Key Takeaways
1. Salesforce's unique architecture optimizes data management for multi-tenancy
The Salesforce platform architecture is complex, comprising many databases and application servers along with a multitude of services that perform different functions, such as searching, business logic execution, caching, and event management.
Multi-tenant architecture. Salesforce's unique architecture is designed to support multiple organizations (tenants) on shared infrastructure while maintaining data isolation and security. This is achieved through a metadata-driven approach, where customizations and configurations are stored as metadata rather than direct database changes.
Optimized for read operations. Unlike traditional databases optimized for write operations, Salesforce prioritizes read performance. This is reflected in its denormalized data structure, where data is often duplicated across objects to reduce the need for complex joins during queries. For example, the Email field may appear on Lead, Contact, and Campaign Member objects to improve query performance.
Key components of Salesforce architecture:
- Universal Data Dictionary (UDD)
- Multi-tenant query optimizer
- Metadata cache
- Bulk data processing engine
- Search engine service
2. Large Data Volumes (LDVs) require strategic planning and optimization techniques
LDVs in Salesforce don't have a strict formal definition, and if you ask an experienced architect what constitutes an LDV scenario, more than likely the answer you will get will be, It depends.
Defining LDVs. Large Data Volumes in Salesforce are generally considered when an object has more than 1 million records. However, LDV scenarios can occur with fewer records depending on factors such as transaction volume, complexity, data skew, and org complexity.
Implications of LDVs. Unmanaged LDVs can lead to performance issues in various areas:
- Slow reports and list views
- Degraded search performance
- SOQL query timeouts
- Lengthy sharing calculations
- Slow loading of related lists
Optimization techniques. To manage LDVs effectively, consider the following strategies:
- Implement data archiving to move older, less frequently accessed data off the main objects
- Use skinny tables for frequently accessed fields to improve query performance
- Optimize SOQL queries and reports by using selective filters and indexed fields
- Leverage asynchronous processing for bulk operations
- Implement data skew prevention strategies
3. Data skew can significantly impact performance and should be carefully managed
When you have more than 10,000 child records that are all linked with the same parent record, a data skew situation arises.
Types of data skew. There are three main types of data skew in Salesforce:
- Account data skew: Too many child records associated with a single account
- Ownership skew: A single user owns an excessive number of records
- Lookup skew: Too many records associated with a single lookup record
Impact of data skew. Data skew can cause various performance issues:
- Record locking conflicts during updates
- Slow sharing calculations and recalculations
- Degraded query performance
- Timeouts during mass updates or deletions
Mitigation strategies:
- Distribute child records across multiple parent records when possible
- Use queues instead of individual users for record ownership in high-volume scenarios
- Consider using custom objects or big objects for storing historical data
- Implement batch processing and asynchronous operations for large data updates
4. Effective data modeling and denormalization are crucial for Salesforce performance
Because the Salesforce platform is optimized for read operations, we will need to be cognizant of denormalizing the data as much as possible.
Denormalization benefits. Salesforce's architecture favors denormalized data structures to improve read performance. This approach reduces the need for complex joins and allows for faster query execution.
Data modeling considerations:
- Replicate commonly accessed data across related objects
- Use formula fields judiciously, as they can impact performance
- Leverage custom indexes for frequently queried fields
- Consider the impact of relationships (lookup vs. master-detail) on query performance
Best practices:
- Analyze reporting and querying needs when designing data models
- Use external IDs for efficient data loading and integration
- Implement field-level security and sharing rules to control data access
- Regularly review and optimize existing data models as business needs evolve
5. Salesforce Connect and external objects offer flexible data integration solutions
Salesforce Connect is a point-and-click solution to integrate external systems with Salesforce without the need for writing complex code or requiring a middleware to move data across the systems.
Benefits of Salesforce Connect:
- Real-time data access from external systems
- Reduced data storage costs in Salesforce
- Simplified integration with legacy systems
- Support for both read and write operations
Implementation options:
- OData 2.0 and 4.0 adapters for systems supporting these protocols
- Custom adapters using Apex Connector Framework for more complex integrations
Considerations:
- Limited availability of some standard Salesforce features for external objects
- Potential performance impact for large data volumes or complex queries
- API call limits that may affect scalability in high-traffic scenarios
6. Big objects provide scalable storage for massive data volumes with specific considerations
Big objects can store millions of records, reaching up to the 1 billion mark and more.
Key benefits of big objects:
- Scalable storage for massive data volumes
- Improved performance for core Salesforce objects by offloading historical data
- Cost-effective storage solution for long-term data retention
- Integration with Platform Events for real-time data processing
Architectural considerations:
- Use of Async SOQL for querying big object data
- Lack of UI for direct data manipulation (custom UI required)
- Limited field types compared to standard/custom objects
- Consistency-focused architecture requiring robust error handling in integrations
Use cases:
- Long-term data archiving for compliance and auditing
- Storage of high-volume IoT or event data
- Historical trending and analytics on large datasets
7. Data governance and archiving strategies are essential for long-term Salesforce success
Regular data archiving using automated tools can be a good strategy to prevent growing your data massively on the platform.
Data governance importance. Implementing robust data governance practices ensures data quality, compliance, and optimal performance of your Salesforce org over time.
Key components of data governance:
- Data quality standards and monitoring
- Data retention and archiving policies
- Access control and security measures
- Data lifecycle management
Archiving strategies:
- Use big objects for long-term data storage within Salesforce
- Leverage external storage solutions (e.g., Heroku) for off-platform archiving
- Implement automated archiving processes based on data age or relevance
- Ensure archived data remains accessible for reporting and compliance needs
8. Query optimization and SOQL best practices are critical for maintaining performance
Because the Salesforce platform is multi-tenant and customers don't get access to the database directly, when you run a report, SOQL, and list view, your request is sent to what's called a Query Optimizer.
Query optimization techniques:
- Use selective filters to leverage indexes
- Avoid using negative operators (e.g., !=, NOT IN) in filter conditions
- Limit the number of fields in SELECT statements
- Use LIMIT clauses to restrict result set size
SOQL best practices:
- Leverage indexed fields in filter conditions
- Use relationship queries to minimize the number of separate queries
- Implement proper error handling and bulkification in Apex code
- Utilize query hints for complex queries when appropriate
Performance monitoring:
- Use the Query Plan tool to analyze query execution plans
- Monitor SOQL query performance in debug logs
- Regularly review and optimize frequently used reports and list views
9. Salesforce's multi-tenancy model impacts data security and sharing calculations
Salesforce stores access to records in Sharing tables. Whenever records are inserted or the ownership changes, the platform has to insert/update these Sharing tables.
Sharing model implications:
- Organization-Wide Defaults (OWDs) impact sharing table size and calculation time
- Complex sharing rules can significantly affect performance in LDV scenarios
- Role hierarchy changes can trigger massive sharing recalculations
Optimization strategies:
- Use Public Read/Write OWDs during large data loads, then restrict access later
- Implement parallel sharing rule calculations for faster processing
- Leverage deferred sharing calculations for bulk changes to roles or territories
- Consider using a minimal role hierarchy to reduce sharing complexity
Security considerations:
- Regularly review and optimize sharing rules and role hierarchies
- Use permission sets instead of profiles for more granular access control
- Implement field-level security to protect sensitive data
10. Performance monitoring and testing are crucial for maintaining a healthy Salesforce org
Performance testing must be done in a very well-planned manner, ensuring that the scope of the testing is known, and that the goals of the tests are clear.
Performance monitoring tools:
- Salesforce Optimizer for identifying org health issues
- Event Monitoring for tracking user activities and system performance
- Debug logs and Developer Console for detailed performance analysis
Performance testing considerations:
- Test in full sandbox environments to simulate production conditions
- Use realistic data volumes and user loads in performance tests
- Consider multi-user scenarios to identify concurrency issues
- Test integrations and custom code under various load conditions
Best practices:
- Establish performance baselines and regularly monitor for deviations
- Implement proactive alerting for performance thresholds
- Conduct performance testing before major releases or significant org changes
- Regularly review and optimize custom code, workflows, and automations
Last updated:
Download PDF
Download EPUB
.epub
digital book format is ideal for reading ebooks on phones, tablets, and e-readers.