When you need to load a very large amount of data into Force.com quickly, you want to ensure that each insert is as efficient as possible. With appropriate preparation and post-processing, you can disable data validation and enrichment operations while loading–without compromising your data integrity or business rules.
Tag Archives: architecture
Want to scale your Salesforce data set to new heights? Then check out this hands-on tutorial that shows you how to implement an external application to store historical records. You also learn how to integrate its UI and user authentication into Salesforce so that your users can analyze historical data right alongside operational data. This practical step-by-step guide uses a combination of Salesforce Platform technology, including Force.com, Heroku, and Force.com Canvas.
If you’ve built an application on the Force.com platform, you want to deliver a great experience to your users. But how can you tell if your applications are performing well and will continue to perform well? Using the Developer Console, you can use “performance profiling” to identify and fix performance hotspots, and ensure that your applications are both fast and scalable.
You are planning a Force.com implementation, with large volumes of data. Your data model is in place, all your code is written and has been tested, and now it’s time to load the objects, some of which have tens of millions of records. What is the most efficient way to get all those records into the system?
This is the second entry in the six-part series of blog posts about data loading for very large enterprise deployments, covering how to load data into a lean configuration.
You are planning a Force.com implementation with large volumes of data. Your data model is in place, all your code is written and has been tested, and now it’s time to load the objects, some of which have tens of millions of records. What is the most efficient way to get all those records into the system?
This is the second entry in the six-part series about data loading for very large enterprise deployments, covering how you can load data into a lean configuration.
If you regularly perform Salesforce implementations with objects that store a lot of records, you probably already know about the strategies that you can use to appropriate manage and distribute your data. Some of the more obvious strategies include using indexes, skinny tables, archival strategies, and even divisions.
However, even if you choose the most appropriate strategy or strategies, you might miss a lesser-known “silent killer” within your architecture: lookup skew.
Read this blog post to learn both how lookup skew affects objects with large volumes of data and what you can do now to minimize its effects.
If you’re an advanced Force.com developer or architect, or someone who’s just interested in developing applications on the Force.com platform, get a peek behind the Force.com query optimizer curtain later this month. Register for the Inside the Force.com Query Optimizer webinar, which will be co-hosted by Technical Enablement’s own John Tan on April 23, 2013, at 7 a.m. and 10 a.m. PST.
When you have a good understanding of how something works, you greatly increase the likelihood that you’ll have a better experience using that something. In this blog post, I’ll expand on that thought, pointing out how it applies to Force.com application design, development, and implementation.
Understanding Force.com internals can give you an edge in building applications that perform well, especially when your organization has a large volume of data. This blog post can help you get that edge. Read on to better understand formula fields, field indexes, and date field logic so that you can deliver SOQL queries, list views, and reports that fly rather than crawl.
Managing your Salesforce organization’s data is a crucial part of keeping your organization healthy, and you might have heard about one tool that can help it stay fit: skinny tables. Read this post to learn how skinny tables work, how they can help you with large volumes of data, and what you should consider before using them.