Testing Best Practices in Salesforce

When migrating your apex code from a Sandbox to a Production environment, the Force.com platform mandates you to ensure that your code  passes minimum 75% code coverage requirement while unit testing. This has been a best practice in the coding world for quite a long time for obvious reasons. Salesforce has made it a compulsion on their platform so that your code doesn’t misbehave once deployed to Production environment with live data.

In the ensuing write-up we will explore Testing Strategies – Common Pitfalls and Best Practices.

  • Test Classes run in parallel

When tests are initiated from the Salesforce user interface (including the Developer Console), test classes are run in parallel. Individual test methods inside test classes are run serially, but not necessarily sequentially. Now, although the test classes run in parallel by default to speed up test run time, sometimes parallel test execution could lead to data contention issues; typically if the tests are accessing your org data. See the following code sample:

Test Class 1

1A  @isTest (seeAllData = true)

2A  public class TestClass1 {

3A         private static testMethod void setAccount() {

4A                Account a= //Select a single account

5A                a.Name= ‘ABC’;

6A                update a;

7A                // check logic

8A           }

9A   }

Test Class 2

1B  @isTest (seeAllData = true)

2B  public class TestClass2 {

3B         private static testMethod void setAccount2() {

4B                Account a= //Select a single account

5B                a.Name= ‘XYZ’;

6B                update a;

7B                // check logic

8B           }

9B   }


In the above code, if lines 4A and 4B try to unfortunately lock the same account record from your database, this could lead to UNABLE_TO_LOCK_ROW error. To avoid this, firstly try to avoid using your org   data as test data. It is a best practice to create your own test data with the @testSetup for each test class. This data would be isolated per class and would only exist for the duration of the test run. Another option is to disable the parallel test execution by going to Setup|Develop|Apex Test Exectution|Options…|Disable Parallel Apex Testing. Finally you could refactor the code to have both the test methods in the same class.

  • Code coverage calculation

Code coverage percentage is calculated whenever tests are executed. The general formula for code coverage percentage calculation is:

Code coverage percentage = Number of covered lines / (Number of covered lines + Number of uncovered lines)

Tests are run in two situations, first is when you explicitly run them through the Developer Console or from the Setup menu, and second, implicitly when you are trying to deploy your code to another org. The code coverage percentage is calculated in both these cases but they are differently treated. In the first case the code coverage percentage is persisted to a code coverage table for the benefit of the Testing tools. In the second case it isn’t stored anywhere. The reason for this is that during deployment, apart from the apex code being deployed, there could also be some metadata being modified in the target org (for example, a lookup relationship being changed to Master-Detail). These metadata changes if unsuccessful (some child records not having parents while converting from lookup to M-D, point in case) could potentially rollback the deployment. Now, if the code coverage percentage was stored in this case and the rollback happened then the code coverage percentage could be considering some executable lines of the apex code which now don’t exist since they  were part of this deployment which has now rolled back.

In short, the code coverage percentage that is stored when you manually run tests isn’t referred to when deploying code, instead, the platform implicitly recalculates it on the fly.

Things that are not considered as a part of the code coverage percentage calculation include: Comments and blank lines, System.debug() statements and curly brackets when they appear alone on a line. Otherwise, multiple executable lines appearing on single line are calculated as one line. Single executable statement appearing on multiple lines is counted as multiple lines.

Some best practices that you should adhere to are as follows:

  • Test for good data, bad data and bulk data. Your test classes should have test methods for testing both the desired situation and undesired situations and they should check it for multiple records.
  • If there are multiple tests that are going to run in a single transaction, remember that all those tests would be consuming single set of governor limits. This could lead to some tests failing due to exhaustion of governor limits. Ensure that each test method gets its own fresh set of governor limits by making use of startTest() and stopTest() methods of the Test class.
  • To ensure users see only data that they are supposed to see when the apex code is executed in the background, you should test with Sharing. You can achieve this by using the System.runAs() method.
  • If you are using static resources in your tests, ensure that those resources are also available on the production org so that your tests don’t fail due to unavailability of the resource.
  • Don’t hardcode IDs in your tests, since IDs are unique across orgs.
  • Have only one test class per Apex class you are testing.

Of course there are numerous general testing best practices that also apply to apex tests. You should try following all of those. The above write-up does mention but a few that are specific to testing in Salesforce.

Very basic understanding of SSL

The purpose of this post is to give a very high level non-technical overview of the transactions that happen when using SSL (Secure Sockets Layer) protocol. More details of the actual messages and its contents could be found on the internet in abundance.

Whenever I have tried to search for a simple explanation about SSL, I haven’t found any, since most of the literature on SSL contains a lot of technical complexity. Hence this is my attempt to explain the basic flow of messages when a Client tries to connect with the server using the SSL protocol for sensitive data transfer.

Read more


Understanding Outbound Messaging in Salesforce

Although most of the knowledge presented here has been taken from the Help & Training documentation provided by Salesforce.com, I have attempted to simplify it and put some important facts about Outbound messages in focus for the readers consideration.

Outbound Message is a XML message that can be sent as an action of a Workflow Rule. By definition:

“Outbound Messaging allows you to specify that changes to fields within Salesforce can cause messages with field values to be sent to designated external servers.”

Use cases for the Outbound Messaging:

Let’s first see how we could use the Outbound messaging.  Here is a simple example:

When Opportunity closes positively, we would want the Order fulfillment system to get all the details of the Opportunity such as billing and shipping address and the Line Items to complete the order. Here we assume that the Order fulfillment system is a legacy system running outside Salesforce servers, may be on an in-house ERP server.

How can we implement the Outbound message mentioned in the above example?

Some important points to consider when using Outbound messages:

  • Outbound messaging uses the notifications() call to a designated endpoint when triggered by a workflow rule.
  • A single SOAP message could contain up to 100 notifications.
  • SessionID could be included in the original Outbound message which could then be used by the client to callback.
  • Each notification contains the ObjectID and a reference to the associated sObject data.
  • If the information in the object changes after the notification is queued but before it is sent, only the updated information will be delivered. Meaning intermittent changes do not reach the client.
  • If you issue multiple discrete calls the calls may be batched together into one or more SOAP messages.
  • Messages will be queued locally; a separate background process performs the actual sending to preserve message reliability.
  • If the endpoint is unavailable, messages will stay in the queue until sent successfully, or until they are 24 hours old. After 24 hours the message are dropped from the queue.
  • If a message cannot be delivered, the interval between retries increases exponentially up to a maximum of 2 hours between retries.
  • Messages are retried independent of their order in the queue. This may result in the messages being delivered out of order.
  • No audit trail is possible since the messages might be delivered multiple times or not at all.
  • Because a message may be delivered more than once, your listener client should check the ID in the notification before processing.


Having set the Outbound message to fire, we have to take the Outbound message WSDL to build the listener on the client side. In the above example we would have to build a listener for the Order fulfillment system to receive and process the message. The steps for building the listener could be found in any of the below mentioned documents:





Is the Outbound Messaging a truly declarative solution?

In my opinion it is NOT; since we have to do some coding to build the listener at the client-side. A developer would prefer to use raw web-services API provided by Apex to build both the outgoing and incoming web-services instead of using the Outbound messages, although they do simplify the message creation.


List views in Salesforce

Have you ever wanted a simple list of records for a marketing or a pre-sales activity without wanting to generate a report? Well List Views is your answer. List Views allow you to create criteria based lists of records, much like stored queries. The best thing about the List Views is that they are persistent, meaning, you can run the same list view without having to recreate it, and every time you run it, the List View will produce results based on changes to the database, like, new records being added, records being modified or deleted.

Creating the List View: Creating a List View is quite easy. Below are the steps to create a list view for Large Customer Accounts ( Accounts where type is customer, annual revenue is greater than $8000000).

Read more

How to create a blog website

Well the title of this post does not exactly reflect what I am about to write. It should have rather read “How I created this Blog website” but the prior one is a more search friendly term. I am writing this post as a response to some frequently asked questions on my other posts regarding this blog website.

To tell you the truth I am no web designer, this website just happened out of trial and error. I definitely wanted something like this since I was eager to have a personal website and not use the blogging capabilities provided by some popular websites such as WordPress, Blogger, Tumblr or Google+.

Read more


External objects in Salseforce

Salesforce.com allows you to access data stored in your external data sources within your organization. Previously this was possible by writing your own adapters in Apex that used web services to access that data. This method of accessing the external data within your organization is still valid and is widely used by customers who wish to have a single view of their data, internal or external. But this system is largely programmatic and requires developer skills to achieve it.

External Objects allow us to map tables in external data sources within Salesforce allowing a federated search of all your data and content. External Objects are similar to Custom Objects such that, they are defined declaratively although they, unlike Custom Objects, map to data outside your Salesforce organization. You can define up to 100 External Objects in Developer, Enterprise, Performance and Unlimited editions.

External Objects are Customizable, Searchable but are not Reportable and Securable. The visibility of the records of the External Object is not controlled by the Salesforce Organization’s security model; instead it is controlled by the external system and reflects within your organization depending on the Identity Type defined while defining the External Data Source.

External Data Source:

External objects rely on an external data source definition to connect with the external system’s data. Each external object definition maps to a data table on the external system. Each of the external object’s fields maps to a table column on the external system.

External Data Sources can use the following types:

  • File Connect: Used for Google Drive, SharePoint 2010 or 2013, SharePoint Online or OneDrive for Business.
  • Lightning Connect:Used with OData 2.0 or 4.0 to connect to external systems (e.g. ERP) , Salesforce connector for connecting two Salesforce organization (this is a pub-sub model) or Custom Apex adapters.
  • Simple URL:Used for accessing data from another web domain.

More information about these types and defining the External Data Source could be found in the following Help and Training link: Defining External Data Source

Two ways of Defining External Objects:

Once you define an External Data Source, you get an option to Sync the Data source within the Salesforce Organization. The ‘Validate and Sync’, which is a onetime activity, automatically creates External Objects corresponding to the tables in the external system and custom fields on the External Objects corresponding to the columns of the table that’s compatible with a Salesforce metadata field type. Since this is a onetime activity it does not reflect any changes in the external system’s schema that might have occurred after the sync.

Second way would be to create the corresponding External Objects manually to customize the external object names and manually create the custom fields.

The steps for defining the External Objects can be found in the following help and training link: Defining External Object

External Object Relationships:

You can create lookup relationships with External Objects within Salesforce. There are three types of lookup relationships that can be created:

  • Lookup Relationship
  • External Lookup Relationship
  • Indirect Lookup Relationship


The following table summarizes the types of relationships that are available to external objects.

Relationship Allowed Child Objects Allowed Parent Objects Parent Field for Matching Records
Lookup Standard





The 18-character Salesforce record ID
External lookup Standard



External The External ID standard field
Indirect lookup External Standard


You select a custom field with the External ID and Unique attributes

*Courtesy of Help and Training.

You can find more information about External Object Relationships and considerations in the following Help and Training link: External Object Relationships

Writable External Objects:

You can Create, Edit and Delete the external data source record by modifying the External object record within Salesforce organization. This capability is only available while using Lightning Connect with OData 2.0 or 4.0 or when using Custom Apex adapters. If you have used Lightning Connect for OData 2.0 or 4.0 as your type when defining the External Data Source within Salesforce organization then you get a checkbox to enable this feature. This feature is not available for other External Data Source types.


External Objects simplifies your need to view your external data within Salesforce by providing a declarative solution. You can use External Objects with Lightning Connect when:

  • You have a large amount of data that you don’t want to copy into your Salesforce organization.
  • You need small amounts of data at any one time.
  • You want real-time access to the latest data.


Happy Reading!

Difference between Pointers and References

Pointers provide us with a benefit of avoiding duplication of data. For example if you have a structure (or Class) which you have instantiated in memory in one of your methods, and then you are trying to pass that instance to another method, then passing by value would create a copy of that instance in the called method. Instead you could pass a pointer to that instance to the called method and then both the calling and the called method would be working on the same copy of the data.
Read more