Salesforce to Postgres

This page provides you with instructions on how to extract data from Salesforce and load it into PostgreSQL. (If this manual process sounds onerous, check out Stitch, which can do all the heavy lifting for you in just a few clicks.)

What is Salesforce?

Salesforce, a cloud-based software-as-a-service platform, is the most popular CRM application in use today. Salesforce is amazingly customizable, has tons of integration functionality, and includes almost too many bells and whistles to count. Companies can use it to do everything from managing account planning to time management and team collaboration.

What is PostgreSQL?

PostgreSQL, known by most simply as Postgres, is a hugely popular object-relational database management system (ORDBMS). It labels itself as "the world's most advanced open source database," and for good reason. The platform, despite being available for free via an open source license, offers enterprise-grade features including a strong emphasis on extensibility and standards compliance.

It runs on all major operating systems, including Linux, Unix, and Windows. It is fully ACID-compliant, has full support for foreign keys, joins, views, triggers, and stored procedures (in multiple languages). Postgres is often the best tool for the job as a back-end database for web systems and software tools, and cloud-based deployments are offered by most major cloud vendors. Its syntax also forms the basis for querying Amazon Redshift, which makes migration between the two systems relatively painless and makes Postgres a good "first step" for developers who may later expand into Redshift's data warehouse platform.

Getting data out of Salesforce

Step one is to get all of that data out of Salesforce. Salesforce provides many APIs for its products that can deliver data on accounts, leads, tasks, and more. You can find a list of APIs on one of the company's helpdesk posts with some direction on when and how to use each API. By looking through that post, you can get an idea of which API makes the most sense for your use case.

For our purposes, we'll use the REST API with SOQL (Salesforce Object Query Language), but the same data is available using other protocols, including streaming for real-time receipt of data.

Sample Salesforce data

The Salesforce Rest API can return JSON- or XML-formatted data depending on your preference. Here's what a sample response might look like in JSON format:

{
    "done" : true,
    "totalSize" : 14,
    "records" : 
    [ 
        {  
            "attributes" : 
            {    
                "type" : "Account",    
                "url" : "/services/data/v20.0/sobjects/Account/001D000000IRFmaIAH"  
            },  
            "Name" : "Test 1"
        }, 
        {  
            "attributes" : 
            {    
                "type" : "Account",    
                "url" : "/services/data/v20.0/sobjects/Account/001D000000IomazIAB"  
            },  
            "Name" : "Test 2"
        }, 

        ...

    ]
}

Loading data into Postgres

Once you have identified all of the columns you will want to insert, you can use the CREATE TABLE statement in Postgres to create a table that can receive all of this data. Then, Postgres offers a number of methods for loading in data, and the best method varies depending on the quantity of data you have and the regularity with which you plan to load it.

For simple, day-to-day data insertion, running INSERT queries against the database directly are the standard SQL method for getting data added. Documentation on INSERT queries and their bretheren can be found in the Postgres documentation here.

For bulk insertions of data, which you will likely want to conduct if you have a high volume of data to load, other tools exist as well. This is where the COPY command becomes quite useful, as it allows you to load large sets of data into Postgres without needing to run a series of INSERT statements. Documentation can be found here.

The Postgres documentation also provides a helpful overall guide for conducting fast data inserts, populating your database, and avoiding common pitfalls in the process. You can find it here.

Keeping Salesforce data up to date

At this point you've coded up a script or written a program to get the data you want and successfully moved it into your data warehouse. But how will you load new or updated data? It's not a good idea to replicate all of your data each time you have updated records. That process would be painfully slow and resource-intensive.

Instead, identify key fields that your script can use to bookmark its progression through the data and use to pick up where it left off as it looks for updated data. Auto-incrementing fields such as updated_at or created_at work best for this. When you've built in this functionality, you can set up your script as a cron job or continuous loop to get new data as it appears in Salesforce.

And remember, as with any code, once you write it, you have to maintain it. If Salesforce modifies its API, or the API sends a field with a datatype your code doesn't recognize, you may have to modify the script. If your users want slightly different information, you definitely will have to.

Other data warehouse options

PostgreSQL is great, but sometimes you need to optimize for different things when you're choosing a data warehouse. Some folks choose to go with Amazon Redshift, Google BigQuery, or Snowflake, which are RDBMSes that use similar SQL syntax, or Panoply, which works with Redshift instances. If you're interested in seeing the relevant steps for loading data into one of these platforms, check out To Redshift, To BigQuery, To Snowflake, and To Panoply.

Easier and faster alternatives

If all this sounds a bit overwhelming, don’t be alarmed. If you have all the skills necessary to go through this process, chances are building and maintaining a script like this isn’t a very high-leverage use of your time.

Thankfully, products like Stitch were built to solve this problem automatically. With just a few clicks, Stitch starts extracting your Salesforce data via the API, structuring it in a way that is optimized for analysis, and inserting that data into your PostgreSQL data warehouse.