Feeling stuck with Segment? Say 👋 to RudderStack.

SVG
Log in

How to Load data from Intercom to MS SQL Server

Extract your Intercom’s data

Intercom exposes a rich REST API for interacting with its services. There are two reasons someone would like to use the Intercom REST API:

  1. To pull Intercom’s data and use them for analytic purposes
  2. To push data to enrich the information it contains for customers and enhance its service

In this post, we’ll focus mainly on pulling data using the API and use it to enrich our data warehouse with data that are generated from our interactions with our customers.

The main entities of the Intercom API are the following:

  • Users – which is the primary way of interacting with Intercom.
  • Leads – represent logged-out users of your application.
  • Companies – allow you to represent commercial organizations using your product.
  • Tags – A tag allows you to label your users and companies and list them using that tag.
  • Segments – A segment is a group of your users defined by the rules that you set.
  • Notes – Notes allow you to annotate and comment on your users.
  • Events – Events are how you can submit user activity to Intercom.
  • Counts – You can get counts of users and companies filtered by certain criteria.
  • Conversations – Conversations are how you can communicate with users in Intercom.

At this point, it has to be noted that not all of the above entities can be pulled out from the Intercom API. For example, Events can only be pushed inside Intercom, and it’s not possible to extract them again. So if you also plan to use Intercom to track the behavior of your users, keep that in mind because contrary to services like Mixpanel, it’s not possible to pull the user events from the system.

A good strategy for ensuring that your user activity will be pushed on all the different services that you need and that you will always have access to data is by using a service like a Segment.

Intercom Exposes a RESTful web API, which means that we interact with its resources through HTTP verbs like POST and GET by using an HTTP client. Intercom also offers a number of SDKs that are built around an HTTP client for a number of popular languages.

Pull Data from the Intercom REST API

A typical use case for pulling data out of Intercom is to fetch all your users together with all the conversations you have done with each one. Then you can load this information to your data warehouse and enhance your analytic capabilities with additional interactions that you had with them. To do that, you first need to get all your users. With CURL, you can do that in the following way:

SH
curl https://api.intercom.io/users
-u pi3243fa:da39a3ee5e6b4b0d3255bfef95601890afd80709
-H 'Accept: application/json'

A typical result will look like this:

JSON
{
"type": "user.list",
"total_count": 105,
"users": [
{
"type": "user",
"id": "530370b477ad7120001d",
...
},
...
],
"pages": {
"next": "https://api.intercom.io/users?per_page=50&page=2",
"page": 1,
"per_page": 50,
"total_pages": 3
}
}

Now we can also extract a full list of the conversations that have been performed on Intercom by doing the following:

SH
$ curl
https://api.intercom.io/conversations?type=admin&admin_id=25&open=true
-u pi3243fa:da39a3ee5e6b4b0d3255bfef95601890afd80709
-H 'Accept:application/json'

and a typical result will look like the following:

JSON
{
"type": "conversation.list",
"conversations": [
{
"type": "conversation",
"id": "147",
"created_at": 1400850973,
"updated_at": 1400857494,
"user": {
"type": "user",
"id": "536e564f316c83104c000020"
},
"assignee": {
"type": "admin",
"id": "25"
},
"conversation_message": {
"type": "conversation_message",
"subject": "",
"body": "<p>Hi Alice,</p>nn<p>We noticed you using our Product, do you have any questions?</p> n<p>- Jane</p>",
"author": {
"type": "admin",
"id": "25"
},
"attachments": [
{
"name": "signature",
"url": "http://someurl.com/signature.jpg"
}
]
}
}
]
}

As we can see, each conversation contains a user object which contains an id. In this way, we can associate the conversations with the users we had extracted earlier. Of course, in order to do that on our Data warehouse repository, we need to map the above structures to the data model that the repository follows by respecting both the schema and the data types.

Then we can write a pipeline that will extract data and transform it into the model of our repository and load them by following the instructions that follow below. Of course, if something changes on the Intercom API, the pipeline will break, and we will have to take care of it.

Use Webhooks to Push Events from Intercom to MS SQL Server

Intercom also supports the definition of webhooks, where you can register certain events, and the system will push a message there whenever the event is triggered. So, for example, you might define a webhook that will push a message every time a new conversation is performed with your customers.

By doing this, it is possible to create a near real-time streaming load process for your data warehouse. In order to implement something like that, though, you need to take into consideration the limitations of both ends while ensuring the delivery semantics that your use case requires for the data management infrastructure that you will build.

For more information, you can check the webhooks section on the Intercom API documentation.

[@portabletext/react] Unknown block type "aboutNodeBlock", specify a component for it in the `components.types` prop

Intercom Data Preparation for Microsoft SQL Server

As in every relational database, SQL Server requires a well-defined database schema before we start populating with data. Data is organized in schemas, which are distinct namespaces where database objects belong to.

The most common database objects are of course tables which have a number of columns with each one having a declared data type. MS SQL Server supports a large number of different data types. This gives us great flexibility in expressing data that we have and at the same time optimizing our data warehouse.

When working with data coming from web services, where data is usually serialized in JSON, it is important to correctly map any data to the right data types. As changing the data types in the future is a process that might cost in downtime of your database, it is important to spend enough time thinking about the proper data type assignments.

For example, dates in JSON are just strings, but when storing date objects in a database, we can enhance analytics with great capabilities by transforming the raw string data into an appropriate date type. A typical strategy for loading data by an Intercom to an MS SQL Server database is to create a schema where you will map each API endpoint to a table. Each key inside the Intercom API endpoint response should be mapped to a column of that table and you should ensure the right conversion to an SQL Server compatible data type.

Of course, you will need to ensure that as the data types from the Intercom API might change, you will adapt your database tables accordingly, there’s no such thing as automatic data typecasting. After you have a complete and well-defined data model or schema for MS SQL Server, you can move forward and start loading your data in the database.

[@portabletext/react] Unknown block type "aboutNodeBlock", specify a component for it in the `components.types` prop

Load data from Intercom to MS SQL Server

As a feature-rich and mature product, MS SQL Server offers a large and diverse set of methods for loading data into a database. One way of importing data to your database is by using the SQL Server Import and Export Wizard. With it and through a visual interface you will be able to bulk load data using a number of data sources that are supported.

You can import data from another SQL Server, from an Oracle database, from Flat Files, from an Access Data Source, PostgreSQL, MySQL, and finally Azure Blob Storage. Especially if you are using a managed version of MS SQL Server on Azure, you should definitely consider utilizing the Azure Blob Storage connection.

In this way, you will be loading data as Blobs on Azure and your MS SQL Server database will sync with it through the Import and Export Wizard.

Another way for importing bulk data into an SQL Server, both on Azure and on-premises, is by using the bcp utility. This is a command-line tool that is built specifically for bulk loading and unloading of data from an MS SQL database.

Finally and for compatibility reasons, especially if you are managing databases from different vendors, you can BULK INSERT SQL statements.

In a similar way and as it happens with the rest of the databases, you can also use the standard INSERT statements, where you will be adding data row-by-row directly to a table. It is the most basic and straightforward way of adding data in a table but it doesn’t scale very well with larger data sets.

So for bulk datasets, you better consider one of the previous methods.

Updating your Intercom data on MS SQL Server

As you will be generating more data on Intercom, you will need to update your older data on an MS SQL Server database. This includes new records together with updates to older records that for any reason have been updated on Intercom.

You will need to periodically check Intercom for new data and repeat the process that has been described previously while updating your currently available data if needed. Updating an already existing row on a SQL Server table is achieved by creating UPDATE statements.

Another issue that you need to take care of is the identification and removal of any duplicate records on your database. Either because Intercom does not have a mechanism to identify new and updated records or because of errors on your data pipelines, duplicate records might be introduced to your database.

In general, ensuring the quality of data that is inserted in your database is a big and difficult issue and MS SQL Server features like TRANSACTIONS can help tremendously, although they do not solve the problem in the general case.

The best way to load data from Intercom to MS SQL Server

So far we just scraped the surface of what you can do with MS SQL Server and how to load data into it. Things can get even more complicated if you want to integrate data coming from different sources.

Are you striving to achieve results right now?

Instead of writing, hosting, and maintaining a flexible data infrastructure use RudderStack that can handle everything automatically for you.

RudderStack with one click integrates with sources or services, creates analytics-ready data, and syncs your Intercom to MS SQL Server right away.

Sign Up For Free And Start Sending Data

Test out our event stream, ELT, and reverse-ETL pipelines. Use our HTTP source to send data in less than 5 minutes, or install one of our 12 SDKs in your website or app.

Don't want to go through the pain of direct integration? RudderStack's Intercom integration makes it easy to send data from Intercom to MS SQL Server.