Feeling stuck with Segment? Say 👋 to RudderStack.

SVG
Log in

How to load data from DoubleClick to Snowflake

Access your data on DoubleClick

The first step in loading your DoubleClick data to any data warehouse solution is to access them and start extracting it.

For accessing your data, you can use the for Publishers API, which is implemented using the SOAP protocol and consequently, this will add some complexity to your development as you will have to manage SOAP and XML responses. However, to help you get started, Google offers client libraries for Java, .NET, Python, PHP, and Ruby that offer wrapper functions and various features.

In addition to the above, the things that you have to keep in mind when dealing with the for Publishers API, are:

  1. Rate limits. Depending on the chosen plan and API version that is being used, for Publishers API allows an amount of calls per hour.
  2. Authentication. You authenticate all for Publishers API requests using OAuth2.
  3. Error Handling. Make sure that you handle errors correctly

Each custom report is composed of the following:

  1. Dimensions. The user can select a number of dimensions for the report.
  2. Dimension Attributes. Specific dimensions can optionally be enhanced with some attributes. There are constraints on what attributes can be selected, depending on the dimensions that the user has chosen.
  3. Columns. Can be considered as metrics that provide all the trafficking statistics and revenue information available for the chosen dimension object. There are constraints of what columns can be combined with what dimensions.
[@portabletext/react] Unknown block type "aboutNodeBlock", specify a component for it in the `components.types` prop

Transform and prepare your DoubleClick data for Snowflake Replication

After you have accessed your data on DoubleClick, you will have to transform it based on two main factors,

  1. The limitations of the database that is going to be used
  2. The type of analysis that you plan to perform

Each system has specific limitations on the data types and data structures that it supports. If for example you want to push data into Google BigQuery, then you can send nested data like JSON directly, but keep in mind that in the case of a SOAP API like DoubleClick, you get XML responses.

Of course, when dealing with tabular data stores, like Microsoft SQL Server, this is not an option. Instead, you will have to flatten out your data, just as in the case of JSON, before loading into the database.

Also, you have to choose the right data types. Again, depending on the system that you will send data to and the data types that the API exposes to you, you will have to make the right choices. These choices are important because they can limit the expressivity of your queries and limit your analysts on what they can do directly out of the database.

With DoubleClick data, you have two main additional sources of complexity. When it comes to data types you have to keep in mind that SOAP is using XML to describe the service and data, so the data types that you have to map are coming from XML and might have automatically be transformed into the primitive data types of the language that you are using.

Also, you have to consider that the reports you’ll get from DoubleClick are like CSV files in terms of their structure and you need to somehow identify what and how to map to a table into your database. This way you will be able to join, combine and query your data in order to assess the performance of various ads and finally improve ROI for display ad campaigns.

Data in Snowflake is organized around tables with a well-defined set of columns with each one having a specific data type.

Snowflake supports a rich set of data types. It is worth mentioning that a number of semi-structured data types are also supported. With Snowflake, it is possible to load data directly in JSON, Avro, ORC, Parquet, or XML format. Hierarchical data is treated as a first-class citizen, similar to what Google BigQuery offers.

There is also one notable common data type that is not supported by Snowflake. LOB or large object data type is not supported, instead, you should use a BINARY or VARCHAR type instead. But these types are not that useful for data warehouse use cases.

A typical strategy for loading data from DoubleClick to Snowflake is to create a schema where you will map each API endpoint to a table.

Each key inside the DoubleClick API endpoint response should be mapped to a column of that table and you should ensure the right conversion to a Snowflake data type.

Of course, you will need to ensure that as any data type from the DoubleClick API might change, you will adapt every database tables accordingly, there’s no such thing as automatic data type casting.

After you have a complete and well-defined data model or schema for Snowflake, you can move forward and start loading your data into the database.

Export data from DoubleClick (for Publishers) to Snowflake

Usually, data is loaded into Snowflake in a bulk way, using the COPY INTO command. Files containing data, usually in JSON format, are stored in a local file system or Amazon S3 buckets. Then a COPY INTO command is invoked on the Snowflake instance and data is copied into any data warehouse.

The files can be pushed into Snowflake using the PUT command, into a staging environment before the COPY command is invoked.

Another alternative is to upload any data directly into a service like Amazon S3 from where Snowflake can access data directly.

If you are looking into other data warehouses you may check our how to’s on DoubleClick to Redshift, DoubleClick to MS SQL Server, DoubleClick to BigQuery, DoubleClick to PostgreSQL.

Updating your DoubleClick data on Snowflake

As you will be generating more data on DoubleClick, you will need to update your older data on Snowflake. This includes new records together with updates to older records that for any reason have been updated on DoubleClick.

You will need to periodically check DoubleClick for new data and repeat the process that has been described previously while updating your currently available data if needed. Updating an already existing row on a Snowflake table is achieved by creating UPDATE statements.

Another issue that you need to take care of is the identification and removal of any duplicate records on your database. Either because DoubleClick does not have a mechanism to identify new and updated records or because of errors on your data pipelines, duplicate records might be introduced to your own database.

In general, ensuring the quality of data that is inserted into your database is a big and difficult issue.

The best way to load data from DoubleClick (for Publishers) to Snowflake

So far we just scraped the surface of what can be done with Snowflake and how to ingest data into it. The way to proceed relies heavily on the data you want to load, from which service they are coming from and the requirements of your use case.

Things can get even more complicated if you want to integrate data coming from different sources. Instead of writing, hosting, and maintaining a flexible data infrastructure, a possible alternative is to use a product like RudderStack that can automatically handle this kind of problem for you.

Easily use the DoubleClick (for Publishers) connector from RudderStack, along with multiple sources or services like databases, CRM, email campaigns, analytics, and more. Quickly and safely ingest DoubleClick data into Snowflake and start generating insights from your data.

Sign Up For Free And Start Sending Data

Test out our event stream, ELT, and reverse-ETL pipelines. Use our HTTP source to send data in less than 5 minutes, or install one of our 12 SDKs in your website or app.

Don't want to go through the pain of direct integration?

RudderStack's DoubleClick for Publishers integration

makes it easy to send data from DoubleClick for Publishers to Snowflake.