This post will help you with exporting your DoubleClick to Snowflake. With a few clicks, you will start collecting analytics-ready data consistently into your Snowflake instance. No need for scripts or engineering effort and resources. Just replicate your data and focus on what matters – the analysis of your advertising data.
Access your data on DoubleClick
The first step in loading your DoubleClick data to any data warehouse solution is to access them and start extracting it.
For accessing your data, you can use the for Publishers API, which is implemented using the SOAP protocol and consequently, this will add some complexity to your development as you will have to manage SOAP and XML responses. However, to help you get started, Google offers client libraries for Java, .NET, Python, PHP, and Ruby that offer wrapper functions and various features.
In addition to the above, the things that you have to keep in mind when dealing with the for Publishers API, are:
- Rate limits. Depending on the chosen plan and API version that is being used, for Publishers API allows an amount of calls per hour.
- Authentication. You authenticate all for Publishers API requests using OAuth2.
- Error Handling. Make sure that you handle errors correctly
Each custom report is composed of the following:
- Dimensions. The user can select a number of dimensions for the report.
- Dimension Attributes. Specific dimensions can optionally be enhanced with some attributes. There are constraints on what attributes can be selected, depending on the dimensions that the user has chosen.
- Columns. Can be considered as metrics that provide all the trafficking statistics and revenue information available for the chosen dimension object. There are constraints of what columns can be combined with what dimensions.
About DoubleClick (for Publishers)
DoubleClick is a platform initially designed to facilitate the process of delivering ads to websites or mobile devices. As per June 2018, DoubleClick for Publishers was merged with DoubleClick Ad Exchange into for Publishers, a new online advertisement service brand introduced by Google.
While using DoubleClick you can define ad units, i.e. the spaces within your website or app where your ads will be shown. Without any further actions, DoubleClick can then determine the best possible placement per ad request and record data relevant to this specific ad campaign. You can also produce various custom reports on the properties that you have selected to expose in order to gain insights on the performance of your ads as well as to determine any possible improvements.
Transform and prepare your DoubleClick data for Snowflake Replication
After you have accessed your data on DoubleClick, you will have to transform it based on two main factors,
- The limitations of the database that is going to be used
- The type of analysis that you plan to perform
Each system has specific limitations on the data types and data structures that it supports. If for example you want to push data into Google BigQuery, then you can send nested data like JSON directly, but keep in mind that in the case of a SOAP API like DoubleClick, you get XML responses.
Of course, when dealing with tabular data stores, like Microsoft SQL Server, this is not an option. Instead, you will have to flatten out your data, just as in the case of JSON, before loading into the database.
Also, you have to choose the right data types. Again, depending on the system that you will send data to and the data types that the API exposes to you, you will have to make the right choices. These choices are important because they can limit the expressivity of your queries and limit your analysts on what they can do directly out of the database.
With DoubleClick data, you have two main additional sources of complexity. When it comes to data types you have to keep in mind that SOAP is using XML to describe the service and data, so the data types that you have to map are coming from XML and might have automatically be transformed into the primitive data types of the language that you are using.
Also, you have to consider that the reports you’ll get from DoubleClick are like CSV files in terms of their structure and you need to somehow identify what and how to map to a table into your database. This way you will be able to join, combine and query your data in order to assess the performance of various ads and finally improve ROI for display ad campaigns.
Data in Snowflake is organized around tables with a well-defined set of columns with each one having a specific data type.
Snowflake supports a rich set of data types. It is worth mentioning that a number of semi-structured data types are also supported. With Snowflake, it is possible to load data directly in JSON, Avro, ORC, Parquet, or XML format. Hierarchical data is treated as a first-class citizen, similar to what Google BigQuery offers.
There is also one notable common data type that is not supported by Snowflake. LOB or large object data type is not supported, instead, you should use a BINARY or VARCHAR type instead. But these types are not that useful for data warehouse use cases.
A typical strategy for loading data from DoubleClick to Snowflake is to create a schema where you will map each API endpoint to a table.
Each key inside the DoubleClick API endpoint response should be mapped to a column of that table and you should ensure the right conversion to a Snowflake data type.
Of course, you will need to ensure that as any data type from the DoubleClick API might change, you will adapt every database tables accordingly, there’s no such thing as automatic data type casting.
After you have a complete and well-defined data model or schema for Snowflake, you can move forward and start loading your data into the database.
Export data from DoubleClick (for Publishers) to Snowflake
Usually, data is loaded into Snowflake in a bulk way, using the COPY INTO command. Files containing data, usually in JSON format, are stored in a local file system or Amazon S3 buckets. Then a COPY INTO command is invoked on the Snowflake instance and data is copied into any data warehouse.
The files can be pushed into Snowflake using the PUT command, into a staging environment before the COPY command is invoked.
Another alternative is to upload any data directly into a service like Amazon S3 from where Snowflake can access data directly.
If you are looking into other data warehouses you may check our how to’s on DoubleClick to Redshift, DoubleClick to MS SQL Server, DoubleClick to BigQuery, DoubleClick to PostgreSQL.
Updating your DoubleClick data on Snowflake
As you will be generating more data on DoubleClick, you will need to update your older data on Snowflake. This includes new records together with updates to older records that for any reason have been updated on DoubleClick.