Feeling stuck with Segment? Say 👋 to RudderStack.

SVG
Log in

How to load data from the Aftership to Snowflake

Access your data on Aftership

The first step in loading your Aftership data to any kind of data warehouse solution is to access your data and start extracting it.

Using the REST API that Aftership offers, you can programmatically interact with your account in order to gain access to your order tracking data. By doing so, you can:

  • Get the list of all supported couriers.
  • Retrieve tracking results
  • Get tracking information of the last checkpoint of a tracking
  • Gain access to contacts (SMS or email) to be notified when the status of tracking has changed.

You can also retrieve some basic aggregated metrics for any user-defined time period, such as the average score of all your surveys or of a specific trend or client.

In addition to the above, the things that you have to keep in mind when dealing with the Aftership API are:

  • Rate limits. In order to guarantee a high quality of service to all users of the API, Aftership applies certain rate limits. Currently, users are limited to 600 requests per minute per account.
  • Authentication. You can authenticate to Aftership using a private API key that is linked to your account.
  • Pagination. API endpoints that return a collection of items are always paginated.
[@portabletext/react] Unknown block type "aboutNodeBlock", specify a component for it in the `components.types` prop

Transform and prepare your Aftership data for Snowflake

After you have accessed your data on Aftership, you will have to transform it based on two main factors,

  1. The limitations of the database that the data will be loaded onto
  2. The type of analysis that you plan to perform

Each system has specific limitations on the data types and data structures that it supports. If for example, you want to push data into Google BigQuery, then you can send nested data like JSON directly.

Also, you have to choose the right data types. Again, depending on the system that you will send the data to and the data types that the API exposes to you, you will have to make the right choices. These choices are important because they can limit the expressivity of your queries and limit your analysts on what they can do directly out of the database.

Also, you have to consider that the reports you’ll get from Aftership are like CSV files in terms of their structure and you need to somehow identify what and how to map to a table into your database.

Data in Snowflake is organized around tables with a well-defined set of columns with each one having a specific data type.

Snowflake supports a rich set of data types. It is worth mentioning that a number of semi-structured data types is also supported. With Snowflake, it is possible to load data directly in JSON, Avro, ORC, Parquet, or XML format. Hierarchical data is treated as a first-class citizen, similar to what Google BigQuery offers.

There is also one notable common data type that is not supported by Snowflake. LOB or large object data type is not supported. Instead, you should use a BINARY or VARCHAR type instead. But these types are not that useful for data warehouse use cases.

A typical strategy for loading data from Aftership to Snowflake is to create a schema where you will map each API endpoint to a table.

Each key inside the Aftership API endpoint response should be mapped to a column of that table and you should ensure the right conversion to a Snowflake data type.

Of course, you will need to ensure that as the data types from the Aftership API might change, you will adapt your database tables accordingly. There’s no such thing as automatic data typecasting.

After you have a complete and well-defined data model or schema for Snowflake, you can move forward and start loading your data into the database.

Load data from Aftership to Snowflake

Usually, data is loaded into Snowflake in a bulk way, using the COPY INTO command. Files containing the data, usually in JSON format, are stored in a local file system or in Amazon S3 buckets. Then a COPY INTO command is invoked on the Snowflake instance, and data is copied into the data warehouse.

The files can be pushed into Snowflake using the PUT command into a staging environment before the COPY command is invoked.

Another alternative is to upload the data directly into a service like Amazon S3, from where Snowflake can access the data directly.

Updating your Aftership data on Snowflake

As you will be generating more data on Aftership, you will need to update your older data on Snowflake. This includes new records together with updates to older records that, for any reason, have been updated on Aftership.

You will need to periodically check Aftership for new data and repeat the process that has been described previously while updating your currently available data if needed. Updating an already existing row on a Snowflake table is achieved by creating UPDATE statements.

Another issue that you need to take care of is the identification and removal of any duplicate records on your database. Either because Aftership does not have a mechanism to identify new and updated records or because of errors on your data pipelines, duplicate records might be introduced to your database.

In general, ensuring the quality of the data that is inserted into your database is a big and difficult issue.

The best way to load data from Aftership to Snowflake

So far, we just scraped the surface of what can be done with Snowflake and how to load data into it. The way to proceed relies heavily on the data you want to load, from which service they are coming from, and the requirements of your use case.

Things can get even more complicated if you want to integrate data coming from different sources. A possible alternative, instead of writing, hosting, and maintaining a flexible data infrastructure, is to use a product like RudderStack that can handle this kind of problem automatically for you.

RudderStack integrates with multiple sources or services like databases, CRM, email campaigns, analytics, and more. Quickly and safely move all your data from Aftership to Snowflake and start generating insights from your data.

Sign Up For Free And Start Sending Data

Test out our event stream, ELT, and reverse-ETL pipelines. Use our HTTP source to send data in less than 5 minutes, or install one of our 12 SDKs in your website or app.

Don't want to go through the pain of direct integration?

RudderStack's AfterShip integration

makes it easy to send data from AfterShip to Snowflake.