How to load data from Xero to SQL Data Warehouse

Extract your data from Xero

Xero has an excellent API, or more precise a number of APIs, and encourages developers to build applications that can be sold on their add-on marketplace. The APIs that they expose are the following:

  • Xero Core (Accounting) API – exposes accounting and related functions of the main Xero application and can be used for various purposes such as creating transactions like invoices and credit notes, right through to extracting accounting data via our reports endpoint.
  • Xero Payroll API – exposes payroll-related functions of Payroll in Xero and can be used for various purposes such as syncing employee details, importing timesheets, etc.
  • Files API – provides access to the files, folders, and the association of files within a Xero organization.
  • Fixed Assets API – which is under review. This feature is not yet available, but users can vote for it to become publicly available.
  • Xero Practice Manager API – a recently released product built on the WorkflowMax product, an API for managing workflows.

In this post, we’ll focus on the Xero Core (Accounting) API, which exposes the core accounting functionalities of the Xero product. The Xero API is a RESTful web service and uses the OAuth (v1.0a) protocol to authenticate 3rd party applications. As a RESTful API, interacting with it can be achieved by using tools like CURL or Postman or Apirise or by using http clients for your favorite language or framework. A few suggestions:

As a product and consequently an API that has to deal with sensitive data, Xero API takes really good care of security. For this reason, many different applications can be developed and integrate with it, where the main difference is how the application authenticates, how often the tokens expire, and general security-related aspects. For more about the different application types, you can consult the application types guides on their documentation.

Xero API requests limits

The Xero API has three different types of limits that enforce the usage of their API. It’s extremely important to keep those in mind when developing against its API and a reason for many headaches when someone attempts to build an infrastructure for extracting data from it.

  • Daily limit – of 1000 API calls per organization.
  • Requests per minute – each OAuth access token can be used up to 60 times in any 60 second period. This rate limit is based on a rolling 60-second window.
  • Request Size Limit – A single POST to the Accounting or Payroll APIs has a size limit of 5MB.

For more information about the API limitations, please consult the documentation for text/xml but you can override this option and request JSON responses if preferred.

Requesting data from the Xero API

Let’s assume that you would like to retrieve all the invoices that you have issued through Xero and put the information in your data warehouse to perform analytics and reporting. To do that you should perform a GET request to the https://api.xero.com/api.xro/2.0/Invoices endpoint.

A typical result, in XML, from performing such an action is like the following:

JAVASCRIPT
<Invoices>
<Invoice>
<Type>ACCREC</Type>
<Contact>
<ContactID>025867f1-d741-4d6b-b1af-9ac774b59ba7</ContactID>
<ContactStatus>ACTIVE</ContactStatus>
<Name>City Agency</Name>
<Addresses>
<Address>
<AddressType>STREET</AddressType>
</Address>
<Address>
<AddressType>POBOX</AddressType>
<AddressLine1>L4, CA House</AddressLine1>
<AddressLine2>14 Boulevard Quay</AddressLine2>
<City>Wellington</City>
<PostalCode>6012</PostalCode>
</Address>
</Addresses>
<Phones>
<Phone>
<PhoneType>DEFAULT</PhoneType>
</Phone>
<Phone>
<PhoneType>DDI</PhoneType>
</Phone>
<Phone>
<PhoneType>MOBILE</PhoneType>
</Phone>
<Phone>
<PhoneType>FAX</PhoneType>
</Phone>
</Phones>
<UpdatedDateUTC>2009-08-15T00:18:43.473</UpdatedDateUTC>
<IsSupplier>false</IsSupplier>
<IsCustomer>true</IsCustomer>
</Contact>
<Date>2009-05-27T00:00:00</Date>
<DueDate>2009-06-06T00:00:00</DueDate>
<Status>AUTHORISED</Status>
<LineAmountTypes>Exclusive</LineAmountTypes>
<LineItems>
<LineItem>
<Description>Onsite project management </Description>
<Quantity>1.0000</Quantity>
<UnitAmount>1800.00</UnitAmount>
<TaxType>OUTPUT</TaxType>
<TaxAmount>225.00</TaxAmount>
<LineAmount>1800.00</LineAmount>
<AccountCode>200</AccountCode>
<Tracking>
<TrackingCategory>
<TrackingCategoryID>e2f2f732-e92a-4f3a9c4d-ee4da0182a13</TrackingCategoryID>
<Name>Activity/Workstream</Name>
<Option>Onsite consultancy</Option>
</TrackingCategory>
</Tracking>
<LineItemID>52208ff9-528a-4985-a9ad-b2b1d4210e38</LineItemID>
</LineItem>
</LineItems>
<SubTotal>1800.00</SubTotal>
<TotalTax>225.00</TotalTax>
<Total>2025.00</Total>
<UpdatedDateUTC>2009-08-15T00:18:43.457</UpdatedDateUTC>
<CurrencyCode>NZD</CurrencyCode>
<InvoiceID>243216c5-369e-4056-ac67-05388f86dc81</InvoiceID>
<InvoiceNumber>OIT00546</InvoiceNumber>
<Payments>
<Payment>
<Date>2009-09-01T00:00:00</Date>
<Amount>1000.00</Amount>
<PaymentID>0d666415-cf77-43fa-80c7-56775591d426</PaymentID>
</Payment>
</Payments>
<AmountDue>1025.00</AmountDue>
<AmountPaid>1000.00</AmountPaid>
<AmountCredited>0.00</AmountCredited>
</Invoice>
</Invoices>

It is possible to paginate your results by using the paging support of the Xero API, which is very useful when you have to work with a large number of invoices. Also, it is possible to request from the API only the latest invoices. This is done by providing the “Modified After” parameter on the GET request to the API. The ModifiedAfter filter is an HTTP header: ‘If-Modified-Since.’

A UTC timestamp (yyyy-mm-ddThh:mm:ss) . Only invoices created or modified since this timestamp will be returned e.g. 2009-11-12T00:00:00.

Xero exposes a very rich API which offers you the opportunity to get very granular data about your accounting activities and use it for analytic and reporting purposes. But, of course, this richness comes with a price, though many resources have to be handled where some of them allow fetching updates and others not.

Load Data from Xero to SQL Data Warehouse

SQL Data Warehouse support numerous options for loading data, such as:

  • PolyBase
  • Azure Data Factory
  • BCP command-line utility
  • SQL Server integration services

As we are interested in loading data from online services by using their exposed HTTP APIs, we will not consider the usage of BCP command-line utility or SQL server integration in this guide. Instead, we’ll consider the case of loading our data as Azure storage Blobs and then use PolyBase to load the data into SQL Data Warehouse.

Accessing these services happens through HTTP APIs. As we see again, APIs play an important role in the extraction and the loading of data into our data warehouse. You can access these APIs by using a tool like CURL, Postman, or Apirise. Or use the libraries provided by Microsoft for your favorite language. Before you upload any data, you have to create a container similar to the Amazon AWS Bucket concept. Creating a container is a straightforward operation and you can do it by following the instructions found on the Blog storage documentation from Microsoft. As an example, the following code can create a container in Node.js.

JAVASCRIPT
blobSvc.createContainerIfNotExists('mycontainer', function(error, result, response){
if(!error){
// Container exists and allows
// anonymous read access to blob
// content and metadata within this container
}
});

After the creation of the container you can start uploading data to it by using again the given SDK of your choice in a similar fashion:

JAVASCRIPT
blobSvc.createBlockBlobFromLocalFile('mycontainer', 'myblob', 'test.txt', function(error, result, response){
if(!error){
// file uploaded
}
});

When you are done putting your data into Azure Blobs, you must load it into SQL Data Warehouse using PolyBase. To do that, you should follow the directions in the Load with PolyBase documentation. In summary, the required steps to do it are the following:

  • create a database master key
  • create a database scoped credentials
  • create an external file format
  • create an external data source

PolyBase’s ability to transparently parallelize loads from Azure Blob Storage will make it the fastest tool for loading data. After configuring PolyBase, you can load data directly into your SQL Data Warehouse by simply creating an external table that points to your data in storage and then mapping that data to a new table within SQL Data Warehouse.

Of course, you will need to establish a recurrent process that will extract any newly created data from your service, load them in the form of Azure Blobs and initiate the PolyBase process for importing the data again into SQL Data Warehouse. One way of doing this is by using the Azure Data Factory service. If you would like to follow this path, you can read some good documentation on how to move data to and from Azure SQL Warehouse using Azure Data Factory.

The best way to load data from Xero to SQL Data Warehouse and possible alternatives

So far we just scraped the surface of what can be done with Microsoft Azure SQL Data Warehouse and how to load data into it. The way to proceed relies heavily on the data you want to load, from which service they are coming from, and your use case requirements. Things can get even more complicated if you want to integrate data coming from different sources. Instead of writing, hosting, and maintaining a flexible data infrastructure, a possible alternative is to use RudderStack that can automatically handle this kind of problem for you.

RudderStack integrates with multiple sources or services like databases, CRM, email campaigns, analytics, and more. Quickly and safely move all your data from Xero into SQL Data Warehouse and start generating insights from your data.

Sign Up For Free And Start Sending Data
Test out our event stream, ELT, and reverse-ETL pipelines. Use our HTTP source to send data in less than 5 minutes, or install one of our 12 SDKs in your website or app.