How to load data from the LinkedIn Ads to Snowflake
Access your data on LinkedIn Ads
The first step in loading your LinkedIn Ads data to any kind of data warehouse solution is to access your data and start extracting it.
Using the REST API that LinkedIn Ads offers, you can programmatically interact with your account to access your digital advertising data. By doing so, you can get aggregated metrics that, among others, include the following:
- Counts of the clicks on the action button and on the ad unit
- The number of impressions or clicks for each card of the carousel ad and creative landing page clicks.
- Count of comments and of likes on each comment
- Value of conversions and cost in the account’s local currency.
All the available aggregated metrics can be retrieved for any user-defined time period.
In addition to the above, the things that you have to keep in mind when dealing with the LinkedIn Ads API are:
- Rate limits. Daily request Quotas per application, per user, and per-application developer as described in the documentation. These vary depending on the Application tier. Current day’s usage and limits can be found in the application (choose application → Application Settings → Usage & Limits)
- Authentication. Linkedin uses OAuth for authentication. An access token is valid for 60 days. Therefore, adding the access token to the requests’ header (bearer authorization) is sufficient to get the reports.
- Pagination. API endpoints that return a collection of items are always paginated.
Transform and prepare your LinkedIn Ads data
After you have accessed your data on LinkedIn Ads, you will have to transform it based on two main factors,
- The limitations of the database that is going to be used
- The type of analysis that you plan to perform
Each system has specific limitations on the data types and data structures that it supports. For example, if you want to push data into Google BigQuery, you can send nested data like JSON directly.
Additionally, you have to choose the right data types. Again, depending on the system that you will send data to and the data types that the API exposes to you, you will have to make the right choices. These choices are important because they can limit the expressivity of your queries and limit your analysts on what they can do directly out of the database.
Also, you have to consider that the reports you’ll get from LinkedIn Ads are like CSV files in terms of their structure, and you need to identify somehow what and how to map to a table into your database.
Data in Snowflake is organized around tables with a well-defined set of columns, with each one having a specific data type.
Snowflake supports a rich set of data types. It is worth mentioning that a number of semi-structured data types are also supported. It is possible to load data directly in JSON, Avro, ORC, Parquet, or XML format with Snowflake. Hierarchical data is treated as a first-class citizen, similar to what Google BigQuery offers.
There is also one notable common data type that Snowflake does not support. LOB or large object data type is not supported. Instead, you should use a BINARY or VARCHAR type instead. But these types are not that useful for data warehouse use cases.
A typical strategy for loading data from LinkedIn Ads to Snowflake is to create a schema where you will map each API endpoint to a table.
Each key inside the LinkedIn Ads API endpoint response should be mapped to a column of that table, and you should ensure the right conversion to a Snowflake data type.
Of course, you will need to ensure that as data types from the LinkedIn Ads API might change, you will adapt your database tables accordingly. Unfortunately, there’s no such thing as automatic data typecasting.
After you have a complete and well-defined data model or schema for Snowflake, you can move forward and start loading your data into a database.
Load data from LinkedIn Ads to Snowflake
Usually, data is loaded into Snowflake in a bulk way, using the COPY INTO command. Files containing data, usually in JSON format, are stored in a local file system or in Amazon S3 buckets. Then a COPY INTO command is invoked on the Snowflake instance, and data is copied into a data warehouse.
The files can be pushed into Snowflake using the PUT command into a staging environment before the COPY command is invoked.
Another alternative is to upload the data directly into a service like Amazon S3, from where Snowflake can access data directly.
Updating your LinkedIn Ads data on Snowflake
As you will be generating more data on LinkedIn Ads, you will need to update your older data on Snowflake. This includes new records and updates to older records that have been updated on LinkedIn Ads for any reason.
You will need to periodically check LinkedIn Ads for new data and repeat the process described previously while updating your currently available data if needed. For example, updating an already existing row on a Snowflake table is achieved by creating UPDATE statements.
Another issue that you need to take care of is identifying and removing any duplicate records on your database. Either because LinkedIn Ads does not have a mechanism to identify new and updated records or because of errors on your data pipelines, duplicate records might be introduced to your database.
In general, ensuring the quality of data inserted into your database is a big and difficult issue.
The best way to load data from LinkedIn Ads to Snowflake
So far, we just scraped the surface of what you can do with Snowflake and how to load data into it. Things can get even more complicated if you want to integrate data coming from different sources.
Are you striving to achieve results right now?
Instead of writing, hosting, and maintaining a flexible data infrastructure, use RudderStack to handle everything for you automatically.
RudderStack, with one click, integrates with sources or services, creates analytics-ready data, and syncs your LinkedIn Ads to Snowflake right away.