Feeling stuck with Segment? Say 👋 to RudderStack.

SVG
Log in

How to load data from PostgreSQL to Google BigQuery

Access your data on PostgreSQL

The first step in migrating your PostgreSQL data to any data warehouse solution is accessing and extracting it. There are many ways of doing this like, for example, logical replication log as previously mentioned.

You need to listen to the log for changes on the database and reflect them on the target system. When pulling data from a database, you also need to filter tables, columns, find a way to identify updates and replicate the appropriate database schema considering that it will end up in a columnar database for analytics.

Another way is by using a JDBC importer. In this case, any input configuration will contain all the appropriate values for the database authentication and connection. By appropriately configuring the JDBC importer, you can control each table's behavior during import and alter its schema as well, if desired. Moreover, pagination of data importing can be simulated by querying tables in batching mode.

Transform and prepare your PostgreSQL data

After you have accessed your data on PostgreSQL, you will have to transform it based on two main factors:

  1. The limitations of the database that you will load the data into
  2. The type of analysis that you plan to perform


Each system has specific limitations on the data types and data structures that it supports. Also, you have to choose the right data types. Again, depending on the system you will send the data to, you will have to make the right choices.

For the most common data types, the mapping choices may seem to be obvious. Each database system will mostly support a set of more sophisticated and database-specific types whose mapping choices require careful consideration. This factor is quite important because it can limit your queries' expressivity directly out of the database.

However, plan to push the data to another PostgreSQL database. You probably don't have to worry about the data types unless you have some reasons for the analysis you will perform.


[@portabletext/react] Unknown block type "aboutNodeBlock", specify a component for it in the `components.types` prop

Load data from PostgreSQL to Google BigQuery

If you want to load PostgreSQL data to Google BigQuery, you have to use one of the following supported data sources:

  1. Google Cloud Storage
  2. Sent data directly to BigQuery with a POST request
  3. Google Cloud Datastore Backup
  4. Streaming insert
  5. App Engine log files
  6. Cloud Storage logs

From the above list of sources, 5 and 6 are not applicable in our case.

For Google Cloud Storage, you first have to load your data into it; there are a few options on how to do this. For example, you can use the console directly as described here, and do not forget to follow the best practices.

Another option is to post your data through the JSON API. As we see again, APIs play an important role in both the extraction and the loading of data into our data warehouse. It's just a matter of one HTTP POST request using a tool like CURL or Postman in its simplest case.

After you have loaded your data into Google Cloud Storage, you have to create a Load Job for BigQuery to load the data into it. This job should point to the source data in Cloud Storage that has to be imported. You can do this by providing source URIs that point to the appropriate objects.

The best way to load data from PostgreSQL to Google BigQuery and possible alternatives

In this post, we just scraped the surface of what can be done with Google BigQuery and how to load data into it. The way to proceed relies heavily on the data you want to load, the service from which it is coming from, and your use case's requirements. However, things can get even more complicated if you want to integrate data coming from different sources.

Instead of writing, hosting, and maintaining a flexible data infrastructure, a possible alternative is to use a product like RudderStack that can handle this kind of problem automatically for you.

RudderStack integrates seamlessly with multiple sources or services like databases, CRM, email campaigns, analytics, and more. Now you can quickly and safely move all your data from PostgreSQL into Google BigQuery and start generating insights from your data in no time.

Easily sync your product data from PostgreSQL into your Analytics data warehouse without affecting your live product system and combine it with your Sales, Marketing, Support, and Finance data to transform your business insights.

Sign Up For Free And Start Sending Data

Test out our event stream, ELT, and reverse-ETL pipelines. Use our HTTP source to send data in less than 5 minutes, or install one of our 12 SDKs in your website or app.

Don't want to go through the pain of direct integration? RudderStack's Reverse ETL connection makes it easy to send data from PostgreSQL to Google BigQuery.