What is BigQuery Data Transfer Service & how it works

What is BigQuery Data Transfer Service?

The BigQuery data transfer service is used to automatically send data from a data source to a BigQuery project on a regular basis.

When you create a new project in BigQuery you can then either manually import data to one of its data tables or you can automate the data transfer on a regular basis. 

If you want to automate the data transfer on a regular basis (which you most likely would like to do) then you would need to create one or more data transfer for your BigQuery project:

From the screenshot above we can conclude that the BigQuery project named ‘optimizesmart.com’ is connected to the following data transfer services:

#1 Transfer Google Search Console to BigQuery 

This data transfer service is used to extract data from the ‘Google Search Console’ account and then move it to ‘optimizesmart.com’ project on a regular basis. 

#2 Transfer Google Ads to BigQuery 

This data transfer service is used to extract data from the ‘Google Ads’ account and then move it to ‘optimizesmart.com’ project on a regular basis. 

#3 Transfer Facebook Ads to BigQuery 

This data transfer service is used to extract data from the ‘Facebook Ads’ account and then move it to ‘optimizesmart.com’ project on a regular basis. 

#4 Transfer GA to BigQuery 

This data transfer service is used to extract data from the ‘Google Analytics’ account and then move it to ‘optimizesmart.com’ project on a regular basis. 

So, one BigQuery project is automatically getting data from several data sources on a regular basis.

How to access the BigQuery Data Transfer Service

You can access the BigQuery Data Transfer Service via Cloud Console, bq command-line tool or BigQuery Data Transfer Service API

How much does BigQuery Data Transfer Service cost?

You can find details about the BigQuery Data Transfer Service pricing from here: https://cloud.google.com/bigquery-transfer/pricing 

Prerequisites for creating a data transfer service in BigQuery

Before you can create a data transfer service in BigQuery:

#1 You need access to a BigQuery project (remember we create data transfer for a particular project):

#2 You need access to a data source:

We pull data into BigQuery from this data source. 

Following are examples of data sources:

  • Amazon S3
  • Campaign Manager
  • Google Ad Manager
  • Google Ads
  • Google Analytics by Supermetrics 
  • Google Merchant Center etc

These data sources can also be a third party connector (like a Supermetrics connector) which pulls data from a data source. 

#3 You would need to decide the scheduling options: 

The scheduling options denotes the day and time when the data should automatically be extracted from a data source. 

#4 You would need to decide the format in which the data should be extracted from a data source.

We decide this format by defining the schema (structure) of our data tables:

#5 You would need access to a dataset (also called destination dataset): 

This dataset contains one or more tables where the extracted data would be stored in BigQuery. 

Before you can create a data transfer in BigQuery, you should decide the format in which the data should be extracted from a data source. 

We decide this format by defining the schema (structure) of our data tables. 

There are two types of schemas: 

  1. Default (standard) schema
  2. Custom schema

When you don’t create a custom schema before creating a data transfer then the default schema is used. This schema is either defined by the data source you use or the connector you use for extracting data from a data source.

For example, if you are using the Supermetrics connector to pull data from a data source (like Google Analytics) and you did not create a custom schema before creating a data transfer then the default schema (provided by the Supermetrics connector) will be used.

Earlier when we sent Google Analytics data to BigQuery via the Supermetrics Google Analytics connector, we used the default schema provided by Supermetrics while creating the data transfer.

As a result, the Supermetrics Google Analytics connector automatically created a set of tables in the specified dataset but it did not give us any option on creating the data tables we want or setting the fields we want to see in the data table(s):

When you use the default schema you have no control over the fields (columns) that appear in your data tables and no control over the number and type of data tables you see in your dataset.

If you want to see your data tables with only the fields you want then you need to first create your own schema and then later select the schema while creating your data transfer.

The custom schema that you create depends on the data source being used.

For example, if you are using Google Analytics as a data source then you define a schema by selecting the dimensions and metrics you want to see in your data table. 

Then you add one or more queries to it.

How to edit a data transfer in BigQuery

Once you have created a data transfer, you can edit it by changing one, any or all of the following settings:

  • Display Name
  • Schedule Options
  • Destination Dataset
  • Configuration (schema, account)

To edit your data transfer service follow the steps below:

Step-1: Navigate to your BigQuery account: https://console.cloud.google.com/bigquery

Step-2: Make sure that you are in the project whose data transfer service you want to edit:

Step-3: Click on link ‘Data transfers’ from the left-hand side navigation:

Step-4: Click on the name of the data transfer service you want to edit:

You should now see a screen like the one below:

Step-5: Click on the ‘EDIT’ button at the top right-hand side of your screen:

You can now do the following:

#1 Change the display name of your data transfer service:

#2 Change the Schedule options:

#3 Change the destination dataset:

#4 Change the Configuration Settings (like changing the schema or data source):

Step-6: Once you have edited your data transfer service then click on the ‘Save’ button to save the changes you made:

In addition to editing your data transfer service, you can also delete it, disable it or schedule a backfill:

Other articles on Google Analytics BigQuery

  1. Advantages of using Google BigQuery for Google Analytics
  2. Cost of using BigQuery for Google Analytics
  3. Guide to BigQuery Cost optimization
  4. What is Google BigQuery Sandbox and how to use it
  5. Understanding the BigQuery User Interface
  6. Sending data from Google Analytics to BigQuery without 360
  7.  How to connect GA4 (Google Analytics 4) with BigQuery
  8. events_& events_intraday_ tables in BigQuery for GA4 (Google Analytics 4)
  9. Using Google Cloud pricing calculator for BigQuery
  10. How to access BigQuery Public Data Sets
  11. How to use Google Analytics sample dataset for BigQuery
  12. Connect and transfer data from Google Sheets to BigQuery
  13. How to query Google Analytics data in BigQuery
  14. How to send data from Google Ads to BigQuery
  15. How to send data from Facebook ads to BigQuery
  16. How to send data from Google Search Console to BigQuery
  17. How to pull custom data from Google Analytics to BigQuery
  18. Best Supermetrics Alternative – Dataddo
  19. Google Analytics BigQuery Tutorial
  20. How to backfill Google Analytics data in BigQuery

Register for the FREE TRAINING...

"How to use Digital Analytics to generate floods of new Sales and Customers without spending years figuring everything out on your own."



Here’s what we’re going to cover in this training…

#1 Why digital analytics is the key to online business success.

​#2 The number 1 reason why most marketers are not able to scale their advertising and maximize sales.

#3 Why Google and Facebook ads don’t work for most businesses & how to make them work.

#4 ​Why you won’t get any competitive advantage in the marketplace just by knowing Google Analytics.

#5 The number 1 reason why conversion optimization is not working for your business.

#6 How to advertise on any marketing platform for FREE with an unlimited budget.

​#7 How to learn and master digital analytics and conversion optimization in record time.



   

My best selling books on Digital Analytics and Conversion Optimization

Maths and Stats for Web Analytics and Conversion Optimization
This expert guide will teach you how to leverage the knowledge of maths and statistics in order to accurately interpret data and take actions, which can quickly improve the bottom-line of your online business.

Master the Essentials of Email Marketing Analytics
This book focuses solely on the ‘analytics’ that power your email marketing optimization program and will help you dramatically reduce your cost per acquisition and increase marketing ROI by tracking the performance of the various KPIs and metrics used for email marketing.

Attribution Modelling in Google Analytics and Beyond
Attribution modelling is the process of determining the most effective marketing channels for investment. This book has been written to help you implement attribution modelling. It will teach you how to leverage the knowledge of attribution modelling in order to allocate marketing budget and understand buying behaviour.

Attribution Modelling in Google Ads and Facebook
This book has been written to help you implement attribution modelling in Google Ads (Google AdWords) and Facebook. It will teach you, how to leverage the knowledge of attribution modelling in order to understand the customer purchasing journey and determine the most effective marketing channels for investment.

About the Author

Himanshu Sharma

  • Founder, OptimizeSmart.com
  • Over 15 years of experience in digital analytics and marketing
  • Author of four best-selling books on digital analytics and conversion optimization
  • Nominated for Digital Analytics Association Awards for Excellence
  • Runs one of the most popular blogs in the world on digital analytics
  • Consultant to countless small and big businesses over the decade
error: Alert: Content is protected !!