The BigQuery data transfer service is used to automatically send data from a data source to a BigQuery project on a regular basis.
When you create a new project in BigQuery you can then either manually import data to one of its data tables or you can automate the data transfer on a regular basis.
If you want to automate the data transfer on a regular basis (which you most likely would like to do) then you would need to create one or more data transfer for your BigQuery project:
From the screenshot above we can conclude that the BigQuery project named ‘optimizesmart.com’ is connected to the following data transfer services:
#1 Transfer Google Search Console to BigQuery
This data transfer service is used to extract data from the ‘Google Search Console’ account and then move it to ‘optimizesmart.com’ project on a regular basis.
#2 Transfer Google Ads to BigQuery
This data transfer service is used to extract data from the ‘Google Ads’ account and then move it to ‘optimizesmart.com’ project on a regular basis.
#3 Transfer Facebook Ads to BigQuery
This data transfer service is used to extract data from the ‘Facebook Ads’ account and then move it to ‘optimizesmart.com’ project on a regular basis.
#4 Transfer GA to BigQuery
This data transfer service is used to extract data from the ‘Google Analytics’ account and then move it to ‘optimizesmart.com’ project on a regular basis.
So, one BigQuery project is automatically getting data from several data sources on a regular basis.
Prerequisites for creating a data transfer service in BigQuery
Before you can create a data transfer service in BigQuery:
#1 You need access to a BigQuery project (remember we create data transfer for a particular project):
#2 You need access to a data source:
We pull data into BigQuery from this data source.
Following are examples of data sources:
Amazon S3
Campaign Manager
Google Ad Manager
Google Ads
Google Analytics by Supermetrics
Google Merchant Center etc
These data sources can also be a third party connector (like a Supermetrics connector) which pulls data from a data source.
#3 You would need to decide the scheduling options:
The scheduling options denotes the day and time when the data should automatically be extracted from a data source.
#4 You would need to decide the format in which the data should be extracted from a data source.
We decide this format by defining the schema (structure) of our data tables:
#5 You would need access to a dataset (also called destination dataset):
This dataset contains one or more tables where the extracted data would be stored in BigQuery.
Before you can create a data transfer in BigQuery, you should decide the format in which the data should be extracted from a data source.
We decide this format by defining the schema (structure) of our data tables.
There are two types of schemas:
Default (standard) schema
Custom schema
When you don’t create a custom schema before creating a data transfer then the default schema is used. This schema is either defined by the data source you use or the connector you use for extracting data from a data source.
For example, if you are using the Supermetrics connector to pull data from a data source (like Google Analytics) and you did not create a custom schema before creating a data transfer then the default schema (provided by the Supermetrics connector) will be used.
Earlier when we sent Google Analytics data to BigQuery via the Supermetrics Google Analytics connector, we used the default schema provided by Supermetrics while creating the data transfer.
As a result, the Supermetrics Google Analytics connector automatically created a set of tables in the specified dataset but it did not give us any option on creating the data tables we want or setting the fields we want to see in the data table(s):
When you use the default schema you have no control over the fields (columns) that appear in your data tables and no control over the number and type of data tables you see in your dataset.
If you want to see your data tables with only the fields you want then you need to first create your own schema and then later select the schema while creating your data transfer.
The custom schema that you create depends on the data source being used.
For example, if you are using Google Analytics as a data source then you define a schema by selecting the dimensions and metrics you want to see in your data table.
Then you add one or more queries to it.
How to edit a data transfer in BigQuery
Once you have created a data transfer, you can edit it by changing one, any or all of the following settings:
Display Name
Schedule Options
Destination Dataset
Configuration (schema, account)
To edit your data transfer service follow the steps below:
"How to use Digital Analytics to generate floods of new Sales and Customers without spending years figuring everything out on your own."
Here’s what we’re going to cover in this training…
#1 Why digital analytics is the key to online business success.
#2 The number 1 reason why most marketers are not able to scale their advertising and maximize sales.
#3 Why Google and Facebook ads don’t work for most businesses & how to make them work.
#4 Why you won’t get any competitive advantage in the marketplace just by knowing Google Analytics.
#5 The number 1 reason why conversion optimization is not working for your business.
#6 How to advertise on any marketing platform for FREE with an unlimited budget.
#7 How to learn and master digital analytics and conversion optimization in record time.
My best selling books on Digital Analytics and Conversion Optimization
Maths and Stats for Web Analytics and Conversion Optimization
This expert guide will teach you how to leverage the knowledge of maths and statistics in order to accurately interpret data and take actions, which can quickly improve the bottom-line of your online business.
Master the Essentials of Email Marketing Analytics
This book focuses solely on the ‘analytics’ that power your email marketing optimization program and will help you dramatically reduce your cost per acquisition and increase marketing ROI by tracking the performance of the various KPIs and metrics used for email marketing.
Attribution Modelling in Google Analytics and Beyond
Attribution modelling is the process of determining the most effective marketing channels for investment. This book has been written to help you implement attribution modelling. It will teach you how to leverage the knowledge of attribution modelling in order to allocate marketing budget and understand buying behaviour.
Attribution Modelling in Google Ads and Facebook
This book has been written to help you implement attribution modelling in Google Ads (Google AdWords) and Facebook. It will teach you, how to leverage the knowledge of attribution modelling in order to understand the customer purchasing journey and determine the most effective marketing channels for investment.
About the Author
Himanshu Sharma
Founder, OptimizeSmart.com
Over 15 years of experience in digital analytics and marketing
Author of four best-selling books on digital analytics and conversion optimization
Nominated for Digital Analytics Association Awards for Excellence
Runs one of the most popular blogs in the world on digital analytics
Consultant to countless small and big businesses over the decade