GA4 BigQuery Export Schema Tutorial
Once you understand how the GA4 data is stored in BigQuery data tables, it will become easier for you to query it.
What is the GA4 BigQuery Export Schema?
The GA4 BigQuery export schema refers to the structure of GA4 and Firebase data that is exported to a Google BigQuery project.
This schema defines how the data is organized within datasets and data tables.
To understand the GA4 BigQuery export schema, you will first need to understand the basic structure of how data is stored in a BigQuery project.
- Google Cloud Platform (GCP) can consist of one or more organizations.
- Each organization can consist of one or more projects.
- One such project can be a BigQuery project. It’s the environment within which all BigQuery datasets and operations reside.
- Each project has a name, project number, and project ID.
- Each project can consist of one or more datasets.
- Each dataset can consist of one or more data tables.
Project ID and Data Sets
Get weekly practical tips on GA4 and/or BigQuery to accurately track and read your analytics data.
When you click on a project ID, you can see the datasets:
Here, the project id ‘dbrt-ga4’ contains the following data sets:
- analytics_207472454
- custom_ga4
- ga3_data_backup
- google_ads
- historical_ga4_data
- searchconsole
For every GA4 property and each Firebase project connected to your BigQuery project, a unique dataset called “analytics_<property_id>” is created in your BigQuery project.
The “property_id” is your GA4 Property ID.
The dataset named in the following format: “analytics_<property_id>” is the dataset meant for GA4:
Click on the dataset named in the following format: “analytics_<property_id>” .
You should now be able to see the data set information like ‘Dataset ID’, Dataset creation date, Last modification date, Data location etc:
Note down the Dataset ID by copying it to the clipboard:
We are going to reference this dataset ID later when querying the data.
Click on the ‘analytics_’ dataset again.
You should now see the following three to four data tables:
Each dataset is made up of one or more data tables.
The ‘analytics_207472454’ dataset contains the following four data tables:
- events_(<number of days>)
- events_intraday_<number of days>
- pseudonymous_users_<number of days>
- users_<number of days>
‘events_’ and ‘events_intraday_’ Data Tables
The ‘events_’ and ‘events_intraday_’ data tables contain event-based and user-based GA4 export data in BigQuery.
All the GA4 event data from the previous day(s) is available in the ‘events_’ data table.
This table is automatically imported for each day of export.
events_(1) means all of the GA4 event data from the previous day are available in this data table.
events_(2) means all of the GA4 event data from the previous two days are available in this data table.
events_(3) means all of the GA4 event data from the previous three days are available in this data table.
Similarly,
events_(1187) means all of the GA4 event data from the previous 1187 days are available in this data table.
All the GA4 event data from the current day is available in the ‘events_intraday_’ data table.
This table is automatically updated throughout the day. That’s why it is called ‘events_intraday’ table.
Note: We usually do not query GA4 data from the ‘events_intraday_’ data table.
‘pseudonymous_users_’ and ‘users _’ Data Tables
The ‘pseudonymous_users_’ and ‘users _’ data tables contain only user-based GA4 export data in BigQuery.
The advantage of using the ‘pseudonymous_users_’ and ‘users _’ data tables over the ‘events_’ and ‘events_intraday_’ data tables is that you get access to more user data.
The ‘pseudonymous_users_’ and ‘users _’ data tables contain audience and prediction data which is not available in the ‘events_’ and ‘events_intraday_’ data tables.
The ‘pseudonymous_users_’ data table contains all the data for every pseudonymous identifier that is not user ID.
A pseudonymous identifier is a unique identifier (like ‘Google Signals’ or ‘Device ID’) created and assigned to each user by Google Analytics to track users across devices and sessions and to create a complete picture of their behaviour.
The ‘pseudonymous_users_’ data table is updated whenever data for a user is updated.
pseudonymous_users_(1) means all the data for every pseudonymous identifier that is not a user ID from the previous day is available in this data table.
pseudonymous_users_(2) means all the data for every pseudonymous identifier that is not a user ID from the previous two days is available in this data table.
Similarly,
pseudonymous_users_(236) means all the data for every pseudonymous identifier that is not a user ID from the previous 236 days is available in this data table.
The ‘users_’ data table contains all the data for every pseudonymous identifier that is a user ID.
Data for a user is updated when there is a change to one of the fields.
Note: The ‘users_’ data table is not available to you in your BigQuery project if you are not using the user-id tracking in GA4.
The SCHEMA tab of the data table
Clicking on the ‘events_’ data table will show you the structure of that table (also known as ‘Schema’):
The schema shows you how the data table has been set up, what type of values it accepts, etc.
Take a close look at the various fields available under the ‘SCHEMA’ tab:
We are going to reference these fields when querying the GA4 data.
Bookmark the [GA4] BigQuery Export schema help documentation from Google to find more information about each field:
You can increase or decrease the size of the right-hand side panel by dragging via the mouse:
Selecting Data Table based on Date
The ‘events_’ data tables are named as “events_YYYYMMDD” where “YYYYMMDD” refers to the date the table was imported to BigQuery.
YYYY denotes a year. For example, 2024
MM denotes a month. For example, 04 (i.e. April)
DD denotes a day. For example, 07
So the data table that was imported to BigQuery on April 7th, 2024 would be named as events_20240407
So you are looking at the data for April 7th, 2024.
If you want to look at data for a different date, then click on the date drop-down menu and select a different date:
The DETAILS and the PREVIEW tabs of the Data Table
Click on the ‘DETAILS’ tab to get information about the data table:
Take note of the table ID:
We are going to reference table ID later when querying the GA4 data.
Look at the ‘Storage Info’ section to determine the size of your data table:
It is always a best practice to check the size of a table before querying the data from it.
If the size of the data table is just a few kilobytes (KB) or megabytes (MB), you don’t need to worry.
But if the table size is in gigabytes (GB), terabytes (TB) or petabytes (PB), you should be careful how you query your data.
Your monthly cost of using BigQuery depends upon the following factors:
#1 The amount of data you stored in BigQuery (i.e. the storage cost)
#2 The amount of data you processed by each query you run (i.e. the query cost).
The first 10 GB of active storage is free each month. After that, you would be charged $0.020 per GB of active storage.
The first 1 terabyte of data processed is free each month. After that, you would be charged $5 per terabyte (TB) of data processed.
Related Articles:
Click on the ‘Preview’ tab to view the data in the ‘events_’ data table:
It is always a best practice to preview a table before querying the data from it.
Many people, especially new users, run queries just to preview the data in a data table. This could considerably cost you if you accidentally queried gigabytes or terabytes of data.
Instead of running queries just to preview the data in a data table, click on the ‘Preview’ tab to preview the table.
There is no cost for previewing the data table.
The table preview will give you an idea of what type of data is available in the table without querying the table.
Rows and Columns of the Data Table
From the table preview, you can see that the table is made up of rows and columns:
Use the horizontal slider to see more columns:
Use the vertical slider to see more rows:
Use the ‘Results per page’ drop-down menu if you want to see more than 50 rows:
Note: You can see up to 200 rows per page.
To see the next 200 rows, press the > button:
You can re-size the width of each column in the data table.
How the GA4 data is stored in data tables
Each row of the data table corresponds to a single GA4 event.
For example,
The first row corresponds to the ‘first_visit’ event:
The second row corresponds to the ‘session_start’ event:
Each column of the data table corresponds to a single key of the GA4 event parameter.
Event parameters in GA4 are the additional information about an event that are sent along with the event. The key of an event parameter is the name of the parameter, and the value of an event parameter is the data associated with the parameter.
Here,
‘event_name’ is the key of the event parameter, and ‘first-visit’ is one of the values of the event parameter.
Keys are also known as fields.
Fields can be regular or nested.
A regular field is a single piece of data, such as a string, a number, or a date. A nested field is a collection of fields (often an array of structures) that are stored together as a single unit.
The event_params field is an array of structures that contains additional event parameters.
Here,
‘ga_session_id‘ is an event parameter.
‘engaged_session_event‘ is an event parameter.
Similalry,
‘page_title‘ is an event parameter.
There can be nested fields within the nested field.
Nested fields can be nested within other nested fields to any depth. This is often referred to as hierarchical data.
Hierarchical data is a type of data that is organized in a tree-like structure. Each node in the tree can contain other nodes, and so on.
For example,
The ‘page_referrer’ is a nested field within the ‘event_params’ nested field.
This means that the ‘page_referrer’ event parameter is itself an array of structures. Each structure in the array represents a single referrer.
The ‘page_location’ event parameter is a nested field within the ‘event_params’ nested field.
This means that the page_location event parameter is itself an array of structures. Each structure in the array represents a single location.
Each event has information on event-specific parameters.
The information about GA4 event parameters is stored in the data table in the key-value format:
The key field (event_params.key) denotes the name of the event parameter.
For example: ‘page_title’:
A value field is an object containing the event parameter’s value in one of its four fields:
- string_value
- int_value
- float_value
- double_value.
So we can have the following value fields:
- event_params.value.string_value
- event_params.value.int_value
- event_params.value.float_value
- event_params.value.double_value
The event_params.value.string_value field in GA4 is a nested field that stores the string value of an event parameter.
The event_params.value.int_value field in GA4 is a nested field that stores the integer value of an event parameter.
The event_params.value.float_value field in GA4 is a nested field that stores the float value of an event parameter.
The event_params.value.double_value field in GA4 is a nested field that stores the double value of an event parameter.
The specific field that is used to store the value of the event parameter depends on the type of value that the parameter is.
For example, if the event parameter is a string, the event_params.value.string_value field will be used to store the value of the parameter.
Similarly, if the event parameter is an integer, the event_params.value.int_value field will be used to store the value of the parameter. And so on.
The following table lists the most common event parameters in the GA4 BigQuery data table:
Other articles on GA4 BigQuery
#1 BigQuery Introduction
- How to create a new Google Cloud Platform account.
- How to create a new BigQuery project.
- What is Google BigQuery Sandbox and how to use it.
- Understanding the BigQuery User Interface.
- What is BigQuery Data Transfer Service & how it works.
- How to create data transfer in BigQuery.
- Connect and transfer data from Google Sheets to BigQuery.
- How to access BigQuery Public Data Sets.
- Best Supermetrics Alternative – Dataddo.
#2 GA4 BigQuery Introduction
- Google Analytics 4 BigQuery Tutorial for Beginners to Advanced.
- GA4 Bigquery Export Schema Tutorial.
- GA4 BigQuery – Connect Google Analytics 4 with BigQuery.
- events_ & events_intraday_ tables in BigQuery for GA4 (Google Analytics 4).
- pseudonymous_users_ & users_ data tables in BigQuery for GA4 (Google Analytics 4).
- How to access GA4 Sample Data in BigQuery.
- Advantages of using Google BigQuery for Google Analytics 4.
- Impact of Google Advanced Consent Mode on BigQuery & GDPR.
#3 GA4 BigQuery Data Transfer
- How to Connect and Export Data from GA4 to BigQuery
- How to backfill GA4 data in BigQuery.
- How to overcome GA4 BigQuery Export limit.
- How to Send Custom GA4 Data to BigQuery.
- How to backup Universal Analytics data to BigQuery.
- How to send data from Google Ads to BigQuery.
- How to send data from Google Search Console to BigQuery.
- Sending data from Google Analytics to BigQuery without 360.
- How to send data from Facebook ads to BigQuery.
- How to pull custom data from Google Analytics to BigQuery.
#4 BigQuery Cost Optimization
- Guide to BigQuery Cost Optimization.
- Using Google Cloud pricing calculator for BigQuery.
- Cost of using BigQuery for Google Analytics 4.
#5 Query GA4 BigQuery Data
- How to query Google Analytics data in BigQuery.
- Query GA4 data in BigQuery without understanding SQL.
- Using GA4 BigQuery SQL generator to create SQL queries.
- New vs Returning users in GA4 BigQuery data table.
- GA4 BigQuery Composer Tutorial for ChatGPT.
- How to track GA4 BigQuery Schema Change.
- Calculating Sessions and Engaged Sessions in GA4 BigQuery.
- Calculating Total Users in GA4 BigQuery.
#6 GA4 to BigQuery Dimension/Metric Mapping.
- GA4 to BigQuery Mapping Tutorial.
- GA4 Attribution Dimensions to BigQuery Mapping.
- GA4 Google Ads Dimensions to BigQuery Mapping.
- GA4 Demographic Dimensions to BigQuery Mapping.
- GA4 Ecommerce Dimensions to BigQuery Mapping.
- GA4 Event-Scoped Ecommerce Metrics to BigQuery Mapping.
- GA4 Item-Scoped Ecommerce Metrics to BigQuery Mapping.
- GA4 Revenue Metrics to BigQuery Mapping.
- GA4 Event Dimensions to BigQuery Mapping.
- GA4 Event Metrics to BigQuery Mapping.
- GA4 Geography Dimensions to BigQuery Mapping.
- GA4 Link Dimensions to BigQuery Mapping.
- GA4 Page/Screen Dimensions to BigQuery Mapping.
- GA4 Page/Screen Metrics to BigQuery Mapping.
- GA4 Platform/Device Dimensions to BigQuery Mapping.
- GA4 User-Scoped Traffic Dimensions to BigQuery Mapping.
- GA4 Session-Scoped Traffic Dimensions to BigQuery Mapping.
- GA4 Session Metrics to BigQuery Mapping.
- GA4 User Dimensions to BigQuery Mapping.
- GA4 User Metrics to BigQuery Mapping.
- GA4 Advertising Metrics to BigQuery Mapping.
-
Once you understand how the GA4 data is stored in BigQuery data tables, it will become easier for you to query it.
What is the GA4 BigQuery Export Schema?
The GA4 BigQuery export schema refers to the structure of GA4 and Firebase data that is exported to a Google BigQuery project.
This schema defines how the data is organized within datasets and data tables.
To understand the GA4 BigQuery export schema, you will first need to understand the basic structure of how data is stored in a BigQuery project.
- Google Cloud Platform (GCP) can consist of one or more organizations.
- Each organization can consist of one or more projects.
- One such project can be a BigQuery project. It’s the environment within which all BigQuery datasets and operations reside.
- Each project has a name, project number, and project ID.
- Each project can consist of one or more datasets.
- Each dataset can consist of one or more data tables.
Project ID and Data Sets
When you click on a project ID, you can see the datasets:
Here, the project id ‘dbrt-ga4’ contains the following data sets:
- analytics_207472454
- custom_ga4
- ga3_data_backup
- google_ads
- historical_ga4_data
- searchconsole
For every GA4 property and each Firebase project connected to your BigQuery project, a unique dataset called “analytics_<property_id>” is created in your BigQuery project.
The “property_id” is your GA4 Property ID.
The dataset named in the following format: “analytics_<property_id>” is the dataset meant for GA4:
Click on the dataset named in the following format: “analytics_<property_id>” .
You should now be able to see the data set information like ‘Dataset ID’, Dataset creation date, Last modification date, Data location etc:
Note down the Dataset ID by copying it to the clipboard:
We are going to reference this dataset ID later when querying the data.
Click on the ‘analytics_’ dataset again.
You should now see the following three to four data tables:
Each dataset is made up of one or more data tables.
The ‘analytics_207472454’ dataset contains the following four data tables:
- events_(<number of days>)
- events_intraday_<number of days>
- pseudonymous_users_<number of days>
- users_<number of days>
‘events_’ and ‘events_intraday_’ Data Tables
The ‘events_’ and ‘events_intraday_’ data tables contain event-based and user-based GA4 export data in BigQuery.
All the GA4 event data from the previous day(s) is available in the ‘events_’ data table.
This table is automatically imported for each day of export.
events_(1) means all of the GA4 event data from the previous day are available in this data table.
events_(2) means all of the GA4 event data from the previous two days are available in this data table.
events_(3) means all of the GA4 event data from the previous three days are available in this data table.
Similarly,
events_(1187) means all of the GA4 event data from the previous 1187 days are available in this data table.
All the GA4 event data from the current day is available in the ‘events_intraday_’ data table.
This table is automatically updated throughout the day. That’s why it is called ‘events_intraday’ table.
Note: We usually do not query GA4 data from the ‘events_intraday_’ data table.
‘pseudonymous_users_’ and ‘users _’ Data Tables
The ‘pseudonymous_users_’ and ‘users _’ data tables contain only user-based GA4 export data in BigQuery.
The advantage of using the ‘pseudonymous_users_’ and ‘users _’ data tables over the ‘events_’ and ‘events_intraday_’ data tables is that you get access to more user data.
The ‘pseudonymous_users_’ and ‘users _’ data tables contain audience and prediction data which is not available in the ‘events_’ and ‘events_intraday_’ data tables.
The ‘pseudonymous_users_’ data table contains all the data for every pseudonymous identifier that is not user ID.
A pseudonymous identifier is a unique identifier (like ‘Google Signals’ or ‘Device ID’) created and assigned to each user by Google Analytics to track users across devices and sessions and to create a complete picture of their behaviour.
The ‘pseudonymous_users_’ data table is updated whenever data for a user is updated.
pseudonymous_users_(1) means all the data for every pseudonymous identifier that is not a user ID from the previous day is available in this data table.
pseudonymous_users_(2) means all the data for every pseudonymous identifier that is not a user ID from the previous two days is available in this data table.
Similarly,
pseudonymous_users_(236) means all the data for every pseudonymous identifier that is not a user ID from the previous 236 days is available in this data table.
The ‘users_’ data table contains all the data for every pseudonymous identifier that is a user ID.
Data for a user is updated when there is a change to one of the fields.
Note: The ‘users_’ data table is not available to you in your BigQuery project if you are not using the user-id tracking in GA4.
The SCHEMA tab of the data table
Clicking on the ‘events_’ data table will show you the structure of that table (also known as ‘Schema’):
The schema shows you how the data table has been set up, what type of values it accepts, etc.
Take a close look at the various fields available under the ‘SCHEMA’ tab:
We are going to reference these fields when querying the GA4 data.
Bookmark the [GA4] BigQuery Export schema help documentation from Google to find more information about each field:
You can increase or decrease the size of the right-hand side panel by dragging via the mouse:
Selecting Data Table based on Date
The ‘events_’ data tables are named as “events_YYYYMMDD” where “YYYYMMDD” refers to the date the table was imported to BigQuery.
YYYY denotes a year. For example, 2024
MM denotes a month. For example, 04 (i.e. April)
DD denotes a day. For example, 07
So the data table that was imported to BigQuery on April 7th, 2024 would be named as events_20240407
So you are looking at the data for April 7th, 2024.
If you want to look at data for a different date, then click on the date drop-down menu and select a different date:
The DETAILS and the PREVIEW tabs of the Data Table
Click on the ‘DETAILS’ tab to get information about the data table:
Take note of the table ID:
We are going to reference table ID later when querying the GA4 data.
Look at the ‘Storage Info’ section to determine the size of your data table:
It is always a best practice to check the size of a table before querying the data from it.
If the size of the data table is just a few kilobytes (KB) or megabytes (MB), you don’t need to worry.
But if the table size is in gigabytes (GB), terabytes (TB) or petabytes (PB), you should be careful how you query your data.
Your monthly cost of using BigQuery depends upon the following factors:
#1 The amount of data you stored in BigQuery (i.e. the storage cost)
#2 The amount of data you processed by each query you run (i.e. the query cost).
The first 10 GB of active storage is free each month. After that, you would be charged $0.020 per GB of active storage.
The first 1 terabyte of data processed is free each month. After that, you would be charged $5 per terabyte (TB) of data processed.
Related Articles:
Click on the ‘Preview’ tab to view the data in the ‘events_’ data table:
It is always a best practice to preview a table before querying the data from it.
Many people, especially new users, run queries just to preview the data in a data table. This could considerably cost you if you accidentally queried gigabytes or terabytes of data.
Instead of running queries just to preview the data in a data table, click on the ‘Preview’ tab to preview the table.
There is no cost for previewing the data table.
The table preview will give you an idea of what type of data is available in the table without querying the table.
Rows and Columns of the Data Table
From the table preview, you can see that the table is made up of rows and columns:
Use the horizontal slider to see more columns:
Use the vertical slider to see more rows:
Use the ‘Results per page’ drop-down menu if you want to see more than 50 rows:
Note: You can see up to 200 rows per page.
To see the next 200 rows, press the > button:
You can re-size the width of each column in the data table.
How the GA4 data is stored in data tables
Each row of the data table corresponds to a single GA4 event.
For example,
The first row corresponds to the ‘first_visit’ event:
The second row corresponds to the ‘session_start’ event:
Each column of the data table corresponds to a single key of the GA4 event parameter.
Event parameters in GA4 are the additional information about an event that are sent along with the event. The key of an event parameter is the name of the parameter, and the value of an event parameter is the data associated with the parameter.
Here,
‘event_name’ is the key of the event parameter, and ‘first-visit’ is one of the values of the event parameter.
Keys are also known as fields.
Fields can be regular or nested.
A regular field is a single piece of data, such as a string, a number, or a date. A nested field is a collection of fields (often an array of structures) that are stored together as a single unit.
The event_params field is an array of structures that contains additional event parameters.
Here,
‘ga_session_id‘ is an event parameter.
‘engaged_session_event‘ is an event parameter.
Similalry,
‘page_title‘ is an event parameter.
There can be nested fields within the nested field.
Nested fields can be nested within other nested fields to any depth. This is often referred to as hierarchical data.
Hierarchical data is a type of data that is organized in a tree-like structure. Each node in the tree can contain other nodes, and so on.
For example,
The ‘page_referrer’ is a nested field within the ‘event_params’ nested field.
This means that the ‘page_referrer’ event parameter is itself an array of structures. Each structure in the array represents a single referrer.
The ‘page_location’ event parameter is a nested field within the ‘event_params’ nested field.
This means that the page_location event parameter is itself an array of structures. Each structure in the array represents a single location.
Each event has information on event-specific parameters.
The information about GA4 event parameters is stored in the data table in the key-value format:
The key field (event_params.key) denotes the name of the event parameter.
For example: ‘page_title’:
A value field is an object containing the event parameter’s value in one of its four fields:
- string_value
- int_value
- float_value
- double_value.
So we can have the following value fields:
- event_params.value.string_value
- event_params.value.int_value
- event_params.value.float_value
- event_params.value.double_value
The event_params.value.string_value field in GA4 is a nested field that stores the string value of an event parameter.
The event_params.value.int_value field in GA4 is a nested field that stores the integer value of an event parameter.
The event_params.value.float_value field in GA4 is a nested field that stores the float value of an event parameter.
The event_params.value.double_value field in GA4 is a nested field that stores the double value of an event parameter.
The specific field that is used to store the value of the event parameter depends on the type of value that the parameter is.
For example, if the event parameter is a string, the event_params.value.string_value field will be used to store the value of the parameter.
Similarly, if the event parameter is an integer, the event_params.value.int_value field will be used to store the value of the parameter. And so on.
The following table lists the most common event parameters in the GA4 BigQuery data table:
Other articles on GA4 BigQuery
#1 BigQuery Introduction
- How to create a new Google Cloud Platform account.
- How to create a new BigQuery project.
- What is Google BigQuery Sandbox and how to use it.
- Understanding the BigQuery User Interface.
- What is BigQuery Data Transfer Service & how it works.
- How to create data transfer in BigQuery.
- Connect and transfer data from Google Sheets to BigQuery.
- How to access BigQuery Public Data Sets.
- Best Supermetrics Alternative – Dataddo.
#2 GA4 BigQuery Introduction
- Google Analytics 4 BigQuery Tutorial for Beginners to Advanced.
- GA4 Bigquery Export Schema Tutorial.
- GA4 BigQuery – Connect Google Analytics 4 with BigQuery.
- events_ & events_intraday_ tables in BigQuery for GA4 (Google Analytics 4).
- pseudonymous_users_ & users_ data tables in BigQuery for GA4 (Google Analytics 4).
- How to access GA4 Sample Data in BigQuery.
- Advantages of using Google BigQuery for Google Analytics 4.
- Impact of Google Advanced Consent Mode on BigQuery & GDPR.
#3 GA4 BigQuery Data Transfer
- How to Connect and Export Data from GA4 to BigQuery
- How to backfill GA4 data in BigQuery.
- How to overcome GA4 BigQuery Export limit.
- How to Send Custom GA4 Data to BigQuery.
- How to backup Universal Analytics data to BigQuery.
- How to send data from Google Ads to BigQuery.
- How to send data from Google Search Console to BigQuery.
- Sending data from Google Analytics to BigQuery without 360.
- How to send data from Facebook ads to BigQuery.
- How to pull custom data from Google Analytics to BigQuery.
#4 BigQuery Cost Optimization
- Guide to BigQuery Cost Optimization.
- Using Google Cloud pricing calculator for BigQuery.
- Cost of using BigQuery for Google Analytics 4.
#5 Query GA4 BigQuery Data
- How to query Google Analytics data in BigQuery.
- Query GA4 data in BigQuery without understanding SQL.
- Using GA4 BigQuery SQL generator to create SQL queries.
- New vs Returning users in GA4 BigQuery data table.
- GA4 BigQuery Composer Tutorial for ChatGPT.
- How to track GA4 BigQuery Schema Change.
- Calculating Sessions and Engaged Sessions in GA4 BigQuery.
- Calculating Total Users in GA4 BigQuery.
#6 GA4 to BigQuery Dimension/Metric Mapping.
- GA4 to BigQuery Mapping Tutorial.
- GA4 Attribution Dimensions to BigQuery Mapping.
- GA4 Google Ads Dimensions to BigQuery Mapping.
- GA4 Demographic Dimensions to BigQuery Mapping.
- GA4 Ecommerce Dimensions to BigQuery Mapping.
- GA4 Event-Scoped Ecommerce Metrics to BigQuery Mapping.
- GA4 Item-Scoped Ecommerce Metrics to BigQuery Mapping.
- GA4 Revenue Metrics to BigQuery Mapping.
- GA4 Event Dimensions to BigQuery Mapping.
- GA4 Event Metrics to BigQuery Mapping.
- GA4 Geography Dimensions to BigQuery Mapping.
- GA4 Link Dimensions to BigQuery Mapping.
- GA4 Page/Screen Dimensions to BigQuery Mapping.
- GA4 Page/Screen Metrics to BigQuery Mapping.
- GA4 Platform/Device Dimensions to BigQuery Mapping.
- GA4 User-Scoped Traffic Dimensions to BigQuery Mapping.
- GA4 Session-Scoped Traffic Dimensions to BigQuery Mapping.
- GA4 Session Metrics to BigQuery Mapping.
- GA4 User Dimensions to BigQuery Mapping.
- GA4 User Metrics to BigQuery Mapping.
- GA4 Advertising Metrics to BigQuery Mapping.
My best selling books on Digital Analytics and Conversion Optimization
Maths and Stats for Web Analytics and Conversion Optimization
This expert guide will teach you how to leverage the knowledge of maths and statistics in order to accurately interpret data and take actions, which can quickly improve the bottom-line of your online business.
Master the Essentials of Email Marketing Analytics
This book focuses solely on the ‘analytics’ that power your email marketing optimization program and will help you dramatically reduce your cost per acquisition and increase marketing ROI by tracking the performance of the various KPIs and metrics used for email marketing.
Attribution Modelling in Google Analytics and BeyondSECOND EDITION OUT NOW!
Attribution modelling is the process of determining the most effective marketing channels for investment. This book has been written to help you implement attribution modelling. It will teach you how to leverage the knowledge of attribution modelling in order to allocate marketing budget and understand buying behaviour.
Attribution Modelling in Google Ads and Facebook
This book has been written to help you implement attribution modelling in Google Ads (Google AdWords) and Facebook. It will teach you, how to leverage the knowledge of attribution modelling in order to understand the customer purchasing journey and determine the most effective marketing channels for investment.