How to overcome GA4 BigQuery Export limit
The two biggest complaints I often hear about GA4 are the Google Analytics data API quota limits and the daily BigQuery export limits.
Standard GA4 properties have a BigQuery export limit of 1 million events /day for daily (batch) exports.
Use the following methods to overcome the GA4 BigQuery Export limit:
- Seriously evaluate your tracking requirements.
- Send more information via event parameters.
- Find and fix duplicate events.
- Exclude events from daily export.
- Use streaming export.
- Use a third-party tool.
- Use multiple GA4 properties.
- Utilize server-side tagging solutions.
- Upgrade to GA4 360.
#1 Seriously evaluate your tracking requirements.
First, you should seriously evaluate if you really need to track all events at such granularity at the GA4 property level.
Are you collecting unnecessary data, esp. at the expense of business-critical information?
Many businesses have this bad habit of collecting as much data as possible about their users.
You can get away with this bad habit while using Universal Analytics. But GA4 won’t let you get away.
GA4 has an unspecified limit on the number of rows it will process to produce the data tables you see in reports via the user interface or data API.
This row limit has been placed to reduce the data processing cost.
When you use a high cardinality dimension, it increases the number of rows that are processed for a data table.
And when the underlying data table hits the unspecified row limit, any data past the row limit is reported under the (other) row.
There could be a case where the underlying data table has to process dimensions that are not part of the report you see. Such dimensions can also contribute to the cardinality limit.
And if these dimensions are high cardinality dimensions, it could affect the cardinality limit of the underlying data tables throughout your GA4 property.
So even one high cardinality dimension can negatively affect the cardinality limit of most of the data you see in your GA4 property.
So the best practice is to avoid collecting high cardinality dimensions.
Audit your GA4 property and find and remove events that are not business-critical information. Track fewer events.
There needs to be a solid business case and approval process before you start tracking events or high cardinality dimensions in GA4.
For example, client ids can easily become a high cardinality dimension and introduce (other) rows in most of your data tables.
Do you really need to track client ids?
The best practice is to minimize the number of events you track (without losing critical information) so you don’t easily hit the API quota or BigQuery export limits.
Identify the most critical events that directly impact your business objectives. Track these events first. Remove any redundant or unnecessary events.
You might be able to reduce the number of events being tracked without losing critical information. Thus avoid hitting the daily export limit.
#2 Send more information via event parameters.
Send more information via parameters than via events, as it can help reduce the event volume.
This is because parameters are not counted as separate events, which can help you send more data without hitting the event limit.
Instead of tracking separate events for similar user actions, use event parameters to capture additional information related to a single event.
This can help you reduce the total number of events while still collecting the required data.
Related Article: Understanding Event Parameters in Google Analytics 4 (GA4)
Get weekly practical tips on GA4 and/or BigQuery to accurately track and read your analytics data.
#3 Find and fix duplicate events.
Find and fix duplicate events to reduce the event volume and avoid hitting the event limits.
Duplicate events are events that are sent multiple times to your GA4 property.
Various factors, such as technical errors, user behaviour, or bugs in your tracking code, can cause them.
For more information, check out this article: How to fix duplicate events in GA4
#4 Exclude data streams and events from daily export.
Specify the data streams and/or events to exclude from daily export which you don’t really need, thus potentially overcoming the daily export limit.
For more information, check out the official help document from Google on Data filtering
#5 Use streaming export.
Use streaming export as it allows you to send more than 1 million events per day.
There is no limit on the number of events for streaming export.
Make sure that your BigQuery project has enough storage space and quota to handle the streaming export.
You can incur additional costs for using streaming export.
For more information on streaming exports, check out the official documentation on [GA4] BigQuery Export.
#6 Use a third-party tool.
Use a third-party tool to export GA4 data to BigQuery.
These tools can often handle larger volumes of data than the native GA4 BigQuery export.
#7 Use multiple GA4 properties.
Track different parts of the website via different GA4 properties.
This involves creating multiple GA4 properties, each with its own BigQuery export.
Each GA4 property will have its own separate limit of 1 million events per day. That way, you can send more than 1 million events per day without exceeding the export limit.
#8 Utilize server-side tagging solutions.
Filter and aggregate events on the server before sending them to GA4.
For example, you could combine multiple small interaction events into a single larger event before sending it to GA4.
Process and transform data on the server to reduce the number of parameters or event types being sent to GA4.
This allows you to consolidate data before it reaches GA4.
Imagine tracking every button click individually.
With server-side tagging, you could combine all button clicks within a timeframe (e.g., 5 seconds) into a single “user interaction session” event.
// Pseudocode for server-side event aggregation
let eventQueue = [];
const aggregationTimeframe = 5000; // 5 seconds
function handleButtonClickEvent(event) {
eventQueue.push(event);
setTimeout(() => {
if (eventQueue.length > 0) {
const aggregatedEvent = {
eventName: 'user_interaction_session',
eventParams: {
buttonClicks: eventQueue.length,
details: eventQueue
}
};
sendToGA4(aggregatedEvent);
eventQueue = [];
}
}, aggregationTimeframe);
}
function sendToGA4(event) {
// Code to send event to GA4
}
This significantly reduces the total event count exported, potentially bringing it under the 1 million limit.
#9 Upgrade to GA4 360.
Upgrade to GA4 360 (the most expensive option) if everything else fails.
GA4 360 properties have a higher BigQuery export limit of billions of events per day. It also provides a higher API quota limit of 250k tokens per day.
Finally, don’t pull data directly from GA4 or a marketing platform into Looker Studio and try to manipulate it there.
Looker Studio is not meant for data manipulation. It is not a spreadsheet or data warehouse.
Pull the data from a data platform into Google Sheets or BigQuery and manipulate the data there, and only after that use that data in Looker Studio.
Other articles on GA4 BigQuery
#1 BigQuery Introduction
- How to create a new Google Cloud Platform account.
- How to create a new BigQuery project.
- What is Google BigQuery Sandbox and how to use it.
- Understanding the BigQuery User Interface.
- What is BigQuery Data Transfer Service & how it works.
- How to create data transfer in BigQuery.
- Connect and transfer data from Google Sheets to BigQuery.
- How to access BigQuery Public Data Sets.
- Best Supermetrics Alternative – Dataddo.
#2 GA4 BigQuery Introduction
- Google Analytics 4 BigQuery Tutorial for Beginners to Advanced.
- GA4 Bigquery Export Schema Tutorial.
- GA4 BigQuery – Connect Google Analytics 4 with BigQuery.
- events_ & events_intraday_ tables in BigQuery for GA4 (Google Analytics 4).
- pseudonymous_users_ & users_ data tables in BigQuery for GA4 (Google Analytics 4).
- How to access GA4 Sample Data in BigQuery.
- Advantages of using Google BigQuery for Google Analytics 4.
- Impact of Google Advanced Consent Mode on BigQuery & GDPR.
#3 GA4 BigQuery Data Transfer
- How to Connect and Export Data from GA4 to BigQuery
- How to backfill GA4 data in BigQuery.
- How to overcome GA4 BigQuery Export limit.
- How to Send Custom GA4 Data to BigQuery.
- How to backup Universal Analytics data to BigQuery.
- How to send data from Google Ads to BigQuery.
- How to send data from Google Search Console to BigQuery.
- Sending data from Google Analytics to BigQuery without 360.
- How to send data from Facebook ads to BigQuery.
- How to pull custom data from Google Analytics to BigQuery.
#4 BigQuery Cost Optimization
- Guide to BigQuery Cost Optimization.
- Using Google Cloud pricing calculator for BigQuery.
- Cost of using BigQuery for Google Analytics 4.
#5 Query GA4 BigQuery Data
- How to query Google Analytics data in BigQuery.
- Query GA4 data in BigQuery without understanding SQL.
- Using GA4 BigQuery SQL generator to create SQL queries.
- New vs Returning users in GA4 BigQuery data table.
- GA4 BigQuery Composer Tutorial for ChatGPT.
- How to track GA4 BigQuery Schema Change.
- Calculating Sessions and Engaged Sessions in GA4 BigQuery.
- Calculating Total Users in GA4 BigQuery.
#6 GA4 to BigQuery Dimension/Metric Mapping.
- GA4 to BigQuery Mapping Tutorial.
- GA4 Attribution Dimensions to BigQuery Mapping.
- GA4 Google Ads Dimensions to BigQuery Mapping.
- GA4 Demographic Dimensions to BigQuery Mapping.
- GA4 Ecommerce Dimensions to BigQuery Mapping.
- GA4 Event-Scoped Ecommerce Metrics to BigQuery Mapping.
- GA4 Item-Scoped Ecommerce Metrics to BigQuery Mapping.
- GA4 Revenue Metrics to BigQuery Mapping.
- GA4 Event Dimensions to BigQuery Mapping.
- GA4 Event Metrics to BigQuery Mapping.
- GA4 Geography Dimensions to BigQuery Mapping.
- GA4 Link Dimensions to BigQuery Mapping.
- GA4 Page/Screen Dimensions to BigQuery Mapping.
- GA4 Page/Screen Metrics to BigQuery Mapping.
- GA4 Platform/Device Dimensions to BigQuery Mapping.
- GA4 User-Scoped Traffic Dimensions to BigQuery Mapping.
- GA4 Session-Scoped Traffic Dimensions to BigQuery Mapping.
- GA4 Session Metrics to BigQuery Mapping.
- GA4 User Dimensions to BigQuery Mapping.
- GA4 User Metrics to BigQuery Mapping.
- GA4 Advertising Metrics to BigQuery Mapping.
-
The two biggest complaints I often hear about GA4 are the Google Analytics data API quota limits and the daily BigQuery export limits.
Standard GA4 properties have a BigQuery export limit of 1 million events /day for daily (batch) exports.
Use the following methods to overcome the GA4 BigQuery Export limit:
- Seriously evaluate your tracking requirements.
- Send more information via event parameters.
- Find and fix duplicate events.
- Exclude events from daily export.
- Use streaming export.
- Use a third-party tool.
- Use multiple GA4 properties.
- Utilize server-side tagging solutions.
- Upgrade to GA4 360.
#1 Seriously evaluate your tracking requirements.
First, you should seriously evaluate if you really need to track all events at such granularity at the GA4 property level.
Are you collecting unnecessary data, esp. at the expense of business-critical information?
Many businesses have this bad habit of collecting as much data as possible about their users.
You can get away with this bad habit while using Universal Analytics. But GA4 won’t let you get away.
GA4 has an unspecified limit on the number of rows it will process to produce the data tables you see in reports via the user interface or data API.
This row limit has been placed to reduce the data processing cost.
When you use a high cardinality dimension, it increases the number of rows that are processed for a data table.
And when the underlying data table hits the unspecified row limit, any data past the row limit is reported under the (other) row.
There could be a case where the underlying data table has to process dimensions that are not part of the report you see. Such dimensions can also contribute to the cardinality limit.
And if these dimensions are high cardinality dimensions, it could affect the cardinality limit of the underlying data tables throughout your GA4 property.
So even one high cardinality dimension can negatively affect the cardinality limit of most of the data you see in your GA4 property.
So the best practice is to avoid collecting high cardinality dimensions.
Audit your GA4 property and find and remove events that are not business-critical information. Track fewer events.
There needs to be a solid business case and approval process before you start tracking events or high cardinality dimensions in GA4.
For example, client ids can easily become a high cardinality dimension and introduce (other) rows in most of your data tables.
Do you really need to track client ids?
The best practice is to minimize the number of events you track (without losing critical information) so you don’t easily hit the API quota or BigQuery export limits.
Identify the most critical events that directly impact your business objectives. Track these events first. Remove any redundant or unnecessary events.
You might be able to reduce the number of events being tracked without losing critical information. Thus avoid hitting the daily export limit.
#2 Send more information via event parameters.
Send more information via parameters than via events, as it can help reduce the event volume.
This is because parameters are not counted as separate events, which can help you send more data without hitting the event limit.
Instead of tracking separate events for similar user actions, use event parameters to capture additional information related to a single event.
This can help you reduce the total number of events while still collecting the required data.
Related Article: Understanding Event Parameters in Google Analytics 4 (GA4)
#3 Find and fix duplicate events.
Find and fix duplicate events to reduce the event volume and avoid hitting the event limits.
Duplicate events are events that are sent multiple times to your GA4 property.
Various factors, such as technical errors, user behaviour, or bugs in your tracking code, can cause them.
For more information, check out this article: How to fix duplicate events in GA4
#4 Exclude data streams and events from daily export.
Specify the data streams and/or events to exclude from daily export which you don’t really need, thus potentially overcoming the daily export limit.
For more information, check out the official help document from Google on Data filtering
#5 Use streaming export.
Use streaming export as it allows you to send more than 1 million events per day.
There is no limit on the number of events for streaming export.
Make sure that your BigQuery project has enough storage space and quota to handle the streaming export.
You can incur additional costs for using streaming export.
For more information on streaming exports, check out the official documentation on [GA4] BigQuery Export.
#6 Use a third-party tool.
Use a third-party tool to export GA4 data to BigQuery.
These tools can often handle larger volumes of data than the native GA4 BigQuery export.
#7 Use multiple GA4 properties.
Track different parts of the website via different GA4 properties.
This involves creating multiple GA4 properties, each with its own BigQuery export.
Each GA4 property will have its own separate limit of 1 million events per day. That way, you can send more than 1 million events per day without exceeding the export limit.
#8 Utilize server-side tagging solutions.
Filter and aggregate events on the server before sending them to GA4.
For example, you could combine multiple small interaction events into a single larger event before sending it to GA4.
Process and transform data on the server to reduce the number of parameters or event types being sent to GA4.
This allows you to consolidate data before it reaches GA4.
Imagine tracking every button click individually.
With server-side tagging, you could combine all button clicks within a timeframe (e.g., 5 seconds) into a single “user interaction session” event.
// Pseudocode for server-side event aggregation
let eventQueue = [];
const aggregationTimeframe = 5000; // 5 seconds
function handleButtonClickEvent(event) {
eventQueue.push(event);
setTimeout(() => {
if (eventQueue.length > 0) {
const aggregatedEvent = {
eventName: 'user_interaction_session',
eventParams: {
buttonClicks: eventQueue.length,
details: eventQueue
}
};
sendToGA4(aggregatedEvent);
eventQueue = [];
}
}, aggregationTimeframe);
}
function sendToGA4(event) {
// Code to send event to GA4
}
This significantly reduces the total event count exported, potentially bringing it under the 1 million limit.
#9 Upgrade to GA4 360.
Upgrade to GA4 360 (the most expensive option) if everything else fails.
GA4 360 properties have a higher BigQuery export limit of billions of events per day. It also provides a higher API quota limit of 250k tokens per day.
Finally, don’t pull data directly from GA4 or a marketing platform into Looker Studio and try to manipulate it there.
Looker Studio is not meant for data manipulation. It is not a spreadsheet or data warehouse.
Pull the data from a data platform into Google Sheets or BigQuery and manipulate the data there, and only after that use that data in Looker Studio.
Other articles on GA4 BigQuery
#1 BigQuery Introduction
- How to create a new Google Cloud Platform account.
- How to create a new BigQuery project.
- What is Google BigQuery Sandbox and how to use it.
- Understanding the BigQuery User Interface.
- What is BigQuery Data Transfer Service & how it works.
- How to create data transfer in BigQuery.
- Connect and transfer data from Google Sheets to BigQuery.
- How to access BigQuery Public Data Sets.
- Best Supermetrics Alternative – Dataddo.
#2 GA4 BigQuery Introduction
- Google Analytics 4 BigQuery Tutorial for Beginners to Advanced.
- GA4 Bigquery Export Schema Tutorial.
- GA4 BigQuery – Connect Google Analytics 4 with BigQuery.
- events_ & events_intraday_ tables in BigQuery for GA4 (Google Analytics 4).
- pseudonymous_users_ & users_ data tables in BigQuery for GA4 (Google Analytics 4).
- How to access GA4 Sample Data in BigQuery.
- Advantages of using Google BigQuery for Google Analytics 4.
- Impact of Google Advanced Consent Mode on BigQuery & GDPR.
#3 GA4 BigQuery Data Transfer
- How to Connect and Export Data from GA4 to BigQuery
- How to backfill GA4 data in BigQuery.
- How to overcome GA4 BigQuery Export limit.
- How to Send Custom GA4 Data to BigQuery.
- How to backup Universal Analytics data to BigQuery.
- How to send data from Google Ads to BigQuery.
- How to send data from Google Search Console to BigQuery.
- Sending data from Google Analytics to BigQuery without 360.
- How to send data from Facebook ads to BigQuery.
- How to pull custom data from Google Analytics to BigQuery.
#4 BigQuery Cost Optimization
- Guide to BigQuery Cost Optimization.
- Using Google Cloud pricing calculator for BigQuery.
- Cost of using BigQuery for Google Analytics 4.
#5 Query GA4 BigQuery Data
- How to query Google Analytics data in BigQuery.
- Query GA4 data in BigQuery without understanding SQL.
- Using GA4 BigQuery SQL generator to create SQL queries.
- New vs Returning users in GA4 BigQuery data table.
- GA4 BigQuery Composer Tutorial for ChatGPT.
- How to track GA4 BigQuery Schema Change.
- Calculating Sessions and Engaged Sessions in GA4 BigQuery.
- Calculating Total Users in GA4 BigQuery.
#6 GA4 to BigQuery Dimension/Metric Mapping.
- GA4 to BigQuery Mapping Tutorial.
- GA4 Attribution Dimensions to BigQuery Mapping.
- GA4 Google Ads Dimensions to BigQuery Mapping.
- GA4 Demographic Dimensions to BigQuery Mapping.
- GA4 Ecommerce Dimensions to BigQuery Mapping.
- GA4 Event-Scoped Ecommerce Metrics to BigQuery Mapping.
- GA4 Item-Scoped Ecommerce Metrics to BigQuery Mapping.
- GA4 Revenue Metrics to BigQuery Mapping.
- GA4 Event Dimensions to BigQuery Mapping.
- GA4 Event Metrics to BigQuery Mapping.
- GA4 Geography Dimensions to BigQuery Mapping.
- GA4 Link Dimensions to BigQuery Mapping.
- GA4 Page/Screen Dimensions to BigQuery Mapping.
- GA4 Page/Screen Metrics to BigQuery Mapping.
- GA4 Platform/Device Dimensions to BigQuery Mapping.
- GA4 User-Scoped Traffic Dimensions to BigQuery Mapping.
- GA4 Session-Scoped Traffic Dimensions to BigQuery Mapping.
- GA4 Session Metrics to BigQuery Mapping.
- GA4 User Dimensions to BigQuery Mapping.
- GA4 User Metrics to BigQuery Mapping.
- GA4 Advertising Metrics to BigQuery Mapping.
My best selling books on Digital Analytics and Conversion Optimization
Maths and Stats for Web Analytics and Conversion Optimization
This expert guide will teach you how to leverage the knowledge of maths and statistics in order to accurately interpret data and take actions, which can quickly improve the bottom-line of your online business.
Master the Essentials of Email Marketing Analytics
This book focuses solely on the ‘analytics’ that power your email marketing optimization program and will help you dramatically reduce your cost per acquisition and increase marketing ROI by tracking the performance of the various KPIs and metrics used for email marketing.
Attribution Modelling in Google Analytics and BeyondSECOND EDITION OUT NOW!
Attribution modelling is the process of determining the most effective marketing channels for investment. This book has been written to help you implement attribution modelling. It will teach you how to leverage the knowledge of attribution modelling in order to allocate marketing budget and understand buying behaviour.
Attribution Modelling in Google Ads and Facebook
This book has been written to help you implement attribution modelling in Google Ads (Google AdWords) and Facebook. It will teach you, how to leverage the knowledge of attribution modelling in order to understand the customer purchasing journey and determine the most effective marketing channels for investment.