How to overcome GA4 BigQuery Export limit

The two biggest complaints I often hear about GA4 are the Google Analytics data API quota limits and the daily BigQuery export limits

Standard GA4 properties have a BigQuery export limit of 1 million events /day for daily (batch) exports. 

Use the following methods to overcome the GA4 BigQuery Export limit:

  1. Seriously evaluate your tracking requirements.
  2. Send more information via event parameters.
  3. Find and fix duplicate events.
  4. Exclude events from daily export.
  5. Use streaming export.
  6. Use a third-party tool.
  7. Use multiple GA4 properties.
  8. Utilize server-side tagging solutions.
  9. Upgrade to GA4 360.

#1 Seriously evaluate your tracking requirements.

First, you should seriously evaluate if you really need to track all events at such granularity at the GA4 property level.

Are you collecting unnecessary data, esp. at the expense of business-critical information?

Many businesses have this bad habit of collecting as much data as possible about their users. 

You can get away with this bad habit while using Universal Analytics. But GA4 won’t let you get away.

GA4 has an unspecified limit on the number of rows it will process to produce the data tables you see in reports via the user interface or data API.

This row limit has been placed to reduce the data processing cost.

When you use a high cardinality dimension, it increases the number of rows that are processed for a data table. 

And when the underlying data table hits the unspecified row limit, any data past the row limit is reported under the (other) row.

ga4 other row cardinality issue

There could be a case where the underlying data table has to process dimensions that are not part of the report you see. Such dimensions can also contribute to the cardinality limit.

And if these dimensions are high cardinality dimensions, it could affect the cardinality limit of the underlying data tables throughout your GA4 property.

So even one high cardinality dimension can negatively affect the cardinality limit of most of the data you see in your GA4 property.

So the best practice is to avoid collecting high cardinality dimensions.

Audit your GA4 property and find and remove events that are not business-critical information. Track fewer events. 

There needs to be a solid business case and approval process before you start tracking events or high cardinality dimensions in GA4. 

For example, client ids can easily become a high cardinality dimension and introduce (other) rows in most of your data tables. 

Do you really need to track client ids?

The best practice is to minimize the number of events you track (without losing critical information) so you don’t easily hit the API quota or BigQuery export limits.

Identify the most critical events that directly impact your business objectives. Track these events first. Remove any redundant or unnecessary events.

You might be able to reduce the number of events being tracked without losing critical information. Thus avoid hitting the daily export limit.

#2 Send more information via event parameters.

Send more information via parameters than via events, as it can help reduce the event volume.

This is because parameters are not counted as separate events, which can help you send more data without hitting the event limit.

Instead of tracking separate events for similar user actions, use event parameters to capture additional information related to a single event. 

event parameters ga4

This can help you reduce the total number of events while still collecting the required data.

Related Article: Understanding Event Parameters in Google Analytics 4 (GA4)

Get weekly practical tips on GA4 and/or BigQuery to accurately track and read your analytics data.

 

#3 Find and fix duplicate events.

Find and fix duplicate events to reduce the event volume and avoid hitting the event limits.

Duplicate events are events that are sent multiple times to your GA4 property.

Various factors, such as technical errors, user behaviour, or bugs in your tracking code, can cause them.

For more information, check out this article: How to fix duplicate events in GA4

#4 Exclude data streams and events from daily export.

Specify the data streams and/or events to exclude from daily export which you don’t really need, thus potentially overcoming the daily export limit.

configure data streams and events

For more information, check out the official help document from Google on Data filtering

#5 Use streaming export.

Use streaming export as it allows you to send more than 1 million events per day.

streaming export ga4 bigquery

There is no limit on the number of events for streaming export.

Make sure that your BigQuery project has enough storage space and quota to handle the streaming export.

You can incur additional costs for using streaming export.

For more information on streaming exports, check out the official documentation on [GA4] BigQuery Export.

#6 Use a third-party tool.

Use a third-party tool to export GA4 data to BigQuery.

These tools can often handle larger volumes of data than the native GA4 BigQuery export.

#7 Use multiple GA4 properties.

Track different parts of the website via different GA4 properties.

This involves creating multiple GA4 properties, each with its own BigQuery export.

Each GA4 property will have its own separate limit of 1 million events per day. That way, you can send more than 1 million events per day without exceeding the export limit.

#8 Utilize server-side tagging solutions.

Filter and aggregate events on the server before sending them to GA4.

For example, you could combine multiple small interaction events into a single larger event before sending it to GA4.

Process and transform data on the server to reduce the number of parameters or event types being sent to GA4.

This allows you to consolidate data before it reaches GA4.

Imagine tracking every button click individually.

With server-side tagging, you could combine all button clicks within a timeframe (e.g., 5 seconds) into a single “user interaction session” event.

// Pseudocode for server-side event aggregation

let eventQueue = [];
const aggregationTimeframe = 5000; // 5 seconds

function handleButtonClickEvent(event) {
    eventQueue.push(event);

    setTimeout(() => {
        if (eventQueue.length > 0) {
            const aggregatedEvent = {
                eventName: 'user_interaction_session',
                eventParams: {
                    buttonClicks: eventQueue.length,
                    details: eventQueue
                }
            };
            
            sendToGA4(aggregatedEvent);
            eventQueue = [];
        }
    }, aggregationTimeframe);
}

function sendToGA4(event) {
    // Code to send event to GA4
}

This significantly reduces the total event count exported, potentially bringing it under the 1 million limit.

#9 Upgrade to GA4 360.

Upgrade to GA4 360 (the most expensive option) if everything else fails.

GA4 360 properties have a higher BigQuery export limit of billions of events per day. It also provides a higher API quota limit of 250k tokens per day. 

Finally, don’t pull data directly from GA4 or a marketing platform into Looker Studio and try to manipulate it there.

how to send data to looker studio

Looker Studio is not meant for data manipulation. It is not a spreadsheet or data warehouse.

Pull the data from a data platform into Google Sheets or BigQuery and manipulate the data there, and only after that use that data in Looker Studio.

My best selling books on Digital Analytics and Conversion Optimization

Maths and Stats for Web Analytics and Conversion Optimization
This expert guide will teach you how to leverage the knowledge of maths and statistics in order to accurately interpret data and take actions, which can quickly improve the bottom-line of your online business.

Master the Essentials of Email Marketing Analytics
This book focuses solely on the ‘analytics’ that power your email marketing optimization program and will help you dramatically reduce your cost per acquisition and increase marketing ROI by tracking the performance of the various KPIs and metrics used for email marketing.

Attribution Modelling in Google Analytics and BeyondSECOND EDITION OUT NOW!
Attribution modelling is the process of determining the most effective marketing channels for investment. This book has been written to help you implement attribution modelling. It will teach you how to leverage the knowledge of attribution modelling in order to allocate marketing budget and understand buying behaviour.

Attribution Modelling in Google Ads and Facebook
This book has been written to help you implement attribution modelling in Google Ads (Google AdWords) and Facebook. It will teach you, how to leverage the knowledge of attribution modelling in order to understand the customer purchasing journey and determine the most effective marketing channels for investment.

About the Author

Himanshu Sharma

  • Founder, OptimizeSmart.com
  • Over 15 years of experience in digital analytics and marketing
  • Author of four best-selling books on digital analytics and conversion optimization
  • Nominated for Digital Analytics Association Awards for Excellence
  • Runs one of the most popular blogs in the world on digital analytics
  • Consultant to countless small and big businesses over the decade