How to Use BigQuery With GA4 for Advanced Analysis
By Emily Redmond, Data Analyst at Emilytics · April 2026
TL;DR: Export GA4 to BigQuery for raw data access. Write SQL queries to analyze patterns GA4 interface can't show. Unlimited data retention and full event-level detail.
GA4 is good for dashboards and standard reports. But for real analysis—cohort analysis, statistical modeling, multi-month patterns—you need BigQuery.
What Is BigQuery?
BigQuery is Google's data warehouse. You store large amounts of data and query it with SQL.
GA4 exports your raw event data to BigQuery daily. Then you can:
- Query any dimension, any metric, any combination
- Write complex SQL for analyses GA4 can't do
- Export to BI tools (Looker, Tableau, Data Studio)
- Feed data into data warehouses or data lakes
- Combine GA4 with other data sources (CRM, billing, etc.)
Prerequisites
- Google Cloud project (free tier is fine for small datasets)
- GA4 property
- A little SQL knowledge (or willingness to learn)
Setting Up BigQuery Export
Step 1: Create a Google Cloud Project (if needed)
- Go to console.cloud.google.com
- Create a new project
- Note the project ID
Step 2: Link GA4 to BigQuery
- Go to GA4 → Admin
- Under "Data collection and modification," click BigQuery links
- Click Link BigQuery project
- Select your project
- Choose which data to export:
- "Google Analytics 4 Events" (raw events, recommended)
- "Intraday" (real-time, if you need it)
- Link
GA4 starts exporting. Takes 24-48 hours for backfill, then exports daily.
Step 3: Access BigQuery
- Go to console.cloud.google.com
- Select your project
- Go to BigQuery
- In the left sidebar, you should see your project with GA4 datasets
Understanding GA4 BigQuery Data Structure
GA4 creates two datasets:
analytics_[PROPERTY_ID]: Streaming insert (real-time)analytics_[PROPERTY_ID]_intraday: Intraday data (finalized daily)
Within each, tables are named by date: events_20260101, events_20260102, etc.
Each row is one event, with columns:
event_timestamp: When the event firedevent_name: Event name (pageview, purchase, etc.)user_id: If you track User-IDuser_pseudo_id: Anonymous user IDuser_properties: Custom user propertiesevent_params: Event parameters- And more...
Your First BigQuery Query
Simple Query: Pageviews by Day
SELECT
DATE(TIMESTAMP_MICROS(event_timestamp)) as date,
COUNT(*) as pageviews
FROM `project.analytics_123456.events_*`
WHERE event_name = 'page_view'
GROUP BY date
ORDER BY date DESC
Replace project and 123456 with your actual project ID and GA4 property ID.
This pulls:
- Date
- Count of pageviews per day
- For the last 30 days (the
events_*wildcard matches all event tables)
Query: Revenue by Country
SELECT
(SELECT value FROM UNNEST(user_properties) WHERE key = 'country') as country,
ROUND(SUM(CAST((SELECT value FROM UNNEST(event_params) WHERE key = 'value').string_value AS FLOAT64)), 2) as revenue
FROM `project.analytics_123456.events_*`
WHERE event_name = 'purchase'
GROUP BY country
ORDER BY revenue DESC
This is more complex because GA4's data is nested. Properties are arrays, so you have to unnest them.
Common GA4 BigQuery Analyses
User Cohorts
"Users who signed up in January—how many are still active 30 days later?"
WITH signup_cohort AS (
SELECT
DATE(TIMESTAMP_MICROS(event_timestamp)) as signup_date,
user_id,
MIN(DATE(TIMESTAMP_MICROS(event_timestamp))) as first_day
FROM `project.analytics_123456.events_*`
WHERE event_name = 'sign_up'
GROUP BY signup_date, user_id
)
SELECT
signup_date,
COUNT(DISTINCT user_id) as signups_that_day,
COUNT(DISTINCT CASE WHEN TIMESTAMP_ADD(TIMESTAMP_MILLIS(first_day), INTERVAL 30 DAY) <= CURRENT_TIMESTAMP THEN user_id END) as still_active_30_days
FROM signup_cohort
WHERE signup_date = '2026-01-01'
GROUP BY signup_date
Funnel Analysis
"Of users who viewed product, how many added to cart? Of those, how many purchased?"
WITH product_viewers AS (
SELECT DISTINCT user_id
FROM `project.analytics_123456.events_*`
WHERE event_name = 'view_item'
),
cart_adders AS (
SELECT DISTINCT user_id
FROM `project.analytics_123456.events_*`
WHERE event_name = 'add_to_cart'
),
purchasers AS (
SELECT DISTINCT user_id
FROM `project.analytics_123456.events_*`
WHERE event_name = 'purchase'
)
SELECT
COUNT(DISTINCT pv.user_id) as viewed_product,
COUNT(DISTINCT ca.user_id) as added_to_cart,
COUNT(DISTINCT p.user_id) as purchased,
ROUND(100 * COUNT(DISTINCT ca.user_id) / COUNT(DISTINCT pv.user_id), 2) as product_to_cart_rate,
ROUND(100 * COUNT(DISTINCT p.user_id) / COUNT(DISTINCT ca.user_id), 2) as cart_to_purchase_rate
FROM product_viewers pv
LEFT JOIN cart_adders ca ON pv.user_id = ca.user_id
LEFT JOIN purchasers p ON ca.user_id = p.user_id
Customer Lifetime Value (CLV)
"What's the total revenue per user?"
SELECT
user_id,
COUNT(DISTINCT CASE WHEN event_name = 'purchase' THEN event_timestamp END) as purchase_count,
ROUND(SUM(CAST((SELECT value FROM UNNEST(event_params) WHERE key = 'value').string_value AS FLOAT64)), 2) as total_revenue,
ROUND(AVG(CAST((SELECT value FROM UNNEST(event_params) WHERE key = 'value').string_value AS FLOAT64)), 2) as avg_order_value
FROM `project.analytics_123456.events_*`
WHERE event_name = 'purchase'
GROUP BY user_id
ORDER BY total_revenue DESC
Tips for Writing GA4 BigQuery Queries
-
Use
events_*wildcard: This matches all event tables. Without it, you only query one day. -
TIMESTAMP_MICROS: GA4 stores timestamps in microseconds. Convert with
TIMESTAMP_MICROS(event_timestamp). -
Unnesting: Properties and parameters are nested arrays. Use
UNNEST()andWHERE key = 'property_name'to extract. -
Limit before you query: Start with
LIMIT 100to test queries before running on millions of rows. -
Check costs: Large queries scan a lot of data (at ~$6.25/TB). Use
--comments to estimate costs.
Connecting BigQuery to BI Tools
Once you have queries in BigQuery, visualize them:
Google Data Studio
- Create a report in datastudio.google.com
- Select "BigQuery" as data source
- Pick your project and query
- Design visualizations
- Share with team
Looker Studio (Premium)
Same as Data Studio but more advanced.
Tableau
- Connect to BigQuery (Tableau has a native connector)
- Write your queries or select tables
- Build dashboards
Costs
- Free tier: 1TB of queried data per month is free
- Pay-as-you-go: $6.25 per TB scanned after that
Most small businesses stay under 1TB/month and don't pay anything.
Frequently Asked Questions
Q: How long until data appears in BigQuery after setting up export? A: 24-48 hours for backfill. After that, data is exported daily (usually early morning Pacific time).
Q: Can I query real-time data? A: Yes, use the intraday dataset. But it's limited to the last few hours.
Q: Do I need to know SQL to use BigQuery with GA4? A: For basic queries, no—use pre-built templates or Data Studio. For custom analysis, yes.
Q: Can I combine GA4 data with other data sources in BigQuery? A: Yes. You can have GA4 events and a separate table of customer CRM data, then join them.
Q: How long is data retained in BigQuery? A: Unlimited (you set retention policies). This is why BigQuery is better than GA4 for long-term analysis.
The Bottom Line
BigQuery is where GA4 data comes alive. If you need:
- Unlimited data retention
- Complex multi-month analysis
- Control over what you measure
- Integration with other data
Set up BigQuery export. It's free to set up and nearly free to run.
Start simple (one query), learn SQL gradually, and build from there.
Emily Redmond is a data analyst at Emilytics — the AI analytics agent that watches your GA4, Search Console, and Bing data around the clock so you never miss what matters. 8 years of experience helping founders and growth teams turn data noise into clear decisions. Say hi →