Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions contents/docs/cdp/batch-exports/bigquery.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -235,3 +235,7 @@ If you check your BigQuery [JOBS view](https://cloud.google.com/bigquery/docs/in
Regardless of model, PostHog batch exports run a [load job](https://cloud.google.com/bigquery/docs/batch-loading-data) to batch load the data for the current period into BigQuery. Moreover, you will see additional [query jobs](https://cloud.google.com/bigquery/docs/running-queries) in your logs when exporting a mutable model as the merge operation the batch export executes requires running additional queries in your BigQuery warehouse.

If you are noticing an issue with your BigQuery batch export, it may be useful to check the aforementioned [JOBS view](https://cloud.google.com/bigquery/docs/information-schema-jobs) and the [Google Cloud console](https://cloud.google.com/bigquery/docs/managing-jobs#view-job). The error logs in them could be valuable to either diagnose the issue by yourself, or when creating a support request for us to look into.

### How is the BigQuery table partitioned?

When PostHog creates the BigQuery table, it is automatically partitioned by the `timestamp` column. If you need a different partition column or want to configure clustering, you should create the table manually in BigQuery before setting up the batch export. PostHog will use the existing table as long as it has a compatible schema.
Loading