diff --git a/contents/docs/cdp/batch-exports/bigquery.mdx b/contents/docs/cdp/batch-exports/bigquery.mdx index 6b57ce6d7c0a..3d156544bbf6 100644 --- a/contents/docs/cdp/batch-exports/bigquery.mdx +++ b/contents/docs/cdp/batch-exports/bigquery.mdx @@ -235,3 +235,7 @@ If you check your BigQuery [JOBS view](https://cloud.google.com/bigquery/docs/in Regardless of model, PostHog batch exports run a [load job](https://cloud.google.com/bigquery/docs/batch-loading-data) to batch load the data for the current period into BigQuery. Moreover, you will see additional [query jobs](https://cloud.google.com/bigquery/docs/running-queries) in your logs when exporting a mutable model as the merge operation the batch export executes requires running additional queries in your BigQuery warehouse. If you are noticing an issue with your BigQuery batch export, it may be useful to check the aforementioned [JOBS view](https://cloud.google.com/bigquery/docs/information-schema-jobs) and the [Google Cloud console](https://cloud.google.com/bigquery/docs/managing-jobs#view-job). The error logs in them could be valuable to either diagnose the issue by yourself, or when creating a support request for us to look into. + +### How is the BigQuery table partitioned? + +When PostHog creates the BigQuery table, it is automatically partitioned by the `timestamp` column. If you need a different partition column or want to configure clustering, you should create the table manually in BigQuery before setting up the batch export. PostHog will use the existing table as long as it has a compatible schema.