Google BigQuery
Prerequisites
In order to connect Google BigQuery to Cube, you need to provide service account credentials. Cube requires the service account to have BigQuery Data Viewer and BigQuery Job User roles enabled. If you plan to use pre-aggregations, the account will need the BigQuery Data Editor role instead of BigQuery Data Viewer. You can learn more about acquiring Google BigQuery credentials here (opens in a new tab).
- The Google Cloud Project ID (opens in a new tab) for the BigQuery (opens in a new tab) project
- A set of Google Cloud service credentials (opens in a new tab) which allow access (opens in a new tab) to the BigQuery (opens in a new tab) project
- The Google Cloud region (opens in a new tab) for the BigQuery (opens in a new tab) project
Setup
Manual
Add the following to a .env
file in your Cube project:
CUBEJS_DB_TYPE=bigquery
CUBEJS_DB_BQ_PROJECT_ID=my-bigquery-project-12345
CUBEJS_DB_BQ_KEY_FILE=/path/to/my/keyfile.json
You could also encode the key file using Base64 and set the result to
CUBEJS_DB_BQ_CREDENTIALS
:
CUBEJS_DB_BQ_CREDENTIALS=$(cat /path/to/my/keyfile.json | base64)
Cube Cloud
In some cases you'll need to allow connections from your Cube Cloud deployment IP address to your database. You can copy the IP address from either the Database Setup step in deployment creation, or from Settings → Configuration in your deployment.
The following fields are required when creating a BigQuery connection:
Cube Cloud also supports connecting to data sources within private VPCs if dedicated infrastructure is used. Check out the VPC connectivity guide for details.
Environment Variables
Environment Variable | Description | Possible Values | Required |
---|---|---|---|
CUBEJS_DB_BQ_PROJECT_ID | The Google BigQuery project ID to connect to | A valid Google BigQuery Project ID | ✅ |
CUBEJS_DB_BQ_KEY_FILE | The path to a JSON key file for connecting to Google BigQuery | A valid Google BigQuery JSON key file | ✅ |
CUBEJS_DB_BQ_CREDENTIALS | A Base64 encoded JSON key file for connecting to Google BigQuery | A valid Google BigQuery JSON key file encoded as a Base64 string | ❌ |
CUBEJS_DB_BQ_LOCATION | The Google BigQuery dataset location to connect to. Required if used with pre-aggregations outside of US. If not set then BQ driver will fail with Dataset was not found in location US error | A valid Google BigQuery regional location (opens in a new tab) | ⚠️ |
CUBEJS_DB_EXPORT_BUCKET | The name of a bucket in cloud storage | A valid bucket name from cloud storage | ❌ |
CUBEJS_DB_EXPORT_BUCKET_TYPE | The cloud provider where the bucket is hosted | gcp | ❌ |
CUBEJS_CONCURRENCY | The number of concurrent connections each queue has to the database. Default is 10 | A valid number | ❌ |
CUBEJS_DB_MAX_POOL | The maximum number of concurrent database connections to pool. Default is 40 | A valid number | ❌ |
Pre-Aggregation Feature Support
count_distinct_approx
Measures of type
count_distinct_approx
can
be used in pre-aggregations when using Google BigQuery as a source database. To
learn more about Google BigQuery's support for approximate aggregate functions,
click here (opens in a new tab).
Pre-Aggregation Build Strategies
To learn more about pre-aggregation build strategies, head here.
Feature | Works with read-only mode? | Is default? |
---|---|---|
Batching | ❌ | ✅ |
Export Bucket | ❌ | ❌ |
By default, Google BigQuery uses batching to build pre-aggregations.
Batching
No extra configuration is required to configure batching for Google BigQuery.
Export bucket
BigQuery only supports using Google Cloud Storage for export buckets.
Google Cloud Storage
For improved pre-aggregation performance with large datasets, enable export bucket functionality by configuring Cube with the following environment variables:
When using an export bucket, remember to assign the BigQuery Data Editor and Storage Object Admin role to your BigQuery service account.
CUBEJS_DB_EXPORT_BUCKET=export_data_58148478376
CUBEJS_DB_EXPORT_BUCKET_TYPE=gcp
SSL
Cube does not require any additional configuration to enable SSL as Google BigQuery connections are made over HTTPS.