Snowflake
Prerequisites
- The account ID (opens in a new tab) for Snowflake (opens in a new tab)
- The warehouse name in the Snowflake (opens in a new tab) account
- The region (opens in a new tab) for the Snowflake (opens in a new tab) warehouse
- The username/password for the Snowflake (opens in a new tab) account
Setup
If you're having Network error and Snowflake can't be reached please make sure you tried format 2 for an account id (opens in a new tab).
Manual
Add the following to a .env
file in your Cube project:
CUBEJS_DB_TYPE=snowflake
CUBEJS_DB_SNOWFLAKE_ACCOUNT=XXXXXXXXX.us-east-1
CUBEJS_DB_SNOWFLAKE_WAREHOUSE=MY_SNOWFLAKE_WAREHOUSE
CUBEJS_DB_NAME=my_snowflake_database
CUBEJS_DB_USER=snowflake_user
CUBEJS_DB_PASS=**********
Cube Cloud
In some cases you'll need to allow connections from your Cube Cloud deployment IP address to your database. You can copy the IP address from either the Database Setup step in deployment creation, or from Settings → Configuration in your deployment.
The following fields are required when creating a Snowflake connection:
Cube Cloud also supports connecting to data sources within private VPCs if dedicated infrastructure is used. Check out the VPC connectivity guide for details.
Environment Variables
Environment Variable | Description | Possible Values | Required |
---|---|---|---|
CUBEJS_DB_SNOWFLAKE_ACCOUNT | The Snowflake account identifier to use when connecting to the database | A valid Snowflake account ID (opens in a new tab) | ✅ |
CUBEJS_DB_SNOWFLAKE_REGION | The Snowflake region to use when connecting to the database | A valid Snowflake region (opens in a new tab) | ❌ |
CUBEJS_DB_SNOWFLAKE_WAREHOUSE | The Snowflake warehouse to use when connecting to the database | A valid Snowflake warehouse (opens in a new tab) in the account | ✅ |
CUBEJS_DB_SNOWFLAKE_ROLE | The Snowflake role to use when connecting to the database | A valid Snowflake role (opens in a new tab) in the account | ❌ |
CUBEJS_DB_SNOWFLAKE_CLIENT_SESSION_KEEP_ALIVE | If true , keep the Snowflake connection alive indefinitely (opens in a new tab) | true , false | ❌ |
CUBEJS_DB_NAME | The name of the database to connect to | A valid database name | ✅ |
CUBEJS_DB_USER | The username used to connect to the database | A valid database username | ✅ |
CUBEJS_DB_PASS | The password used to connect to the database | A valid database password | ✅ |
CUBEJS_DB_SNOWFLAKE_AUTHENTICATOR | The type of authenticator to use with Snowflake. Use SNOWFLAKE with username/password, or SNOWFLAKE_JWT with key pairs. Defaults to SNOWFLAKE | SNOWFLAKE , SNOWFLAKE_JWT | ❌ |
CUBEJS_DB_SNOWFLAKE_PRIVATE_KEY_PATH | The path to the private RSA key folder | A valid path to the private RSA key | ❌ |
CUBEJS_DB_SNOWFLAKE_PRIVATE_KEY_PASS | The password for the private RSA key. Only required for encrypted keys | A valid password for the encrypted private RSA key | ❌ |
CUBEJS_CONCURRENCY | The number of concurrent connections each queue has to the database. Default is 5 | A valid number | ❌ |
CUBEJS_DB_MAX_POOL | The maximum number of concurrent database connections to pool. Default is 20 | A valid number | ❌ |
Pre-Aggregation Feature Support
count_distinct_approx
Measures of type
count_distinct_approx
can
be used in pre-aggregations when using Snowflake as a source database. To learn
more about Snowflake's support for approximate aggregate functions, click
here (opens in a new tab).
Pre-Aggregation Build Strategies
To learn more about pre-aggregation build strategies, head here.
Feature | Works with read-only mode? | Is default? |
---|---|---|
Batching | ❌ | ✅ |
Export Bucket | ❌ | ❌ |
By default, Snowflake uses batching to build pre-aggregations.
Batching
No extra configuration is required to configure batching for Snowflake.
Export Bucket
Snowflake supports using both AWS S3 (opens in a new tab) and Google Cloud Storage (opens in a new tab) for export bucket functionality.
AWS S3
Ensure the AWS credentials are correctly configured in IAM to allow reads and writes to the export bucket in S3.
CUBEJS_DB_EXPORT_BUCKET_TYPE=s3
CUBEJS_DB_EXPORT_BUCKET=my.bucket.on.s3
CUBEJS_DB_EXPORT_BUCKET_AWS_KEY=<AWS_KEY>
CUBEJS_DB_EXPORT_BUCKET_AWS_SECRET=<AWS_SECRET>
CUBEJS_DB_EXPORT_BUCKET_AWS_REGION=<AWS_REGION>
Google Cloud Storage
When using an export bucket, remember to assign the Storage Object Admin
role to your Google Cloud credentials (CUBEJS_DB_EXPORT_GCS_CREDENTIALS
).
Before configuring Cube, an integration must be created and configured in
Snowflake (opens in a new tab). Take note of the integration name
(gcs_int
from the example link) as you'll need it to configure Cube.
Once the Snowflake integration is set up, configure Cube using the following:
CUBEJS_DB_EXPORT_BUCKET=snowflake-export-bucket
CUBEJS_DB_EXPORT_BUCKET_TYPE=gcp
CUBEJS_DB_EXPORT_GCS_CREDENTIALS=<BASE64_ENCODED_SERVICE_CREDENTIALS_JSON>
CUBEJS_DB_EXPORT_INTEGRATION=gcs_int
SSL
Cube does not require any additional configuration to enable SSL as Snowflake connections are made over HTTPS.