From 464979a0759340ef64449625dbbed645b0ab2da9 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:31:36 +0000 Subject: [PATCH 01/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- self-host/customize-deployment/environment-variables.mdx | 1 + 1 file changed, 1 insertion(+) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index aa91fe18..5318449f 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -1,5 +1,6 @@ --- title: "Environment variables" +mode: wide --- This is a reference to all environment variables that can be used to configure a Lightdash deployment. From 9303413aef7739021fc9f9b874165c269736d0a2 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:42:28 +0000 Subject: [PATCH 02/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 90 +++++++++---------- 1 file changed, 45 insertions(+), 45 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 5318449f..3d683344 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -5,51 +5,51 @@ mode: wide This is a reference to all environment variables that can be used to configure a Lightdash deployment. -| Variable | Description | Required? | Default | -| :------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :-------------------------------------------: | :---------------------: | -| `PGHOST` | Hostname of postgres server to store Lightdash data | | | -| `PGPORT` | Port of postgres server to store Lightdash data | | | -| `PGUSER` | Username of postgres user to access postgres server to store Lightdash data | | | -| `PGPASSWORD` | Password for PGUSER | | | -| `PGDATABASE` | Database name inside postgres server to store Lightdash data | | | -| `PGCONNECTIONURI` | Connection URI for postgres server to store Lightdash data in the format postgresql://user:password@host:port/db?params. This is an alternative to providing the previous PG variables. | | | -| `PGMAXCONNECTIONS` | Maximum number of connections to the database | | | -| `PGMINCONNECTIONS` | Minimum number of connections to the database | | | -| `LIGHTDASH_SECRET` | Secret key used to secure various tokens in Lightdash. This **must** be fixed between deployments. If the secret changes, you won't have access to Lightdash data. | | | -| `SECURE_COOKIES` | Only allows cookies to be stored over a https connection. We use cookies to keep you logged in. This is recommended to be set to true in production. | | `false` | -| `COOKIES_MAX_AGE_HOURS` | How many hours a user session exists before the user is automatically signed out. For example if 24, then the user will be automatically after 24 hours of inactivity. | | | -| `TRUST_PROXY` | This tells the Lightdash server that it can trust the X-Forwarded-Proto header it receives in requests. This is useful if you use `SECURE_COOKIES=true` behind a HTTPS terminated proxy that you can trust. | | `false` | -| `SITE_URL` | Site url where Lightdash is being hosted. It should include the protocol. E.g https://lightdash.mycompany.com | | `http://localhost:8080` | -| `INTERNAL_LIGHTDASH_HOST` | Internal Lightdash host for the Headless browser to send requests when your Lightdash instance is not accessible from the Internet. Needs to support `https` if `SECURE_COOKIES=true` | | Same as `SITE_URL` | -| `STATIC_IP` | Server static IP so users can add the IP to their warehouse allow-list. | | `http://localhost:8080` | -| `LIGHTDASH_QUERY_MAX_LIMIT` | Query max rows limit | | `5000` | -| `LIGHTDASH_QUERY_DEFAULT_LIMIT` | Default number of rows to return in a query | | `500` | -| `LIGHTDASH_QUERY_MAX_PAGE_SIZE` | Maximum page size for paginated queries | | `2500` | -| `SCHEDULER_ENABLED` | Enables/Disables the scheduler worker that triggers the scheduled deliveries. | | `true` | -| `SCHEDULER_CONCURRENCY` | How many scheduled delivery jobs can be processed concurrently. | | `3` | -| `SCHEDULER_JOB_TIMEOUT` | After how many milliseconds the job should be timeout so the scheduler worker can pick other jobs. | | `600000` (10 minutes) | -| `SCHEDULER_SCREENSHOT_TIMEOUT` | Timeout in milliseconds for taking screenshots | | | -| `SCHEDULER_INCLUDE_TASKS` | Comma-separated list of scheduler tasks to include | | | -| `SCHEDULER_EXCLUDE_TASKS` | Comma-separated list of scheduler tasks to exclude | | | -| `LIGHTDASH_CSV_CELLS_LIMIT` | Max cells on CSV file exports | | `100000` | -| `LIGHTDASH_CHART_VERSION_HISTORY_DAYS_LIMIT` | Configure how far back the chart versions history goes in days | | `3` | -| `LIGHTDASH_PIVOT_TABLE_MAX_COLUMN_LIMIT` | Configure maximum number of columns in pivot table | | `60` | -| `GROUPS_ENABLED` | Enables/Disables groups functionality | | `false` | -| `CUSTOM_VISUALIZATIONS_ENABLED` | Enables/Disables custom chart functionality | | `false` | -| `LIGHTDASH_MAX_PAYLOAD` | Maximum HTTP request body size | | `5mb` | -| `LIGHTDASH_LICENSE_KEY` | License key for Lightdash Enterprise Edition. See [Enterprise License Keys](/self-host/customize-deployment/enterprise-license-keys) for details. [Get your license key](https://calendly.com/lightdash-cloud/enterprise?utm_source=docs&utm_medium=referral&utm_campaign=enterprise_licensing&utm_content=license_key_cta) | | | -| `HEADLESS_BROWSER_HOST` | Hostname for the headless browser | | — | -| `HEADLESS_BROWSER_PORT` | Port for the headless browser | | `3001` | -| `ALLOW_MULTIPLE_ORGS` | If set to `true`, new users registering on Lightdash will have their own organization, separated from others | | `false` | -| `LIGHTDASH_MODE` | Mode for Lightdash (default, demo, pr, etc.) | | `default` | -| `DISABLE_PAT` | Disables Personal Access Tokens | | `false` | -| `PAT_ALLOWED_ORG_ROLES` | Comma-separated list of organization roles allowed to use Personal Access Tokens | | All roles | -| `PAT_MAX_EXPIRATION_TIME_IN_DAYS` | Maximum expiration time in days for Personal Access Tokens | | | -| `MAX_DOWNLOADS_AS_CODE` | Maximum number of downloads as code | | `100` | -| `EXTENDED_USAGE_ANALYTICS` | Enables extended usage analytics | | `false` | -| `USE_SECURE_BROWSER` | Use secure WebSocket connections for headless browser | | `false` | -| `DISABLE_DASHBOARD_COMMENTS` | Disables dashboard comments | | `false` | -| `ORGANIZATION_WAREHOUSE_CREDENTIALS_ENABLED` | Enables organization warehouse settings | | `false` | +| Variable | Required? | Default | Description | +| :------------------------------------------- | :-------------------------------------------: | :---------------------: | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| `PGHOST` | | | Hostname of postgres server to store Lightdash data | +| `PGPORT` | | | Port of postgres server to store Lightdash data | +| `PGUSER` | | | Username of postgres user to access postgres server to store Lightdash data | +| `PGPASSWORD` | | | Password for PGUSER | +| `PGDATABASE` | | | Database name inside postgres server to store Lightdash data | +| `PGCONNECTIONURI` | | | Connection URI for postgres server to store Lightdash data in the format postgresql://user:password@host:port/db?params. This is an alternative to providing the previous PG variables. | +| `PGMAXCONNECTIONS` | | | Maximum number of connections to the database | +| `PGMINCONNECTIONS` | | | Minimum number of connections to the database | +| `LIGHTDASH_SECRET` | | | Secret key used to secure various tokens in Lightdash. This **must** be fixed between deployments. If the secret changes, you won't have access to Lightdash data. | +| `SECURE_COOKIES` | | `false` | Only allows cookies to be stored over a https connection. We use cookies to keep you logged in. This is recommended to be set to true in production. | +| `COOKIES_MAX_AGE_HOURS` | | | How many hours a user session exists before the user is automatically signed out. For example if 24, then the user will be automatically after 24 hours of inactivity. | +| `TRUST_PROXY` | | `false` | This tells the Lightdash server that it can trust the X-Forwarded-Proto header it receives in requests. This is useful if you use `SECURE_COOKIES=true` behind a HTTPS terminated proxy that you can trust. | +| `SITE_URL` | | `http://localhost:8080` | Site url where Lightdash is being hosted. It should include the protocol. E.g https://lightdash.mycompany.com | +| `INTERNAL_LIGHTDASH_HOST` | | Same as `SITE_URL` | Internal Lightdash host for the Headless browser to send requests when your Lightdash instance is not accessible from the Internet. Needs to support `https` if `SECURE_COOKIES=true` | +| `STATIC_IP` | | `http://localhost:8080` | Server static IP so users can add the IP to their warehouse allow-list. | +| `LIGHTDASH_QUERY_MAX_LIMIT` | | `5000` | Query max rows limit | +| `LIGHTDASH_QUERY_DEFAULT_LIMIT` | | `500` | Default number of rows to return in a query | +| `LIGHTDASH_QUERY_MAX_PAGE_SIZE` | | `2500` | Maximum page size for paginated queries | +| `SCHEDULER_ENABLED` | | `true` | Enables/Disables the scheduler worker that triggers the scheduled deliveries. | +| `SCHEDULER_CONCURRENCY` | | `3` | How many scheduled delivery jobs can be processed concurrently. | +| `SCHEDULER_JOB_TIMEOUT` | | `600000` (10 minutes) | After how many milliseconds the job should be timeout so the scheduler worker can pick other jobs. | +| `SCHEDULER_SCREENSHOT_TIMEOUT` | | | Timeout in milliseconds for taking screenshots | +| `SCHEDULER_INCLUDE_TASKS` | | | Comma-separated list of scheduler tasks to include | +| `SCHEDULER_EXCLUDE_TASKS` | | | Comma-separated list of scheduler tasks to exclude | +| `LIGHTDASH_CSV_CELLS_LIMIT` | | `100000` | Max cells on CSV file exports | +| `LIGHTDASH_CHART_VERSION_HISTORY_DAYS_LIMIT` | | `3` | Configure how far back the chart versions history goes in days | +| `LIGHTDASH_PIVOT_TABLE_MAX_COLUMN_LIMIT` | | `60` | Configure maximum number of columns in pivot table | +| `GROUPS_ENABLED` | | `false` | Enables/Disables groups functionality | +| `CUSTOM_VISUALIZATIONS_ENABLED` | | `false` | Enables/Disables custom chart functionality | +| `LIGHTDASH_MAX_PAYLOAD` | | `5mb` | Maximum HTTP request body size | +| `LIGHTDASH_LICENSE_KEY` | | | License key for Lightdash Enterprise Edition. See [Enterprise License Keys](/self-host/customize-deployment/enterprise-license-keys) for details. [Get your license key](https://calendly.com/lightdash-cloud/enterprise?utm_source=docs&utm_medium=referral&utm_campaign=enterprise_licensing&utm_content=license_key_cta) | +| `HEADLESS_BROWSER_HOST` | | — | Hostname for the headless browser | +| `HEADLESS_BROWSER_PORT` | | `3001` | Port for the headless browser | +| `ALLOW_MULTIPLE_ORGS` | | `false` | If set to `true`, new users registering on Lightdash will have their own organization, separated from others | +| `LIGHTDASH_MODE` | | `default` | Mode for Lightdash (default, demo, pr, etc.) | +| `DISABLE_PAT` | | `false` | Disables Personal Access Tokens | +| `PAT_ALLOWED_ORG_ROLES` | | All roles | Comma-separated list of organization roles allowed to use Personal Access Tokens | +| `PAT_MAX_EXPIRATION_TIME_IN_DAYS` | | | Maximum expiration time in days for Personal Access Tokens | +| `MAX_DOWNLOADS_AS_CODE` | | `100` | Maximum number of downloads as code | +| `EXTENDED_USAGE_ANALYTICS` | | `false` | Enables extended usage analytics | +| `USE_SECURE_BROWSER` | | `false` | Use secure WebSocket connections for headless browser | +| `DISABLE_DASHBOARD_COMMENTS` | | `false` | Disables dashboard comments | +| `ORGANIZATION_WAREHOUSE_CREDENTIALS_ENABLED` | | `false` | Enables organization warehouse settings | Lightdash also accepts all [standard postgres environment variables](https://www.postgresql.org/docs/9.3/libpq-envars.html) From fbe01ab319ae79da7e21d5bf20a644ba0ec71b86 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:42:49 +0000 Subject: [PATCH 03/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 22 +++++++++---------- 1 file changed, 11 insertions(+), 11 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 3d683344..d4af1470 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -57,17 +57,17 @@ Lightdash also accepts all [standard postgres environment variables](https://www This is a reference to all the SMTP environment variables that can be used to configure a Lightdash email client. -| Variable | Description | Required? | Default | -| :------------------------------ | :------------------------------------------------------------------------- | :-------------------------------------------: | :---------: | -| `EMAIL_SMTP_HOST` | Hostname of email server | | | -| `EMAIL_SMTP_PORT` | Port of email server | | `587` | -| `EMAIL_SMTP_SECURE` | Secure connection | | `true` | -| `EMAIL_SMTP_USER` | Auth user | | | -| `EMAIL_SMTP_PASSWORD` | Auth password | `[1]` | | -| `EMAIL_SMTP_ACCESS_TOKEN` | Auth access token for Oauth2 authentication | `[1]` | | -| `EMAIL_SMTP_ALLOW_INVALID_CERT` | Allow connection to TLS server with self-signed or invalid TLS certificate | | `false` | -| `EMAIL_SMTP_SENDER_EMAIL` | The email address that sends emails | | | -| `EMAIL_SMTP_SENDER_NAME` | The name of the email address that sends emails | | `Lightdash` | +| Variable | Required? | Default | Description | +| :------------------------------ | :-------------------------------------------: | :---------: | :------------------------------------------------------------------------- | +| `EMAIL_SMTP_HOST` | | | Hostname of email server | +| `EMAIL_SMTP_PORT` | | `587` | Port of email server | +| `EMAIL_SMTP_SECURE` | | `true` | Secure connection | +| `EMAIL_SMTP_USER` | | | Auth user | +| `EMAIL_SMTP_PASSWORD` | `[1]` | | Auth password | +| `EMAIL_SMTP_ACCESS_TOKEN` | `[1]` | | Auth access token for Oauth2 authentication | +| `EMAIL_SMTP_ALLOW_INVALID_CERT` | | `false` | Allow connection to TLS server with self-signed or invalid TLS certificate | +| `EMAIL_SMTP_SENDER_EMAIL` | | | The email address that sends emails | +| `EMAIL_SMTP_SENDER_NAME` | | `Lightdash` | The name of the email address that sends emails | [1] `EMAIL_SMTP_PASSWORD` or `EMAIL_SMTP_ACCESS_TOKEN` needs to be provided From 8db08d5179a76d3f8b578ddaef4b86f54164a437 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:43:12 +0000 Subject: [PATCH 04/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 58 +++++++++---------- 1 file changed, 29 insertions(+), 29 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index d4af1470..6967ba2c 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -75,35 +75,35 @@ This is a reference to all the SMTP environment variables that can be used to co These variables enable you to control Single Sign On (SSO) functionality. -| Variable | Description | Required? | Default | -| :------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------- | :-------: | :-----: | -| `AUTH_DISABLE_PASSWORD_AUTHENTICATION` | If "true" disables signing in with plain passwords | | `false` | -| `AUTH_ENABLE_GROUP_SYNC` | If "true" enables assigning SSO groups to Lightdash groups | | `false` | -| `AUTH_ENABLE_OIDC_LINKING` | Enables linking a new OIDC identity to an existing user if they already have another OIDC with the same email | | `false` | -| `AUTH_ENABLE_OIDC_TO_EMAIL_LINKING` | Enables linking OIDC identity to an existing user by email. Required when using [SCIM](/references/scim-integration) with SSO | | `false` | -| `AUTH_GOOGLE_OAUTH2_CLIENT_ID` | Required for Google SSO | | | -| `AUTH_GOOGLE_OAUTH2_CLIENT_SECRET` | Required for Google SSO | | | -| `AUTH_OKTA_OAUTH_CLIENT_ID` | Required for Okta SSO | | | -| `AUTH_OKTA_OAUTH_CLIENT_SECRET` | Required for Okta SSO | | | -| `AUTH_OKTA_OAUTH_ISSUER` | Required for Okta SSO | | | -| `AUTH_OKTA_DOMAIN` | Required for Okta SSO | | | -| `AUTH_OKTA_AUTHORIZATION_SERVER_ID` | Optional for Okta SSO with a custom authorization server | | | -| `AUTH_OKTA_EXTRA_SCOPES` | Optional for Okta SSO scopes (e.g. groups) without a custom authorization server | | | -| `AUTH_ONE_LOGIN_OAUTH_CLIENT_ID` | Required for One Login SSO | | | -| `AUTH_ONE_LOGIN_OAUTH_CLIENT_SECRET` | Required for One Login SSO | | | -| `AUTH_ONE_LOGIN_OAUTH_ISSUER` | Required for One Login SSO | | | -| `AUTH_AZURE_AD_OAUTH_CLIENT_ID` | Required for Azure AD | | | -| `AUTH_AZURE_AD_OAUTH_CLIENT_SECRET` | Required for Azure AD | | | -| `AUTH_AZURE_AD_OAUTH_TENANT_ID` | Required for Azure AD | | | -| `AUTH_AZURE_AD_OIDC_METADATA_ENDPOINT` | Optional for Azure AD | | | -| `AUTH_AZURE_AD_X509_CERT_PATH` | Optional for Azure AD | | | -| `AUTH_AZURE_AD_X509_CERT` | Optional for Azure AD | | | -| `AUTH_AZURE_AD_PRIVATE_KEY_PATH` | Optional for Azure AD | | | -| `AUTH_AZURE_AD_PRIVATE_KEY` | Optional for Azure AD | | | -| `DATABRICKS_OAUTH_CLIENT_ID` | Client ID for Databricks OAuth | | | -| `DATABRICKS_OAUTH_CLIENT_SECRET` | Client secret for Databricks OAuth (optional) | | | -| `DATABRICKS_OAUTH_AUTHORIZATION_ENDPOINT` | Authorization endpoint URL for Databricks OAuth | | | -| `DATABRICKS_OAUTH_TOKEN_ENDPOINT` | Token endpoint URL for Databricks OAuth | | | +| Variable | Required? | Default | Description | +| :------------------------------------- | :-------: | :-----: | :------------------------------------------------------------------------------------------------------------------------------------------- | +| `AUTH_DISABLE_PASSWORD_AUTHENTICATION` | | `false` | If "true" disables signing in with plain passwords | +| `AUTH_ENABLE_GROUP_SYNC` | | `false` | If "true" enables assigning SSO groups to Lightdash groups | +| `AUTH_ENABLE_OIDC_LINKING` | | `false` | Enables linking a new OIDC identity to an existing user if they already have another OIDC with the same email | +| `AUTH_ENABLE_OIDC_TO_EMAIL_LINKING` | | `false` | Enables linking OIDC identity to an existing user by email. Required when using [SCIM](/references/scim-integration) with SSO | +| `AUTH_GOOGLE_OAUTH2_CLIENT_ID` | | | Required for Google SSO | +| `AUTH_GOOGLE_OAUTH2_CLIENT_SECRET` | | | Required for Google SSO | +| `AUTH_OKTA_OAUTH_CLIENT_ID` | | | Required for Okta SSO | +| `AUTH_OKTA_OAUTH_CLIENT_SECRET` | | | Required for Okta SSO | +| `AUTH_OKTA_OAUTH_ISSUER` | | | Required for Okta SSO | +| `AUTH_OKTA_DOMAIN` | | | Required for Okta SSO | +| `AUTH_OKTA_AUTHORIZATION_SERVER_ID` | | | Optional for Okta SSO with a custom authorization server | +| `AUTH_OKTA_EXTRA_SCOPES` | | | Optional for Okta SSO scopes (e.g. groups) without a custom authorization server | +| `AUTH_ONE_LOGIN_OAUTH_CLIENT_ID` | | | Required for One Login SSO | +| `AUTH_ONE_LOGIN_OAUTH_CLIENT_SECRET` | | | Required for One Login SSO | +| `AUTH_ONE_LOGIN_OAUTH_ISSUER` | | | Required for One Login SSO | +| `AUTH_AZURE_AD_OAUTH_CLIENT_ID` | | | Required for Azure AD | +| `AUTH_AZURE_AD_OAUTH_CLIENT_SECRET` | | | Required for Azure AD | +| `AUTH_AZURE_AD_OAUTH_TENANT_ID` | | | Required for Azure AD | +| `AUTH_AZURE_AD_OIDC_METADATA_ENDPOINT` | | | Optional for Azure AD | +| `AUTH_AZURE_AD_X509_CERT_PATH` | | | Optional for Azure AD | +| `AUTH_AZURE_AD_X509_CERT` | | | Optional for Azure AD | +| `AUTH_AZURE_AD_PRIVATE_KEY_PATH` | | | Optional for Azure AD | +| `AUTH_AZURE_AD_PRIVATE_KEY` | | | Optional for Azure AD | +| `DATABRICKS_OAUTH_CLIENT_ID` | | | Client ID for Databricks OAuth | +| `DATABRICKS_OAUTH_CLIENT_SECRET` | | | Client secret for Databricks OAuth (optional) | +| `DATABRICKS_OAUTH_AUTHORIZATION_ENDPOINT` | | | Authorization endpoint URL for Databricks OAuth | +| `DATABRICKS_OAUTH_TOKEN_ENDPOINT` | | | Token endpoint URL for Databricks OAuth | ## S3 From 9394ef9758f6fb0602ebf22a089c4e45526e5d16 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:43:35 +0000 Subject: [PATCH 05/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 28 +++++++++---------- 1 file changed, 14 insertions(+), 14 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 6967ba2c..2263d27b 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -109,20 +109,20 @@ These variables enable you to control Single Sign On (SSO) functionality. These variables allow you to configure [S3 Object Storage](/self-host/customize-deployment/configure-lightdash-to-use-external-object-storage), which is required to self-host Lightdash. -| Variable | Description | Required? | Default | -|:--------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------:|:---------------:| -| `S3_ENDPOINT` | S3 endpoint for storing results | | | -| `S3_BUCKET` | Name of the S3 bucket for storing files | | | -| `S3_REGION` | Region where the S3 bucket is located | | | -| `S3_ACCESS_KEY` | Access key for authenticating with the S3 bucket | | | -| `S3_SECRET_KEY` | Secret key for authenticating with the S3 bucket | | | -| `S3_USE_CREDENTIALS_FROM` | Configures the credential provider chain for AWS S3 authentication if access key and secret is not provided. Supports: `env` (environment variables), `token_file` (token file credentials), `ini` (initialization file credentials), `ecs` (container metadata credentials), `ec2` (instance metadata credentials). Multiple values can be specified in order of preference. | | | -| `S3_EXPIRATION_TIME` | Expiration time for scheduled deliveries files | | 259200 (3d) | -| `S3_FORCE_PATH_STYLE` | Force path style addressing, needed for MinIO setup e.g. `http://your.s3.domain/BUCKET/KEY` instead of `http://BUCKET.your.s3.domain/KEY` | | `false` | -| `RESULTS_S3_BUCKET` | Name of the S3 bucket used for storing query results | | `S3_BUCKET` | -| `RESULTS_S3_REGION` | Region where the S3 query storage bucket is located | | `S3_REGION` | -| `RESULTS_S3_ACCESS_KEY` | Access key for authenticating with the S3 query storage bucket | | `S3_ACCESS_KEY` | -| `RESULTS_S3_SECRET_KEY` | Secret key for authenticating with the S3 query storage bucket | | `S3_SECRET_KEY` | +| Variable | Required? | Default | Description | +|:--------------------------|:---------------------------------------------:|:---------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| `S3_ENDPOINT` | | | S3 endpoint for storing results | +| `S3_BUCKET` | | | Name of the S3 bucket for storing files | +| `S3_REGION` | | | Region where the S3 bucket is located | +| `S3_ACCESS_KEY` | | | Access key for authenticating with the S3 bucket | +| `S3_SECRET_KEY` | | | Secret key for authenticating with the S3 bucket | +| `S3_USE_CREDENTIALS_FROM` | | | Configures the credential provider chain for AWS S3 authentication if access key and secret is not provided. Supports: `env` (environment variables), `token_file` (token file credentials), `ini` (initialization file credentials), `ecs` (container metadata credentials), `ec2` (instance metadata credentials). Multiple values can be specified in order of preference. | +| `S3_EXPIRATION_TIME` | | 259200 (3d) | Expiration time for scheduled deliveries files | +| `S3_FORCE_PATH_STYLE` | | `false` | Force path style addressing, needed for MinIO setup e.g. `http://your.s3.domain/BUCKET/KEY` instead of `http://BUCKET.your.s3.domain/KEY` | +| `RESULTS_S3_BUCKET` | | `S3_BUCKET` | Name of the S3 bucket used for storing query results | +| `RESULTS_S3_REGION` | | `S3_REGION` | Region where the S3 query storage bucket is located | +| `RESULTS_S3_ACCESS_KEY` | | `S3_ACCESS_KEY` | Access key for authenticating with the S3 query storage bucket | +| `RESULTS_S3_SECRET_KEY` | | `S3_SECRET_KEY` | Secret key for authenticating with the S3 query storage bucket | ## Cache From 298d26948a1c22a08f614589f291e263f7a0248c Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:43:43 +0000 Subject: [PATCH 06/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../customize-deployment/environment-variables.mdx | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 2263d27b..8a318286 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -130,11 +130,11 @@ These variables allow you to configure [S3 Object Storage](/self-host/customize- Note that you will need an Enterprise License Key for this functionality. -| Variable | Description | Required? | Default | -| :--------------------------- | :------------------------------------------------------------------------- | :-------: | :---------: | -| `RESULTS_CACHE_ENABLED` | Enables caching for chart results | | `false` | -| `AUTOCOMPLETE_CACHE_ENABLED` | Enables caching for filter autocomplete results | | `false` | -| `CACHE_STALE_TIME_SECONDS` | Defines how long cached results remain valid before being considered stale | | 86400 (24h) | +| Variable | Required? | Default | Description | +| :--------------------------- | :-------: | :---------: | :------------------------------------------------------------------------- | +| `RESULTS_CACHE_ENABLED` | | `false` | Enables caching for chart results | +| `AUTOCOMPLETE_CACHE_ENABLED` | | `false` | Enables caching for filter autocomplete results | +| `CACHE_STALE_TIME_SECONDS` | | 86400 (24h) | Defines how long cached results remain valid before being considered stale | These variables are **deprecated**; use the `RESULTS_S3_*` versions instead. From 6847ebbabe59941f7c3d9a1e1ea1188fa9803836 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:43:51 +0000 Subject: [PATCH 07/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../customize-deployment/environment-variables.mdx | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 8a318286..a6ad1ed0 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -140,12 +140,12 @@ These variables allow you to configure [S3 Object Storage](/self-host/customize- These variables are **deprecated**; use the `RESULTS_S3_*` versions instead. -| Variable | Description | Required? | Default | -| :---------------------------- | :------------------------------------- | :-------: | :-------------: | -| `RESULTS_CACHE_S3_BUCKET` | Deprecated - use RESULTS_S3_BUCKET | | `S3_BUCKET` | -| `RESULTS_CACHE_S3_REGION` | Deprecated - use RESULTS_S3_REGION | | `S3_REGION` | -| `RESULTS_CACHE_S3_ACCESS_KEY` | Deprecated - use RESULTS_S3_ACCESS_KEY | | `S3_ACCESS_KEY` | -| `RESULTS_CACHE_S3_SECRET_KEY` | Deprecated - use RESULTS_S3_SECRET_KEY | | `S3_SECRET_KEY` | +| Variable | Required? | Default | Description | +| :---------------------------- | :-------: | :-------------: | :------------------------------------- | +| `RESULTS_CACHE_S3_BUCKET` | | `S3_BUCKET` | Deprecated - use RESULTS_S3_BUCKET | +| `RESULTS_CACHE_S3_REGION` | | `S3_REGION` | Deprecated - use RESULTS_S3_REGION | +| `RESULTS_CACHE_S3_ACCESS_KEY` | | `S3_ACCESS_KEY` | Deprecated - use RESULTS_S3_ACCESS_KEY | +| `RESULTS_CACHE_S3_SECRET_KEY` | | `S3_SECRET_KEY` | Deprecated - use RESULTS_S3_SECRET_KEY | ## Logging From 5563884a4af9eefe4ce2c81fccadf2bb6e010fb6 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:44:06 +0000 Subject: [PATCH 08/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 20 +++++++++---------- 1 file changed, 10 insertions(+), 10 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index a6ad1ed0..6d1e8e82 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -149,16 +149,16 @@ These variables allow you to configure [S3 Object Storage](/self-host/customize- ## Logging -| Variable | Description | Required? | Default | -| :----------------------------- | :------------------------------------------------------------------------------------------ | :-------: | :--------------------: | -| `LIGHTDASH_LOG_LEVEL` | The minimum level of log messages to show. `DEBUG`, `AUDIT`, `HTTP`, `INFO`, `WARN` `ERROR` | | `INFO` | -| `LIGHTDASH_LOG_FORMAT` | The format of log messages. `PLAIN`, `PRETTY`, `JSON` | | `pretty` | -| `LIGHTDASH_LOG_OUTPUTS` | The outputs to send log messages to | | `console` | -| `LIGHTDASH_LOG_CONSOLE_LEVEL` | The minimum level of log messages to display on the console | | `LIGHTDASH_LOG_LEVEL` | -| `LIGHTDASH_LOG_CONSOLE_FORMAT` | The format of log messages on the console | | `LIGHTDASH_LOG_FORMAT` | -| `LIGHTDASH_LOG_FILE_LEVEL` | The minimum level of log messages to write to the log file | | `LIGHTDASH_LOG_LEVEL` | -| `LIGHTDASH_LOG_FILE_FORMAT` | The format of log messages in the log file | | `LIGHTDASH_LOG_FORMAT` | -| `LIGHTDASH_LOG_FILE_PATH` | The path to the log file. Requires `LIGHTDASH_LOG_OUTPUTS` to include `file` to enable file output. | | `./logs/all.log` | +| Variable | Required? | Default | Description | +| :----------------------------- | :-------: | :--------------------: | :------------------------------------------------------------------------------------------ | +| `LIGHTDASH_LOG_LEVEL` | | `INFO` | The minimum level of log messages to show. `DEBUG`, `AUDIT`, `HTTP`, `INFO`, `WARN` `ERROR` | +| `LIGHTDASH_LOG_FORMAT` | | `pretty` | The format of log messages. `PLAIN`, `PRETTY`, `JSON` | +| `LIGHTDASH_LOG_OUTPUTS` | | `console` | The outputs to send log messages to | +| `LIGHTDASH_LOG_CONSOLE_LEVEL` | | `LIGHTDASH_LOG_LEVEL` | The minimum level of log messages to display on the console | +| `LIGHTDASH_LOG_CONSOLE_FORMAT` | | `LIGHTDASH_LOG_FORMAT` | The format of log messages on the console | +| `LIGHTDASH_LOG_FILE_LEVEL` | | `LIGHTDASH_LOG_LEVEL` | The minimum level of log messages to write to the log file | +| `LIGHTDASH_LOG_FILE_FORMAT` | | `LIGHTDASH_LOG_FORMAT` | The format of log messages in the log file | +| `LIGHTDASH_LOG_FILE_PATH` | | `./logs/all.log` | The path to the log file. Requires `LIGHTDASH_LOG_OUTPUTS` to include `file` to enable file output. | ## Prometheus From 05265b03d73de97392630d1f2bfba0320c28d4f9 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:44:19 +0000 Subject: [PATCH 09/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 18 +++++++++--------- 1 file changed, 9 insertions(+), 9 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 6d1e8e82..5ca834aa 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -162,15 +162,15 @@ These variables allow you to configure [S3 Object Storage](/self-host/customize- ## Prometheus -| Variable | Description | Required? | Default | -| :------------------------------------------ | :------------------------------------------------------------------------------ | :-------: | :-------------------------: | -| `LIGHTDASH_PROMETHEUS_ENABLED` | Enables/Disables Prometheus metrics endpoint | | `false` | -| `LIGHTDASH_PROMETHEUS_PORT` | Port for Prometheus metrics endpoint | | `9090` | -| `LIGHTDASH_PROMETHEUS_PATH` | Path for Prometheus metrics endpoint | | `/metrics` | -| `LIGHTDASH_PROMETHEUS_PREFIX` | Prefix for metric names. | | | -| `LIGHTDASH_GC_DURATION_BUCKETS` | Buckets for duration histogram in seconds. | | `0.001, 0.01, 0.1, 1, 2, 5` | -| `LIGHTDASH_EVENT_LOOP_MONITORING_PRECISION` | Precision for event loop monitoring in milliseconds. Must be greater than zero. | | `10` | -| `LIGHTDASH_PROMETHEUS_LABELS` | Labels to add to all metrics. Must be valid JSON | | | +| Variable | Required? | Default | Description | +| :------------------------------------------ | :-------: | :-------------------------: | :------------------------------------------------------------------------------ | +| `LIGHTDASH_PROMETHEUS_ENABLED` | | `false` | Enables/Disables Prometheus metrics endpoint | +| `LIGHTDASH_PROMETHEUS_PORT` | | `9090` | Port for Prometheus metrics endpoint | +| `LIGHTDASH_PROMETHEUS_PATH` | | `/metrics` | Path for Prometheus metrics endpoint | +| `LIGHTDASH_PROMETHEUS_PREFIX` | | | Prefix for metric names. | +| `LIGHTDASH_GC_DURATION_BUCKETS` | | `0.001, 0.01, 0.1, 1, 2, 5` | Buckets for duration histogram in seconds. | +| `LIGHTDASH_EVENT_LOOP_MONITORING_PRECISION` | | `10` | Precision for event loop monitoring in milliseconds. Must be greater than zero. | +| `LIGHTDASH_PROMETHEUS_LABELS` | | | Labels to add to all metrics. Must be valid JSON | ## Security From ff15bf5fb0e991bd6dd9c442e80835b800316cf5 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:44:36 +0000 Subject: [PATCH 10/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../customize-deployment/environment-variables.mdx | 14 +++++++------- 1 file changed, 7 insertions(+), 7 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 5ca834aa..0d3bb004 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -174,13 +174,13 @@ These variables allow you to configure [S3 Object Storage](/self-host/customize- ## Security -| Variable | Description | Required? | Default | -| :------------------------------- | :--------------------------------------------------------------------------------------------------------------- | :-------: | :-----: | -| `LIGHTDASH_CSP_REPORT_ONLY` | Enables Content Security Policy (CSP) reporting only mode. This is recommended to be set to false in production. | | `true` | -| `LIGHTDASH_CSP_ALLOWED_DOMAINS` | List of domains that are allowed to load resources from. Values must be separated by commas. | | | -| `LIGHTDASH_CSP_REPORT_URI` | URI to send CSP violation reports to. | | | -| `LIGHTDASH_CORS_ENABLED` | Enables Cross-Origin Resource Sharing (CORS) | | `false` | -| `LIGHTDASH_CORS_ALLOWED_DOMAINS` | List of domains that are allowed to make cross-origin requests. Values must be separated by commas. | | | +| Variable | Required? | Default | Description | +| :------------------------------- | :-------: | :-----: | :--------------------------------------------------------------------------------------------------------------- | +| `LIGHTDASH_CSP_REPORT_ONLY` | | `true` | Enables Content Security Policy (CSP) reporting only mode. This is recommended to be set to false in production. | +| `LIGHTDASH_CSP_ALLOWED_DOMAINS` | | | List of domains that are allowed to load resources from. Values must be separated by commas. | +| `LIGHTDASH_CSP_REPORT_URI` | | | URI to send CSP violation reports to. | +| `LIGHTDASH_CORS_ENABLED` | | `false` | Enables Cross-Origin Resource Sharing (CORS) | +| `LIGHTDASH_CORS_ALLOWED_DOMAINS` | | | List of domains that are allowed to make cross-origin requests. Values must be separated by commas. | ## Analytics & Event Tracking From d22d3226e0efc7714b4d84c921e86508e1019ef6 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:45:26 +0000 Subject: [PATCH 11/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 24 +++++++++---------- 1 file changed, 12 insertions(+), 12 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 0d3bb004..c268a6de 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -197,18 +197,18 @@ These variables allow you to configure [S3 Object Storage](/self-host/customize- These variables enable you to configure the [AI Analyst functionality](/guides/ai-agents). Note that you will need an **Enterprise Licence Key** for this functionality. -| Variable | Description | Required? | Default | -| ---------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------- | --------- | -------- | -| `AI_COPILOT_ENABLED` | Enables/Disables AI Analyst functionality | | `false` | -| `ASK_AI_BUTTON_ENABLED` | Enables the "Ask AI" button in the interface for direct access to AI agents, when disabled agents can be acessed from `/ai-agents` route | | `false` | -| `AI_EMBEDDING_ENABLED` | Enables AI embedding functionality for verified answers similarity matching | | `false` | -| `AI_DEFAULT_PROVIDER` | Default AI provider to use (`openai`, `azure`, `anthropic`, `openrouter`, `bedrock`) | | `openai` | -| `AI_DEFAULT_EMBEDDING_PROVIDER` | Default AI provider for embeddings (`openai`, `bedrock`, `azure`) | | `openai` | -| `AI_COPILOT_DEBUG_LOGGING_ENABLED` | Enables debug logging for AI Copilot | | `false` | -| `AI_COPILOT_TELEMETRY_ENABLED` | Enables telemetry for AI Copilot | | `false` | -| `AI_COPILOT_REQUIRES_FEATURE_FLAG` | Requires a feature flag to use AI Copilot | | `false` | -| `AI_COPILOT_MAX_QUERY_LIMIT` | Maximum number of rows returned in AI-generated queries | | `500` | -| `AI_VERIFIED_ANSWER_SIMILARITY_THRESHOLD`| Similarity threshold (0-1) for verified answer matching | | `0.6` | +| Variable | Required? | Default | Description | +| ---------------------------------------- | --------- | -------- | ---------------------------------------------------------------------------------------------------------------------------------------- | +| `AI_COPILOT_ENABLED` | | `false` | Enables/Disables AI Analyst functionality | +| `ASK_AI_BUTTON_ENABLED` | | `false` | Enables the "Ask AI" button in the interface for direct access to AI agents, when disabled agents can be acessed from `/ai-agents` route | +| `AI_EMBEDDING_ENABLED` | | `false` | Enables AI embedding functionality for verified answers similarity matching | +| `AI_DEFAULT_PROVIDER` | | `openai` | Default AI provider to use (`openai`, `azure`, `anthropic`, `openrouter`, `bedrock`) | +| `AI_DEFAULT_EMBEDDING_PROVIDER` | | `openai` | Default AI provider for embeddings (`openai`, `bedrock`, `azure`) | +| `AI_COPILOT_DEBUG_LOGGING_ENABLED` | | `false` | Enables debug logging for AI Copilot | +| `AI_COPILOT_TELEMETRY_ENABLED` | | `false` | Enables telemetry for AI Copilot | +| `AI_COPILOT_REQUIRES_FEATURE_FLAG` | | `false` | Requires a feature flag to use AI Copilot | +| `AI_COPILOT_MAX_QUERY_LIMIT` | | `500` | Maximum number of rows returned in AI-generated queries | +| `AI_VERIFIED_ANSWER_SIMILARITY_THRESHOLD`| | `0.6` | Similarity threshold (0-1) for verified answer matching | The AI Analyst supports multiple providers for flexibility. Choose one of the provider configurations below based on your preferred AI service. **OpenAI integration is the recommended option as it is the most tested and stable implementation.** From 66c78e2819b9b6330ba2670151c44f9be58d7a3c Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:45:36 +0000 Subject: [PATCH 12/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../customize-deployment/environment-variables.mdx | 14 +++++++------- 1 file changed, 7 insertions(+), 7 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index c268a6de..a364fda9 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -218,13 +218,13 @@ To enable AI Analyst, set `AI_COPILOT_ENABLED=true` and provide an API key for ` #### OpenAI Configuration -| Variable | Description | Required? | Default | -| ------------------------ | ------------------------------------------------------------- | -------------------------- | ------------------------ | -| `OPENAI_API_KEY` | API key for OpenAI | Required when using OpenAI | | -| `OPENAI_MODEL_NAME` | OpenAI model name to use | | `gpt-4.1` | -| `OPENAI_EMBEDDING_MODEL` | OpenAI embedding model for verified answers | | `text-embedding-3-small` | -| `OPENAI_BASE_URL` | Optional base URL for OpenAI compatible API | | | -| `OPENAI_AVAILABLE_MODELS`| Comma-separated list of models available in the model picker | | All supported models | +| Variable | Required? | Default | Description | +| ------------------------ | -------------------------- | ------------------------ | ------------------------------------------------------------- | +| `OPENAI_API_KEY` | Required when using OpenAI | | API key for OpenAI | +| `OPENAI_MODEL_NAME` | | `gpt-4.1` | OpenAI model name to use | +| `OPENAI_EMBEDDING_MODEL` | | `text-embedding-3-small` | OpenAI embedding model for verified answers | +| `OPENAI_BASE_URL` | | | Optional base URL for OpenAI compatible API | +| `OPENAI_AVAILABLE_MODELS`| | All supported models | Comma-separated list of models available in the model picker | #### Anthropic Configuration From 0327e4c78c49a32ae7e029cd6ccc50fd14b9c4ad Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:45:44 +0000 Subject: [PATCH 13/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../customize-deployment/environment-variables.mdx | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index a364fda9..31673b32 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -228,11 +228,11 @@ To enable AI Analyst, set `AI_COPILOT_ENABLED=true` and provide an API key for ` #### Anthropic Configuration -| Variable | Description | Required? | Default | -| -------------------------- | ------------------------------------------------------------ | ----------------------------- | ------------------ | -| `ANTHROPIC_API_KEY` | API key for Anthropic | Required when using Anthropic | | -| `ANTHROPIC_MODEL_NAME` | Anthropic model name to use | | `claude-sonnet-4-5`| -| `ANTHROPIC_AVAILABLE_MODELS`| Comma-separated list of models available in the model picker | | All supported models | +| Variable | Required? | Default | Description | +| -------------------------- | ----------------------------- | ------------------ | ------------------------------------------------------------ | +| `ANTHROPIC_API_KEY` | Required when using Anthropic | | API key for Anthropic | +| `ANTHROPIC_MODEL_NAME` | | `claude-sonnet-4-5`| Anthropic model name to use | +| `ANTHROPIC_AVAILABLE_MODELS`| | All supported models | Comma-separated list of models available in the model picker | #### Azure AI Configuration From 4132cd80113e9db4063198c50037d77d10e8ff3d Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:45:54 +0000 Subject: [PATCH 14/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 16 ++++++++-------- 1 file changed, 8 insertions(+), 8 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 31673b32..db4a1d6c 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -236,14 +236,14 @@ To enable AI Analyst, set `AI_COPILOT_ENABLED=true` and provide an API key for ` #### Azure AI Configuration -| Variable | Description | Required? | Default | -| --------------------------------- | ------------------------------------------------------------ | ---------------------------- | ------------------------ | -| `AZURE_AI_API_KEY` | API key for Azure AI | Required when using Azure AI | | -| `AZURE_AI_ENDPOINT` | Endpoint for Azure AI | Required when using Azure AI | | -| `AZURE_AI_API_VERSION` | API version for Azure AI | Required when using Azure AI | | -| `AZURE_AI_DEPLOYMENT_NAME` | Deployment name for Azure AI | Required when using Azure AI | | -| `AZURE_EMBEDDING_DEPLOYMENT_NAME` | Deployment name for Azure embedding model | | `text-embedding-3-small` | -| `AZURE_USE_DEPLOYMENT_BASED_URLS` | Use deployment-based URLs for Azure OpenAI API calls | | `true` | +| Variable | Required? | Default | Description | +| --------------------------------- | ---------------------------- | ------------------------ | ------------------------------------------------------------ | +| `AZURE_AI_API_KEY` | Required when using Azure AI | | API key for Azure AI | +| `AZURE_AI_ENDPOINT` | Required when using Azure AI | | Endpoint for Azure AI | +| `AZURE_AI_API_VERSION` | Required when using Azure AI | | API version for Azure AI | +| `AZURE_AI_DEPLOYMENT_NAME` | Required when using Azure AI | | Deployment name for Azure AI | +| `AZURE_EMBEDDING_DEPLOYMENT_NAME` | | `text-embedding-3-small` | Deployment name for Azure embedding model | +| `AZURE_USE_DEPLOYMENT_BASED_URLS` | | `true` | Use deployment-based URLs for Azure OpenAI API calls | #### OpenRouter Configuration From 5ac48b0bd582781a1f861b0cda0a054b95829237 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:46:05 +0000 Subject: [PATCH 15/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../customize-deployment/environment-variables.mdx | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index db4a1d6c..f9451dd6 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -247,12 +247,12 @@ To enable AI Analyst, set `AI_COPILOT_ENABLED=true` and provide an API key for ` #### OpenRouter Configuration -| Variable | Description | Required? | Default | -| ------------------------------ | --------------------------------------------------------------------------- | ------------------------------ | --------------------------- | -| `OPENROUTER_API_KEY` | API key for OpenRouter | Required when using OpenRouter | | -| `OPENROUTER_MODEL_NAME` | OpenRouter model name to use | | `openai/gpt-4.1-2025-04-14` | -| `OPENROUTER_SORT_ORDER` | Provider sorting method (`price`, `throughput`, `latency`) | | `latency` | -| `OPENROUTER_ALLOWED_PROVIDERS` | Comma-separated list of allowed providers (`anthropic`, `openai`, `google`) | | `openai` | +| Variable | Required? | Default | Description | +| ------------------------------ | ------------------------------ | --------------------------- | --------------------------------------------------------------------------- | +| `OPENROUTER_API_KEY` | Required when using OpenRouter | | API key for OpenRouter | +| `OPENROUTER_MODEL_NAME` | | `openai/gpt-4.1-2025-04-14` | OpenRouter model name to use | +| `OPENROUTER_SORT_ORDER` | | `latency` | Provider sorting method (`price`, `throughput`, `latency`) | +| `OPENROUTER_ALLOWED_PROVIDERS` | | `openai` | Comma-separated list of allowed providers (`anthropic`, `openai`, `google`) | #### AWS Bedrock Configuration From edf1a9f1e86613217e3e0cc318eb9a4c9da06120 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:46:17 +0000 Subject: [PATCH 16/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 20 +++++++++---------- 1 file changed, 10 insertions(+), 10 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index f9451dd6..000073a2 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -256,16 +256,16 @@ To enable AI Analyst, set `AI_COPILOT_ENABLED=true` and provide an API key for ` #### AWS Bedrock Configuration -| Variable | Description | Required? | Default | -| -------------------------- | ------------------------------------------------------------- | ------------------------------------------------ | --------------------- | -| `BEDROCK_API_KEY` | API key for Bedrock (alternative to IAM credentials) | Required if not using IAM credentials | | -| `BEDROCK_ACCESS_KEY_ID` | AWS access key ID for Bedrock | Required if not using API key | | -| `BEDROCK_SECRET_ACCESS_KEY`| AWS secret access key for Bedrock | Required if using access key ID | | -| `BEDROCK_SESSION_TOKEN` | AWS session token (for temporary credentials) | | | -| `BEDROCK_REGION` | AWS region for Bedrock | | | -| `BEDROCK_MODEL_NAME` | Bedrock model name to use | | `claude-sonnet-4-5` | -| `BEDROCK_EMBEDDING_MODEL` | Bedrock embedding model for verified answers | | `cohere.embed-english-v3` | -| `BEDROCK_AVAILABLE_MODELS` | Comma-separated list of models available in the model picker | | All supported models | +| Variable | Required? | Default | Description | +| -------------------------- | ------------------------------------------------ | --------------------- | ------------------------------------------------------------- | +| `BEDROCK_API_KEY` | Required if not using IAM credentials | | API key for Bedrock (alternative to IAM credentials) | +| `BEDROCK_ACCESS_KEY_ID` | Required if not using API key | | AWS access key ID for Bedrock | +| `BEDROCK_SECRET_ACCESS_KEY`| Required if using access key ID | | AWS secret access key for Bedrock | +| `BEDROCK_SESSION_TOKEN` | | | AWS session token (for temporary credentials) | +| `BEDROCK_REGION` | | | AWS region for Bedrock | +| `BEDROCK_MODEL_NAME` | | `claude-sonnet-4-5` | Bedrock model name to use | +| `BEDROCK_EMBEDDING_MODEL` | | `cohere.embed-english-v3` | Bedrock embedding model for verified answers | +| `BEDROCK_AVAILABLE_MODELS` | | All supported models | Comma-separated list of models available in the model picker | #### Supported Models From d99117056dd0fd5a6959f47cc4f28f3a2d5a4622 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:46:30 +0000 Subject: [PATCH 17/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 22 +++++++++---------- 1 file changed, 11 insertions(+), 11 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 000073a2..e47861a0 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -287,17 +287,17 @@ For Bedrock, the region prefix is also added based on `BEDROCK_REGION` (e.g., `c These variables enable you to configure the [Slack integration](/guides/using-slack-integration). -| Variable | Description | Required? | Default | -| :--------------------------- | :------------------------------------------- | :-------: | :-------------------: | -| `SLACK_SIGNING_SECRET` | Required for Slack integration | | | -| `SLACK_CLIENT_ID` | Required for Slack integration | | | -| `SLACK_CLIENT_SECRET` | Required for Slack integration | | | -| `SLACK_STATE_SECRET` | Required for Slack integration | | `slack-state-secret` | -| `SLACK_APP_TOKEN` | App token for Slack | | | -| `SLACK_PORT` | Port for Slack integration | | `4351` | -| `SLACK_SOCKET_MODE` | Enable socket mode for Slack | | `false` | -| `SLACK_CHANNELS_CACHED_TIME` | Time in milliseconds to cache Slack channels | | `600000` (10 minutes) | -| `SLACK_SUPPORT_URL` | URL for Slack support | | | +| Variable | Required? | Default | Description | +| :--------------------------- | :-------: | :-------------------: | :------------------------------------------- | +| `SLACK_SIGNING_SECRET` | | | Required for Slack integration | +| `SLACK_CLIENT_ID` | | | Required for Slack integration | +| `SLACK_CLIENT_SECRET` | | | Required for Slack integration | +| `SLACK_STATE_SECRET` | | `slack-state-secret` | Required for Slack integration | +| `SLACK_APP_TOKEN` | | | App token for Slack | +| `SLACK_PORT` | | `4351` | Port for Slack integration | +| `SLACK_SOCKET_MODE` | | `false` | Enable socket mode for Slack | +| `SLACK_CHANNELS_CACHED_TIME` | | `600000` (10 minutes) | Time in milliseconds to cache Slack channels | +| `SLACK_SUPPORT_URL` | | | URL for Slack support | ## GitHub Integration From c85489d3e06e958577a8940f5ab74d3f3380af2a Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:46:41 +0000 Subject: [PATCH 18/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 16 ++++++++-------- 1 file changed, 8 insertions(+), 8 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index e47861a0..fba88bd5 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -303,14 +303,14 @@ These variables enable you to configure the [Slack integration](/guides/using-sl These variables enable you to configure [Github integrations](/self-host/customize-deployment/configure-github-for-lightdash) -| Variable | Description | Required? | Default | -| :----------------------- | :----------------------------------------------- | :-------------------------------------------: | :-----: | -| `GITHUB_PRIVATE_KEY` | GitHub private key for GitHub App authentication | | | -| `GITHUB_APP_ID` | GitHub Application ID | | | -| `GITHUB_CLIENT_ID` | GitHub OAuth client ID | | | -| `GITHUB_CLIENT_SECRET` | GitHub OAuth client secret | | | -| `GITHUB_APP_NAME` | Name of the GitHub App | | | -| `GITHUB_REDIRECT_DOMAIN` | Domain for GitHub OAuth redirection | | | +| Variable | Required? | Default | Description | +| :----------------------- | :-------------------------------------------: | :-----: | :----------------------------------------------- | +| `GITHUB_PRIVATE_KEY` | | | GitHub private key for GitHub App authentication | +| `GITHUB_APP_ID` | | | GitHub Application ID | +| `GITHUB_CLIENT_ID` | | | GitHub OAuth client ID | +| `GITHUB_CLIENT_SECRET` | | | GitHub OAuth client secret | +| `GITHUB_APP_NAME` | | | Name of the GitHub App | +| `GITHUB_REDIRECT_DOMAIN` | | | Domain for GitHub OAuth redirection | ## Microsoft Teams Integration From abefc00cbffe6dcc24377826d4b607029c02f694 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:46:47 +0000 Subject: [PATCH 19/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- self-host/customize-deployment/environment-variables.mdx | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index fba88bd5..9d4674c0 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -316,9 +316,9 @@ These variables enable you to configure [Github integrations](/self-host/customi These variables enable you to configure Microsoft Teams integration. -| Variable | Description | Required? | Default | -| :------------------------ | :---------------------------------- | :-------: | :-----: | -| `MICROSOFT_TEAMS_ENABLED` | Enables Microsoft Teams integration | | `false` | +| Variable | Required? | Default | Description | +| :------------------------ | :-------: | :-----: | :---------------------------------- | +| `MICROSOFT_TEAMS_ENABLED` | | `false` | Enables Microsoft Teams integration | ## Google Cloud Platform From 0a28f0f0967cc3f2373efbb8e7b958e0b8ee46f4 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:46:56 +0000 Subject: [PATCH 20/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../customize-deployment/environment-variables.mdx | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 9d4674c0..c4174dd7 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -324,12 +324,12 @@ These variables enable you to configure Microsoft Teams integration. These variables enable you to configure Google Cloud Platform integration. -| Variable | Description | Required? | Default | -| :------------------------ | :--------------------------------------------------- | :-------: | :-----: | -| `GOOGLE_CLOUD_PROJECT_ID` | Google Cloud Platform project ID | | | -| `GOOGLE_DRIVE_API_KEY` | Google Drive API key | | | -| `AUTH_GOOGLE_ENABLED` | Enables Google authentication | | `false` | -| `AUTH_ENABLE_GCLOUD_ADC` | Enables Google Cloud Application Default Credentials | | `false` | +| Variable | Required? | Default | Description | +| :------------------------ | :-------: | :-----: | :--------------------------------------------------- | +| `GOOGLE_CLOUD_PROJECT_ID` | | | Google Cloud Platform project ID | +| `GOOGLE_DRIVE_API_KEY` | | | Google Drive API key | +| `AUTH_GOOGLE_ENABLED` | | `false` | Enables Google authentication | +| `AUTH_ENABLE_GCLOUD_ADC` | | `false` | Enables Google Cloud Application Default Credentials | ## Embedding From ef0e4d165202c27cc09b6bc854da5e47c525738a Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:47:06 +0000 Subject: [PATCH 21/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../customize-deployment/environment-variables.mdx | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index c4174dd7..c639b7cc 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -335,12 +335,12 @@ These variables enable you to configure Google Cloud Platform integration. Note that you will need an **Enterprise Licence Key** for this functionality. -| Variable | Description | Required? | Default | -| :---------------------------------------- | :---------------------------------------------------------------------------------------------------- | :-------: | :-----: | -| `EMBEDDING_ENABLED` | Enables embedding functionality | | `false` | -| `EMBED_ALLOW_ALL_DASHBOARDS_BY_DEFAULT` | When creating new embeds, allow all dashboards by default | | `false` | -| `EMBED_ALLOW_ALL_CHARTS_BY_DEFAULT` | When creating new embeds, allow all charts by default | | `false` | -| `LIGHTDASH_IFRAME_EMBEDDING_DOMAINS` | List of domains that are allowed to embed Lightdash in an iframe. Values must be separated by commas. | | | +| Variable | Required? | Default | Description | +| :---------------------------------------- | :-------: | :-----: | :---------------------------------------------------------------------------------------------------- | +| `EMBEDDING_ENABLED` | | `false` | Enables embedding functionality | +| `EMBED_ALLOW_ALL_DASHBOARDS_BY_DEFAULT` | | `false` | When creating new embeds, allow all dashboards by default | +| `EMBED_ALLOW_ALL_CHARTS_BY_DEFAULT` | | `false` | When creating new embeds, allow all charts by default | +| `LIGHTDASH_IFRAME_EMBEDDING_DOMAINS` | | | List of domains that are allowed to embed Lightdash in an iframe. Values must be separated by commas. | ## Custom roles From 131a233b2ae4b5b6f040cc3225b5c745baf22916 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:47:16 +0000 Subject: [PATCH 22/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- self-host/customize-deployment/environment-variables.mdx | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index c639b7cc..c857c7fb 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -346,9 +346,9 @@ Note that you will need an **Enterprise Licence Key** for this functionality. Note that you will need an **Enterprise Licence Key** for this functionality. -| Variable | Description | Required? | Default | -| :--------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------- | :-------: | :-----: | -| `CUSTOM_ROLES_ENABLED` | Enables creation of custom organization roles with configurable permission scopes beyond the default Admin, Developer, Editor, and Viewer roles. | | `false` | +| Variable | Required? | Default | Description | +| :--------------------- | :-------: | :-----: | :----------------------------------------------------------------------------------------------------------------------------------------------- | +| `CUSTOM_ROLES_ENABLED` | | `false` | Enables creation of custom organization roles with configurable permission scopes beyond the default Admin, Developer, Editor, and Viewer roles. | ## Service account From adf8b3daa2c311afc00e8e9644527bd348b44a44 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:47:23 +0000 Subject: [PATCH 23/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- self-host/customize-deployment/environment-variables.mdx | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index c857c7fb..a0d5254f 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -354,9 +354,9 @@ Note that you will need an **Enterprise Licence Key** for this functionality. Note that you will need an **Enterprise Licence Key** for this functionality. -| Variable | Description | Required? | Default | -| :------------------------ | :------------------------------------ | :-------: | :-----: | -| `SERVICE_ACCOUNT_ENABLED` | Enables service account functionality | | `false` | +| Variable | Required? | Default | Description | +| :------------------------ | :-------: | :-----: | :------------------------------------ | +| `SERVICE_ACCOUNT_ENABLED` | | `false` | Enables service account functionality | ## SCIM From 638f6e91d6bb700e1cf132ee8f9cd634c6c2c105 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:47:32 +0000 Subject: [PATCH 24/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- self-host/customize-deployment/environment-variables.mdx | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index a0d5254f..5d696499 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -362,9 +362,9 @@ Note that you will need an **Enterprise Licence Key** for this functionality. Note that you will need an **Enterprise Licence Key** for this functionality. -| Variable | Description | Required? | Default | -| :------------- | :--------------------------------------------------------- | :-------: | :-----: | -| `SCIM_ENABLED` | Enables SCIM (System for Cross-domain Identity Management) | | `false` | +| Variable | Required? | Default | Description | +| :------------- | :-------: | :-----: | :--------------------------------------------------------- | +| `SCIM_ENABLED` | | `false` | Enables SCIM (System for Cross-domain Identity Management) | ## Sentry From dca923b39cd78efb688b1c4baae722d7681b6ec3 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:47:43 +0000 Subject: [PATCH 25/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 22 +++++++++---------- 1 file changed, 11 insertions(+), 11 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 5d696499..3dd529ea 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -370,17 +370,17 @@ Note that you will need an **Enterprise Licence Key** for this functionality. These variables enable you to configure Sentry for error tracking. -| Variable | Description | Required? | Default | -| :------------------------------ | :-------------------------------------------------- | :-------: | :-----: | -| `SENTRY_DSN` | Sentry DSN for both frontend and backend | | | -| `SENTRY_BE_DSN` | Sentry DSN for backend only | | | -| `SENTRY_FE_DSN` | Sentry DSN for frontend only | | | -| `SENTRY_BE_SECURITY_REPORT_URI` | URI for Sentry backend security reports | | | -| `SENTRY_TRACES_SAMPLE_RATE` | Sample rate for Sentry traces (0.0 to 1.0) | | `0.1` | -| `SENTRY_PROFILES_SAMPLE_RATE` | Sample rate for Sentry profiles (0.0 to 1.0) | | `0.2` | -| `SENTRY_ANR_ENABLED` | Enables Sentry Application Not Responding detection | | `false` | -| `SENTRY_ANR_CAPTURE_STACKTRACE` | Captures stacktrace for ANR events | | `false` | -| `SENTRY_ANR_TIMEOUT` | Timeout in milliseconds for ANR detection | | | +| Variable | Required? | Default | Description | +| :------------------------------ | :-------: | :-----: | :-------------------------------------------------- | +| `SENTRY_DSN` | | | Sentry DSN for both frontend and backend | +| `SENTRY_BE_DSN` | | | Sentry DSN for backend only | +| `SENTRY_FE_DSN` | | | Sentry DSN for frontend only | +| `SENTRY_BE_SECURITY_REPORT_URI` | | | URI for Sentry backend security reports | +| `SENTRY_TRACES_SAMPLE_RATE` | | `0.1` | Sample rate for Sentry traces (0.0 to 1.0) | +| `SENTRY_PROFILES_SAMPLE_RATE` | | `0.2` | Sample rate for Sentry profiles (0.0 to 1.0) | +| `SENTRY_ANR_ENABLED` | | `false` | Enables Sentry Application Not Responding detection | +| `SENTRY_ANR_CAPTURE_STACKTRACE` | | `false` | Captures stacktrace for ANR events | +| `SENTRY_ANR_TIMEOUT` | | | Timeout in milliseconds for ANR detection | ## Intercom & Pylon From 0d31aedbfdecbc63d513508b269ebb09b9dd2e3c Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:47:52 +0000 Subject: [PATCH 26/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../customize-deployment/environment-variables.mdx | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 3dd529ea..ad7818be 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -386,12 +386,12 @@ These variables enable you to configure Sentry for error tracking. These variables enable you to configure Intercom and Pylon for customer support and feedback. -| Variable | Description | Required? | Default | -| :----------------------------------- | :------------------------------------ | :-------: | :---------------------------: | -| `INTERCOM_APP_ID` | Intercom application ID | | | -| `INTERCOM_APP_BASE` | Base URL for Intercom API | | `https://api-iam.intercom.io` | -| `PYLON_APP_ID` | Pylon application ID | | | -| `PYLON_IDENTITY_VERIFICATION_SECRET` | Secret for verifying Pylon identities | | | +| Variable | Required? | Default | Description | +| :----------------------------------- | :-------: | :---------------------------: | :------------------------------------ | +| `INTERCOM_APP_ID` | | | Intercom application ID | +| `INTERCOM_APP_BASE` | | `https://api-iam.intercom.io` | Base URL for Intercom API | +| `PYLON_APP_ID` | | | Pylon application ID | +| `PYLON_IDENTITY_VERIFICATION_SECRET` | | | Secret for verifying Pylon identities | ## Kubernetes From 8c285d6f37221a538dc842746dea7109d7764eb1 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:48:00 +0000 Subject: [PATCH 27/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../customize-deployment/environment-variables.mdx | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index ad7818be..0713fe50 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -397,12 +397,12 @@ These variables enable you to configure Intercom and Pylon for customer support These variables enable you to configure Kubernetes integration. -| Variable | Description | Required? | Default | -| :------------------------- | :-------------------------------------- | :-------: | :-----: | -| `K8S_NODE_NAME` | Name of the Kubernetes node | | | -| `K8S_POD_NAME` | Name of the Kubernetes pod | | | -| `K8S_POD_NAMESPACE` | Namespace of the Kubernetes pod | | | -| `LIGHTDASH_CLOUD_INSTANCE` | Identifier for Lightdash cloud instance | | | +| Variable | Required? | Default | Description | +| :------------------------- | :-------: | :-----: | :-------------------------------------- | +| `K8S_NODE_NAME` | | | Name of the Kubernetes node | +| `K8S_POD_NAME` | | | Name of the Kubernetes pod | +| `K8S_POD_NAMESPACE` | | | Namespace of the Kubernetes pod | +| `LIGHTDASH_CLOUD_INSTANCE` | | | Identifier for Lightdash cloud instance | ## **Organization appearance** From f6a1fd490b1564165bf4b7bafb6ec135dde80d3a Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:48:08 +0000 Subject: [PATCH 28/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- self-host/customize-deployment/environment-variables.mdx | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 0713fe50..cfe7dab1 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -408,10 +408,10 @@ These variables enable you to configure Kubernetes integration. These variables allow you to customize the default appearance settings for your Lightdash instance's organizations. This color palette will be set for all organizations in your instance. You can't choose another one while these env vars are set. -| Variable | Description | Required? | Default | -| :------------------------------ | :---------------------------------------------------------------------------------------- | :-------: | :-----: | -| `OVERRIDE_COLOR_PALETTE_NAME` | Name of the default color palette | | | -| `OVERRIDE_COLOR_PALETTE_COLORS` | Comma-separated list of hex color codes for the default color palette (must be 20 colors) | | | +| Variable | Required? | Default | Description | +| :------------------------------ | :-------: | :-----: | :---------------------------------------------------------------------------------------- | +| `OVERRIDE_COLOR_PALETTE_NAME` | | | Name of the default color palette | +| `OVERRIDE_COLOR_PALETTE_COLORS` | | | Comma-separated list of hex color codes for the default color palette (must be 20 colors) | ## Initialize instance From ef177fe1594cfb88e9be303632624fedfdf29ccd Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:48:35 +0000 Subject: [PATCH 29/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 48 +++++++++---------- 1 file changed, 24 insertions(+), 24 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index cfe7dab1..25358ce6 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -427,30 +427,30 @@ When a new Lightdash instance is created, and there are no orgs and projects. Yo Currently we only support Databricks project types and Github dbt configuration. -| Variable | Description | Required? | Default | -| :------------------------------------ | :------------------------------------------------------------------------------------------------ | :-------------------------------------------: | :----------: | -| `LD_SETUP_ADMIN_NAME` | Name of the admin user for initial setup | | `Admin User` | -| `LD_SETUP_ADMIN_EMAIL` | Email of the admin user for initial setup | | | -| `LD_SETUP_ORGANIZATION_EMAIL_DOMAIN` | Comma-separated list of email domains for organization whitelisting | | | -| `LD_SETUP_ORGANIZATION_DEFAULT_ROLE` | Default role for new organization members | | `viewer` | -| `LD_SETUP_ORGANIZATION_NAME` | Name of the organization | | | -| `LD_SETUP_ADMIN_API_KEY` | API key for the admin user, must start with `ldpat_` prefix | | | -| `LD_SETUP_API_KEY_EXPIRATION` | Number of days until API key expires (0 for no expiration) | | `30` | -| `LD_SETUP_SERVICE_ACCOUNT_TOKEN` | A pre-set token for the service account, must start with `ldsvc_` prefix | | | -| `LD_SETUP_SERVICE_ACCOUNT_EXPIRATION` | Number of days until service account token expires (0 for no expiration) | | `30` | -| `LD_SETUP_PROJECT_NAME` | Name of the project | | | -| `LD_SETUP_PROJECT_CATALOG` | Catalog name for Databricks project | | | -| `LD_SETUP_PROJECT_SCHEMA` | Schema/database name for the project | | | -| `LD_SETUP_PROJECT_HOST` | Hostname for the Databricks server | | | -| `LD_SETUP_PROJECT_HTTP_PATH` | HTTP path for Databricks connection | | | -| `LD_SETUP_PROJECT_PAT` | Personal access token for Databricks | | | -| `LD_SETUP_START_OF_WEEK` | Day to use as start of week | | `SUNDAY` | -| `LD_SETUP_PROJECT_COMPUTE` | JSON string with Databricks compute configuration like `{"name": "string", "httpPath": "string"}` | | | -| `LD_SETUP_DBT_VERSION` | Version of dbt to use (eg: v1.8) | | `latest` | -| `LD_SETUP_GITHUB_PAT` | GitHub personal access token | | | -| `LD_SETUP_GITHUB_REPOSITORY` | GitHub repository for dbt project | | | -| `LD_SETUP_GITHUB_BRANCH` | GitHub branch for dbt project | | | -| `LD_SETUP_GITHUB_PATH` | Subdirectory path within GitHub repository | | `/` | +| Variable | Required? | Default | Description | +| :------------------------------------ | :-------------------------------------------: | :----------: | :------------------------------------------------------------------------------------------------ | +| `LD_SETUP_ADMIN_NAME` | | `Admin User` | Name of the admin user for initial setup | +| `LD_SETUP_ADMIN_EMAIL` | | | Email of the admin user for initial setup | +| `LD_SETUP_ORGANIZATION_EMAIL_DOMAIN` | | | Comma-separated list of email domains for organization whitelisting | +| `LD_SETUP_ORGANIZATION_DEFAULT_ROLE` | | `viewer` | Default role for new organization members | +| `LD_SETUP_ORGANIZATION_NAME` | | | Name of the organization | +| `LD_SETUP_ADMIN_API_KEY` | | | API key for the admin user, must start with `ldpat_` prefix | +| `LD_SETUP_API_KEY_EXPIRATION` | | `30` | Number of days until API key expires (0 for no expiration) | +| `LD_SETUP_SERVICE_ACCOUNT_TOKEN` | | | A pre-set token for the service account, must start with `ldsvc_` prefix | +| `LD_SETUP_SERVICE_ACCOUNT_EXPIRATION` | | `30` | Number of days until service account token expires (0 for no expiration) | +| `LD_SETUP_PROJECT_NAME` | | | Name of the project | +| `LD_SETUP_PROJECT_CATALOG` | | | Catalog name for Databricks project | +| `LD_SETUP_PROJECT_SCHEMA` | | | Schema/database name for the project | +| `LD_SETUP_PROJECT_HOST` | | | Hostname for the Databricks server | +| `LD_SETUP_PROJECT_HTTP_PATH` | | | HTTP path for Databricks connection | +| `LD_SETUP_PROJECT_PAT` | | | Personal access token for Databricks | +| `LD_SETUP_START_OF_WEEK` | | `SUNDAY` | Day to use as start of week | +| `LD_SETUP_PROJECT_COMPUTE` | | | JSON string with Databricks compute configuration like `{"name": "string", "httpPath": "string"}` | +| `LD_SETUP_DBT_VERSION` | | `latest` | Version of dbt to use (eg: v1.8) | +| `LD_SETUP_GITHUB_PAT` | | | GitHub personal access token | +| `LD_SETUP_GITHUB_REPOSITORY` | | | GitHub repository for dbt project | +| `LD_SETUP_GITHUB_BRANCH` | | | GitHub branch for dbt project | +| `LD_SETUP_GITHUB_PATH` | | `/` | Subdirectory path within GitHub repository | In order to login as the admin user using SSO, you must enable the following ENV variable too: From fbd47ceef3570de492629e700006874e82fcfd64 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 15 Dec 2025 18:48:48 +0000 Subject: [PATCH 30/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 22 +++++++++---------- 1 file changed, 11 insertions(+), 11 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 25358ce6..78b24089 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -473,14 +473,14 @@ On server start, we will check the following variables, and update some configur For more information on our plans, visit our [pricing page](https://www.lightdash.com/pricing). -| Variable | Description | Required? | Default | -| :----------------------------------- | :----------------------------------------------------------------------- | :---------------------------------------------: | :------: | -| `LD_SETUP_ADMIN_EMAIL` | Email of the admin to update its Personal access token | Required if `LD_SETUP_ADMIN_API_KEY` is present | | -| `LD_SETUP_ADMIN_API_KEY` | API key for the admin user, must start with `ldpat_` prefix | | | -| `LD_SETUP_ORGANIZATION_EMAIL_DOMAIN` | Comma-separated list of email domains for organization whitelisting | | | -| `LD_SETUP_ORGANIZATION_DEFAULT_ROLE` | Default role for new organization members | | `viewer` | -| `LD_SETUP_PROJECT_HTTP_PATH` | HTTP path for Databricks connection | | | -| `LD_SETUP_PROJECT_PAT` | Personal access token for Databricks | | | -| `LD_SETUP_DBT_VERSION` | Version of dbt to use (eg: v1.8) | | `latest` | -| `LD_SETUP_GITHUB_PAT` | GitHub personal access token | | | -| `LD_SETUP_SERVICE_ACCOUNT_TOKEN` | A pre-set token for the service account, must start with `ldsvc_` prefix | | | \ No newline at end of file +| Variable | Required? | Default | Description | +| :----------------------------------- | :---------------------------------------------: | :------: | :----------------------------------------------------------------------- | +| `LD_SETUP_ADMIN_EMAIL` | Required if `LD_SETUP_ADMIN_API_KEY` is present | | Email of the admin to update its Personal access token | +| `LD_SETUP_ADMIN_API_KEY` | | | API key for the admin user, must start with `ldpat_` prefix | +| `LD_SETUP_ORGANIZATION_EMAIL_DOMAIN` | | | Comma-separated list of email domains for organization whitelisting | +| `LD_SETUP_ORGANIZATION_DEFAULT_ROLE` | | `viewer` | Default role for new organization members | +| `LD_SETUP_PROJECT_HTTP_PATH` | | | HTTP path for Databricks connection | +| `LD_SETUP_PROJECT_PAT` | | | Personal access token for Databricks | +| `LD_SETUP_DBT_VERSION` | | `latest` | Version of dbt to use (eg: v1.8) | +| `LD_SETUP_GITHUB_PAT` | | | GitHub personal access token | +| `LD_SETUP_SERVICE_ACCOUNT_TOKEN` | | | A pre-set token for the service account, must start with `ldsvc_` prefix | \ No newline at end of file From b662a92aac7966d8d2882fb785c7eda45a6d6298 Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Tue, 16 Dec 2025 13:41:57 +0000 Subject: [PATCH 31/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 594 +++++++++--------- 1 file changed, 297 insertions(+), 297 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index 78b24089..da0f03cc 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -5,51 +5,51 @@ mode: wide This is a reference to all environment variables that can be used to configure a Lightdash deployment. -| Variable | Required? | Default | Description | -| :------------------------------------------- | :-------------------------------------------: | :---------------------: | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| `PGHOST` | | | Hostname of postgres server to store Lightdash data | -| `PGPORT` | | | Port of postgres server to store Lightdash data | -| `PGUSER` | | | Username of postgres user to access postgres server to store Lightdash data | -| `PGPASSWORD` | | | Password for PGUSER | -| `PGDATABASE` | | | Database name inside postgres server to store Lightdash data | -| `PGCONNECTIONURI` | | | Connection URI for postgres server to store Lightdash data in the format postgresql://user:password@host:port/db?params. This is an alternative to providing the previous PG variables. | -| `PGMAXCONNECTIONS` | | | Maximum number of connections to the database | -| `PGMINCONNECTIONS` | | | Minimum number of connections to the database | -| `LIGHTDASH_SECRET` | | | Secret key used to secure various tokens in Lightdash. This **must** be fixed between deployments. If the secret changes, you won't have access to Lightdash data. | -| `SECURE_COOKIES` | | `false` | Only allows cookies to be stored over a https connection. We use cookies to keep you logged in. This is recommended to be set to true in production. | -| `COOKIES_MAX_AGE_HOURS` | | | How many hours a user session exists before the user is automatically signed out. For example if 24, then the user will be automatically after 24 hours of inactivity. | -| `TRUST_PROXY` | | `false` | This tells the Lightdash server that it can trust the X-Forwarded-Proto header it receives in requests. This is useful if you use `SECURE_COOKIES=true` behind a HTTPS terminated proxy that you can trust. | -| `SITE_URL` | | `http://localhost:8080` | Site url where Lightdash is being hosted. It should include the protocol. E.g https://lightdash.mycompany.com | -| `INTERNAL_LIGHTDASH_HOST` | | Same as `SITE_URL` | Internal Lightdash host for the Headless browser to send requests when your Lightdash instance is not accessible from the Internet. Needs to support `https` if `SECURE_COOKIES=true` | -| `STATIC_IP` | | `http://localhost:8080` | Server static IP so users can add the IP to their warehouse allow-list. | -| `LIGHTDASH_QUERY_MAX_LIMIT` | | `5000` | Query max rows limit | -| `LIGHTDASH_QUERY_DEFAULT_LIMIT` | | `500` | Default number of rows to return in a query | -| `LIGHTDASH_QUERY_MAX_PAGE_SIZE` | | `2500` | Maximum page size for paginated queries | -| `SCHEDULER_ENABLED` | | `true` | Enables/Disables the scheduler worker that triggers the scheduled deliveries. | -| `SCHEDULER_CONCURRENCY` | | `3` | How many scheduled delivery jobs can be processed concurrently. | -| `SCHEDULER_JOB_TIMEOUT` | | `600000` (10 minutes) | After how many milliseconds the job should be timeout so the scheduler worker can pick other jobs. | -| `SCHEDULER_SCREENSHOT_TIMEOUT` | | | Timeout in milliseconds for taking screenshots | -| `SCHEDULER_INCLUDE_TASKS` | | | Comma-separated list of scheduler tasks to include | -| `SCHEDULER_EXCLUDE_TASKS` | | | Comma-separated list of scheduler tasks to exclude | -| `LIGHTDASH_CSV_CELLS_LIMIT` | | `100000` | Max cells on CSV file exports | -| `LIGHTDASH_CHART_VERSION_HISTORY_DAYS_LIMIT` | | `3` | Configure how far back the chart versions history goes in days | -| `LIGHTDASH_PIVOT_TABLE_MAX_COLUMN_LIMIT` | | `60` | Configure maximum number of columns in pivot table | -| `GROUPS_ENABLED` | | `false` | Enables/Disables groups functionality | -| `CUSTOM_VISUALIZATIONS_ENABLED` | | `false` | Enables/Disables custom chart functionality | -| `LIGHTDASH_MAX_PAYLOAD` | | `5mb` | Maximum HTTP request body size | -| `LIGHTDASH_LICENSE_KEY` | | | License key for Lightdash Enterprise Edition. See [Enterprise License Keys](/self-host/customize-deployment/enterprise-license-keys) for details. [Get your license key](https://calendly.com/lightdash-cloud/enterprise?utm_source=docs&utm_medium=referral&utm_campaign=enterprise_licensing&utm_content=license_key_cta) | -| `HEADLESS_BROWSER_HOST` | | — | Hostname for the headless browser | -| `HEADLESS_BROWSER_PORT` | | `3001` | Port for the headless browser | -| `ALLOW_MULTIPLE_ORGS` | | `false` | If set to `true`, new users registering on Lightdash will have their own organization, separated from others | -| `LIGHTDASH_MODE` | | `default` | Mode for Lightdash (default, demo, pr, etc.) | -| `DISABLE_PAT` | | `false` | Disables Personal Access Tokens | -| `PAT_ALLOWED_ORG_ROLES` | | All roles | Comma-separated list of organization roles allowed to use Personal Access Tokens | -| `PAT_MAX_EXPIRATION_TIME_IN_DAYS` | | | Maximum expiration time in days for Personal Access Tokens | -| `MAX_DOWNLOADS_AS_CODE` | | `100` | Maximum number of downloads as code | -| `EXTENDED_USAGE_ANALYTICS` | | `false` | Enables extended usage analytics | -| `USE_SECURE_BROWSER` | | `false` | Use secure WebSocket connections for headless browser | -| `DISABLE_DASHBOARD_COMMENTS` | | `false` | Disables dashboard comments | -| `ORGANIZATION_WAREHOUSE_CREDENTIALS_ENABLED` | | `false` | Enables organization warehouse settings | +| Variable | Description | +| :------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| `PGHOST` | (Required) Hostname of postgres server to store Lightdash data | +| `PGPORT` | (Required) Port of postgres server to store Lightdash data | +| `PGUSER` | (Required) Username of postgres user to access postgres server to store Lightdash data | +| `PGPASSWORD` | (Required) Password for PGUSER | +| `PGDATABASE` | (Required) Database name inside postgres server to store Lightdash data | +| `PGCONNECTIONURI` | Connection URI for postgres server to store Lightdash data in the format postgresql://user:password@host:port/db?params. This is an alternative to providing the previous PG variables. | +| `PGMAXCONNECTIONS` | Maximum number of connections to the database | +| `PGMINCONNECTIONS` | Minimum number of connections to the database | +| `LIGHTDASH_SECRET` | (Required) Secret key used to secure various tokens in Lightdash. This **must** be fixed between deployments. If the secret changes, you won't have access to Lightdash data. | +| `SECURE_COOKIES` | Only allows cookies to be stored over a https connection. We use cookies to keep you logged in. This is recommended to be set to true in production. (default=false) | +| `COOKIES_MAX_AGE_HOURS` | How many hours a user session exists before the user is automatically signed out. For example if 24, then the user will be automatically after 24 hours of inactivity. | +| `TRUST_PROXY` | This tells the Lightdash server that it can trust the X-Forwarded-Proto header it receives in requests. This is useful if you use `SECURE_COOKIES=true` behind a HTTPS terminated proxy that you can trust. (default=false) | +| `SITE_URL` | Site url where Lightdash is being hosted. It should include the protocol. E.g https://lightdash.mycompany.com (default=http://localhost:8080) | +| `INTERNAL_LIGHTDASH_HOST` | Internal Lightdash host for the Headless browser to send requests when your Lightdash instance is not accessible from the Internet. Needs to support `https` if `SECURE_COOKIES=true` (default=Same as `SITE_URL`) | +| `STATIC_IP` | Server static IP so users can add the IP to their warehouse allow-list. (default=http://localhost:8080) | +| `LIGHTDASH_QUERY_MAX_LIMIT` | Query max rows limit (default=5000) | +| `LIGHTDASH_QUERY_DEFAULT_LIMIT` | Default number of rows to return in a query (default=500) | +| `LIGHTDASH_QUERY_MAX_PAGE_SIZE` | Maximum page size for paginated queries (default=2500) | +| `SCHEDULER_ENABLED` | Enables/Disables the scheduler worker that triggers the scheduled deliveries. (default=true) | +| `SCHEDULER_CONCURRENCY` | How many scheduled delivery jobs can be processed concurrently. (default=3) | +| `SCHEDULER_JOB_TIMEOUT` | After how many milliseconds the job should be timeout so the scheduler worker can pick other jobs. (default=600000, 10 minutes) | +| `SCHEDULER_SCREENSHOT_TIMEOUT` | Timeout in milliseconds for taking screenshots | +| `SCHEDULER_INCLUDE_TASKS` | Comma-separated list of scheduler tasks to include | +| `SCHEDULER_EXCLUDE_TASKS` | Comma-separated list of scheduler tasks to exclude | +| `LIGHTDASH_CSV_CELLS_LIMIT` | Max cells on CSV file exports (default=100000) | +| `LIGHTDASH_CHART_VERSION_HISTORY_DAYS_LIMIT` | Configure how far back the chart versions history goes in days (default=3) | +| `LIGHTDASH_PIVOT_TABLE_MAX_COLUMN_LIMIT` | Configure maximum number of columns in pivot table (default=60) | +| `GROUPS_ENABLED` | Enables/Disables groups functionality (default=false) | +| `CUSTOM_VISUALIZATIONS_ENABLED` | Enables/Disables custom chart functionality (default=false) | +| `LIGHTDASH_MAX_PAYLOAD` | Maximum HTTP request body size (default=5mb) | +| `LIGHTDASH_LICENSE_KEY` | License key for Lightdash Enterprise Edition. See [Enterprise License Keys](/self-host/customize-deployment/enterprise-license-keys) for details. [Get your license key](https://calendly.com/lightdash-cloud/enterprise?utm_source=docs&utm_medium=referral&utm_campaign=enterprise_licensing&utm_content=license_key_cta) | +| `HEADLESS_BROWSER_HOST` | Hostname for the headless browser | +| `HEADLESS_BROWSER_PORT` | Port for the headless browser (default=3001) | +| `ALLOW_MULTIPLE_ORGS` | If set to `true`, new users registering on Lightdash will have their own organization, separated from others (default=false) | +| `LIGHTDASH_MODE` | Mode for Lightdash (default, demo, pr, etc.) (default=default) | +| `DISABLE_PAT` | Disables Personal Access Tokens (default=false) | +| `PAT_ALLOWED_ORG_ROLES` | Comma-separated list of organization roles allowed to use Personal Access Tokens (default=All roles) | +| `PAT_MAX_EXPIRATION_TIME_IN_DAYS` | Maximum expiration time in days for Personal Access Tokens | +| `MAX_DOWNLOADS_AS_CODE` | Maximum number of downloads as code (default=100) | +| `EXTENDED_USAGE_ANALYTICS` | Enables extended usage analytics (default=false) | +| `USE_SECURE_BROWSER` | Use secure WebSocket connections for headless browser (default=false) | +| `DISABLE_DASHBOARD_COMMENTS` | Disables dashboard comments (default=false) | +| `ORGANIZATION_WAREHOUSE_CREDENTIALS_ENABLED` | Enables organization warehouse settings (default=false) | Lightdash also accepts all [standard postgres environment variables](https://www.postgresql.org/docs/9.3/libpq-envars.html) @@ -57,17 +57,17 @@ Lightdash also accepts all [standard postgres environment variables](https://www This is a reference to all the SMTP environment variables that can be used to configure a Lightdash email client. -| Variable | Required? | Default | Description | -| :------------------------------ | :-------------------------------------------: | :---------: | :------------------------------------------------------------------------- | -| `EMAIL_SMTP_HOST` | | | Hostname of email server | -| `EMAIL_SMTP_PORT` | | `587` | Port of email server | -| `EMAIL_SMTP_SECURE` | | `true` | Secure connection | -| `EMAIL_SMTP_USER` | | | Auth user | -| `EMAIL_SMTP_PASSWORD` | `[1]` | | Auth password | -| `EMAIL_SMTP_ACCESS_TOKEN` | `[1]` | | Auth access token for Oauth2 authentication | -| `EMAIL_SMTP_ALLOW_INVALID_CERT` | | `false` | Allow connection to TLS server with self-signed or invalid TLS certificate | -| `EMAIL_SMTP_SENDER_EMAIL` | | | The email address that sends emails | -| `EMAIL_SMTP_SENDER_NAME` | | `Lightdash` | The name of the email address that sends emails | +| Variable | Description | +| :------------------------------ | :------------------------------------------------------------------------- | +| `EMAIL_SMTP_HOST` | (Required) Hostname of email server | +| `EMAIL_SMTP_PORT` | Port of email server (default=587) | +| `EMAIL_SMTP_SECURE` | Secure connection (default=true) | +| `EMAIL_SMTP_USER` | (Required) Auth user | +| `EMAIL_SMTP_PASSWORD` | Auth password [1] | +| `EMAIL_SMTP_ACCESS_TOKEN` | Auth access token for Oauth2 authentication [1] | +| `EMAIL_SMTP_ALLOW_INVALID_CERT` | Allow connection to TLS server with self-signed or invalid TLS certificate (default=false) | +| `EMAIL_SMTP_SENDER_EMAIL` | (Required) The email address that sends emails | +| `EMAIL_SMTP_SENDER_NAME` | The name of the email address that sends emails (default=Lightdash) | [1] `EMAIL_SMTP_PASSWORD` or `EMAIL_SMTP_ACCESS_TOKEN` needs to be provided @@ -75,54 +75,54 @@ This is a reference to all the SMTP environment variables that can be used to co These variables enable you to control Single Sign On (SSO) functionality. -| Variable | Required? | Default | Description | -| :------------------------------------- | :-------: | :-----: | :------------------------------------------------------------------------------------------------------------------------------------------- | -| `AUTH_DISABLE_PASSWORD_AUTHENTICATION` | | `false` | If "true" disables signing in with plain passwords | -| `AUTH_ENABLE_GROUP_SYNC` | | `false` | If "true" enables assigning SSO groups to Lightdash groups | -| `AUTH_ENABLE_OIDC_LINKING` | | `false` | Enables linking a new OIDC identity to an existing user if they already have another OIDC with the same email | -| `AUTH_ENABLE_OIDC_TO_EMAIL_LINKING` | | `false` | Enables linking OIDC identity to an existing user by email. Required when using [SCIM](/references/scim-integration) with SSO | -| `AUTH_GOOGLE_OAUTH2_CLIENT_ID` | | | Required for Google SSO | -| `AUTH_GOOGLE_OAUTH2_CLIENT_SECRET` | | | Required for Google SSO | -| `AUTH_OKTA_OAUTH_CLIENT_ID` | | | Required for Okta SSO | -| `AUTH_OKTA_OAUTH_CLIENT_SECRET` | | | Required for Okta SSO | -| `AUTH_OKTA_OAUTH_ISSUER` | | | Required for Okta SSO | -| `AUTH_OKTA_DOMAIN` | | | Required for Okta SSO | -| `AUTH_OKTA_AUTHORIZATION_SERVER_ID` | | | Optional for Okta SSO with a custom authorization server | -| `AUTH_OKTA_EXTRA_SCOPES` | | | Optional for Okta SSO scopes (e.g. groups) without a custom authorization server | -| `AUTH_ONE_LOGIN_OAUTH_CLIENT_ID` | | | Required for One Login SSO | -| `AUTH_ONE_LOGIN_OAUTH_CLIENT_SECRET` | | | Required for One Login SSO | -| `AUTH_ONE_LOGIN_OAUTH_ISSUER` | | | Required for One Login SSO | -| `AUTH_AZURE_AD_OAUTH_CLIENT_ID` | | | Required for Azure AD | -| `AUTH_AZURE_AD_OAUTH_CLIENT_SECRET` | | | Required for Azure AD | -| `AUTH_AZURE_AD_OAUTH_TENANT_ID` | | | Required for Azure AD | -| `AUTH_AZURE_AD_OIDC_METADATA_ENDPOINT` | | | Optional for Azure AD | -| `AUTH_AZURE_AD_X509_CERT_PATH` | | | Optional for Azure AD | -| `AUTH_AZURE_AD_X509_CERT` | | | Optional for Azure AD | -| `AUTH_AZURE_AD_PRIVATE_KEY_PATH` | | | Optional for Azure AD | -| `AUTH_AZURE_AD_PRIVATE_KEY` | | | Optional for Azure AD | -| `DATABRICKS_OAUTH_CLIENT_ID` | | | Client ID for Databricks OAuth | -| `DATABRICKS_OAUTH_CLIENT_SECRET` | | | Client secret for Databricks OAuth (optional) | -| `DATABRICKS_OAUTH_AUTHORIZATION_ENDPOINT` | | | Authorization endpoint URL for Databricks OAuth | -| `DATABRICKS_OAUTH_TOKEN_ENDPOINT` | | | Token endpoint URL for Databricks OAuth | +| Variable | Description | +| :------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------- | +| `AUTH_DISABLE_PASSWORD_AUTHENTICATION` | If "true" disables signing in with plain passwords (default=false) | +| `AUTH_ENABLE_GROUP_SYNC` | If "true" enables assigning SSO groups to Lightdash groups (default=false) | +| `AUTH_ENABLE_OIDC_LINKING` | Enables linking a new OIDC identity to an existing user if they already have another OIDC with the same email (default=false) | +| `AUTH_ENABLE_OIDC_TO_EMAIL_LINKING` | Enables linking OIDC identity to an existing user by email. Required when using [SCIM](/references/scim-integration) with SSO (default=false)| +| `AUTH_GOOGLE_OAUTH2_CLIENT_ID` | Required for Google SSO | +| `AUTH_GOOGLE_OAUTH2_CLIENT_SECRET` | Required for Google SSO | +| `AUTH_OKTA_OAUTH_CLIENT_ID` | Required for Okta SSO | +| `AUTH_OKTA_OAUTH_CLIENT_SECRET` | Required for Okta SSO | +| `AUTH_OKTA_OAUTH_ISSUER` | Required for Okta SSO | +| `AUTH_OKTA_DOMAIN` | Required for Okta SSO | +| `AUTH_OKTA_AUTHORIZATION_SERVER_ID` | Optional for Okta SSO with a custom authorization server | +| `AUTH_OKTA_EXTRA_SCOPES` | Optional for Okta SSO scopes (e.g. groups) without a custom authorization server | +| `AUTH_ONE_LOGIN_OAUTH_CLIENT_ID` | Required for One Login SSO | +| `AUTH_ONE_LOGIN_OAUTH_CLIENT_SECRET` | Required for One Login SSO | +| `AUTH_ONE_LOGIN_OAUTH_ISSUER` | Required for One Login SSO | +| `AUTH_AZURE_AD_OAUTH_CLIENT_ID` | Required for Azure AD | +| `AUTH_AZURE_AD_OAUTH_CLIENT_SECRET` | Required for Azure AD | +| `AUTH_AZURE_AD_OAUTH_TENANT_ID` | Required for Azure AD | +| `AUTH_AZURE_AD_OIDC_METADATA_ENDPOINT` | Optional for Azure AD | +| `AUTH_AZURE_AD_X509_CERT_PATH` | Optional for Azure AD | +| `AUTH_AZURE_AD_X509_CERT` | Optional for Azure AD | +| `AUTH_AZURE_AD_PRIVATE_KEY_PATH` | Optional for Azure AD | +| `AUTH_AZURE_AD_PRIVATE_KEY` | Optional for Azure AD | +| `DATABRICKS_OAUTH_CLIENT_ID` | Client ID for Databricks OAuth | +| `DATABRICKS_OAUTH_CLIENT_SECRET` | Client secret for Databricks OAuth (optional) | +| `DATABRICKS_OAUTH_AUTHORIZATION_ENDPOINT` | Authorization endpoint URL for Databricks OAuth | +| `DATABRICKS_OAUTH_TOKEN_ENDPOINT` | Token endpoint URL for Databricks OAuth | ## S3 These variables allow you to configure [S3 Object Storage](/self-host/customize-deployment/configure-lightdash-to-use-external-object-storage), which is required to self-host Lightdash. -| Variable | Required? | Default | Description | -|:--------------------------|:---------------------------------------------:|:---------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| `S3_ENDPOINT` | | | S3 endpoint for storing results | -| `S3_BUCKET` | | | Name of the S3 bucket for storing files | -| `S3_REGION` | | | Region where the S3 bucket is located | -| `S3_ACCESS_KEY` | | | Access key for authenticating with the S3 bucket | -| `S3_SECRET_KEY` | | | Secret key for authenticating with the S3 bucket | -| `S3_USE_CREDENTIALS_FROM` | | | Configures the credential provider chain for AWS S3 authentication if access key and secret is not provided. Supports: `env` (environment variables), `token_file` (token file credentials), `ini` (initialization file credentials), `ecs` (container metadata credentials), `ec2` (instance metadata credentials). Multiple values can be specified in order of preference. | -| `S3_EXPIRATION_TIME` | | 259200 (3d) | Expiration time for scheduled deliveries files | -| `S3_FORCE_PATH_STYLE` | | `false` | Force path style addressing, needed for MinIO setup e.g. `http://your.s3.domain/BUCKET/KEY` instead of `http://BUCKET.your.s3.domain/KEY` | -| `RESULTS_S3_BUCKET` | | `S3_BUCKET` | Name of the S3 bucket used for storing query results | -| `RESULTS_S3_REGION` | | `S3_REGION` | Region where the S3 query storage bucket is located | -| `RESULTS_S3_ACCESS_KEY` | | `S3_ACCESS_KEY` | Access key for authenticating with the S3 query storage bucket | -| `RESULTS_S3_SECRET_KEY` | | `S3_SECRET_KEY` | Secret key for authenticating with the S3 query storage bucket | +| Variable | Description | +|:--------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| `S3_ENDPOINT` | (Required) S3 endpoint for storing results | +| `S3_BUCKET` | (Required) Name of the S3 bucket for storing files | +| `S3_REGION` | (Required) Region where the S3 bucket is located | +| `S3_ACCESS_KEY` | Access key for authenticating with the S3 bucket | +| `S3_SECRET_KEY` | Secret key for authenticating with the S3 bucket | +| `S3_USE_CREDENTIALS_FROM` | Configures the credential provider chain for AWS S3 authentication if access key and secret is not provided. Supports: `env` (environment variables), `token_file` (token file credentials), `ini` (initialization file credentials), `ecs` (container metadata credentials), `ec2` (instance metadata credentials). Multiple values can be specified in order of preference. | +| `S3_EXPIRATION_TIME` | Expiration time for scheduled deliveries files (default=259200, 3d) | +| `S3_FORCE_PATH_STYLE` | Force path style addressing, needed for MinIO setup e.g. `http://your.s3.domain/BUCKET/KEY` instead of `http://BUCKET.your.s3.domain/KEY` (default=false) | +| `RESULTS_S3_BUCKET` | Name of the S3 bucket used for storing query results (default=S3_BUCKET) | +| `RESULTS_S3_REGION` | Region where the S3 query storage bucket is located (default=S3_REGION) | +| `RESULTS_S3_ACCESS_KEY` | Access key for authenticating with the S3 query storage bucket (default=S3_ACCESS_KEY) | +| `RESULTS_S3_SECRET_KEY` | Secret key for authenticating with the S3 query storage bucket (default=S3_SECRET_KEY) | ## Cache @@ -130,85 +130,85 @@ These variables allow you to configure [S3 Object Storage](/self-host/customize- Note that you will need an Enterprise License Key for this functionality. -| Variable | Required? | Default | Description | -| :--------------------------- | :-------: | :---------: | :------------------------------------------------------------------------- | -| `RESULTS_CACHE_ENABLED` | | `false` | Enables caching for chart results | -| `AUTOCOMPLETE_CACHE_ENABLED` | | `false` | Enables caching for filter autocomplete results | -| `CACHE_STALE_TIME_SECONDS` | | 86400 (24h) | Defines how long cached results remain valid before being considered stale | +| Variable | Description | +| :--------------------------- | :------------------------------------------------------------------------- | +| `RESULTS_CACHE_ENABLED` | Enables caching for chart results (default=false) | +| `AUTOCOMPLETE_CACHE_ENABLED` | Enables caching for filter autocomplete results (default=false) | +| `CACHE_STALE_TIME_SECONDS` | Defines how long cached results remain valid before being considered stale (default=86400, 24h) | These variables are **deprecated**; use the `RESULTS_S3_*` versions instead. -| Variable | Required? | Default | Description | -| :---------------------------- | :-------: | :-------------: | :------------------------------------- | -| `RESULTS_CACHE_S3_BUCKET` | | `S3_BUCKET` | Deprecated - use RESULTS_S3_BUCKET | -| `RESULTS_CACHE_S3_REGION` | | `S3_REGION` | Deprecated - use RESULTS_S3_REGION | -| `RESULTS_CACHE_S3_ACCESS_KEY` | | `S3_ACCESS_KEY` | Deprecated - use RESULTS_S3_ACCESS_KEY | -| `RESULTS_CACHE_S3_SECRET_KEY` | | `S3_SECRET_KEY` | Deprecated - use RESULTS_S3_SECRET_KEY | +| Variable | Description | +| :---------------------------- | :------------------------------------------------- | +| `RESULTS_CACHE_S3_BUCKET` | Deprecated - use RESULTS_S3_BUCKET (default=S3_BUCKET) | +| `RESULTS_CACHE_S3_REGION` | Deprecated - use RESULTS_S3_REGION (default=S3_REGION) | +| `RESULTS_CACHE_S3_ACCESS_KEY` | Deprecated - use RESULTS_S3_ACCESS_KEY (default=S3_ACCESS_KEY) | +| `RESULTS_CACHE_S3_SECRET_KEY` | Deprecated - use RESULTS_S3_SECRET_KEY (default=S3_SECRET_KEY) | ## Logging -| Variable | Required? | Default | Description | -| :----------------------------- | :-------: | :--------------------: | :------------------------------------------------------------------------------------------ | -| `LIGHTDASH_LOG_LEVEL` | | `INFO` | The minimum level of log messages to show. `DEBUG`, `AUDIT`, `HTTP`, `INFO`, `WARN` `ERROR` | -| `LIGHTDASH_LOG_FORMAT` | | `pretty` | The format of log messages. `PLAIN`, `PRETTY`, `JSON` | -| `LIGHTDASH_LOG_OUTPUTS` | | `console` | The outputs to send log messages to | -| `LIGHTDASH_LOG_CONSOLE_LEVEL` | | `LIGHTDASH_LOG_LEVEL` | The minimum level of log messages to display on the console | -| `LIGHTDASH_LOG_CONSOLE_FORMAT` | | `LIGHTDASH_LOG_FORMAT` | The format of log messages on the console | -| `LIGHTDASH_LOG_FILE_LEVEL` | | `LIGHTDASH_LOG_LEVEL` | The minimum level of log messages to write to the log file | -| `LIGHTDASH_LOG_FILE_FORMAT` | | `LIGHTDASH_LOG_FORMAT` | The format of log messages in the log file | -| `LIGHTDASH_LOG_FILE_PATH` | | `./logs/all.log` | The path to the log file. Requires `LIGHTDASH_LOG_OUTPUTS` to include `file` to enable file output. | +| Variable | Description | +| :----------------------------- | :------------------------------------------------------------------------------------------ | +| `LIGHTDASH_LOG_LEVEL` | The minimum level of log messages to show. `DEBUG`, `AUDIT`, `HTTP`, `INFO`, `WARN` `ERROR` (default=INFO) | +| `LIGHTDASH_LOG_FORMAT` | The format of log messages. `PLAIN`, `PRETTY`, `JSON` (default=pretty) | +| `LIGHTDASH_LOG_OUTPUTS` | The outputs to send log messages to (default=console) | +| `LIGHTDASH_LOG_CONSOLE_LEVEL` | The minimum level of log messages to display on the console (default=LIGHTDASH_LOG_LEVEL) | +| `LIGHTDASH_LOG_CONSOLE_FORMAT` | The format of log messages on the console (default=LIGHTDASH_LOG_FORMAT) | +| `LIGHTDASH_LOG_FILE_LEVEL` | The minimum level of log messages to write to the log file (default=LIGHTDASH_LOG_LEVEL) | +| `LIGHTDASH_LOG_FILE_FORMAT` | The format of log messages in the log file (default=LIGHTDASH_LOG_FORMAT) | +| `LIGHTDASH_LOG_FILE_PATH` | The path to the log file. Requires `LIGHTDASH_LOG_OUTPUTS` to include `file` to enable file output. (default=./logs/all.log) | ## Prometheus -| Variable | Required? | Default | Description | -| :------------------------------------------ | :-------: | :-------------------------: | :------------------------------------------------------------------------------ | -| `LIGHTDASH_PROMETHEUS_ENABLED` | | `false` | Enables/Disables Prometheus metrics endpoint | -| `LIGHTDASH_PROMETHEUS_PORT` | | `9090` | Port for Prometheus metrics endpoint | -| `LIGHTDASH_PROMETHEUS_PATH` | | `/metrics` | Path for Prometheus metrics endpoint | -| `LIGHTDASH_PROMETHEUS_PREFIX` | | | Prefix for metric names. | -| `LIGHTDASH_GC_DURATION_BUCKETS` | | `0.001, 0.01, 0.1, 1, 2, 5` | Buckets for duration histogram in seconds. | -| `LIGHTDASH_EVENT_LOOP_MONITORING_PRECISION` | | `10` | Precision for event loop monitoring in milliseconds. Must be greater than zero. | -| `LIGHTDASH_PROMETHEUS_LABELS` | | | Labels to add to all metrics. Must be valid JSON | +| Variable | Description | +| :------------------------------------------ | :------------------------------------------------------------------------------ | +| `LIGHTDASH_PROMETHEUS_ENABLED` | Enables/Disables Prometheus metrics endpoint (default=false) | +| `LIGHTDASH_PROMETHEUS_PORT` | Port for Prometheus metrics endpoint (default=9090) | +| `LIGHTDASH_PROMETHEUS_PATH` | Path for Prometheus metrics endpoint (default=/metrics) | +| `LIGHTDASH_PROMETHEUS_PREFIX` | Prefix for metric names. | +| `LIGHTDASH_GC_DURATION_BUCKETS` | Buckets for duration histogram in seconds. (default=0.001, 0.01, 0.1, 1, 2, 5) | +| `LIGHTDASH_EVENT_LOOP_MONITORING_PRECISION` | Precision for event loop monitoring in milliseconds. Must be greater than zero. (default=10) | +| `LIGHTDASH_PROMETHEUS_LABELS` | Labels to add to all metrics. Must be valid JSON | ## Security -| Variable | Required? | Default | Description | -| :------------------------------- | :-------: | :-----: | :--------------------------------------------------------------------------------------------------------------- | -| `LIGHTDASH_CSP_REPORT_ONLY` | | `true` | Enables Content Security Policy (CSP) reporting only mode. This is recommended to be set to false in production. | -| `LIGHTDASH_CSP_ALLOWED_DOMAINS` | | | List of domains that are allowed to load resources from. Values must be separated by commas. | -| `LIGHTDASH_CSP_REPORT_URI` | | | URI to send CSP violation reports to. | -| `LIGHTDASH_CORS_ENABLED` | | `false` | Enables Cross-Origin Resource Sharing (CORS) | -| `LIGHTDASH_CORS_ALLOWED_DOMAINS` | | | List of domains that are allowed to make cross-origin requests. Values must be separated by commas. | +| Variable | Description | +| :------------------------------- | :--------------------------------------------------------------------------------------------------------------- | +| `LIGHTDASH_CSP_REPORT_ONLY` | Enables Content Security Policy (CSP) reporting only mode. This is recommended to be set to false in production. (default=true) | +| `LIGHTDASH_CSP_ALLOWED_DOMAINS` | List of domains that are allowed to load resources from. Values must be separated by commas. | +| `LIGHTDASH_CSP_REPORT_URI` | URI to send CSP violation reports to. | +| `LIGHTDASH_CORS_ENABLED` | Enables Cross-Origin Resource Sharing (CORS) (default=false) | +| `LIGHTDASH_CORS_ALLOWED_DOMAINS` | List of domains that are allowed to make cross-origin requests. Values must be separated by commas. | ## Analytics & Event Tracking -| Variable | Description | Required? | Default | -| :------------------------------- | :------------------------------------------------------------------------------------------------- | :-------: | :-----: | -| `RUDDERSTACK_WRITE_KEY` | RudderStack key used to track events (by default Lightdash’s key is used) | | | -| `RUDDERSTACK_DATA_PLANE_URL` | RudderStack data plane URL to which events are tracked (by default Lightdash’s data plane is used) | | | -| `RUDDERSTACK_ANALYTICS_DISABLED` | Set to true to disable RudderStack analytics | | | -| `POSTHOG_PROJECT_API_KEY` | API key for Posthog (by default Lightdash’s key is used) | | | -| `POSTHOG_FE_API_HOST` | Hostname for Posthog’s front-end API | | | -| `POSTHOG_BE_API_HOST` | Hostname for Posthog’s back-end API | | | +| Variable | Description | +| :------------------------------- | :------------------------------------------------------------------------------------------------- | +| `RUDDERSTACK_WRITE_KEY` | RudderStack key used to track events (by default Lightdash's key is used) | +| `RUDDERSTACK_DATA_PLANE_URL` | RudderStack data plane URL to which events are tracked (by default Lightdash's data plane is used) | +| `RUDDERSTACK_ANALYTICS_DISABLED` | Set to true to disable RudderStack analytics | +| `POSTHOG_PROJECT_API_KEY` | API key for Posthog (by default Lightdash's key is used) | +| `POSTHOG_FE_API_HOST` | Hostname for Posthog's front-end API | +| `POSTHOG_BE_API_HOST` | Hostname for Posthog's back-end API | ## AI Analyst These variables enable you to configure the [AI Analyst functionality](/guides/ai-agents). Note that you will need an **Enterprise Licence Key** for this functionality. -| Variable | Required? | Default | Description | -| ---------------------------------------- | --------- | -------- | ---------------------------------------------------------------------------------------------------------------------------------------- | -| `AI_COPILOT_ENABLED` | | `false` | Enables/Disables AI Analyst functionality | -| `ASK_AI_BUTTON_ENABLED` | | `false` | Enables the "Ask AI" button in the interface for direct access to AI agents, when disabled agents can be acessed from `/ai-agents` route | -| `AI_EMBEDDING_ENABLED` | | `false` | Enables AI embedding functionality for verified answers similarity matching | -| `AI_DEFAULT_PROVIDER` | | `openai` | Default AI provider to use (`openai`, `azure`, `anthropic`, `openrouter`, `bedrock`) | -| `AI_DEFAULT_EMBEDDING_PROVIDER` | | `openai` | Default AI provider for embeddings (`openai`, `bedrock`, `azure`) | -| `AI_COPILOT_DEBUG_LOGGING_ENABLED` | | `false` | Enables debug logging for AI Copilot | -| `AI_COPILOT_TELEMETRY_ENABLED` | | `false` | Enables telemetry for AI Copilot | -| `AI_COPILOT_REQUIRES_FEATURE_FLAG` | | `false` | Requires a feature flag to use AI Copilot | -| `AI_COPILOT_MAX_QUERY_LIMIT` | | `500` | Maximum number of rows returned in AI-generated queries | -| `AI_VERIFIED_ANSWER_SIMILARITY_THRESHOLD`| | `0.6` | Similarity threshold (0-1) for verified answer matching | +| Variable | Description | +| ---------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------- | +| `AI_COPILOT_ENABLED` | Enables/Disables AI Analyst functionality (default=false) | +| `ASK_AI_BUTTON_ENABLED` | Enables the "Ask AI" button in the interface for direct access to AI agents, when disabled agents can be acessed from `/ai-agents` route (default=false) | +| `AI_EMBEDDING_ENABLED` | Enables AI embedding functionality for verified answers similarity matching (default=false) | +| `AI_DEFAULT_PROVIDER` | Default AI provider to use (`openai`, `azure`, `anthropic`, `openrouter`, `bedrock`) (default=openai) | +| `AI_DEFAULT_EMBEDDING_PROVIDER` | Default AI provider for embeddings (`openai`, `bedrock`, `azure`) (default=openai) | +| `AI_COPILOT_DEBUG_LOGGING_ENABLED` | Enables debug logging for AI Copilot (default=false) | +| `AI_COPILOT_TELEMETRY_ENABLED` | Enables telemetry for AI Copilot (default=false) | +| `AI_COPILOT_REQUIRES_FEATURE_FLAG` | Requires a feature flag to use AI Copilot (default=false) | +| `AI_COPILOT_MAX_QUERY_LIMIT` | Maximum number of rows returned in AI-generated queries (default=500) | +| `AI_VERIFIED_ANSWER_SIMILARITY_THRESHOLD`| Similarity threshold (0-1) for verified answer matching (default=0.6) | The AI Analyst supports multiple providers for flexibility. Choose one of the provider configurations below based on your preferred AI service. **OpenAI integration is the recommended option as it is the most tested and stable implementation.** @@ -218,54 +218,54 @@ To enable AI Analyst, set `AI_COPILOT_ENABLED=true` and provide an API key for ` #### OpenAI Configuration -| Variable | Required? | Default | Description | -| ------------------------ | -------------------------- | ------------------------ | ------------------------------------------------------------- | -| `OPENAI_API_KEY` | Required when using OpenAI | | API key for OpenAI | -| `OPENAI_MODEL_NAME` | | `gpt-4.1` | OpenAI model name to use | -| `OPENAI_EMBEDDING_MODEL` | | `text-embedding-3-small` | OpenAI embedding model for verified answers | -| `OPENAI_BASE_URL` | | | Optional base URL for OpenAI compatible API | -| `OPENAI_AVAILABLE_MODELS`| | All supported models | Comma-separated list of models available in the model picker | +| Variable | Description | +| ------------------------ | ------------------------------------------------------------- | +| `OPENAI_API_KEY` | (Required when using OpenAI) API key for OpenAI | +| `OPENAI_MODEL_NAME` | OpenAI model name to use (default=gpt-4.1) | +| `OPENAI_EMBEDDING_MODEL` | OpenAI embedding model for verified answers (default=text-embedding-3-small) | +| `OPENAI_BASE_URL` | Optional base URL for OpenAI compatible API | +| `OPENAI_AVAILABLE_MODELS`| Comma-separated list of models available in the model picker (default=All supported models) | #### Anthropic Configuration -| Variable | Required? | Default | Description | -| -------------------------- | ----------------------------- | ------------------ | ------------------------------------------------------------ | -| `ANTHROPIC_API_KEY` | Required when using Anthropic | | API key for Anthropic | -| `ANTHROPIC_MODEL_NAME` | | `claude-sonnet-4-5`| Anthropic model name to use | -| `ANTHROPIC_AVAILABLE_MODELS`| | All supported models | Comma-separated list of models available in the model picker | +| Variable | Description | +| -------------------------- | ------------------------------------------------------------ | +| `ANTHROPIC_API_KEY` | (Required when using Anthropic) API key for Anthropic | +| `ANTHROPIC_MODEL_NAME` | Anthropic model name to use (default=claude-sonnet-4-5) | +| `ANTHROPIC_AVAILABLE_MODELS`| Comma-separated list of models available in the model picker (default=All supported models) | #### Azure AI Configuration -| Variable | Required? | Default | Description | -| --------------------------------- | ---------------------------- | ------------------------ | ------------------------------------------------------------ | -| `AZURE_AI_API_KEY` | Required when using Azure AI | | API key for Azure AI | -| `AZURE_AI_ENDPOINT` | Required when using Azure AI | | Endpoint for Azure AI | -| `AZURE_AI_API_VERSION` | Required when using Azure AI | | API version for Azure AI | -| `AZURE_AI_DEPLOYMENT_NAME` | Required when using Azure AI | | Deployment name for Azure AI | -| `AZURE_EMBEDDING_DEPLOYMENT_NAME` | | `text-embedding-3-small` | Deployment name for Azure embedding model | -| `AZURE_USE_DEPLOYMENT_BASED_URLS` | | `true` | Use deployment-based URLs for Azure OpenAI API calls | +| Variable | Description | +| --------------------------------- | ------------------------------------------------------------ | +| `AZURE_AI_API_KEY` | (Required when using Azure AI) API key for Azure AI | +| `AZURE_AI_ENDPOINT` | (Required when using Azure AI) Endpoint for Azure AI | +| `AZURE_AI_API_VERSION` | (Required when using Azure AI) API version for Azure AI | +| `AZURE_AI_DEPLOYMENT_NAME` | (Required when using Azure AI) Deployment name for Azure AI | +| `AZURE_EMBEDDING_DEPLOYMENT_NAME` | Deployment name for Azure embedding model (default=text-embedding-3-small) | +| `AZURE_USE_DEPLOYMENT_BASED_URLS` | Use deployment-based URLs for Azure OpenAI API calls (default=true) | #### OpenRouter Configuration -| Variable | Required? | Default | Description | -| ------------------------------ | ------------------------------ | --------------------------- | --------------------------------------------------------------------------- | -| `OPENROUTER_API_KEY` | Required when using OpenRouter | | API key for OpenRouter | -| `OPENROUTER_MODEL_NAME` | | `openai/gpt-4.1-2025-04-14` | OpenRouter model name to use | -| `OPENROUTER_SORT_ORDER` | | `latency` | Provider sorting method (`price`, `throughput`, `latency`) | -| `OPENROUTER_ALLOWED_PROVIDERS` | | `openai` | Comma-separated list of allowed providers (`anthropic`, `openai`, `google`) | +| Variable | Description | +| ------------------------------ | --------------------------------------------------------------------------- | +| `OPENROUTER_API_KEY` | (Required when using OpenRouter) API key for OpenRouter | +| `OPENROUTER_MODEL_NAME` | OpenRouter model name to use (default=openai/gpt-4.1-2025-04-14) | +| `OPENROUTER_SORT_ORDER` | Provider sorting method (`price`, `throughput`, `latency`) (default=latency)| +| `OPENROUTER_ALLOWED_PROVIDERS` | Comma-separated list of allowed providers (`anthropic`, `openai`, `google`) (default=openai) | #### AWS Bedrock Configuration -| Variable | Required? | Default | Description | -| -------------------------- | ------------------------------------------------ | --------------------- | ------------------------------------------------------------- | -| `BEDROCK_API_KEY` | Required if not using IAM credentials | | API key for Bedrock (alternative to IAM credentials) | -| `BEDROCK_ACCESS_KEY_ID` | Required if not using API key | | AWS access key ID for Bedrock | -| `BEDROCK_SECRET_ACCESS_KEY`| Required if using access key ID | | AWS secret access key for Bedrock | -| `BEDROCK_SESSION_TOKEN` | | | AWS session token (for temporary credentials) | -| `BEDROCK_REGION` | | | AWS region for Bedrock | -| `BEDROCK_MODEL_NAME` | | `claude-sonnet-4-5` | Bedrock model name to use | -| `BEDROCK_EMBEDDING_MODEL` | | `cohere.embed-english-v3` | Bedrock embedding model for verified answers | -| `BEDROCK_AVAILABLE_MODELS` | | All supported models | Comma-separated list of models available in the model picker | +| Variable | Description | +| -------------------------- | ------------------------------------------------------------- | +| `BEDROCK_API_KEY` | (Required if not using IAM credentials) API key for Bedrock (alternative to IAM credentials) | +| `BEDROCK_ACCESS_KEY_ID` | (Required if not using API key) AWS access key ID for Bedrock | +| `BEDROCK_SECRET_ACCESS_KEY`| (Required if using access key ID) AWS secret access key for Bedrock | +| `BEDROCK_SESSION_TOKEN` | AWS session token (for temporary credentials) | +| `BEDROCK_REGION` | (Required) AWS region for Bedrock | +| `BEDROCK_MODEL_NAME` | Bedrock model name to use (default=claude-sonnet-4-5) | +| `BEDROCK_EMBEDDING_MODEL` | Bedrock embedding model for verified answers (default=cohere.embed-english-v3) | +| `BEDROCK_AVAILABLE_MODELS` | Comma-separated list of models available in the model picker (default=All supported models) | #### Supported Models @@ -287,131 +287,131 @@ For Bedrock, the region prefix is also added based on `BEDROCK_REGION` (e.g., `c These variables enable you to configure the [Slack integration](/guides/using-slack-integration). -| Variable | Required? | Default | Description | -| :--------------------------- | :-------: | :-------------------: | :------------------------------------------- | -| `SLACK_SIGNING_SECRET` | | | Required for Slack integration | -| `SLACK_CLIENT_ID` | | | Required for Slack integration | -| `SLACK_CLIENT_SECRET` | | | Required for Slack integration | -| `SLACK_STATE_SECRET` | | `slack-state-secret` | Required for Slack integration | -| `SLACK_APP_TOKEN` | | | App token for Slack | -| `SLACK_PORT` | | `4351` | Port for Slack integration | -| `SLACK_SOCKET_MODE` | | `false` | Enable socket mode for Slack | -| `SLACK_CHANNELS_CACHED_TIME` | | `600000` (10 minutes) | Time in milliseconds to cache Slack channels | -| `SLACK_SUPPORT_URL` | | | URL for Slack support | +| Variable | Description | +| :--------------------------- | :------------------------------------------- | +| `SLACK_SIGNING_SECRET` | Required for Slack integration | +| `SLACK_CLIENT_ID` | Required for Slack integration | +| `SLACK_CLIENT_SECRET` | Required for Slack integration | +| `SLACK_STATE_SECRET` | Required for Slack integration (default=slack-state-secret) | +| `SLACK_APP_TOKEN` | App token for Slack | +| `SLACK_PORT` | Port for Slack integration (default=4351) | +| `SLACK_SOCKET_MODE` | Enable socket mode for Slack (default=false) | +| `SLACK_CHANNELS_CACHED_TIME` | Time in milliseconds to cache Slack channels (default=600000, 10 minutes) | +| `SLACK_SUPPORT_URL` | URL for Slack support | ## GitHub Integration These variables enable you to configure [Github integrations](/self-host/customize-deployment/configure-github-for-lightdash) -| Variable | Required? | Default | Description | -| :----------------------- | :-------------------------------------------: | :-----: | :----------------------------------------------- | -| `GITHUB_PRIVATE_KEY` | | | GitHub private key for GitHub App authentication | -| `GITHUB_APP_ID` | | | GitHub Application ID | -| `GITHUB_CLIENT_ID` | | | GitHub OAuth client ID | -| `GITHUB_CLIENT_SECRET` | | | GitHub OAuth client secret | -| `GITHUB_APP_NAME` | | | Name of the GitHub App | -| `GITHUB_REDIRECT_DOMAIN` | | | Domain for GitHub OAuth redirection | +| Variable | Description | +| :----------------------- | :----------------------------------------------- | +| `GITHUB_PRIVATE_KEY` | (Required) GitHub private key for GitHub App authentication | +| `GITHUB_APP_ID` | (Required) GitHub Application ID | +| `GITHUB_CLIENT_ID` | (Required) GitHub OAuth client ID | +| `GITHUB_CLIENT_SECRET` | (Required) GitHub OAuth client secret | +| `GITHUB_APP_NAME` | (Required) Name of the GitHub App | +| `GITHUB_REDIRECT_DOMAIN` | Domain for GitHub OAuth redirection | ## Microsoft Teams Integration These variables enable you to configure Microsoft Teams integration. -| Variable | Required? | Default | Description | -| :------------------------ | :-------: | :-----: | :---------------------------------- | -| `MICROSOFT_TEAMS_ENABLED` | | `false` | Enables Microsoft Teams integration | +| Variable | Description | +| :------------------------ | :---------------------------------- | +| `MICROSOFT_TEAMS_ENABLED` | Enables Microsoft Teams integration (default=false) | ## Google Cloud Platform These variables enable you to configure Google Cloud Platform integration. -| Variable | Required? | Default | Description | -| :------------------------ | :-------: | :-----: | :--------------------------------------------------- | -| `GOOGLE_CLOUD_PROJECT_ID` | | | Google Cloud Platform project ID | -| `GOOGLE_DRIVE_API_KEY` | | | Google Drive API key | -| `AUTH_GOOGLE_ENABLED` | | `false` | Enables Google authentication | -| `AUTH_ENABLE_GCLOUD_ADC` | | `false` | Enables Google Cloud Application Default Credentials | +| Variable | Description | +| :------------------------ | :--------------------------------------------------- | +| `GOOGLE_CLOUD_PROJECT_ID` | Google Cloud Platform project ID | +| `GOOGLE_DRIVE_API_KEY` | Google Drive API key | +| `AUTH_GOOGLE_ENABLED` | Enables Google authentication (default=false) | +| `AUTH_ENABLE_GCLOUD_ADC` | Enables Google Cloud Application Default Credentials (default=false) | ## Embedding Note that you will need an **Enterprise Licence Key** for this functionality. -| Variable | Required? | Default | Description | -| :---------------------------------------- | :-------: | :-----: | :---------------------------------------------------------------------------------------------------- | -| `EMBEDDING_ENABLED` | | `false` | Enables embedding functionality | -| `EMBED_ALLOW_ALL_DASHBOARDS_BY_DEFAULT` | | `false` | When creating new embeds, allow all dashboards by default | -| `EMBED_ALLOW_ALL_CHARTS_BY_DEFAULT` | | `false` | When creating new embeds, allow all charts by default | -| `LIGHTDASH_IFRAME_EMBEDDING_DOMAINS` | | | List of domains that are allowed to embed Lightdash in an iframe. Values must be separated by commas. | +| Variable | Description | +| :---------------------------------------- | :---------------------------------------------------------------------------------------------------- | +| `EMBEDDING_ENABLED` | Enables embedding functionality (default=false) | +| `EMBED_ALLOW_ALL_DASHBOARDS_BY_DEFAULT` | When creating new embeds, allow all dashboards by default (default=false) | +| `EMBED_ALLOW_ALL_CHARTS_BY_DEFAULT` | When creating new embeds, allow all charts by default (default=false) | +| `LIGHTDASH_IFRAME_EMBEDDING_DOMAINS` | List of domains that are allowed to embed Lightdash in an iframe. Values must be separated by commas. | ## Custom roles Note that you will need an **Enterprise Licence Key** for this functionality. -| Variable | Required? | Default | Description | -| :--------------------- | :-------: | :-----: | :----------------------------------------------------------------------------------------------------------------------------------------------- | -| `CUSTOM_ROLES_ENABLED` | | `false` | Enables creation of custom organization roles with configurable permission scopes beyond the default Admin, Developer, Editor, and Viewer roles. | +| Variable | Description | +| :--------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------- | +| `CUSTOM_ROLES_ENABLED` | Enables creation of custom organization roles with configurable permission scopes beyond the default Admin, Developer, Editor, and Viewer roles. (default=false) | ## Service account Note that you will need an **Enterprise Licence Key** for this functionality. -| Variable | Required? | Default | Description | -| :------------------------ | :-------: | :-----: | :------------------------------------ | -| `SERVICE_ACCOUNT_ENABLED` | | `false` | Enables service account functionality | +| Variable | Description | +| :------------------------ | :------------------------------------ | +| `SERVICE_ACCOUNT_ENABLED` | Enables service account functionality (default=false) | ## SCIM Note that you will need an **Enterprise Licence Key** for this functionality. -| Variable | Required? | Default | Description | -| :------------- | :-------: | :-----: | :--------------------------------------------------------- | -| `SCIM_ENABLED` | | `false` | Enables SCIM (System for Cross-domain Identity Management) | +| Variable | Description | +| :------------- | :--------------------------------------------------------- | +| `SCIM_ENABLED` | Enables SCIM (System for Cross-domain Identity Management) (default=false) | ## Sentry These variables enable you to configure Sentry for error tracking. -| Variable | Required? | Default | Description | -| :------------------------------ | :-------: | :-----: | :-------------------------------------------------- | -| `SENTRY_DSN` | | | Sentry DSN for both frontend and backend | -| `SENTRY_BE_DSN` | | | Sentry DSN for backend only | -| `SENTRY_FE_DSN` | | | Sentry DSN for frontend only | -| `SENTRY_BE_SECURITY_REPORT_URI` | | | URI for Sentry backend security reports | -| `SENTRY_TRACES_SAMPLE_RATE` | | `0.1` | Sample rate for Sentry traces (0.0 to 1.0) | -| `SENTRY_PROFILES_SAMPLE_RATE` | | `0.2` | Sample rate for Sentry profiles (0.0 to 1.0) | -| `SENTRY_ANR_ENABLED` | | `false` | Enables Sentry Application Not Responding detection | -| `SENTRY_ANR_CAPTURE_STACKTRACE` | | `false` | Captures stacktrace for ANR events | -| `SENTRY_ANR_TIMEOUT` | | | Timeout in milliseconds for ANR detection | +| Variable | Description | +| :------------------------------ | :-------------------------------------------------- | +| `SENTRY_DSN` | Sentry DSN for both frontend and backend | +| `SENTRY_BE_DSN` | Sentry DSN for backend only | +| `SENTRY_FE_DSN` | Sentry DSN for frontend only | +| `SENTRY_BE_SECURITY_REPORT_URI` | URI for Sentry backend security reports | +| `SENTRY_TRACES_SAMPLE_RATE` | Sample rate for Sentry traces (0.0 to 1.0) (default=0.1) | +| `SENTRY_PROFILES_SAMPLE_RATE` | Sample rate for Sentry profiles (0.0 to 1.0) (default=0.2) | +| `SENTRY_ANR_ENABLED` | Enables Sentry Application Not Responding detection (default=false) | +| `SENTRY_ANR_CAPTURE_STACKTRACE` | Captures stacktrace for ANR events (default=false) | +| `SENTRY_ANR_TIMEOUT` | Timeout in milliseconds for ANR detection | ## Intercom & Pylon These variables enable you to configure Intercom and Pylon for customer support and feedback. -| Variable | Required? | Default | Description | -| :----------------------------------- | :-------: | :---------------------------: | :------------------------------------ | -| `INTERCOM_APP_ID` | | | Intercom application ID | -| `INTERCOM_APP_BASE` | | `https://api-iam.intercom.io` | Base URL for Intercom API | -| `PYLON_APP_ID` | | | Pylon application ID | -| `PYLON_IDENTITY_VERIFICATION_SECRET` | | | Secret for verifying Pylon identities | +| Variable | Description | +| :----------------------------------- | :------------------------------------ | +| `INTERCOM_APP_ID` | Intercom application ID | +| `INTERCOM_APP_BASE` | Base URL for Intercom API (default=https://api-iam.intercom.io) | +| `PYLON_APP_ID` | Pylon application ID | +| `PYLON_IDENTITY_VERIFICATION_SECRET` | Secret for verifying Pylon identities | ## Kubernetes These variables enable you to configure Kubernetes integration. -| Variable | Required? | Default | Description | -| :------------------------- | :-------: | :-----: | :-------------------------------------- | -| `K8S_NODE_NAME` | | | Name of the Kubernetes node | -| `K8S_POD_NAME` | | | Name of the Kubernetes pod | -| `K8S_POD_NAMESPACE` | | | Namespace of the Kubernetes pod | -| `LIGHTDASH_CLOUD_INSTANCE` | | | Identifier for Lightdash cloud instance | +| Variable | Description | +| :------------------------- | :-------------------------------------- | +| `K8S_NODE_NAME` | Name of the Kubernetes node | +| `K8S_POD_NAME` | Name of the Kubernetes pod | +| `K8S_POD_NAMESPACE` | Namespace of the Kubernetes pod | +| `LIGHTDASH_CLOUD_INSTANCE` | Identifier for Lightdash cloud instance | ## **Organization appearance** These variables allow you to customize the default appearance settings for your Lightdash instance's organizations. This color palette will be set for all organizations in your instance. You can't choose another one while these env vars are set. -| Variable | Required? | Default | Description | -| :------------------------------ | :-------: | :-----: | :---------------------------------------------------------------------------------------- | -| `OVERRIDE_COLOR_PALETTE_NAME` | | | Name of the default color palette | -| `OVERRIDE_COLOR_PALETTE_COLORS` | | | Comma-separated list of hex color codes for the default color palette (must be 20 colors) | +| Variable | Description | +| :------------------------------ | :---------------------------------------------------------------------------------------- | +| `OVERRIDE_COLOR_PALETTE_NAME` | Name of the default color palette | +| `OVERRIDE_COLOR_PALETTE_COLORS` | Comma-separated list of hex color codes for the default color palette (must be 20 colors) | ## Initialize instance @@ -427,30 +427,30 @@ When a new Lightdash instance is created, and there are no orgs and projects. Yo Currently we only support Databricks project types and Github dbt configuration. -| Variable | Required? | Default | Description | -| :------------------------------------ | :-------------------------------------------: | :----------: | :------------------------------------------------------------------------------------------------ | -| `LD_SETUP_ADMIN_NAME` | | `Admin User` | Name of the admin user for initial setup | -| `LD_SETUP_ADMIN_EMAIL` | | | Email of the admin user for initial setup | -| `LD_SETUP_ORGANIZATION_EMAIL_DOMAIN` | | | Comma-separated list of email domains for organization whitelisting | -| `LD_SETUP_ORGANIZATION_DEFAULT_ROLE` | | `viewer` | Default role for new organization members | -| `LD_SETUP_ORGANIZATION_NAME` | | | Name of the organization | -| `LD_SETUP_ADMIN_API_KEY` | | | API key for the admin user, must start with `ldpat_` prefix | -| `LD_SETUP_API_KEY_EXPIRATION` | | `30` | Number of days until API key expires (0 for no expiration) | -| `LD_SETUP_SERVICE_ACCOUNT_TOKEN` | | | A pre-set token for the service account, must start with `ldsvc_` prefix | -| `LD_SETUP_SERVICE_ACCOUNT_EXPIRATION` | | `30` | Number of days until service account token expires (0 for no expiration) | -| `LD_SETUP_PROJECT_NAME` | | | Name of the project | -| `LD_SETUP_PROJECT_CATALOG` | | | Catalog name for Databricks project | -| `LD_SETUP_PROJECT_SCHEMA` | | | Schema/database name for the project | -| `LD_SETUP_PROJECT_HOST` | | | Hostname for the Databricks server | -| `LD_SETUP_PROJECT_HTTP_PATH` | | | HTTP path for Databricks connection | -| `LD_SETUP_PROJECT_PAT` | | | Personal access token for Databricks | -| `LD_SETUP_START_OF_WEEK` | | `SUNDAY` | Day to use as start of week | -| `LD_SETUP_PROJECT_COMPUTE` | | | JSON string with Databricks compute configuration like `{"name": "string", "httpPath": "string"}` | -| `LD_SETUP_DBT_VERSION` | | `latest` | Version of dbt to use (eg: v1.8) | -| `LD_SETUP_GITHUB_PAT` | | | GitHub personal access token | -| `LD_SETUP_GITHUB_REPOSITORY` | | | GitHub repository for dbt project | -| `LD_SETUP_GITHUB_BRANCH` | | | GitHub branch for dbt project | -| `LD_SETUP_GITHUB_PATH` | | `/` | Subdirectory path within GitHub repository | +| Variable | Description | +| :------------------------------------ | :------------------------------------------------------------------------------------------------ | +| `LD_SETUP_ADMIN_NAME` | Name of the admin user for initial setup (default=Admin User) | +| `LD_SETUP_ADMIN_EMAIL` | (Required) Email of the admin user for initial setup | +| `LD_SETUP_ORGANIZATION_EMAIL_DOMAIN` | Comma-separated list of email domains for organization whitelisting | +| `LD_SETUP_ORGANIZATION_DEFAULT_ROLE` | Default role for new organization members (default=viewer) | +| `LD_SETUP_ORGANIZATION_NAME` | (Required) Name of the organization | +| `LD_SETUP_ADMIN_API_KEY` | (Required) API key for the admin user, must start with `ldpat_` prefix | +| `LD_SETUP_API_KEY_EXPIRATION` | Number of days until API key expires (0 for no expiration) (default=30) | +| `LD_SETUP_SERVICE_ACCOUNT_TOKEN` | (Required) A pre-set token for the service account, must start with `ldsvc_` prefix | +| `LD_SETUP_SERVICE_ACCOUNT_EXPIRATION` | Number of days until service account token expires (0 for no expiration) (default=30) | +| `LD_SETUP_PROJECT_NAME` | (Required) Name of the project | +| `LD_SETUP_PROJECT_CATALOG` | Catalog name for Databricks project | +| `LD_SETUP_PROJECT_SCHEMA` | (Required) Schema/database name for the project | +| `LD_SETUP_PROJECT_HOST` | (Required) Hostname for the Databricks server | +| `LD_SETUP_PROJECT_HTTP_PATH` | (Required) HTTP path for Databricks connection | +| `LD_SETUP_PROJECT_PAT` | (Required) Personal access token for Databricks | +| `LD_SETUP_START_OF_WEEK` | Day to use as start of week (default=SUNDAY) | +| `LD_SETUP_PROJECT_COMPUTE` | JSON string with Databricks compute configuration like `{"name": "string", "httpPath": "string"}` | +| `LD_SETUP_DBT_VERSION` | Version of dbt to use (eg: v1.8) (default=latest) | +| `LD_SETUP_GITHUB_PAT` | (Required) GitHub personal access token | +| `LD_SETUP_GITHUB_REPOSITORY` | (Required) GitHub repository for dbt project | +| `LD_SETUP_GITHUB_BRANCH` | (Required) GitHub branch for dbt project | +| `LD_SETUP_GITHUB_PATH` | Subdirectory path within GitHub repository (default=/) | In order to login as the admin user using SSO, you must enable the following ENV variable too: @@ -473,14 +473,14 @@ On server start, we will check the following variables, and update some configur For more information on our plans, visit our [pricing page](https://www.lightdash.com/pricing). -| Variable | Required? | Default | Description | -| :----------------------------------- | :---------------------------------------------: | :------: | :----------------------------------------------------------------------- | -| `LD_SETUP_ADMIN_EMAIL` | Required if `LD_SETUP_ADMIN_API_KEY` is present | | Email of the admin to update its Personal access token | -| `LD_SETUP_ADMIN_API_KEY` | | | API key for the admin user, must start with `ldpat_` prefix | -| `LD_SETUP_ORGANIZATION_EMAIL_DOMAIN` | | | Comma-separated list of email domains for organization whitelisting | -| `LD_SETUP_ORGANIZATION_DEFAULT_ROLE` | | `viewer` | Default role for new organization members | -| `LD_SETUP_PROJECT_HTTP_PATH` | | | HTTP path for Databricks connection | -| `LD_SETUP_PROJECT_PAT` | | | Personal access token for Databricks | -| `LD_SETUP_DBT_VERSION` | | `latest` | Version of dbt to use (eg: v1.8) | -| `LD_SETUP_GITHUB_PAT` | | | GitHub personal access token | -| `LD_SETUP_SERVICE_ACCOUNT_TOKEN` | | | A pre-set token for the service account, must start with `ldsvc_` prefix | \ No newline at end of file +| Variable | Description | +| :----------------------------------- | :----------------------------------------------------------------------- | +| `LD_SETUP_ADMIN_EMAIL` | (Required if `LD_SETUP_ADMIN_API_KEY` is present) Email of the admin to update its Personal access token | +| `LD_SETUP_ADMIN_API_KEY` | API key for the admin user, must start with `ldpat_` prefix | +| `LD_SETUP_ORGANIZATION_EMAIL_DOMAIN` | Comma-separated list of email domains for organization whitelisting | +| `LD_SETUP_ORGANIZATION_DEFAULT_ROLE` | Default role for new organization members (default=viewer) | +| `LD_SETUP_PROJECT_HTTP_PATH` | HTTP path for Databricks connection | +| `LD_SETUP_PROJECT_PAT` | Personal access token for Databricks | +| `LD_SETUP_DBT_VERSION` | Version of dbt to use (eg: v1.8) (default=latest) | +| `LD_SETUP_GITHUB_PAT` | GitHub personal access token | +| `LD_SETUP_SERVICE_ACCOUNT_TOKEN` | A pre-set token for the service account, must start with `ldsvc_` prefix | From b379c73ca44ec888705a37de745d87b537dfc6dd Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Tue, 16 Dec 2025 14:14:46 +0000 Subject: [PATCH 32/32] Update self-host/customize-deployment/environment-variables.mdx Co-Authored-By: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com> --- .../environment-variables.mdx | 60 +++++++++---------- 1 file changed, 30 insertions(+), 30 deletions(-) diff --git a/self-host/customize-deployment/environment-variables.mdx b/self-host/customize-deployment/environment-variables.mdx index da0f03cc..16d4993a 100644 --- a/self-host/customize-deployment/environment-variables.mdx +++ b/self-host/customize-deployment/environment-variables.mdx @@ -7,15 +7,15 @@ This is a reference to all environment variables that can be used to configure a | Variable | Description | | :------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| `PGHOST` | (Required) Hostname of postgres server to store Lightdash data | -| `PGPORT` | (Required) Port of postgres server to store Lightdash data | -| `PGUSER` | (Required) Username of postgres user to access postgres server to store Lightdash data | -| `PGPASSWORD` | (Required) Password for PGUSER | -| `PGDATABASE` | (Required) Database name inside postgres server to store Lightdash data | +| `PGHOST` | **(Required)** Hostname of postgres server to store Lightdash data | +| `PGPORT` | **(Required)** Port of postgres server to store Lightdash data | +| `PGUSER` | **(Required)** Username of postgres user to access postgres server to store Lightdash data | +| `PGPASSWORD` | **(Required)** Password for PGUSER | +| `PGDATABASE` | **(Required)** Database name inside postgres server to store Lightdash data | | `PGCONNECTIONURI` | Connection URI for postgres server to store Lightdash data in the format postgresql://user:password@host:port/db?params. This is an alternative to providing the previous PG variables. | | `PGMAXCONNECTIONS` | Maximum number of connections to the database | | `PGMINCONNECTIONS` | Minimum number of connections to the database | -| `LIGHTDASH_SECRET` | (Required) Secret key used to secure various tokens in Lightdash. This **must** be fixed between deployments. If the secret changes, you won't have access to Lightdash data. | +| `LIGHTDASH_SECRET` | **(Required)** Secret key used to secure various tokens in Lightdash. This **must** be fixed between deployments. If the secret changes, you won't have access to Lightdash data. | | `SECURE_COOKIES` | Only allows cookies to be stored over a https connection. We use cookies to keep you logged in. This is recommended to be set to true in production. (default=false) | | `COOKIES_MAX_AGE_HOURS` | How many hours a user session exists before the user is automatically signed out. For example if 24, then the user will be automatically after 24 hours of inactivity. | | `TRUST_PROXY` | This tells the Lightdash server that it can trust the X-Forwarded-Proto header it receives in requests. This is useful if you use `SECURE_COOKIES=true` behind a HTTPS terminated proxy that you can trust. (default=false) | @@ -59,14 +59,14 @@ This is a reference to all the SMTP environment variables that can be used to co | Variable | Description | | :------------------------------ | :------------------------------------------------------------------------- | -| `EMAIL_SMTP_HOST` | (Required) Hostname of email server | +| `EMAIL_SMTP_HOST` | **(Required)** Hostname of email server | | `EMAIL_SMTP_PORT` | Port of email server (default=587) | | `EMAIL_SMTP_SECURE` | Secure connection (default=true) | -| `EMAIL_SMTP_USER` | (Required) Auth user | +| `EMAIL_SMTP_USER` | **(Required)** Auth user | | `EMAIL_SMTP_PASSWORD` | Auth password [1] | | `EMAIL_SMTP_ACCESS_TOKEN` | Auth access token for Oauth2 authentication [1] | | `EMAIL_SMTP_ALLOW_INVALID_CERT` | Allow connection to TLS server with self-signed or invalid TLS certificate (default=false) | -| `EMAIL_SMTP_SENDER_EMAIL` | (Required) The email address that sends emails | +| `EMAIL_SMTP_SENDER_EMAIL` | **(Required)** The email address that sends emails | | `EMAIL_SMTP_SENDER_NAME` | The name of the email address that sends emails (default=Lightdash) | [1] `EMAIL_SMTP_PASSWORD` or `EMAIL_SMTP_ACCESS_TOKEN` needs to be provided @@ -111,9 +111,9 @@ These variables allow you to configure [S3 Object Storage](/self-host/customize- | Variable | Description | |:--------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| `S3_ENDPOINT` | (Required) S3 endpoint for storing results | -| `S3_BUCKET` | (Required) Name of the S3 bucket for storing files | -| `S3_REGION` | (Required) Region where the S3 bucket is located | +| `S3_ENDPOINT` | **(Required)** S3 endpoint for storing results | +| `S3_BUCKET` | **(Required)** Name of the S3 bucket for storing files | +| `S3_REGION` | **(Required)** Region where the S3 bucket is located | | `S3_ACCESS_KEY` | Access key for authenticating with the S3 bucket | | `S3_SECRET_KEY` | Secret key for authenticating with the S3 bucket | | `S3_USE_CREDENTIALS_FROM` | Configures the credential provider chain for AWS S3 authentication if access key and secret is not provided. Supports: `env` (environment variables), `token_file` (token file credentials), `ini` (initialization file credentials), `ecs` (container metadata credentials), `ec2` (instance metadata credentials). Multiple values can be specified in order of preference. | @@ -262,7 +262,7 @@ To enable AI Analyst, set `AI_COPILOT_ENABLED=true` and provide an API key for ` | `BEDROCK_ACCESS_KEY_ID` | (Required if not using API key) AWS access key ID for Bedrock | | `BEDROCK_SECRET_ACCESS_KEY`| (Required if using access key ID) AWS secret access key for Bedrock | | `BEDROCK_SESSION_TOKEN` | AWS session token (for temporary credentials) | -| `BEDROCK_REGION` | (Required) AWS region for Bedrock | +| `BEDROCK_REGION` | **(Required)** AWS region for Bedrock | | `BEDROCK_MODEL_NAME` | Bedrock model name to use (default=claude-sonnet-4-5) | | `BEDROCK_EMBEDDING_MODEL` | Bedrock embedding model for verified answers (default=cohere.embed-english-v3) | | `BEDROCK_AVAILABLE_MODELS` | Comma-separated list of models available in the model picker (default=All supported models) | @@ -305,11 +305,11 @@ These variables enable you to configure [Github integrations](/self-host/customi | Variable | Description | | :----------------------- | :----------------------------------------------- | -| `GITHUB_PRIVATE_KEY` | (Required) GitHub private key for GitHub App authentication | -| `GITHUB_APP_ID` | (Required) GitHub Application ID | -| `GITHUB_CLIENT_ID` | (Required) GitHub OAuth client ID | -| `GITHUB_CLIENT_SECRET` | (Required) GitHub OAuth client secret | -| `GITHUB_APP_NAME` | (Required) Name of the GitHub App | +| `GITHUB_PRIVATE_KEY` | **(Required)** GitHub private key for GitHub App authentication | +| `GITHUB_APP_ID` | **(Required)** GitHub Application ID | +| `GITHUB_CLIENT_ID` | **(Required)** GitHub OAuth client ID | +| `GITHUB_CLIENT_SECRET` | **(Required)** GitHub OAuth client secret | +| `GITHUB_APP_NAME` | **(Required)** Name of the GitHub App | | `GITHUB_REDIRECT_DOMAIN` | Domain for GitHub OAuth redirection | ## Microsoft Teams Integration @@ -430,26 +430,26 @@ When a new Lightdash instance is created, and there are no orgs and projects. Yo | Variable | Description | | :------------------------------------ | :------------------------------------------------------------------------------------------------ | | `LD_SETUP_ADMIN_NAME` | Name of the admin user for initial setup (default=Admin User) | -| `LD_SETUP_ADMIN_EMAIL` | (Required) Email of the admin user for initial setup | +| `LD_SETUP_ADMIN_EMAIL` | **(Required)** Email of the admin user for initial setup | | `LD_SETUP_ORGANIZATION_EMAIL_DOMAIN` | Comma-separated list of email domains for organization whitelisting | | `LD_SETUP_ORGANIZATION_DEFAULT_ROLE` | Default role for new organization members (default=viewer) | -| `LD_SETUP_ORGANIZATION_NAME` | (Required) Name of the organization | -| `LD_SETUP_ADMIN_API_KEY` | (Required) API key for the admin user, must start with `ldpat_` prefix | +| `LD_SETUP_ORGANIZATION_NAME` | **(Required)** Name of the organization | +| `LD_SETUP_ADMIN_API_KEY` | **(Required)** API key for the admin user, must start with `ldpat_` prefix | | `LD_SETUP_API_KEY_EXPIRATION` | Number of days until API key expires (0 for no expiration) (default=30) | -| `LD_SETUP_SERVICE_ACCOUNT_TOKEN` | (Required) A pre-set token for the service account, must start with `ldsvc_` prefix | +| `LD_SETUP_SERVICE_ACCOUNT_TOKEN` | **(Required)** A pre-set token for the service account, must start with `ldsvc_` prefix | | `LD_SETUP_SERVICE_ACCOUNT_EXPIRATION` | Number of days until service account token expires (0 for no expiration) (default=30) | -| `LD_SETUP_PROJECT_NAME` | (Required) Name of the project | +| `LD_SETUP_PROJECT_NAME` | **(Required)** Name of the project | | `LD_SETUP_PROJECT_CATALOG` | Catalog name for Databricks project | -| `LD_SETUP_PROJECT_SCHEMA` | (Required) Schema/database name for the project | -| `LD_SETUP_PROJECT_HOST` | (Required) Hostname for the Databricks server | -| `LD_SETUP_PROJECT_HTTP_PATH` | (Required) HTTP path for Databricks connection | -| `LD_SETUP_PROJECT_PAT` | (Required) Personal access token for Databricks | +| `LD_SETUP_PROJECT_SCHEMA` | **(Required)** Schema/database name for the project | +| `LD_SETUP_PROJECT_HOST` | **(Required)** Hostname for the Databricks server | +| `LD_SETUP_PROJECT_HTTP_PATH` | **(Required)** HTTP path for Databricks connection | +| `LD_SETUP_PROJECT_PAT` | **(Required)** Personal access token for Databricks | | `LD_SETUP_START_OF_WEEK` | Day to use as start of week (default=SUNDAY) | | `LD_SETUP_PROJECT_COMPUTE` | JSON string with Databricks compute configuration like `{"name": "string", "httpPath": "string"}` | | `LD_SETUP_DBT_VERSION` | Version of dbt to use (eg: v1.8) (default=latest) | -| `LD_SETUP_GITHUB_PAT` | (Required) GitHub personal access token | -| `LD_SETUP_GITHUB_REPOSITORY` | (Required) GitHub repository for dbt project | -| `LD_SETUP_GITHUB_BRANCH` | (Required) GitHub branch for dbt project | +| `LD_SETUP_GITHUB_PAT` | **(Required)** GitHub personal access token | +| `LD_SETUP_GITHUB_REPOSITORY` | **(Required)** GitHub repository for dbt project | +| `LD_SETUP_GITHUB_BRANCH` | **(Required)** GitHub branch for dbt project | | `LD_SETUP_GITHUB_PATH` | Subdirectory path within GitHub repository (default=/) |