diff --git a/README.md b/README.md index e48f4401..de605c54 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ # bundle-examples -This repository provides Databricks Asset Bundles examples. +This repository provides Declarative Automation Bundles examples. To learn more, see: * The launch blog post at https://www.databricks.com/blog/announcing-general-availability-databricks-asset-bundles diff --git a/contrib/README.md b/contrib/README.md index 31f7d015..2cfcf695 100644 --- a/contrib/README.md +++ b/contrib/README.md @@ -1,10 +1,10 @@ # Contrib Directory -The `contrib` directory contains additional community-contributed examples and resources for Databricks Asset Bundles. These examples may include: +The `contrib` directory contains additional community-contributed examples and resources for Declarative Automation Bundles. These examples may include: - Custom configurations and extensions - Advanced usage patterns -- Tools or utilities for enhancing Databricks Asset Bundles workflows +- Tools or utilities for enhancing Declarative Automation Bundles workflows ## Structure @@ -38,6 +38,6 @@ If you would like to add your own examples or resources, please: 2. Include a `README.md` file explaining the contribution. 3. Ensure that any necessary configuration files, scripts, or dependencies are included. -For more information on Databricks Asset Bundles, see: +For more information on Declarative Automation Bundles, see: - The launch blog post at https://www.databricks.com/blog/announcing-general-availability-databricks-asset-bundles - The docs at https://docs.databricks.com/dev-tools/bundles/index.html diff --git a/contrib/data_engineering/assets/README.md b/contrib/data_engineering/assets/README.md index f6c8907f..7dc13560 100644 --- a/contrib/data_engineering/assets/README.md +++ b/contrib/data_engineering/assets/README.md @@ -1,4 +1,4 @@ -This folder is reserved for Databricks Asset Bundles definitions. +This folder is reserved for Declarative Automation Bundles definitions. New jobs and pipelines should conventions from the 'data-engineering' template. See https://github.com/databricks/bundle-examples/blob/main/contrib/templates/data-engineering/README.md. diff --git a/contrib/data_engineering/databricks.yml b/contrib/data_engineering/databricks.yml index 0577aa4b..366cfc6f 100644 --- a/contrib/data_engineering/databricks.yml +++ b/contrib/data_engineering/databricks.yml @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for data_engineering. +# This is a Declarative Automation Bundle definition for data_engineering. # See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. bundle: name: data_engineering diff --git a/contrib/databricks_ingestion_monitoring/COMMON_CONFIGURATION.md b/contrib/databricks_ingestion_monitoring/COMMON_CONFIGURATION.md index ace63810..a879a4b0 100644 --- a/contrib/databricks_ingestion_monitoring/COMMON_CONFIGURATION.md +++ b/contrib/databricks_ingestion_monitoring/COMMON_CONFIGURATION.md @@ -1,6 +1,6 @@ # Common Configuration Guide -This document describes common configuration parameters shared among monitoring DABs (Databricks Asset Bundles). +This document describes common configuration parameters shared among monitoring DABs (Declarative Automation Bundles). Configuration is done through variables in a DAB deployment target. diff --git a/contrib/databricks_ingestion_monitoring/README.md b/contrib/databricks_ingestion_monitoring/README.md index d4f14cbb..e2e52e73 100644 --- a/contrib/databricks_ingestion_monitoring/README.md +++ b/contrib/databricks_ingestion_monitoring/README.md @@ -10,7 +10,7 @@ In particular, the package provides: - Provide out-of-the-box AI/BI Dashboards based on the above observability tables - Code and examples to integrate the observability tables with third-party monitoring providers such as Datadog, New Relic, Azure Monitor, Splunk -The package contains deployable [Databricks Asset Bundles (DABs)](https://docs.databricks.com/aws/en/dev-tools/bundles/) for easy distribution: +The package contains deployable [Declarative Automation Bundles](https://docs.databricks.com/aws/en/dev-tools/bundles/) for easy distribution: - Generic SDP pipelines - CDC Connector @@ -22,7 +22,7 @@ Coming soon # Prerequisites -- [Databricks Asset Bundles (DABs)](https://docs.databricks.com/aws/en/dev-tools/bundles/) +- [Declarative Automation Bundles](https://docs.databricks.com/aws/en/dev-tools/bundles/) - PrPr for forEachBatch sinks in SDP (if using the 3P observabitlity platforms integration) diff --git a/contrib/job_with_ai_parse_document/README.md b/contrib/job_with_ai_parse_document/README.md index f899e1d0..d7e6ad2e 100644 --- a/contrib/job_with_ai_parse_document/README.md +++ b/contrib/job_with_ai_parse_document/README.md @@ -1,6 +1,6 @@ # AI Document Processing Job with Structured Streaming -A Databricks Asset Bundle demonstrating **incremental document processing** using `ai_parse_document`, `ai_query`, and Databricks Jobs with Structured Streaming. +A Declarative Automation Bundle demonstrating **incremental document processing** using `ai_parse_document`, `ai_query`, and Databricks Jobs with Structured Streaming. ## Overview @@ -171,7 +171,7 @@ The included notebook visualizes parsing results with interactive bounding boxes ## Resources -- [Databricks Asset Bundles](https://docs.databricks.com/dev-tools/bundles/) +- [Declarative Automation Bundles](https://docs.databricks.com/dev-tools/bundles/) - [Databricks Workflows](https://docs.databricks.com/workflows/) - [Structured Streaming](https://docs.databricks.com/structured-streaming/) - [`ai_parse_document` Function](https://docs.databricks.com/aws/en/sql/language-manual/functions/ai_parse_document) diff --git a/contrib/job_with_ai_parse_document/databricks.yml b/contrib/job_with_ai_parse_document/databricks.yml index c8784a9a..ad2ee7e0 100644 --- a/contrib/job_with_ai_parse_document/databricks.yml +++ b/contrib/job_with_ai_parse_document/databricks.yml @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for ai_parse_document_workflow. +# This is a Declarative Automation Bundle definition for ai_parse_document_workflow. # See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. bundle: name: ai_parse_document_workflow diff --git a/contrib/templates/data-engineering/databricks_template_schema.json b/contrib/templates/data-engineering/databricks_template_schema.json index 575488f0..acdc72e0 100644 --- a/contrib/templates/data-engineering/databricks_template_schema.json +++ b/contrib/templates/data-engineering/databricks_template_schema.json @@ -1,5 +1,5 @@ { - "welcome_message": "\nWelcome to the data-engineering template for Databricks Asset Bundles!", + "welcome_message": "\nWelcome to the data-engineering template for Declarative Automation Bundles!", "properties": { "project_name": { "type": "string", diff --git a/contrib/templates/data-engineering/template/{{.project_name}}/assets/README.md b/contrib/templates/data-engineering/template/{{.project_name}}/assets/README.md index f6c8907f..7dc13560 100644 --- a/contrib/templates/data-engineering/template/{{.project_name}}/assets/README.md +++ b/contrib/templates/data-engineering/template/{{.project_name}}/assets/README.md @@ -1,4 +1,4 @@ -This folder is reserved for Databricks Asset Bundles definitions. +This folder is reserved for Declarative Automation Bundles definitions. New jobs and pipelines should conventions from the 'data-engineering' template. See https://github.com/databricks/bundle-examples/blob/main/contrib/templates/data-engineering/README.md. diff --git a/contrib/templates/data-engineering/template/{{.project_name}}/databricks.yml.tmpl b/contrib/templates/data-engineering/template/{{.project_name}}/databricks.yml.tmpl index d988fccc..b33fdbfd 100644 --- a/contrib/templates/data-engineering/template/{{.project_name}}/databricks.yml.tmpl +++ b/contrib/templates/data-engineering/template/{{.project_name}}/databricks.yml.tmpl @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for {{.project_name}}. +# This is a Declarative Automation Bundle definition for {{.project_name}}. # See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. bundle: name: {{.project_name}} diff --git a/contrib/templates/default-scala/README.md b/contrib/templates/default-scala/README.md index 5c6be785..7d93a16f 100644 --- a/contrib/templates/default-scala/README.md +++ b/contrib/templates/default-scala/README.md @@ -1,6 +1,6 @@ # default-scala -This template helps you create Scala projects with Databricks Asset Bundles. It uses sbt to compile and package Scala files, and can be used with Databricks Connect for local development. +This template helps you create Scala projects with Declarative Automation Bundles. It uses sbt to compile and package Scala files, and can be used with Databricks Connect for local development. It supports two compute types: standard clusters and serverless compute. diff --git a/contrib/templates/default-scala/databricks_template_schema.json b/contrib/templates/default-scala/databricks_template_schema.json index 9d239768..91fbda78 100644 --- a/contrib/templates/default-scala/databricks_template_schema.json +++ b/contrib/templates/default-scala/databricks_template_schema.json @@ -1,5 +1,5 @@ { - "welcome_message": "\nWelcome to the default-scala template for Databricks Asset Bundles!\n\nA workspace was selected based on your current profile. For information about how to change this, see https://docs.databricks.com/dev-tools/cli/profiles.html.\nworkspace_host: {{workspace_host}}", + "welcome_message": "\nWelcome to the default-scala template for Declarative Automation Bundles!\n\nA workspace was selected based on your current profile. For information about how to change this, see https://docs.databricks.com/dev-tools/cli/profiles.html.\nworkspace_host: {{workspace_host}}", "properties": { "project_name": { "type": "string", diff --git a/contrib/templates/default-scala/template/{{.project_name}}/README.md.tmpl b/contrib/templates/default-scala/template/{{.project_name}}/README.md.tmpl index 3d4792ed..cc4be258 100644 --- a/contrib/templates/default-scala/template/{{.project_name}}/README.md.tmpl +++ b/contrib/templates/default-scala/template/{{.project_name}}/README.md.tmpl @@ -36,7 +36,7 @@ The '{{.project_name}}' project was generated by using the default-scala templat 6. Optionally, install developer tools such as the Databricks extension for Visual Studio Code from https://docs.databricks.com/dev-tools/vscode-ext.html. -7. For documentation on the Databricks Asset Bundles format used +7. For documentation on the Declarative Automation Bundles format used for this project, and for CI/CD configuration, see https://docs.databricks.com/dev-tools/bundles/index.html. diff --git a/contrib/templates/default-scala/template/{{.project_name}}/databricks.yml.tmpl b/contrib/templates/default-scala/template/{{.project_name}}/databricks.yml.tmpl index bfd50a10..9ad15fdd 100644 --- a/contrib/templates/default-scala/template/{{.project_name}}/databricks.yml.tmpl +++ b/contrib/templates/default-scala/template/{{.project_name}}/databricks.yml.tmpl @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for {{.project_name}}. +# This is a Declarative Automation Bundle definition for {{.project_name}}. # See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. bundle: name: {{.project_name}} diff --git a/contrib/templates/default-scala/template/{{.project_name}}/resources/.gitkeep b/contrib/templates/default-scala/template/{{.project_name}}/resources/.gitkeep index f915206c..44b83b9e 100644 --- a/contrib/templates/default-scala/template/{{.project_name}}/resources/.gitkeep +++ b/contrib/templates/default-scala/template/{{.project_name}}/resources/.gitkeep @@ -1,3 +1,3 @@ -This folder is reserved for Databricks Asset Bundles resource definitions. +This folder is reserved for Declarative Automation Bundles resource definitions. diff --git a/contrib/templates/file-push/README.md b/contrib/templates/file-push/README.md index 46b524e3..ce5995a3 100644 --- a/contrib/templates/file-push/README.md +++ b/contrib/templates/file-push/README.md @@ -1,6 +1,6 @@ # Zerobus - File Mode -This is an (experimental) template for creating a file push pipeline with Databricks Asset Bundles. +This is an (experimental) template for creating a file push pipeline with Declarative Automation Bundles. Install it using ``` diff --git a/contrib/templates/file-push/databricks_template_schema.json b/contrib/templates/file-push/databricks_template_schema.json index 6e75b02b..497f21b5 100644 --- a/contrib/templates/file-push/databricks_template_schema.json +++ b/contrib/templates/file-push/databricks_template_schema.json @@ -1,5 +1,5 @@ { - "welcome_message": "\nWelcome to the file-push template for Databricks Asset Bundles!\n\nA workspace was selected based on your current profile. For information about how to change this, see https://docs.databricks.com/dev-tools/cli/profiles.html.\nworkspace_host: {{workspace_host}}", + "welcome_message": "\nWelcome to the file-push template for Declarative Automation Bundles!\n\nA workspace was selected based on your current profile. For information about how to change this, see https://docs.databricks.com/dev-tools/cli/profiles.html.\nworkspace_host: {{workspace_host}}", "properties": { "catalog_name": { "type": "string", diff --git a/contrib/templates/streamlit-app/databricks_template_schema.json b/contrib/templates/streamlit-app/databricks_template_schema.json index 9faa1a49..acc0cab1 100644 --- a/contrib/templates/streamlit-app/databricks_template_schema.json +++ b/contrib/templates/streamlit-app/databricks_template_schema.json @@ -1,5 +1,5 @@ { - "welcome_message": "\nWelcome to the streamlit-app template for Databricks Asset Bundles!", + "welcome_message": "\nWelcome to the streamlit-app template for Declarative Automation Bundles!", "properties": { "project_name": { "type": "string", diff --git a/contrib/templates/streamlit-app/template/{{.project_name}}/databricks.yml.tmpl b/contrib/templates/streamlit-app/template/{{.project_name}}/databricks.yml.tmpl index 8b572a7a..2b4f0b92 100644 --- a/contrib/templates/streamlit-app/template/{{.project_name}}/databricks.yml.tmpl +++ b/contrib/templates/streamlit-app/template/{{.project_name}}/databricks.yml.tmpl @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for {{.project_name}}. +# This is a Declarative Automation Bundle definition for {{.project_name}}. # See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. bundle: name: {{.project_name}} diff --git a/knowledge_base/alerts/README.md b/knowledge_base/alerts/README.md index 900c9781..4fa50ea6 100644 --- a/knowledge_base/alerts/README.md +++ b/knowledge_base/alerts/README.md @@ -1,6 +1,6 @@ -# SQL Alerts with Databricks Asset Bundles +# SQL Alerts with Declarative Automation Bundles -This example shows how to define SQL alerts using Databricks Asset Bundles. The alert monitors daily NYC Taxi revenue and triggers when it exceeds a threshold. +This example shows how to define SQL alerts using Declarative Automation Bundles. The alert monitors daily NYC Taxi revenue and triggers when it exceeds a threshold. For more information about SQL alerts, see the [Databricks documentation](https://docs.databricks.com/aws/en/sql/user/alerts/). diff --git a/knowledge_base/app_with_database/README.md b/knowledge_base/app_with_database/README.md index afa39add..0512a42a 100644 --- a/knowledge_base/app_with_database/README.md +++ b/knowledge_base/app_with_database/README.md @@ -1,7 +1,7 @@ # Databricks app with OLTP database This example demonstrates how to define a Databricks app backed by -an OLTP Postgres in a Databricks Asset Bundle. +an OLTP Postgres in a Declarative Automation Bundle. It includes and deploys an example application that uses Python and Dash and a database instance. When application is started it provisions its own schema and demonstration data in the OLTP database. diff --git a/knowledge_base/app_with_database/databricks.yml b/knowledge_base/app_with_database/databricks.yml index 4949c22b..ec03708b 100644 --- a/knowledge_base/app_with_database/databricks.yml +++ b/knowledge_base/app_with_database/databricks.yml @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for app_with_database. +# This is a Declarative Automation Bundle definition for app_with_database. bundle: name: app_with_database_example diff --git a/knowledge_base/dashboard_nyc_taxi/README.md b/knowledge_base/dashboard_nyc_taxi/README.md index 06d633cd..8e1708d2 100644 --- a/knowledge_base/dashboard_nyc_taxi/README.md +++ b/knowledge_base/dashboard_nyc_taxi/README.md @@ -1,6 +1,6 @@ # Dashboard for NYC Taxi Trip Analysis -This example shows how to define a Databricks Asset Bundle with an AI/BI dashboard and a job that captures a snapshot of the dashboard and emails it to a subscriber. +This example shows how to define a Declarative Automation Bundle with an AI/BI dashboard and a job that captures a snapshot of the dashboard and emails it to a subscriber. It deploys the sample __NYC Taxi Trip Analysis__ dashboard to a Databricks workspace and configures a daily schedule to run the dashboard and send the snapshot in email to a specified email address. diff --git a/knowledge_base/database_with_catalog/README.md b/knowledge_base/database_with_catalog/README.md index 5a37ac0e..41a28de8 100644 --- a/knowledge_base/database_with_catalog/README.md +++ b/knowledge_base/database_with_catalog/README.md @@ -1,6 +1,6 @@ # OLTP database instance with a catalog -This example demonstrates how to define an OLTP database instance and a database catalog in a Databricks Asset Bundle. +This example demonstrates how to define an OLTP database instance and a database catalog in a Declarative Automation Bundle. It includes and deploys an example database instance and a catalog. When data changes in the database instance, they are reflected in Unity Catalog. diff --git a/knowledge_base/databricks_app/README.md b/knowledge_base/databricks_app/README.md index 4508a73e..e6b6de75 100644 --- a/knowledge_base/databricks_app/README.md +++ b/knowledge_base/databricks_app/README.md @@ -1,6 +1,6 @@ # Databricks App for working with Databricks jobs -This example demonstrates how to define an Databricks App in a Databricks Asset Bundle. +This example demonstrates how to define an Databricks App in a Declarative Automation Bundle. It includes and deploys an example app and a job managed by DABs to a Databricks workspace. The app shows current status of the job and lists all existing runs. diff --git a/knowledge_base/development_cluster/README.md b/knowledge_base/development_cluster/README.md index af48ec14..92e83f0a 100644 --- a/knowledge_base/development_cluster/README.md +++ b/knowledge_base/development_cluster/README.md @@ -1,6 +1,6 @@ # Development cluster -This example demonstrates how to define and use a development (all-purpose) cluster in a Databricks Asset Bundle. +This example demonstrates how to define and use a development (all-purpose) cluster in a Declarative Automation Bundle. This bundle defines an `example_job` which is run on a job cluster in production mode. diff --git a/knowledge_base/job_backfill_data/README.md b/knowledge_base/job_backfill_data/README.md index 3e41f4d7..760a1ba4 100644 --- a/knowledge_base/job_backfill_data/README.md +++ b/knowledge_base/job_backfill_data/README.md @@ -1,6 +1,6 @@ # job_backfill_data -This example demonstrates a Databricks Asset Bundle (DABs) Job that runs a SQL task with a date parameter for backfilling data. +This example demonstrates a Declarative Automation Bundle Job that runs a SQL task with a date parameter for backfilling data. The Job consists of: diff --git a/knowledge_base/job_backfill_data/databricks.yml b/knowledge_base/job_backfill_data/databricks.yml index b1fb02d0..302bba7f 100644 --- a/knowledge_base/job_backfill_data/databricks.yml +++ b/knowledge_base/job_backfill_data/databricks.yml @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for job backfill data. +# This is a Declarative Automation Bundle definition for job backfill data. # See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. bundle: name: job_backfill_data diff --git a/knowledge_base/job_conditional_execution/databricks.yml b/knowledge_base/job_conditional_execution/databricks.yml index 23f0ff7e..8900f3a3 100644 --- a/knowledge_base/job_conditional_execution/databricks.yml +++ b/knowledge_base/job_conditional_execution/databricks.yml @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for job conditional execution. +# This is a Declarative Automation Bundle definition for job conditional execution. # See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. bundle: name: job_conditional_execution diff --git a/knowledge_base/job_file_arrival/databricks.yml b/knowledge_base/job_file_arrival/databricks.yml index 1d7fb3e5..f74af408 100644 --- a/knowledge_base/job_file_arrival/databricks.yml +++ b/knowledge_base/job_file_arrival/databricks.yml @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for job file arrival. +# This is a Declarative Automation Bundle definition for job file arrival. # See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. bundle: name: job_file_arrival diff --git a/knowledge_base/job_programmatic_generation/databricks.yml b/knowledge_base/job_programmatic_generation/databricks.yml index 6fbcbb26..f8c25fcc 100644 --- a/knowledge_base/job_programmatic_generation/databricks.yml +++ b/knowledge_base/job_programmatic_generation/databricks.yml @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for job programmatic generation. +# This is a Declarative Automation Bundle definition for job programmatic generation. # See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. bundle: name: job_programmatic_generation diff --git a/knowledge_base/job_read_secret/README.md b/knowledge_base/job_read_secret/README.md index a5157435..16fb033e 100644 --- a/knowledge_base/job_read_secret/README.md +++ b/knowledge_base/job_read_secret/README.md @@ -1,6 +1,6 @@ # Databricks job that reads a secret from a secret scope -This example demonstrates how to define a secret scope and a job with a task that reads from it in a Databricks Asset Bundle. +This example demonstrates how to define a secret scope and a job with a task that reads from it in a Declarative Automation Bundle. It includes and deploys an example secret scope, and a job with a task in a bundle that reads a secret from the secret scope to a Databricks workspace. diff --git a/knowledge_base/job_table_update_trigger/databricks.yml b/knowledge_base/job_table_update_trigger/databricks.yml index 8006cb6e..54ac9075 100644 --- a/knowledge_base/job_table_update_trigger/databricks.yml +++ b/knowledge_base/job_table_update_trigger/databricks.yml @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for job table update trigger. +# This is a Declarative Automation Bundle definition for job table update trigger. # See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. bundle: name: job_table_update_trigger diff --git a/knowledge_base/job_with_for_each/databricks.yml b/knowledge_base/job_with_for_each/databricks.yml index 5b131164..f8b72ecf 100644 --- a/knowledge_base/job_with_for_each/databricks.yml +++ b/knowledge_base/job_with_for_each/databricks.yml @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for job with for each. +# This is a Declarative Automation Bundle definition for job with for each. # See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. bundle: name: job_with_for_each diff --git a/knowledge_base/job_with_multiple_wheels/README.md b/knowledge_base/job_with_multiple_wheels/README.md index 07b42ef9..cdb0b4b2 100644 --- a/knowledge_base/job_with_multiple_wheels/README.md +++ b/knowledge_base/job_with_multiple_wheels/README.md @@ -1,6 +1,6 @@ # Job with multiple wheels -This example demonstrates how to define and use a job with multiple wheel dependencies in a Databricks Asset Bundle. +This example demonstrates how to define and use a job with multiple wheel dependencies in a Declarative Automation Bundle. One of the wheel files depends on the other. It is important to specify the order of the wheels in the job such that the dependent wheel is installed first, since it won't be available in a public registry. diff --git a/knowledge_base/job_with_task_values/databricks.yml b/knowledge_base/job_with_task_values/databricks.yml index 83b50ede..0f33ce36 100644 --- a/knowledge_base/job_with_task_values/databricks.yml +++ b/knowledge_base/job_with_task_values/databricks.yml @@ -1,4 +1,4 @@ -# This is a Databricks asset bundle definition for job with task values. +# This is a Declarative Automation Bundle definition for job with task values. # See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. bundle: name: job_with_task_values diff --git a/knowledge_base/private_wheel_packages/README.md b/knowledge_base/private_wheel_packages/README.md index 9ae72df8..f172ec6a 100644 --- a/knowledge_base/private_wheel_packages/README.md +++ b/knowledge_base/private_wheel_packages/README.md @@ -1,6 +1,6 @@ # Private wheel packages -This example demonstrates how to use a private wheel package from a job in a Databricks Asset Bundle. +This Declarative Automation Bundles example demonstrates how to use a private wheel package from a job. If you are using notebooks, you can use the approach documented in [Notebook-scoped Python libraries][doc] to install wheels from a private repository in a notebook. You can use the workaround documented here if you are not using notebooks. @@ -15,7 +15,7 @@ wheels from a private repository in a notebook. You can use the workaround docum # Usage You can refer to private wheel files from job libraries or serverless environments by downloading the wheel -and making it part of your Databricks Asset Bundle deployment. +and making it part of your Declarative Automation Bundle deployment. To emulate this for this example, we will download a wheel from PyPI, include it in deployment, and refer to it from job configuration. diff --git a/knowledge_base/python_wheel_poetry/README.md b/knowledge_base/python_wheel_poetry/README.md index 003af249..f387fd2e 100644 --- a/knowledge_base/python_wheel_poetry/README.md +++ b/knowledge_base/python_wheel_poetry/README.md @@ -1,6 +1,6 @@ # Python wheel with Poetry -This example demonstrates how to use Poetry with a Databricks Asset Bundle. +This Declarative Automation Bundles example demonstrates how to use Poetry to build a whl. ## Prerequisites diff --git a/knowledge_base/python_wheel_poetry/pyproject.toml b/knowledge_base/python_wheel_poetry/pyproject.toml index 54e5b34d..fe8b7e63 100644 --- a/knowledge_base/python_wheel_poetry/pyproject.toml +++ b/knowledge_base/python_wheel_poetry/pyproject.toml @@ -1,7 +1,7 @@ [tool.poetry] name = "python_wheel_poetry" version = "0.0.1" -description = "Example package to demonstrate using Poetry with Databricks Asset Bundles." +description = "Example package to demonstrate using Poetry with Declarative Automation Bundles." authors = ["Pieter Noordhuis "] readme = "README.md" packages = [{ include = "python_wheel_poetry", from = "src" }] diff --git a/knowledge_base/serverless_job/README.md b/knowledge_base/serverless_job/README.md index 403f50c5..318c9311 100644 --- a/knowledge_base/serverless_job/README.md +++ b/knowledge_base/serverless_job/README.md @@ -1,6 +1,6 @@ # Serverless job -This example demonstrates how to define and use a serverless job in a Databricks Asset Bundle. +This Declarative Automation Bundles example demonstrates how to define a job that runs on serverless compute. For more information, please refer to the [documentation](https://docs.databricks.com/en/workflows/jobs/how-to/use-bundles-with-jobs.html#configure-a-job-that-uses-serverless-compute). diff --git a/knowledge_base/target_includes/README.md b/knowledge_base/target_includes/README.md index 09579aa7..dfe7140d 100644 --- a/knowledge_base/target_includes/README.md +++ b/knowledge_base/target_includes/README.md @@ -1,6 +1,6 @@ # Target Includes Example -This example demonstrates the concept of using `target_includes` (or similar include mechanisms) in Databricks Asset Bundles to organize job configurations across different environments without duplication. +This example demonstrates the concept of using `target_includes` (or similar include mechanisms) in Declarative Automation Bundles to organize job configurations across different environments without duplication. It addresses the use case described in [GitHub Issue #2878](https://github.com/databricks/cli/issues/2878), which requests the ability to include specific resource files based on target configurations. diff --git a/knowledge_base/write_from_job_to_volume/README.md b/knowledge_base/write_from_job_to_volume/README.md index 5194ca53..2bc444df 100644 --- a/knowledge_base/write_from_job_to_volume/README.md +++ b/knowledge_base/write_from_job_to_volume/README.md @@ -1,6 +1,6 @@ # Save job result to volume -This example demonstrates how to define and use a Unity Catalog Volume in a Databricks Asset Bundle. +This Declarative Automation Bundles example demonstrates how to define and use a Unity Catalog volume. Specifically we'll define a `hello_world_job` job which writes "Hello, World!" to a file in a Unity Catalog Volume. diff --git a/knowledge_base/write_from_job_to_volume/resources/my_volume.volume.yml b/knowledge_base/write_from_job_to_volume/resources/my_volume.volume.yml index 7e473214..1efdbe24 100644 --- a/knowledge_base/write_from_job_to_volume/resources/my_volume.volume.yml +++ b/knowledge_base/write_from_job_to_volume/resources/my_volume.volume.yml @@ -4,6 +4,6 @@ resources: catalog_name: main # We use the ${resources.schemas...} interpolation syntax to force the creation # of the schema before the volume. Usage of the ${resources.schemas...} syntax - # allows Databricks Asset Bundles to form a dependency graph between resources. + # allows Declarative Automation Bundles to form a dependency graph between resources. schema_name: ${resources.schemas.hello_world_schema.name} name: my_volume