Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# bundle-examples

This repository provides Databricks Asset Bundles examples.
This repository provides Declarative Automation Bundles examples.

To learn more, see:
* The launch blog post at https://www.databricks.com/blog/announcing-general-availability-databricks-asset-bundles
Expand Down
6 changes: 3 additions & 3 deletions contrib/README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# Contrib Directory

The `contrib` directory contains additional community-contributed examples and resources for Databricks Asset Bundles. These examples may include:
The `contrib` directory contains additional community-contributed examples and resources for Declarative Automation Bundles. These examples may include:

- Custom configurations and extensions
- Advanced usage patterns
- Tools or utilities for enhancing Databricks Asset Bundles workflows
- Tools or utilities for enhancing Declarative Automation Bundles workflows

## Structure

Expand Down Expand Up @@ -38,6 +38,6 @@ If you would like to add your own examples or resources, please:
2. Include a `README.md` file explaining the contribution.
3. Ensure that any necessary configuration files, scripts, or dependencies are included.

For more information on Databricks Asset Bundles, see:
For more information on Declarative Automation Bundles, see:
- The launch blog post at https://www.databricks.com/blog/announcing-general-availability-databricks-asset-bundles
- The docs at https://docs.databricks.com/dev-tools/bundles/index.html
2 changes: 1 addition & 1 deletion contrib/data_engineering/assets/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
This folder is reserved for Databricks Asset Bundles definitions.
This folder is reserved for Declarative Automation Bundles definitions.

New jobs and pipelines should conventions from the 'data-engineering' template.
See https://github.com/databricks/bundle-examples/blob/main/contrib/templates/data-engineering/README.md.
2 changes: 1 addition & 1 deletion contrib/data_engineering/databricks.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for data_engineering.
# This is a Declarative Automation Bundle definition for data_engineering.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We never do this. The product name is "Declarative Automation Bundles". So either use that (i think it's fine here) or if it's singular, then just use "bundle".

Suggested change
# This is a Declarative Automation Bundle definition for data_engineering.
# This is a bundle definition for data_engineering.

# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
bundle:
name: data_engineering
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Common Configuration Guide

This document describes common configuration parameters shared among monitoring DABs (Databricks Asset Bundles).
This document describes common configuration parameters shared among monitoring DABs (Declarative Automation Bundles).

Configuration is done through variables in a DAB deployment target.

Expand Down
4 changes: 2 additions & 2 deletions contrib/databricks_ingestion_monitoring/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ In particular, the package provides:
- Provide out-of-the-box AI/BI Dashboards based on the above observability tables
- Code and examples to integrate the observability tables with third-party monitoring providers such as Datadog, New Relic, Azure Monitor, Splunk

The package contains deployable [Databricks Asset Bundles (DABs)](https://docs.databricks.com/aws/en/dev-tools/bundles/) for easy distribution:
The package contains deployable [Declarative Automation Bundles](https://docs.databricks.com/aws/en/dev-tools/bundles/) for easy distribution:

- Generic SDP pipelines
- CDC Connector
Expand All @@ -22,7 +22,7 @@ Coming soon

# Prerequisites

- [Databricks Asset Bundles (DABs)](https://docs.databricks.com/aws/en/dev-tools/bundles/)
- [Declarative Automation Bundles](https://docs.databricks.com/aws/en/dev-tools/bundles/)
- PrPr for forEachBatch sinks in SDP (if using the 3P observabitlity platforms integration)


Expand Down
4 changes: 2 additions & 2 deletions contrib/job_with_ai_parse_document/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# AI Document Processing Job with Structured Streaming

A Databricks Asset Bundle demonstrating **incremental document processing** using `ai_parse_document`, `ai_query`, and Databricks Jobs with Structured Streaming.
A Declarative Automation Bundle demonstrating **incremental document processing** using `ai_parse_document`, `ai_query`, and Databricks Jobs with Structured Streaming.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
A Declarative Automation Bundle demonstrating **incremental document processing** using `ai_parse_document`, `ai_query`, and Databricks Jobs with Structured Streaming.
This Declarative Automation Bundles example demonstrates incremental document processing using `ai_parse_document`, `ai_query`, and Databricks Jobs with Structured Streaming.

Same here - the product name is plural, so changed the language to work with that. Also a nit to remove the inline-bold AI tell.


## Overview

Expand Down Expand Up @@ -171,7 +171,7 @@ The included notebook visualizes parsing results with interactive bounding boxes

## Resources

- [Databricks Asset Bundles](https://docs.databricks.com/dev-tools/bundles/)
- [Declarative Automation Bundles](https://docs.databricks.com/dev-tools/bundles/)
- [Databricks Workflows](https://docs.databricks.com/workflows/)
- [Structured Streaming](https://docs.databricks.com/structured-streaming/)
- [`ai_parse_document` Function](https://docs.databricks.com/aws/en/sql/language-manual/functions/ai_parse_document)
Expand Down
2 changes: 1 addition & 1 deletion contrib/job_with_ai_parse_document/databricks.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for ai_parse_document_workflow.
# This is a Declarative Automation Bundle definition for ai_parse_document_workflow.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# This is a Declarative Automation Bundle definition for ai_parse_document_workflow.
# This is a bundle definition for ai_parse_document_workflow.

Same here.

# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
bundle:
name: ai_parse_document_workflow
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"welcome_message": "\nWelcome to the data-engineering template for Databricks Asset Bundles!",
"welcome_message": "\nWelcome to the data-engineering template for Declarative Automation Bundles!",
"properties": {
"project_name": {
"type": "string",
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
This folder is reserved for Databricks Asset Bundles definitions.
This folder is reserved for Declarative Automation Bundles definitions.

New jobs and pipelines should conventions from the 'data-engineering' template.
See https://github.com/databricks/bundle-examples/blob/main/contrib/templates/data-engineering/README.md.
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for {{.project_name}}.
# This is a Declarative Automation Bundle definition for {{.project_name}}.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# This is a Declarative Automation Bundle definition for {{.project_name}}.
# This is a bundle definition for {{.project_name}}.

Same here

# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
bundle:
name: {{.project_name}}
Expand Down
2 changes: 1 addition & 1 deletion contrib/templates/default-scala/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# default-scala

This template helps you create Scala projects with Databricks Asset Bundles. It uses sbt to compile and package Scala files, and can be used with Databricks Connect for local development.
This template helps you create Scala projects with Declarative Automation Bundles. It uses sbt to compile and package Scala files, and can be used with Databricks Connect for local development.

It supports two compute types: standard clusters and serverless compute.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"welcome_message": "\nWelcome to the default-scala template for Databricks Asset Bundles!\n\nA workspace was selected based on your current profile. For information about how to change this, see https://docs.databricks.com/dev-tools/cli/profiles.html.\nworkspace_host: {{workspace_host}}",
"welcome_message": "\nWelcome to the default-scala template for Declarative Automation Bundles!\n\nA workspace was selected based on your current profile. For information about how to change this, see https://docs.databricks.com/dev-tools/cli/profiles.html.\nworkspace_host: {{workspace_host}}",
"properties": {
"project_name": {
"type": "string",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ The '{{.project_name}}' project was generated by using the default-scala templat
6. Optionally, install developer tools such as the Databricks extension for Visual Studio Code from
https://docs.databricks.com/dev-tools/vscode-ext.html.

7. For documentation on the Databricks Asset Bundles format used
7. For documentation on the Declarative Automation Bundles format used
for this project, and for CI/CD configuration, see
https://docs.databricks.com/dev-tools/bundles/index.html.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for {{.project_name}}.
# This is a Declarative Automation Bundle definition for {{.project_name}}.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# This is a Declarative Automation Bundle definition for {{.project_name}}.
# This is a bundle definition for {{.project_name}}.

# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
bundle:
name: {{.project_name}}
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@

This folder is reserved for Databricks Asset Bundles resource definitions.
This folder is reserved for Declarative Automation Bundles resource definitions.

2 changes: 1 addition & 1 deletion contrib/templates/file-push/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Zerobus - File Mode

This is an (experimental) template for creating a file push pipeline with Databricks Asset Bundles.
This is an (experimental) template for creating a file push pipeline with Declarative Automation Bundles.

Install it using
```
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"welcome_message": "\nWelcome to the file-push template for Databricks Asset Bundles!\n\nA workspace was selected based on your current profile. For information about how to change this, see https://docs.databricks.com/dev-tools/cli/profiles.html.\nworkspace_host: {{workspace_host}}",
"welcome_message": "\nWelcome to the file-push template for Declarative Automation Bundles!\n\nA workspace was selected based on your current profile. For information about how to change this, see https://docs.databricks.com/dev-tools/cli/profiles.html.\nworkspace_host: {{workspace_host}}",
"properties": {
"catalog_name": {
"type": "string",
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"welcome_message": "\nWelcome to the streamlit-app template for Databricks Asset Bundles!",
"welcome_message": "\nWelcome to the streamlit-app template for Declarative Automation Bundles!",
"properties": {
"project_name": {
"type": "string",
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for {{.project_name}}.
# This is a Declarative Automation Bundle definition for {{.project_name}}.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# This is a Declarative Automation Bundle definition for {{.project_name}}.
# This is a bundle definition for {{.project_name}}.

# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
bundle:
name: {{.project_name}}
Expand Down
4 changes: 2 additions & 2 deletions knowledge_base/alerts/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# SQL Alerts with Databricks Asset Bundles
# SQL Alerts with Declarative Automation Bundles

This example shows how to define SQL alerts using Databricks Asset Bundles. The alert monitors daily NYC Taxi revenue and triggers when it exceeds a threshold.
This example shows how to define SQL alerts using Declarative Automation Bundles. The alert monitors daily NYC Taxi revenue and triggers when it exceeds a threshold.

For more information about SQL alerts, see the [Databricks documentation](https://docs.databricks.com/aws/en/sql/user/alerts/).

Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/app_with_database/README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Databricks app with OLTP database

This example demonstrates how to define a Databricks app backed by
an OLTP Postgres in a Databricks Asset Bundle.
an OLTP Postgres in a Declarative Automation Bundle.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
an OLTP Postgres in a Declarative Automation Bundle.
an OLTP Postgres in a bundle.


It includes and deploys an example application that uses Python and Dash and a database instance.
When application is started it provisions its own schema and demonstration data in the OLTP database.
Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/app_with_database/databricks.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for app_with_database.
# This is a Declarative Automation Bundle definition for app_with_database.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# This is a Declarative Automation Bundle definition for app_with_database.
# This is a bundle definition for app_with_database.

bundle:
name: app_with_database_example

Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/dashboard_nyc_taxi/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Dashboard for NYC Taxi Trip Analysis

This example shows how to define a Databricks Asset Bundle with an AI/BI dashboard and a job that captures a snapshot of the dashboard and emails it to a subscriber.
This example shows how to define a Declarative Automation Bundle with an AI/BI dashboard and a job that captures a snapshot of the dashboard and emails it to a subscriber.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
This example shows how to define a Declarative Automation Bundle with an AI/BI dashboard and a job that captures a snapshot of the dashboard and emails it to a subscriber.
This Declarative Automation Bundles example shows how to define an AI/BI dashboard and a job that captures a snapshot of the dashboard and emails it to a subscriber.


It deploys the sample __NYC Taxi Trip Analysis__ dashboard to a Databricks workspace and configures a daily schedule to run the dashboard and send the snapshot in email to a specified email address.

Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/database_with_catalog/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# OLTP database instance with a catalog

This example demonstrates how to define an OLTP database instance and a database catalog in a Databricks Asset Bundle.
This example demonstrates how to define an OLTP database instance and a database catalog in a Declarative Automation Bundle.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
This example demonstrates how to define an OLTP database instance and a database catalog in a Declarative Automation Bundle.
This Declarative Automation Bundles example demonstrates how to define an OLTP database instance and a database catalog.


It includes and deploys an example database instance and a catalog. When data changes in the database instance, they are reflected in Unity Catalog.

Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/databricks_app/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Databricks App for working with Databricks jobs

This example demonstrates how to define an Databricks App in a Databricks Asset Bundle.
This example demonstrates how to define an Databricks App in a Declarative Automation Bundle.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
This example demonstrates how to define an Databricks App in a Declarative Automation Bundle.
This Declarative Automation Bundles example demonstrates how to define a Databricks App.


It includes and deploys an example app and a job managed by DABs to a Databricks workspace.
The app shows current status of the job and lists all existing runs.
Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/development_cluster/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Development cluster

This example demonstrates how to define and use a development (all-purpose) cluster in a Databricks Asset Bundle.
This example demonstrates how to define and use a development (all-purpose) cluster in a Declarative Automation Bundle.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
This example demonstrates how to define and use a development (all-purpose) cluster in a Declarative Automation Bundle.
This Declarative Automation Bundles example demonstrates how to define and use a development (all-purpose) cluster.


This bundle defines an `example_job` which is run on a job cluster in production mode.

Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/job_backfill_data/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# job_backfill_data

This example demonstrates a Databricks Asset Bundle (DABs) Job that runs a SQL task with a date parameter for backfilling data.
This example demonstrates a Declarative Automation Bundle Job that runs a SQL task with a date parameter for backfilling data.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
This example demonstrates a Declarative Automation Bundle Job that runs a SQL task with a date parameter for backfilling data.
This Declarative Automation Bundles example demonstrates how to define a job that runs a SQL task with a date parameter for backfilling data.


The Job consists of:

Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/job_backfill_data/databricks.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for job backfill data.
# This is a Declarative Automation Bundle definition for job backfill data.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# This is a Declarative Automation Bundle definition for job backfill data.
# This is a bundle definition for job backfill data.

# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
bundle:
name: job_backfill_data
Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/job_conditional_execution/databricks.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for job conditional execution.
# This is a Declarative Automation Bundle definition for job conditional execution.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# This is a Declarative Automation Bundle definition for job conditional execution.
# This is a bundle definition for a job with conditional execution.

Or "...a job that runs based on a condition."?

# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
bundle:
name: job_conditional_execution
Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/job_file_arrival/databricks.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for job file arrival.
# This is a Declarative Automation Bundle definition for job file arrival.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# This is a Declarative Automation Bundle definition for job file arrival.
# This is a bundle definition for job file arrival.

# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
bundle:
name: job_file_arrival
Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/job_programmatic_generation/databricks.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for job programmatic generation.
# This is a Declarative Automation Bundle definition for job programmatic generation.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# This is a Declarative Automation Bundle definition for job programmatic generation.
# This is a bundle definition for job programmatic generation.

# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
bundle:
name: job_programmatic_generation
Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/job_read_secret/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Databricks job that reads a secret from a secret scope

This example demonstrates how to define a secret scope and a job with a task that reads from it in a Databricks Asset Bundle.
This example demonstrates how to define a secret scope and a job with a task that reads from it in a Declarative Automation Bundle.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
This example demonstrates how to define a secret scope and a job with a task that reads from it in a Declarative Automation Bundle.
This Declarative Automation Bundles example demonstrates how to define a secret scope and a job with a task that reads from it.


It includes and deploys an example secret scope, and a job with a task in a bundle that reads a secret from the secret scope to a Databricks workspace.

Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/job_table_update_trigger/databricks.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for job table update trigger.
# This is a Declarative Automation Bundle definition for job table update trigger.
Copy link
Copy Markdown
Contributor

@juliacrawf-db juliacrawf-db Mar 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# This is a Declarative Automation Bundle definition for job table update trigger.
# This is a bundle definition for a job with a table update trigger.

# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
bundle:
name: job_table_update_trigger
Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/job_with_for_each/databricks.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for job with for each.
# This is a Declarative Automation Bundle definition for job with for each.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# This is a Declarative Automation Bundle definition for job with for each.
# This is a bundle definition for a job with a for each task.

# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
bundle:
name: job_with_for_each
Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/job_with_multiple_wheels/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Job with multiple wheels

This example demonstrates how to define and use a job with multiple wheel dependencies in a Databricks Asset Bundle.
This example demonstrates how to define and use a job with multiple wheel dependencies in a Declarative Automation Bundle.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
This example demonstrates how to define and use a job with multiple wheel dependencies in a Declarative Automation Bundle.
This Declarative Automation Bundles example demonstrates how to define and use a job with multiple wheel dependencies.


One of the wheel files depends on the other. It is important to specify the order of the wheels in the job such that
the dependent wheel is installed first, since it won't be available in a public registry.
Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/job_with_task_values/databricks.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This is a Databricks asset bundle definition for job with task values.
# This is a Declarative Automation Bundle definition for job with task values.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# This is a Declarative Automation Bundle definition for job with task values.
# This is a bundle definition for a job with task values.

# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
bundle:
name: job_with_task_values
Expand Down
4 changes: 2 additions & 2 deletions knowledge_base/private_wheel_packages/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Private wheel packages

This example demonstrates how to use a private wheel package from a job in a Databricks Asset Bundle.
This Declarative Automation Bundles example demonstrates how to use a private wheel package from a job.

If you are using notebooks, you can use the approach documented in [Notebook-scoped Python libraries][doc] to install
wheels from a private repository in a notebook. You can use the workaround documented here if you are not using notebooks.
Expand All @@ -15,7 +15,7 @@ wheels from a private repository in a notebook. You can use the workaround docum
# Usage

You can refer to private wheel files from job libraries or serverless environments by downloading the wheel
and making it part of your Databricks Asset Bundle deployment.
and making it part of your Declarative Automation Bundle deployment.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
and making it part of your Declarative Automation Bundle deployment.
and making it part of your bundle deployment.


To emulate this for this example, we will download a wheel from PyPI, include it in deployment, and refer to it from job configuration.

Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/python_wheel_poetry/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Python wheel with Poetry

This example demonstrates how to use Poetry with a Databricks Asset Bundle.
This Declarative Automation Bundles example demonstrates how to use Poetry to build a whl.

## Prerequisites

Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/python_wheel_poetry/pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[tool.poetry]
name = "python_wheel_poetry"
version = "0.0.1"
description = "Example package to demonstrate using Poetry with Databricks Asset Bundles."
description = "Example package to demonstrate using Poetry with Declarative Automation Bundles."
authors = ["Pieter Noordhuis <pieter.noordhuis@databricks.com>"]
readme = "README.md"
packages = [{ include = "python_wheel_poetry", from = "src" }]
Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/serverless_job/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Serverless job

This example demonstrates how to define and use a serverless job in a Databricks Asset Bundle.
This Declarative Automation Bundles example demonstrates how to define a job that runs on serverless compute.

For more information, please refer to the [documentation](https://docs.databricks.com/en/workflows/jobs/how-to/use-bundles-with-jobs.html#configure-a-job-that-uses-serverless-compute).

Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/target_includes/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Target Includes Example

This example demonstrates the concept of using `target_includes` (or similar include mechanisms) in Databricks Asset Bundles to organize job configurations across different environments without duplication.
This example demonstrates the concept of using `target_includes` (or similar include mechanisms) in Declarative Automation Bundles to organize job configurations across different environments without duplication.

It addresses the use case described in [GitHub Issue #2878](https://github.com/databricks/cli/issues/2878), which requests the ability to include specific resource files based on target configurations.

Expand Down
2 changes: 1 addition & 1 deletion knowledge_base/write_from_job_to_volume/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Save job result to volume

This example demonstrates how to define and use a Unity Catalog Volume in a Databricks Asset Bundle.
This Declarative Automation Bundles example demonstrates how to define and use a Unity Catalog volume.

Specifically we'll define a `hello_world_job` job which writes "Hello, World!"
to a file in a Unity Catalog Volume.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,6 @@ resources:
catalog_name: main
# We use the ${resources.schemas...} interpolation syntax to force the creation
# of the schema before the volume. Usage of the ${resources.schemas...} syntax
# allows Databricks Asset Bundles to form a dependency graph between resources.
# allows Declarative Automation Bundles to form a dependency graph between resources.
schema_name: ${resources.schemas.hello_world_schema.name}
name: my_volume
Loading