Skip to content

Commit 74899eb

Browse files
denikclaude
andauthored
direct: fix drift detection for jobs with >100 tasks (#4675)
## Changes Use jobs.Get() instead of jobs.GetById() to properly paginate tasks. ## Why Fixes plan for jobs with more than 100 tasks. ## Tests New pydabs configs for invariant tests for 10 and 1000 jobs. The 1k config failed no_drift test because remote state only had 100 tasks and the reset were considered 'update'. Testserver for jobs is extended to read all jobs. --------- Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
1 parent c779b91 commit 74899eb

File tree

12 files changed

+157
-3
lines changed

12 files changed

+157
-3
lines changed

CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,11 @@
11
# Version changelog
22

3+
## Release v0.293.0 (2026-03-06)
4+
5+
### Bundles
6+
* direct: fix drift detection for jobs with >100 tasks by paginating all tasks when reading job state ([#4675](https://github.com/databricks/cli/pull/4675))
7+
8+
39
## Release v0.292.0 (2026-03-05)
410

511
### Bundles
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
bundle:
2+
name: test-bundle-$UNIQUE_NAME
3+
4+
python:
5+
venv_path: .venv
6+
resources:
7+
- "job_pydabs_1000_tasks:load_resources"
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
#!/bin/bash
2+
3+
uv venv --quiet
4+
uv pip install --quiet "$DATABRICKS_BUNDLES_WHEEL"
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
bundle:
2+
name: test-bundle-$UNIQUE_NAME
3+
4+
python:
5+
venv_path: .venv
6+
resources:
7+
- "job_pydabs_10_tasks:load_resources"
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
#!/bin/bash
2+
3+
uv venv --quiet
4+
uv pip install --quiet "$DATABRICKS_BUNDLES_WHEEL"
Lines changed: 38 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,38 @@
1+
import os
2+
3+
from databricks.bundles.core import Resources
4+
5+
6+
def load_resources() -> Resources:
7+
unique_name = os.environ["UNIQUE_NAME"]
8+
spark_version = os.environ.get("DEFAULT_SPARK_VERSION", "13.3.x-scala2.12")
9+
node_type_id = os.environ.get("NODE_TYPE_ID", "i3.xlarge")
10+
11+
resources = Resources()
12+
resources.add_job(
13+
resource_name="foo",
14+
job={
15+
"name": f"test-job-{unique_name}",
16+
"tasks": [
17+
{
18+
"task_key": f"task_{i:04d}",
19+
"notebook_task": {
20+
"notebook_path": "/Shared/notebook",
21+
},
22+
"job_cluster_key": "main_cluster",
23+
}
24+
for i in range(1000)
25+
],
26+
"job_clusters": [
27+
{
28+
"job_cluster_key": "main_cluster",
29+
"new_cluster": {
30+
"spark_version": spark_version,
31+
"node_type_id": node_type_id,
32+
"num_workers": 1,
33+
},
34+
}
35+
],
36+
},
37+
)
38+
return resources
Lines changed: 38 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,38 @@
1+
import os
2+
3+
from databricks.bundles.core import Resources
4+
5+
6+
def load_resources() -> Resources:
7+
unique_name = os.environ["UNIQUE_NAME"]
8+
spark_version = os.environ.get("DEFAULT_SPARK_VERSION", "13.3.x-scala2.12")
9+
node_type_id = os.environ.get("NODE_TYPE_ID", "i3.xlarge")
10+
11+
resources = Resources()
12+
resources.add_job(
13+
resource_name="foo",
14+
job={
15+
"name": f"test-job-{unique_name}",
16+
"tasks": [
17+
{
18+
"task_key": f"task_{i:02d}",
19+
"notebook_task": {
20+
"notebook_path": "/Shared/notebook",
21+
},
22+
"job_cluster_key": "main_cluster",
23+
}
24+
for i in range(10)
25+
],
26+
"job_clusters": [
27+
{
28+
"job_cluster_key": "main_cluster",
29+
"new_cluster": {
30+
"spark_version": spark_version,
31+
"node_type_id": node_type_id,
32+
"num_workers": 1,
33+
},
34+
}
35+
],
36+
},
37+
)
38+
return resources

acceptance/bundle/invariant/migrate/out.test.toml

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

acceptance/bundle/invariant/no_drift/out.test.toml

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

acceptance/bundle/invariant/test.toml

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,11 @@
11
Local = true
22
Cloud = true
33
RequiresUnityCatalog = true
4+
Timeout = '10m'
45

56
Ignore = [
67
".databricks",
8+
".venv",
79
"databricks.yml",
810
"plan.json",
911
"*.py",
@@ -28,6 +30,8 @@ EnvMatrix.INPUT_CONFIG = [
2830
"experiment.yml.tmpl",
2931
"external_location.yml.tmpl",
3032
"job.yml.tmpl",
33+
"job_pydabs_10_tasks.yml.tmpl",
34+
"job_pydabs_1000_tasks.yml.tmpl",
3135
"job_with_task.yml.tmpl",
3236
"model.yml.tmpl",
3337
"model_serving_endpoint.yml.tmpl",

0 commit comments

Comments
 (0)