Skip to content

Commit 0bab892

Browse files
authored
acc: Add instance_pool_and_node_type test for jobs clusters transformation (#3226)
## Tests - Move bundle/resources/jobs bundle/resources/jobs/update - Add new test bundle/resources/jobs/instance_pool_and_node_type that tests mutator added in #3198 ## Why We only had slow integration test that checks this behaviour, this is local and explicit and highlights an issue with node_type_id being "" rather than being omitted which trips the backend.
1 parent 7e8e974 commit 0bab892

File tree

9 files changed

+119
-0
lines changed

9 files changed

+119
-0
lines changed
Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
resources:
2+
jobs:
3+
some_other_job:
4+
name: "[${bundle.target}] Test Wheel Job $UNIQUE_NAME"
5+
tasks:
6+
- task_key: TestTask
7+
new_cluster:
8+
num_workers: 1
9+
spark_version: $DEFAULT_SPARK_VERSION
10+
node_type_id: $NODE_TYPE_ID
11+
data_security_mode: USER_ISOLATION
12+
instance_pool_id: $TEST_INSTANCE_POOL_ID
13+
python_wheel_task:
14+
package_name: my_test_code
15+
entry_point: run
16+
parameters:
17+
- "one"
18+
- "two"
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
Local = true
2+
Cloud = false
3+
4+
[EnvMatrix]
5+
DATABRICKS_CLI_DEPLOYMENT = ["terraform"]
Lines changed: 71 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
2+
>>> [CLI] bundle validate -o json
3+
[
4+
{
5+
"new_cluster": {
6+
"data_security_mode": "USER_ISOLATION",
7+
"instance_pool_id": "$TEST_INSTANCE_POOL_ID",
8+
"node_type_id": "",
9+
"num_workers": 1,
10+
"spark_version": "$DEFAULT_SPARK_VERSION"
11+
},
12+
"python_wheel_task": {
13+
"entry_point": "run",
14+
"package_name": "my_test_code",
15+
"parameters": [
16+
"one",
17+
"two"
18+
]
19+
},
20+
"task_key": "TestTask"
21+
}
22+
]
23+
24+
>>> [CLI] bundle summary -o json
25+
[
26+
{
27+
"new_cluster": {
28+
"data_security_mode": "USER_ISOLATION",
29+
"instance_pool_id": "$TEST_INSTANCE_POOL_ID",
30+
"node_type_id": "",
31+
"num_workers": 1,
32+
"spark_version": "$DEFAULT_SPARK_VERSION"
33+
},
34+
"python_wheel_task": {
35+
"entry_point": "run",
36+
"package_name": "my_test_code",
37+
"parameters": [
38+
"one",
39+
"two"
40+
]
41+
},
42+
"task_key": "TestTask"
43+
}
44+
]
45+
46+
>>> [CLI] bundle deploy
47+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files...
48+
Deploying resources...
49+
Updating deployment state...
50+
Deployment complete!
51+
52+
>>> jq -s .[] | select(.path=="/api/2.2/jobs/create") | .body.tasks out.requests.txt
53+
[
54+
{
55+
"new_cluster": {
56+
"data_security_mode": "USER_ISOLATION",
57+
"instance_pool_id": "$TEST_INSTANCE_POOL_ID",
58+
"num_workers": 1,
59+
"spark_version": "$DEFAULT_SPARK_VERSION"
60+
},
61+
"python_wheel_task": {
62+
"entry_point": "run",
63+
"package_name": "my_test_code",
64+
"parameters": [
65+
"one",
66+
"two"
67+
]
68+
},
69+
"task_key": "TestTask"
70+
}
71+
]
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
trace $CLI bundle validate -o json | jq .resources.jobs.some_other_job.tasks
2+
trace $CLI bundle summary -o json | jq .resources.jobs.some_other_job.tasks
3+
4+
trace $CLI bundle deploy
5+
6+
trace jq -s '.[] | select(.path=="/api/2.2/jobs/create") | .body.tasks' out.requests.txt
7+
rm out.requests.txt
Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
RecordRequests = true
2+
3+
# Fails on direct with
4+
# --- FAIL: TestAccept/bundle/resources/jobs/instance_pool_and_node_type (0.00s)
5+
# --- FAIL: TestAccept/bundle/resources/jobs/instance_pool_and_node_type/DATABRICKS_CLI_DEPLOYMENT=direct-exp (1.60s)
6+
# acceptance_test.go:1178: Writing updated bundle config to databricks.yml. BundleConfig sections: default_name
7+
# acceptance_test.go:722: Diff:
8+
# --- bundle/resources/jobs/instance_pool_and_node_type/output.txt
9+
# +++ /var/folders/5y/9kkdnjw91p11vsqwk0cvmk200000gp/T/TestAcceptbundleresourcesjobsinstance_pool_and_node_typeDATABRICKS_CLI_DEPLOYMENT=direct-exp3221363519/001/output.txt
10+
# @@ -55,6 +55,7 @@
11+
# "new_cluster": {
12+
# "data_security_mode": "USER_ISOLATION",
13+
# "instance_pool_id": "$TEST_INSTANCE_POOL_ID",
14+
# + "node_type_id": "",
15+
# "num_workers": 1,
16+
# "spark_version": "$DEFAULT_SPARK_VERSION"
17+
# },
18+
EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"]
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.

0 commit comments

Comments
 (0)