Skip to content

Commit 9138476

Browse files
authored
Convert TestGenerateAndBind to an acceptance test (#3160)
## Why <!-- Why are these changes needed? Provide the context that the reviewer might be missing. For example, were there any decisions behind the change that are not reflected in the code itself? --> One change in the series of changes for converting integration tests into acceptance tests. This will allow for easier testing of various backing solutions for bundle deployment
1 parent 4b32ca9 commit 9138476

File tree

6 files changed

+131
-70
lines changed

6 files changed

+131
-70
lines changed
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
bundle:
2+
name: test-bundle-generate-bind-$UNIQUE_NAME
3+
4+
workspace:
5+
root_path: "~/.bundle/test-bundle-generate-bind-$UNIQUE_NAME"
6+
7+
include:
8+
- resources/*.yml
Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
2+
=== Create a pre-defined job:
3+
Created job with ID: [NUMID]
4+
5+
>>> [CLI] workspace mkdirs /Workspace/Users/[USERNAME]/python-[UNIQUE_NAME]
6+
7+
>>> [CLI] workspace import /Workspace/Users/[USERNAME]/python-[UNIQUE_NAME]/test --file test.py --language PYTHON
8+
9+
>>> [CLI] bundle generate job --key test_job_key --existing-job-id [NUMID] --config-dir resources --source-dir src
10+
File successfully saved to src/test.py
11+
Job configuration successfully saved to resources/test_job_key.job.yml
12+
13+
>>> ls src/
14+
test.py
15+
16+
>>> cat resources/test_job_key.job.yml
17+
name: generate-job-[UNIQUE_NAME]
18+
19+
>>> [CLI] bundle deployment bind test_job_key [NUMID] --auto-approve
20+
Updating deployment state...
21+
Successfully bound job with an id '[NUMID]'. Run 'bundle deploy' to deploy changes to your workspace
22+
23+
>>> [CLI] bundle deploy
24+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle-generate-bind-[UNIQUE_NAME]/files...
25+
Deploying resources...
26+
Updating deployment state...
27+
Deployment complete!
28+
29+
>>> [CLI] bundle destroy --auto-approve
30+
The following resources will be deleted:
31+
delete job test_job_key
32+
33+
All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/test-bundle-generate-bind-[UNIQUE_NAME]
34+
35+
Deleting files...
36+
Destroy complete!
37+
38+
=== Check that job is bound and does not exist after bundle is destroyed:
39+
>>> errcode [CLI] jobs get [NUMID] --output json
40+
Error: Job [NUMID] does not exist.
41+
42+
Exit code: 1
43+
44+
=== Delete the tmp folder:
45+
>>> [CLI] workspace delete /Workspace/Users/[USERNAME]/python-[UNIQUE_NAME]/test
46+
47+
>>> [CLI] workspace delete /Workspace/Users/[USERNAME]/python-[UNIQUE_NAME]
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
title "Create a pre-defined job:\n"
2+
3+
PYTHON_NOTEBOOK_DIR="/Workspace/Users/${CURRENT_USER_NAME}/python-${UNIQUE_NAME}"
4+
PYTHON_NOTEBOOK="${PYTHON_NOTEBOOK_DIR}/test"
5+
6+
JOB_ID=$($CLI jobs create --json '
7+
{
8+
"name": "generate-job-'${UNIQUE_NAME}'",
9+
"tasks": [
10+
{
11+
"task_key": "test",
12+
"new_cluster": {
13+
"spark_version": "'${DEFAULT_SPARK_VERSION}'",
14+
"node_type_id": "'${NODE_TYPE_ID}'",
15+
"num_workers": 1,
16+
"spark_conf": {
17+
"spark.databricks.enableWsfs": true,
18+
"spark.databricks.hive.metastore.glueCatalog.enabled": true,
19+
"spark.databricks.pip.ignoreSSL": true
20+
}
21+
},
22+
"notebook_task": {
23+
"notebook_path": "'${PYTHON_NOTEBOOK}'"
24+
}
25+
}
26+
]
27+
}' | jq -r '.job_id')
28+
29+
echo "Created job with ID: $JOB_ID"
30+
31+
envsubst < databricks.yml.tmpl > databricks.yml
32+
33+
cleanup() {
34+
title "Delete the tmp folder:"
35+
trace $CLI workspace delete ${PYTHON_NOTEBOOK}
36+
trace $CLI workspace delete ${PYTHON_NOTEBOOK_DIR}
37+
}
38+
trap cleanup EXIT
39+
40+
trace $CLI workspace mkdirs "${PYTHON_NOTEBOOK_DIR}"
41+
trace $CLI workspace import "${PYTHON_NOTEBOOK}" --file test.py --language PYTHON
42+
43+
trace $CLI bundle generate job --key test_job_key --existing-job-id $JOB_ID --config-dir resources --source-dir src
44+
trace ls src/
45+
trace cat resources/test_job_key.job.yml | grep "name: generate-job-${UNIQUE_NAME}"
46+
47+
trace $CLI bundle deployment bind test_job_key $JOB_ID --auto-approve
48+
trace $CLI bundle deploy
49+
50+
trace $CLI bundle destroy --auto-approve
51+
52+
title "Check that job is bound and does not exist after bundle is destroyed:"
53+
trace errcode $CLI jobs get "${JOB_ID}" --output json
Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
# Databricks notebook source
2+
print("Hello world!")
Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
# This test is using a workspace import API to load a notebook file.
2+
# This API has a logic on how to accept notebook files and distinguishes them from regular python files.
3+
# To succeed locally we would need to replicate this logic in the fake_workspace
4+
Local = false
5+
Cloud = true
6+
7+
Ignore = [
8+
"databricks.yml",
9+
"resources/*",
10+
"src/*"
11+
]
12+
13+
[Env]
14+
# MSYS2 automatically converts absolute paths like /Users/$username/$UNIQUE_NAME to
15+
# C:/Program Files/Git/Users/$username/UNIQUE_NAME before passing it to the CLI
16+
# Setting this environment variable prevents that conversion on windows.
17+
MSYS_NO_PATHCONV = "1"
18+
19+
[[Repls]]
20+
Old = '\\'
21+
New = '/'

integration/bundle/bind_resource_test.go

Lines changed: 0 additions & 70 deletions
This file was deleted.

0 commit comments

Comments
 (0)