Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@

### Bundles

* Add declarative bind support for direct deployment engine ([#4630](https://github.com/databricks/cli/pull/4630)).

### Dependency updates

### API Changes
23 changes: 23 additions & 0 deletions acceptance/bundle/deploy/bind/basic/databricks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
bundle:
name: test-bind-basic

resources:
jobs:
foo:
name: test-bind-job
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py

targets:
default:
bind:
jobs:
foo:
id: "PLACEHOLDER_JOB_ID"
1 change: 1 addition & 0 deletions acceptance/bundle/deploy/bind/basic/hello.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
print("hello")
5 changes: 5 additions & 0 deletions acceptance/bundle/deploy/bind/basic/out.test.toml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

66 changes: 66 additions & 0 deletions acceptance/bundle/deploy/bind/basic/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@

>>> [CLI] bundle plan
bind jobs.foo (id: [NEW_JOB_ID])

Plan: 0 to add, 1 to change, 0 to delete, 0 unchanged, 1 to bind

>>> [CLI] bundle deploy --auto-approve
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bind-basic/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!

>>> [CLI] bundle plan
Plan: 0 to add, 0 to change, 0 to delete, 1 unchanged

>>> print_state.py
{
"state_version": 1,
"cli_version": "[DEV_VERSION]",
"lineage": "[UUID]",
"serial": 1,
"state": {
"resources.jobs.foo": {
"__id__": "[NEW_JOB_ID]",
"state": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/test-bind-basic/default/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"environments": [
{
"environment_key": "default",
"spec": {
"client": "1"
}
}
],
"format": "MULTI_TASK",
"max_concurrent_runs": 1,
"name": "test-bind-job",
"queue": {
"enabled": true
},
"tasks": [
{
"environment_key": "default",
"spark_python_task": {
"python_file": "/Workspace/Users/[USERNAME]/.bundle/test-bind-basic/default/files/hello.py"
},
"task_key": "my_task"
}
]
}
}
}
}

>>> [CLI] bundle destroy --auto-approve
The following resources will be deleted:
delete resources.jobs.foo

All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/test-bind-basic/default

Deleting files...
Destroy complete!
21 changes: 21 additions & 0 deletions acceptance/bundle/deploy/bind/basic/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# Create a job in the workspace
NEW_JOB_ID=$($CLI jobs create --json '{"name": "test-import-job", "environments": [{"environment_key": "default", "spec": {"client": "1"}}], "tasks": [{"task_key": "my_task", "environment_key": "default", "spark_python_task": {"python_file": "/Workspace/test.py"}}]}' | jq -r .job_id)
add_repl.py $NEW_JOB_ID NEW_JOB_ID

# Update the databricks.yml with the actual job ID
update_file.py databricks.yml 'PLACEHOLDER_JOB_ID' "$NEW_JOB_ID"

# Run plan - should show import action
trace $CLI bundle plan

# Deploy with auto-approve
trace $CLI bundle deploy --auto-approve

# Plan again - should show no changes (skip)
trace $CLI bundle plan

# Verify state file contains the imported ID
trace print_state.py | contains.py "$NEW_JOB_ID"

# Cleanup
trace $CLI bundle destroy --auto-approve
23 changes: 23 additions & 0 deletions acceptance/bundle/deploy/bind/bind-and-update/databricks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
bundle:
name: test-bind-update

resources:
jobs:
foo:
name: updated-job-name
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py

targets:
default:
bind:
jobs:
foo:
id: "PLACEHOLDER_JOB_ID"
1 change: 1 addition & 0 deletions acceptance/bundle/deploy/bind/bind-and-update/hello.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
print("hello")
5 changes: 5 additions & 0 deletions acceptance/bundle/deploy/bind/bind-and-update/out.test.toml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

23 changes: 23 additions & 0 deletions acceptance/bundle/deploy/bind/bind-and-update/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@

>>> [CLI] bundle plan
bind jobs.foo (id: [NEW_JOB_ID])

Plan: 0 to add, 1 to change, 0 to delete, 0 unchanged, 1 to bind

>>> [CLI] bundle deploy --auto-approve
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bind-update/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!

>>> [CLI] jobs get [NEW_JOB_ID]
updated-job-name

>>> [CLI] bundle destroy --auto-approve
The following resources will be deleted:
delete resources.jobs.foo

All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/test-bind-update/default

Deleting files...
Destroy complete!
18 changes: 18 additions & 0 deletions acceptance/bundle/deploy/bind/bind-and-update/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# Create a job in the workspace with a different name
NEW_JOB_ID=$($CLI jobs create --json '{"name": "original-job-name", "environments": [{"environment_key": "default", "spec": {"client": "1"}}], "tasks": [{"task_key": "my_task", "environment_key": "default", "spark_python_task": {"python_file": "/Workspace/test.py"}}]}' | jq -r .job_id)
add_repl.py $NEW_JOB_ID NEW_JOB_ID

# Update the databricks.yml with the actual job ID
update_file.py databricks.yml 'PLACEHOLDER_JOB_ID' "$NEW_JOB_ID"

# Run plan - should show import_and_update action (name differs from config)
trace $CLI bundle plan

# Deploy with auto-approve
trace $CLI bundle deploy --auto-approve

# Verify the job was updated
trace $CLI jobs get $NEW_JOB_ID | jq -r .settings.name

# Cleanup
trace $CLI bundle destroy --auto-approve
23 changes: 23 additions & 0 deletions acceptance/bundle/deploy/bind/block-migrate/databricks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
bundle:
name: test-bind-block-migrate

resources:
jobs:
foo:
name: test-job
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py

targets:
default:
bind:
jobs:
foo:
id: "12345"
1 change: 1 addition & 0 deletions acceptance/bundle/deploy/bind/block-migrate/hello.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
print("hello")
5 changes: 5 additions & 0 deletions acceptance/bundle/deploy/bind/block-migrate/out.test.toml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 3 additions & 0 deletions acceptance/bundle/deploy/bind/block-migrate/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@

>>> musterr [CLI] bundle deployment migrate
Error: cannot run 'bundle deployment migrate' when bind blocks are defined in the target configuration; bind blocks are only supported with the direct deployment engine
2 changes: 2 additions & 0 deletions acceptance/bundle/deploy/bind/block-migrate/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Try to run migration with bind blocks - should fail
trace musterr $CLI bundle deployment migrate
2 changes: 2 additions & 0 deletions acceptance/bundle/deploy/bind/block-migrate/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Migration test does not need engine matrix
[EnvMatrix]
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
bundle:
name: test-bind-delete-conflict

resources:
jobs:
foo:
name: test-bind-delete-job
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py

targets:
default:
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
bundle:
name: test-bind-delete-conflict

resources:
jobs:
bar:
name: test-bind-delete-job
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py

targets:
default:
bind:
jobs:
bar:
id: "PLACEHOLDER_JOB_ID"
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
print("hello")

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@

>>> [CLI] bundle deploy --auto-approve
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bind-delete-conflict/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!

>>> musterr [CLI] bundle plan
Error: bind block for "resources.jobs.bar" has the same ID "[FOO_ID]" as existing resource "resources.jobs.foo"; remove the bind block or the conflicting resource
at targets.default.bind.jobs.bar

Error: bind validation failed


>>> [CLI] bundle destroy --auto-approve
The following resources will be deleted:
delete resources.jobs.foo

All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/test-bind-delete-conflict/default

Deleting files...
Destroy complete!
18 changes: 18 additions & 0 deletions acceptance/bundle/deploy/bind/delete-and-bind-conflict/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# Deploy foo to create it in state
trace $CLI bundle deploy --auto-approve

# Get the job ID from state
JOB_ID=$(read_id.py foo)

# Switch to a config that renames foo->bar and adds a bind block for bar
# with the same job ID. This creates a conflict: foo is being deleted
# (still in state) while bar is being bound with the same ID.
cp databricks.yml databricks.yml.bak
cp databricks_conflict.yml databricks.yml
update_file.py databricks.yml 'PLACEHOLDER_JOB_ID' "$JOB_ID"

trace musterr $CLI bundle plan

# Cleanup: restore original config and destroy
cp databricks.yml.bak databricks.yml
trace $CLI bundle destroy --auto-approve
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Ignore = [".databricks", "databricks.yml.bak", "databricks_conflict.yml"]
19 changes: 19 additions & 0 deletions acceptance/bundle/deploy/bind/duplicate-bind-id/databricks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
bundle:
name: test-bind-duplicate-id

resources:
jobs:
foo:
name: test-bind-dup-job
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py

targets:
default:
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
bundle:
name: test-bind-duplicate-id

resources:
jobs:
foo:
name: test-bind-dup-job
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py
bar:
name: test-bind-dup-bar
environments:
- environment_key: default
spec:
client: "1"
tasks:
- task_key: my_task
environment_key: default
spark_python_task:
python_file: ./hello.py

targets:
default:
bind:
jobs:
bar:
id: "PLACEHOLDER_JOB_ID"
1 change: 1 addition & 0 deletions acceptance/bundle/deploy/bind/duplicate-bind-id/hello.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
print("hello")
Loading