Skip to content

Fix dashboard serialisation: query is now queryLines on upstream REST API#491

Merged
sundarshankar89 merged 5 commits intomainfrom
fix/dataset-model
Mar 13, 2026
Merged

Fix dashboard serialisation: query is now queryLines on upstream REST API#491
sundarshankar89 merged 5 commits intomainfrom
fix/dataset-model

Conversation

@asnare
Copy link
Contributor

@asnare asnare commented Mar 6, 2026

This PR updates the way we load datasets from dashboards via the API:

  • Previously the JSON for a dataset included a query attribute in the response, whereas now it returns queryLines containing an array of the query lines.
  • The upstream API still accepts query for compatibility when writing, but converts it to queryLines on fetch.

For compatibility this change retains the use of query (formatted via SQLGlot) in the saved dashboards.

An integration test has been added to cover .save_to_folder(): this was previously missing but there was no integration test.

Related issues:

As an incidental change, Hatch is upgraded from 1.9.4 to 1.16.5: this is necessary because 1.9.4 no longer works, and without Hatch the CI/CD cannot run.

asnare added 2 commits March 6, 2026 15:45
Serialized dashboards now return the query as an array of lines in the `queryLines` attribute.
@asnare asnare self-assigned this Mar 6, 2026
@asnare asnare added the bug Something isn't working label Mar 6, 2026
@asnare asnare added this to UCX Mar 6, 2026
@asnare asnare moved this to Ready for Review in UCX Mar 6, 2026
@github-actions
Copy link

github-actions bot commented Mar 6, 2026

❌ 36/37 passed, 1 failed, 4 skipped, 26m56s total

❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import Unknown\nbackend = RuntimeBackend()\ntry:\n grants = backend.fetch("SHOW GRANTS ON METASTORE")\n print("FAILED")\nexcept Unknown:\n print("PASSED")\n]: AssertionError: assert 'FAILED' == 'PASSED' (23.843s)
AssertionError: assert 'FAILED' == 'PASSED'
  
  - PASSED
  + FAILED
08:17 DEBUG [databricks.sdk] Loaded from environment
08:17 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
08:17 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
08:17 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
08:17 INFO [databricks.sdk] Using Databricks Metadata Service authentication
[gw3] linux -- Python 3.10.19 /home/runner/work/lsql/lsql/.venv/bin/python
08:17 DEBUG [databricks.sdk] Loaded from environment
08:17 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
08:17 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
08:17 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
08:17 INFO [databricks.sdk] Using Databricks Metadata Service authentication
08:17 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/153383108335587",
<       "display": "users",
<       "type": "direct",
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
08:17 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmpvwq9p1s_/working-copy in /tmp/tmpvwq9p1s_
08:17 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.RPKE/wheels/databricks_labs_lsql-0.16.1+1020260313081742-py3-none-any.whl
08:17 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
08:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.RPKE/wheels) does not exist."
< }
08:17 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.RPKE/wheels
08:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.RPKE/wheels"
> }
< 200 OK
< {}
08:17 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
08:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 3050996465364851
< }
08:17 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
08:17 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.RPKE/version.json
08:17 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
08:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 3050996465364852
< }
08:17 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "ON_DEMAND_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "liran.bareket@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "liran.bareket@databricks.com",
<     "DatabricksInstanceGroupId": "-7571316921879686317",
<     "DatabricksInstancePoolCreatorId": "6779888502363704",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.12.10",
<     "instance_id": "5e13c81f05184961a5ed1781ff058021",
<     "ngrok_endpoint_base_domain": "green.mux.ngrok-dataplane.wildcard",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "d22a4e3eef08442d9761e87054942901",
<     "node_type_id": "Standard_D8ads_v6",
<     "private_ip": "10.179.14.10",
<     "start_timestamp": 1773389782271
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8ads_v6",
<   "effective_spark_version": "16.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1773389805510,
<   "last_restarted_time": 1773389830056,
<   "last_state_loss_time": 1773389830002,
<   "node_type_id": "Standard_D8ads_v6",
<   "num_workers": 0,
<   "pinned_by_user_name": "6779888502363704",
<   "release_version": "16.4.19",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 3580861177023537875,
<   "spark_version": "16.4.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.4.x-scala2.12"
<   },
<   "start_time": 1759339672984,
<   "state": "RUNNING",
<   "state_message": ""
< }
08:17 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "8953140847238503821"
< }
08:17 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8953140847238503821
< 200 OK
< {
<   "id": "8953140847238503821",
<   "status": "Pending"
< }
08:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=8953140847238503821: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
08:17 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8953140847238503821
< 200 OK
< {
<   "id": "8953140847238503821",
<   "status": "Pending"
< }
08:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=8953140847238503821: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
08:17 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8953140847238503821
< 200 OK
< {
<   "id": "8953140847238503821",
<   "status": "Running"
< }
08:17 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (111 more bytes)",
>   "contextId": "8953140847238503821",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "90664dd351784f028016d40029c54a33"
< }
08:17 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=90664dd351784f028016d40029c54a33&contextId=8953140847238503821
< 200 OK
< {
<   "id": "90664dd351784f028016d40029c54a33",
<   "results": null,
<   "status": "Running"
< }
08:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=90664dd351784f028016d40029c54a33, context_id=8953140847238503821: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
08:17 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=90664dd351784f028016d40029c54a33&contextId=8953140847238503821
< 200 OK
< {
<   "id": "90664dd351784f028016d40029c54a33",
<   "results": null,
<   "status": "Running"
< }
08:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=90664dd351784f028016d40029c54a33, context_id=8953140847238503821: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
08:17 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=90664dd351784f028016d40029c54a33&contextId=8953140847238503821
< 200 OK
< {
<   "id": "90664dd351784f028016d40029c54a33",
<   "results": null,
<   "status": "Running"
< }
08:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=90664dd351784f028016d40029c54a33, context_id=8953140847238503821: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
08:17 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=90664dd351784f028016d40029c54a33&contextId=8953140847238503821
< 200 OK
< {
<   "id": "90664dd351784f028016d40029c54a33",
<   "results": null,
<   "status": "Running"
< }
08:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=90664dd351784f028016d40029c54a33, context_id=8953140847238503821: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~4s)
08:18 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=90664dd351784f028016d40029c54a33&contextId=8953140847238503821
< 200 OK
< {
<   "id": "90664dd351784f028016d40029c54a33",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.RPKE/wheels/databricks_labs_ls... (5309 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
08:18 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "from databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import Unkno... (145 more bytes)",
>   "contextId": "8953140847238503821",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "7bf5d16e526a4ec680f776913d6bf704"
< }
08:18 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=7bf5d16e526a4ec680f776913d6bf704&contextId=8953140847238503821
< 200 OK
< {
<   "id": "7bf5d16e526a4ec680f776913d6bf704",
<   "results": {
<     "data": "FAILED",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
08:17 DEBUG [databricks.sdk] Loaded from environment
08:17 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
08:17 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
08:17 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
08:17 INFO [databricks.sdk] Using Databricks Metadata Service authentication
08:17 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/153383108335587",
<       "display": "users",
<       "type": "direct",
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
08:17 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmpvwq9p1s_/working-copy in /tmp/tmpvwq9p1s_
08:17 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.RPKE/wheels/databricks_labs_lsql-0.16.1+1020260313081742-py3-none-any.whl
08:17 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
08:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.RPKE/wheels) does not exist."
< }
08:17 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.RPKE/wheels
08:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.RPKE/wheels"
> }
< 200 OK
< {}
08:17 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
08:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 3050996465364851
< }
08:17 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
08:17 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.RPKE/version.json
08:17 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
08:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 3050996465364852
< }
08:17 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "ON_DEMAND_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "liran.bareket@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "liran.bareket@databricks.com",
<     "DatabricksInstanceGroupId": "-7571316921879686317",
<     "DatabricksInstancePoolCreatorId": "6779888502363704",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.12.10",
<     "instance_id": "5e13c81f05184961a5ed1781ff058021",
<     "ngrok_endpoint_base_domain": "green.mux.ngrok-dataplane.wildcard",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "d22a4e3eef08442d9761e87054942901",
<     "node_type_id": "Standard_D8ads_v6",
<     "private_ip": "10.179.14.10",
<     "start_timestamp": 1773389782271
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8ads_v6",
<   "effective_spark_version": "16.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1773389805510,
<   "last_restarted_time": 1773389830056,
<   "last_state_loss_time": 1773389830002,
<   "node_type_id": "Standard_D8ads_v6",
<   "num_workers": 0,
<   "pinned_by_user_name": "6779888502363704",
<   "release_version": "16.4.19",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 3580861177023537875,
<   "spark_version": "16.4.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.4.x-scala2.12"
<   },
<   "start_time": 1759339672984,
<   "state": "RUNNING",
<   "state_message": ""
< }
08:17 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "8953140847238503821"
< }
08:17 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8953140847238503821
< 200 OK
< {
<   "id": "8953140847238503821",
<   "status": "Pending"
< }
08:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=8953140847238503821: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
08:17 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8953140847238503821
< 200 OK
< {
<   "id": "8953140847238503821",
<   "status": "Pending"
< }
08:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=8953140847238503821: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
08:17 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8953140847238503821
< 200 OK
< {
<   "id": "8953140847238503821",
<   "status": "Running"
< }
08:17 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (111 more bytes)",
>   "contextId": "8953140847238503821",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "90664dd351784f028016d40029c54a33"
< }
08:17 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=90664dd351784f028016d40029c54a33&contextId=8953140847238503821
< 200 OK
< {
<   "id": "90664dd351784f028016d40029c54a33",
<   "results": null,
<   "status": "Running"
< }
08:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=90664dd351784f028016d40029c54a33, context_id=8953140847238503821: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
08:17 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=90664dd351784f028016d40029c54a33&contextId=8953140847238503821
< 200 OK
< {
<   "id": "90664dd351784f028016d40029c54a33",
<   "results": null,
<   "status": "Running"
< }
08:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=90664dd351784f028016d40029c54a33, context_id=8953140847238503821: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
08:17 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=90664dd351784f028016d40029c54a33&contextId=8953140847238503821
< 200 OK
< {
<   "id": "90664dd351784f028016d40029c54a33",
<   "results": null,
<   "status": "Running"
< }
08:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=90664dd351784f028016d40029c54a33, context_id=8953140847238503821: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
08:17 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=90664dd351784f028016d40029c54a33&contextId=8953140847238503821
< 200 OK
< {
<   "id": "90664dd351784f028016d40029c54a33",
<   "results": null,
<   "status": "Running"
< }
08:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=90664dd351784f028016d40029c54a33, context_id=8953140847238503821: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~4s)
08:18 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=90664dd351784f028016d40029c54a33&contextId=8953140847238503821
< 200 OK
< {
<   "id": "90664dd351784f028016d40029c54a33",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.RPKE/wheels/databricks_labs_ls... (5309 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
08:18 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "from databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import Unkno... (145 more bytes)",
>   "contextId": "8953140847238503821",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "7bf5d16e526a4ec680f776913d6bf704"
< }
08:18 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=7bf5d16e526a4ec680f776913d6bf704&contextId=8953140847238503821
< 200 OK
< {
<   "id": "7bf5d16e526a4ec680f776913d6bf704",
<   "results": {
<     "data": "FAILED",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
[gw3] linux -- Python 3.10.19 /home/runner/work/lsql/lsql/.venv/bin/python

Running from acceptance #561

Hatch 1.9.4 does not work with the current version of pip.
@asnare
Copy link
Contributor Author

asnare commented Mar 6, 2026

❌ 36/37 passed, 1 failed, 4 skipped, 4m32s total

❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import Unknown\nbackend = RuntimeBackend()\ntry:\n grants = backend.fetch("SHOW GRANTS ON METASTORE")\n print("FAILED")\nexcept Unknown:\n print("PASSED")\n]: ValueError: TEST_SCHEMA auth: metadata-service: HTTPConnectionPool(host='127.0.0.1', port=33719): Read timed out. (read timeout=10). Config: host=https://DATABRICKS_HOST, CLOUD_ENV_client_id=4106dc97-a963-48f0-a079-a578238959a6, CLOUD_ENV_tenant_id=9f37a392-f0ae-4280-9796-f1864a10effc, auth_type=metadata-service, cluster_id=DATABRICKS_CLUSTER_ID, warehouse_id=DATABRICKS_WAREHOUSE_ID, metadata_service_url=***. Env: DATABRICKS_HOST, ARM_CLIENT_ID, ARM_TENANT_ID, DATABRICKS_AUTH_TYPE, DATABRICKS_CLUSTER_ID, DATABRICKS_WAREHOUSE_ID, DATABRICKS_METADATA_SERVICE_URL (10.003s)
Running from acceptance #552

This is a test that is failing on main, and unrelated to this PR. (It is addressed by #492.)

@asnare asnare enabled auto-merge (squash) March 12, 2026 08:56
@sundarshankar89 sundarshankar89 merged commit 29d6faf into main Mar 13, 2026
9 of 10 checks passed
@sundarshankar89 sundarshankar89 deleted the fix/dataset-model branch March 13, 2026 08:12
@github-project-automation github-project-automation bot moved this from Ready for Review to Done in UCX Mar 13, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

Status: Done

3 participants