Skip to content

Commit 6b59062

Browse files
committed
fix(databricks): regenerate docs to reflect API spec fixes
1 parent af865eb commit 6b59062

File tree

1 file changed

+14
-11
lines changed

1 file changed

+14
-11
lines changed

apps/docs/content/docs/en/tools/databricks.mdx

Lines changed: 14 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ Execute a SQL statement against a Databricks SQL warehouse and return results in
3333
| `catalog` | string | No | Unity Catalog name \(equivalent to USE CATALOG\) |
3434
| `schema` | string | No | Schema name \(equivalent to USE SCHEMA\) |
3535
| `rowLimit` | number | No | Maximum number of rows to return |
36-
| `waitTimeout` | string | No | How long to wait for results \(e.g., "50s"\). Range: "0s" or "5s" to "50s". Default: "10s" |
36+
| `waitTimeout` | string | No | How long to wait for results \(e.g., "50s"\). Range: "0s" or "5s" to "50s". Default: "50s" |
3737

3838
#### Output
3939

@@ -45,7 +45,7 @@ Execute a SQL statement against a Databricks SQL warehouse and return results in
4545
|`name` | string | Column name |
4646
|`position` | number | Column position \(0-based\) |
4747
|`typeName` | string | Column type \(STRING, INT, LONG, DOUBLE, BOOLEAN, TIMESTAMP, DATE, DECIMAL, etc.\) |
48-
| `data` | array | Result rows as a 2D array of strings |
48+
| `data` | array | Result rows as a 2D array of strings where each inner array is a row of column values |
4949
| `totalRows` | number | Total number of rows in the result |
5050
| `truncated` | boolean | Whether the result set was truncated due to row_limit or byte_limit |
5151

@@ -61,7 +61,7 @@ List all jobs in a Databricks workspace with optional filtering by name.
6161
| `apiKey` | string | Yes | Databricks Personal Access Token |
6262
| `limit` | number | No | Maximum number of jobs to return \(range 1-100, default 20\) |
6363
| `offset` | number | No | Offset for pagination |
64-
| `name` | string | No | Filter jobs by name \(case-insensitive match\) |
64+
| `name` | string | No | Filter jobs by exact name \(case-insensitive\) |
6565
| `expandTasks` | boolean | No | Include task and cluster details in the response \(max 100 elements\) |
6666

6767
#### Output
@@ -124,15 +124,16 @@ Get the status, timing, and details of a Databricks job run by its run ID.
124124
| `runType` | string | Type of run \(JOB_RUN, WORKFLOW_RUN, SUBMIT_RUN\) |
125125
| `attemptNumber` | number | Retry attempt number \(0 for initial attempt\) |
126126
| `state` | object | Run state information |
127-
|`lifeCycleState` | string | Lifecycle state \(QUEUED, PENDING, RUNNING, TERMINATING, TERMINATED, SKIPPED, INTERNAL_ERROR\) |
128-
|`resultState` | string | Result state \(SUCCESS, FAILED, TIMEDOUT, CANCELED\) |
127+
|`lifeCycleState` | string | Lifecycle state \(QUEUED, PENDING, RUNNING, TERMINATING, TERMINATED, SKIPPED, INTERNAL_ERROR, BLOCKED, WAITING_FOR_RETRY\) |
128+
|`resultState` | string | Result state \(SUCCESS, FAILED, TIMEDOUT, CANCELED, SUCCESS_WITH_FAILURES, UPSTREAM_FAILED, UPSTREAM_CANCELED, EXCLUDED\) |
129129
|`stateMessage` | string | Descriptive message for the current state |
130130
|`userCancelledOrTimedout` | boolean | Whether the run was cancelled by user or timed out |
131131
| `startTime` | number | Run start timestamp \(epoch ms\) |
132132
| `endTime` | number | Run end timestamp \(epoch ms, 0 if still running\) |
133133
| `setupDuration` | number | Cluster setup duration \(ms\) |
134134
| `executionDuration` | number | Execution duration \(ms\) |
135135
| `cleanupDuration` | number | Cleanup duration \(ms\) |
136+
| `queueDuration` | number | Time spent in queue before execution \(ms\) |
136137
| `runPageUrl` | string | URL to the run detail page in Databricks UI |
137138
| `creatorUserName` | string | Email of the user who triggered the run |
138139

@@ -149,7 +150,7 @@ List job runs in a Databricks workspace with optional filtering by job, status,
149150
| `jobId` | number | No | Filter runs by job ID. Omit to list runs across all jobs |
150151
| `activeOnly` | boolean | No | Only include active runs \(PENDING, RUNNING, or TERMINATING\) |
151152
| `completedOnly` | boolean | No | Only include completed runs |
152-
| `limit` | number | No | Maximum number of runs to return \(range 1-25, default 20\) |
153+
| `limit` | number | No | Maximum number of runs to return \(range 1-24, default 20\) |
153154
| `offset` | number | No | Offset for pagination |
154155
| `runType` | string | No | Filter by run type \(JOB_RUN, WORKFLOW_RUN, SUBMIT_RUN\) |
155156
| `startTimeFrom` | number | No | Filter runs started at or after this timestamp \(epoch ms\) |
@@ -165,9 +166,10 @@ List job runs in a Databricks workspace with optional filtering by job, status,
165166
|`runName` | string | Run name |
166167
|`runType` | string | Run type \(JOB_RUN, WORKFLOW_RUN, SUBMIT_RUN\) |
167168
|`state` | object | Run state information |
168-
|`lifeCycleState` | string | Lifecycle state \(QUEUED, PENDING, RUNNING, TERMINATING, TERMINATED, SKIPPED, INTERNAL_ERROR\) |
169-
|`resultState` | string | Result state \(SUCCESS, FAILED, TIMEDOUT, CANCELED\) |
169+
|`lifeCycleState` | string | Lifecycle state \(QUEUED, PENDING, RUNNING, TERMINATING, TERMINATED, SKIPPED, INTERNAL_ERROR, BLOCKED, WAITING_FOR_RETRY\) |
170+
|`resultState` | string | Result state \(SUCCESS, FAILED, TIMEDOUT, CANCELED, SUCCESS_WITH_FAILURES, UPSTREAM_FAILED, UPSTREAM_CANCELED, EXCLUDED\) |
170171
|`stateMessage` | string | Descriptive state message |
172+
|`userCancelledOrTimedout` | boolean | Whether the run was cancelled by user or timed out |
171173
|`startTime` | number | Run start timestamp \(epoch ms\) |
172174
|`endTime` | number | Run end timestamp \(epoch ms\) |
173175
| `hasMore` | boolean | Whether more runs are available for pagination |
@@ -208,11 +210,12 @@ Get the output of a completed Databricks job run, including notebook results, er
208210
| Parameter | Type | Description |
209211
| --------- | ---- | ----------- |
210212
| `notebookOutput` | object | Notebook task output \(from dbutils.notebook.exit\(\)\) |
211-
|`result` | string | Value passed to dbutils.notebook.exit\(\) \(max 1 MB\) |
213+
|`result` | string | Value passed to dbutils.notebook.exit\(\) \(max 5 MB\) |
212214
|`truncated` | boolean | Whether the result was truncated |
213215
| `error` | string | Error message if the run failed or output is unavailable |
214216
| `errorTrace` | string | Error stack trace if available |
215-
| `logs` | string | Log output from the run if available |
217+
| `logs` | string | Log output \(last 5 MB\) from spark_jar, spark_python, or python_wheel tasks |
218+
| `logsTruncated` | boolean | Whether the log output was truncated |
216219

217220
### `databricks_list_clusters`
218221

@@ -242,7 +245,7 @@ List all clusters in a Databricks workspace including their state, configuration
242245
|`autoscale` | object | Autoscaling configuration \(null for fixed-size clusters\) |
243246
|`minWorkers` | number | Minimum number of workers |
244247
|`maxWorkers` | number | Maximum number of workers |
245-
|`clusterSource` | string | Origin \(API, UI, JOB, MODELS, PIPELINE, SQL\) |
248+
|`clusterSource` | string | Origin \(API, UI, JOB, MODELS, PIPELINE, PIPELINE_MAINTENANCE, SQL\) |
246249
|`autoterminationMinutes` | number | Minutes of inactivity before auto-termination \(0 = disabled\) |
247250
|`startTime` | number | Cluster start timestamp \(epoch ms\) |
248251

0 commit comments

Comments
 (0)