This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
# Run locally (debug mode, port 8050)
python app.py
# Deploy to Databricks Apps via Asset Bundle (dev target)
databricks bundle deploy
databricks bundle run api_explorer
# Deploy directly via CLI
databricks apps deploy databricks-api-explorer --source-code-path . --profile <your-profile>
# View production logs
databricks apps logs databricks-api-explorer --profile <your-profile>There are no tests, linters, or build steps.
Three Python files do all the work. There is no database, no state beyond in-memory Dash stores, and no background tasks.
The entire UI and all server-side logic live here. Key sections:
highlight_json_components()— pure-Python JSON tokenizer that buildshtml.Span/html.Buttontrees (nodangerouslySetInnerHTML). Accepts optionalid_link_data(chip list fromextract_chips) to render ID field values as inline clickablehtml.Buttonelements.build_response_panel()— assembles the response view; passes chips to the JSON renderer.build_param_form(endpoint, prefill=None)— generates type-aware parameter inputs;prefilldict overrides the default value for named params (used when navigating via inline ID link).- Callbacks are numbered 1–12 (with 8b inserted between 8 and 9).
- Two catalogs:
API_CATALOG(workspace APIs) andACCOUNT_API_CATALOG(account-level APIs targetingaccounts.cloud.databricks.com). - Each endpoint has
id,method,path,params, optionalpath_params, optionalbody. Account endpoints also carryscope: "account"when looked up viaget_endpoint_by_id/ENDPOINT_MAP. LIST_TO_GET/ACCOUNT_LIST_TO_GETdicts: map list-endpoint IDs to(get_endpoint_id, list_key, id_field, param_name, label_field).list_key=Nonemeans the response is a bare JSON array.extract_chips()— reads both link maps, walks the API response, returns chip dicts. Theid_fieldkey is what the JSON tree viewer uses to match keys by name.- Adding a new endpoint: add it to the appropriate
*_CATALOGand, if it's a list with a corresponding get, add it to the matching*_LIST_TO_GET. - A scope switcher in the sidebar toggles between Workspace and Account views. Account API calls route through
_accounts_host()which derives the accounts console URL from the workspace host.
IS_DATABRICKS_APP = bool(os.getenv("DATABRICKS_CLIENT_SECRET"))— this flag switches the entire app between local and Databricks App mode at startup._resolve_conn(conn_config)inapp.pyis the central dispatcher: whenIS_DATABRICKS_APPis true it readsDATABRICKS_HOSTenv var and thex-forwarded-access-tokenrequest header; otherwise it callsresolve_local_connection(conn_config).resolve_local_connection()usesdatabricks.sdk.core.Config(profile=...)to resolve host + token for profile mode, or reads host/token directly for custom mode.make_api_call()returns a consistent dict:{status_code, elapsed_ms, data, success, error, url}.
conn-config store is the single source of truth for local connection state. Shape: {"mode": "profile"|"custom", "profile": "<name>"} or {"mode": "custom", "host": "...", "token": "..."}. Callbacks that make API calls read it as a State and pass it to _resolve_conn().
version.txt auto-increments on every import app (via version.py). Don't be surprised when it bumps during development; it's intentional.
Dash 4.0.0 allow_duplicate constraint: allow_duplicate=True only works when the target output is the primary (first) output of another callback. Never add allow_duplicate=True to a secondary output — this causes a KeyError: "Callback function not found for output '...'" at runtime. handle_id_link_click uses allow_duplicate=True on selected-endpoint.data (first output of select_endpoint) and response-container.children (first output of execute_api_call), both safe.
Sidebar button highlighting is decoupled from select_endpoint: select_endpoint (callback 8) only writes selected-endpoint.data. A separate sync_active_button callback (8b) watches selected-endpoint and derives endpoint-btn.className from it. This means any callback that updates selected-endpoint with allow_duplicate=True automatically gets correct sidebar highlighting for free, without needing allow_duplicate on the pattern-matching className output.
Inline ID links + _prefill flow: When a list API succeeds, extract_chips() builds a chip list. highlight_json_components() renders matching field values as html.Button elements with component IDs embedding gid, par, val. Clicking one fires handle_id_link_click, which: (1) calls the Get API and updates response-container; (2) writes {...endpoint, "_prefill": {param_name: value}} to selected-endpoint. That store change triggers both sync_active_button (highlights the Get button) and render_endpoint_detail (reads _prefill, calls build_param_form(endpoint, prefill) with the ID pre-filled).
Default CLI profile is resolved at startup in auth.py as the first profile found in ~/.databrickscfg (falls back to "DEFAULT" if the file is absent or empty). No hardcoding needed.
assets/style.css— all styling; uses CSS custom properties defined in:root. JSON syntax colors:.jk(keys, blue),.jv(strings, green),.jn(numbers, amber),.jb(booleans, purple),.jbn(null, muted).assets/devtools_patch.js— MutationObserver that removes the Plotly Cloud button from the Dash debug bar and injects workspace URL links. Watches#host-displayfor connection changes.