A dbt adapter for Altertable, backed by Arrow Flight SQL.
- Python 3.10+
- dbt-core
>=1.8,<2.0
pip install dbt-altertableOr with uv:
uv add dbt-altertableAdd a profile to ~/.dbt/profiles.yml:
my_project:
target: dev
outputs:
dev:
type: altertable
username: your_username
password: your_password
database: your_database
schema: your_schema
host: flight.altertable.ai # optional, this is the default
port: 443 # optional, this is the default
tls: true # optional, this is the default| Field | Required | Default | Description |
|---|---|---|---|
username |
yes | — | Altertable username |
password |
yes | — | Altertable password |
database |
yes | — | Target catalog name |
schema |
yes | — | Target schema name |
host |
no | flight.altertable.ai |
Flight SQL endpoint host |
port |
no | 443 |
Flight SQL endpoint port |
tls |
no | true |
Use TLS for the Flight SQL connection |
dbt models should use DuckDB-compatible SQL. Altertable executes queries via DuckDB, so all DuckDB SQL features and functions are available — see the DuckDB SQL reference.
When you enable [persist_docs](https://docs.getdbt.com/referen ce/resource-configs/persist_docs), dbt writes model and column description values to the warehouse using DuckDB’s COMMENT ON TABLE / COMMENT ON COLUMN syntax (one statement per column so Arrow Flight SQL accepts each round-trip).
Enable it in dbt_project.yml or on a model:
models:
my_project:
+persist_docs:
relation: true
columns: trueAfter a successful dbt run, descriptions show up on duckdb_tables() / duckdb_columns() (and therefore in dbt docs generate / catalog metadata).
This project is managed with uv and hatchling.
git clone https://github.com/altertable-ai/dbt-altertable.git
cd dbt-altertable
uv sync --extra devCommon tasks (see Makefile):
make lint # ruff format + ruff check --fix
make typecheck # ty check
make test # pytest
make build # uv build (wheel + sdist)Flight SQL integration tests (for example persist_docs) run against altertable-mock in CI: the workflow builds the mock from main, starts it with --user dbt_ci:dbt_ci_secret, then runs pytest tests/integration. The default matrix job runs unit tests only (pytest --ignore=tests/integration).
Locally — Testcontainers (requires Docker; installs testcontainers via the integration extra):
ALTERTABLE_USE_TESTCONTAINERS=1 make testmake test detects that flag, runs uv sync --extra dev --extra integration, sets TESTCONTAINERS_RYUK_DISABLED=true by default (Ryuk often hangs on Docker Desktop), then runs the full pytest suite.
Or run only integration tests:
uv sync --extra dev --extra integration
export TESTCONTAINERS_RYUK_DISABLED=${TESTCONTAINERS_RYUK_DISABLED:-true}
ALTERTABLE_USE_TESTCONTAINERS=1 uv run pytest tests/integration -vOptional overrides: ALTERTABLE_MOCK_IMAGE (default ghcr.io/altertable-ai/altertable-mock:latest), ALTERTABLE_MOCK_BOOT_USER (default dbt_ci:dbt_ci_secret, passed as --user to the mock), ALTERTABLE_MOCK_WAIT_TIMEOUT (seconds to wait for Starting Flight SQL server in logs, default 180).
If dbt run fails against the published image (for example catalog or information_schema errors), build the mock from source and point tests at it: docker build -t altertable-mock:local . then export ALTERTABLE_MOCK_IMAGE=altertable-mock:local — CI builds from main for the same reason.
Real Altertable — set the same ALTERTABLE_TEST_* variables to your Flight endpoint, catalog, and schema; do not set ALTERTABLE_USE_TESTCONTAINERS.
Optional pre-commit hooks:
uv run pre-commit install --hook-type pre-commit --hook-type commit-msgReleases are managed via release-please — every push to main updates a rolling release PR. Merging it bumps the version, updates CHANGELOG.md, tags the release, and triggers PyPI publishing via trusted publishing.
This adapter draws on the design of dbt-duckdb.
MIT — see LICENSE.