I originally submitted this issue to dbt_external_tables. Perhaps it belongs in dbt-core instead.
Describe the bug
When I run dbt run-operation stage_external_sources, I get the error message:
Encountered an error while running operation: Database Error cross-database reference to database "sources" is not supported
because dbt tries running this command:
drop table if exists "sources"."src_pendo"."src_accounts" cascade
when it should be:
drop table if exists "sources.src_pendo.src_accounts" cascade
And, indeed, when I run the latter SQL command in Redshift Query Editor on my warehouse there's no error, whereas when I run the former SQL command I get the same error.
Steps to reproduce
I'm running all CLI commands in PowerShell.
My models/staging/pendo/src_pendo.yml file (following this example) looks like this (one column for brevity):
version: 2
sources:
- name: s3_pendo
database: sources
schema: src_pendo
loader: S3
loaded_at_field: _sdc_batched_at
tables:
- name: src_accounts
external:
location: <s3 path>
row_format: serde 'org.apache.hadoop.hive.serde2.OpenCSVSerde'
table_properties: "('skip.header.line.count'='1')"
stored_as: parquet
columns:
- name: account_id
data_type: varchar
tests:
- not_null
- unique
<s3 path> contains CSV files.
Expected results
If I'm understanding correctly, the dbt run-operation stage_external_sources macro will create the external schema and table defined in src_pendo.yml in Redshift Spectrum, then populate the table with all data in the CSV files in <s3 path>. Let me know if this isn't the use case.
Actual results
See bug description.
Screenshots and log output
Full (PowerShell) output after running the macro:

System information
The contents of your packages.yml file:
packages:
- package: dbt-labs/dbt_external_tables
version: 0.8.0
Which database are you using dbt with?
The output of dbt --version:
installed version: 1.0.1
latest version: 1.0.1
Up to date!
Plugins:
- postgres: 1.0.1
- redshift: 1.0.0
The operating system you're using: Windows 10
The output of python --version: Python 3.9.6
Additional context
If this is truly a bug then I'll try submitting a PR.
I originally submitted this issue to
dbt_external_tables. Perhaps it belongs indbt-coreinstead.Describe the bug
When I run
dbt run-operation stage_external_sources, I get the error message:because dbt tries running this command:
when it should be:
And, indeed, when I run the latter SQL command in Redshift Query Editor on my warehouse there's no error, whereas when I run the former SQL command I get the same error.
Steps to reproduce
I'm running all CLI commands in PowerShell.
My
models/staging/pendo/src_pendo.ymlfile (following this example) looks like this (one column for brevity):<s3 path>contains CSV files.Expected results
If I'm understanding correctly, the
dbt run-operation stage_external_sourcesmacro will create the external schema and table defined insrc_pendo.ymlin Redshift Spectrum, then populate the table with all data in the CSV files in<s3 path>. Let me know if this isn't the use case.Actual results
See bug description.
Screenshots and log output
Full (PowerShell) output after running the macro:
System information
The contents of your
packages.ymlfile:Which database are you using dbt with?
The output of
dbt --version:The operating system you're using: Windows 10
The output of
python --version:Python 3.9.6Additional context
If this is truly a bug then I'll try submitting a PR.