Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Too many quotes in generated command #51

Open
1 of 3 tasks
SteadyGiant opened this issue Jan 12, 2022 · 0 comments
Open
1 of 3 tasks

Too many quotes in generated command #51

SteadyGiant opened this issue Jan 12, 2022 · 0 comments
Labels

Comments

@SteadyGiant
Copy link

SteadyGiant commented Jan 12, 2022

I originally submitted this issue to dbt_external_tables. Perhaps it belongs in dbt-core instead.

Describe the bug

When I run dbt run-operation stage_external_sources, I get the error message:

Encountered an error while running operation: Database Error cross-database reference to database "sources" is not supported

because dbt tries running this command:

drop table if exists "sources"."src_pendo"."src_accounts" cascade

when it should be:

drop table if exists "sources.src_pendo.src_accounts" cascade

And, indeed, when I run the latter SQL command in Redshift Query Editor on my warehouse there's no error, whereas when I run the former SQL command I get the same error.

Steps to reproduce

I'm running all CLI commands in PowerShell.

My models/staging/pendo/src_pendo.yml file (following this example) looks like this (one column for brevity):

version: 2

sources:
  - name: s3_pendo
    database: sources
    schema: src_pendo
    loader: S3
    loaded_at_field: _sdc_batched_at
    tables:
      - name: src_accounts
        external:
          location: <s3 path>
          row_format: serde 'org.apache.hadoop.hive.serde2.OpenCSVSerde'
          table_properties: "('skip.header.line.count'='1')"
          stored_as: parquet
        columns:
          - name: account_id
            data_type: varchar
            tests:
              - not_null
              - unique

<s3 path> contains CSV files.

Expected results

If I'm understanding correctly, the dbt run-operation stage_external_sources macro will create the external schema and table defined in src_pendo.yml in Redshift Spectrum, then populate the table with all data in the CSV files in <s3 path>. Let me know if this isn't the use case.

Actual results

See bug description.

Screenshots and log output

Full (PowerShell) output after running the macro:

image

System information

The contents of your packages.yml file:

packages:
  - package: dbt-labs/dbt_external_tables
    version: 0.8.0

Which database are you using dbt with?

  • redshift
  • snowflake
  • other (specify: ____________)

The output of dbt --version:

installed version: 1.0.1
   latest version: 1.0.1

Up to date!

Plugins:
  - postgres: 1.0.1
  - redshift: 1.0.0

The operating system you're using: Windows 10

The output of python --version: Python 3.9.6

Additional context

If this is truly a bug then I'll try submitting a PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant