Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ADAP-366] [Bug] Issue with table materialization #365

Closed
2 tasks done
dave-connors-3 opened this issue Mar 10, 2023 · 19 comments · Fixed by #375
Closed
2 tasks done

[ADAP-366] [Bug] Issue with table materialization #365

dave-connors-3 opened this issue Mar 10, 2023 · 19 comments · Fixed by #375
Assignees
Labels
type:bug Something isn't working

Comments

@dave-connors-3
Copy link
Contributor

Is this a regression in a recent version of dbt-redshift?

  • I believe this is a regression in dbt-redshift functionality
  • I have searched the existing issues, and I could not find an existing issue for this regression

Current Behavior

when running dbt on-redshift on 1.5.0.b2, the backup relation step of the materialization seems to fail.

�[0m21:24:23.321701 [debug] [Thread-1  ]: Using redshift connection "model.dbt_project_evaluator.stg_exposure_relationships"
�[0m21:24:23.322455 [debug] [Thread-1  ]: On model.dbt_project_evaluator.stg_exposure_relationships: alter table "ci"."dbt_dconnors"."stg_exposure_relationships" rename to "stg_exposure_relationships__dbt_backup"
�[0m21:24:23.376670 [debug] [Thread-1  ]: Redshift adapter: Redshift error: {'S': 'ERROR', 'C': '42P01', 'M': 'relation "dbt_dconnors.stg_exposure_relationships" does not exist', 'F': '../src/pg/src/backend/catalog/namespace.c', 'L': '262', 'R': 'LocalRangeVarGetRelid'}
�[0m21:24:23.377097 [debug] [Thread-1  ]: On model.dbt_project_evaluator.stg_exposure_relationships: ROLLBACK

Seems to be unable to find a relation in the load_cached_relation step, and throws a LocalRangeVarGetRelid error.

Expected/Previous Behavior

In 1.4.4 this same project runs without error!

Steps To Reproduce

  1. run a table model in v1.4-latest
  2. switch to 1.5.0b2
  3. run it again
  4. observe the error

Relevant log output

No response

Environment

- OS: Mac
- Python: Python 3.9.13
- dbt-core (working version): 1.4.4
- dbt-redshift (working version): 1.4.0
- dbt-core (regression version): 1.5.0-b3
- dbt-redshift (regression version): 1.5.0-b2

Additional Context

No response

@dave-connors-3 dave-connors-3 added type:bug Something isn't working triage:product labels Mar 10, 2023
@github-actions github-actions bot changed the title [Regression] Issue with table materialization [ADAP-366] [Regression] Issue with table materialization Mar 10, 2023
@dbeatty10 dbeatty10 self-assigned this Mar 10, 2023
@dbeatty10
Copy link
Contributor

Thanks for reporting @dave-connors-3 !

I tried the following model and wasn't able to replicate:

models/my_model.sql

{{
  config(
    materialized='table',
  )
}}

select 1 as id
dbt run -s my_model

I know you are able to replicate this in the CI for dbt-project-evaluator though :)

Could you try tweaking the example above to see if you can get it to fail?

@dave-connors-3
Copy link
Contributor Author

@dbeatty10 I have done you a great disservice! these are view models!!

@dbeatty10
Copy link
Contributor

Lol, no prob @dave-connors-3 😂

I was able to reproduce similar error to what you reported! 🎉

I used dbt-redshift==1.5.0b2 and dbt-core==1.5.0-b2.

models/base_model.sql

{{
  config(
    materialized='table',
  )
}}

select 1 as id

models/my_model.sql

{{
  config(
    materialized='view',
  )
}}

select * from {{ ref('base_model') }}

Run it twice -- first time should work and second time should 💥

dbt run
dbt run

@dbeatty10 dbeatty10 removed their assignment Mar 10, 2023
@mikealfare
Copy link
Contributor

@dbeatty10 To clarify on your reproduction, did you run the first dbt run in 1.4.latest and the second one in 1.5.0b2? Or did you literally just run it twice in the same session?

@dbeatty10
Copy link
Contributor

@mikealfare ran both using the same virtual environment. Doing dbt run twice in succession worked for 1.4.latest, but then didn't for 1.5.0b2.

@dbeatty10
Copy link
Contributor

I am going to re-label the subject line from "Regression" to "Bug" instead since we don't consider it a regression unless it shows up in a final release. Maybe the most accurate title would be "Pre-regression"? But we'll stick with "Bug" for today.

@dbeatty10 dbeatty10 changed the title [ADAP-366] [Regression] Issue with table materialization [ADAP-366] [Bug] Issue with table materialization Mar 10, 2023
@mikealfare
Copy link
Contributor

Thanks @dbeatty10. That's how I read it too, which is slightly different than @dave-connors-3's version. The 1.4.latest is a red herring, we just can't run again in 1.5.latest, regardless of where the first run is. And given how basic the example is, this seems like a pretty big issue.

@Fleid, I want to make sure you see this so that we can prioritize it. If I understand the issue correctly, we basically can't run dbt run twice on a view.

@b-per
Copy link

b-per commented Mar 13, 2023

A bit of a guess, but could it be due to the fact that this commit changed class RedshiftAdapter(PostgresAdapter, SQLAdapter): to class RedshiftAdapter(SQLAdapter):?

@dbeatty10
Copy link
Contributor

@b-per I'm guess you're right that they are related.

Context: dbt Core 1.5 is switching from the psycopg2 driver to the redshift-connector driver.

@Fleid
Copy link
Contributor

Fleid commented Mar 13, 2023

@sathiish-kumar we will need your eyes on this one too!

@sathiish-kumar
Copy link
Contributor

sathiish-kumar commented Mar 13, 2023

I troubleshot this a bit, based on the above mentioned examples, I was able to repro the issue but it's a bit puzzling as to how it was working in the first place in 1.4.0 (this may just be me not understanding things right)

When doing the dbt run the second time, we have the base_model table and my_model view already existent on the Redshift side. During the second run, we're able to successfully rename the existing base_model table using ALTER statement and when we try to use similar ALTER statement for the view, it fails. The specific macro in question is rename_relation. In my understanding ALTER cannot be used for views in Redshift - they only work with CREATE OR REPLACE.

To understand how things work with 1.4.0, I would love to have access to the raw SQL that's being logged when executing rename_relation macro with the same aforementioned models? @Fleid / @dbeatty10 would either of you have access to it? (I'm finding it a bit difficult to switch to 1.4.0 version on my local unfortunately)

@b-per
Copy link

b-per commented Mar 14, 2023

What is the issue you are getting when going back to 1.4? Are you using Python venv to install different dbt versions? (I would recommend doing it that way).

For this one I generated the debug output for both 1.4 and 1.5.

Output on 1.4 (working)
❯ dbt --debug run -s +my_model -t red
============================== 2023-03-14 08:41:08.799495 | 936ce199-60e5-4559-92ec-33ad94f5a15d ==============================
08:41:08.799495 [info ] [MainThread]: Running with dbt=1.4.5
08:41:08.800707 [debug] [MainThread]: running dbt with arguments {'debug': True, 'write_json': True, 'use_colors': True, 'printer_width': 80, 'version_check': True, 'partial_parse': True, 'static_parser': True, 'profiles_dir': '/Users/bper/.dbt', 'send_anonymous_usage_stats': True, 'quiet': False, 'no_print': False, 'cache_selected_only': False, 'target': 'red', 'select': ['+my_model'], 'which': 'run', 'rpc_method': 'run', 'indirect_selection': 'eager'}
08:41:08.801355 [debug] [MainThread]: Tracking: tracking
08:41:08.827950 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x104fe3100>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x104fe3b50>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x104fe3730>]}
08:41:08.854643 [debug] [MainThread]: checksum: e0800a81b5377c92a65f92df2a69f48b40465b0f6236a335b5becef2794b124e, vars: {}, profile: None, target: red, version: 1.4.5
08:41:08.876127 [info ] [MainThread]: Unable to do partial parsing because of a version mismatch
08:41:08.876572 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'partial_parser', 'label': '936ce199-60e5-4559-92ec-33ad94f5a15d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x104fe38e0>]}
08:41:09.592922 [debug] [MainThread]: 1699: static parser successfully parsed staging/stg_model_5.sql
08:41:09.645133 [debug] [MainThread]: 1699: static parser successfully parsed staging/stg_model_4.sql
08:41:09.646751 [debug] [MainThread]: 1699: static parser successfully parsed staging/source_1/stg_model_3.sql
08:41:09.649622 [debug] [MainThread]: 1699: static parser successfully parsed staging/source_1/stg_model_2.sql
08:41:09.652099 [debug] [MainThread]: 1699: static parser successfully parsed staging/source_1/stg_model_1.sql
08:41:09.654112 [debug] [MainThread]: 1603: static parser failed on marts/fct_model_6.sql
08:41:09.658160 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/fct_model_6.sql
08:41:09.659164 [debug] [MainThread]: 1699: static parser successfully parsed marts/model_8.sql
08:41:09.660687 [debug] [MainThread]: 1699: static parser successfully parsed marts/fct_model_9.sql
08:41:09.662481 [debug] [MainThread]: 1699: static parser successfully parsed marts/int_model_4.sql
08:41:09.664418 [debug] [MainThread]: 1699: static parser successfully parsed marts/int_model_5.sql
08:41:09.676061 [debug] [MainThread]: 1699: static parser successfully parsed marts/intermediate/dim_model_7.sql
08:41:09.683441 [debug] [MainThread]: 1699: static parser successfully parsed del/my_model.sql
08:41:09.686660 [debug] [MainThread]: 1699: static parser successfully parsed del/base_model.sql
08:41:09.690422 [debug] [MainThread]: 1699: static parser successfully parsed reports/report_3.sql
08:41:09.693121 [debug] [MainThread]: 1699: static parser successfully parsed reports/report_2.sql
08:41:09.695597 [debug] [MainThread]: 1699: static parser successfully parsed reports/report_1.sql
08:41:09.754221 [debug] [MainThread]: 1603: static parser failed on staging/graph/stg_nodes.sql
08:41:09.774933 [debug] [MainThread]: 1602: parser fallback to jinja rendering on staging/graph/stg_nodes.sql
08:41:09.776164 [debug] [MainThread]: 1603: static parser failed on staging/graph/stg_exposure_relationships.sql
08:41:09.784691 [debug] [MainThread]: 1602: parser fallback to jinja rendering on staging/graph/stg_exposure_relationships.sql
08:41:09.785715 [debug] [MainThread]: 1603: static parser failed on staging/graph/stg_node_relationships.sql
08:41:09.789668 [debug] [MainThread]: 1602: parser fallback to jinja rendering on staging/graph/stg_node_relationships.sql
08:41:09.791415 [debug] [MainThread]: 1603: static parser failed on staging/graph/stg_sources.sql
08:41:09.804089 [debug] [MainThread]: 1602: parser fallback to jinja rendering on staging/graph/stg_sources.sql
08:41:09.805501 [debug] [MainThread]: 1603: static parser failed on staging/graph/stg_exposures.sql
08:41:09.814467 [debug] [MainThread]: 1602: parser fallback to jinja rendering on staging/graph/stg_exposures.sql
08:41:09.815553 [debug] [MainThread]: 1603: static parser failed on staging/graph/stg_metric_relationships.sql
08:41:09.819351 [debug] [MainThread]: 1602: parser fallback to jinja rendering on staging/graph/stg_metric_relationships.sql
08:41:09.820541 [debug] [MainThread]: 1603: static parser failed on staging/graph/stg_metrics.sql
08:41:09.832849 [debug] [MainThread]: 1602: parser fallback to jinja rendering on staging/graph/stg_metrics.sql
08:41:09.834006 [debug] [MainThread]: 1603: static parser failed on staging/graph/base/base_node_relationships.sql
08:41:09.844624 [debug] [MainThread]: 1602: parser fallback to jinja rendering on staging/graph/base/base_node_relationships.sql
08:41:09.845864 [debug] [MainThread]: 1603: static parser failed on staging/graph/base/base_exposure_relationships.sql
08:41:09.849819 [debug] [MainThread]: 1602: parser fallback to jinja rendering on staging/graph/base/base_exposure_relationships.sql
08:41:09.850850 [debug] [MainThread]: 1603: static parser failed on staging/graph/base/base_metric_relationships.sql
08:41:09.854885 [debug] [MainThread]: 1602: parser fallback to jinja rendering on staging/graph/base/base_metric_relationships.sql
08:41:09.856267 [debug] [MainThread]: 1603: static parser failed on staging/variables/stg_naming_convention_folders.sql
08:41:09.865538 [debug] [MainThread]: 1602: parser fallback to jinja rendering on staging/variables/stg_naming_convention_folders.sql
08:41:09.866607 [debug] [MainThread]: 1603: static parser failed on staging/variables/stg_naming_convention_prefixes.sql
08:41:09.872718 [debug] [MainThread]: 1602: parser fallback to jinja rendering on staging/variables/stg_naming_convention_prefixes.sql
08:41:09.874363 [debug] [MainThread]: 1603: static parser failed on marts/documentation/fct_undocumented_models.sql
08:41:09.884079 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/documentation/fct_undocumented_models.sql
08:41:09.885936 [debug] [MainThread]: 1603: static parser failed on marts/documentation/fct_documentation_coverage.sql
08:41:09.893727 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/documentation/fct_documentation_coverage.sql
08:41:09.894840 [debug] [MainThread]: 1603: static parser failed on marts/core/int_direct_relationships.sql
08:41:09.898809 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/core/int_direct_relationships.sql
08:41:09.899953 [debug] [MainThread]: 1603: static parser failed on marts/core/int_all_dag_relationships.sql
08:41:09.908978 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/core/int_all_dag_relationships.sql
08:41:09.910634 [debug] [MainThread]: 1603: static parser failed on marts/core/int_all_graph_resources.sql
08:41:09.935191 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/core/int_all_graph_resources.sql
08:41:09.936323 [debug] [MainThread]: 1603: static parser failed on marts/dag/fct_multiple_sources_joined.sql
08:41:09.944223 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/dag/fct_multiple_sources_joined.sql
08:41:09.945296 [debug] [MainThread]: 1603: static parser failed on marts/dag/fct_root_models.sql
08:41:09.948241 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/dag/fct_root_models.sql
08:41:09.949223 [debug] [MainThread]: 1603: static parser failed on marts/dag/fct_model_fanout.sql
08:41:09.953047 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/dag/fct_model_fanout.sql
08:41:09.954080 [debug] [MainThread]: 1603: static parser failed on marts/dag/fct_unused_sources.sql
08:41:09.957547 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/dag/fct_unused_sources.sql
08:41:09.958598 [debug] [MainThread]: 1603: static parser failed on marts/dag/fct_source_fanout.sql
08:41:09.962523 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/dag/fct_source_fanout.sql
08:41:09.965008 [debug] [MainThread]: 1603: static parser failed on marts/dag/fct_direct_join_to_source.sql
08:41:09.970046 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/dag/fct_direct_join_to_source.sql
08:41:09.971477 [debug] [MainThread]: 1603: static parser failed on marts/dag/fct_staging_dependent_on_staging.sql
08:41:09.978024 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/dag/fct_staging_dependent_on_staging.sql
08:41:09.979337 [debug] [MainThread]: 1603: static parser failed on marts/dag/fct_staging_dependent_on_marts_or_intermediate.sql
08:41:09.982931 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/dag/fct_staging_dependent_on_marts_or_intermediate.sql
08:41:09.983938 [debug] [MainThread]: 1603: static parser failed on marts/dag/fct_marts_or_intermediate_dependent_on_source.sql
08:41:09.986995 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/dag/fct_marts_or_intermediate_dependent_on_source.sql
08:41:09.988083 [debug] [MainThread]: 1603: static parser failed on marts/dag/fct_hard_coded_references.sql
08:41:09.991973 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/dag/fct_hard_coded_references.sql
08:41:09.993024 [debug] [MainThread]: 1603: static parser failed on marts/dag/fct_rejoining_of_upstream_concepts.sql
08:41:09.996391 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/dag/fct_rejoining_of_upstream_concepts.sql
08:41:09.997378 [debug] [MainThread]: 1603: static parser failed on marts/tests/fct_missing_primary_key_tests.sql
08:41:10.000531 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/tests/fct_missing_primary_key_tests.sql
08:41:10.001596 [debug] [MainThread]: 1603: static parser failed on marts/tests/fct_test_coverage.sql
08:41:10.007247 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/tests/fct_test_coverage.sql
08:41:10.008581 [debug] [MainThread]: 1603: static parser failed on marts/tests/intermediate/int_model_test_summary.sql
08:41:10.013938 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/tests/intermediate/int_model_test_summary.sql
08:41:10.015596 [debug] [MainThread]: 1603: static parser failed on marts/performance/fct_exposure_parents_materializations.sql
08:41:10.019672 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/performance/fct_exposure_parents_materializations.sql
08:41:10.021048 [debug] [MainThread]: 1603: static parser failed on marts/performance/fct_chained_views_dependencies.sql
08:41:10.025012 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/performance/fct_chained_views_dependencies.sql
08:41:10.026335 [debug] [MainThread]: 1603: static parser failed on marts/structure/fct_source_directories.sql
08:41:10.030224 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/structure/fct_source_directories.sql
08:41:10.031283 [debug] [MainThread]: 1603: static parser failed on marts/structure/fct_test_directories.sql
08:41:10.034977 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/structure/fct_test_directories.sql
08:41:10.036169 [debug] [MainThread]: 1603: static parser failed on marts/structure/fct_model_naming_conventions.sql
08:41:10.042010 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/structure/fct_model_naming_conventions.sql
08:41:10.043364 [debug] [MainThread]: 1603: static parser failed on marts/structure/fct_model_directories.sql
08:41:10.047829 [debug] [MainThread]: 1602: parser fallback to jinja rendering on marts/structure/fct_model_directories.sql
08:41:10.332478 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'load_project', 'label': '936ce199-60e5-4559-92ec-33ad94f5a15d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x105221700>]}
08:41:10.347846 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': '936ce199-60e5-4559-92ec-33ad94f5a15d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10512e160>]}
08:41:10.348244 [info ] [MainThread]: Found 53 models, 59 tests, 0 snapshots, 0 analyses, 481 macros, 0 operations, 20 seed files, 5 sources, 1 exposure, 1 metric
08:41:10.348444 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '936ce199-60e5-4559-92ec-33ad94f5a15d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1053ec370>]}
08:41:10.350705 [info ] [MainThread]: 
08:41:10.351848 [debug] [MainThread]: Acquiring new redshift connection 'master'
08:41:10.352756 [debug] [ThreadPool]: Acquiring new redshift connection 'list_dev'
08:41:10.361183 [debug] [ThreadPool]: Using redshift connection "list_dev"
08:41:10.361528 [debug] [ThreadPool]: On list_dev: /* {"app": "dbt", "dbt_version": "1.4.5", "profile_name": "integration_tests", "target_name": "red", "connection_name": "list_dev"} */

    select distinct nspname from pg_namespace
  
08:41:10.361729 [debug] [ThreadPool]: Opening a new connection, currently in state init
08:41:10.361954 [debug] [ThreadPool]: Redshift adapter: Connecting to Redshift using 'database' credentials
08:41:12.546256 [debug] [ThreadPool]: SQL status: SELECT in 2 seconds
08:41:12.554911 [debug] [ThreadPool]: On list_dev: Close
08:41:12.563884 [debug] [ThreadPool]: Acquiring new redshift connection 'list_dev_dbt_bperigaud'
08:41:12.583035 [debug] [ThreadPool]: Using redshift connection "list_dev_dbt_bperigaud"
08:41:12.583609 [debug] [ThreadPool]: On list_dev_dbt_bperigaud: BEGIN
08:41:12.584435 [debug] [ThreadPool]: Opening a new connection, currently in state closed
08:41:12.585634 [debug] [ThreadPool]: Redshift adapter: Connecting to Redshift using 'database' credentials
08:41:13.531033 [debug] [ThreadPool]: SQL status: BEGIN in 1 seconds
08:41:13.531825 [debug] [ThreadPool]: Using redshift connection "list_dev_dbt_bperigaud"
08:41:13.532314 [debug] [ThreadPool]: On list_dev_dbt_bperigaud: /* {"app": "dbt", "dbt_version": "1.4.5", "profile_name": "integration_tests", "target_name": "red", "connection_name": "list_dev_dbt_bperigaud"} */
select
      'dev' as database,
      tablename as name,
      schemaname as schema,
      'table' as type
    from pg_tables
    where schemaname ilike 'dbt_bperigaud'
    union all
    select
      'dev' as database,
      viewname as name,
      schemaname as schema,
      'view' as type
    from pg_views
    where schemaname ilike 'dbt_bperigaud'
  
08:41:13.717134 [debug] [ThreadPool]: SQL status: SELECT in 0 seconds
08:41:13.722850 [debug] [ThreadPool]: On list_dev_dbt_bperigaud: ROLLBACK
08:41:13.826579 [debug] [ThreadPool]: On list_dev_dbt_bperigaud: Close
08:41:13.845371 [debug] [MainThread]: Using redshift connection "master"
08:41:13.845793 [debug] [MainThread]: On master: BEGIN
08:41:13.846169 [debug] [MainThread]: Opening a new connection, currently in state init
08:41:13.846439 [debug] [MainThread]: Redshift adapter: Connecting to Redshift using 'database' credentials
08:41:14.850222 [debug] [MainThread]: SQL status: BEGIN in 1 seconds
08:41:14.854581 [debug] [MainThread]: Using redshift connection "master"
08:41:14.855247 [debug] [MainThread]: On master: /* {"app": "dbt", "dbt_version": "1.4.5", "profile_name": "integration_tests", "target_name": "red", "connection_name": "master"} */
with relation as (
        select
            pg_rewrite.ev_class as class,
            pg_rewrite.oid as id
        from pg_rewrite
    ),
    class as (
        select
            oid as id,
            relname as name,
            relnamespace as schema,
            relkind as kind
        from pg_class
    ),
    dependency as (
        select distinct
            pg_depend.objid as id,
            pg_depend.refobjid as ref
        from pg_depend
    ),
    schema as (
        select
            pg_namespace.oid as id,
            pg_namespace.nspname as name
        from pg_namespace
        where nspname != 'information_schema' and nspname not like 'pg\_%'
    ),
    referenced as (
        select
            relation.id AS id,
            referenced_class.name ,
            referenced_class.schema ,
            referenced_class.kind
        from relation
        join class as referenced_class on relation.class=referenced_class.id
        where referenced_class.kind in ('r', 'v')
    ),
    relationships as (
        select
            referenced.name as referenced_name,
            referenced.schema as referenced_schema_id,
            dependent_class.name as dependent_name,
            dependent_class.schema as dependent_schema_id,
            referenced.kind as kind
        from referenced
        join dependency on referenced.id=dependency.id
        join class as dependent_class on dependency.ref=dependent_class.id
        where
            (referenced.name != dependent_class.name or
             referenced.schema != dependent_class.schema)
    )

    select
        referenced_schema.name as referenced_schema,
        relationships.referenced_name as referenced_name,
        dependent_schema.name as dependent_schema,
        relationships.dependent_name as dependent_name
    from relationships
    join schema as dependent_schema on relationships.dependent_schema_id=dependent_schema.id
    join schema as referenced_schema on relationships.referenced_schema_id=referenced_schema.id
    group by referenced_schema, referenced_name, dependent_schema, dependent_name
    order by referenced_schema, referenced_name, dependent_schema, dependent_name;
08:41:15.461432 [debug] [MainThread]: SQL status: SELECT in 1 seconds
08:41:15.474389 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '936ce199-60e5-4559-92ec-33ad94f5a15d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x105221550>]}
08:41:15.476924 [debug] [MainThread]: On master: ROLLBACK
08:41:15.586027 [debug] [MainThread]: Using redshift connection "master"
08:41:15.587263 [debug] [MainThread]: On master: BEGIN
08:41:15.793060 [debug] [MainThread]: SQL status: BEGIN in 0 seconds
08:41:15.794864 [debug] [MainThread]: On master: COMMIT
08:41:15.796097 [debug] [MainThread]: Using redshift connection "master"
08:41:15.797028 [debug] [MainThread]: On master: COMMIT
08:41:15.899530 [debug] [MainThread]: SQL status: COMMIT in 0 seconds
08:41:15.900790 [debug] [MainThread]: On master: Close
08:41:15.904006 [info ] [MainThread]: Concurrency: 4 threads (target='red')
08:41:15.905442 [info ] [MainThread]: 
08:41:15.917693 [debug] [Thread-1  ]: Began running node model.dbt_project_evaluator_integration_tests.base_model
08:41:15.919119 [info ] [Thread-1  ]: 1 of 2 START sql table model dbt_bperigaud.base_model .......................... [RUN]
08:41:15.920613 [debug] [Thread-1  ]: Acquiring new redshift connection 'model.dbt_project_evaluator_integration_tests.base_model'
08:41:15.921355 [debug] [Thread-1  ]: Began compiling node model.dbt_project_evaluator_integration_tests.base_model
08:41:15.940406 [debug] [Thread-1  ]: Writing injected SQL for node "model.dbt_project_evaluator_integration_tests.base_model"
08:41:15.947232 [debug] [Thread-1  ]: Timing info for model.dbt_project_evaluator_integration_tests.base_model (compile): 2023-03-14 08:41:15.921818 => 2023-03-14 08:41:15.946907
08:41:15.948061 [debug] [Thread-1  ]: Began executing node model.dbt_project_evaluator_integration_tests.base_model
08:41:15.988176 [debug] [Thread-1  ]: Writing runtime sql for node "model.dbt_project_evaluator_integration_tests.base_model"
08:41:15.989123 [debug] [Thread-1  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:15.989389 [debug] [Thread-1  ]: On model.dbt_project_evaluator_integration_tests.base_model: BEGIN
08:41:15.989591 [debug] [Thread-1  ]: Opening a new connection, currently in state closed
08:41:15.989818 [debug] [Thread-1  ]: Redshift adapter: Connecting to Redshift using 'database' credentials
08:41:16.983688 [debug] [Thread-1  ]: SQL status: BEGIN in 1 seconds
08:41:16.984692 [debug] [Thread-1  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:16.985333 [debug] [Thread-1  ]: On model.dbt_project_evaluator_integration_tests.base_model: /* {"app": "dbt", "dbt_version": "1.4.5", "profile_name": "integration_tests", "target_name": "red", "node_id": "model.dbt_project_evaluator_integration_tests.base_model"} */

  
    

  create  table
    "dev"."dbt_bperigaud"."base_model__dbt_tmp"
    
    
    
  as (
    

select 1 as id
  );
  
08:41:17.129461 [debug] [Thread-1  ]: SQL status: SELECT in 0 seconds
08:41:17.145729 [debug] [Thread-1  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:17.146450 [debug] [Thread-1  ]: On model.dbt_project_evaluator_integration_tests.base_model: /* {"app": "dbt", "dbt_version": "1.4.5", "profile_name": "integration_tests", "target_name": "red", "node_id": "model.dbt_project_evaluator_integration_tests.base_model"} */
alter table "dev"."dbt_bperigaud"."base_model__dbt_tmp" rename to "base_model"
08:41:17.252661 [debug] [Thread-1  ]: SQL status: ALTER TABLE in 0 seconds
08:41:17.268040 [debug] [Thread-1  ]: On model.dbt_project_evaluator_integration_tests.base_model: COMMIT
08:41:17.268399 [debug] [Thread-1  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:17.268629 [debug] [Thread-1  ]: On model.dbt_project_evaluator_integration_tests.base_model: COMMIT
08:41:17.428340 [debug] [Thread-1  ]: SQL status: COMMIT in 0 seconds
08:41:17.430475 [debug] [Thread-1  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:17.431216 [debug] [Thread-1  ]: On model.dbt_project_evaluator_integration_tests.base_model: BEGIN
08:41:17.532585 [debug] [Thread-1  ]: SQL status: BEGIN in 0 seconds
08:41:17.537647 [debug] [Thread-1  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:17.538024 [debug] [Thread-1  ]: On model.dbt_project_evaluator_integration_tests.base_model: /* {"app": "dbt", "dbt_version": "1.4.5", "profile_name": "integration_tests", "target_name": "red", "node_id": "model.dbt_project_evaluator_integration_tests.base_model"} */
drop table if exists "dev"."dbt_bperigaud"."base_model__dbt_backup" cascade
08:41:17.639517 [debug] [Thread-1  ]: SQL status: DROP TABLE in 0 seconds
08:41:17.641209 [debug] [Thread-1  ]: On model.dbt_project_evaluator_integration_tests.base_model: COMMIT
08:41:17.641672 [debug] [Thread-1  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:17.642028 [debug] [Thread-1  ]: On model.dbt_project_evaluator_integration_tests.base_model: COMMIT
08:41:17.750470 [debug] [Thread-1  ]: SQL status: COMMIT in 0 seconds
08:41:17.751163 [debug] [Thread-1  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:17.751599 [debug] [Thread-1  ]: On model.dbt_project_evaluator_integration_tests.base_model: BEGIN
08:41:17.855362 [debug] [Thread-1  ]: SQL status: BEGIN in 0 seconds
08:41:17.857795 [debug] [Thread-1  ]: Timing info for model.dbt_project_evaluator_integration_tests.base_model (execute): 2023-03-14 08:41:15.948616 => 2023-03-14 08:41:17.857530
08:41:17.858792 [debug] [Thread-1  ]: On model.dbt_project_evaluator_integration_tests.base_model: ROLLBACK
08:41:17.960038 [debug] [Thread-1  ]: On model.dbt_project_evaluator_integration_tests.base_model: Close
08:41:17.962403 [debug] [Thread-1  ]: Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '936ce199-60e5-4559-92ec-33ad94f5a15d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x106931400>]}
08:41:17.963579 [info ] [Thread-1  ]: 1 of 2 OK created sql table model dbt_bperigaud.base_model ..................... [SELECT in 2.04s]
08:41:17.966462 [debug] [Thread-1  ]: Finished running node model.dbt_project_evaluator_integration_tests.base_model
08:41:17.969203 [debug] [Thread-3  ]: Began running node model.dbt_project_evaluator_integration_tests.my_model
08:41:17.970101 [info ] [Thread-3  ]: 2 of 2 START sql view model dbt_bperigaud.my_model ............................. [RUN]
08:41:17.971356 [debug] [Thread-3  ]: Acquiring new redshift connection 'model.dbt_project_evaluator_integration_tests.my_model'
08:41:17.971898 [debug] [Thread-3  ]: Began compiling node model.dbt_project_evaluator_integration_tests.my_model
08:41:17.977614 [debug] [Thread-3  ]: Writing injected SQL for node "model.dbt_project_evaluator_integration_tests.my_model"
08:41:17.978806 [debug] [Thread-3  ]: Timing info for model.dbt_project_evaluator_integration_tests.my_model (compile): 2023-03-14 08:41:17.972272 => 2023-03-14 08:41:17.978720
08:41:17.979738 [debug] [Thread-3  ]: Began executing node model.dbt_project_evaluator_integration_tests.my_model
08:41:18.057999 [debug] [Thread-3  ]: Writing runtime sql for node "model.dbt_project_evaluator_integration_tests.my_model"
08:41:18.058638 [debug] [Thread-3  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:18.058862 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: BEGIN
08:41:18.059056 [debug] [Thread-3  ]: Opening a new connection, currently in state init
08:41:18.059234 [debug] [Thread-3  ]: Redshift adapter: Connecting to Redshift using 'database' credentials
08:41:18.990605 [debug] [Thread-3  ]: SQL status: BEGIN in 1 seconds
08:41:18.992909 [debug] [Thread-3  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:18.993738 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: /* {"app": "dbt", "dbt_version": "1.4.5", "profile_name": "integration_tests", "target_name": "red", "node_id": "model.dbt_project_evaluator_integration_tests.my_model"} */


  create view "dev"."dbt_bperigaud"."my_model__dbt_tmp" as (
    

select * from "dev"."dbt_bperigaud"."base_model"
  ) ;

08:41:19.115287 [debug] [Thread-3  ]: SQL status: CREATE VIEW in 0 seconds
08:41:19.123925 [debug] [Thread-3  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:19.124871 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: /* {"app": "dbt", "dbt_version": "1.4.5", "profile_name": "integration_tests", "target_name": "red", "node_id": "model.dbt_project_evaluator_integration_tests.my_model"} */
alter table "dev"."dbt_bperigaud"."my_model__dbt_tmp" rename to "my_model"
08:41:19.237433 [debug] [Thread-3  ]: SQL status: ALTER TABLE in 0 seconds
08:41:19.243051 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: COMMIT
08:41:19.244066 [debug] [Thread-3  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:19.244662 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: COMMIT
08:41:19.523877 [debug] [Thread-3  ]: SQL status: COMMIT in 0 seconds
08:41:19.526355 [debug] [Thread-3  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:19.527304 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: BEGIN
08:41:19.657522 [debug] [Thread-3  ]: SQL status: BEGIN in 0 seconds
08:41:19.664436 [debug] [Thread-3  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:19.665269 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: /* {"app": "dbt", "dbt_version": "1.4.5", "profile_name": "integration_tests", "target_name": "red", "node_id": "model.dbt_project_evaluator_integration_tests.my_model"} */
drop view if exists "dev"."dbt_bperigaud"."my_model__dbt_backup" cascade
08:41:20.580062 [debug] [Thread-3  ]: SQL status: DROP VIEW in 1 seconds
08:41:20.583934 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: COMMIT
08:41:20.585145 [debug] [Thread-3  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:20.585945 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: COMMIT
08:41:20.724743 [debug] [Thread-3  ]: SQL status: COMMIT in 0 seconds
08:41:20.726817 [debug] [Thread-3  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:20.728439 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: BEGIN
08:41:20.833443 [debug] [Thread-3  ]: SQL status: BEGIN in 0 seconds
08:41:20.836930 [debug] [Thread-3  ]: Timing info for model.dbt_project_evaluator_integration_tests.my_model (execute): 2023-03-14 08:41:17.980423 => 2023-03-14 08:41:20.836790
08:41:20.838366 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: ROLLBACK
08:41:20.944209 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: Close
08:41:20.947317 [debug] [Thread-3  ]: Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '936ce199-60e5-4559-92ec-33ad94f5a15d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1057ad490>]}
08:41:20.948667 [info ] [Thread-3  ]: 2 of 2 OK created sql view model dbt_bperigaud.my_model ........................ [CREATE VIEW in 2.98s]
08:41:20.949746 [debug] [Thread-3  ]: Finished running node model.dbt_project_evaluator_integration_tests.my_model
08:41:20.953021 [debug] [MainThread]: Acquiring new redshift connection 'master'
08:41:20.953825 [debug] [MainThread]: Using redshift connection "master"
08:41:20.954344 [debug] [MainThread]: On master: BEGIN
08:41:20.955089 [debug] [MainThread]: Opening a new connection, currently in state closed
08:41:20.956055 [debug] [MainThread]: Redshift adapter: Connecting to Redshift using 'database' credentials
08:41:21.950814 [debug] [MainThread]: SQL status: BEGIN in 1 seconds
08:41:21.952683 [debug] [MainThread]: On master: COMMIT
08:41:21.953704 [debug] [MainThread]: Using redshift connection "master"
08:41:21.954586 [debug] [MainThread]: On master: COMMIT
08:41:22.059697 [debug] [MainThread]: SQL status: COMMIT in 0 seconds
08:41:22.060794 [debug] [MainThread]: On master: Close
08:41:22.062918 [debug] [MainThread]: Connection 'master' was properly closed.
08:41:22.063832 [debug] [MainThread]: Connection 'model.dbt_project_evaluator_integration_tests.base_model' was properly closed.
08:41:22.064571 [debug] [MainThread]: Connection 'model.dbt_project_evaluator_integration_tests.my_model' was properly closed.
08:41:22.065668 [info ] [MainThread]: 
08:41:22.066482 [info ] [MainThread]: Finished running 1 table model, 1 view model in 0 hours 0 minutes and 11.71 seconds (11.71s).
08:41:22.067558 [debug] [MainThread]: Command end result
08:41:22.099446 [info ] [MainThread]: 
08:41:22.100200 [info ] [MainThread]: Completed successfully
08:41:22.100880 [info ] [MainThread]: 
08:41:22.101686 [info ] [MainThread]: Done. PASS=2 WARN=0 ERROR=0 SKIP=0 TOTAL=2
08:41:22.102843 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1051e1f70>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1051e1ee0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1051e1a30>]}
08:41:22.105123 [debug] [MainThread]: Flushing usage events
Output on 1.5 (failing)
❯ dbt --debug run -s +my_model -t red
08:41:33  Running with dbt=1.5.0-b3
08:41:33  running dbt with arguments {'profiles_dir': '/Users/bper/.dbt', 'use_experimental_parser': 'False', 'partial_parse': 'True', 'version_check': 'True', 'target_path': 'None', 'fail_fast': 'False', 'log_path': '/Users/bper/dev/pro-serv-dag-auditing/integration_tests/logs', 'log_format': 'default', 'use_colors': 'True', 'log_cache_events': 'False', 'warn_error': 'None', 'printer_width': '80', 'send_anonymous_usage_stats': 'True', 'static_parser': 'True', 'no_print': 'None', 'debug': 'True', 'warn_error_options': 'WarnErrorOptions(include=[], exclude=[])', 'write_json': 'True', 'indirect_selection': 'eager', 'cache_selected_only': 'False', 'quiet': 'False'}
08:41:33  checksum: ea32a27705f94e4720613ac0a5c6e0a175d13f06ab24f6e49c5a7ba04dc217a3, vars: {}, profile: None, target: red, version: 1.5.0b3
08:41:33  Unable to do partial parsing because of a version mismatch
08:41:33  Sending event: {'category': 'dbt', 'action': 'partial_parser', 'label': 'e99cd034-f0fe-4755-9ee0-1ab323b207ba', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10b9f7730>]}
08:41:33  1699: static parser successfully parsed staging/stg_model_5.sql
08:41:33  1699: static parser successfully parsed staging/stg_model_4.sql
08:41:33  1699: static parser successfully parsed staging/source_1/stg_model_3.sql
08:41:33  1699: static parser successfully parsed staging/source_1/stg_model_2.sql
08:41:33  1699: static parser successfully parsed staging/source_1/stg_model_1.sql
08:41:33  1603: static parser failed on marts/fct_model_6.sql
08:41:33  1602: parser fallback to jinja rendering on marts/fct_model_6.sql
08:41:33  1699: static parser successfully parsed marts/model_8.sql
08:41:33  1699: static parser successfully parsed marts/fct_model_9.sql
08:41:33  1699: static parser successfully parsed marts/int_model_4.sql
08:41:33  1699: static parser successfully parsed marts/int_model_5.sql
08:41:33  1699: static parser successfully parsed marts/intermediate/dim_model_7.sql
08:41:33  1699: static parser successfully parsed del/my_model.sql
08:41:33  1699: static parser successfully parsed del/base_model.sql
08:41:33  1699: static parser successfully parsed reports/report_3.sql
08:41:33  1699: static parser successfully parsed reports/report_2.sql
08:41:33  1699: static parser successfully parsed reports/report_1.sql
08:41:34  1603: static parser failed on staging/graph/stg_nodes.sql
08:41:34  1602: parser fallback to jinja rendering on staging/graph/stg_nodes.sql
08:41:34  1603: static parser failed on staging/graph/stg_exposure_relationships.sql
08:41:34  1602: parser fallback to jinja rendering on staging/graph/stg_exposure_relationships.sql
08:41:34  1603: static parser failed on staging/graph/stg_node_relationships.sql
08:41:34  1602: parser fallback to jinja rendering on staging/graph/stg_node_relationships.sql
08:41:34  1603: static parser failed on staging/graph/stg_sources.sql
08:41:34  1602: parser fallback to jinja rendering on staging/graph/stg_sources.sql
08:41:34  1603: static parser failed on staging/graph/stg_exposures.sql
08:41:34  1602: parser fallback to jinja rendering on staging/graph/stg_exposures.sql
08:41:34  1603: static parser failed on staging/graph/stg_metric_relationships.sql
08:41:34  1602: parser fallback to jinja rendering on staging/graph/stg_metric_relationships.sql
08:41:34  1603: static parser failed on staging/graph/stg_metrics.sql
08:41:34  1602: parser fallback to jinja rendering on staging/graph/stg_metrics.sql
08:41:34  1603: static parser failed on staging/graph/base/base_node_relationships.sql
08:41:34  1602: parser fallback to jinja rendering on staging/graph/base/base_node_relationships.sql
08:41:34  1603: static parser failed on staging/graph/base/base_exposure_relationships.sql
08:41:34  1602: parser fallback to jinja rendering on staging/graph/base/base_exposure_relationships.sql
08:41:34  1603: static parser failed on staging/graph/base/base_metric_relationships.sql
08:41:34  1602: parser fallback to jinja rendering on staging/graph/base/base_metric_relationships.sql
08:41:34  1603: static parser failed on staging/variables/stg_naming_convention_folders.sql
08:41:34  1602: parser fallback to jinja rendering on staging/variables/stg_naming_convention_folders.sql
08:41:34  1603: static parser failed on staging/variables/stg_naming_convention_prefixes.sql
08:41:34  1602: parser fallback to jinja rendering on staging/variables/stg_naming_convention_prefixes.sql
08:41:34  1603: static parser failed on marts/documentation/fct_undocumented_models.sql
08:41:34  1602: parser fallback to jinja rendering on marts/documentation/fct_undocumented_models.sql
08:41:34  1603: static parser failed on marts/documentation/fct_documentation_coverage.sql
08:41:34  1602: parser fallback to jinja rendering on marts/documentation/fct_documentation_coverage.sql
08:41:34  1603: static parser failed on marts/core/int_direct_relationships.sql
08:41:34  1602: parser fallback to jinja rendering on marts/core/int_direct_relationships.sql
08:41:34  1603: static parser failed on marts/core/int_all_dag_relationships.sql
08:41:34  1602: parser fallback to jinja rendering on marts/core/int_all_dag_relationships.sql
08:41:34  1603: static parser failed on marts/core/int_all_graph_resources.sql
08:41:34  1602: parser fallback to jinja rendering on marts/core/int_all_graph_resources.sql
08:41:34  1603: static parser failed on marts/dag/fct_multiple_sources_joined.sql
08:41:34  1602: parser fallback to jinja rendering on marts/dag/fct_multiple_sources_joined.sql
08:41:34  1603: static parser failed on marts/dag/fct_root_models.sql
08:41:34  1602: parser fallback to jinja rendering on marts/dag/fct_root_models.sql
08:41:34  1603: static parser failed on marts/dag/fct_model_fanout.sql
08:41:34  1602: parser fallback to jinja rendering on marts/dag/fct_model_fanout.sql
08:41:34  1603: static parser failed on marts/dag/fct_unused_sources.sql
08:41:34  1602: parser fallback to jinja rendering on marts/dag/fct_unused_sources.sql
08:41:34  1603: static parser failed on marts/dag/fct_source_fanout.sql
08:41:34  1602: parser fallback to jinja rendering on marts/dag/fct_source_fanout.sql
08:41:34  1603: static parser failed on marts/dag/fct_direct_join_to_source.sql
08:41:34  1602: parser fallback to jinja rendering on marts/dag/fct_direct_join_to_source.sql
08:41:34  1603: static parser failed on marts/dag/fct_staging_dependent_on_staging.sql
08:41:34  1602: parser fallback to jinja rendering on marts/dag/fct_staging_dependent_on_staging.sql
08:41:34  1603: static parser failed on marts/dag/fct_staging_dependent_on_marts_or_intermediate.sql
08:41:34  1602: parser fallback to jinja rendering on marts/dag/fct_staging_dependent_on_marts_or_intermediate.sql
08:41:34  1603: static parser failed on marts/dag/fct_marts_or_intermediate_dependent_on_source.sql
08:41:34  1602: parser fallback to jinja rendering on marts/dag/fct_marts_or_intermediate_dependent_on_source.sql
08:41:34  1603: static parser failed on marts/dag/fct_hard_coded_references.sql
08:41:34  1602: parser fallback to jinja rendering on marts/dag/fct_hard_coded_references.sql
08:41:34  1603: static parser failed on marts/dag/fct_rejoining_of_upstream_concepts.sql
08:41:34  1602: parser fallback to jinja rendering on marts/dag/fct_rejoining_of_upstream_concepts.sql
08:41:34  1603: static parser failed on marts/tests/fct_missing_primary_key_tests.sql
08:41:34  1602: parser fallback to jinja rendering on marts/tests/fct_missing_primary_key_tests.sql
08:41:34  1603: static parser failed on marts/tests/fct_test_coverage.sql
08:41:34  1602: parser fallback to jinja rendering on marts/tests/fct_test_coverage.sql
08:41:34  1603: static parser failed on marts/tests/intermediate/int_model_test_summary.sql
08:41:34  1602: parser fallback to jinja rendering on marts/tests/intermediate/int_model_test_summary.sql
08:41:34  1603: static parser failed on marts/performance/fct_exposure_parents_materializations.sql
08:41:34  1602: parser fallback to jinja rendering on marts/performance/fct_exposure_parents_materializations.sql
08:41:34  1603: static parser failed on marts/performance/fct_chained_views_dependencies.sql
08:41:34  1602: parser fallback to jinja rendering on marts/performance/fct_chained_views_dependencies.sql
08:41:34  1603: static parser failed on marts/structure/fct_source_directories.sql
08:41:34  1602: parser fallback to jinja rendering on marts/structure/fct_source_directories.sql
08:41:34  1603: static parser failed on marts/structure/fct_test_directories.sql
08:41:34  1602: parser fallback to jinja rendering on marts/structure/fct_test_directories.sql
08:41:34  1603: static parser failed on marts/structure/fct_model_naming_conventions.sql
08:41:34  1602: parser fallback to jinja rendering on marts/structure/fct_model_naming_conventions.sql
08:41:34  1603: static parser failed on marts/structure/fct_model_directories.sql
08:41:34  1602: parser fallback to jinja rendering on marts/structure/fct_model_directories.sql
08:41:34  Sending event: {'category': 'dbt', 'action': 'load_project', 'label': 'e99cd034-f0fe-4755-9ee0-1ab323b207ba', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10be3d0d0>]}
08:41:34  Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': 'e99cd034-f0fe-4755-9ee0-1ab323b207ba', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10befa400>]}
08:41:34  Found 53 models, 59 tests, 0 snapshots, 0 analyses, 488 macros, 0 operations, 20 seed files, 5 sources, 1 exposure, 1 metric, 0 groups
08:41:34  Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'e99cd034-f0fe-4755-9ee0-1ab323b207ba', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10b984d00>]}
08:41:34  
08:41:34  Acquiring new redshift connection 'master'
08:41:34  Acquiring new redshift connection 'list_dev'
08:41:34  Using redshift connection "list_dev"
08:41:34  On list_dev: select distinct nspname from pg_namespace
08:41:34  Opening a new connection, currently in state init
08:41:34  Redshift adapter: Connecting to redshift with username/password based auth...
08:41:36  SQL status: cursor.rowcount = -1 in 1 seconds
08:41:36  On list_dev: Close
08:41:36  Re-using an available connection from the pool (formerly list_dev, now list_dev_dbt_bperigaud)
08:41:36  Using redshift connection "list_dev_dbt_bperigaud"
08:41:36  On list_dev_dbt_bperigaud: BEGIN
08:41:36  Opening a new connection, currently in state closed
08:41:36  Redshift adapter: Connecting to redshift with username/password based auth...
08:41:37  SQL status: cursor.rowcount = -1 in 1 seconds
08:41:37  Using redshift connection "list_dev_dbt_bperigaud"
08:41:37  On list_dev_dbt_bperigaud: select
      'dev' as database,
      tablename as name,
      schemaname as schema,
      'table' as type
    from pg_tables
    where schemaname ilike 'dbt_bperigaud'
    union all
    select
      'dev' as database,
      viewname as name,
      schemaname as schema,
      'view' as type
    from pg_views
    where schemaname ilike 'dbt_bperigaud'
08:41:37  SQL status: cursor.rowcount = -1 in 0 seconds
08:41:37  On list_dev_dbt_bperigaud: ROLLBACK
08:41:37  On list_dev_dbt_bperigaud: Close
08:41:37  Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'e99cd034-f0fe-4755-9ee0-1ab323b207ba', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10b9e8c40>]}
08:41:37  Using redshift connection "master"
08:41:37  On master: BEGIN
08:41:37  Opening a new connection, currently in state init
08:41:37  Redshift adapter: Connecting to redshift with username/password based auth...
08:41:38  SQL status: cursor.rowcount = -1 in 1 seconds
08:41:38  On master: COMMIT
08:41:38  Using redshift connection "master"
08:41:38  On master: COMMIT
08:41:40  SQL status: cursor.rowcount = -1 in 1 seconds
08:41:40  On master: Close
08:41:40  Concurrency: 4 threads (target='red')
08:41:40  
08:41:40  Began running node model.dbt_project_evaluator_integration_tests.base_model
08:41:40  1 of 2 START sql table model dbt_bperigaud.base_model .......................... [RUN]
08:41:40  Re-using an available connection from the pool (formerly list_dev_dbt_bperigaud, now model.dbt_project_evaluator_integration_tests.base_model)
08:41:40  Began compiling node model.dbt_project_evaluator_integration_tests.base_model
08:41:40  Writing injected SQL for node "model.dbt_project_evaluator_integration_tests.base_model"
08:41:40  Timing info for model.dbt_project_evaluator_integration_tests.base_model (compile): 2023-03-14 08:41:40.266320 => 2023-03-14 08:41:40.273197
08:41:40  Began executing node model.dbt_project_evaluator_integration_tests.base_model
08:41:40  Writing runtime sql for node "model.dbt_project_evaluator_integration_tests.base_model"
08:41:40  Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:40  On model.dbt_project_evaluator_integration_tests.base_model: BEGIN
08:41:40  Opening a new connection, currently in state closed
08:41:40  Redshift adapter: Connecting to redshift with username/password based auth...
08:41:42  SQL status: cursor.rowcount = -1 in 2 seconds
08:41:42  Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:42  On model.dbt_project_evaluator_integration_tests.base_model: create  table
    "dev"."dbt_bperigaud"."base_model__dbt_tmp"
    
    
    
  as (
    

select 1 as id
  );
08:41:42  SQL status: cursor.rowcount = -1 in 0 seconds
08:41:42  Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:42  On model.dbt_project_evaluator_integration_tests.base_model: alter table "dev"."dbt_bperigaud"."base_model" rename to "base_model__dbt_backup"
08:41:43  SQL status: cursor.rowcount = -1 in 1 seconds
08:41:43  Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:43  On model.dbt_project_evaluator_integration_tests.base_model: alter table "dev"."dbt_bperigaud"."base_model__dbt_tmp" rename to "base_model"
08:41:43  SQL status: cursor.rowcount = -1 in 0 seconds
08:41:43  On model.dbt_project_evaluator_integration_tests.base_model: COMMIT
08:41:43  Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:43  On model.dbt_project_evaluator_integration_tests.base_model: COMMIT
08:41:43  SQL status: cursor.rowcount = -1 in 0 seconds
08:41:43  Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:43  On model.dbt_project_evaluator_integration_tests.base_model: BEGIN
08:41:44  SQL status: cursor.rowcount = -1 in 1 seconds
08:41:44  Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:44  On model.dbt_project_evaluator_integration_tests.base_model: drop table if exists "dev"."dbt_bperigaud"."base_model__dbt_backup" cascade
08:41:45  SQL status: cursor.rowcount = -1 in 0 seconds
08:41:45  On model.dbt_project_evaluator_integration_tests.base_model: COMMIT
08:41:45  Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:45  On model.dbt_project_evaluator_integration_tests.base_model: COMMIT
08:41:45  SQL status: cursor.rowcount = -1 in 0 seconds
08:41:45  Using redshift connection "model.dbt_project_evaluator_integration_tests.base_model"
08:41:45  On model.dbt_project_evaluator_integration_tests.base_model: BEGIN
08:41:45  SQL status: cursor.rowcount = -1 in 0 seconds
08:41:45  Timing info for model.dbt_project_evaluator_integration_tests.base_model (execute): 2023-03-14 08:41:40.274327 => 2023-03-14 08:41:45.526500
08:41:45  On model.dbt_project_evaluator_integration_tests.base_model: ROLLBACK
08:41:45  On model.dbt_project_evaluator_integration_tests.base_model: Close
08:41:45  Sending event: {'category': 'dbt', 'action': 'run_model', 'label': 'e99cd034-f0fe-4755-9ee0-1ab323b207ba', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10c705dc0>]}
08:41:45  1 of 2 OK created sql table model dbt_bperigaud.base_model ..................... [cursor.rowcount = -1 in 5.47s]
08:41:45  Finished running node model.dbt_project_evaluator_integration_tests.base_model
08:41:45  Began running node model.dbt_project_evaluator_integration_tests.my_model
08:41:45  2 of 2 START sql view model dbt_bperigaud.my_model ............................. [RUN]
08:41:45  Acquiring new redshift connection 'model.dbt_project_evaluator_integration_tests.my_model'
08:41:45  Began compiling node model.dbt_project_evaluator_integration_tests.my_model
08:41:45  Writing injected SQL for node "model.dbt_project_evaluator_integration_tests.my_model"
08:41:45  Timing info for model.dbt_project_evaluator_integration_tests.my_model (compile): 2023-03-14 08:41:45.749256 => 2023-03-14 08:41:45.756402
08:41:45  Began executing node model.dbt_project_evaluator_integration_tests.my_model
08:41:45  Writing runtime sql for node "model.dbt_project_evaluator_integration_tests.my_model"
08:41:45  Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:45  On model.dbt_project_evaluator_integration_tests.my_model: BEGIN
08:41:45  Opening a new connection, currently in state init
08:41:45  Redshift adapter: Connecting to redshift with username/password based auth...
08:41:46  SQL status: cursor.rowcount = -1 in 1 seconds
08:41:46  Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:46  On model.dbt_project_evaluator_integration_tests.my_model: create view "dev"."dbt_bperigaud"."my_model__dbt_tmp" as (
    

select * from "dev"."dbt_bperigaud"."base_model"
  ) ;
08:41:47  SQL status: cursor.rowcount = -1 in 1 seconds
08:41:47  Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:47  On model.dbt_project_evaluator_integration_tests.my_model: alter table "dev"."dbt_bperigaud"."my_model" rename to "my_model__dbt_backup"
08:41:47  Redshift adapter: Redshift error: {'S': 'ERROR', 'C': '42P01', 'M': 'relation "dbt_bperigaud.my_model" does not exist', 'F': '../src/pg/src/backend/catalog/namespace.c', 'L': '267', 'R': 'LocalRangeVarGetRelid'}
08:41:47  On model.dbt_project_evaluator_integration_tests.my_model: ROLLBACK
08:41:47  Redshift adapter: Error running SQL: macro rename_relation
08:41:47  Redshift adapter: Rolling back transaction.
08:41:47  Timing info for model.dbt_project_evaluator_integration_tests.my_model (execute): 2023-03-14 08:41:45.758365 => 2023-03-14 08:41:47.748380
08:41:47  On model.dbt_project_evaluator_integration_tests.my_model: Close
08:41:47  Database Error in model my_model (models/del/my_model.sql)
  {'S': 'ERROR', 'C': '42P01', 'M': 'relation "dbt_bperigaud.my_model" does not exist', 'F': '../src/pg/src/backend/catalog/namespace.c', 'L': '267', 'R': 'LocalRangeVarGetRelid'}
  compiled Code at target/run/dbt_project_evaluator_integration_tests/models/del/my_model.sql
08:41:47  Sending event: {'category': 'dbt', 'action': 'run_model', 'label': 'e99cd034-f0fe-4755-9ee0-1ab323b207ba', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10ca56520>]}
08:41:47  2 of 2 ERROR creating sql view model dbt_bperigaud.my_model .................... [ERROR in 2.04s]
08:41:47  Finished running node model.dbt_project_evaluator_integration_tests.my_model
08:41:47  Using redshift connection "master"
08:41:47  On master: BEGIN
08:41:47  Opening a new connection, currently in state closed
08:41:47  Redshift adapter: Connecting to redshift with username/password based auth...
08:41:48  SQL status: cursor.rowcount = -1 in 1 seconds
08:41:48  On master: COMMIT
08:41:48  Using redshift connection "master"
08:41:48  On master: COMMIT
08:41:49  SQL status: cursor.rowcount = -1 in 0 seconds
08:41:49  On master: Close
08:41:49  Connection 'master' was properly closed.
08:41:49  Connection 'model.dbt_project_evaluator_integration_tests.base_model' was properly closed.
08:41:49  Connection 'model.dbt_project_evaluator_integration_tests.my_model' was properly closed.
08:41:49  
08:41:49  Finished running 1 table model, 1 view model in 0 hours 0 minutes and 14.42 seconds (14.42s).
08:41:49  Command end result
08:41:49  
08:41:49  Completed with 1 error and 0 warnings:
08:41:49  
08:41:49  Database Error in model my_model (models/del/my_model.sql)
08:41:49    {'S': 'ERROR', 'C': '42P01', 'M': 'relation "dbt_bperigaud.my_model" does not exist', 'F': '../src/pg/src/backend/catalog/namespace.c', 'L': '267', 'R': 'LocalRangeVarGetRelid'}
08:41:49    compiled Code at target/run/dbt_project_evaluator_integration_tests/models/del/my_model.sql
08:41:49  
08:41:49  Done. PASS=1 WARN=0 ERROR=1 SKIP=0 TOTAL=2
08:41:49  Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1052cceb0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x1087b7520>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10c10dfa0>]}
08:41:49  Flushing usage events

@dbeatty10
Copy link
Contributor

@b-per you are prince 👑 for generating output on 1.4 and 1.5.

For anyone looking for an easy way to use Python venv to install different dbt versions, see below for some examples. The nice thing about venv over alternatives is that it is built-in to all the supported versions of Python3, so no need to install anything new.

These commands are not optimized for readability. Rather, they are optimized for copy-paste + updating the template (regardless if a pre-release version or not).

dbt-redshift 1.4.0

python3 -m venv redshift_1.4
source redshift_1.4/bin/activate
python3 -m pip install --upgrade pip
python3 -m pip install --pre dbt-redshift~=1.4.0.dev0 dbt-core~=1.4.0.dev0
source redshift_1.4/bin/activate
dbt --version
deactivate

dbt-redshift 1.5.0

python3 -m venv redshift_1.5
source redshift_1.5/bin/activate
python3 -m pip install --upgrade pip
python3 -m pip install --pre dbt-redshift~=1.5.0.dev0 dbt-core~=1.5.0.dev0
source redshift_1.5/bin/activate
dbt --version
deactivate

General case

python3 -m venv {{adapter}}_{{version_number}}
source {{adapter}}_{{version_number}}/bin/activate
python3 -m pip install --upgrade pip
python3 -m pip install --pre dbt-{{adapter}}~={{version_number}}.dev0 dbt-core~={{version_number}}.dev0
source {{adapter}}_{{version_number}}/bin/activate
dbt --version
deactivate

@sathiish-kumar
Copy link
Contributor

Thanks @b-per I'll try using the venv.
Scanning your logs reveals a major difference to me that's probably worth noting here:

In the 1.4.0 logs (relevant snippet below), we see that the CREATE VIEW happens on the my_model__dbt_tmp and the rename appears to happen on this same view name (my_model__dbt_tmp).

1.4.0
08:41:18.993738 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: /* {"app": "dbt", "dbt_version": "1.4.5", "profile_name": "integration_tests", "target_name": "red", "node_id": "model.dbt_project_evaluator_integration_tests.my_model"} */


  create view "dev"."dbt_bperigaud"."my_model__dbt_tmp" as (


select * from "dev"."dbt_bperigaud"."base_model"
  ) ;

08:41:19.115287 [debug] [Thread-3  ]: SQL status: CREATE VIEW in 0 seconds
08:41:19.123925 [debug] [Thread-3  ]: Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:19.124871 [debug] [Thread-3  ]: On model.dbt_project_evaluator_integration_tests.my_model: /* {"app": "dbt", "dbt_version": "1.4.5", "profile_name": "integration_tests", "target_name": "red", "node_id": "model.dbt_project_evaluator_integration_tests.my_model"} */
alter table "dev"."dbt_bperigaud"."my_model__dbt_tmp" rename to "my_model"

In the 1.5.0-b3 logs (relevant snippet below), we see that the CREATE VIEW happens on the same view name my_model__dbt_tmp but rename appears to happen on my_model (alter table "dev"."dbt_bperigaud"."my_model" rename to "my_model__dbt_backup").

1.5.0-b3
08:41:46  On model.dbt_project_evaluator_integration_tests.my_model: create view "dev"."dbt_bperigaud"."my_model__dbt_tmp" as (


select * from "dev"."dbt_bperigaud"."base_model"
  ) ;
08:41:47  SQL status: cursor.rowcount = -1 in 1 seconds
08:41:47  Using redshift connection "model.dbt_project_evaluator_integration_tests.my_model"
08:41:47  On model.dbt_project_evaluator_integration_tests.my_model: alter table "dev"."dbt_bperigaud"."my_model" rename to "my_model__dbt_backup"
08:41:47  Redshift adapter: Redshift error: {'S': 'ERROR', 'C': '42P01', 'M': 'relation "dbt_bperigaud.my_model" does not exist', 'F': '../src/pg/src/backend/catalog/namespace.c', 'L': '267', 'R': 'LocalRangeVarGetRelid'}
08:41:47  On model.dbt_project_evaluator_integration_tests.my_model: ROLLBACK

Assuming the view my_model already exists, based on my understanding of the sequence of steps for materialization in dbt (could be very off here so please correct me if I'm wrong):

  1. Create a new view that's a temp view that follows the naming convention with a __dbt_tmp suffix.
  2. Rename existing view (in this case my_model) in Redshift with a __dbt_backup suffix.
  3. Rename new tmp view as the my_model view.
  4. Drop the backup.

Assuming this is correct, I'm not sure why 1.4.0 doesn't do this but directly chooses to rename the tmp view to my_model. Looking beyond this, it's surprising to me that ALTER table appears to work on the view still, will be troubleshooting this further.

@mikealfare
Copy link
Contributor

mikealfare commented Mar 14, 2023

Assuming this is correct, I'm not sure why 1.4.0 doesn't do this but directly chooses to rename the tmp view to my_model.

From a cursory glance, it seems like maybe we tried to implement a safety rollback feature and botched the view name.

Looking beyond this, it's surprising to me that ALTER table appears to work on the view still

There are several spots where table and view are interchangeable in Redshift. My guess is that it was so common to replace a view with a table (for performance) that they just made things work with both where possible to avoid having to rewrite that code if you decided to materialize a view.

@sathiish-kumar
Copy link
Contributor

I think I've stumbled upon the root-cause of the issue here, I'm still yet to figure out why this doesn't happen in 1.4.5. This is predicated on my understanding of the sequence of steps for materialization being right:

Here is the sequence of SQL that would be executed on the second run with 1.5.0-b3, assuming the first run created a base_model table and my_model view.

create table "dev"."dbt"."base_model__dbt_tmp" as (select 1 as id);

alter table "dev"."dbt"."base_model" rename to "base_model__dbt_backup" /* After this two tables exist: base_model__dbt_backup, base_model__dbt_tmp. One view exists as well - my_model. */

alter table "dev"."dbt"."base_model__dbt_tmp" rename to "base_model" /* After this two tables exist: base_model, base_model__dbt_backup, One view exists as well - my_model */

drop table if exists "dev"."dbt"."base_model__dbt_backup" cascade /* After this one table exists: base_model, zero view exists. */

After the first alter table, Redshift starts pointing the view to the table base_model__dbt_backup. Once this happens, it's never again pointed back to the base_model, when it should be after the second alter table. After the final drop table the view is silently gone since it's dependent table (base_model__dbt_backup) has been dropped.

Will try to figure out why things aren't working the same way in 1.4.5 but any pointers on changes in materialization between the two would be appreciated. Thanks!

@mikealfare mikealfare self-assigned this Mar 14, 2023
@mikealfare
Copy link
Contributor

mikealfare commented Mar 15, 2023

@sathiish-kumar is correct. I created a test case to reflect this issue and it fails. If I add bind=False into the view config, it passes. I also split out the second run to do the table and then the view, and confirmed that the view is not there after refreshing the table.

Since this didn't seem to be an issue in 1.4, perhaps we updated the default for binding view from False to True in 1.5?

@mikealfare
Copy link
Contributor

Providing an update in case you're also looking into this @sathiish-kumar, @b-per, @dbeatty10, et. al.

It looks like 1.4 is doing some dependency check before starting it's model update run. It's reading things like pg_depend, pg_namespace, etc. In 1.5 we're not doing that. Things like the view creation statement, bind option and default, and adapter jinja templates are pretty much the same. It just looks like we lost a step in the process somewhere. I'll continue to dive deeper.

@mikealfare
Copy link
Contributor

We dropped inheritance from dbt-postgress.PostgresAdapter in 1.5. That might have been intentional, in order to separate the two; but we must have dropped some of the functionality. The test case passes when I include that superclass.

mikealfare added a commit that referenced this issue Mar 16, 2023
mikealfare added a commit that referenced this issue Mar 16, 2023
* moved models out of test file (more than 20 lines); moved files to directory (more than 1 file)
* created test case for #365
* added sslmode to `RedshiftCredentials._connection_keys`
* moved code out of try block that would not trigger exception
* added link relations in cache logic
* pulled up abstract methods that were not implemented, but retained `NotImplementedError`
* the macro `postgres_get_relations` only has one underscore in `dbt-core` instead of two, like `redshift__get_relations`
* changie
abbywh pushed a commit to abbywh/dbt-redshift that referenced this issue Oct 11, 2023
* moved models out of test file (more than 20 lines); moved files to directory (more than 1 file)
* created test case for dbt-labs#365
* added sslmode to `RedshiftCredentials._connection_keys`
* moved code out of try block that would not trigger exception
* added link relations in cache logic
* pulled up abstract methods that were not implemented, but retained `NotImplementedError`
* the macro `postgres_get_relations` only has one underscore in `dbt-core` instead of two, like `redshift__get_relations`
* changie
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants