Setting on_schema_change
on incremental models using the insert_overwrite
strategy with dynamic partitioning causes those models to fail
#39
Labels
type:bug
Something isn't working
Describe the bug
If a model is materialized as
incremental
, its strategy is set toinsert_overwrite
, itson_schema_change
is set to any value except forignore
, and the compiled model refers to_dbt_max_partition
, the model will fail to run with the following error:Steps To Reproduce
Here's a minimal example model:
If
is_incremental()
isTrue
, the model will compile to the following:When the BigQuery adapter builds the larger SQL script it needs to run, it will hit this
run_query()
command, which (if I'm understanding correctly) is mainly used to compute the updates' schema so that it can be compared against the destination table's existing schema. When it tries to run that query, it hasn't yet declared_dbt_max_partition
(that declaration is done here, inbq_insert_overwrite()
), which triggers the error.Expected behavior
I'm expecting the temporary relation to be created after
_dbt_max_partition
has been defined, which ought to allow the rest of the incremental update to proceed as normal.Screenshots and log output
If applicable, add screenshots or log output to help explain your problem.
System information
The output of
dbt --version
:The operating system you're using:
macOS Mojave 10.14.6
The output of
python --version
:Python 3.7.10
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: