-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New add_load_date_suffix import option. #681
New add_load_date_suffix import option. #681
Conversation
With the new import option , the pipeline will now create a new BigQuery table for each new load date. This helps handle API incompatible schema changes better as each day's imports only need to be consistent rather then all prior imports.
/gcbrun |
those failures aren't related to this commit. Would fix them myself but would rather @TheLanceLord take a look and resolve them in a different PR as perhaps it will help find bugs in that tool. I manually resolved them locally and the cloud-build pipeline was clean afterwards. Thanks @TheLanceLord for writing that bot, i've had customers asking for that very thing in the past and am very happy you took the effort to write it. |
Yep, I'll take a look into it and see what I broke |
@TheLanceLord This is the PR that broke our CI #673 Can you have a look please ? |
/gcbrun |
@TheLanceLord I run the linter locally, here are the errors i get
I will add the tools/google-cloud-support-slackbot to the exclusion list for now so it doesn't break CI for everything else. I will open an issue and assign it to you |
/gcbrun |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
With the new import option , the pipeline will now create a new BigQuery table for each new load date. This helps handle API incompatible schema changes better as each day's imports only need to be consistent rather then all prior imports. Co-authored-by: Abdel SGHIOUAR <abdelfettah@google.com>
With the new import option , the pipeline will now create
a new BigQuery table for each new load date. This helps handle API incompatible
schema changes better as each day's imports only need to be consistent rather
then all prior imports.