-
Notifications
You must be signed in to change notification settings - Fork 132
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bigquery: add copy functionality #127
Conversation
This commit adds a `copy` method to the `parsons.google.google_bigquery.GoogleBigQuery` connector. The `copy` method can be used to load a Parsons table into a BigQuery table by uploading the file to Google Cloud Storage.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is awesome.
parsons/google/google_bigquery.py
Outdated
gcs_client: object | ||
The GoogleCloudStorage Connector to use for loading data into Google Cloud Storage. | ||
""" | ||
tmp_gcs_bucket = tmp_gcs_bucket or os.environ.get('GCS_TEMP_BUCKET') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Out of curiosity, is there a reason why you didn't use the check_env
util?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just forgot about it. I'll swap that out for this.
One question, actually, are there any advanced configurations for copying that the Google Client can take that we could pass as |
Yeah. There is a The other thing I will do is add a |
This commit adds a
copy
method to theparsons.google.google_bigquery.GoogleBigQuery
connector. Thecopy
method can be used to load a Parsons table into a BigQuerytable by uploading the file to Google Cloud Storage.