-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add support for BigQuery DataFrames. Set context.engine
to 'bigframes' to support query results larger than 10 GB
#58
Conversation
setup.py
Outdated
@@ -28,6 +28,7 @@ | |||
# 'Development Status :: 5 - Production/Stable'`` | |||
release_status = "Development Status :: 4 - Beta" | |||
dependencies = [ | |||
"bigframes >= 1.17.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please don't.
In the design I shared that this should be an optional dependency. That means it should be an "extra".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ack. Moved it to extra.
e1dfe18
to
fa25c72
Compare
5376177
to
23630a8
Compare
owlbot.py
Outdated
@@ -56,6 +56,7 @@ | |||
# Multi-processing note isn't relevant, as bigquery-magics is responsible for | |||
# creating clients, not the end user. | |||
"docs/multiprocessing.rst", | |||
"noxfile.py", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please don't modify the noxfile in this way. There is a feature in the template that does what we want.
Use unit_test_extras_by_python to make sure we have some Python versions where we test without extras and some where we test with.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change reverted
noxfile.py
Outdated
@@ -217,6 +224,9 @@ def unit(session, protobuf_implementation): | |||
if protobuf_implementation == "cpp": | |||
session.install("protobuf<4") | |||
|
|||
if install_bigframes: | |||
session.install("bigframes >= 1.17.0") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please use the "extras". Don't install this manually.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good. Updated owlbot.py instead
@@ -54,6 +54,7 @@ | |||
"grpcio >= 1.47.0, < 2.0dev", | |||
"grpcio >= 1.49.1, < 2.0dev; python_version>='3.11'", | |||
], | |||
"bigframes": ["bigframes >= 1.17.0"], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pip
will always install the latest version of bigframes. Let's make sure we test our compatibility with the minimum supported bigframes by adding bigframes==1.17.0
to https://github.com/googleapis/python-bigquery-magics/blob/main/testing/constraints-3.9.txt
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
bpd.options.bigquery.project = context.project | ||
bpd.options.bigquery.credentials = context.credentials |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm worried these might fail if the bigframes session has already started. Maybe that's what we want to happen?
Alternatively, we could explicitly create a bigframes Session object, but then whatever DataFrame we produce won't be compatible with other DataFrames in the notebook.
context.engine
to 'bigframes' to support query results larger than 10 GB
No description provided.