-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added tests for different filter types Binary and Numeric #115
Conversation
halio-g
commented
Oct 31, 2023
- Added validation to skip pushdown for Array type columns.
- Added unit test for Datatype Binary and Decimal.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @halio-g! Just one comment but also did we remember to return the col_type in our schema information query?
spark-3.1-spanner-lib/src/main/java/com/google/cloud/spark/spanner/SparkFilterUtils.java
Outdated
Show resolved
Hide resolved
We only store the json and jsonb in the col_type, so we don't need to modify the schema information query. We need the metadata when converting the Spark SQL to GoogleSql. |
…o be suffixed with -0 to specify it's utc time in FilterUtil.
…ll check in predicates for array and normal data type. 3. Numeric column out of scope. 4. Test the inf, nan in the filter.
…Spanner exception otherwise.