-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Spark CAST(integral as timestamp) #11089
base: main
Are you sure you want to change the base?
Add Spark CAST(integral as timestamp) #11089
Conversation
✅ Deploy Preview for meta-velox canceled.
|
Hey @rui-mo, could you please help review this? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks.
// Ensure that other integral types cast to int64 work as well. | ||
testIntegralToTimestampCast<int8_t>(); | ||
testIntegralToTimestampCast<int16_t>(); | ||
testIntegralToTimestampCast<int32_t>(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder why we adopt different test methods for int64_t and the others. Is there any special consideration?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
since other integral types can't test large timestamp values, and since we only need to ensure cast
smaller integral to int64_t
works well, so here I adopt different test methods
ff47627
to
c09a126
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. Just several nits.
1dac228
to
18ce96d
Compare
18ce96d
to
265e1c5
Compare
cc: @pedroerp @mbasmanova If you have any more comment. Thanks! |
Add Spark CAST (integral as timestamp). The input value is treated as the
number of seconds since the epoch (1970-01-01 00:00:00 UTC). Supported types
are tinyint, smallint, integer and bigint.
Spark's implementation: https://github.com/apache/spark/blob/v3.5.1/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala#L680