-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TIMESTAMP type cannot represent seconds representable in Spark #9904
Comments
@NEUpanning Thanks for opening an issue. I believe Spark's timestamp is in microsecond unit, see https://github.com/apache/spark/blob/master/sql/api/src/main/scala/org/apache/spark/sql/types/TimestampType.scala#L23. But the issue in the test of |
The maximum seconds of a valid timestamp in Spark is
But in the function from_unixtime, timestamp could be created with a larger number of seconds which can cause check failure in debug mode. I wonder if we need to fix that in the function |
You are right. Spark's timestamp isn't exceed the range of seconds in [INT64_MIN/1000 - 1, INT64_MAX/1000]. In the function from_unixtime, argument unix_time can be represented by LongType or DecimalType etc. as seconds, so its range exceeds Velox timestamp type's range limit. |
@rui-mo I'm thinking the same. |
Bug description
Presto's Timestamp is stored in one 64-bit signed integer for milliseconds, so TIMESTAMP type limits the range of seconds in [INT64_MIN/1000 - 1, INT64_MAX/1000]. Spark's Timestamp is stored in 64-bit signed integer for seconds, so the range of seconds in [INT64_MIN- 1, INT64_MAX]. Therefore, TIMESTAMP type cannot represent seconds representable in Spark.
System information
Relevant logs
No response
The text was updated successfully, but these errors were encountered: