[FEA] Understand why we were able to parse a timestamp correctly for America/Los_Angeles
when all we support is UTC.
#10488
Labels
feature request
New feature or request
Is your feature request related to a problem? Please describe.
This is really odd to me. I have a test that is still a WIP.
It lest you set the timezone for from_json which is used to parse timestamps. But when I set the timezone to "America/Los_Angeles" it works correctly and I don't know why. It shouldn't. It has DST rules that are ongoing.
We should not be producing the right answer. More likely Spark is producing the wrong answer somehow.
https://github.com/apache/spark/blob/e6a3385e27fa95391433ea02fa053540fe101d40/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/JsonExpressionsSuite.scala#L529-L571
is the test that this is based off of. I hope that I am wrong and everything is working fine, but it looks really odd to me.
The text was updated successfully, but these errors were encountered: