You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Wow that PR has no config to get the old behavior in it at all. Just the advice.
In Spark 3.2, special datetime values such as epoch, today, yesterday, tomorrow, and now are supported in typed literals only, for instance, select timestamp'now'. In Spark 3.1 and 3.0, such special values are supported in any casts of strings to dates/timestamps. To keep these special values as dates/timestamps in Spark 3.1 and 3.0, you should replace them manually, e.g. if (c in ('now', 'today'), current_date(), cast(c as date)).
It looks like this was a bug fix because "now" was being replaced per row, which meant that the results could change for each row, and possibly depend on the machine it was run on. I don't think there will be a config to get the old behavior. This is good overall, but we still have to support the old behavior for older version.
Describe the bug
ParseDateTimeSuite:
Steps/Code to reproduce bug
Run ParseDateTimeSuite
Expected behavior
Tests should pass
Environment details (please complete the following information)
N/A
Additional context
Spark PR: apache/spark@a59063d
The text was updated successfully, but these errors were encountered: