ce53b7199d
### What changes were proposed in this pull request? When SQL function `to_timestamp_ntz` has invalid format pattern input, throw a runtime exception with hints for the valid patterns, instead of throwing an upgrade exception with suggestions to use legacy formatters. ### Why are the changes needed? As discussed in https://github.com/apache/spark/pull/32995/files#r655148980, there is an error message saying "You may get a different result due to the upgrading of Spark 3.0: Fail to recognize 'yyyy-MM-dd GGGGG' pattern in the DateTimeFormatter. 1) You can set spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before Spark 3.0" This is not true for function to_timestamp_ntz, which only uses the Iso8601TimestampFormatter and added since Spark 3.2. We should improve it. ### Does this PR introduce _any_ user-facing change? No, the new SQL function is not released yet. ### How was this patch tested? Unit test Closes #33019 from gengliangwang/improveError. Authored-by: Gengliang Wang <gengliang@apache.org> Signed-off-by: Gengliang Wang <gengliang@apache.org> |
||
---|---|---|
.. | ||
benchmarks | ||
src | ||
pom.xml |