[SPARK-36415][SQL][DOCS] Add docs for try_cast/try_add/try_divide
### What changes were proposed in this pull request?
Add documentation for new functions try_cast/try_add/try_divide
### Why are the changes needed?
Better documentation. These new functions are useful when migrating to the ANSI dialect.
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
Build docs and preview:
![image](https://user-images.githubusercontent.com/1097932/128209312-34a6cc6a-a73d-4aed-8646-22b1cb7ce702.png)
Closes #33638 from gengliangwang/addDocForTry.
Authored-by: Gengliang Wang <gengliang@apache.org>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
(cherry picked from commit 8a35243fa7
)
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
This commit is contained in:
parent
bcf2169bed
commit
87291dced1
|
@ -257,6 +257,13 @@ The behavior of some SQL operators can be different under ANSI mode (`spark.sql.
|
|||
- `map_col[key]`: This operator throws `NoSuchElementException` if key does not exist in map.
|
||||
- `GROUP BY`: aliases in a select list can not be used in GROUP BY clauses. Each column referenced in a GROUP BY clause shall unambiguously reference a column of the table resulting from the FROM clause.
|
||||
|
||||
### Useful Functions for ANSI Mode
|
||||
|
||||
When ANSI mode is on, it throws exceptions for invalid operations. You can use the following SQL functions to suppress such exceptions.
|
||||
- `try_cast`: identical to `CAST`, except that it returns `NULL` result instead of throwing an exception on runtime error.
|
||||
- `try_add`: identical to the add operator `+`, except that it returns `NULL` result instead of throwing an exception on integral value overflow.
|
||||
- `try_divide`: identical to the division operator `/`, except that it returns `NULL` result instead of throwing an exception on dividing 0.
|
||||
|
||||
### SQL Keywords
|
||||
|
||||
When `spark.sql.ansi.enabled` is true, Spark SQL will use the ANSI mode parser.
|
||||
|
|
Loading…
Reference in a new issue