spark-instrumented-optimizer/sql/catalyst/src/main
Takuya UESHIN 568055da93 [SPARK-23054][SQL][PYSPARK][FOLLOWUP] Use sqlType casting when casting PythonUserDefinedType to String.
## What changes were proposed in this pull request?

This is a follow-up of #20246.

If a UDT in Python doesn't have its corresponding Scala UDT, cast to string will be the raw string of the internal value, e.g. `"org.apache.spark.sql.catalyst.expressions.UnsafeArrayDataxxxxxxxx"` if the internal type is `ArrayType`.

This pr fixes it by using its `sqlType` casting.

## How was this patch tested?

Added a test and existing tests.

Author: Takuya UESHIN <ueshin@databricks.com>

Closes #20306 from ueshin/issues/SPARK-23054/fup1.
2018-01-19 11:37:08 +08:00
..
antlr4/org/apache/spark/sql/catalyst/parser [SPARK-22999][SQL] show databases like command' can remove the like keyword 2018-01-15 02:02:49 +08:00
java/org/apache/spark/sql [SPARK-22825][SQL] Fix incorrect results of Casting Array to String 2018-01-05 14:02:21 +08:00
scala/org/apache/spark/sql [SPARK-23054][SQL][PYSPARK][FOLLOWUP] Use sqlType casting when casting PythonUserDefinedType to String. 2018-01-19 11:37:08 +08:00