spark-instrumented-optimizer/sql/catalyst/src/main
Wenchen Fan 1f1d98c6fa [SPARK-26580][SQL] remove Scala 2.11 hack for Scala UDF
## What changes were proposed in this pull request?

In https://github.com/apache/spark/pull/22732 , we tried our best to keep the behavior of Scala UDF unchanged in Spark 2.4.

However, since Spark 3.0, Scala 2.12 is the default. The trick that was used to keep the behavior unchanged doesn't work with Scala 2.12.

This PR proposes to remove the Scala 2.11 hack, as it's not useful.

## How was this patch tested?

existing tests.

Closes #23498 from cloud-fan/udf.

Authored-by: Wenchen Fan <wenchen@databricks.com>
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
2019-01-11 14:52:13 +08:00
..
antlr4/org/apache/spark/sql/catalyst/parser [SPARK-26435][SQL] Support creating partitioned table using Hive CTAS by specifying partition column names 2018-12-27 16:03:14 +08:00
java/org/apache/spark/sql [SPARK-26448][SQL] retain the difference between 0.0 and -0.0 2019-01-09 13:50:32 -08:00
scala/org/apache/spark/sql [SPARK-26580][SQL] remove Scala 2.11 hack for Scala UDF 2019-01-11 14:52:13 +08:00