[SPARK-18179][SQL] Throws analysis exception with a proper message for unsupported argument types in reflect/java_method function

## What changes were proposed in this pull request?

This PR proposes throwing an `AnalysisException` with a proper message rather than `NoSuchElementException` with the message ` key not found: TimestampType` when unsupported types are given to `reflect` and `java_method` functions.

```scala
spark.range(1).selectExpr("reflect('java.lang.String', 'valueOf', cast('1990-01-01' as timestamp))")
```

produces

**Before**

```
java.util.NoSuchElementException: key not found: TimestampType
  at scala.collection.MapLike$class.default(MapLike.scala:228)
  at scala.collection.AbstractMap.default(Map.scala:59)
  at scala.collection.MapLike$class.apply(MapLike.scala:141)
  at scala.collection.AbstractMap.apply(Map.scala:59)
  at org.apache.spark.sql.catalyst.expressions.CallMethodViaReflection$$anonfun$findMethod$1$$anonfun$apply$1.apply(CallMethodViaReflection.scala:159)
...
```

**After**

```
cannot resolve 'reflect('java.lang.String', 'valueOf', CAST('1990-01-01' AS TIMESTAMP))' due to data type mismatch: arguments from the third require boolean, byte, short, integer, long, float, double or string expressions; line 1 pos 0;
'Project [unresolvedalias(reflect(java.lang.String, valueOf, cast(1990-01-01 as timestamp)), Some(<function1>))]
+- Range (0, 1, step=1, splits=Some(2))
...
```

Added message is,

```
arguments from the third require boolean, byte, short, integer, long, float, double or string expressions
```

## How was this patch tested?

Tests added in `CallMethodViaReflection`.

Author: hyukjinkwon <gurwls223@gmail.com>

Closes #15694 from HyukjinKwon/SPARK-18179.
This commit is contained in:
hyukjinkwon 2016-11-22 22:25:27 -08:00 committed by Reynold Xin
parent 982b82e32e
commit 2559fb4b40
2 changed files with 13 additions and 0 deletions

View file

@ -65,6 +65,10 @@ case class CallMethodViaReflection(children: Seq[Expression])
TypeCheckFailure("first two arguments should be string literals")
} else if (!classExists) {
TypeCheckFailure(s"class $className not found")
} else if (children.slice(2, children.length)
.exists(e => !CallMethodViaReflection.typeMapping.contains(e.dataType))) {
TypeCheckFailure("arguments from the third require boolean, byte, short, " +
"integer, long, float, double or string expressions")
} else if (method == null) {
TypeCheckFailure(s"cannot find a static method that matches the argument types in $className")
} else {

View file

@ -17,6 +17,8 @@
package org.apache.spark.sql.catalyst.expressions
import java.sql.Timestamp
import org.apache.spark.SparkFunSuite
import org.apache.spark.sql.catalyst.analysis.TypeCheckResult.TypeCheckFailure
import org.apache.spark.sql.types.{IntegerType, StringType}
@ -85,6 +87,13 @@ class CallMethodViaReflectionSuite extends SparkFunSuite with ExpressionEvalHelp
assert(createExpr(staticClassName, "method1").checkInputDataTypes().isSuccess)
}
test("unsupported type checking") {
val ret = createExpr(staticClassName, "method1", new Timestamp(1)).checkInputDataTypes()
assert(ret.isFailure)
val errorMsg = ret.asInstanceOf[TypeCheckFailure].message
assert(errorMsg.contains("arguments from the third require boolean, byte, short"))
}
test("invoking methods using acceptable types") {
checkEvaluation(createExpr(staticClassName, "method1"), "m1")
checkEvaluation(createExpr(staticClassName, "method2", 2), "m2")