[SPARK-28685][SQL][TEST] Test HMS 2.0.0+ in VersionsSuite/HiveClientSuites on JDK 11

## What changes were proposed in this pull request?

It seems Datanucleus 3.x can not support JDK 11:
```java
[info]   Cause: org.datanucleus.exceptions.NucleusException: The java type java.lang.Long (jdbc-type="", sql-type="") cant be mapped for this datastore. No mapping is available.
[info]   at org.datanucleus.store.rdbms.mapping.RDBMSMappingManager.getDatastoreMappingClass(RDBMSMappingManager.java:1215)
[info]   at org.datanucleus.store.rdbms.mapping.RDBMSMappingManager.createDatastoreMapping(RDBMSMappingManager.java:1378)
[info]   at org.datanucleus.store.rdbms.table.AbstractClassTable.addDatastoreId(AbstractClassTable.java:392)
[info]   at org.datanucleus.store.rdbms.table.ClassTable.initializePK(ClassTable.java:1087)
[info]   at org.datanucleus.store.rdbms.table.ClassTable.preInitialize(ClassTable.java:247)
```

Hive upgrade Datanucleus to 4.x from Hive 2.0([HIVE-6113](https://issues.apache.org/jira/browse/HIVE-6113)). This PR makes it skip `0.12`, `0.13`, `0.14`, `1.0`, `1.1` and `1.2` when testing with JDK 11.

Note that, this pr will not fix sql read hive materialized view. It's another issue:
```
3.0: sql read hive materialized view *** FAILED *** (1 second, 521 milliseconds)
3.1: sql read hive materialized view *** FAILED *** (1 second, 536 milliseconds)
```

## How was this patch tested?

manual tests:
```shell
export JAVA_HOME="/usr/lib/jdk-11.0.3"
build/sbt "hive/test-only *.VersionsSuite *.HiveClientSuites" -Phive -Phadoop-3.2
```

Closes #25405 from wangyum/SPARK-28685.

Authored-by: Yuming Wang <yumwang@ebay.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
This commit is contained in:
Yuming Wang 2019-08-10 17:01:15 -07:00 committed by Dongjoon Hyun
parent 47af8925b6
commit 58cc0df59e
2 changed files with 14 additions and 5 deletions

View file

@ -19,10 +19,13 @@ package org.apache.spark.sql.hive.client
import scala.collection.immutable.IndexedSeq
import org.apache.spark.SparkFunSuite
import org.apache.commons.lang3.{JavaVersion, SystemUtils}
private[client] trait HiveClientVersions {
protected val versions =
protected val versions = if (SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9)) {
IndexedSeq("2.0", "2.1", "2.2", "2.3", "3.0", "3.1")
} else {
IndexedSeq("0.12", "0.13", "0.14", "1.0", "1.1", "1.2", "2.0", "2.1", "2.2", "2.3", "3.0",
"3.1")
}
}

View file

@ -20,6 +20,7 @@ package org.apache.spark.sql.hive.client
import java.io.{ByteArrayOutputStream, File, PrintStream, PrintWriter}
import java.net.URI
import org.apache.commons.lang3.{JavaVersion, SystemUtils}
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hive.common.StatsSetupConst
import org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
@ -102,8 +103,11 @@ class VersionsSuite extends SparkFunSuite with Logging {
assert(getNestedMessages(e) contains "Unknown column 'A0.OWNER_NAME' in 'field list'")
}
private val versions =
private val versions = if (SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9)) {
Seq("2.0", "2.1", "2.2", "2.3", "3.0", "3.1")
} else {
Seq("0.12", "0.13", "0.14", "1.0", "1.1", "1.2", "2.0", "2.1", "2.2", "2.3", "3.0", "3.1")
}
private var client: HiveClient = null
@ -323,7 +327,8 @@ class VersionsSuite extends SparkFunSuite with Logging {
}
test(s"$version: dropTable") {
val versionsWithoutPurge = versions.takeWhile(_ != "0.14")
val versionsWithoutPurge =
if (versions.contains("0.14")) versions.takeWhile(_ != "0.14") else Nil
// First try with the purge option set. This should fail if the version is < 0.14, in which
// case we check the version and try without it.
try {
@ -478,7 +483,8 @@ class VersionsSuite extends SparkFunSuite with Logging {
test(s"$version: dropPartitions") {
val spec = Map("key1" -> "1", "key2" -> "3")
val versionsWithoutPurge = versions.takeWhile(_ != "1.2")
val versionsWithoutPurge =
if (versions.contains("1.2")) versions.takeWhile(_ != "1.2") else Nil
// Similar to dropTable; try with purge set, and if it fails, make sure we're running
// with a version that is older than the minimum (1.2 in this case).
try {