[SPARK-28703][SQL][TEST] Skip HiveExternalCatalogVersionsSuite and 3 tests in HiveSparkSubmitSuite at JDK9+

## What changes were proposed in this pull request?
This PR skip more test when testing with `JAVA_9` or later:
1. Skip `HiveExternalCatalogVersionsSuite` when testing with `JAVA_9` or later because our previous version does not support `JAVA_9` or later.

2. Skip 3 tests in `HiveSparkSubmitSuite` because the `spark.sql.hive.metastore.version` of these tests is lower than `2.0`, however Datanucleus 3.x seem does not support `JAVA_9` or later. Hive upgrade Datanucleus to 4.x from Hive 2.0([HIVE-6113](https://issues.apache.org/jira/browse/HIVE-6113)):

```
[info]   Cause: org.datanucleus.exceptions.NucleusException: The java type java.lang.Long (jdbc-type="", sql-type="") cant be mapped for this datastore. No mapping is available.
[info]   at org.datanucleus.store.rdbms.mapping.RDBMSMappingManager.getDatastoreMappingClass(RDBMSMappingManager.java:1215)
[info]   at org.datanucleus.store.rdbms.mapping.RDBMSMappingManager.createDatastoreMapping(RDBMSMappingManager.java:1378)
[info]   at org.datanucleus.store.rdbms.table.AbstractClassTable.addDatastoreId(AbstractClassTable.java:392)
[info]   at org.datanucleus.store.rdbms.table.ClassTable.initializePK(ClassTable.java:1087)
[info]   at org.datanucleus.store.rdbms.table.ClassTable.preInitialize(ClassTable.java:247)
```

Please note that this exclude only the tests related to the old metastore library, some other tests of `HiveSparkSubmitSuite` still fail on JDK9+.

## How was this patch tested?

manual tests:

Test with JDK 11:
```
[info] HiveExternalCatalogVersionsSuite:
[info] - backward compatibility !!! CANCELED !!! (37 milliseconds)

[info] HiveSparkSubmitSuite:
...
[info] - SPARK-8020: set sql conf in spark conf !!! CANCELED !!! (30 milliseconds)
[info]   org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(JAVA_9) was true (HiveSparkSubmitSuite.scala:130)
...
[info] - SPARK-9757 Persist Parquet relation with decimal column !!! CANCELED !!! (1 millisecond)
[info]   org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(JAVA_9) was true (HiveSparkSubmitSuite.scala:168)
...
[info] - SPARK-16901: set javax.jdo.option.ConnectionURL !!! CANCELED !!! (1 millisecond)
[info]   org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(JAVA_9) was true (HiveSparkSubmitSuite.scala:260)
...
```

Closes #25426 from wangyum/SPARK-28703.

Authored-by: Yuming Wang <yumwang@ebay.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
This commit is contained in:
Yuming Wang 2019-08-12 20:42:06 -07:00 committed by Dongjoon Hyun
parent ec84415358
commit 016e1b491c
2 changed files with 16 additions and 3 deletions

View file

@ -24,6 +24,7 @@ import java.nio.file.{Files, Paths}
import scala.sys.process._
import scala.util.control.NonFatal
import org.apache.commons.lang3.{JavaVersion, SystemUtils}
import org.apache.hadoop.conf.Configuration
import org.apache.spark.{SecurityManager, SparkConf, TestUtils}
@ -45,6 +46,7 @@ import org.apache.spark.util.Utils
* downloading for this spark version.
*/
class HiveExternalCatalogVersionsSuite extends SparkSubmitTestUtils {
private val isTestAtLeastJava9 = SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9)
private val wareHousePath = Utils.createTempDir(namePrefix = "warehouse")
private val tmpDataDir = Utils.createTempDir(namePrefix = "test-data")
// For local test, you can set `sparkTestingDir` to a static value like `/tmp/test-spark`, to
@ -137,9 +139,7 @@ class HiveExternalCatalogVersionsSuite extends SparkSubmitTestUtils {
new String(Files.readAllBytes(contentPath), StandardCharsets.UTF_8)
}
override def beforeAll(): Unit = {
super.beforeAll()
private def prepare(): Unit = {
val tempPyFile = File.createTempFile("test", ".py")
// scalastyle:off line.size.limit
Files.write(tempPyFile.toPath,
@ -201,7 +201,16 @@ class HiveExternalCatalogVersionsSuite extends SparkSubmitTestUtils {
tempPyFile.delete()
}
override def beforeAll(): Unit = {
super.beforeAll()
if (!isTestAtLeastJava9) {
prepare()
}
}
test("backward compatibility") {
// TODO SPARK-28704 Test backward compatibility on JDK9+ once we have a version supports JDK9+
assume(!isTestAtLeastJava9)
val args = Seq(
"--class", PROCESS_TABLES.getClass.getName.stripSuffix("$"),
"--name", "HiveExternalCatalog backward compatibility test",

View file

@ -21,6 +21,7 @@ import java.io.{BufferedWriter, File, FileWriter}
import scala.util.Properties
import org.apache.commons.lang3.{JavaVersion, SystemUtils}
import org.apache.hadoop.fs.Path
import org.scalatest.{BeforeAndAfterEach, Matchers}
@ -126,6 +127,7 @@ class HiveSparkSubmitSuite
}
test("SPARK-8020: set sql conf in spark conf") {
assume(!SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9))
val unusedJar = TestUtils.createJarWithClasses(Seq.empty)
val args = Seq(
"--class", SparkSQLConfTest.getClass.getName.stripSuffix("$"),
@ -163,6 +165,7 @@ class HiveSparkSubmitSuite
}
test("SPARK-9757 Persist Parquet relation with decimal column") {
assume(!SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9))
val unusedJar = TestUtils.createJarWithClasses(Seq.empty)
val args = Seq(
"--class", SPARK_9757.getClass.getName.stripSuffix("$"),
@ -254,6 +257,7 @@ class HiveSparkSubmitSuite
}
test("SPARK-16901: set javax.jdo.option.ConnectionURL") {
assume(!SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9))
// In this test, we set javax.jdo.option.ConnectionURL and set metastore version to
// 0.13. This test will make sure that javax.jdo.option.ConnectionURL will not be
// overridden by hive's default settings when we create a HiveConf object inside