spark-instrumented-optimizer/extras/java8-tests
Sean Owen 6ab96a6fd0 SPARK-2749 [BUILD]. Spark SQL Java tests aren't compiling in Jenkins' Maven builds; missing junit:junit dep
The Maven-based builds in the build matrix have been failing for a few days:

https://amplab.cs.berkeley.edu/jenkins/view/Spark/

On inspection, it looks like the Spark SQL Java tests don't compile:

https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-Master-Maven-pre-YARN/hadoop.version=1.0.4,label=centos/244/consoleFull

I confirmed it by repeating the command vs master:

`mvn -Dhadoop.version=1.0.4 -Dlabel=centos -DskipTests clean package`

The problem is that this module doesn't depend on JUnit. In fact, none of the modules do, but `com.novocode:junit-interface` (the SBT-JUnit bridge) pulls it in, in most places. However this module doesn't depend on `com.novocode:junit-interface`

Adding the `junit:junit` dependency fixes the compile problem. In fact, the other modules with Java tests should probably depend on it explicitly instead of happening to get it via `com.novocode:junit-interface`, since that is a bit SBT/Scala-specific (and I am not even sure it's needed).

Author: Sean Owen <srowen@gmail.com>

Closes #1660 from srowen/SPARK-2749 and squashes the following commits:

858ff7c [Sean Owen] Add explicit junit dep to other modules with Java tests for robustness
9636794 [Sean Owen] Add junit dep so that Spark SQL Java tests compile
2014-07-30 15:04:33 -07:00
..
src/test SPARK-1973. Add randomSplit to JavaRDD (with tests, and tidy Java tests) 2014-06-04 11:27:08 -07:00
pom.xml SPARK-2749 [BUILD]. Spark SQL Java tests aren't compiling in Jenkins' Maven builds; missing junit:junit dep 2014-07-30 15:04:33 -07:00
README.md [java8API] SPARK-964 Investigate the potential for using JDK 8 lambda expressions for the Java/Scala APIs 2014-03-03 22:31:30 -08:00

Java 8 Test Suites

These tests require having Java 8 installed and are isolated from the main Spark build. If Java 8 is not your system's default Java version, you will need to point Spark's build to your Java location. The set-up depends a bit on the build system:

  • Sbt users can either set JAVA_HOME to the location of a Java 8 JDK or explicitly pass -java-home to the sbt launch script. If a Java 8 JDK is detected sbt will automatically include the Java 8 test project.

    $ JAVA_HOME=/opt/jdk1.8.0/ sbt/sbt clean "test-only org.apache.spark.Java8APISuite"

  • For Maven users,

    Maven users can also refer to their Java 8 directory using JAVA_HOME. However, Maven will not automatically detect the presence of a Java 8 JDK, so a special build profile -Pjava8-tests must be used.

    $ JAVA_HOME=/opt/jdk1.8.0/ mvn clean install -DskipTests $ JAVA_HOME=/opt/jdk1.8.0/ mvn test -Pjava8-tests -DwildcardSuites=org.apache.spark.Java8APISuite

    Note that the above command can only be run from project root directory since this module depends on core and the test-jars of core and streaming. This means an install step is required to make the test dependencies visible to the Java 8 sub-project.