spark-instrumented-optimizer/extras/java8-tests
Prashant Sharma 628932b8d0 [SPARK-1776] Have Spark's SBT build read dependencies from Maven.
Patch introduces the new way of working also retaining the existing ways of doing things.

For example build instruction for yarn in maven is
`mvn -Pyarn -PHadoop2.2 clean package -DskipTests`
in sbt it can become
`MAVEN_PROFILES="yarn, hadoop-2.2" sbt/sbt clean assembly`
Also supports
`sbt/sbt -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 clean assembly`

Author: Prashant Sharma <prashant.s@imaginea.com>
Author: Patrick Wendell <pwendell@gmail.com>

Closes #772 from ScrapCodes/sbt-maven and squashes the following commits:

a8ac951 [Prashant Sharma] Updated sbt version.
62b09bb [Prashant Sharma] Improvements.
fa6221d [Prashant Sharma] Excluding sql from mima
4b8875e [Prashant Sharma] Sbt assembly no longer builds tools by default.
72651ca [Prashant Sharma] Addresses code reivew comments.
acab73d [Prashant Sharma] Revert "Small fix to run-examples script."
ac4312c [Prashant Sharma] Revert "minor fix"
6af91ac [Prashant Sharma] Ported oldDeps back. + fixes issues with prev commit.
65cf06c [Prashant Sharma] Servelet API jars mess up with the other servlet jars on the class path.
446768e [Prashant Sharma] minor fix
89b9777 [Prashant Sharma] Merge conflicts
d0a02f2 [Prashant Sharma] Bumped up pom versions, Since the build now depends on pom it is better updated there. + general cleanups.
dccc8ac [Prashant Sharma] updated mima to check against 1.0
a49c61b [Prashant Sharma] Fix for tools jar
a2f5ae1 [Prashant Sharma] Fixes a bug in dependencies.
cf88758 [Prashant Sharma] cleanup
9439ea3 [Prashant Sharma] Small fix to run-examples script.
96cea1f [Prashant Sharma] SPARK-1776 Have Spark's SBT build read dependencies from Maven.
36efa62 [Patrick Wendell] Set project name in pom files and added eclipse/intellij plugins.
4973dbd [Patrick Wendell] Example build using pom reader.
2014-07-10 11:03:37 -07:00
..
src/test SPARK-1973. Add randomSplit to JavaRDD (with tests, and tidy Java tests) 2014-06-04 11:27:08 -07:00
pom.xml [SPARK-1776] Have Spark's SBT build read dependencies from Maven. 2014-07-10 11:03:37 -07:00
README.md [java8API] SPARK-964 Investigate the potential for using JDK 8 lambda expressions for the Java/Scala APIs 2014-03-03 22:31:30 -08:00

Java 8 Test Suites

These tests require having Java 8 installed and are isolated from the main Spark build. If Java 8 is not your system's default Java version, you will need to point Spark's build to your Java location. The set-up depends a bit on the build system:

  • Sbt users can either set JAVA_HOME to the location of a Java 8 JDK or explicitly pass -java-home to the sbt launch script. If a Java 8 JDK is detected sbt will automatically include the Java 8 test project.

    $ JAVA_HOME=/opt/jdk1.8.0/ sbt/sbt clean "test-only org.apache.spark.Java8APISuite"

  • For Maven users,

    Maven users can also refer to their Java 8 directory using JAVA_HOME. However, Maven will not automatically detect the presence of a Java 8 JDK, so a special build profile -Pjava8-tests must be used.

    $ JAVA_HOME=/opt/jdk1.8.0/ mvn clean install -DskipTests $ JAVA_HOME=/opt/jdk1.8.0/ mvn test -Pjava8-tests -DwildcardSuites=org.apache.spark.Java8APISuite

    Note that the above command can only be run from project root directory since this module depends on core and the test-jars of core and streaming. This means an install step is required to make the test dependencies visible to the Java 8 sub-project.