2013-03-17 17:47:44 -04:00
---
layout: global
2014-09-16 12:18:03 -04:00
title: Building Spark
redirect_from: "building-with-maven.html"
2019-03-30 20:49:45 -04:00
license: |
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
2013-03-17 17:47:44 -04:00
---
* This will become a table of contents (this text will be scraped).
{:toc}
2016-05-17 11:40:38 -04:00
# Building Apache Spark
2015-05-16 03:18:41 -04:00
2016-05-17 11:40:38 -04:00
## Apache Maven
2015-05-16 03:18:41 -04:00
2016-05-17 11:40:38 -04:00
The Maven-based build is the build of reference for Apache Spark.
2019-05-02 23:01:17 -04:00
Building Spark using Maven requires Maven 3.6.1 and Java 8.
2019-03-25 11:46:42 -04:00
Spark requires Scala 2.12; support for Scala 2.11 was removed in Spark 3.0.0.
2015-03-17 07:25:01 -04:00
2016-05-17 11:40:38 -04:00
### Setting up Maven's Memory Usage
2013-03-17 17:47:44 -04:00
2016-09-22 05:31:15 -04:00
You'll need to configure Maven to use more memory than usual by setting `MAVEN_OPTS` :
2013-03-17 17:47:44 -04:00
2016-09-22 05:31:15 -04:00
export MAVEN_OPTS="-Xmx2g -XX:ReservedCodeCacheSize=512m"
2013-03-17 17:47:44 -04:00
2017-02-16 07:32:45 -05:00
(The `ReservedCodeCacheSize` setting is optional but recommended.)
2016-09-22 05:31:15 -04:00
If you don't add these parameters to `MAVEN_OPTS` , you may see errors and warnings like the following:
2013-03-17 17:47:44 -04:00
2014-02-19 18:54:03 -05:00
[INFO] Compiling 203 Scala sources and 9 Java sources to /Users/me/Development/spark/core/target/scala-{{site.SCALA_BINARY_VERSION}}/classes...
2013-08-27 23:02:30 -04:00
[ERROR] Java heap space -> [Help 1]
2013-03-17 17:47:44 -04:00
2016-09-22 05:31:15 -04:00
You can fix these problems by setting the `MAVEN_OPTS` variable as discussed before.
2013-03-17 17:47:44 -04:00
[SPARK-4501][Core] - Create build/mvn to automatically download maven/zinc/scalac
Creates a top level directory script (as `build/mvn`) to automatically download zinc and the specific version of scala used to easily build spark. This will also download and install maven if the user doesn't already have it and all packages are hosted under the `build/` directory. Tested on both Linux and OSX OS's and both work. All commands pass through to the maven binary so it acts exactly as a traditional maven call would.
Author: Brennon York <brennon.york@capitalone.com>
Closes #3707 from brennonyork/SPARK-4501 and squashes the following commits:
0e5a0e4 [Brennon York] minor incorrect doc verbage (with -> this)
9b79e38 [Brennon York] fixed merge conflicts with dev/run-tests, properly quoted args in sbt/sbt, fixed bug where relative paths would fail if passed in from build/mvn
d2d41b6 [Brennon York] added blurb about leverging zinc with build/mvn
b979c58 [Brennon York] updated the merge conflict
c5634de [Brennon York] updated documentation to overview build/mvn, updated all points where sbt/sbt was referenced with build/sbt
b8437ba [Brennon York] set progress bars for curl and wget when not run on jenkins, no progress bar when run on jenkins, moved sbt script to build/sbt, wrote stub and warning under sbt/sbt which calls build/sbt, modified build/sbt to use the correct directory, fixed bug in build/sbt-launch-lib.bash to correctly pull the sbt version
be11317 [Brennon York] added switch to silence download progress only if AMPLAB_JENKINS is set
28d0a99 [Brennon York] updated to remove the python dependency, uses grep instead
7e785a6 [Brennon York] added silent and quiet flags to curl and wget respectively, added single echo output to denote start of a download if download is needed
14a5da0 [Brennon York] removed unnecessary zinc output on startup
1af4a94 [Brennon York] fixed bug with uppercase vs lowercase variable
3e8b9b3 [Brennon York] updated to properly only restart zinc if it was freshly installed
a680d12 [Brennon York] Added comments to functions and tested various mvn calls
bb8cc9d [Brennon York] removed package files
ef017e6 [Brennon York] removed OS complexities, setup generic install_app call, removed extra file complexities, removed help, removed forced install (defaults now), removed double-dash from cli
07bf018 [Brennon York] Updated to specifically handle pulling down the correct scala version
f914dea [Brennon York] Beginning final portions of localized scala home
69c4e44 [Brennon York] working linux and osx installers for purely local mvn build
4a1609c [Brennon York] finalizing working linux install for maven to local ./build/apache-maven folder
cbfcc68 [Brennon York] Changed the default sbt/sbt to build/sbt and added a build/mvn which will automatically download, install, and execute maven with zinc for easier build capability
2014-12-27 16:25:18 -05:00
**Note:**
2015-09-08 09:38:10 -04:00
2016-09-22 05:31:15 -04:00
* If using `build/mvn` with no `MAVEN_OPTS` set, the script will automatically add the above options to the `MAVEN_OPTS` environment variable.
2017-02-16 07:32:45 -05:00
* The `test` phase of the Spark build will automatically add these options to `MAVEN_OPTS` , even when not using `build/mvn` .
2014-03-04 01:31:30 -05:00
2016-05-17 11:40:38 -04:00
### build/mvn
Spark now comes packaged with a self-contained Maven installation to ease building and deployment of Spark from source located under the `build/` directory. This script will automatically download and setup all necessary build requirements ([Maven](https://maven.apache.org/), [Scala ](http://www.scala-lang.org/ ), and [Zinc ](https://github.com/typesafehub/zinc )) locally within the `build/` directory itself. It honors any `mvn` binary if present already, however, will pull down its own copy of Scala and Zinc regardless to ensure proper version requirements are met. `build/mvn` execution acts as a pass through to the `mvn` call allowing easy transition from previous build methods. As an example, one can build a version of Spark as follows:
2017-02-08 07:20:07 -05:00
./build/mvn -DskipTests clean package
2016-05-17 11:40:38 -04:00
Other build examples can be found below.
## Building a Runnable Distribution
To create a Spark distribution like those distributed by the
2018-08-21 13:02:17 -04:00
[Spark Downloads ](https://spark.apache.org/downloads.html ) page, and that is laid out so as
2016-05-17 11:40:38 -04:00
to be runnable, use `./dev/make-distribution.sh` in the project root directory. It can be configured
with Maven profile settings and so on like the direct Maven build. Example:
2018-10-10 15:07:53 -04:00
./dev/make-distribution.sh --name custom-spark --pip --r --tgz -Psparkr -Phive -Phive-thriftserver -Pmesos -Pyarn -Pkubernetes
2016-05-17 11:40:38 -04:00
2016-12-28 01:37:37 -05:00
This will build Spark distribution along with Python pip and R packages. For more information on usage, run `./dev/make-distribution.sh --help`
2016-05-17 11:40:38 -04:00
2017-02-08 07:20:07 -05:00
## Specifying the Hadoop Version and Enabling YARN
2013-03-17 17:47:44 -04:00
2017-02-08 07:20:07 -05:00
You can specify the exact version of Hadoop to compile against through the `hadoop.version` property.
2014-05-05 13:33:49 -04:00
2017-02-08 07:20:07 -05:00
You can enable the `yarn` profile and optionally set the `yarn.version` property if it is different
from `hadoop.version` .
2014-10-27 13:02:48 -04:00
2018-10-10 15:07:53 -04:00
Example:
2013-03-17 17:47:44 -04:00
2018-10-10 15:07:53 -04:00
./build/mvn -Pyarn -Dhadoop.version=2.8.5 -DskipTests clean package
2014-04-29 01:50:51 -04:00
2016-05-17 11:40:38 -04:00
## Building With Hive and JDBC Support
2014-05-04 02:32:12 -04:00
2014-08-23 01:28:05 -04:00
To enable Hive integration for Spark SQL along with its JDBC server and CLI,
2014-11-12 00:36:48 -05:00
add the `-Phive` and `Phive-thriftserver` profiles to your existing build options.
2016-03-10 20:07:18 -05:00
By default Spark will build with Hive 1.2.1 bindings.
2016-05-10 13:29:38 -04:00
2017-02-08 07:20:07 -05:00
# With Hive 1.2.1 support
./build/mvn -Pyarn -Phive -Phive-thriftserver -DskipTests clean package
2016-05-10 13:29:38 -04:00
2016-05-17 11:40:38 -04:00
## Packaging without Hadoop Dependencies for YARN
2016-05-10 13:29:38 -04:00
2016-08-26 15:25:22 -04:00
The assembly directory produced by `mvn package` will, by default, include all of Spark's
dependencies, including Hadoop and some of its ecosystem projects. On YARN deployments, this
causes multiple versions of these to appear on executor classpaths: the version packaged in
2016-05-17 11:40:38 -04:00
the Spark assembly and the version on each node, included with `yarn.application.classpath` .
2016-08-26 15:25:22 -04:00
The `hadoop-provided` profile builds the assembly without including Hadoop-ecosystem projects,
2016-05-17 11:40:38 -04:00
like ZooKeeper and Hadoop itself.
2014-08-03 13:19:04 -04:00
2016-08-26 15:25:22 -04:00
## Building with Mesos support
./build/mvn -Pmesos -DskipTests clean package
2017-12-21 20:21:11 -05:00
## Building with Kubernetes support
./build/mvn -Pkubernetes -DskipTests clean package
2016-08-26 15:25:22 -04:00
2016-05-17 11:40:38 -04:00
## Building submodules individually
2015-10-08 06:38:39 -04:00
2018-04-06 01:37:08 -04:00
It's possible to build Spark submodules using the `mvn -pl` option.
2015-10-08 06:38:39 -04:00
For instance, you can build the Spark Streaming module using:
2018-11-14 19:22:23 -05:00
./build/mvn -pl :spark-streaming_{{site.SCALA_BINARY_VERSION}} clean install
2015-10-08 06:38:39 -04:00
2018-11-14 19:22:23 -05:00
where `spark-streaming_{{site.SCALA_BINARY_VERSION}}` is the `artifactId` as defined in `streaming/pom.xml` file.
2013-03-17 17:47:44 -04:00
2016-05-17 11:40:38 -04:00
## Continuous Compilation
2013-08-21 17:51:56 -04:00
We use the scala-maven-plugin which supports incremental and continuous compilation. E.g.
2016-05-17 11:40:38 -04:00
./build/mvn scala:cc
2013-08-21 17:51:56 -04:00
[SPARK-4501][Core] - Create build/mvn to automatically download maven/zinc/scalac
Creates a top level directory script (as `build/mvn`) to automatically download zinc and the specific version of scala used to easily build spark. This will also download and install maven if the user doesn't already have it and all packages are hosted under the `build/` directory. Tested on both Linux and OSX OS's and both work. All commands pass through to the maven binary so it acts exactly as a traditional maven call would.
Author: Brennon York <brennon.york@capitalone.com>
Closes #3707 from brennonyork/SPARK-4501 and squashes the following commits:
0e5a0e4 [Brennon York] minor incorrect doc verbage (with -> this)
9b79e38 [Brennon York] fixed merge conflicts with dev/run-tests, properly quoted args in sbt/sbt, fixed bug where relative paths would fail if passed in from build/mvn
d2d41b6 [Brennon York] added blurb about leverging zinc with build/mvn
b979c58 [Brennon York] updated the merge conflict
c5634de [Brennon York] updated documentation to overview build/mvn, updated all points where sbt/sbt was referenced with build/sbt
b8437ba [Brennon York] set progress bars for curl and wget when not run on jenkins, no progress bar when run on jenkins, moved sbt script to build/sbt, wrote stub and warning under sbt/sbt which calls build/sbt, modified build/sbt to use the correct directory, fixed bug in build/sbt-launch-lib.bash to correctly pull the sbt version
be11317 [Brennon York] added switch to silence download progress only if AMPLAB_JENKINS is set
28d0a99 [Brennon York] updated to remove the python dependency, uses grep instead
7e785a6 [Brennon York] added silent and quiet flags to curl and wget respectively, added single echo output to denote start of a download if download is needed
14a5da0 [Brennon York] removed unnecessary zinc output on startup
1af4a94 [Brennon York] fixed bug with uppercase vs lowercase variable
3e8b9b3 [Brennon York] updated to properly only restart zinc if it was freshly installed
a680d12 [Brennon York] Added comments to functions and tested various mvn calls
bb8cc9d [Brennon York] removed package files
ef017e6 [Brennon York] removed OS complexities, setup generic install_app call, removed extra file complexities, removed help, removed forced install (defaults now), removed double-dash from cli
07bf018 [Brennon York] Updated to specifically handle pulling down the correct scala version
f914dea [Brennon York] Beginning final portions of localized scala home
69c4e44 [Brennon York] working linux and osx installers for purely local mvn build
4a1609c [Brennon York] finalizing working linux install for maven to local ./build/apache-maven folder
cbfcc68 [Brennon York] Changed the default sbt/sbt to build/sbt and added a build/mvn which will automatically download, install, and execute maven with zinc for easier build capability
2014-12-27 16:25:18 -05:00
should run continuous compilation (i.e. wait for changes). However, this has not been tested
2014-12-15 17:52:17 -05:00
extensively. A couple of gotchas to note:
2015-02-03 00:14:21 -05:00
2014-12-15 17:52:17 -05:00
* it only scans the paths `src/main` and `src/test` (see
2017-09-12 10:07:04 -04:00
[docs ](http://davidb.github.io/scala-maven-plugin/example_cc.html )), so it will only work
2014-12-15 17:52:17 -05:00
from within certain submodules that have that structure.
2015-02-03 00:14:21 -05:00
2014-12-15 17:52:17 -05:00
* you'll typically need to run `mvn install` from the project root for compilation within
specific submodules to work; this is because submodules that depend on other submodules do so via
the `spark-parent` module).
Thus, the full flow for running continuous-compilation of the `core` submodule may look more like:
2015-02-03 00:14:21 -05:00
2016-05-17 11:40:38 -04:00
$ ./build/mvn install
2015-09-08 09:38:10 -04:00
$ cd core
2016-05-17 11:40:38 -04:00
$ ../build/mvn scala:cc
2013-08-21 17:51:56 -04:00
2016-05-17 11:40:38 -04:00
## Building with SBT
2014-03-04 01:31:30 -05:00
2016-05-17 11:40:38 -04:00
Maven is the official build tool recommended for packaging Spark, and is the *build of reference* .
But SBT is supported for day-to-day development since it can provide much faster iterative
compilation. More advanced developers may wish to use SBT.
[SPARK-4501][Core] - Create build/mvn to automatically download maven/zinc/scalac
Creates a top level directory script (as `build/mvn`) to automatically download zinc and the specific version of scala used to easily build spark. This will also download and install maven if the user doesn't already have it and all packages are hosted under the `build/` directory. Tested on both Linux and OSX OS's and both work. All commands pass through to the maven binary so it acts exactly as a traditional maven call would.
Author: Brennon York <brennon.york@capitalone.com>
Closes #3707 from brennonyork/SPARK-4501 and squashes the following commits:
0e5a0e4 [Brennon York] minor incorrect doc verbage (with -> this)
9b79e38 [Brennon York] fixed merge conflicts with dev/run-tests, properly quoted args in sbt/sbt, fixed bug where relative paths would fail if passed in from build/mvn
d2d41b6 [Brennon York] added blurb about leverging zinc with build/mvn
b979c58 [Brennon York] updated the merge conflict
c5634de [Brennon York] updated documentation to overview build/mvn, updated all points where sbt/sbt was referenced with build/sbt
b8437ba [Brennon York] set progress bars for curl and wget when not run on jenkins, no progress bar when run on jenkins, moved sbt script to build/sbt, wrote stub and warning under sbt/sbt which calls build/sbt, modified build/sbt to use the correct directory, fixed bug in build/sbt-launch-lib.bash to correctly pull the sbt version
be11317 [Brennon York] added switch to silence download progress only if AMPLAB_JENKINS is set
28d0a99 [Brennon York] updated to remove the python dependency, uses grep instead
7e785a6 [Brennon York] added silent and quiet flags to curl and wget respectively, added single echo output to denote start of a download if download is needed
14a5da0 [Brennon York] removed unnecessary zinc output on startup
1af4a94 [Brennon York] fixed bug with uppercase vs lowercase variable
3e8b9b3 [Brennon York] updated to properly only restart zinc if it was freshly installed
a680d12 [Brennon York] Added comments to functions and tested various mvn calls
bb8cc9d [Brennon York] removed package files
ef017e6 [Brennon York] removed OS complexities, setup generic install_app call, removed extra file complexities, removed help, removed forced install (defaults now), removed double-dash from cli
07bf018 [Brennon York] Updated to specifically handle pulling down the correct scala version
f914dea [Brennon York] Beginning final portions of localized scala home
69c4e44 [Brennon York] working linux and osx installers for purely local mvn build
4a1609c [Brennon York] finalizing working linux install for maven to local ./build/apache-maven folder
cbfcc68 [Brennon York] Changed the default sbt/sbt to build/sbt and added a build/mvn which will automatically download, install, and execute maven with zinc for easier build capability
2014-12-27 16:25:18 -05:00
2016-05-17 11:40:38 -04:00
The SBT build is derived from the Maven POM files, and so the same Maven profiles and variables
can be set to control the SBT build. For example:
2015-11-11 14:16:39 -05:00
2017-02-08 07:20:07 -05:00
./build/sbt package
2015-11-11 14:16:39 -05:00
2016-05-17 11:40:38 -04:00
To avoid the overhead of launching sbt each time you need to re-compile, you can launch sbt
in interactive mode by running `build/sbt` , and then run all build commands at the command
2017-02-23 16:27:47 -05:00
prompt.
2019-01-22 19:27:24 -05:00
### Setting up SBT's Memory Usage
Configure the JVM options for SBT in `.jvmopts` at the project root, for example:
-Xmx2g
-XX:ReservedCodeCacheSize=512m
For the meanings of these two options, please carefully read the [Setting up Maven's Memory Usage section ](http://spark.apache.org/docs/latest/building-spark.html#setting-up-mavens-memory-usage ).
2017-02-23 16:27:47 -05:00
## Speeding up Compilation
Developers who compile Spark frequently may want to speed up compilation; e.g., by using Zinc
(for developers who build with Maven) or by avoiding re-compilation of the assembly JAR (for
developers who build with SBT). For more information about how to do this, refer to the
2018-08-21 13:02:17 -04:00
[Useful Developer Tools page ](https://spark.apache.org/developer-tools.html#reducing-build-times ).
2014-03-12 01:39:17 -04:00
2017-04-03 05:09:11 -04:00
## Encrypted Filesystems
2016-05-06 07:25:45 -04:00
2016-05-17 11:40:38 -04:00
When building on an encrypted filesystem (if your home directory is encrypted, for example), then the Spark build might fail with a "Filename too long" error. As a workaround, add the following in the configuration args of the `scala-maven-plugin` in the project `pom.xml` :
2016-05-06 07:25:45 -04:00
2016-05-17 11:40:38 -04:00
< arg > -Xmax-classfile-name< / arg >
< arg > 128< / arg >
2016-05-06 07:25:45 -04:00
2016-05-17 11:40:38 -04:00
and in `project/SparkBuild.scala` add:
2016-05-06 07:25:45 -04:00
2016-05-17 11:40:38 -04:00
scalacOptions in Compile ++= Seq("-Xmax-classfile-name", "128"),
2016-05-06 07:25:45 -04:00
2016-05-17 11:40:38 -04:00
to the `sharedSettings` val. See also [this PR ](https://github.com/apache/spark/pull/2883/files ) if you are unsure of where to add these lines.
2016-05-06 07:25:45 -04:00
2016-05-17 11:40:38 -04:00
## IntelliJ IDEA or Eclipse
2014-03-12 01:39:17 -04:00
2016-05-17 11:40:38 -04:00
For help in setting up IntelliJ IDEA or Eclipse for Spark development, and troubleshooting, refer to the
2018-08-21 13:02:17 -04:00
[Useful Developer Tools page ](https://spark.apache.org/developer-tools.html ).
2014-03-12 01:39:17 -04:00
2016-05-17 11:40:38 -04:00
# Running Tests
2014-09-16 12:18:03 -04:00
2016-05-17 11:40:38 -04:00
Tests are run by default via the [ScalaTest Maven plugin ](http://www.scalatest.org/user_guide/using_the_scalatest_maven_plugin ).
2016-09-29 19:01:45 -04:00
Note that tests should not be run as root or an admin user.
2014-09-16 12:18:03 -04:00
2016-10-20 18:30:01 -04:00
The following is an example of a command to run the tests:
2014-09-16 12:18:03 -04:00
2017-02-08 07:20:07 -05:00
./build/mvn test
2016-05-17 11:40:38 -04:00
## Testing with SBT
2014-10-06 00:36:20 -04:00
2016-10-20 18:30:01 -04:00
The following is an example of a command to run the tests:
2014-10-06 00:36:20 -04:00
2017-02-08 07:20:07 -05:00
./build/sbt test
2014-10-06 00:36:20 -04:00
2017-02-23 16:27:47 -05:00
## Running Individual Tests
2014-10-06 00:36:20 -04:00
2017-02-23 16:27:47 -05:00
For information about how to run individual tests, refer to the
2018-08-21 13:02:17 -04:00
[Useful Developer Tools page ](https://spark.apache.org/developer-tools.html#running-individual-tests ).
2014-10-06 00:36:20 -04:00
2016-11-16 17:22:15 -05:00
## PySpark pip installable
If you are building Spark for use in a Python environment and you wish to pip install it, you will first need to build the Spark JARs as described above. Then you can construct an sdist package suitable for setup.py and pip installable package.
cd python; python setup.py sdist
**Note:** Due to packaging requirements you can not directly pip install from the Python directory, rather you must first build the sdist package as described above.
2016-12-28 01:37:37 -05:00
Alternatively, you can also run make-distribution with the --pip option.
2018-06-25 21:48:15 -04:00
## PySpark Tests with Maven or SBT
2016-05-17 11:40:38 -04:00
If you are building PySpark and wish to run the PySpark tests you will need to build Spark with Hive support.
./build/mvn -DskipTests clean package -Phive
./python/run-tests
2018-06-25 21:48:15 -04:00
If you are building PySpark with SBT and wish to run the PySpark tests, you will need to build Spark with Hive support and also build the test components:
./build/sbt -Phive clean package
./build/sbt test:compile
./python/run-tests
2016-05-17 11:40:38 -04:00
The run-tests script also can be limited to a specific Python version or a specific module
./python/run-tests --python-executables=python --modules=pyspark-sql
## Running R Tests
2017-06-16 06:03:54 -04:00
To run the SparkR tests you will need to install the [knitr ](https://cran.r-project.org/package=knitr ), [rmarkdown ](https://cran.r-project.org/package=rmarkdown ), [testthat ](https://cran.r-project.org/package=testthat ), [e1071 ](https://cran.r-project.org/package=e1071 ) and [survival ](https://cran.r-project.org/package=survival ) packages first:
2018-08-30 08:25:26 -04:00
R -e "install.packages(c('knitr', 'rmarkdown', 'devtools', 'e1071', 'survival'), repos='http://cran.us.r-project.org')"
R -e "devtools::install_version('testthat', version = '1.0.2', repos='http://cran.us.r-project.org')"
2017-06-16 06:03:54 -04:00
You can run just the SparkR tests using the command:
2016-05-17 11:40:38 -04:00
./R/run-tests.sh
## Running Docker-based Integration Test Suites
2016-08-26 15:25:22 -04:00
In order to run Docker integration tests, you have to install the `docker` engine on your box.
The instructions for installation can be found at [the Docker site ](https://docs.docker.com/engine/installation/ ).
Once installed, the `docker` service needs to be started, if not already running.
2016-05-17 11:40:38 -04:00
On Linux, this can be done by `sudo service docker start` .
./build/mvn install -DskipTests
2018-11-14 19:22:23 -05:00
./build/mvn test -Pdocker-integration-tests -pl :spark-docker-integration-tests_{{site.SCALA_BINARY_VERSION}}
2016-05-17 11:40:38 -04:00
or
./build/sbt docker-integration-tests/test
2018-09-03 08:36:04 -04:00
## Change Scala Version
2019-03-25 11:46:42 -04:00
When other versions of Scala like 2.13 are supported, it will be possible to build for that version.
Change the major Scala version using (e.g. 2.13):
2018-09-03 08:36:04 -04:00
2019-03-25 11:46:42 -04:00
./dev/change-scala-version.sh 2.13
2018-09-03 08:36:04 -04:00
2019-03-25 11:46:42 -04:00
For Maven, please enable the profile (e.g. 2.13):
2018-09-03 08:36:04 -04:00
2019-03-25 11:46:42 -04:00
./build/mvn -Pscala-2.13 compile
2018-09-03 08:36:04 -04:00
2019-03-25 11:46:42 -04:00
For SBT, specify a complete scala version using (e.g. 2.13.0):
2018-09-03 08:36:04 -04:00
2019-03-25 11:46:42 -04:00
./build/sbt -Dscala.version=2.13.0
2018-09-03 08:36:04 -04:00
Otherwise, the sbt-pom-reader plugin will use the `scala.version` specified in the spark-parent pom.
2018-10-12 13:41:33 -04:00
## Running Jenkins tests with Github Enterprise
To run tests with Jenkins:
./dev/run-tests-jenkins
If use an individual repository or a repository on GitHub Enterprise, export below environment variables before running above command.
### Related environment variables
< table class = "table" >
< tr > < th > Variable Name< / th > < th > Default< / th > < th > Meaning< / th > < / tr >
< tr >
< td > < code > SPARK_PROJECT_URL< / code > < / td >
< td > https://github.com/apache/spark< / td >
< td >
The Spark project URL of GitHub Enterprise.
< / td >
< / tr >
< tr >
< td > < code > GITHUB_API_BASE< / code > < / td >
< td > https://api.github.com/repos/apache/spark< / td >
< td >
The Spark project API server URL of GitHub Enterprise.
< / td >
< / tr >
< / table >