Merge pull request #532 from andyk/master

SPARK-715: Adds instructions for building with Maven to documentation
This commit is contained in:
Matei Zaharia 2013-03-20 19:29:23 -07:00
commit 4c5efcf600
4 changed files with 72 additions and 0 deletions

View file

@ -17,6 +17,8 @@ which is packaged with it. To build Spark and its example programs, run:
sbt/sbt package
Spark also supports building using Maven. If you would like to build using Maven, see the [instructions for building Spark with Maven](http://spark-project.org/docs/latest/building-with-maven.html) in the spark documentation..
To run Spark, you will need to have Scala's bin directory in your `PATH`, or
you will need to set the `SCALA_HOME` environment variable to point to where
you've installed Scala. Scala must be accessible through one of these

View file

@ -90,6 +90,7 @@
<li class="dropdown">
<a href="api.html" class="dropdown-toggle" data-toggle="dropdown">More<b class="caret"></b></a>
<ul class="dropdown-menu">
<li><a href="building-with-maven.html">Building Spark with Maven</a></li>
<li><a href="configuration.html">Configuration</a></li>
<li><a href="tuning.html">Tuning Guide</a></li>
<li><a href="bagel-programming-guide.html">Bagel (Pregel on Spark)</a></li>

View file

@ -0,0 +1,66 @@
---
layout: global
title: Building Spark with Maven
---
* This will become a table of contents (this text will be scraped).
{:toc}
Building Spark using Maven Requires Maven 3 (the build process is tested with Maven 3.0.4) and Java 1.6 or newer.
Building with Maven requires that a Hadoop profile be specified explicitly at the command line, there is no default. There are two profiles to choose from, one for building for Hadoop 1 or Hadoop 2.
for Hadoop 1 (using 0.20.205.0) use:
$ mvn -Phadoop1 clean install
for Hadoop 2 (using 2.0.0-mr1-cdh4.1.1) use:
$ mvn -Phadoop2 clean install
It uses the scala-maven-plugin which supports incremental and continuous compilation. E.g.
$ mvn -Phadoop2 scala:cc
…should run continuous compilation (i.e. wait for changes). However, this has not been tested extensively.
## Spark Tests in Maven ##
Tests are run by default via the scalatest-maven-plugin. With this you can do things like:
Skip test execution (but not compilation):
$ mvn -DskipTests -Phadoop2 clean install
To run a specific test suite:
$ mvn -Phadoop2 -Dsuites=spark.repl.ReplSuite test
## Setting up JVM Memory Usage Via Maven ##
You might run into the following errors if you're using a vanilla installation of Maven:
[INFO] Compiling 203 Scala sources and 9 Java sources to /Users/andyk/Development/spark/core/target/scala-2.9.2/classes...
[ERROR] PermGen space -> [Help 1]
[INFO] Compiling 203 Scala sources and 9 Java sources to /Users/andyk/Development/spark/core/target/scala-2.9.2/classes...
[ERROR] Java heap space -> [Help 1]
To fix these, you can do the following:
export MAVEN_OPTS="-Xmx1024m -XX:MaxPermSize=128M"
## Using With IntelliJ IDEA ##
This setup works fine in IntelliJ IDEA 11.1.4. After opening the project via the pom.xml file in the project root folder, you only need to activate either the hadoop1 or hadoop2 profile in the "Maven Properties" popout. We have not tried Eclipse/Scala IDE with this.
## Building Spark Debian Packages ##
It includes support for building a Debian package containing a 'fat-jar' which includes the repl, the examples and bagel. This can be created by specifying the deb profile:
$ mvn -Phadoop2,deb clean install
The debian package can then be found under repl/target. We added the short commit hash to the file name so that we can distinguish individual packages build for SNAPSHOT versions.

View file

@ -22,6 +22,8 @@ Spark uses [Simple Build Tool](https://github.com/harrah/xsbt/wiki), which is bu
sbt/sbt package
Spark also supports building using Maven. If you would like to build using Maven, see the [instructions for building Spark with Maven](building-with-maven.html).
# Testing the Build
Spark comes with a number of sample programs in the `examples` directory.
@ -72,6 +74,7 @@ of `project/SparkBuild.scala`, then rebuilding Spark (`sbt/sbt clean compile`).
**Other documents:**
* [Building Spark With Maven](building-with-maven.html): Build Spark using the Maven build tool
* [Configuration](configuration.html): customize Spark via its configuration system
* [Tuning Guide](tuning.html): best practices to optimize performance and memory use
* [Bagel](bagel-programming-guide.html): an implementation of Google's Pregel on Spark