spark-instrumented-optimizer/R
Burak Yavuz c9a4c36d05 [SPARK-8313] R Spark packages support
shivaram cafreeman Could you please help me in testing this out? Exposing and running `rPackageBuilder` from inside the shell works, but for some reason, I can't get it to work during Spark Submit. It just starts relaunching Spark Submit.

For testing, you may use the R branch with [sbt-spark-package](https://github.com/databricks/sbt-spark-package). You can call spPackage, and then pass the jar using `--jars`.

Author: Burak Yavuz <brkyvz@gmail.com>

Closes #7139 from brkyvz/r-submit and squashes the following commits:

0de384f [Burak Yavuz] remove unused imports 2
d253708 [Burak Yavuz] removed unused imports
6603d0d [Burak Yavuz] addressed comments
4258ffe [Burak Yavuz] merged master
ddfcc06 [Burak Yavuz] added zipping test
3a1be7d [Burak Yavuz] don't zip
77995df [Burak Yavuz] fix URI
ac45527 [Burak Yavuz] added zipping of all libs
e6bf7b0 [Burak Yavuz] add println ignores
1bc5554 [Burak Yavuz] add assumes for tests
9778e03 [Burak Yavuz] addressed comments
b42b300 [Burak Yavuz] merged master
ffd134e [Burak Yavuz] Merge branch 'master' of github.com:apache/spark into r-submit
d867756 [Burak Yavuz] add apache header
eff5ba1 [Burak Yavuz] ready for review
8838edb [Burak Yavuz] Merge branch 'master' of github.com:apache/spark into r-submit
e5b5a06 [Burak Yavuz] added doc
bb751ce [Burak Yavuz] fix null bug
0226768 [Burak Yavuz] fixed issues
8810beb [Burak Yavuz] R packages support
2015-08-04 18:20:12 -07:00
..
pkg [SPARK-8313] R Spark packages support 2015-08-04 18:20:12 -07:00
.gitignore [SPARK-5654] Integrate SparkR 2015-04-08 22:45:40 -07:00
create-docs.sh [SPARK-8027] [SPARKR] Move man pages creation to install-dev.sh 2015-06-04 12:52:16 -07:00
DOCUMENTATION.md [SPARK-5654] Integrate SparkR 2015-04-08 22:45:40 -07:00
install-dev.bat [SPARK-6797] [SPARKR] Add support for YARN cluster mode. 2015-07-13 08:21:47 -07:00
install-dev.sh [SPARK-8313] R Spark packages support 2015-08-04 18:20:12 -07:00
log4j.properties [SPARK-8350] [R] Log R unit test output to "unit-tests.log" 2015-06-15 08:16:22 -07:00
README.md Small update in the readme file 2015-07-06 13:28:07 -07:00
run-tests.sh [SPARK-8850] [SQL] Enable Unsafe mode by default 2015-07-30 10:45:32 -07:00
WINDOWS.md [SPARK-5654] Integrate SparkR 2015-04-08 22:45:40 -07:00

R on Spark

SparkR is an R package that provides a light-weight frontend to use Spark from R.

SparkR development

Build Spark

Build Spark with Maven and include the -Psparkr profile to build the R package. For example to use the default Hadoop versions you can run

  build/mvn -DskipTests -Psparkr package

Running sparkR

You can start using SparkR by launching the SparkR shell with

./bin/sparkR

The sparkR script automatically creates a SparkContext with Spark by default in local mode. To specify the Spark master of a cluster for the automatically created SparkContext, you can run

./bin/sparkR --master "local[2]"

To set other options like driver memory, executor memory etc. you can pass in the spark-submit arguments to ./bin/sparkR

Using SparkR from RStudio

If you wish to use SparkR from RStudio or other R frontends you will need to set some environment variables which point SparkR to your Spark installation. For example

# Set this to where Spark is installed
Sys.setenv(SPARK_HOME="/Users/shivaram/spark")
# This line loads SparkR from the installed directory
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
library(SparkR)
sc <- sparkR.init(master="local")

Making changes to SparkR

The instructions for making contributions to Spark also apply to SparkR. If you only make R file changes (i.e. no Scala changes) then you can just re-install the R package using R/install-dev.sh and test your changes. Once you have made your changes, please include unit tests for them and run existing unit tests using the run-tests.sh script as described below.

Generating documentation

The SparkR documentation (Rd files and HTML files) are not a part of the source repository. To generate them you can run the script R/create-docs.sh. This script uses devtools and knitr to generate the docs and these packages need to be installed on the machine before using the script.

Examples, Unit tests

SparkR comes with several sample programs in the examples/src/main/r directory. To run one of them, use ./bin/sparkR <filename> <args>. For example:

./bin/sparkR examples/src/main/r/dataframe.R

You can also run the unit-tests for SparkR by running (you need to install the testthat package first):

R -e 'install.packages("testthat", repos="http://cran.us.r-project.org")'
./R/run-tests.sh

Running on YARN

The ./bin/spark-submit and ./bin/sparkR can also be used to submit jobs to YARN clusters. You will need to set YARN conf dir before doing so. For example on CDH you can run

export YARN_CONF_DIR=/etc/hadoop/conf
./bin/spark-submit --master yarn examples/src/main/r/dataframe.R