[SPARK-8028] [SPARKR] Use addJar instead of setJars in SparkR

This prevents the spark.jars from being cleared while using `--packages` or `--jars`

cc pwendell davies brkyvz

Author: Shivaram Venkataraman <shivaram@cs.berkeley.edu>

Closes #6568 from shivaram/SPARK-8028 and squashes the following commits:

3a9cf1f [Shivaram Venkataraman] Use addJar instead of setJars in SparkR This prevents the spark.jars from being cleared
This commit is contained in:
Shivaram Venkataraman 2015-06-01 21:01:14 -07:00
parent 15d7c90aeb
commit 6b44278ef7

View file

@ -355,7 +355,6 @@ private[r] object RRDD {
val sparkConf = new SparkConf().setAppName(appName)
.setSparkHome(sparkHome)
.setJars(jars)
// Override `master` if we have a user-specified value
if (master != "") {
@ -373,7 +372,11 @@ private[r] object RRDD {
sparkConf.setExecutorEnv(name.asInstanceOf[String], value.asInstanceOf[String])
}
new JavaSparkContext(sparkConf)
val jsc = new JavaSparkContext(sparkConf)
jars.foreach { jar =>
jsc.addJar(jar)
}
jsc
}
/**