Set spark.executor.uri from environment variable (needed by Mesos)
The Mesos backend uses this property when setting up a slave process. It is similarly set in the Scala repl (org.apache.spark.repl.SparkILoop), but I couldn't find any analogous for pyspark. Author: Ivan Wick <ivanwick+github@gmail.com> This patch had conflicts when merged, resolved by Committer: Matei Zaharia <matei@databricks.com> Closes #311 from ivanwick/master and squashes the following commits: da0c3e4 [Ivan Wick] Set spark.executor.uri from environment variable (needed by Mesos)
This commit is contained in:
parent
2c557837b4
commit
5cd11d51c1
|
@ -29,6 +29,9 @@ from pyspark.storagelevel import StorageLevel
|
|||
# this is the equivalent of ADD_JARS
|
||||
add_files = os.environ.get("ADD_FILES").split(',') if os.environ.get("ADD_FILES") != None else None
|
||||
|
||||
if os.environ.get("SPARK_EXECUTOR_URI"):
|
||||
SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])
|
||||
|
||||
sc = SparkContext(os.environ.get("MASTER", "local[*]"), "PySparkShell", pyFiles=add_files)
|
||||
|
||||
print """Welcome to
|
||||
|
|
Loading…
Reference in a new issue