[SPARK-12345][CORE] Do not send SPARK_HOME through Spark submit REST interface
It is usually an invalid location on the remote machine executing the job. It is picked up by the Mesos support in cluster mode, and most of the time causes the job to fail. Fixes SPARK-12345 Author: Luc Bourlier <luc.bourlier@typesafe.com> Closes #10329 from skyluc/issue/SPARK_HOME.
This commit is contained in:
parent
007a32f90a
commit
ba9332edd8
|
@ -428,8 +428,10 @@ private[spark] object RestSubmissionClient {
|
|||
* Filter non-spark environment variables from any environment.
|
||||
*/
|
||||
private[rest] def filterSystemEnvironment(env: Map[String, String]): Map[String, String] = {
|
||||
env.filter { case (k, _) =>
|
||||
(k.startsWith("SPARK_") && k != "SPARK_ENV_LOADED") || k.startsWith("MESOS_")
|
||||
env.filterKeys { k =>
|
||||
// SPARK_HOME is filtered out because it is usually wrong on the remote machine (SPARK-12345)
|
||||
(k.startsWith("SPARK_") && k != "SPARK_ENV_LOADED" && k != "SPARK_HOME") ||
|
||||
k.startsWith("MESOS_")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
Loading…
Reference in a new issue