[SPARK-3734] DriverRunner should not read SPARK_HOME from submitter's environment
When using spark-submit in `cluster` mode to submit a job to a Spark Standalone cluster, if the JAVA_HOME environment variable was set on the submitting machine then DriverRunner would attempt to use the submitter's JAVA_HOME to launch the driver process (instead of the worker's JAVA_HOME), causing the driver to fail unless the submitter and worker had the same Java location. This commit fixes this by reading JAVA_HOME from sys.env instead of command.environment. Author: Josh Rosen <joshrosen@apache.org> Closes #2586 from JoshRosen/SPARK-3734 and squashes the following commits: e9513d9 [Josh Rosen] [SPARK-3734] DriverRunner should not read SPARK_HOME from submitter's environment.
This commit is contained in:
parent
de700d3177
commit
b167a8c7e7
|
@ -30,7 +30,7 @@ import org.apache.spark.util.Utils
|
|||
private[spark]
|
||||
object CommandUtils extends Logging {
|
||||
def buildCommandSeq(command: Command, memory: Int, sparkHome: String): Seq[String] = {
|
||||
val runner = getEnv("JAVA_HOME", command).map(_ + "/bin/java").getOrElse("java")
|
||||
val runner = sys.env.get("JAVA_HOME").map(_ + "/bin/java").getOrElse("java")
|
||||
|
||||
// SPARK-698: do not call the run.cmd script, as process.destroy()
|
||||
// fails to kill a process tree on Windows
|
||||
|
@ -38,9 +38,6 @@ object CommandUtils extends Logging {
|
|||
command.arguments
|
||||
}
|
||||
|
||||
private def getEnv(key: String, command: Command): Option[String] =
|
||||
command.environment.get(key).orElse(Option(System.getenv(key)))
|
||||
|
||||
/**
|
||||
* Attention: this must always be aligned with the environment variables in the run scripts and
|
||||
* the way the JAVA_OPTS are assembled there.
|
||||
|
|
Loading…
Reference in a new issue