SPARK-2641: Passing num executors to spark arguments from properties file
Since we can set spark executor memory and executor cores using property file, we must also be allowed to set the executor instances. Author: Kanwaljit Singh <kanwaljit.singh@guavus.com> Closes #1657 from kjsingh/branch-1.0 and squashes the following commits: d8a5a12 [Kanwaljit Singh] SPARK-2641: Fixing how spark arguments are loaded from properties file for num executors Conflicts: core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala
This commit is contained in:
parent
8d932475e6
commit
1d648123a7
|
@ -120,6 +120,8 @@ private[spark] class SparkSubmitArguments(args: Seq[String], env: Map[String, St
|
|||
name = Option(name).orElse(sparkProperties.get("spark.app.name")).orNull
|
||||
jars = Option(jars).orElse(sparkProperties.get("spark.jars")).orNull
|
||||
deployMode = Option(deployMode).orElse(env.get("DEPLOY_MODE")).orNull
|
||||
numExecutors = Option(numExecutors)
|
||||
.getOrElse(defaultProperties.get("spark.executor.instances").orNull)
|
||||
|
||||
// Try to set main class from JAR if no --class argument is given
|
||||
if (mainClass == null && !isPython && primaryResource != null) {
|
||||
|
|
Loading…
Reference in a new issue