[SPARK-22998][K8S] Set missing value for SPARK_MOUNTED_CLASSPATH in the executors

## What changes were proposed in this pull request?

The environment variable `SPARK_MOUNTED_CLASSPATH` is referenced in the executor's Dockerfile, where its value is added to the classpath of the executor. However, the scheduler backend code missed setting it when creating the executor pods. This PR fixes it.

## How was this patch tested?

Unit tested.

vanzin Can you help take a look? Thanks!
foxish

Author: Yinan Li <liyinan926@gmail.com>

Closes #20193 from liyinan926/master.
This commit is contained in:
Yinan Li 2018-01-09 01:32:48 -08:00 committed by Felix Cheung
parent 0959aa581a
commit 6a4206ff04
2 changed files with 6 additions and 2 deletions

View file

@ -94,6 +94,8 @@ private[spark] class ExecutorPodFactory(
private val executorCores = sparkConf.getDouble("spark.executor.cores", 1)
private val executorLimitCores = sparkConf.get(KUBERNETES_EXECUTOR_LIMIT_CORES)
private val executorJarsDownloadDir = sparkConf.get(JARS_DOWNLOAD_LOCATION)
/**
* Configure and construct an executor pod with the given parameters.
*/
@ -145,7 +147,8 @@ private[spark] class ExecutorPodFactory(
(ENV_EXECUTOR_CORES, math.ceil(executorCores).toInt.toString),
(ENV_EXECUTOR_MEMORY, executorMemoryString),
(ENV_APPLICATION_ID, applicationId),
(ENV_EXECUTOR_ID, executorId)) ++ executorEnvs)
(ENV_EXECUTOR_ID, executorId),
(ENV_MOUNTED_CLASSPATH, s"$executorJarsDownloadDir/*")) ++ executorEnvs)
.map(env => new EnvVarBuilder()
.withName(env._1)
.withValue(env._2)

View file

@ -197,7 +197,8 @@ class ExecutorPodFactorySuite extends SparkFunSuite with BeforeAndAfter with Bef
ENV_EXECUTOR_CORES -> "1",
ENV_EXECUTOR_MEMORY -> "1g",
ENV_APPLICATION_ID -> "dummy",
ENV_EXECUTOR_POD_IP -> null) ++ additionalEnvVars
ENV_EXECUTOR_POD_IP -> null,
ENV_MOUNTED_CLASSPATH -> "/var/spark-data/spark-jars/*") ++ additionalEnvVars
assert(executor.getSpec.getContainers.size() === 1)
assert(executor.getSpec.getContainers.get(0).getEnv.size() === defaultEnvs.size)