aab1e09f1c
### What changes were proposed in this pull request? This PR aims to support Scala 2.11 at `AbstractCommandBuilder.java` and `load-spark-env` scripts. ### Why are the changes needed? Currently, Scala 2.12 is only supported and the following fails. ``` $ dev/change-scala-version.sh 2.13 $ build/mvn test -pl core --am -Pscala-2.13 -DwildcardSuites=none -Dtest=org.apache.spark.launcher.SparkLauncherSuite ... [ERROR] Failures: [ERROR] SparkLauncherSuite.testChildProcLauncher:123 expected:<0> but was:<1> [ERROR] SparkLauncherSuite.testSparkLauncherGetError:274 [ERROR] Tests run: 6, Failures: 2, Errors: 0, Skipped: 0 ``` ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? This should be tested manually with the above command. ``` [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary for Spark Project Parent POM 3.1.0-SNAPSHOT: [INFO] [INFO] Spark Project Parent POM ........................... SUCCESS [ 2.186 s] [INFO] Spark Project Tags ................................. SUCCESS [ 4.400 s] [INFO] Spark Project Local DB ............................. SUCCESS [ 1.744 s] [INFO] Spark Project Networking ........................... SUCCESS [ 2.233 s] [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 1.527 s] [INFO] Spark Project Unsafe ............................... SUCCESS [ 5.564 s] [INFO] Spark Project Launcher ............................. SUCCESS [ 1.946 s] [INFO] Spark Project Core ................................. SUCCESS [01:21 min] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 01:41 min [INFO] Finished at: 2020-07-24T20:04:34-07:00 [INFO] ------------------------------------------------------------------------ ``` Closes #29227 from dongjoon-hyun/SPARK-32434. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
66 lines
2.4 KiB
Bash
66 lines
2.4 KiB
Bash
#!/usr/bin/env bash
|
|
|
|
#
|
|
# Licensed to the Apache Software Foundation (ASF) under one or more
|
|
# contributor license agreements. See the NOTICE file distributed with
|
|
# this work for additional information regarding copyright ownership.
|
|
# The ASF licenses this file to You under the Apache License, Version 2.0
|
|
# (the "License"); you may not use this file except in compliance with
|
|
# the License. You may obtain a copy of the License at
|
|
#
|
|
# http://www.apache.org/licenses/LICENSE-2.0
|
|
#
|
|
# Unless required by applicable law or agreed to in writing, software
|
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
# See the License for the specific language governing permissions and
|
|
# limitations under the License.
|
|
#
|
|
|
|
# This script loads spark-env.sh if it exists, and ensures it is only loaded once.
|
|
# spark-env.sh is loaded from SPARK_CONF_DIR if set, or within the current directory's
|
|
# conf/ subdirectory.
|
|
|
|
# Figure out where Spark is installed
|
|
if [ -z "${SPARK_HOME}" ]; then
|
|
source "$(dirname "$0")"/find-spark-home
|
|
fi
|
|
|
|
SPARK_ENV_SH="spark-env.sh"
|
|
if [ -z "$SPARK_ENV_LOADED" ]; then
|
|
export SPARK_ENV_LOADED=1
|
|
|
|
export SPARK_CONF_DIR="${SPARK_CONF_DIR:-"${SPARK_HOME}"/conf}"
|
|
|
|
SPARK_ENV_SH="${SPARK_CONF_DIR}/${SPARK_ENV_SH}"
|
|
if [[ -f "${SPARK_ENV_SH}" ]]; then
|
|
# Promote all variable declarations to environment (exported) variables
|
|
set -a
|
|
. ${SPARK_ENV_SH}
|
|
set +a
|
|
fi
|
|
fi
|
|
|
|
# Setting SPARK_SCALA_VERSION if not already set.
|
|
|
|
if [ -z "$SPARK_SCALA_VERSION" ]; then
|
|
SCALA_VERSION_1=2.13
|
|
SCALA_VERSION_2=2.12
|
|
|
|
ASSEMBLY_DIR_1="${SPARK_HOME}/assembly/target/scala-${SCALA_VERSION_1}"
|
|
ASSEMBLY_DIR_2="${SPARK_HOME}/assembly/target/scala-${SCALA_VERSION_2}"
|
|
ENV_VARIABLE_DOC="https://spark.apache.org/docs/latest/configuration.html#environment-variables"
|
|
if [[ -d "$ASSEMBLY_DIR_1" && -d "$ASSEMBLY_DIR_2" ]]; then
|
|
echo "Presence of build for multiple Scala versions detected ($ASSEMBLY_DIR_1 and $ASSEMBLY_DIR_2)." 1>&2
|
|
echo "Remove one of them or, export SPARK_SCALA_VERSION=$SCALA_VERSION_1 in ${SPARK_ENV_SH}." 1>&2
|
|
echo "Visit ${ENV_VARIABLE_DOC} for more details about setting environment variables in spark-env.sh." 1>&2
|
|
exit 1
|
|
fi
|
|
|
|
if [[ -d "$ASSEMBLY_DIR_1" ]]; then
|
|
export SPARK_SCALA_VERSION=${SCALA_VERSION_1}
|
|
else
|
|
export SPARK_SCALA_VERSION=${SCALA_VERSION_2}
|
|
fi
|
|
fi
|