ee571d79e5
## What changes were proposed in this pull request? We use SPARK_CONF_DIR to switch spark conf directory and can be visited if we explicitly export it in spark-env.sh, but with default settings, it can't be done. This PR export SPARK_CONF_DIR while it is default. ### Before ``` KentKentsMacBookPro ~/Documents/spark-packages/spark-2.3.0-SNAPSHOT-bin-master bin/spark-shell --master local Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 17/11/08 10:28:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/11/08 10:28:45 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. Spark context Web UI available at http://169.254.168.63:4041 Spark context available as 'sc' (master = local, app id = local-1510108125770). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.3.0-SNAPSHOT /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_65) Type in expressions to have them evaluated. Type :help for more information. scala> sys.env.get("SPARK_CONF_DIR") res0: Option[String] = None ``` ### After ``` scala> sys.env.get("SPARK_CONF_DIR") res0: Option[String] = Some(/Users/Kent/Documents/spark/conf) ``` ## How was this patch tested? vanzin Author: Kent Yao <yaooqinn@hotmail.com> Closes #19688 from yaooqinn/SPARK-22466.
61 lines
2 KiB
Bash
61 lines
2 KiB
Bash
#!/usr/bin/env bash
|
|
|
|
#
|
|
# Licensed to the Apache Software Foundation (ASF) under one or more
|
|
# contributor license agreements. See the NOTICE file distributed with
|
|
# this work for additional information regarding copyright ownership.
|
|
# The ASF licenses this file to You under the Apache License, Version 2.0
|
|
# (the "License"); you may not use this file except in compliance with
|
|
# the License. You may obtain a copy of the License at
|
|
#
|
|
# http://www.apache.org/licenses/LICENSE-2.0
|
|
#
|
|
# Unless required by applicable law or agreed to in writing, software
|
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
# See the License for the specific language governing permissions and
|
|
# limitations under the License.
|
|
#
|
|
|
|
# This script loads spark-env.sh if it exists, and ensures it is only loaded once.
|
|
# spark-env.sh is loaded from SPARK_CONF_DIR if set, or within the current directory's
|
|
# conf/ subdirectory.
|
|
|
|
# Figure out where Spark is installed
|
|
if [ -z "${SPARK_HOME}" ]; then
|
|
source "$(dirname "$0")"/find-spark-home
|
|
fi
|
|
|
|
if [ -z "$SPARK_ENV_LOADED" ]; then
|
|
export SPARK_ENV_LOADED=1
|
|
|
|
export SPARK_CONF_DIR="${SPARK_CONF_DIR:-"${SPARK_HOME}"/conf}"
|
|
|
|
if [ -f "${SPARK_CONF_DIR}/spark-env.sh" ]; then
|
|
# Promote all variable declarations to environment (exported) variables
|
|
set -a
|
|
. "${SPARK_CONF_DIR}/spark-env.sh"
|
|
set +a
|
|
fi
|
|
fi
|
|
|
|
# Setting SPARK_SCALA_VERSION if not already set.
|
|
|
|
if [ -z "$SPARK_SCALA_VERSION" ]; then
|
|
|
|
ASSEMBLY_DIR2="${SPARK_HOME}/assembly/target/scala-2.11"
|
|
ASSEMBLY_DIR1="${SPARK_HOME}/assembly/target/scala-2.12"
|
|
|
|
if [[ -d "$ASSEMBLY_DIR2" && -d "$ASSEMBLY_DIR1" ]]; then
|
|
echo -e "Presence of build for multiple Scala versions detected." 1>&2
|
|
echo -e 'Either clean one of them or, export SPARK_SCALA_VERSION in spark-env.sh.' 1>&2
|
|
exit 1
|
|
fi
|
|
|
|
if [ -d "$ASSEMBLY_DIR2" ]; then
|
|
export SPARK_SCALA_VERSION="2.11"
|
|
else
|
|
export SPARK_SCALA_VERSION="2.12"
|
|
fi
|
|
fi
|