2012-12-28 01:47:37 -05:00
|
|
|
#!/usr/bin/env bash
|
2012-10-19 20:16:41 -04:00
|
|
|
|
2013-07-16 20:21:33 -04:00
|
|
|
#
|
|
|
|
# Licensed to the Apache Software Foundation (ASF) under one or more
|
|
|
|
# contributor license agreements. See the NOTICE file distributed with
|
|
|
|
# this work for additional information regarding copyright ownership.
|
|
|
|
# The ASF licenses this file to You under the Apache License, Version 2.0
|
|
|
|
# (the "License"); you may not use this file except in compliance with
|
|
|
|
# the License. You may obtain a copy of the License at
|
|
|
|
#
|
|
|
|
# http://www.apache.org/licenses/LICENSE-2.0
|
|
|
|
#
|
|
|
|
# Unless required by applicable law or agreed to in writing, software
|
|
|
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
|
|
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
|
|
# See the License for the specific language governing permissions and
|
|
|
|
# limitations under the License.
|
|
|
|
#
|
|
|
|
|
2014-05-21 04:22:25 -04:00
|
|
|
# Figure out where Spark is installed
|
2015-03-11 04:03:01 -04:00
|
|
|
export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)"
|
2012-10-19 20:16:41 -04:00
|
|
|
|
2015-03-11 04:03:01 -04:00
|
|
|
source "$SPARK_HOME"/bin/load-spark-env.sh
|
2013-08-27 18:46:23 -04:00
|
|
|
|
2014-08-10 00:10:43 -04:00
|
|
|
function usage() {
|
2015-03-11 04:03:01 -04:00
|
|
|
if [ -n "$1" ]; then
|
|
|
|
echo $1
|
|
|
|
fi
|
2014-07-03 18:06:58 -04:00
|
|
|
echo "Usage: ./bin/pyspark [options]" 1>&2
|
2015-03-11 04:03:01 -04:00
|
|
|
"$SPARK_HOME"/bin/spark-submit --help 2>&1 | grep -v Usage 1>&2
|
|
|
|
exit $2
|
2014-08-10 00:10:43 -04:00
|
|
|
}
|
2015-03-11 04:03:01 -04:00
|
|
|
export -f usage
|
2014-08-10 00:10:43 -04:00
|
|
|
|
|
|
|
if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then
|
|
|
|
usage
|
[SPARK-1808] Route bin/pyspark through Spark submit
**Problem.** For `bin/pyspark`, there is currently no other way to specify Spark configuration properties other than through `SPARK_JAVA_OPTS` in `conf/spark-env.sh`. However, this mechanism is supposedly deprecated. Instead, it needs to pick up configurations explicitly specified in `conf/spark-defaults.conf`.
**Solution.** Have `bin/pyspark` invoke `bin/spark-submit`, like all of its counterparts in Scala land (i.e. `bin/spark-shell`, `bin/run-example`). This has the additional benefit of making the invocation of all the user facing Spark scripts consistent.
**Details.** `bin/pyspark` inherently handles two cases: (1) running python applications and (2) running the python shell. For (1), Spark submit already handles running python applications. For cases in which `bin/pyspark` is given a python file, we can simply call pass the file directly to Spark submit and let it handle the rest.
For case (2), `bin/pyspark` starts a python process as before, which launches the JVM as a sub-process. The existing code already provides a code path to do this. All we needed to change is to use `bin/spark-submit` instead of `spark-class` to launch the JVM. This requires modifications to Spark submit to handle the pyspark shell as a special case.
This has been tested locally (OSX and Windows 7), on a standalone cluster, and on a YARN cluster. Running IPython also works as before, except now it takes in Spark submit arguments too.
Author: Andrew Or <andrewor14@gmail.com>
Closes #799 from andrewor14/pyspark-submit and squashes the following commits:
bf37e36 [Andrew Or] Minor changes
01066fa [Andrew Or] bin/pyspark for Windows
c8cb3bf [Andrew Or] Handle perverse app names (with escaped quotes)
1866f85 [Andrew Or] Windows is not cooperating
456d844 [Andrew Or] Guard against shlex hanging if PYSPARK_SUBMIT_ARGS is not set
7eebda8 [Andrew Or] Merge branch 'master' of github.com:apache/spark into pyspark-submit
b7ba0d8 [Andrew Or] Address a few comments (minor)
06eb138 [Andrew Or] Use shlex instead of writing our own parser
05879fa [Andrew Or] Merge branch 'master' of github.com:apache/spark into pyspark-submit
a823661 [Andrew Or] Fix --die-on-broken-pipe not propagated properly
6fba412 [Andrew Or] Deal with quotes + address various comments
fe4c8a7 [Andrew Or] Update --help for bin/pyspark
afe47bf [Andrew Or] Fix spark shell
f04aaa4 [Andrew Or] Merge branch 'master' of github.com:apache/spark into pyspark-submit
a371d26 [Andrew Or] Route bin/pyspark through Spark submit
2014-05-17 01:34:38 -04:00
|
|
|
fi
|
|
|
|
|
2014-10-09 19:08:07 -04:00
|
|
|
# In Spark <= 1.1, setting IPYTHON=1 would cause the driver to be launched using the `ipython`
|
|
|
|
# executable, while the worker would still be launched using PYSPARK_PYTHON.
|
|
|
|
#
|
|
|
|
# In Spark 1.2, we removed the documentation of the IPYTHON and IPYTHON_OPTS variables and added
|
|
|
|
# PYSPARK_DRIVER_PYTHON and PYSPARK_DRIVER_PYTHON_OPTS to allow IPython to be used for the driver.
|
|
|
|
# Now, users can simply set PYSPARK_DRIVER_PYTHON=ipython to use IPython and set
|
|
|
|
# PYSPARK_DRIVER_PYTHON_OPTS to pass options when starting the Python driver
|
|
|
|
# (e.g. PYSPARK_DRIVER_PYTHON_OPTS='notebook'). This supports full customization of the IPython
|
|
|
|
# and executor Python executables.
|
|
|
|
#
|
|
|
|
# For backwards-compatibility, we retain the old IPYTHON and IPYTHON_OPTS variables.
|
|
|
|
|
|
|
|
# Determine the Python executable to use if PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON isn't set:
|
|
|
|
if hash python2.7 2>/dev/null; then
|
|
|
|
# Attempt to use Python 2.7, if installed:
|
|
|
|
DEFAULT_PYTHON="python2.7"
|
|
|
|
else
|
|
|
|
DEFAULT_PYTHON="python"
|
|
|
|
fi
|
|
|
|
|
|
|
|
# Determine the Python executable to use for the driver:
|
|
|
|
if [[ -n "$IPYTHON_OPTS" || "$IPYTHON" == "1" ]]; then
|
|
|
|
# If IPython options are specified, assume user wants to run IPython
|
|
|
|
# (for backwards-compatibility)
|
|
|
|
PYSPARK_DRIVER_PYTHON_OPTS="$PYSPARK_DRIVER_PYTHON_OPTS $IPYTHON_OPTS"
|
|
|
|
PYSPARK_DRIVER_PYTHON="ipython"
|
|
|
|
elif [[ -z "$PYSPARK_DRIVER_PYTHON" ]]; then
|
|
|
|
PYSPARK_DRIVER_PYTHON="${PYSPARK_PYTHON:-"$DEFAULT_PYTHON"}"
|
|
|
|
fi
|
|
|
|
|
|
|
|
# Determine the Python executable to use for the executors:
|
2014-06-11 15:11:46 -04:00
|
|
|
if [[ -z "$PYSPARK_PYTHON" ]]; then
|
2014-10-09 19:08:07 -04:00
|
|
|
if [[ $PYSPARK_DRIVER_PYTHON == *ipython* && $DEFAULT_PYTHON != "python2.7" ]]; then
|
|
|
|
echo "IPython requires Python 2.7+; please install python2.7 or set PYSPARK_PYTHON" 1>&2
|
|
|
|
exit 1
|
2014-10-02 14:13:19 -04:00
|
|
|
else
|
2014-10-09 19:08:07 -04:00
|
|
|
PYSPARK_PYTHON="$DEFAULT_PYTHON"
|
2014-10-02 14:13:19 -04:00
|
|
|
fi
|
2012-10-19 20:16:41 -04:00
|
|
|
fi
|
|
|
|
export PYSPARK_PYTHON
|
|
|
|
|
|
|
|
# Add the PySpark classes to the Python path:
|
2014-09-08 13:24:15 -04:00
|
|
|
export PYTHONPATH="$SPARK_HOME/python/:$PYTHONPATH"
|
|
|
|
export PYTHONPATH="$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH"
|
2012-10-19 20:16:41 -04:00
|
|
|
|
2013-01-02 00:25:49 -05:00
|
|
|
# Load the PySpark shell.py script when ./pyspark is used interactively:
|
2014-09-08 13:24:15 -04:00
|
|
|
export OLD_PYTHONSTARTUP="$PYTHONSTARTUP"
|
2015-03-11 04:03:01 -04:00
|
|
|
export PYTHONSTARTUP="$SPARK_HOME/python/pyspark/shell.py"
|
[SPARK-1808] Route bin/pyspark through Spark submit
**Problem.** For `bin/pyspark`, there is currently no other way to specify Spark configuration properties other than through `SPARK_JAVA_OPTS` in `conf/spark-env.sh`. However, this mechanism is supposedly deprecated. Instead, it needs to pick up configurations explicitly specified in `conf/spark-defaults.conf`.
**Solution.** Have `bin/pyspark` invoke `bin/spark-submit`, like all of its counterparts in Scala land (i.e. `bin/spark-shell`, `bin/run-example`). This has the additional benefit of making the invocation of all the user facing Spark scripts consistent.
**Details.** `bin/pyspark` inherently handles two cases: (1) running python applications and (2) running the python shell. For (1), Spark submit already handles running python applications. For cases in which `bin/pyspark` is given a python file, we can simply call pass the file directly to Spark submit and let it handle the rest.
For case (2), `bin/pyspark` starts a python process as before, which launches the JVM as a sub-process. The existing code already provides a code path to do this. All we needed to change is to use `bin/spark-submit` instead of `spark-class` to launch the JVM. This requires modifications to Spark submit to handle the pyspark shell as a special case.
This has been tested locally (OSX and Windows 7), on a standalone cluster, and on a YARN cluster. Running IPython also works as before, except now it takes in Spark submit arguments too.
Author: Andrew Or <andrewor14@gmail.com>
Closes #799 from andrewor14/pyspark-submit and squashes the following commits:
bf37e36 [Andrew Or] Minor changes
01066fa [Andrew Or] bin/pyspark for Windows
c8cb3bf [Andrew Or] Handle perverse app names (with escaped quotes)
1866f85 [Andrew Or] Windows is not cooperating
456d844 [Andrew Or] Guard against shlex hanging if PYSPARK_SUBMIT_ARGS is not set
7eebda8 [Andrew Or] Merge branch 'master' of github.com:apache/spark into pyspark-submit
b7ba0d8 [Andrew Or] Address a few comments (minor)
06eb138 [Andrew Or] Use shlex instead of writing our own parser
05879fa [Andrew Or] Merge branch 'master' of github.com:apache/spark into pyspark-submit
a823661 [Andrew Or] Fix --die-on-broken-pipe not propagated properly
6fba412 [Andrew Or] Deal with quotes + address various comments
fe4c8a7 [Andrew Or] Update --help for bin/pyspark
afe47bf [Andrew Or] Fix spark shell
f04aaa4 [Andrew Or] Merge branch 'master' of github.com:apache/spark into pyspark-submit
a371d26 [Andrew Or] Route bin/pyspark through Spark submit
2014-05-17 01:34:38 -04:00
|
|
|
|
2014-06-11 15:11:46 -04:00
|
|
|
# For pyspark tests
|
|
|
|
if [[ -n "$SPARK_TESTING" ]]; then
|
2014-09-05 14:07:00 -04:00
|
|
|
unset YARN_CONF_DIR
|
|
|
|
unset HADOOP_CONF_DIR
|
2015-04-16 19:20:57 -04:00
|
|
|
export PYTHONHASHSEED=0
|
2015-05-29 17:13:44 -04:00
|
|
|
exec "$PYSPARK_DRIVER_PYTHON" -m $1
|
2014-06-11 15:11:46 -04:00
|
|
|
exit
|
|
|
|
fi
|
|
|
|
|
2015-03-11 04:03:01 -04:00
|
|
|
export PYSPARK_DRIVER_PYTHON
|
|
|
|
export PYSPARK_DRIVER_PYTHON_OPTS
|
|
|
|
exec "$SPARK_HOME"/bin/spark-submit pyspark-shell-main "$@"
|