spark-instrumented-optimizer/bin/find-spark-home
Dongjoon Hyun 4f79b9fffd [SPARK-32447][CORE] Use python3 by default in pyspark and find-spark-home scripts
### What changes were proposed in this pull request?

This PR aims to use `python3` instead of `python` inside `bin/pyspark`, `bin/find-spark-home` and `bin/find-spark-home.cmd` script.
```
$ git diff master --stat
 bin/find-spark-home     | 4 ++--
 bin/find-spark-home.cmd | 4 ++--
 bin/pyspark             | 4 ++--
```

### Why are the changes needed?

According to [PEP 394](https://www.python.org/dev/peps/pep-0394/), we have four different cases for `python` while `python3` will be there always.
```
- Distributors may choose to set the behavior of the python command as follows:
      python2,
      python3,
      not provide python command,
      allow python to be configurable by an end user or a system administrator.
```

Moreover, these scripts already depend on `find_spark_home.py` which is using `#!/usr/bin/env python3`.
```
FIND_SPARK_HOME_PYTHON_SCRIPT="$(cd "$(dirname "$0")"; pwd)/find_spark_home.py"
```

### Does this PR introduce _any_ user-facing change?

No. Apache Spark 3.1 already drops Python 2.7 via SPARK-32138 .

### How was this patch tested?

Pass the Jenkins or GitHub Action.

Closes #29246 from dongjoon-hyun/SPARK-FIND-SPARK-HOME.

Authored-by: Dongjoon Hyun <dongjoon@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
2020-07-26 15:55:48 -07:00

42 lines
1.9 KiB
Bash
Executable file

#!/usr/bin/env bash
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Attempts to find a proper value for SPARK_HOME. Should be included using "source" directive.
FIND_SPARK_HOME_PYTHON_SCRIPT="$(cd "$(dirname "$0")"; pwd)/find_spark_home.py"
# Short circuit if the user already has this set.
if [ ! -z "${SPARK_HOME}" ]; then
exit 0
elif [ ! -f "$FIND_SPARK_HOME_PYTHON_SCRIPT" ]; then
# If we are not in the same directory as find_spark_home.py we are not pip installed so we don't
# need to search the different Python directories for a Spark installation.
# Note only that, if the user has pip installed PySpark but is directly calling pyspark-shell or
# spark-submit in another directory we want to use that version of PySpark rather than the
# pip installed version of PySpark.
export SPARK_HOME="$(cd "$(dirname "$0")"/..; pwd)"
else
# We are pip installed, use the Python script to resolve a reasonable SPARK_HOME
# Default to standard python3 interpreter unless told otherwise
if [[ -z "$PYSPARK_DRIVER_PYTHON" ]]; then
PYSPARK_DRIVER_PYTHON="${PYSPARK_PYTHON:-"python3"}"
fi
export SPARK_HOME=$($PYSPARK_DRIVER_PYTHON "$FIND_SPARK_HOME_PYTHON_SCRIPT")
fi