4f79b9fffd
### What changes were proposed in this pull request? This PR aims to use `python3` instead of `python` inside `bin/pyspark`, `bin/find-spark-home` and `bin/find-spark-home.cmd` script. ``` $ git diff master --stat bin/find-spark-home | 4 ++-- bin/find-spark-home.cmd | 4 ++-- bin/pyspark | 4 ++-- ``` ### Why are the changes needed? According to [PEP 394](https://www.python.org/dev/peps/pep-0394/), we have four different cases for `python` while `python3` will be there always. ``` - Distributors may choose to set the behavior of the python command as follows: python2, python3, not provide python command, allow python to be configurable by an end user or a system administrator. ``` Moreover, these scripts already depend on `find_spark_home.py` which is using `#!/usr/bin/env python3`. ``` FIND_SPARK_HOME_PYTHON_SCRIPT="$(cd "$(dirname "$0")"; pwd)/find_spark_home.py" ``` ### Does this PR introduce _any_ user-facing change? No. Apache Spark 3.1 already drops Python 2.7 via SPARK-32138 . ### How was this patch tested? Pass the Jenkins or GitHub Action. Closes #29246 from dongjoon-hyun/SPARK-FIND-SPARK-HOME. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org> |
||
---|---|---|
.. | ||
beeline | ||
beeline.cmd | ||
docker-image-tool.sh | ||
find-spark-home | ||
find-spark-home.cmd | ||
load-spark-env.cmd | ||
load-spark-env.sh | ||
pyspark | ||
pyspark.cmd | ||
pyspark2.cmd | ||
run-example | ||
run-example.cmd | ||
spark-class | ||
spark-class.cmd | ||
spark-class2.cmd | ||
spark-shell | ||
spark-shell.cmd | ||
spark-shell2.cmd | ||
spark-sql | ||
spark-sql.cmd | ||
spark-sql2.cmd | ||
spark-submit | ||
spark-submit.cmd | ||
spark-submit2.cmd | ||
sparkR | ||
sparkR.cmd | ||
sparkR2.cmd |