spark-instrumented-optimizer/bin
Dongjoon Hyun 4f79b9fffd [SPARK-32447][CORE] Use python3 by default in pyspark and find-spark-home scripts
### What changes were proposed in this pull request?

This PR aims to use `python3` instead of `python` inside `bin/pyspark`, `bin/find-spark-home` and `bin/find-spark-home.cmd` script.
```
$ git diff master --stat
 bin/find-spark-home     | 4 ++--
 bin/find-spark-home.cmd | 4 ++--
 bin/pyspark             | 4 ++--
```

### Why are the changes needed?

According to [PEP 394](https://www.python.org/dev/peps/pep-0394/), we have four different cases for `python` while `python3` will be there always.
```
- Distributors may choose to set the behavior of the python command as follows:
      python2,
      python3,
      not provide python command,
      allow python to be configurable by an end user or a system administrator.
```

Moreover, these scripts already depend on `find_spark_home.py` which is using `#!/usr/bin/env python3`.
```
FIND_SPARK_HOME_PYTHON_SCRIPT="$(cd "$(dirname "$0")"; pwd)/find_spark_home.py"
```

### Does this PR introduce _any_ user-facing change?

No. Apache Spark 3.1 already drops Python 2.7 via SPARK-32138 .

### How was this patch tested?

Pass the Jenkins or GitHub Action.

Closes #29246 from dongjoon-hyun/SPARK-FIND-SPARK-HOME.

Authored-by: Dongjoon Hyun <dongjoon@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
2020-07-26 15:55:48 -07:00
..
beeline [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
beeline.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
docker-image-tool.sh [SPARK-31934][BUILD] Remove set -x from docker image tool 2020-06-08 16:03:13 -07:00
find-spark-home [SPARK-32447][CORE] Use python3 by default in pyspark and find-spark-home scripts 2020-07-26 15:55:48 -07:00
find-spark-home.cmd [SPARK-32447][CORE] Use python3 by default in pyspark and find-spark-home scripts 2020-07-26 15:55:48 -07:00
load-spark-env.cmd [SPARK-32434][CORE] Support Scala 2.13 in AbstractCommandBuilder and load-spark-env scripts 2020-07-25 08:19:02 -07:00
load-spark-env.sh [SPARK-32434][CORE] Support Scala 2.13 in AbstractCommandBuilder and load-spark-env scripts 2020-07-25 08:19:02 -07:00
pyspark [SPARK-32447][CORE] Use python3 by default in pyspark and find-spark-home scripts 2020-07-26 15:55:48 -07:00
pyspark.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
pyspark2.cmd [SPARK-30884][PYSPARK] Upgrade to Py4J 0.10.9 2020-02-20 09:09:30 -08:00
run-example [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
run-example.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
spark-class [SPARK-28525][DEPLOY] Allow Launcher to be applied Java options 2019-07-30 12:45:32 -07:00
spark-class.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
spark-class2.cmd [SPARK-28302][CORE] Make sure to generate unique output file for SparkLauncher on Windows 2019-07-09 15:49:31 +09:00
spark-shell [SPARK-25906][SHELL] Documents '-I' option (from Scala REPL) in spark-shell 2018-11-06 10:39:58 +08:00
spark-shell.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
spark-shell2.cmd [SPARK-25906][SHELL] Documents '-I' option (from Scala REPL) in spark-shell 2018-11-06 10:39:58 +08:00
spark-sql [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
spark-sql.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
spark-sql2.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
spark-submit [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
spark-submit.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
spark-submit2.cmd [SPARK-11518][DEPLOY, WINDOWS] Handle spaces in Windows command scripts 2016-02-10 09:54:22 +00:00
sparkR [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
sparkR.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
sparkR2.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00