spark-instrumented-optimizer/bin
Dongjoon Hyun fc4e56a54c [SPARK-30884][PYSPARK] Upgrade to Py4J 0.10.9
This PR aims to upgrade Py4J to `0.10.9` for better Python 3.7 support in Apache Spark 3.0.0 (master/branch-3.0). This is not for `branch-2.4`.

- Apache Spark 3.0.0 is using `Py4J 0.10.8.1` (released on 2018-10-21) because `0.10.8.1` was the first official release to support Python 3.7.
    - https://www.py4j.org/changelog.html#py4j-0-10-8-and-py4j-0-10-8-1
- `Py4J 0.10.9` was released on January 25th 2020 with better Python 3.7 support and `magic_member` bug fix.
    - https://github.com/bartdag/py4j/releases/tag/0.10.9
    - https://www.py4j.org/changelog.html#py4j-0-10-9

No.

Pass the Jenkins with the existing tests.

Closes #27641 from dongjoon-hyun/SPARK-30884.

Authored-by: Dongjoon Hyun <dhyun@apple.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
2020-02-20 09:09:30 -08:00
..
beeline [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
beeline.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
docker-image-tool.sh [SPARK-27626][K8S] Fix docker-image-tool.sh to be robust in non-bash shell env 2019-05-03 10:13:22 -07:00
find-spark-home [MINOR] Fix a bunch of typos 2018-01-02 07:10:19 +09:00
find-spark-home.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
load-spark-env.cmd [SPARK-26132][BUILD][CORE] Remove support for Scala 2.11 in Spark 3.0.0 2019-03-25 10:46:42 -05:00
load-spark-env.sh [SPARK-26132][BUILD][CORE] Remove support for Scala 2.11 in Spark 3.0.0 2019-03-25 10:46:42 -05:00
pyspark [SPARK-30884][PYSPARK] Upgrade to Py4J 0.10.9 2020-02-20 09:09:30 -08:00
pyspark.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
pyspark2.cmd [SPARK-30884][PYSPARK] Upgrade to Py4J 0.10.9 2020-02-20 09:09:30 -08:00
run-example [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
run-example.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
spark-class [SPARK-28525][DEPLOY] Allow Launcher to be applied Java options 2019-07-30 12:45:32 -07:00
spark-class.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
spark-class2.cmd [SPARK-28302][CORE] Make sure to generate unique output file for SparkLauncher on Windows 2019-07-09 15:49:31 +09:00
spark-shell [SPARK-25906][SHELL] Documents '-I' option (from Scala REPL) in spark-shell 2018-11-06 10:39:58 +08:00
spark-shell.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
spark-shell2.cmd [SPARK-25906][SHELL] Documents '-I' option (from Scala REPL) in spark-shell 2018-11-06 10:39:58 +08:00
spark-sql [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
spark-sql.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
spark-sql2.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
spark-submit [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
spark-submit.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
spark-submit2.cmd [SPARK-11518][DEPLOY, WINDOWS] Handle spaces in Windows command scripts 2016-02-10 09:54:22 +00:00
sparkR [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
sparkR.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
sparkR2.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00