[SPARK-35300][PYTHON][DOCS] Standardize module names in install.rst

### What changes were proposed in this pull request?

Use full names of modules in `install.rst` when specifying dependencies.

### Why are the changes needed?

Using full names makes it more clear.
In addition, `pandas APIs on Spark` as a new module can start to be recognized by more people.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Manual verification.

Closes #32427 from xinrong-databricks/nameDoc.

Authored-by: Xinrong Meng <xinrong.meng@databricks.com>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>
This commit is contained in:
Xinrong Meng 2021-05-04 11:02:57 +09:00 committed by HyukjinKwon
parent 120c389b00
commit 5ecb112410

View file

@ -152,17 +152,17 @@ To install PySpark from source, refer to |building_spark|_.
Dependencies Dependencies
------------ ------------
============= ========================= ============================ ============= ========================= ======================================
Package Minimum supported version Note Package Minimum supported version Note
============= ========================= ============================ ============= ========================= ======================================
`pandas` 0.23.2 Optional for SQL `pandas` 0.23.2 Optional for Spark SQL
`NumPy` 1.7 Required for ML `NumPy` 1.7 Required for MLlib DataFrame-based API
`pyarrow` 1.0.0 Optional for SQL `pyarrow` 1.0.0 Optional for Spark SQL
`Py4J` 0.10.9.2 Required `Py4J` 0.10.9.2 Required
`pandas` 0.23.2 Required for pandas-on-Spark `pandas` 0.23.2 Required for pandas APIs on Spark
`pyarrow` 1.0.0 Required for pandas-on-Spark `pyarrow` 1.0.0 Required for pandas APIs on Spark
`Numpy` 1.14(<1.20.0) Required for pandas-on-Spark `Numpy` 1.14(<1.20.0) Required for pandas APIs on Spark
============= ========================= ============================ ============= ========================= ======================================
Note that PySpark requires Java 8 or later with ``JAVA_HOME`` properly set. Note that PySpark requires Java 8 or later with ``JAVA_HOME`` properly set.
If using JDK 11, set ``-Dio.netty.tryReflectionSetAccessible=true`` for Arrow related features and refer If using JDK 11, set ``-Dio.netty.tryReflectionSetAccessible=true`` for Arrow related features and refer