[SPARK-36178][PYTHON] List pyspark.sql.catalog APIs in documentation

### What changes were proposed in this pull request?
The pyspark.sql.catalog APIs were missing from the documentation. PR fixes this omission.

### Why are the changes needed?
Documentation consistency

### Does this PR introduce _any_ user-facing change?
No.

### How was this patch tested?
Documentation change only.

Closes #33392 from dominikgehl/feature/SPARK-36178.

Authored-by: Dominik Gehl <dog@open.ch>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
(cherry picked from commit fe4db74da4)
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
This commit is contained in:
Dominik Gehl 2021-07-19 19:49:09 +09:00 committed by Hyukjin Kwon
parent 73025ae296
commit e7a210e5ed

View file

@ -29,6 +29,7 @@ Core Classes
:toctree: api/
SparkSession
Catalog
DataFrame
Column
Row
@ -603,3 +604,30 @@ Grouping
GroupedData.pivot
GroupedData.sum
PandasCogroupedOps.applyInPandas
Catalog APIs
------------
.. currentmodule:: pyspark.sql
.. autosummary::
:toctree: api/
Catalog.cacheTable
Catalog.clearCache
Catalog.createExternalTable
Catalog.createTable
Catalog.currentDatabase
Catalog.dropGlobalTempView
Catalog.dropTempView
Catalog.isCached
Catalog.listColumns
Catalog.listDatabases
Catalog.listFunctions
Catalog.listTables
Catalog.recoverPartitions
Catalog.refreshByPath
Catalog.refreshTable
Catalog.registerFunction
Catalog.setCurrentDatabase
Catalog.uncacheTable