[SPARK-22369][PYTHON][DOCS] Exposes catalog API documentation in PySpark
## What changes were proposed in this pull request? This PR proposes to add a link from `spark.catalog(..)` to `Catalog` and expose Catalog APIs in PySpark as below: <img width="740" alt="2017-10-29 12 25 46" src="https://user-images.githubusercontent.com/6477701/32135863-f8e9b040-bc40-11e7-92ad-09c8043a1295.png"> <img width="1131" alt="2017-10-29 12 26 33" src="https://user-images.githubusercontent.com/6477701/32135849-bb257b86-bc40-11e7-9eda-4d58fc1301c2.png"> Note that this is not shown in the list on the top - https://spark.apache.org/docs/latest/api/python/pyspark.sql.html#module-pyspark.sql <img width="674" alt="2017-10-29 12 30 58" src="https://user-images.githubusercontent.com/6477701/32135854-d50fab16-bc40-11e7-9181-812c56fd22f5.png"> This is basically similar with `DataFrameReader` and `DataFrameWriter`. ## How was this patch tested? Manually built the doc. Author: hyukjinkwon <gurwls223@gmail.com> Closes #19596 from HyukjinKwon/SPARK-22369.
This commit is contained in:
parent
b2463fad71
commit
41b60125b6
|
@ -46,6 +46,7 @@ from pyspark.sql.types import Row
|
|||
from pyspark.sql.context import SQLContext, HiveContext, UDFRegistration
|
||||
from pyspark.sql.session import SparkSession
|
||||
from pyspark.sql.column import Column
|
||||
from pyspark.sql.catalog import Catalog
|
||||
from pyspark.sql.dataframe import DataFrame, DataFrameNaFunctions, DataFrameStatFunctions
|
||||
from pyspark.sql.group import GroupedData
|
||||
from pyspark.sql.readwriter import DataFrameReader, DataFrameWriter
|
||||
|
@ -54,7 +55,7 @@ from pyspark.sql.window import Window, WindowSpec
|
|||
|
||||
__all__ = [
|
||||
'SparkSession', 'SQLContext', 'HiveContext', 'UDFRegistration',
|
||||
'DataFrame', 'GroupedData', 'Column', 'Row',
|
||||
'DataFrame', 'GroupedData', 'Column', 'Catalog', 'Row',
|
||||
'DataFrameNaFunctions', 'DataFrameStatFunctions', 'Window', 'WindowSpec',
|
||||
'DataFrameReader', 'DataFrameWriter'
|
||||
]
|
||||
|
|
|
@ -271,6 +271,8 @@ class SparkSession(object):
|
|||
def catalog(self):
|
||||
"""Interface through which the user may create, drop, alter or query underlying
|
||||
databases, tables, functions etc.
|
||||
|
||||
:return: :class:`Catalog`
|
||||
"""
|
||||
if not hasattr(self, "_catalog"):
|
||||
self._catalog = Catalog(self)
|
||||
|
|
Loading…
Reference in a new issue