spark-instrumented-optimizer/python/docs/source/reference/pyspark.ss.rst
HyukjinKwon 6ab29b37cf [SPARK-32179][SPARK-32188][PYTHON][DOCS] Replace and redesign the documentation base
### What changes were proposed in this pull request?

This PR proposes to redesign the PySpark documentation.

I made a demo site to make it easier to review: https://hyukjin-spark.readthedocs.io/en/stable/reference/index.html.

Here is the initial draft for the final PySpark docs shape: https://hyukjin-spark.readthedocs.io/en/latest/index.html.

In more details, this PR proposes:
1. Use [pydata_sphinx_theme](https://github.com/pandas-dev/pydata-sphinx-theme) theme - [pandas](https://pandas.pydata.org/docs/) and [Koalas](https://koalas.readthedocs.io/en/latest/) use this theme. The CSS overwrite is ported from Koalas. The colours in the CSS were actually chosen by designers to use in Spark.
2. Use the Sphinx option to separate `source` and `build` directories as the documentation pages will likely grow.
3. Port current API documentation into the new style. It mimics Koalas and pandas to use the theme most effectively.

    One disadvantage of this approach is that you should list up APIs or classes; however, I think this isn't a big issue in PySpark since we're being conservative on adding APIs. I also intentionally listed classes only instead of functions in ML and MLlib to make it relatively easier to manage.

### Why are the changes needed?

Often I hear the complaints, from the users, that current PySpark documentation is pretty messy to read - https://spark.apache.org/docs/latest/api/python/index.html compared other projects such as [pandas](https://pandas.pydata.org/docs/) and [Koalas](https://koalas.readthedocs.io/en/latest/).

It would be nicer if we can make it more organised instead of just listing all classes, methods and attributes to make it easier to navigate.

Also, the documentation has been there from almost the very first version of PySpark. Maybe it's time to update it.

### Does this PR introduce _any_ user-facing change?

Yes, PySpark API documentation will be redesigned.

### How was this patch tested?

Manually tested, and the demo site was made to show.

Closes #29188 from HyukjinKwon/SPARK-32179.

Authored-by: HyukjinKwon <gurwls223@apache.org>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>
2020-07-27 17:49:21 +09:00

91 lines
2.4 KiB
ReStructuredText

.. Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
.. http://www.apache.org/licenses/LICENSE-2.0
.. Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
====================
Structured Streaming
====================
Core Classes
------------
.. currentmodule:: pyspark.sql.streaming
.. autosummary::
:toctree: api/
DataStreamReader
DataStreamWriter
ForeachBatchFunction
StreamingQuery
StreamingQueryException
StreamingQueryManager
Input and Output
----------------
.. currentmodule:: pyspark.sql.streaming
.. autosummary::
:toctree: api/
DataStreamReader.csv
DataStreamReader.format
DataStreamReader.json
DataStreamReader.load
DataStreamReader.option
DataStreamReader.options
DataStreamReader.orc
DataStreamReader.parquet
DataStreamReader.schema
DataStreamReader.text
DataStreamWriter.foreach
DataStreamWriter.foreachBatch
DataStreamWriter.format
DataStreamWriter.option
DataStreamWriter.options
DataStreamWriter.outputMode
DataStreamWriter.partitionBy
DataStreamWriter.queryName
DataStreamWriter.start
DataStreamWriter.trigger
Query Management
----------------
.. currentmodule:: pyspark.sql.streaming
.. autosummary::
:toctree: api/
StreamingQuery.awaitTermination
StreamingQuery.exception
StreamingQuery.explain
StreamingQuery.id
StreamingQuery.isActive
StreamingQuery.lastProgress
StreamingQuery.name
StreamingQuery.processAllAvailable
StreamingQuery.recentProgress
StreamingQuery.runId
StreamingQuery.status
StreamingQuery.stop
StreamingQueryManager.active
StreamingQueryManager.awaitAnyTermination
StreamingQueryManager.get
StreamingQueryManager.resetTerminated