[MINOR][DOC] Use raw triple double quotes around docstrings where there are occurrences of backslashes.
From [PEP 257](https://www.python.org/dev/peps/pep-0257/): > For consistency, always use """triple double quotes""" around docstrings. Use r"""raw triple double quotes""" if you use any backslashes in your docstrings. For Unicode docstrings, use u"""Unicode triple-quoted strings""". For example, this is what help (kafka_wordcount) shows: ``` DESCRIPTION Counts words in UTF8 encoded, ' ' delimited text received from the network every second. Usage: kafka_wordcount.py <zk> <topic> To run this on your local machine, you need to setup Kafka and create a producer first, see http://kafka.apache.org/documentation.html#quickstart and then run the example `$ bin/spark-submit --jars external/kafka-assembly/target/scala-*/spark-streaming-kafka-assembly-*.jar examples/src/main/python/streaming/kafka_wordcount.py localhost:2181 test` ``` This is what it shows, after the fix: ``` DESCRIPTION Counts words in UTF8 encoded, '\n' delimited text received from the network every second. Usage: kafka_wordcount.py <zk> <topic> To run this on your local machine, you need to setup Kafka and create a producer first, see http://kafka.apache.org/documentation.html#quickstart and then run the example `$ bin/spark-submit --jars \ external/kafka-assembly/target/scala-*/spark-streaming-kafka-assembly-*.jar \ examples/src/main/python/streaming/kafka_wordcount.py \ localhost:2181 test` ``` The thing worth noticing is no linebreak here in the help. ## What changes were proposed in this pull request? Change triple double quotes to raw triple double quotes when there are occurrences of backslashes in docstrings. ## How was this patch tested? Manually as this is a doc fix. Author: Shashwat Anand <me@shashwat.me> Closes #20497 from ashashwat/docstring-fixes.
This commit is contained in:
parent
522e0b1866
commit
4aaa7d40bf
|
@ -15,7 +15,7 @@
|
|||
# limitations under the License.
|
||||
#
|
||||
|
||||
"""
|
||||
r"""
|
||||
Counts words in UTF8 encoded, '\n' delimited text received from the network.
|
||||
Usage: structured_network_wordcount.py <hostname> <port>
|
||||
<hostname> and <port> describe the TCP server that Structured Streaming
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
# limitations under the License.
|
||||
#
|
||||
|
||||
"""
|
||||
r"""
|
||||
Counts words in UTF8 encoded, '\n' delimited text received from the network over a
|
||||
sliding window of configurable duration. Each line from the network is tagged
|
||||
with a timestamp that is used to determine the windows into which it falls.
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
# limitations under the License.
|
||||
#
|
||||
|
||||
"""
|
||||
r"""
|
||||
Counts words in UTF8 encoded, '\n' delimited text directly received from Kafka in every 2 seconds.
|
||||
Usage: direct_kafka_wordcount.py <broker_list> <topic>
|
||||
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
# limitations under the License.
|
||||
#
|
||||
|
||||
"""
|
||||
r"""
|
||||
Counts words in UTF8 encoded, '\n' delimited text received from the network every second.
|
||||
Usage: flume_wordcount.py <hostname> <port>
|
||||
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
# limitations under the License.
|
||||
#
|
||||
|
||||
"""
|
||||
r"""
|
||||
Counts words in UTF8 encoded, '\n' delimited text received from the network every second.
|
||||
Usage: kafka_wordcount.py <zk> <topic>
|
||||
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
# limitations under the License.
|
||||
#
|
||||
|
||||
"""
|
||||
r"""
|
||||
Counts words in UTF8 encoded, '\n' delimited text received from the network every second.
|
||||
Usage: network_wordcount.py <hostname> <port>
|
||||
<hostname> and <port> describe the TCP server that Spark Streaming would connect to receive data.
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
# limitations under the License.
|
||||
#
|
||||
|
||||
"""
|
||||
r"""
|
||||
Shows the most positive words in UTF8 encoded, '\n' delimited text directly received the network
|
||||
every 5 seconds. The streaming data is joined with a static RDD of the AFINN word list
|
||||
(http://neuro.imm.dtu.dk/wiki/AFINN)
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
# limitations under the License.
|
||||
#
|
||||
|
||||
"""
|
||||
r"""
|
||||
Use DataFrames and SQL to count words in UTF8 encoded, '\n' delimited text received from the
|
||||
network every second.
|
||||
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
# limitations under the License.
|
||||
#
|
||||
|
||||
"""
|
||||
r"""
|
||||
Counts words in UTF8 encoded, '\n' delimited text received from the
|
||||
network every second.
|
||||
|
||||
|
|
Loading…
Reference in a new issue