[MINOR] [SQL] Fix sphinx warnings in PySpark SQL

Author: MechCoder <manojkumarsivaraj334@gmail.com>

Closes #8171 from MechCoder/sql_sphinx.
This commit is contained in:
MechCoder 2015-08-20 10:05:31 -07:00 committed by Xiangrui Meng
parent b4f4e91c39
commit 52c60537a2
2 changed files with 7 additions and 5 deletions

View file

@ -302,10 +302,10 @@ class SparkContext(object):
"""
A unique identifier for the Spark application.
Its format depends on the scheduler implementation.
(i.e.
in case of local spark app something like 'local-1433865536131'
in case of YARN something like 'application_1433865536131_34483'
)
* in case of local spark app something like 'local-1433865536131'
* in case of YARN something like 'application_1433865536131_34483'
>>> sc.applicationId # doctest: +ELLIPSIS
u'local-...'
"""

View file

@ -467,9 +467,11 @@ class StructType(DataType):
"""
Construct a StructType by adding new elements to it to define the schema. The method accepts
either:
a) A single parameter which is a StructField object.
b) Between 2 and 4 parameters as (name, data_type, nullable (optional),
metadata(optional). The data_type parameter may be either a String or a DataType object
metadata(optional). The data_type parameter may be either a String or a
DataType object.
>>> struct1 = StructType().add("f1", StringType(), True).add("f2", StringType(), True, None)
>>> struct2 = StructType([StructField("f1", StringType(), True),\