[MINOR] [PYSPARK] [EXAMPLES] Changed examples to use SparkSession.sparkContext instead of _sc

## What changes were proposed in this pull request?

Some PySpark examples need a SparkContext and get it by accessing _sc directly from the session.  These examples should use the provided property `sparkContext` in `SparkSession` instead.

## How was this patch tested?
Ran modified examples

Author: Bryan Cutler <cutlerb@gmail.com>

Closes #13303 from BryanCutler/pyspark-session-sparkContext-MINOR.
This commit is contained in:
Bryan Cutler 2016-05-25 14:29:14 -07:00 committed by Davies Liu
parent 698ef762f8
commit 9c297df3d4
5 changed files with 5 additions and 5 deletions

View file

@ -67,7 +67,7 @@ if __name__ == "__main__":
.appName("PythonALS")\
.getOrCreate()
sc = spark._sc
sc = spark.sparkContext
M = int(sys.argv[1]) if len(sys.argv) > 1 else 100
U = int(sys.argv[2]) if len(sys.argv) > 2 else 500

View file

@ -70,7 +70,7 @@ if __name__ == "__main__":
.appName("AvroKeyInputFormat")\
.getOrCreate()
sc = spark._sc
sc = spark.sparkContext
conf = None
if len(sys.argv) == 3:

View file

@ -53,7 +53,7 @@ if __name__ == "__main__":
.appName("ParquetInputFormat")\
.getOrCreate()
sc = spark._sc
sc = spark.sparkContext
parquet_rdd = sc.newAPIHadoopFile(
path,

View file

@ -32,7 +32,7 @@ if __name__ == "__main__":
.appName("PythonPi")\
.getOrCreate()
sc = spark._sc
sc = spark.sparkContext
partitions = int(sys.argv[1]) if len(sys.argv) > 1 else 2
n = 100000 * partitions

View file

@ -46,7 +46,7 @@ if __name__ == "__main__":
.appName("PythonTransitiveClosure")\
.getOrCreate()
sc = spark._sc
sc = spark.sparkContext
partitions = int(sys.argv[1]) if len(sys.argv) > 1 else 2
tc = sc.parallelize(generateGraph(), partitions).cache()