[SPARK-15456][PYSPARK] Fixed PySpark shell context initialization when HiveConf not present

## What changes were proposed in this pull request?

When PySpark shell cannot find HiveConf, it will fallback to create a SparkSession from a SparkContext.  This fixes a bug caused by using a variable to SparkContext before it was initialized.

## How was this patch tested?

Manually starting PySpark shell and using the SparkContext

Author: Bryan Cutler <cutlerb@gmail.com>

Closes #13237 from BryanCutler/pyspark-shell-session-context-SPARK-15456.
This commit is contained in:
Bryan Cutler 2016-05-20 16:41:57 -07:00 committed by Andrew Or
parent 127bf1bb07
commit 021c19702c

View file

@ -44,9 +44,9 @@ try:
.enableHiveSupport()\ .enableHiveSupport()\
.getOrCreate() .getOrCreate()
except py4j.protocol.Py4JError: except py4j.protocol.Py4JError:
spark = SparkSession(sc) spark = SparkSession.builder.getOrCreate()
except TypeError: except TypeError:
spark = SparkSession(sc) spark = SparkSession.builder.getOrCreate()
sc = spark.sparkContext sc = spark.sparkContext
atexit.register(lambda: sc.stop()) atexit.register(lambda: sc.stop())