[SPARK-15456][PYSPARK] Fixed PySpark shell context initialization when HiveConf not present
## What changes were proposed in this pull request? When PySpark shell cannot find HiveConf, it will fallback to create a SparkSession from a SparkContext. This fixes a bug caused by using a variable to SparkContext before it was initialized. ## How was this patch tested? Manually starting PySpark shell and using the SparkContext Author: Bryan Cutler <cutlerb@gmail.com> Closes #13237 from BryanCutler/pyspark-shell-session-context-SPARK-15456.
This commit is contained in:
parent
127bf1bb07
commit
021c19702c
|
@ -44,9 +44,9 @@ try:
|
|||
.enableHiveSupport()\
|
||||
.getOrCreate()
|
||||
except py4j.protocol.Py4JError:
|
||||
spark = SparkSession(sc)
|
||||
spark = SparkSession.builder.getOrCreate()
|
||||
except TypeError:
|
||||
spark = SparkSession(sc)
|
||||
spark = SparkSession.builder.getOrCreate()
|
||||
|
||||
sc = spark.sparkContext
|
||||
atexit.register(lambda: sc.stop())
|
||||
|
|
Loading…
Reference in a new issue