spark-instrumented-optimizer/repl/scala-2.11/src
Wenchen Fan a432a2b860 [SPARK-15116] In REPL we should create SparkSession first and get SparkContext from it
## What changes were proposed in this pull request?

see https://github.com/apache/spark/pull/12873#discussion_r61993910. The problem is, if we create `SparkContext` first and then call `SparkSession.builder.enableHiveSupport().getOrCreate()`, we will reuse the existing `SparkContext` and the hive flag won't be set.

## How was this patch tested?

verified it locally.

Author: Wenchen Fan <wenchen@databricks.com>

Closes #12890 from cloud-fan/repl.
2016-05-04 14:40:54 -07:00
..
main/scala/org/apache/spark/repl [SPARK-15116] In REPL we should create SparkSession first and get SparkContext from it 2016-05-04 14:40:54 -07:00
test/scala/org/apache/spark/repl [SPARK-14828][SQL] Start SparkSession in REPL instead of SQLContext 2016-04-25 15:30:18 -07:00