eecc43cb52
### What changes were proposed in this pull request? When SparkContext is initialed, if we want to start SparkSession, when we call `SparkSession.builder.enableHiveSupport().getOrCreate()`, the SparkSession we created won't have hive support since we have't reset existed SC's conf's `spark.sql.catalogImplementation`. In this PR we use sharedState.conf to decide whether we should enable Hive Support. ### Why are the changes needed? We should respect `enableHiveSupport` ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Added UT Closes #31680 from AngersZhuuuu/SPARK-34568. Authored-by: Angerszhuuuu <angers.zhu@gmail.com> Signed-off-by: Wenchen Fan <wenchen@databricks.com> |
||
---|---|---|
.. | ||
benchmarks | ||
compatibility/src/test/scala/org/apache/spark/sql/hive/execution | ||
src | ||
pom.xml |