spark-instrumented-optimizer/sql/core/src/main
windpiger 8f33731e79 [SPARK-19664][SQL] put hive.metastore.warehouse.dir in hadoopconf to overwrite its original value
## What changes were proposed in this pull request?

In [SPARK-15959](https://issues.apache.org/jira/browse/SPARK-15959), we bring back the `hive.metastore.warehouse.dir` , while in the logic, when use the value of  `spark.sql.warehouse.dir` to overwrite `hive.metastore.warehouse.dir` , it set it to `sparkContext.conf` which does not overwrite the value is hadoopConf, I think it should put in `sparkContext.hadoopConfiguration` and overwrite the original value of hadoopConf

https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/internal/SharedState.scala#L64

## How was this patch tested?
N/A

Author: windpiger <songjun@outlook.com>

Closes #16996 from windpiger/hivemetawarehouseConf.
2017-02-23 22:57:23 -08:00
..
java/org/apache/spark [SPARK-19534][TESTS] Convert Java tests to use lambdas, Java 8 features 2017-02-19 09:42:50 -08:00
resources [SPARK-16031] Add debug-only socket source in Structured Streaming 2016-06-19 21:27:04 -07:00
scala/org/apache/spark/sql [SPARK-19664][SQL] put hive.metastore.warehouse.dir in hadoopconf to overwrite its original value 2017-02-23 22:57:23 -08:00