spark-instrumented-optimizer/sql/hive
Yin Huai 572b62cafe [SPARK-7853] [SQL] Fix HiveContext in Spark Shell
https://issues.apache.org/jira/browse/SPARK-7853

This fixes the problem introduced by my change in https://github.com/apache/spark/pull/6435, which causes that Hive Context fails to create in spark shell because of the class loader issue.

Author: Yin Huai <yhuai@databricks.com>

Closes #6459 from yhuai/SPARK-7853 and squashes the following commits:

37ad33e [Yin Huai] Do not use hiveQlTable at all.
47cdb6d [Yin Huai] Move hiveconf.set to the end of setConf.
005649b [Yin Huai] Update comment.
35d86f3 [Yin Huai] Access TTable directly to make sure Hive will not internally use any metastore utility functions.
3737766 [Yin Huai] Recursively find all jars.
2015-05-28 17:12:30 -07:00
..
compatibility/src/test/scala/org/apache/spark/sql/hive/execution [SQL] [TEST] udf_java_method failed due to jdk version 2015-05-21 12:31:58 -07:00
src [SPARK-7853] [SQL] Fix HiveContext in Spark Shell 2015-05-28 17:12:30 -07:00
v0.13.1/src/main/scala/org/apache/spark/sql/hive [SPARK-6505] [SQL] Remove the reflection call in HiveFunctionWrapper 2015-04-27 14:08:05 +08:00
pom.xml [SPARK-7850][BUILD] Hive 0.12.0 profile in POM should be removed 2015-05-27 00:18:42 -07:00