spark-instrumented-optimizer/dev/audit-release
Andrew Or 3c5e65c339 [SPARK-14721][SQL] Remove HiveContext (part 2)
## What changes were proposed in this pull request?

This removes the class `HiveContext` itself along with all code usages associated with it. The bulk of the work was already done in #12485. This is mainly just code cleanup and actually removing the class.

Note: A couple of things will break after this patch. These will be fixed separately.
- the python HiveContext
- all the documentation / comments referencing HiveContext
- there will be no more HiveContext in the REPL (fixed by #12589)

## How was this patch tested?

No change in functionality.

Author: Andrew Or <andrew@databricks.com>

Closes #12585 from andrewor14/delete-hive-context.
2016-04-25 13:23:05 -07:00
..
blank_maven_build SPARK-1996. Remove use of special Maven repo for Akka 2014-06-21 23:29:57 -07:00
blank_sbt_build [Release] Bring audit scripts up-to-date 2014-11-12 16:35:39 -08:00
maven_app_core SPARK-1996. Remove use of special Maven repo for Akka 2014-06-21 23:29:57 -07:00
sbt_app_core [SPARK-7977] [BUILD] Disallowing println 2015-07-10 11:34:01 +01:00
sbt_app_ganglia [SPARK-7977] [BUILD] Disallowing println 2015-07-10 11:34:01 +01:00
sbt_app_graphx [SPARK-7977] [BUILD] Disallowing println 2015-07-10 11:34:01 +01:00
sbt_app_hive [SPARK-14721][SQL] Remove HiveContext (part 2) 2016-04-25 13:23:05 -07:00
sbt_app_kinesis [SPARK-7977] [BUILD] Disallowing println 2015-07-10 11:34:01 +01:00
sbt_app_sql [SPARK-7977] [BUILD] Disallowing println 2015-07-10 11:34:01 +01:00
sbt_app_streaming [SPARK-7977] [BUILD] Disallowing println 2015-07-10 11:34:01 +01:00
.gitignore Merge pull request #565 from pwendell/dev-scripts. Closes #565. 2014-02-08 23:13:34 -08:00
audit_release.py [SPARK-14073][STREAMING][TEST-MAVEN] Move flume back to Spark 2016-03-25 17:37:16 -07:00
README.md [SPARK-13189] Cleanup build references to Scala 2.10 2016-02-09 11:56:25 -08:00

Test Application Builds

This directory includes test applications which are built when auditing releases. You can run them locally by setting appropriate environment variables.

$ cd sbt_app_core
$ SCALA_VERSION=2.11.7 \
  SPARK_VERSION=1.0.0-SNAPSHOT \
  SPARK_RELEASE_REPOSITORY=file:///home/patrick/.ivy2/local \
  sbt run