spark-instrumented-optimizer/python
Michael Armbrust 44233865cf [SQL] Make it possible to create Java/Python SQLContexts from an existing Scala SQLContext.
Author: Michael Armbrust <michael@databricks.com>

Closes #761 from marmbrus/existingContext and squashes the following commits:

4651051 [Michael Armbrust] Make it possible to create Java/Python SQLContexts from an existing Scala SQLContext.
2014-05-13 21:23:51 -07:00
..
lib SPARK-1004. PySpark on YARN 2014-04-29 23:24:34 -07:00
pyspark [SQL] Make it possible to create Java/Python SQLContexts from an existing Scala SQLContext. 2014-05-13 21:23:51 -07:00
test_support License headers 2013-12-09 16:41:01 -08:00
.gitignore SPARK-1004. PySpark on YARN 2014-04-29 23:24:34 -07:00
epydoc.conf [SPARK-1439, SPARK-1440] Generate unified Scaladoc across projects and Javadocs 2014-04-21 21:57:40 -07:00
run-tests FIX: Don't build Hive in assembly unless running Hive tests. 2014-04-17 17:24:00 -07:00