spark-instrumented-optimizer/python
Ewen Cheslack-Postava 56d230e614 Add classmethod to SparkContext to set system properties.
Add a new classmethod to SparkContext to set system properties like is
possible in Scala/Java. Unlike the Java/Scala implementations, there's
no access to System until the JVM bridge is created. Since
SparkContext handles that, move the initialization of the JVM
connection to a separate classmethod that can safely be called
repeatedly as long as the same instance (or no instance) is provided.
2013-10-22 00:22:37 -07:00
..
examples Add banner to PySpark and make wordcount output nicer 2013-09-01 14:13:16 -07:00
lib Fix PySpark for assembly run and include it in dist 2013-08-29 21:19:06 -07:00
pyspark Add classmethod to SparkContext to set system properties. 2013-10-22 00:22:37 -07:00
test_support Implementing SPARK-878 for PySpark: adding zip and egg files to context and passing it down to workers which add these to their sys.path 2013-08-16 11:58:20 -07:00
.gitignore Rename top-level 'pyspark' directory to 'python' 2013-01-01 15:05:00 -08:00
epydoc.conf Exclude some private modules in epydoc 2013-09-02 12:22:52 -07:00
run-tests Fix PySpark unit tests on Python 2.6. 2013-08-14 15:12:12 -07:00