spark-instrumented-optimizer/python/pyspark/sql
Yin Huai dc9c9196d6 [SPARK-6366][SQL] In Python API, the default save mode for save and saveAsTable should be "error" instead of "append".
https://issues.apache.org/jira/browse/SPARK-6366

Author: Yin Huai <yhuai@databricks.com>

Closes #5053 from yhuai/SPARK-6366 and squashes the following commits:

fc81897 [Yin Huai] Use error as the default save mode for save/saveAsTable.
2015-03-18 09:41:06 +08:00
..
__init__.py [SPARK-5752][SQL] Don't implicitly convert RDDs directly to DataFrames 2015-02-13 23:03:22 -08:00
context.py [SPARK-6055] [PySpark] fix incorrect __eq__ of DataType 2015-02-27 20:07:17 -08:00
dataframe.py [SPARK-6366][SQL] In Python API, the default save mode for save and saveAsTable should be "error" instead of "append". 2015-03-18 09:41:06 +08:00
functions.py [SPARK-5994] [SQL] Python DataFrame documentation fixes 2015-02-24 20:51:55 -08:00
tests.py [SPARK-6055] [PySpark] fix incorrect __eq__ of DataType 2015-02-27 20:07:17 -08:00
types.py [SPARK-6121][SQL][MLLIB] simpleString for UDT 2015-03-02 17:14:34 -08:00