spark-instrumented-optimizer/python/pyspark
Davies Liu 37719e0cd0 [SPARK-8189] [SQL] use Long for TimestampType in SQL
This PR change to use Long as internal type for TimestampType for efficiency, which means it will the precision below 100ns.

Author: Davies Liu <davies@databricks.com>

Closes #6733 from davies/timestamp and squashes the following commits:

d9565fa [Davies Liu] remove print
65cf2f1 [Davies Liu] fix Timestamp in SparkR
86fecfb [Davies Liu] disable two timestamp tests
8f77ee0 [Davies Liu] fix scala style
246ee74 [Davies Liu] address comments
309d2e1 [Davies Liu] use Long for TimestampType in SQL
2015-06-10 16:55:39 -07:00
..
ml [SPARK-7432] [MLLIB] fix flaky CrossValidator doctest 2015-06-02 08:51:00 -07:00
mllib [SPARK-7639] [PYSPARK] [MLLIB] Python API for KernelDensity 2015-06-06 14:52:14 -07:00
sql [SPARK-8189] [SQL] use Long for TimestampType in SQL 2015-06-10 16:55:39 -07:00
streaming [SPARK-2808] [STREAMING] [KAFKA] cleanup tests from 2015-06-07 21:42:45 +01:00
__init__.py [SPARK-4172] [PySpark] Progress API in Python 2015-02-17 13:36:43 -08:00
accumulators.py [SPARK-7899] [PYSPARK] Fix Python 3 pyspark/sql/types module conflict 2015-05-29 14:13:44 -07:00
broadcast.py [SPARK-4897] [PySpark] Python 3 support 2015-04-16 16:20:57 -07:00
cloudpickle.py [SPARK-4897] [PySpark] Python 3 support 2015-04-16 16:20:57 -07:00
conf.py [SPARK-4897] [PySpark] Python 3 support 2015-04-16 16:20:57 -07:00
context.py [SPARK-8116][PYSPARK] Allow sc.range() to take a single argument. 2015-06-04 22:22:01 -07:00
daemon.py [SPARK-4897] [PySpark] Python 3 support 2015-04-16 16:20:57 -07:00
files.py [SPARK-3309] [PySpark] Put all public API in __all__ 2014-09-03 11:49:45 -07:00
heapq3.py [SPARK-4897] [PySpark] Python 3 support 2015-04-16 16:20:57 -07:00
java_gateway.py [SPARK-6949] [SQL] [PySpark] Support Date/Timestamp in Column expression 2015-04-21 00:08:18 -07:00
join.py [SPARK-4897] [PySpark] Python 3 support 2015-04-16 16:20:57 -07:00
profiler.py [SPARK-4897] [PySpark] Python 3 support 2015-04-16 16:20:57 -07:00
rdd.py [SPARK-6416] [DOCS] RDD.fold() requires the operator to be commutative 2015-05-21 19:42:51 +01:00
rddsampler.py [SPARK-4897] [PySpark] Python 3 support 2015-04-16 16:20:57 -07:00
resultiterable.py [SPARK-3074] [PySpark] support groupByKey() with single huge key 2015-04-09 17:07:23 -07:00
serializers.py [SPARK-4897] [PySpark] Python 3 support 2015-04-16 16:20:57 -07:00
shell.py [SPARK-4897] [PySpark] Python 3 support 2015-04-16 16:20:57 -07:00
shuffle.py [SPARK-7339] [PYSPARK] PySpark shuffle spill memory sometimes are not correct 2015-05-26 08:35:39 -07:00
statcounter.py [SPARK-4897] [PySpark] Python 3 support 2015-04-16 16:20:57 -07:00
status.py [SPARK-4172] [PySpark] Progress API in Python 2015-02-17 13:36:43 -08:00
storagelevel.py [SPARK-3417] Use new-style classes in PySpark 2014-09-08 15:45:36 -07:00
tests.py [SPARK-7711] Add a startTime property to match the corresponding one in Scala 2015-05-21 14:08:57 -07:00
traceback_utils.py [SPARK-1087] Move python traceback utilities into new traceback_utils.py file. 2014-09-15 19:28:17 -07:00
worker.py [SPARK-6216] [PYSPARK] check python version of worker with driver 2015-05-18 12:55:13 -07:00