spark-instrumented-optimizer/sql/core
Marcelo Vanzin 7f9da2b7f8 [SPARK-28371][SQL] Make Parquet "StartsWith" filter null-safe
Parquet may call the filter with a null value to check whether nulls are
accepted. While it seems Spark avoids that path in Parquet with 1.10, in
1.11 that causes Spark unit tests to fail.

Tested with Parquet 1.11 (and new unit test).

Closes #25140 from vanzin/SPARK-28371.

Authored-by: Marcelo Vanzin <vanzin@cloudera.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
2019-07-13 11:38:54 -07:00
..
benchmarks [SPARK-27701][SQL] Extend NestedColumnAliasing to general nested field cases including GetArrayStructField 2019-06-11 20:12:53 -07:00
src [SPARK-28371][SQL] Make Parquet "StartsWith" filter null-safe 2019-07-13 11:38:54 -07:00
v1.2.1/src [SPARK-28108][SQL][test-hadoop3.2] Simplify OrcFilters 2019-06-24 12:23:52 +08:00
v2.3.5/src [SPARK-28108][SQL][test-hadoop3.2] Simplify OrcFilters 2019-06-24 12:23:52 +08:00
pom.xml [SPARK-27521][SQL] Move data source v2 to catalyst module 2019-06-05 09:55:55 -07:00