spark-instrumented-optimizer/external
DB Tsai a12de29c1a [SPARK-27838][SQL] Support user provided non-nullable avro schema for nullable catalyst schema without any null record
## What changes were proposed in this pull request?

When the data is read from the sources, the catalyst schema is always nullable. Since Avro uses Union type to represent nullable, when any non-nullable avro file is read and then written out, the schema will always be changed.

This PR provides a solution for users to keep the Avro schema without being forced to use Union type.

## How was this patch tested?

One test is added.

Closes #24682 from dbtsai/avroNull.

Authored-by: DB Tsai <d_tsai@apple.com>
Signed-off-by: DB Tsai <d_tsai@apple.com>
2019-05-24 21:47:14 +00:00
..
avro [SPARK-27838][SQL] Support user provided non-nullable avro schema for nullable catalyst schema without any null record 2019-05-24 21:47:14 +00:00
docker [SPARK-27794][R][DOCS] Use https URL for CRAN repo 2019-05-22 14:28:21 -07:00
docker-integration-tests [SPARK-27596][SQL] The JDBC 'query' option doesn't work for Oracle database 2019-05-05 21:52:23 -07:00
kafka-0-10 [SPARK-27294][SS] Add multi-cluster Kafka delegation token 2019-05-07 11:40:43 -07:00
kafka-0-10-assembly [SPARK-25956] Make Scala 2.12 as default Scala version in Spark 3.0 2018-11-14 16:22:23 -08:00
kafka-0-10-sql [SPARK-27687][SS] Rename Kafka consumer cache capacity conf and document caching 2019-05-15 10:42:09 -07:00
kafka-0-10-token-provider [SPARK-27680][CORE][SQL][GRAPHX] Remove usage of Traversable 2019-05-14 09:14:56 -05:00
kinesis-asl [SPARK-27610][YARN] Shade netty native libraries 2019-05-07 10:47:36 -07:00
kinesis-asl-assembly [SPARK-25956] Make Scala 2.12 as default Scala version in Spark 3.0 2018-11-14 16:22:23 -08:00
spark-ganglia-lgpl [SPARK-25956] Make Scala 2.12 as default Scala version in Spark 3.0 2018-11-14 16:22:23 -08:00