spark-instrumented-optimizer/sql/hive
Wenchen Fan dbb824125d [SPARK-21936][SQL] backward compatibility test framework for HiveExternalCatalog
## What changes were proposed in this pull request?

`HiveExternalCatalog` is a semi-public interface. When creating tables, `HiveExternalCatalog` converts the table metadata to hive table format and save into hive metastore. It's very import to guarantee backward compatibility here, i.e., tables created by previous Spark versions should still be readable in newer Spark versions.

Previously we find backward compatibility issues manually, which is really easy to miss bugs. This PR introduces a test framework to automatically test `HiveExternalCatalog` backward compatibility, by downloading Spark binaries with different versions, and create tables with these Spark versions, and read these tables with current Spark version.

## How was this patch tested?

test-only change

Author: Wenchen Fan <wenchen@databricks.com>

Closes #19148 from cloud-fan/test.
2017-09-07 23:21:49 -07:00
..
compatibility/src/test/scala/org/apache/spark/sql/hive/execution [SPARK-21831][TEST] Remove spark.sql.hive.convertMetastoreOrc config in HiveCompatibilitySuite 2017-08-25 19:51:13 -07:00
src [SPARK-21936][SQL] backward compatibility test framework for HiveExternalCatalog 2017-09-07 23:21:49 -07:00
pom.xml [SPARK-21936][SQL] backward compatibility test framework for HiveExternalCatalog 2017-09-07 23:21:49 -07:00