spark-instrumented-optimizer/sql/hive
gatorsmile 24c0c94128 [SPARK-18949][SQL] Add recoverPartitions API to Catalog
### What changes were proposed in this pull request?

Currently, we only have a SQL interface for recovering all the partitions in the directory of a table and update the catalog. `MSCK REPAIR TABLE` or `ALTER TABLE table RECOVER PARTITIONS`. (Actually, very hard for me to remember `MSCK` and have no clue what it means)

After the new "Scalable Partition Handling", the table repair becomes much more important for making visible the data in the created data source partitioned table.

Thus, this PR is to add it into the Catalog interface. After this PR, users can repair the table by
```Scala
spark.catalog.recoverPartitions("testTable")
```

### How was this patch tested?
Modified the existing test cases.

Author: gatorsmile <gatorsmile@gmail.com>

Closes #16356 from gatorsmile/repairTable.
2016-12-20 23:40:02 -08:00
..
compatibility/src/test/scala/org/apache/spark/sql/hive/execution [SPARK-16904][SQL] Removal of Hive Built-in Hash Functions and TestHiveFunctionRegistry 2016-11-07 01:16:37 -08:00
src [SPARK-18949][SQL] Add recoverPartitions API to Catalog 2016-12-20 23:40:02 -08:00
pom.xml [SPARK-18695] Bump master branch version to 2.2.0-SNAPSHOT 2016-12-02 21:09:37 -08:00