[SPARK-19905][SQL] Bring back Dataset.inputFiles for Hive SerDe tables

## What changes were proposed in this pull request?

`Dataset.inputFiles` works by matching `FileRelation`s in the query plan. In Spark 2.1, Hive SerDe tables are represented by `MetastoreRelation`, which inherits from `FileRelation`. However, in Spark 2.2, Hive SerDe tables are now represented by `CatalogRelation`, which doesn't inherit from `FileRelation` anymore, due to the unification of Hive SerDe tables and data source tables. This change breaks `Dataset.inputFiles` for Hive SerDe tables.

This PR tries to fix this issue by explicitly matching `CatalogRelation`s that are Hive SerDe tables in `Dataset.inputFiles`. Note that we can't make `CatalogRelation` inherit from `FileRelation` since not all `CatalogRelation`s are file based (e.g., JDBC data source tables).

## How was this patch tested?

New test case added in `HiveDDLSuite`.

Author: Cheng Lian <lian@databricks.com>

Closes #17247 from liancheng/spark-19905-hive-table-input-files.
This commit is contained in:
Cheng Lian 2017-03-10 15:19:32 -08:00 committed by Wenchen Fan
parent bc30351404
commit ffee4f1cef
2 changed files with 14 additions and 0 deletions

View file

@ -36,6 +36,7 @@ import org.apache.spark.broadcast.Broadcast
import org.apache.spark.rdd.RDD
import org.apache.spark.sql.catalyst._
import org.apache.spark.sql.catalyst.analysis._
import org.apache.spark.sql.catalyst.catalog.CatalogRelation
import org.apache.spark.sql.catalyst.encoders._
import org.apache.spark.sql.catalyst.expressions._
import org.apache.spark.sql.catalyst.expressions.aggregate._
@ -2734,6 +2735,8 @@ class Dataset[T] private[sql](
fsBasedRelation.inputFiles
case fr: FileRelation =>
fr.inputFiles
case r: CatalogRelation if DDLUtils.isHiveTable(r.tableMeta) =>
r.tableMeta.storage.locationUri.map(_.toString).toArray
}.flatten
files.toSet.toArray
}

View file

@ -1865,4 +1865,15 @@ class HiveDDLSuite
}
}
}
test("SPARK-19905: Hive SerDe table input paths") {
withTable("spark_19905") {
withTempView("spark_19905_view") {
spark.range(10).createOrReplaceTempView("spark_19905_view")
sql("CREATE TABLE spark_19905 STORED AS RCFILE AS SELECT * FROM spark_19905_view")
assert(spark.table("spark_19905").inputFiles.nonEmpty)
assert(sql("SELECT input_file_name() FROM spark_19905").count() > 0)
}
}
}
}