[SPARK-32959][SQL][TEST] Fix an invalid test in DataSourceV2SQLSuite

### What changes were proposed in this pull request?

This PR addresses two issues related to the `Relation: view text` test in `DataSourceV2SQLSuite`.

1. The test has the following block:
```scala
withView("view1") { v1: String =>
  sql(...)
}
```
Since `withView`'s signature is `withView(v: String*)(f: => Unit): Unit`, the `f` that will be executed is ` v1: String => sql(..)`, which is just defining the anonymous function, and _not_ executing it.

2. Once the test is fixed to run, it actually fails. The reason is that the v2 session catalog implementation used in tests does not correctly handle `V1Table` for views in `loadTable`. And this results in views resolved to `ResolvedTable` instead of `ResolvedView`, causing the test failure: f1dc479d39/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala (L1007-L1011)

### Why are the changes needed?

Fixing a bug in test.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Existing test.

Closes #29811 from imback82/fix_minor_test.

Authored-by: Terry Kim <yuminkim@gmail.com>
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
This commit is contained in:
Terry Kim 2020-09-23 05:49:45 +00:00 committed by Wenchen Fan
parent acfee3c8b1
commit 21b7479797
2 changed files with 11 additions and 7 deletions

View file

@ -758,8 +758,9 @@ class DataSourceV2SQLSuite
test("Relation: view text") {
val t1 = "testcat.ns1.ns2.tbl"
val v1 = "view1"
withTable(t1) {
withView("view1") { v1: String =>
withView(v1) {
sql(s"CREATE TABLE $t1 USING foo AS SELECT id, data FROM source")
sql(s"CREATE VIEW $v1 AS SELECT * from $t1")
checkAnswer(sql(s"TABLE $v1"), spark.table("source"))

View file

@ -22,8 +22,8 @@ import java.util.concurrent.ConcurrentHashMap
import scala.collection.JavaConverters._
import org.apache.spark.sql.catalyst.analysis.NoSuchTableException
import org.apache.spark.sql.connector.catalog.{DelegatingCatalogExtension, Identifier, Table}
import org.apache.spark.sql.catalyst.catalog.CatalogTableType
import org.apache.spark.sql.connector.catalog.{DelegatingCatalogExtension, Identifier, Table, V1Table}
import org.apache.spark.sql.connector.expressions.Transform
import org.apache.spark.sql.types.StructType
@ -47,10 +47,13 @@ private[connector] trait TestV2SessionCatalogBase[T <: Table] extends Delegating
tables.get(ident)
} else {
// Table was created through the built-in catalog
val t = super.loadTable(ident)
val table = newTable(t.name(), t.schema(), t.partitioning(), t.properties())
tables.put(ident, table)
table
super.loadTable(ident) match {
case v1Table: V1Table if v1Table.v1Table.tableType == CatalogTableType.VIEW => v1Table
case t =>
val table = newTable(t.name(), t.schema(), t.partitioning(), t.properties())
tables.put(ident, table)
table
}
}
}