[SPARK-33850][SQL][FOLLOWUP] Improve and cleanup the test code

### What changes were proposed in this pull request?

This PR mainly improves and cleans up the test code introduced in #30855 based on the comment.
The test code is actually taken from another test `explain formatted - check presence of subquery in case of DPP` so this PR cleans the code too ( removed unnecessary `withTable`).

### Why are the changes needed?

To keep the test code clean.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

`ExplainSuite` passes.

Closes #30861 from sarutak/followup-SPARK-33850.

Authored-by: Kousuke Saruta <sarutak@oss.nttdata.com>
Signed-off-by: Takeshi Yamamuro <yamamuro@apache.org>
This commit is contained in:
Kousuke Saruta 2020-12-21 09:40:42 +09:00 committed by Takeshi Yamamuro
parent 13391683e7
commit 3c8be3983c

View file

@ -233,7 +233,6 @@ class ExplainSuite extends ExplainSuiteHelper with DisableAdaptiveExecutionSuite
withSQLConf(SQLConf.DYNAMIC_PARTITION_PRUNING_ENABLED.key -> "true", withSQLConf(SQLConf.DYNAMIC_PARTITION_PRUNING_ENABLED.key -> "true",
SQLConf.DYNAMIC_PARTITION_PRUNING_REUSE_BROADCAST_ONLY.key -> "false", SQLConf.DYNAMIC_PARTITION_PRUNING_REUSE_BROADCAST_ONLY.key -> "false",
SQLConf.EXCHANGE_REUSE_ENABLED.key -> "false") { SQLConf.EXCHANGE_REUSE_ENABLED.key -> "false") {
withTable("df1", "df2") {
spark.range(1000).select(col("id"), col("id").as("k")) spark.range(1000).select(col("id"), col("id").as("k"))
.write .write
.partitionBy("k") .partitionBy("k")
@ -275,25 +274,19 @@ class ExplainSuite extends ExplainSuiteHelper with DisableAdaptiveExecutionSuite
} }
} }
} }
}
test("SPARK-33850: explain formatted - check presence of subquery in case of AQE") { test("SPARK-33850: explain formatted - check presence of subquery in case of AQE") {
withTable("df1") {
withSQLConf(SQLConf.ADAPTIVE_EXECUTION_ENABLED.key -> "true") { withSQLConf(SQLConf.ADAPTIVE_EXECUTION_ENABLED.key -> "true") {
withTable("df1") { withTempView("df") {
spark.range(1, 100) val df = spark.range(1, 100)
.write df.createTempView("df")
.format("parquet")
.mode("overwrite")
.saveAsTable("df1")
val sqlText = "EXPLAIN FORMATTED SELECT (SELECT min(id) FROM df1) as v" val sqlText = "EXPLAIN FORMATTED SELECT (SELECT min(id) FROM df) as v"
val expected_pattern1 = val expected_pattern =
"Subquery:1 Hosting operator id = 2 Hosting Expression = Subquery subquery#x" "Subquery:1 Hosting operator id = 2 Hosting Expression = Subquery subquery#x"
withNormalizedExplain(sqlText) { normalizedOutput => withNormalizedExplain(sqlText) { normalizedOutput =>
assert(expected_pattern1.r.findAllMatchIn(normalizedOutput).length == 1) assert(expected_pattern.r.findAllMatchIn(normalizedOutput).length == 1)
}
} }
} }
} }