[SPARKR] found some extra whitespace in the R tests

## What changes were proposed in this pull request?

during my ubuntu-port testing, i found some extra whitespace that for some reason wasn't getting caught on the centos lint-r build step.

## How was this patch tested?

the build system will test this!  i used one of my ubuntu testing builds and scped over the modified file.

before my fix:
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/22/console

after my fix:
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/23/console

Closes #22896 from shaneknapp/remove-extra-whitespace.

Authored-by: shane knapp <incomplete@gmail.com>
Signed-off-by: hyukjinkwon <gurwls223@apache.org>
This commit is contained in:
shane knapp 2018-10-31 10:32:26 +08:00 committed by hyukjinkwon
parent f6ff6329ee
commit 243ce319a0

View file

@ -22,12 +22,12 @@ context("test show SparkDataFrame when eager execution is enabled.")
test_that("eager execution is not enabled", {
# Start Spark session without eager execution enabled
sparkR.session(master = sparkRTestMaster, enableHiveSupport = FALSE)
df <- createDataFrame(faithful)
expect_is(df, "SparkDataFrame")
expected <- "eruptions:double, waiting:double"
expect_output(show(df), expected)
# Stop Spark session
sparkR.session.stop()
})
@ -35,9 +35,9 @@ test_that("eager execution is not enabled", {
test_that("eager execution is enabled", {
# Start Spark session with eager execution enabled
sparkConfig <- list(spark.sql.repl.eagerEval.enabled = "true")
sparkR.session(master = sparkRTestMaster, enableHiveSupport = FALSE, sparkConfig = sparkConfig)
df <- createDataFrame(faithful)
expect_is(df, "SparkDataFrame")
expected <- paste0("(+---------+-------+\n",
@ -45,7 +45,7 @@ test_that("eager execution is enabled", {
"+---------+-------+\n)*",
"(only showing top 20 rows)")
expect_output(show(df), expected)
# Stop Spark session
sparkR.session.stop()
})
@ -55,9 +55,9 @@ test_that("eager execution is enabled with maxNumRows and truncate set", {
sparkConfig <- list(spark.sql.repl.eagerEval.enabled = "true",
spark.sql.repl.eagerEval.maxNumRows = as.integer(5),
spark.sql.repl.eagerEval.truncate = as.integer(2))
sparkR.session(master = sparkRTestMaster, enableHiveSupport = FALSE, sparkConfig = sparkConfig)
df <- arrange(createDataFrame(faithful), "waiting")
expect_is(df, "SparkDataFrame")
expected <- paste0("(+---------+-------+\n",
@ -66,7 +66,7 @@ test_that("eager execution is enabled with maxNumRows and truncate set", {
"| 1.| 43|\n)*",
"(only showing top 5 rows)")
expect_output(show(df), expected)
# Stop Spark session
sparkR.session.stop()
})