[MINOR][TEST] Remove out-dated hive version in run-tests.py

## What changes were proposed in this pull request?

```
========================================================================
Building Spark
========================================================================
[info] Building Spark (w/Hive 1.2.1) using SBT with these arguments:  -Phadoop-3.2 -Pkubernetes -Phive-thriftserver -Pkinesis-asl -Pyarn -Pspark-ganglia-lgpl -Phive -Pmesos test:package streaming-kinesis-asl-assembly/assembly
```

`(w/Hive 1.2.1)` is incorrect when testing hadoop-3.2, It's should be (w/Hive 2.3.4).
This pr removes `(w/Hive 1.2.1)` in run-tests.py.

## How was this patch tested?

N/A

Closes #24451 from wangyum/run-tests-invalid-info.

Authored-by: Yuming Wang <yumwang@ebay.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
This commit is contained in:
Yuming Wang 2019-04-24 21:22:15 -07:00 committed by Dongjoon Hyun
parent b1c6b60ce7
commit f82ed5e8e0

View file

@ -297,8 +297,7 @@ def build_spark_maven(hadoop_version):
mvn_goals = ["clean", "package", "-DskipTests"]
profiles_and_goals = build_profiles + mvn_goals
print("[info] Building Spark (w/Hive 1.2.1) using Maven with these arguments: ",
" ".join(profiles_and_goals))
print("[info] Building Spark using Maven with these arguments: ", " ".join(profiles_and_goals))
exec_maven(profiles_and_goals)
@ -310,8 +309,7 @@ def build_spark_sbt(hadoop_version):
"streaming-kinesis-asl-assembly/assembly"]
profiles_and_goals = build_profiles + sbt_goals
print("[info] Building Spark (w/Hive 1.2.1) using SBT with these arguments: ",
" ".join(profiles_and_goals))
print("[info] Building Spark using SBT with these arguments: ", " ".join(profiles_and_goals))
exec_sbt(profiles_and_goals)
@ -323,7 +321,7 @@ def build_spark_unidoc_sbt(hadoop_version):
sbt_goals = ["unidoc"]
profiles_and_goals = build_profiles + sbt_goals
print("[info] Building Spark unidoc (w/Hive 1.2.1) using SBT with these arguments: ",
print("[info] Building Spark unidoc using SBT with these arguments: ",
" ".join(profiles_and_goals))
exec_sbt(profiles_and_goals)
@ -334,7 +332,7 @@ def build_spark_assembly_sbt(hadoop_version, checkstyle=False):
build_profiles = get_hadoop_profiles(hadoop_version) + modules.root.build_profile_flags
sbt_goals = ["assembly/package"]
profiles_and_goals = build_profiles + sbt_goals
print("[info] Building Spark assembly (w/Hive 1.2.1) using SBT with these arguments: ",
print("[info] Building Spark assembly using SBT with these arguments: ",
" ".join(profiles_and_goals))
exec_sbt(profiles_and_goals)