[SPARK-24573][INFRA] Runs SBT checkstyle after the build to work around a side-effect

## What changes were proposed in this pull request?

Seems checkstyle affects the build in the PR builder in Jenkins. I can't reproduce in my local and seems it can only be reproduced in the PR builder.

I was checking the places it goes through and this is just a speculation that checkstyle's compilation in SBT has a side effect to the assembly build.

This PR proposes to run the SBT checkstyle after the build.

## How was this patch tested?

Jenkins tests.

Author: hyukjinkwon <gurwls223@apache.org>

Closes #21579 from HyukjinKwon/investigate-javastyle.
This commit is contained in:
hyukjinkwon 2018-06-18 15:32:34 +08:00
parent c7c0b086a0
commit b0a9352559

View file

@ -357,7 +357,7 @@ def build_spark_unidoc_sbt(hadoop_version):
exec_sbt(profiles_and_goals)
def build_spark_assembly_sbt(hadoop_version):
def build_spark_assembly_sbt(hadoop_version, checkstyle=False):
# Enable all of the profiles for the build:
build_profiles = get_hadoop_profiles(hadoop_version) + modules.root.build_profile_flags
sbt_goals = ["assembly/package"]
@ -366,6 +366,9 @@ def build_spark_assembly_sbt(hadoop_version):
" ".join(profiles_and_goals))
exec_sbt(profiles_and_goals)
if checkstyle:
run_java_style_checks()
# Note that we skip Unidoc build only if Hadoop 2.6 is explicitly set in this SBT build.
# Due to a different dependency resolution in SBT & Unidoc by an unknown reason, the
# documentation build fails on a specific machine & environment in Jenkins but it was unable
@ -570,11 +573,13 @@ def main():
or f.endswith("scalastyle-config.xml")
for f in changed_files):
run_scala_style_checks()
should_run_java_style_checks = False
if not changed_files or any(f.endswith(".java")
or f.endswith("checkstyle.xml")
or f.endswith("checkstyle-suppressions.xml")
for f in changed_files):
run_java_style_checks()
# Run SBT Checkstyle after the build to prevent a side-effect to the build.
should_run_java_style_checks = True
if not changed_files or any(f.endswith("lint-python")
or f.endswith("tox.ini")
or f.endswith(".py")
@ -603,7 +608,7 @@ def main():
detect_binary_inop_with_mima(hadoop_version)
# Since we did not build assembly/package before running dev/mima, we need to
# do it here because the tests still rely on it; see SPARK-13294 for details.
build_spark_assembly_sbt(hadoop_version)
build_spark_assembly_sbt(hadoop_version, should_run_java_style_checks)
# run the test suites
run_scala_tests(build_tool, hadoop_version, test_modules, excluded_tags)