From d5cec45c0b0feaf2dd6014cf82bf0d7d25f5ac87 Mon Sep 17 00:00:00 2001 From: William Hyun Date: Sun, 18 Jul 2021 17:52:28 -0700 Subject: [PATCH] [SPARK-36198][TESTS] Skip UNIDOC generation in PySpark GHA job ### What changes were proposed in this pull request? This PR aims to skip UNIDOC generation in PySpark GHA job. ### Why are the changes needed? PySpark GHA jobs do not need to generate Java/Scala doc. This will save about 13 minutes in total. -https://github.com/apache/spark/runs/3098268973?check_suite_focus=true ``` ... ======================================================================== Building Unidoc API Documentation ======================================================================== [info] Building Spark unidoc using SBT with these arguments: -Phadoop-3.2 -Phive-2.3 -Pscala-2.12 -Phive-thriftserver -Pmesos -Pdocker-integration-tests -Phive -Pkinesis-asl -Pspark-ganglia-lgpl -Pkubernetes -Phadoop-cloud -Pyarn unidoc ... [info] Main Java API documentation successful. [success] Total time: 192 s (03:12), completed Jul 18, 2021 6:08:40 PM ``` ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Pass the GHA. Closes #33407 from williamhyun/SKIP_UNIDOC. Authored-by: William Hyun Signed-off-by: Dongjoon Hyun (cherry picked from commit c336f73ccddc1d163caa0a619919f3bbc9bf34ab) Signed-off-by: Dongjoon Hyun --- .github/workflows/build_and_test.yml | 1 + dev/run-tests.py | 2 +- 2 files changed, 2 insertions(+), 1 deletion(-) diff --git a/.github/workflows/build_and_test.yml b/.github/workflows/build_and_test.yml index 62f37d380e..66a0edaa6e 100644 --- a/.github/workflows/build_and_test.yml +++ b/.github/workflows/build_and_test.yml @@ -170,6 +170,7 @@ jobs: HIVE_PROFILE: hive2.3 GITHUB_PREV_SHA: ${{ github.event.before }} SPARK_LOCAL_IP: localhost + SKIP_UNIDOC: true steps: - name: Checkout Spark repository uses: actions/checkout@v2 diff --git a/dev/run-tests.py b/dev/run-tests.py index 3055dcca6f..97523e7483 100755 --- a/dev/run-tests.py +++ b/dev/run-tests.py @@ -397,7 +397,7 @@ def build_spark_assembly_sbt(extra_profiles, checkstyle=False): if checkstyle: run_java_style_checks(build_profiles) - if not os.environ.get("AMPLAB_JENKINS"): + if not os.environ.get("AMPLAB_JENKINS") and not os.environ.get("SKIP_UNIDOC"): build_spark_unidoc_sbt(extra_profiles)