[SPARK-36198][TESTS] Skip UNIDOC generation in PySpark GHA job
### What changes were proposed in this pull request?
This PR aims to skip UNIDOC generation in PySpark GHA job.
### Why are the changes needed?
PySpark GHA jobs do not need to generate Java/Scala doc. This will save about 13 minutes in total.
-https://github.com/apache/spark/runs/3098268973?check_suite_focus=true
```
...
========================================================================
Building Unidoc API Documentation
========================================================================
[info] Building Spark unidoc using SBT with these arguments: -Phadoop-3.2 -Phive-2.3 -Pscala-2.12 -Phive-thriftserver -Pmesos -Pdocker-integration-tests -Phive -Pkinesis-asl -Pspark-ganglia-lgpl -Pkubernetes -Phadoop-cloud -Pyarn unidoc
...
[info] Main Java API documentation successful.
[success] Total time: 192 s (03:12), completed Jul 18, 2021 6:08:40 PM
```
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Pass the GHA.
Closes #33407 from williamhyun/SKIP_UNIDOC.
Authored-by: William Hyun <william@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
(cherry picked from commit c336f73ccd
)
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
This commit is contained in:
parent
46ddb17da4
commit
d5cec45c0b
1
.github/workflows/build_and_test.yml
vendored
1
.github/workflows/build_and_test.yml
vendored
|
@ -170,6 +170,7 @@ jobs:
|
|||
HIVE_PROFILE: hive2.3
|
||||
GITHUB_PREV_SHA: ${{ github.event.before }}
|
||||
SPARK_LOCAL_IP: localhost
|
||||
SKIP_UNIDOC: true
|
||||
steps:
|
||||
- name: Checkout Spark repository
|
||||
uses: actions/checkout@v2
|
||||
|
|
|
@ -397,7 +397,7 @@ def build_spark_assembly_sbt(extra_profiles, checkstyle=False):
|
|||
if checkstyle:
|
||||
run_java_style_checks(build_profiles)
|
||||
|
||||
if not os.environ.get("AMPLAB_JENKINS"):
|
||||
if not os.environ.get("AMPLAB_JENKINS") and not os.environ.get("SKIP_UNIDOC"):
|
||||
build_spark_unidoc_sbt(extra_profiles)
|
||||
|
||||
|
||||
|
|
Loading…
Reference in a new issue