[SPARK-36285][INFRA][TESTS] Skip MiMa in PySpark/SparkR/Docker GHA job
This PR aims to skip MiMa in PySpark/SparkR/Docker GHA job.
This will save GHA resource because MiMa is irrelevant to Python.
No.
Pass the GHA.
Closes #33532 from williamhyun/mima.
Lead-authored-by: William Hyun <william@apache.org>
Co-authored-by: Dongjoon Hyun <dongjoon@apache.org>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
(cherry picked from commit 674202e7b6
)
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
This commit is contained in:
parent
8a3b1cd811
commit
dfa5c4dadc
3
.github/workflows/build_and_test.yml
vendored
3
.github/workflows/build_and_test.yml
vendored
|
@ -169,6 +169,7 @@ jobs:
|
|||
GITHUB_PREV_SHA: ${{ github.event.before }}
|
||||
SPARK_LOCAL_IP: localhost
|
||||
SKIP_UNIDOC: true
|
||||
SKIP_MIMA: true
|
||||
METASPACE_SIZE: 128m
|
||||
steps:
|
||||
- name: Checkout Spark repository
|
||||
|
@ -251,6 +252,7 @@ jobs:
|
|||
HIVE_PROFILE: hive2.3
|
||||
GITHUB_PREV_SHA: ${{ github.event.before }}
|
||||
SPARK_LOCAL_IP: localhost
|
||||
SKIP_MIMA: true
|
||||
steps:
|
||||
- name: Checkout Spark repository
|
||||
uses: actions/checkout@v2
|
||||
|
@ -622,6 +624,7 @@ jobs:
|
|||
GITHUB_PREV_SHA: ${{ github.event.before }}
|
||||
SPARK_LOCAL_IP: localhost
|
||||
ORACLE_DOCKER_IMAGE_NAME: oracle/database:18.4.0-xe
|
||||
SKIP_MIMA: true
|
||||
steps:
|
||||
- name: Checkout Spark repository
|
||||
uses: actions/checkout@v2
|
||||
|
|
|
@ -804,7 +804,8 @@ def main():
|
|||
# backwards compatibility checks
|
||||
if build_tool == "sbt":
|
||||
# Note: compatibility tests only supported in sbt for now
|
||||
detect_binary_inop_with_mima(extra_profiles)
|
||||
if not os.environ.get("SKIP_MIMA"):
|
||||
detect_binary_inop_with_mima(extra_profiles)
|
||||
# Since we did not build assembly/package before running dev/mima, we need to
|
||||
# do it here because the tests still rely on it; see SPARK-13294 for details.
|
||||
build_spark_assembly_sbt(extra_profiles, should_run_java_style_checks)
|
||||
|
|
Loading…
Reference in a new issue