0feb3cbe77
### What changes were proposed in this pull request? This PR aims to use GitHub urls instead of GitHub in the release scripts. ### Why are the changes needed? Currently, Spark Packaing jobs are broken due to GitBox issue. ``` fatal: unable to access 'https://gitbox.apache.org/repos/asf/spark.git/': Failed to connect to gitbox.apache.org port 443: Connection timed out ``` - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Packaging/job/spark-master-maven-snapshots/2906/console - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Packaging/job/spark-branch-3.0-maven-snapshots/105/console - https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Packaging/job/spark-branch-2.4-maven-snapshots/439/console ### Does this PR introduce _any_ user-facing change? No. (This is a dev-only script.) ### How was this patch tested? Manual. ``` $ cat ./test.sh . dev/create-release/release-util.sh get_release_info git clone "$ASF_REPO" $ sh test.sh Branch [branch-3.0]: Current branch version is 3.0.1-SNAPSHOT. Release [3.0.0]: RC # [2]: Full name [Dongjoon Hyun]: GPG key [dongjoonapache.org]: ================ Release details: BRANCH: branch-3.0 VERSION: 3.0.0 TAG: v3.0.0-rc2 NEXT: 3.0.1-SNAPSHOT ASF USER: dongjoon GPG KEY: dongjoonapache.org FULL NAME: Dongjoon Hyun E-MAIL: dongjoonapache.org ================ Is this info correct [y/n]? y ASF password: GPG passphrase: Cloning into 'spark'... remote: Enumerating objects: 223, done. remote: Counting objects: 100% (223/223), done. remote: Compressing objects: 100% (117/117), done. remote: Total 708324 (delta 70), reused 138 (delta 32), pack-reused 708101 Receiving objects: 100% (708324/708324), 322.08 MiB | 2.94 MiB/s, done. Resolving deltas: 100% (289268/289268), done. Updating files: 100% (16287/16287), done. $ sh test.sh ... ``` Closes #28513 from dongjoon-hyun/SPARK-31687. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org> |
||
---|---|---|
.. | ||
create-release | ||
deps | ||
sparktestsupport | ||
tests | ||
.gitignore | ||
.rat-excludes | ||
.scalafmt.conf | ||
appveyor-guide.md | ||
appveyor-install-dependencies.ps1 | ||
change-scala-version.sh | ||
check-license | ||
checkstyle-suppressions.xml | ||
checkstyle.xml | ||
github_jira_sync.py | ||
lint-java | ||
lint-python | ||
lint-r | ||
lint-r.R | ||
lint-scala | ||
make-distribution.sh | ||
merge_spark_pr.py | ||
mima | ||
pip-sanity-check.py | ||
README.md | ||
requirements.txt | ||
run-pip-tests | ||
run-tests | ||
run-tests-jenkins | ||
run-tests-jenkins.py | ||
run-tests.py | ||
sbt-checkstyle | ||
scalafmt | ||
scalastyle | ||
test-dependencies.sh | ||
tox.ini |
Spark Developer Scripts
This directory contains scripts useful to developers when packaging, testing, or committing to Spark.
Many of these scripts require Apache credentials to work correctly.