6c3d0a4405
### What changes were proposed in this pull request? 1. URL encode the `ASF_PASSWORD` of the release manager. 2. Update the image to install `qpdf` and `jq` dep 3. Increase the JVM HEAM memory for maven build. ### Why are the changes needed? Release script takes hours to run, and if a single failure happens about somewhere midway, then either one has to get down to manually doing stuff or re run the entire script. (This is my understanding) So, I have made the fixes of a few failures, discovered so far. 1. If the release manager password contains a char, that is not allowed in URL, then it fails the build at the clone spark step. `git clone "https://$ASF_USERNAME:$ASF_PASSWORD$ASF_SPARK_REPO" -b $GIT_BRANCH` ^^^ Fails with bad URL `ASF_USERNAME` may not be URL encoded, but we need to encode `ASF_PASSWORD`. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? By running the release for branch-2.4, using both type of passwords, i.e. passwords with special chars and without it. Closes #29373 from ScrapCodes/release-script-fix2. Lead-authored-by: Prashant Sharma <prashant@apache.org> Co-authored-by: Prashant Sharma <prashsh1@in.ibm.com> Signed-off-by: Prashant Sharma <prashant@apache.org> |
||
---|---|---|
.. | ||
create-release | ||
deps | ||
sparktestsupport | ||
tests | ||
.gitignore | ||
.rat-excludes | ||
.scalafmt.conf | ||
appveyor-guide.md | ||
appveyor-install-dependencies.ps1 | ||
change-scala-version.sh | ||
check-license | ||
checkstyle-suppressions.xml | ||
checkstyle.xml | ||
github_jira_sync.py | ||
lint-java | ||
lint-python | ||
lint-r | ||
lint-r.R | ||
lint-scala | ||
make-distribution.sh | ||
merge_spark_pr.py | ||
mima | ||
pip-sanity-check.py | ||
README.md | ||
requirements.txt | ||
run-pip-tests | ||
run-tests | ||
run-tests-jenkins | ||
run-tests-jenkins.py | ||
run-tests.py | ||
sbt-checkstyle | ||
scalafmt | ||
scalastyle | ||
test-dependencies.sh | ||
tox.ini |
Spark Developer Scripts
This directory contains scripts useful to developers when packaging, testing, or committing to Spark.
Many of these scripts require Apache credentials to work correctly.