7fcb127674
### What changes were proposed in this pull request? This PR aims to upgrade `zstd-jni` to 1.5.0-2, which uses `zstd` version 1.5.0. ### Why are the changes needed? Major improvements to Zstd support are targeted for the upcoming 3.2.0 release of Spark. Zstd 1.5.0 introduces significant compression (+25% to 140%) and decompression (~15%) speed improvements in benchmarks described in more detail on the releases page: - https://github.com/facebook/zstd/releases/tag/v1.5.0 ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Build passes build tests, but the benchmark tests seem flaky. I am unsure if this change is responsible. The error is: ``` Running org.apache.spark.rdd.CoalescedRDDBenchmark: 21/06/08 18:53:10 ERROR SparkContext: Failed to add file:/home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar to Spark environment java.lang.IllegalArgumentException: requirement failed: File spark-core_2.12-3.2.0-SNAPSHOT-tests.jar was already registered with a different path (old path = /home/runner/work/spark/spark/core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar, new path = /home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar ``` https://github.com/dchristle/spark/runs/2776123749?check_suite_focus=true cc: dongjoon-hyun Closes #32826 from dchristle/ZSTD150. Lead-authored-by: David Christle <dchristle@squareup.com> Co-authored-by: David Christle <dchristle@users.noreply.github.com> Co-authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org> |
||
---|---|---|
.. | ||
create-release | ||
deps | ||
sparktestsupport | ||
tests | ||
.gitignore | ||
.rat-excludes | ||
.scalafmt.conf | ||
appveyor-guide.md | ||
appveyor-install-dependencies.ps1 | ||
change-scala-version.sh | ||
check-license | ||
checkstyle-suppressions.xml | ||
checkstyle.xml | ||
eslint.json | ||
github_jira_sync.py | ||
lint-java | ||
lint-js | ||
lint-python | ||
lint-r | ||
lint-r.R | ||
lint-scala | ||
make-distribution.sh | ||
merge_spark_pr.py | ||
mima | ||
package-lock.json | ||
package.json | ||
pip-sanity-check.py | ||
README.md | ||
reformat-python | ||
requirements.txt | ||
run-pip-tests | ||
run-tests | ||
run-tests-jenkins | ||
run-tests-jenkins.py | ||
run-tests.py | ||
sbt-checkstyle | ||
scalafmt | ||
scalastyle | ||
test-dependencies.sh | ||
tox.ini |
Spark Developer Scripts
This directory contains scripts useful to developers when packaging, testing, or committing to Spark.
Many of these scripts require Apache credentials to work correctly.