spark-instrumented-optimizer/core/benchmarks/ZStandardBenchmark-results.txt

28 lines
2.6 KiB
Plaintext
Raw Normal View History

================================================================================================
Benchmark ZStandardCompressionCodec
================================================================================================
[SPARK-35670][BUILD] Upgrade ZSTD-JNI to 1.5.0-2 ### What changes were proposed in this pull request? This PR aims to upgrade `zstd-jni` to 1.5.0-2, which uses `zstd` version 1.5.0. ### Why are the changes needed? Major improvements to Zstd support are targeted for the upcoming 3.2.0 release of Spark. Zstd 1.5.0 introduces significant compression (+25% to 140%) and decompression (~15%) speed improvements in benchmarks described in more detail on the releases page: - https://github.com/facebook/zstd/releases/tag/v1.5.0 ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Build passes build tests, but the benchmark tests seem flaky. I am unsure if this change is responsible. The error is: ``` Running org.apache.spark.rdd.CoalescedRDDBenchmark: 21/06/08 18:53:10 ERROR SparkContext: Failed to add file:/home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar to Spark environment java.lang.IllegalArgumentException: requirement failed: File spark-core_2.12-3.2.0-SNAPSHOT-tests.jar was already registered with a different path (old path = /home/runner/work/spark/spark/core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar, new path = /home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar ``` https://github.com/dchristle/spark/runs/2776123749?check_suite_focus=true cc: dongjoon-hyun Closes #32826 from dchristle/ZSTD150. Lead-authored-by: David Christle <dchristle@squareup.com> Co-authored-by: David Christle <dchristle@users.noreply.github.com> Co-authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
2021-06-17 14:06:50 -04:00
OpenJDK 64-Bit Server VM 1.8.0_292-b10 on Linux 5.8.0-1033-azure
Intel(R) Xeon(R) Platinum 8171M CPU @ 2.60GHz
Benchmark ZStandardCompressionCodec: Best Time(ms) Avg Time(ms) Stdev(ms) Rate(M/s) Per Row(ns) Relative
--------------------------------------------------------------------------------------------------------------------------------------
[SPARK-35670][BUILD] Upgrade ZSTD-JNI to 1.5.0-2 ### What changes were proposed in this pull request? This PR aims to upgrade `zstd-jni` to 1.5.0-2, which uses `zstd` version 1.5.0. ### Why are the changes needed? Major improvements to Zstd support are targeted for the upcoming 3.2.0 release of Spark. Zstd 1.5.0 introduces significant compression (+25% to 140%) and decompression (~15%) speed improvements in benchmarks described in more detail on the releases page: - https://github.com/facebook/zstd/releases/tag/v1.5.0 ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Build passes build tests, but the benchmark tests seem flaky. I am unsure if this change is responsible. The error is: ``` Running org.apache.spark.rdd.CoalescedRDDBenchmark: 21/06/08 18:53:10 ERROR SparkContext: Failed to add file:/home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar to Spark environment java.lang.IllegalArgumentException: requirement failed: File spark-core_2.12-3.2.0-SNAPSHOT-tests.jar was already registered with a different path (old path = /home/runner/work/spark/spark/core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar, new path = /home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar ``` https://github.com/dchristle/spark/runs/2776123749?check_suite_focus=true cc: dongjoon-hyun Closes #32826 from dchristle/ZSTD150. Lead-authored-by: David Christle <dchristle@squareup.com> Co-authored-by: David Christle <dchristle@users.noreply.github.com> Co-authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
2021-06-17 14:06:50 -04:00
Compression 10000 times at level 1 without buffer pool 444 606 183 0.0 44440.9 1.0X
Compression 10000 times at level 2 without buffer pool 514 527 10 0.0 51421.8 0.9X
Compression 10000 times at level 3 without buffer pool 725 729 6 0.0 72531.4 0.6X
Compression 10000 times at level 1 with buffer pool 229 235 6 0.0 22886.7 1.9X
Compression 10000 times at level 2 with buffer pool 288 303 15 0.0 28802.3 1.5X
Compression 10000 times at level 3 with buffer pool 493 521 26 0.0 49339.5 0.9X
[SPARK-35670][BUILD] Upgrade ZSTD-JNI to 1.5.0-2 ### What changes were proposed in this pull request? This PR aims to upgrade `zstd-jni` to 1.5.0-2, which uses `zstd` version 1.5.0. ### Why are the changes needed? Major improvements to Zstd support are targeted for the upcoming 3.2.0 release of Spark. Zstd 1.5.0 introduces significant compression (+25% to 140%) and decompression (~15%) speed improvements in benchmarks described in more detail on the releases page: - https://github.com/facebook/zstd/releases/tag/v1.5.0 ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Build passes build tests, but the benchmark tests seem flaky. I am unsure if this change is responsible. The error is: ``` Running org.apache.spark.rdd.CoalescedRDDBenchmark: 21/06/08 18:53:10 ERROR SparkContext: Failed to add file:/home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar to Spark environment java.lang.IllegalArgumentException: requirement failed: File spark-core_2.12-3.2.0-SNAPSHOT-tests.jar was already registered with a different path (old path = /home/runner/work/spark/spark/core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar, new path = /home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar ``` https://github.com/dchristle/spark/runs/2776123749?check_suite_focus=true cc: dongjoon-hyun Closes #32826 from dchristle/ZSTD150. Lead-authored-by: David Christle <dchristle@squareup.com> Co-authored-by: David Christle <dchristle@users.noreply.github.com> Co-authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
2021-06-17 14:06:50 -04:00
OpenJDK 64-Bit Server VM 1.8.0_292-b10 on Linux 5.8.0-1033-azure
Intel(R) Xeon(R) Platinum 8171M CPU @ 2.60GHz
Benchmark ZStandardCompressionCodec: Best Time(ms) Avg Time(ms) Stdev(ms) Rate(M/s) Per Row(ns) Relative
------------------------------------------------------------------------------------------------------------------------------------------
[SPARK-35670][BUILD] Upgrade ZSTD-JNI to 1.5.0-2 ### What changes were proposed in this pull request? This PR aims to upgrade `zstd-jni` to 1.5.0-2, which uses `zstd` version 1.5.0. ### Why are the changes needed? Major improvements to Zstd support are targeted for the upcoming 3.2.0 release of Spark. Zstd 1.5.0 introduces significant compression (+25% to 140%) and decompression (~15%) speed improvements in benchmarks described in more detail on the releases page: - https://github.com/facebook/zstd/releases/tag/v1.5.0 ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Build passes build tests, but the benchmark tests seem flaky. I am unsure if this change is responsible. The error is: ``` Running org.apache.spark.rdd.CoalescedRDDBenchmark: 21/06/08 18:53:10 ERROR SparkContext: Failed to add file:/home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar to Spark environment java.lang.IllegalArgumentException: requirement failed: File spark-core_2.12-3.2.0-SNAPSHOT-tests.jar was already registered with a different path (old path = /home/runner/work/spark/spark/core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar, new path = /home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar ``` https://github.com/dchristle/spark/runs/2776123749?check_suite_focus=true cc: dongjoon-hyun Closes #32826 from dchristle/ZSTD150. Lead-authored-by: David Christle <dchristle@squareup.com> Co-authored-by: David Christle <dchristle@users.noreply.github.com> Co-authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
2021-06-17 14:06:50 -04:00
Decompression 10000 times from level 1 without buffer pool 1188 1192 6 0.0 118770.4 1.0X
Decompression 10000 times from level 2 without buffer pool 1176 1199 33 0.0 117574.4 1.0X
Decompression 10000 times from level 3 without buffer pool 1174 1175 1 0.0 117426.0 1.0X
Decompression 10000 times from level 1 with buffer pool 1020 1046 36 0.0 102021.9 1.2X
Decompression 10000 times from level 2 with buffer pool 996 1005 14 0.0 99561.0 1.2X
Decompression 10000 times from level 3 with buffer pool 1021 1022 1 0.0 102050.9 1.2X