7dc1d8917d
### What changes were proposed in this pull request? This PR aims to update the docker/spark-test and clean up unused stuff. ### Why are the changes needed? Since Spark 3.0.0, Java 11 is supported. We had better use the latest Java and OS. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Manually do the following as described in https://github.com/apache/spark/blob/master/external/docker/spark-test/README.md . ``` docker run -v $SPARK_HOME:/opt/spark spark-test-master docker run -v $SPARK_HOME:/opt/spark spark-test-worker spark://<master_ip>:7077 ``` Closes #29150 from williamhyun/docker. Authored-by: William Hyun <williamhyun3@gmail.com> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org> |
||
---|---|---|
.. | ||
base | ||
master | ||
worker | ||
build | ||
README.md |
Spark Docker files usable for testing and development purposes.
These images are intended to be run like so:
docker run -v $SPARK_HOME:/opt/spark spark-test-master
docker run -v $SPARK_HOME:/opt/spark spark-test-worker spark://<master_ip>:7077
Using this configuration, the containers will have their Spark directories
mounted to your actual SPARK_HOME
, allowing you to modify and recompile
your Spark source and have them immediately usable in the docker images
(without rebuilding them).