spark-instrumented-optimizer/external/docker/spark-test
William Hyun 7dc1d8917d [SPARK-32353][TEST] Update docker/spark-test and clean up unused stuff
### What changes were proposed in this pull request?
This PR aims to update the docker/spark-test and clean up unused stuff.

### Why are the changes needed?
Since Spark 3.0.0, Java 11 is supported. We had better use the latest Java and OS.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?
Manually do the following as described in https://github.com/apache/spark/blob/master/external/docker/spark-test/README.md .

```
docker run -v $SPARK_HOME:/opt/spark spark-test-master
docker run -v $SPARK_HOME:/opt/spark spark-test-worker spark://<master_ip>:7077
```

Closes #29150 from williamhyun/docker.

Authored-by: William Hyun <williamhyun3@gmail.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
2020-07-17 12:05:45 -07:00
..
base [SPARK-32353][TEST] Update docker/spark-test and clean up unused stuff 2020-07-17 12:05:45 -07:00
master [SPARK-32353][TEST] Update docker/spark-test and clean up unused stuff 2020-07-17 12:05:45 -07:00
worker [SPARK-32353][TEST] Update docker/spark-test and clean up unused stuff 2020-07-17 12:05:45 -07:00
build [SPARK-13595][BUILD] Move docker, extras modules into external 2016-03-09 18:27:44 +00:00
README.md [SPARK-13595][BUILD] Move docker, extras modules into external 2016-03-09 18:27:44 +00:00

Spark Docker files usable for testing and development purposes.

These images are intended to be run like so:

docker run -v $SPARK_HOME:/opt/spark spark-test-master
docker run -v $SPARK_HOME:/opt/spark spark-test-worker spark://<master_ip>:7077

Using this configuration, the containers will have their Spark directories mounted to your actual SPARK_HOME, allowing you to modify and recompile your Spark source and have them immediately usable in the docker images (without rebuilding them).