From 3fd3ee038b89821f51f30a4ecd4452b5b3bc6568 Mon Sep 17 00:00:00 2001 From: bomeng Date: Sun, 12 Jun 2016 12:58:34 +0100 Subject: [PATCH] [SPARK-15781][DOCUMENTATION] remove deprecated environment variable doc ## What changes were proposed in this pull request? Like `SPARK_JAVA_OPTS` and `SPARK_CLASSPATH`, we will remove the document for `SPARK_WORKER_INSTANCES` to discourage user not to use them. If they are actually used, SparkConf will show a warning message as before. ## How was this patch tested? Manually tested. Author: bomeng Closes #13533 from bomeng/SPARK-15781. --- docs/spark-standalone.md | 9 --------- 1 file changed, 9 deletions(-) diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md index fd94c34d16..40c72931cb 100644 --- a/docs/spark-standalone.md +++ b/docs/spark-standalone.md @@ -133,15 +133,6 @@ You can optionally configure the cluster further by setting environment variable SPARK_WORKER_WEBUI_PORT Port for the worker web UI (default: 8081). - - SPARK_WORKER_INSTANCES - - Number of worker instances to run on each machine (default: 1). You can make this more than 1 if - you have have very large machines and would like multiple Spark worker processes. If you do set - this, make sure to also set SPARK_WORKER_CORES explicitly to limit the cores per worker, - or else each worker will try to use all the cores. - - SPARK_WORKER_DIR Directory to run applications in, which will include both logs and scratch space (default: SPARK_HOME/work).