spark-instrumented-optimizer/bin
Gengliang Wang c34c42234f [SPARK-26076][BUILD][MINOR] Revise ambiguous error message from load-spark-env.sh
## What changes were proposed in this pull request?

When I try to run scripts (e.g. `start-master.sh`/`start-history-server.sh ` in latest master, I got such error:
```
Presence of build for multiple Scala versions detected.
Either clean one of them or, export SPARK_SCALA_VERSION in spark-env.sh.
```

The error message is quite confusing. Without reading `load-spark-env.sh`,  I didn't know which directory to remove, or where to find and edit the `spark-evn.sh`.

This PR is to make the error message more clear. Also change the script for less maintenance when we add or drop Scala versions in the future.
As now with https://github.com/apache/spark/pull/22967, we can revise the error message as following(in my local setup):

```
Presence of build for multiple Scala versions detected (/Users/gengliangwang/IdeaProjects/spark/assembly/target/scala-2.12 and /Users/gengliangwang/IdeaProjects/spark/assembly/target/scala-2.11).
Remove one of them or, export SPARK_SCALA_VERSION=2.12 in /Users/gengliangwang/IdeaProjects/spark/conf/spark-env.sh.
Visit https://spark.apache.org/docs/latest/configuration.html#environment-variables for more details about setting environment variables in spark-env.sh.
```

## How was this patch tested?

Manual test

Closes #23049 from gengliangwang/reviseEnvScript.

Authored-by: Gengliang Wang <gengliang.wang@databricks.com>
Signed-off-by: Sean Owen <sean.owen@databricks.com>
2018-11-20 08:29:59 -06:00
..
beeline [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
beeline.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
docker-image-tool.sh [SPARK-25897][K8S] Hook up k8s integration tests to sbt build. 2018-11-07 13:19:31 -08:00
find-spark-home [MINOR] Fix a bunch of typos 2018-01-02 07:10:19 +09:00
find-spark-home.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
load-spark-env.cmd [SPARK-22466][SPARK SUBMIT] export SPARK_CONF_DIR while conf is default 2017-11-09 14:33:08 +09:00
load-spark-env.sh [SPARK-26076][BUILD][MINOR] Revise ambiguous error message from load-spark-env.sh 2018-11-20 08:29:59 -06:00
pyspark [SPARK-25891][PYTHON] Upgrade to Py4J 0.10.8.1 2018-10-31 09:55:03 -07:00
pyspark.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
pyspark2.cmd [SPARK-25891][PYTHON] Upgrade to Py4J 0.10.8.1 2018-10-31 09:55:03 -07:00
run-example [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
run-example.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
spark-class [SPARK-20546][DEPLOY] spark-class gets syntax error in posix mode 2017-05-05 11:36:51 +01:00
spark-class.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
spark-class2.cmd [SPARK-22495] Fix setup of SPARK_HOME variable on Windows 2017-11-23 12:47:38 +09:00
spark-shell [SPARK-25906][SHELL] Documents '-I' option (from Scala REPL) in spark-shell 2018-11-06 10:39:58 +08:00
spark-shell.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
spark-shell2.cmd [SPARK-25906][SHELL] Documents '-I' option (from Scala REPL) in spark-shell 2018-11-06 10:39:58 +08:00
spark-sql [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
spark-sql.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
spark-sql2.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00
spark-submit [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
spark-submit.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
spark-submit2.cmd [SPARK-11518][DEPLOY, WINDOWS] Handle spaces in Windows command scripts 2016-02-10 09:54:22 +00:00
sparkR [SPARK-1267][SPARK-18129] Allow PySpark to be pip installed 2016-11-16 14:22:15 -08:00
sparkR.cmd [SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts 2017-10-06 23:38:47 +09:00
sparkR2.cmd [SPARK-22597][SQL] Add spark-sql cmd script for Windows users 2017-11-24 19:55:26 +01:00