spark-instrumented-optimizer/bin
jerryshao 8aff36e91d [SPARK-2960][DEPLOY] Support executing Spark from symlinks (reopen)
This PR is based on the work of roji to support running Spark scripts from symlinks. Thanks for the great work roji . Would you mind taking a look at this PR, thanks a lot.

For releases like HDP and others, normally it will expose the Spark executables as symlinks and put in `PATH`, but current Spark's scripts do not support finding real path from symlink recursively, this will make spark fail to execute from symlink. This PR try to solve this issue by finding the absolute path from symlink.

Instead of using `readlink -f` like what this PR (https://github.com/apache/spark/pull/2386) implemented is that `-f` is not support for Mac, so here manually seeking the path through loop.

I've tested with Mac and Linux (Cent OS), looks fine.

This PR did not fix the scripts under `sbin` folder, not sure if it needs to be fixed also?

Please help to review, any comment is greatly appreciated.

Author: jerryshao <sshao@hortonworks.com>
Author: Shay Rojansky <roji@roji.org>

Closes #8669 from jerryshao/SPARK-2960.
2015-11-04 10:49:34 +00:00
..
beeline [SPARK-2960][DEPLOY] Support executing Spark from symlinks (reopen) 2015-11-04 10:49:34 +00:00
beeline.cmd [SPARK-4683][SQL] Add a beeline.cmd to run on Windows 2014-12-04 10:21:03 -08:00
load-spark-env.cmd [SPARK-6673] spark-shell.cmd can't start in Windows even when spark was built 2015-04-06 10:11:20 +01:00
load-spark-env.sh [SPARK-2960][DEPLOY] Support executing Spark from symlinks (reopen) 2015-11-04 10:49:34 +00:00
pyspark [SPARK-2960][DEPLOY] Support executing Spark from symlinks (reopen) 2015-11-04 10:49:34 +00:00
pyspark.cmd moved user scripts to bin folder 2013-09-23 12:46:48 +08:00
pyspark2.cmd [SPARK-10447][SPARK-3842][PYSPARK] upgrade pyspark to py4j0.9 2015-10-20 10:52:49 -07:00
run-example [SPARK-2960][DEPLOY] Support executing Spark from symlinks (reopen) 2015-11-04 10:49:34 +00:00
run-example.cmd moved user scripts to bin folder 2013-09-23 12:46:48 +08:00
run-example2.cmd [SPARK-6673] spark-shell.cmd can't start in Windows even when spark was built 2015-04-06 10:11:20 +01:00
spark-class [SPARK-2960][DEPLOY] Support executing Spark from symlinks (reopen) 2015-11-04 10:49:34 +00:00
spark-class.cmd sbin/spark-class* -> bin/spark-class* 2014-01-03 15:08:01 +05:30
spark-class2.cmd [SPARK-6435] spark-shell --jars option does not add all jars to classpath 2015-04-28 07:56:36 -04:00
spark-shell [SPARK-2960][DEPLOY] Support executing Spark from symlinks (reopen) 2015-11-04 10:49:34 +00:00
spark-shell.cmd [Minor] Remove permission for execution from spark-shell.cmd 2015-02-06 09:33:36 +00:00
spark-shell2.cmd [SPARK-9180] fix spark-shell to accept --name option 2015-07-22 16:15:44 -07:00
spark-sql [SPARK-2960][DEPLOY] Support executing Spark from symlinks (reopen) 2015-11-04 10:49:34 +00:00
spark-submit [SPARK-2960][DEPLOY] Support executing Spark from symlinks (reopen) 2015-11-04 10:49:34 +00:00
spark-submit.cmd [SPARK-3943] Some scripts bin\*.cmd pollutes environment variables in Windows 2014-10-14 18:50:14 -07:00
spark-submit2.cmd [SPARK-6324] [CORE] Centralize handling of script usage messages. 2015-06-05 14:32:00 +02:00
sparkR [SPARK-2960][DEPLOY] Support executing Spark from symlinks (reopen) 2015-11-04 10:49:34 +00:00
sparkR.cmd [SPARK-5654] Integrate SparkR 2015-04-08 22:45:40 -07:00
sparkR2.cmd [SPARK-5654] Integrate SparkR 2015-04-08 22:45:40 -07:00