This allows the spark-shell, spark-class, run-example, make-distribution.sh,
and ./bin/start-* scripts to work under Cygwin. Note that this doesn't
support PySpark under Cygwin, since that requires many additional `cygpath`
calls from within Python and will be non-trivial to implement.
This PR was inspired by, and subsumes, #253 (so close#253 after this is merged).
In some environments, this command
export SPARK_HOME=$(cd "$(dirname $0)/.."; pwd)
echoes two paths, one by the "cd ..", and one by the "pwd". Note the resulting
erroneous -jar paths below:
ctn@ubuntu:~/src/spark$ sbt/sbt
+ EXTRA_ARGS=
+ '[' '' '!=' '' ']'
+++ dirname sbt/sbt
++ cd sbt/..
++ pwd
+ export 'SPARK_HOME=/home/ctn/src/spark
/home/ctn/src/spark'
+ SPARK_HOME='/home/ctn/src/spark
/home/ctn/src/spark'
+ export SPARK_TESTING=1
+ SPARK_TESTING=1
+ java -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=128m -jar /home/ctn/src/spark /home/ctn/src/spark/sbt/sbt-launch-0.11.3-2.jar
Error: Invalid or corrupt jarfile /home/ctn/src/spark
Committer: ctn <ctn@adatao.com>
On branch master
Changes to be committed:
- Send output of the "cd .." part to /dev/null
modified: sbt/sbt