## What changes were proposed in this pull request?
All the windows command scripts can not handle quotes in parameter.
Run a windows command shell with parameter which has quotes can reproduce the bug:
```
C:\Users\meng\software\spark-2.2.0-bin-hadoop2.7> bin\spark-shell --driver-java-options " -Dfile.encoding=utf-8 "
'C:\Users\meng\software\spark-2.2.0-bin-hadoop2.7\bin\spark-shell2.cmd" --driver-java-options "' is not recognized as an internal or external command,
operable program or batch file.
```
Windows recognize "--driver-java-options" as part of the command.
All the Windows command script has the following code have the bug.
```
cmd /V /E /C "<other command>" %*
```
We should quote command and parameters like
```
cmd /V /E /C ""<other command>" %*"
```
## How was this patch tested?
Test manually on Windows 10 and Windows 7
We can verify it by the following demo:
```
C:\Users\meng\program\demo>cat a.cmd
echo off
cmd /V /E /C "b.cmd" %*
C:\Users\meng\program\demo>cat b.cmd
echo off
echo %*
C:\Users\meng\program\demo>cat c.cmd
echo off
cmd /V /E /C ""b.cmd" %*"
C:\Users\meng\program\demo>a.cmd "123"
'b.cmd" "123' is not recognized as an internal or external command,
operable program or batch file.
C:\Users\meng\program\demo>c.cmd "123"
"123"
```
With the spark-shell.cmd example, change it to the following code will make the command execute succeed.
```
cmd /V /E /C ""%~dp0spark-shell2.cmd" %*"
```
```
C:\Users\meng\software\spark-2.2.0-bin-hadoop2.7> bin\spark-shell --driver-java-options " -Dfile.encoding=utf-8 "
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
...
```
Author: minixalpha <xkzalpha@gmail.com>
Closes#19090 from minixalpha/master.
## What changes were proposed in this pull request?
This patch fixes the problem that pyspark fails on Windows because pyspark can't find ```spark-submit2.cmd```.
## How was this patch tested?
manual tests:
I ran ```bin\pyspark.cmd``` and checked if pyspark is launched correctly after this patch is applyed.
Author: Masayoshi TSUZUKI <tsudukim@oss.nttdata.co.jp>
Closes#11442 from tsudukim/feature/SPARK-13592.
Modified not to pollute environment variables.
Just moved the main logic into `XXX2.cmd` from `XXX.cmd`, and call `XXX2.cmd` with cmd command in `XXX.cmd`.
`pyspark.cmd` and `spark-class.cmd` are already using the same way, but `spark-shell.cmd`, `spark-submit.cmd` and `/python/docs/make.bat` are not.
Author: Masayoshi TSUZUKI <tsudukim@oss.nttdata.co.jp>
Closes#2797 from tsudukim/feature/SPARK-3943 and squashes the following commits:
b397a7d [Masayoshi TSUZUKI] [SPARK-3943] Some scripts bin\*.cmd pollutes environment variables in Windows
This is an effort to bring the Windows scripts up to speed after recent splashing changes in #1845.
Author: Andrew Or <andrewor14@gmail.com>
Closes#2129 from andrewor14/windows-config and squashes the following commits:
881a8f0 [Andrew Or] Add reference to Windows taskkill
92e6047 [Andrew Or] Update a few comments (minor)
22b1acd [Andrew Or] Fix style again (minor)
afcffea [Andrew Or] Fix style (minor)
72004c2 [Andrew Or] Actually respect --driver-java-options
803218b [Andrew Or] Actually respect SPARK_*_CLASSPATH
eeb34a0 [Andrew Or] Update outdated comment (minor)
35caecc [Andrew Or] In Windows, actually kill Java processes on exit
f97daa2 [Andrew Or] Fix Windows spark shell stdin issue
83ebe60 [Andrew Or] Parse special driver configs in Windows (broken)
Tested on Windows 7.
Author: Andrew Or <andrewor14@gmail.com>
Closes#745 from andrewor14/windows-submit and squashes the following commits:
c0b58fb [Andrew Or] Allow spaces in parameters
162e54d [Andrew Or] Merge branch 'master' of github.com:apache/spark into windows-submit
91597ce [Andrew Or] Make spark-shell.cmd use spark-submit.cmd
af6fd29 [Andrew Or] Add spark submit for Windows