## What changes were proposed in this pull request?
All the windows command scripts can not handle quotes in parameter.
Run a windows command shell with parameter which has quotes can reproduce the bug:
```
C:\Users\meng\software\spark-2.2.0-bin-hadoop2.7> bin\spark-shell --driver-java-options " -Dfile.encoding=utf-8 "
'C:\Users\meng\software\spark-2.2.0-bin-hadoop2.7\bin\spark-shell2.cmd" --driver-java-options "' is not recognized as an internal or external command,
operable program or batch file.
```
Windows recognize "--driver-java-options" as part of the command.
All the Windows command script has the following code have the bug.
```
cmd /V /E /C "<other command>" %*
```
We should quote command and parameters like
```
cmd /V /E /C ""<other command>" %*"
```
## How was this patch tested?
Test manually on Windows 10 and Windows 7
We can verify it by the following demo:
```
C:\Users\meng\program\demo>cat a.cmd
echo off
cmd /V /E /C "b.cmd" %*
C:\Users\meng\program\demo>cat b.cmd
echo off
echo %*
C:\Users\meng\program\demo>cat c.cmd
echo off
cmd /V /E /C ""b.cmd" %*"
C:\Users\meng\program\demo>a.cmd "123"
'b.cmd" "123' is not recognized as an internal or external command,
operable program or batch file.
C:\Users\meng\program\demo>c.cmd "123"
"123"
```
With the spark-shell.cmd example, change it to the following code will make the command execute succeed.
```
cmd /V /E /C ""%~dp0spark-shell2.cmd" %*"
```
```
C:\Users\meng\software\spark-2.2.0-bin-hadoop2.7> bin\spark-shell --driver-java-options " -Dfile.encoding=utf-8 "
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
...
```
Author: minixalpha <xkzalpha@gmail.com>
Closes#19090 from minixalpha/master.