[SPARK-21877][DEPLOY, WINDOWS] Handle quotes in Windows command scripts

## What changes were proposed in this pull request?

All the windows command scripts can not handle quotes in parameter.

Run a windows command shell with parameter which has quotes can reproduce the bug:

```
C:\Users\meng\software\spark-2.2.0-bin-hadoop2.7> bin\spark-shell --driver-java-options " -Dfile.encoding=utf-8 "
'C:\Users\meng\software\spark-2.2.0-bin-hadoop2.7\bin\spark-shell2.cmd" --driver-java-options "' is not recognized as an internal or external command,
operable program or batch file.
```

Windows recognize "--driver-java-options" as part of the command.
All the Windows command script has the following code have the bug.

```
cmd /V /E /C "<other command>" %*
```

We should quote command and parameters like

```
cmd /V /E /C ""<other command>" %*"
```

## How was this patch tested?

Test manually on Windows 10 and Windows 7

We can verify it by the following demo:

```
C:\Users\meng\program\demo>cat a.cmd
echo off
cmd /V /E /C "b.cmd" %*

C:\Users\meng\program\demo>cat b.cmd
echo off
echo %*

C:\Users\meng\program\demo>cat c.cmd
echo off
cmd /V /E /C ""b.cmd" %*"

C:\Users\meng\program\demo>a.cmd "123"
'b.cmd" "123' is not recognized as an internal or external command,
operable program or batch file.

C:\Users\meng\program\demo>c.cmd "123"
"123"
```

With the spark-shell.cmd example, change it to the following code will make the command execute succeed.

```
cmd /V /E /C ""%~dp0spark-shell2.cmd" %*"
```

```
C:\Users\meng\software\spark-2.2.0-bin-hadoop2.7> bin\spark-shell  --driver-java-options " -Dfile.encoding=utf-8 "
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
...

```

Author: minixalpha <xkzalpha@gmail.com>

Closes #19090 from minixalpha/master.
This commit is contained in:
minixalpha 2017-10-06 23:38:47 +09:00 committed by hyukjinkwon
parent 0c03297bf0
commit c7b46d4d8a
7 changed files with 22 additions and 7 deletions

View file

@ -17,4 +17,6 @@ rem See the License for the specific language governing permissions and
rem limitations under the License. rem limitations under the License.
rem rem
cmd /V /E /C "%~dp0spark-class.cmd" org.apache.hive.beeline.BeeLine %* rem The outermost quotes are used to prevent Windows command line parse error
rem when there are some quotes in parameters, see SPARK-21877.
cmd /V /E /C ""%~dp0spark-class.cmd" org.apache.hive.beeline.BeeLine %*"

View file

@ -20,4 +20,6 @@ rem
rem This is the entry point for running PySpark. To avoid polluting the rem This is the entry point for running PySpark. To avoid polluting the
rem environment, it just launches a new cmd to do the real work. rem environment, it just launches a new cmd to do the real work.
cmd /V /E /C "%~dp0pyspark2.cmd" %* rem The outermost quotes are used to prevent Windows command line parse error
rem when there are some quotes in parameters, see SPARK-21877.
cmd /V /E /C ""%~dp0pyspark2.cmd" %*"

View file

@ -19,4 +19,7 @@ rem
set SPARK_HOME=%~dp0.. set SPARK_HOME=%~dp0..
set _SPARK_CMD_USAGE=Usage: ./bin/run-example [options] example-class [example args] set _SPARK_CMD_USAGE=Usage: ./bin/run-example [options] example-class [example args]
cmd /V /E /C "%~dp0spark-submit.cmd" run-example %*
rem The outermost quotes are used to prevent Windows command line parse error
rem when there are some quotes in parameters, see SPARK-21877.
cmd /V /E /C ""%~dp0spark-submit.cmd" run-example %*"

View file

@ -20,4 +20,6 @@ rem
rem This is the entry point for running a Spark class. To avoid polluting rem This is the entry point for running a Spark class. To avoid polluting
rem the environment, it just launches a new cmd to do the real work. rem the environment, it just launches a new cmd to do the real work.
cmd /V /E /C "%~dp0spark-class2.cmd" %* rem The outermost quotes are used to prevent Windows command line parse error
rem when there are some quotes in parameters, see SPARK-21877.
cmd /V /E /C ""%~dp0spark-class2.cmd" %*"

View file

@ -20,4 +20,6 @@ rem
rem This is the entry point for running Spark shell. To avoid polluting the rem This is the entry point for running Spark shell. To avoid polluting the
rem environment, it just launches a new cmd to do the real work. rem environment, it just launches a new cmd to do the real work.
cmd /V /E /C "%~dp0spark-shell2.cmd" %* rem The outermost quotes are used to prevent Windows command line parse error
rem when there are some quotes in parameters, see SPARK-21877.
cmd /V /E /C ""%~dp0spark-shell2.cmd" %*"

View file

@ -20,4 +20,6 @@ rem
rem This is the entry point for running Spark submit. To avoid polluting the rem This is the entry point for running Spark submit. To avoid polluting the
rem environment, it just launches a new cmd to do the real work. rem environment, it just launches a new cmd to do the real work.
cmd /V /E /C "%~dp0spark-submit2.cmd" %* rem The outermost quotes are used to prevent Windows command line parse error
rem when there are some quotes in parameters, see SPARK-21877.
cmd /V /E /C ""%~dp0spark-submit2.cmd" %*"

View file

@ -20,4 +20,6 @@ rem
rem This is the entry point for running SparkR. To avoid polluting the rem This is the entry point for running SparkR. To avoid polluting the
rem environment, it just launches a new cmd to do the real work. rem environment, it just launches a new cmd to do the real work.
cmd /V /E /C "%~dp0sparkR2.cmd" %* rem The outermost quotes are used to prevent Windows command line parse error
rem when there are some quotes in parameters, see SPARK-21877.
cmd /V /E /C ""%~dp0sparkR2.cmd" %*"