[SPARK-18136] Fix SPARK_JARS_DIR for Python pip install on Windows

## What changes were proposed in this pull request?

Fix for setup of `SPARK_JARS_DIR` on Windows as it looks for `%SPARK_HOME%\RELEASE` file instead of `%SPARK_HOME%\jars` as it should. RELEASE file is not included in the `pip` build of PySpark.

## How was this patch tested?

Local install of PySpark on Anaconda 4.4.0 (Python 3.6.1).

Author: Jakub Nowacki <j.s.nowacki@gmail.com>

Closes #19310 from jsnowacki/master.
This commit is contained in:
Jakub Nowacki 2017-09-23 21:04:10 +09:00 committed by hyukjinkwon
parent f180b65343
commit c11f24a940

View file

@ -29,7 +29,7 @@ if "x%1"=="x" (
)
rem Find Spark jars.
if exist "%SPARK_HOME%\RELEASE" (
if exist "%SPARK_HOME%\jars" (
set SPARK_JARS_DIR="%SPARK_HOME%\jars"
) else (
set SPARK_JARS_DIR="%SPARK_HOME%\assembly\target\scala-%SPARK_SCALA_VERSION%\jars"