2020-07-18 22:02:37 -04:00
|
|
|
#!/usr/bin/env python3
|
2016-11-16 17:22:15 -05:00
|
|
|
|
|
|
|
#
|
|
|
|
# Licensed to the Apache Software Foundation (ASF) under one or more
|
|
|
|
# contributor license agreements. See the NOTICE file distributed with
|
|
|
|
# this work for additional information regarding copyright ownership.
|
|
|
|
# The ASF licenses this file to You under the Apache License, Version 2.0
|
|
|
|
# (the "License"); you may not use this file except in compliance with
|
|
|
|
# the License. You may obtain a copy of the License at
|
|
|
|
#
|
|
|
|
# http://www.apache.org/licenses/LICENSE-2.0
|
|
|
|
#
|
|
|
|
# Unless required by applicable law or agreed to in writing, software
|
|
|
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
|
|
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
|
|
# See the License for the specific language governing permissions and
|
|
|
|
# limitations under the License.
|
|
|
|
#
|
|
|
|
|
|
|
|
# This script attempt to determine the correct setting for SPARK_HOME given
|
|
|
|
# that Spark may have been installed on the system with pip.
|
|
|
|
|
|
|
|
import os
|
|
|
|
import sys
|
|
|
|
|
|
|
|
|
|
|
|
def _find_spark_home():
|
|
|
|
"""Find the SPARK_HOME."""
|
2018-08-11 22:23:36 -04:00
|
|
|
# If the environment has SPARK_HOME set trust it.
|
2016-11-16 17:22:15 -05:00
|
|
|
if "SPARK_HOME" in os.environ:
|
|
|
|
return os.environ["SPARK_HOME"]
|
|
|
|
|
|
|
|
def is_spark_home(path):
|
|
|
|
"""Takes a path and returns true if the provided path could be a reasonable SPARK_HOME"""
|
|
|
|
return (os.path.isfile(os.path.join(path, "bin/spark-submit")) and
|
|
|
|
(os.path.isdir(os.path.join(path, "jars")) or
|
|
|
|
os.path.isdir(os.path.join(path, "assembly"))))
|
|
|
|
|
|
|
|
paths = ["../", os.path.dirname(os.path.realpath(__file__))]
|
|
|
|
|
|
|
|
# Add the path of the PySpark module if it exists
|
[SPARK-31382][BUILD] Show a better error message for different python and pip installation mistake
### What changes were proposed in this pull request?
This PR proposes to show a better error message when a user mistakenly installs `pyspark` from PIP but the default `python` does not point out the corresponding `pip`. See https://stackoverflow.com/questions/46286436/running-pyspark-after-pip-install-pyspark/49587560 as an example.
It can be reproduced as below:
I have two Python executables. `python` is Python 3.7, `pip` binds with Python 3.7 and `python2.7` is Python 2.7.
```bash
pip install pyspark
```
```bash
pyspark
```
```
...
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 2.4.5
/_/
Using Python version 3.7.3 (default, Mar 27 2019 09:23:15)
SparkSession available as 'spark'.
...
```
```bash
PYSPARK_PYTHON=python2.7 pyspark
```
```
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin']
/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin/pyspark: line 24: /bin/load-spark-env.sh: No such file or directory
/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin/pyspark: line 77: /bin/spark-submit: No such file or directory
/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin/pyspark: line 77: exec: /bin/spark-submit: cannot execute: No such file or directory
```
### Why are the changes needed?
There are multiple questions outside about this error and they have no idea what's going on. See:
- https://stackoverflow.com/questions/46286436/running-pyspark-after-pip-install-pyspark/49587560
- https://stackoverflow.com/questions/45991888/path-issue-could-not-find-valid-spark-home-while-searching
- https://stackoverflow.com/questions/49707239/pyspark-could-not-find-valid-spark-home
- https://stackoverflow.com/questions/55569985/pyspark-could-not-find-valid-spark-home
- https://stackoverflow.com/questions/48296474/error-could-not-find-valid-spark-home-while-searching-pycharm-in-windows
- https://github.com/ContinuumIO/anaconda-issues/issues/8076
The answer is usually setting `SPARK_HOME`; however this isn't completely correct.
It works if you set `SPARK_HOME` because `pyspark` executable script directly imports the library by using `SPARK_HOME` (see https://github.com/apache/spark/blob/master/bin/pyspark#L52-L53) instead of the default package location specified via `python` executable. So, this way you use a package installed in a different Python, which isn't ideal.
### Does this PR introduce any user-facing change?
Yes, it fixes the error message better.
**Before:**
```
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin']
...
```
**After:**
```
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin']
Did you install PySpark via a package manager such as pip or Conda? If so,
PySpark was not found in your Python environment. It is possible your
Python environment does not properly bind with your package manager.
Please check your default 'python' and if you set PYSPARK_PYTHON and/or
PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
PySpark, for example, 'python -c 'import pyspark'.
If you cannot import, you can install by using the Python executable directly,
for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also
explicitly set the Python executable, that has PySpark installed, to
PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
'PYSPARK_PYTHON=python3 pyspark'.
...
```
### How was this patch tested?
Manually tested as described above.
Closes #28152 from HyukjinKwon/SPARK-31382.
Authored-by: HyukjinKwon <gurwls223@apache.org>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>
2020-04-08 22:04:35 -04:00
|
|
|
import_error_raised = False
|
2020-07-13 22:22:44 -04:00
|
|
|
from importlib.util import find_spec
|
|
|
|
try:
|
|
|
|
module_home = os.path.dirname(find_spec("pyspark").origin)
|
|
|
|
paths.append(module_home)
|
|
|
|
# If we are installed in edit mode also look two dirs up
|
|
|
|
paths.append(os.path.join(module_home, "../../"))
|
|
|
|
except ImportError:
|
|
|
|
# Not pip installed no worries
|
|
|
|
import_error_raised = True
|
2016-11-16 17:22:15 -05:00
|
|
|
|
|
|
|
# Normalize the paths
|
|
|
|
paths = [os.path.abspath(p) for p in paths]
|
|
|
|
|
|
|
|
try:
|
|
|
|
return next(path for path in paths if is_spark_home(path))
|
|
|
|
except StopIteration:
|
|
|
|
print("Could not find valid SPARK_HOME while searching {0}".format(paths), file=sys.stderr)
|
[SPARK-31382][BUILD] Show a better error message for different python and pip installation mistake
### What changes were proposed in this pull request?
This PR proposes to show a better error message when a user mistakenly installs `pyspark` from PIP but the default `python` does not point out the corresponding `pip`. See https://stackoverflow.com/questions/46286436/running-pyspark-after-pip-install-pyspark/49587560 as an example.
It can be reproduced as below:
I have two Python executables. `python` is Python 3.7, `pip` binds with Python 3.7 and `python2.7` is Python 2.7.
```bash
pip install pyspark
```
```bash
pyspark
```
```
...
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 2.4.5
/_/
Using Python version 3.7.3 (default, Mar 27 2019 09:23:15)
SparkSession available as 'spark'.
...
```
```bash
PYSPARK_PYTHON=python2.7 pyspark
```
```
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin']
/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin/pyspark: line 24: /bin/load-spark-env.sh: No such file or directory
/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin/pyspark: line 77: /bin/spark-submit: No such file or directory
/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin/pyspark: line 77: exec: /bin/spark-submit: cannot execute: No such file or directory
```
### Why are the changes needed?
There are multiple questions outside about this error and they have no idea what's going on. See:
- https://stackoverflow.com/questions/46286436/running-pyspark-after-pip-install-pyspark/49587560
- https://stackoverflow.com/questions/45991888/path-issue-could-not-find-valid-spark-home-while-searching
- https://stackoverflow.com/questions/49707239/pyspark-could-not-find-valid-spark-home
- https://stackoverflow.com/questions/55569985/pyspark-could-not-find-valid-spark-home
- https://stackoverflow.com/questions/48296474/error-could-not-find-valid-spark-home-while-searching-pycharm-in-windows
- https://github.com/ContinuumIO/anaconda-issues/issues/8076
The answer is usually setting `SPARK_HOME`; however this isn't completely correct.
It works if you set `SPARK_HOME` because `pyspark` executable script directly imports the library by using `SPARK_HOME` (see https://github.com/apache/spark/blob/master/bin/pyspark#L52-L53) instead of the default package location specified via `python` executable. So, this way you use a package installed in a different Python, which isn't ideal.
### Does this PR introduce any user-facing change?
Yes, it fixes the error message better.
**Before:**
```
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin']
...
```
**After:**
```
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/bin']
Did you install PySpark via a package manager such as pip or Conda? If so,
PySpark was not found in your Python environment. It is possible your
Python environment does not properly bind with your package manager.
Please check your default 'python' and if you set PYSPARK_PYTHON and/or
PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
PySpark, for example, 'python -c 'import pyspark'.
If you cannot import, you can install by using the Python executable directly,
for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also
explicitly set the Python executable, that has PySpark installed, to
PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
'PYSPARK_PYTHON=python3 pyspark'.
...
```
### How was this patch tested?
Manually tested as described above.
Closes #28152 from HyukjinKwon/SPARK-31382.
Authored-by: HyukjinKwon <gurwls223@apache.org>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>
2020-04-08 22:04:35 -04:00
|
|
|
if import_error_raised:
|
|
|
|
print(
|
|
|
|
"\nDid you install PySpark via a package manager such as pip or Conda? If so,\n"
|
|
|
|
"PySpark was not found in your Python environment. It is possible your\n"
|
|
|
|
"Python environment does not properly bind with your package manager.\n"
|
|
|
|
"\nPlease check your default 'python' and if you set PYSPARK_PYTHON and/or\n"
|
|
|
|
"PYSPARK_DRIVER_PYTHON environment variables, and see if you can import\n"
|
|
|
|
"PySpark, for example, 'python -c 'import pyspark'.\n"
|
|
|
|
"\nIf you cannot import, you can install by using the Python executable directly,\n"
|
|
|
|
"for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also\n"
|
|
|
|
"explicitly set the Python executable, that has PySpark installed, to\n"
|
|
|
|
"PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,\n"
|
|
|
|
"'PYSPARK_PYTHON=python3 pyspark'.\n", file=sys.stderr)
|
2018-03-08 06:38:34 -05:00
|
|
|
sys.exit(-1)
|
2016-11-16 17:22:15 -05:00
|
|
|
|
2020-07-13 22:22:44 -04:00
|
|
|
|
2016-11-16 17:22:15 -05:00
|
|
|
if __name__ == "__main__":
|
|
|
|
print(_find_spark_home())
|