60a899b8c3
## What changes were proposed in this pull request? With large partition, pyspark may exceeds executor memory limit and trigger out of memory for python 2.7. This is because map() is used. Unlike in python3.x, python 2.7 map() will generate a list and need to read all data into memory. The proposed fix will use imap in python 2.7 and it has been verified. ## How was this patch tested? Manual test. (Please explain how this patch was tested. E.g. unit tests, integration tests, manual tests) (If this patch involves UI changes, please attach a screenshot; otherwise, remove this) Please review http://spark.apache.org/contributing.html before opening a pull request. Closes #23954 from TigerYang414/patch-1. Lead-authored-by: TigerYang414 <39265202+TigerYang414@users.noreply.github.com> Co-authored-by: Hyukjin Kwon <gurwls223@apache.org> Signed-off-by: Sean Owen <sean.owen@databricks.com> |
||
---|---|---|
.. | ||
avro | ||
tests | ||
__init__.py | ||
catalog.py | ||
column.py | ||
conf.py | ||
context.py | ||
dataframe.py | ||
functions.py | ||
group.py | ||
readwriter.py | ||
session.py | ||
streaming.py | ||
types.py | ||
udf.py | ||
utils.py | ||
window.py |