spark-instrumented-optimizer/core
Andrew Or 4296d96c82 Assign spill threshold as a fraction of maximum memory
Further, divide this threshold by the number of tasks running concurrently.

Note that this does not guard against the following scenario: a new task
quickly fills up its share of the memory before old tasks finish spilling
their contents, in which case the total memory used by such maps may exceed
what was specified. Currently, spark.shuffle.safetyFraction mitigates the
effect of this.
2014-01-04 00:00:57 -08:00
..
src Assign spill threshold as a fraction of maximum memory 2014-01-04 00:00:57 -08:00
pom.xml restore core/pom.xml file modification 2014-01-01 11:30:08 +08:00