spark-instrumented-optimizer/core/src/main
Reynold Xin 883e034aeb Merge pull request #245 from gregakespret/task-maxfailures-fix
Fix for spark.task.maxFailures not enforced correctly.

Docs at http://spark.incubator.apache.org/docs/latest/configuration.html say:

```
spark.task.maxFailures

Number of individual task failures before giving up on the job. Should be greater than or equal to 1. Number of allowed retries = this value - 1.
```

Previous implementation worked incorrectly. When for example `spark.task.maxFailures` was set to 1, the job was aborted only after the second task failure, not after the first one.
2013-12-16 14:16:02 -08:00
..
java/org/apache/spark/network/netty Merge branch 'master' into scala-2.10 2013-11-13 16:55:11 +08:00
resources/org/apache/spark/ui/static Add missing license headers found with RAT 2013-09-02 12:23:03 -07:00
scala/org/apache Merge pull request #245 from gregakespret/task-maxfailures-fix 2013-12-16 14:16:02 -08:00