SPARK-1860: Do not cleanup application work/ directories by default

This causes an unrecoverable error for applications that are running for longer
than 7 days that have jars added to the SparkContext, as the jars are cleaned up
even though the application is still running.

Author: Aaron Davidson <aaron@databricks.com>

Closes #800 from aarondav/shitty-defaults and squashes the following commits:

a573fbb [Aaron Davidson] SPARK-1860: Do not cleanup application work/ directories by default
This commit is contained in:
Aaron Davidson 2014-05-15 21:37:58 -07:00 committed by Patrick Wendell
parent 94c5139607
commit bb98ecafce
2 changed files with 4 additions and 3 deletions

View file

@ -65,7 +65,7 @@ private[spark] class Worker(
val REGISTRATION_TIMEOUT = 20.seconds
val REGISTRATION_RETRIES = 3
val CLEANUP_ENABLED = conf.getBoolean("spark.worker.cleanup.enabled", true)
val CLEANUP_ENABLED = conf.getBoolean("spark.worker.cleanup.enabled", false)
// How often worker will clean up old app folders
val CLEANUP_INTERVAL_MILLIS = conf.getLong("spark.worker.cleanup.interval", 60 * 30) * 1000
// TTL for app folders/data; after TTL expires it will be cleaned up

View file

@ -390,10 +390,11 @@ Apart from these, the following properties are also available, and may be useful
</tr>
<tr>
<td>spark.worker.cleanup.enabled</td>
<td>true</td>
<td>false</td>
<td>
Enable periodic cleanup of worker / application directories. Note that this only affects standalone
mode, as YARN works differently.
mode, as YARN works differently. Applications directories are cleaned up regardless of whether
the application is still running.
</td>
</tr>
<tr>