a17a0ee776
When we run applications on YARN with cluster mode, uploaded resources on .sparkStaging directory can't be cleaned up in case of failure of uploading local resources.
You can see this issue by running following command.
```
bin/spark-submit --master yarn --deploy-mode cluster --class <someClassName> <non-existing-jar>
```
Author: Kousuke Saruta <sarutak@oss.nttdata.co.jp>
Closes #6026 from sarutak/delete-uploaded-resources-on-error and squashes the following commits:
caef9f4 [Kousuke Saruta] Fixed style
882f921 [Kousuke Saruta] Wrapped Client#submitApplication with try/catch blocks in order to delete resources on error
1786ca4 [Kousuke Saruta] Merge branch 'master' of https://github.com/apache/spark into delete-uploaded-resources-on-error
f61071b [Kousuke Saruta] Fixed cleanup problem
(cherry picked from commit
|
||
---|---|---|
.. | ||
main/scala/org/apache/spark | ||
test |