[SPARK-8273] Driver hangs up when yarn shutdown in client mode
In client mode, if yarn was shut down with spark application running, the application will hang up after several retries(default: 30) because the exception throwed by YarnClientImpl could not be caught by upper level, we should exit in case that user can not be aware that. The exception we wanna catch is [here](https://github.com/apache/hadoop/blob/branch-2.7.0/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java#L122), and I try to fix it refer to [MR](https://github.com/apache/hadoop/blob/branch-2.7.0/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/main/java/org/apache/hadoop/mapred/ClientServiceDelegate.java#L320). Author: WangTaoTheTonic <wangtao111@huawei.com> Closes #6717 from WangTaoTheTonic/SPARK-8273 and squashes the following commits: 28752d6 [WangTaoTheTonic] catch the throwed exception
This commit is contained in:
parent
568d1d51d6
commit
2846a357f3
|
@ -28,6 +28,7 @@ import scala.collection.JavaConversions._
|
|||
import scala.collection.mutable.{ArrayBuffer, HashMap, HashSet, ListBuffer, Map}
|
||||
import scala.reflect.runtime.universe
|
||||
import scala.util.{Try, Success, Failure}
|
||||
import scala.util.control.NonFatal
|
||||
|
||||
import com.google.common.base.Objects
|
||||
import com.google.common.io.Files
|
||||
|
@ -771,6 +772,9 @@ private[spark] class Client(
|
|||
case e: ApplicationNotFoundException =>
|
||||
logError(s"Application $appId not found.")
|
||||
return (YarnApplicationState.KILLED, FinalApplicationStatus.KILLED)
|
||||
case NonFatal(e) =>
|
||||
logError(s"Failed to contact YARN for application $appId.", e)
|
||||
return (YarnApplicationState.FAILED, FinalApplicationStatus.FAILED)
|
||||
}
|
||||
val state = report.getYarnApplicationState
|
||||
|
||||
|
|
Loading…
Reference in a new issue