[SPARK-34674][CORE][K8S] Close SparkContext after the Main method has finished

### What changes were proposed in this pull request?
Close SparkContext after the Main method has finished, to allow SparkApplication on K8S to complete.
This is fixed version of [merged and reverted PR](https://github.com/apache/spark/pull/32081).

### Why are the changes needed?
if I don't call the method sparkContext.stop() explicitly, then a Spark driver process doesn't terminate even after its Main method has been completed. This behaviour is different from spark on yarn, where the manual sparkContext stopping is not required. It looks like, the problem is in using non-daemon threads, which prevent the driver jvm process from terminating.
So I have inserted code that closes sparkContext automatically.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
Manually on the production AWS EKS environment in my company.

Closes #32283 from kotlovs/close-spark-context-on-exit-2.

Authored-by: skotlov <skotlov@joom.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
This commit is contained in:
skotlov 2021-04-21 22:54:16 -07:00 committed by Dongjoon Hyun
parent 548e66c98a
commit b17a0e6931

View file

@ -955,6 +955,15 @@ private[spark] class SparkSubmit extends Logging {
} catch {
case t: Throwable =>
throw findCause(t)
} finally {
if (!isShell(args.primaryResource) && !isSqlShell(args.mainClass) &&
!isThriftServer(args.mainClass)) {
try {
SparkContext.getActive.foreach(_.stop())
} catch {
case e: Throwable => logError(s"Failed to close SparkContext: $e")
}
}
}
}