[SPARK-16398][CORE] Make cancelJob and cancelStage APIs public
## What changes were proposed in this pull request? Make SparkContext `cancelJob` and `cancelStage` APIs public. This allows applications to use `SparkListener` to do their own management of jobs via events, but without using the REST API. ## How was this patch tested? Existing tests (dev/run-tests) Author: MasterDDT <miteshp@live.com> Closes #14072 from MasterDDT/SPARK-16398.
This commit is contained in:
parent
42279bff68
commit
69f5391408
|
@ -2011,13 +2011,23 @@ class SparkContext(config: SparkConf) extends Logging with ExecutorAllocationCli
|
|||
dagScheduler.cancelAllJobs()
|
||||
}
|
||||
|
||||
/** Cancel a given job if it's scheduled or running */
|
||||
private[spark] def cancelJob(jobId: Int) {
|
||||
/**
|
||||
* Cancel a given job if it's scheduled or running.
|
||||
*
|
||||
* @param jobId the job ID to cancel
|
||||
* @throws InterruptedException if the cancel message cannot be sent
|
||||
*/
|
||||
def cancelJob(jobId: Int) {
|
||||
dagScheduler.cancelJob(jobId)
|
||||
}
|
||||
|
||||
/** Cancel a given stage and all jobs associated with it */
|
||||
private[spark] def cancelStage(stageId: Int) {
|
||||
/**
|
||||
* Cancel a given stage and all jobs associated with it.
|
||||
*
|
||||
* @param stageId the stage ID to cancel
|
||||
* @throws InterruptedException if the cancel message cannot be sent
|
||||
*/
|
||||
def cancelStage(stageId: Int) {
|
||||
dagScheduler.cancelStage(stageId)
|
||||
}
|
||||
|
||||
|
|
Loading…
Reference in a new issue