[SPARK-6653] [YARN] New config to specify port for sparkYarnAM actor system
Author: shekhar.bansal <shekhar.bansal@guavus.com>
Closes #5719 from zuxqoj/master and squashes the following commits:
5574ff7 [shekhar.bansal] [SPARK-6653][yarn] New config to specify port for sparkYarnAM actor system
5117258 [shekhar.bansal] [SPARK-6653][yarn] New config to specify port for sparkYarnAM actor system
9de5330 [shekhar.bansal] [SPARK-6653][yarn] New config to specify port for sparkYarnAM actor system
456a592 [shekhar.bansal] [SPARK-6653][yarn] New configuration property to specify port for sparkYarnAM actor system
803e93e [shekhar.bansal] [SPARK-6653][yarn] New configuration property to specify port for sparkYarnAM actor system
(cherry picked from commit fc8feaa8e9
)
Signed-off-by: Sean Owen <sowen@cloudera.com>
This commit is contained in:
parent
0634510686
commit
93af96a2f5
|
@ -133,6 +133,13 @@ Most of the configs are the same for Spark on YARN as for other deployment modes
|
|||
Same as <code>spark.yarn.driver.memoryOverhead</code>, but for the Application Master in client mode.
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td><code>spark.yarn.am.port</code></td>
|
||||
<td>(random)</td>
|
||||
<td>
|
||||
Port for the YARN Application Master to listen on. In YARN client mode, this is used to communicate between the Spark driver running on a gateway and the Application Master running on YARN. In YARN cluster mode, this is used for the dynamic executor feature, where it handles the kill from the scheduler backend.
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td><code>spark.yarn.queue</code></td>
|
||||
<td>default</td>
|
||||
|
|
|
@ -285,7 +285,8 @@ private[spark] class ApplicationMaster(
|
|||
}
|
||||
|
||||
private def runExecutorLauncher(securityMgr: SecurityManager): Unit = {
|
||||
rpcEnv = RpcEnv.create("sparkYarnAM", Utils.localHostName, 0, sparkConf, securityMgr)
|
||||
val port = sparkConf.getInt("spark.yarn.am.port", 0)
|
||||
rpcEnv = RpcEnv.create("sparkYarnAM", Utils.localHostName, port, sparkConf, securityMgr)
|
||||
waitForSparkDriver()
|
||||
addAmIpFilter()
|
||||
registerAM(sparkConf.get("spark.driver.appUIAddress", ""), securityMgr)
|
||||
|
|
Loading…
Reference in a new issue