[SPARK-5078] Optionally read from SPARK_LOCAL_HOSTNAME
Current spark lets you set the ip address using SPARK_LOCAL_IP, but then this is given to akka after doing a reverse DNS lookup. This makes it difficult to run spark in Docker. You can already change the hostname that is used programmatically, but it would be nice to be able to do this with an environment variable as well. Author: Michael Armbrust <michael@databricks.com> Closes #3893 from marmbrus/localHostnameEnv and squashes the following commits: 85045b6 [Michael Armbrust] Optionally read from SPARK_LOCAL_HOSTNAME
This commit is contained in:
parent
13e610b88e
commit
a3978f3e15
|
@ -701,7 +701,7 @@ private[spark] object Utils extends Logging {
|
|||
}
|
||||
}
|
||||
|
||||
private var customHostname: Option[String] = None
|
||||
private var customHostname: Option[String] = sys.env.get("SPARK_LOCAL_HOSTNAME")
|
||||
|
||||
/**
|
||||
* Allow setting a custom host name because when we run on Mesos we need to use the same
|
||||
|
|
Loading…
Reference in a new issue