[SPARK-28525][DEPLOY] Allow Launcher to be applied Java options
Launcher is implemented as a Java application and sometimes I'd like to apply Java options. One situation I have met is the time I try to attach debugger to Launcher. Launcher is launched from bin/spark-class but there is no room to apply Java options. ``` build_command() { "$RUNNER" -Xmx128m -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main "$" printf "%d\0" $? } ``` Considering that it's not so many times to apply Java options to Launcher, one compromise would just modify spark-class by user like as follows. ``` build_command() { "$RUNNER" -Xmx128m $SPARK_LAUNCHER_OPTS -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main "$" printf "%d\0" $? } ``` But it doesn't work when any text related to Java options is output to standard output because whole output is used as command-string for spark-shell and spark-submit in current implementation. One example is jdwp. When apply agentlib option to use jdwp for debug, we will get output like as follows. ``` Listening for transport dt_socket at address: 9876 ``` The output shown above is not a command-string so spark-submit and spark-shell will fail. To enable Java options for Launcher, we need treat command-string and others. I changed launcher/Main.java and bin/spark-class to print separator-character and treat it. ## How was this patch tested? Tested manually using Spark Shell with / without LAUNCHER_JAVA_OPTIONS like as follows. ``` SPARK_LAUNCHER_OPTS="-agentlib:jdwp=transport=dt_socket,suspend=y,address=localhost:9876,server=y" bin/spark-shell ``` Closes #25265 from sarutak/add-spark-launcher-opts. Authored-by: Kousuke Saruta <sarutak@oss.nttdata.com> Signed-off-by: Marcelo Vanzin <vanzin@cloudera.com>
This commit is contained in:
parent
44c28d7515
commit
121f9338ce
|
@ -68,15 +68,27 @@ fi
|
|||
# The exit code of the launcher is appended to the output, so the parent shell removes it from the
|
||||
# command array and checks the value to see if the launcher succeeded.
|
||||
build_command() {
|
||||
"$RUNNER" -Xmx128m -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main "$@"
|
||||
"$RUNNER" -Xmx128m $SPARK_LAUNCHER_OPTS -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main "$@"
|
||||
printf "%d\0" $?
|
||||
}
|
||||
|
||||
# Turn off posix mode since it does not allow process substitution
|
||||
set +o posix
|
||||
CMD=()
|
||||
while IFS= read -d '' -r ARG; do
|
||||
CMD+=("$ARG")
|
||||
DELIM=$'\n'
|
||||
CMD_START_FLAG="false"
|
||||
while IFS= read -d "$DELIM" -r ARG; do
|
||||
if [ "$CMD_START_FLAG" == "true" ]; then
|
||||
CMD+=("$ARG")
|
||||
else
|
||||
if [ "$ARG" == $'\0' ]; then
|
||||
# After NULL character is consumed, change the delimiter and consume command string.
|
||||
DELIM=''
|
||||
CMD_START_FLAG="true"
|
||||
elif [ "$ARG" != "" ]; then
|
||||
echo "$ARG"
|
||||
fi
|
||||
fi
|
||||
done < <(build_command "$@")
|
||||
|
||||
COUNT=${#CMD[@]}
|
||||
|
|
|
@ -56,6 +56,9 @@
|
|||
# - SPARK_DAEMON_CLASSPATH, to set the classpath for all daemons
|
||||
# - SPARK_PUBLIC_DNS, to set the public dns name of the master or workers
|
||||
|
||||
# Options for launcher
|
||||
# - SPARK_LAUNCHER_OPTS, to set config properties and Java options for the launcher (e.g. "-Dx=y")
|
||||
|
||||
# Generic options for the daemons used in the standalone deploy mode
|
||||
# - SPARK_CONF_DIR Alternate conf dir. (Default: ${SPARK_HOME}/conf)
|
||||
# - SPARK_LOG_DIR Where log files are stored. (Default: ${SPARK_HOME}/logs)
|
||||
|
|
|
@ -90,6 +90,9 @@ class Main {
|
|||
if (isWindows()) {
|
||||
System.out.println(prepareWindowsCommand(cmd, env));
|
||||
} else {
|
||||
// A sequence of NULL character and newline separates command-strings and others.
|
||||
System.out.println('\0');
|
||||
|
||||
// In bash, use NULL as the arg separator since it cannot be used in an argument.
|
||||
List<String> bashCmd = prepareBashCommand(cmd, env);
|
||||
for (String c : bashCmd) {
|
||||
|
|
Loading…
Reference in a new issue