[SPARK-25906][SHELL] Documents '-I' option (from Scala REPL) in spark-shell
## What changes were proposed in this pull request? This PR targets to document `-I` option from Spark 2.4.x (previously `-i` option until Spark 2.3.x). After we upgraded Scala to 2.11.12, `-i` option (`:load`) was replaced to `-I`(SI-7898). Existing `-i` became `:paste` which does not respect Spark's implicit import (for instance `toDF`, symbol as column, etc.). Therefore, `-i` option does not correctly from Spark 2.4.x and it's not documented. I checked other Scala REPL options but looks not applicable or working from quick tests. This PR only targets to document `-I` for now. ## How was this patch tested? Manually tested. **Mac:** ```bash $ ./bin/spark-shell --help Usage: ./bin/spark-shell [options] Scala REPL options: -I <file> preload <file>, enforcing line-by-line interpretation Options: --master MASTER_URL spark://host:port, mesos://host:port, yarn, k8s://https://host:port, or local (Default: local[*]). --deploy-mode DEPLOY_MODE Whether to launch the driver program locally ("client") or on one of the worker machines inside the cluster ("cluster") (Default: client). ... ``` **Windows:** ```cmd C:\...\spark>.\bin\spark-shell --help Usage: .\bin\spark-shell.cmd [options] Scala REPL options: -I <file> preload <file>, enforcing line-by-line interpretation Options: --master MASTER_URL spark://host:port, mesos://host:port, yarn, k8s://https://host:port, or local (Default: local[*]). --deploy-mode DEPLOY_MODE Whether to launch the driver program locally ("client") or on one of the worker machines inside the cluster ("cluster") (Default: client). ... ``` Closes #22919 from HyukjinKwon/SPARK-25906. Authored-by: hyukjinkwon <gurwls223@apache.org> Signed-off-by: hyukjinkwon <gurwls223@apache.org>
This commit is contained in:
parent
78fa1be29b
commit
cc38abc27a
|
@ -32,7 +32,10 @@ if [ -z "${SPARK_HOME}" ]; then
|
|||
source "$(dirname "$0")"/find-spark-home
|
||||
fi
|
||||
|
||||
export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]"
|
||||
export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]
|
||||
|
||||
Scala REPL options:
|
||||
-I <file> preload <file>, enforcing line-by-line interpretation"
|
||||
|
||||
# SPARK-4161: scala does not assume use of the java classpath,
|
||||
# so we need to add the "-Dscala.usejavacp=true" flag manually. We
|
||||
|
|
|
@ -20,7 +20,13 @@ rem
|
|||
rem Figure out where the Spark framework is installed
|
||||
call "%~dp0find-spark-home.cmd"
|
||||
|
||||
set _SPARK_CMD_USAGE=Usage: .\bin\spark-shell.cmd [options]
|
||||
set LF=^
|
||||
|
||||
|
||||
rem two empty lines are required
|
||||
set _SPARK_CMD_USAGE=Usage: .\bin\spark-shell.cmd [options]^%LF%%LF%^%LF%%LF%^
|
||||
Scala REPL options:^%LF%%LF%^
|
||||
-I ^<file^> preload ^<file^>, enforcing line-by-line interpretation
|
||||
|
||||
rem SPARK-4161: scala does not assume use of the java classpath,
|
||||
rem so we need to add the "-Dscala.usejavacp=true" flag manually. We
|
||||
|
|
Loading…
Reference in a new issue