## What changes were proposed in this pull request?
updated the usage message in sbin/start-slave.sh.
<masterURL> argument moved to first
## How was this patch tested?
tested locally with
Starting master
starting slave with (./start-slave.sh spark://<IP>:<PORT> -c 1
and opening spark shell with ./spark-shell --master spark://<IP>:<PORT>
Closes#24974 from shivusondur/jira28164.
Authored-by: shivusondur <shivusondur@gmail.com>
Signed-off-by: Sean Owen <sean.owen@databricks.com>
## What changes were proposed in this pull request?
Currently if we run
```
./sbin/start-master.sh -h
```
We get
```
Usage: ./sbin/start-master.sh [options]
18/10/11 23:38:30 INFO Master: Started daemon with process name: 33907C02TL2JZGTF1
18/10/11 23:38:30 INFO SignalUtils: Registered signal handler for TERM
18/10/11 23:38:30 INFO SignalUtils: Registered signal handler for HUP
18/10/11 23:38:30 INFO SignalUtils: Registered signal handler for INT
Options:
-i HOST, --ip HOST Hostname to listen on (deprecated, please use --host or -h)
-h HOST, --host HOST Hostname to listen on
-p PORT, --port PORT Port to listen on (default: 7077)
--webui-port PORT Port for web UI (default: 8080)
--properties-file FILE Path to a custom Spark properties file.
Default is conf/spark-defaults.conf.
```
We can filter out some useless output.
## How was this patch tested?
Manual test
Closes#22700 from gengliangwang/improveStartScript.
Authored-by: Gengliang Wang <gengliang.wang@databricks.com>
Signed-off-by: Sean Owen <sean.owen@databricks.com>
Addressing https://issues.apache.org/jira/browse/SPARK-11218, mostly copied start-thriftserver.sh.
```
charlesyeh-mbp:spark charlesyeh$ ./sbin/start-master.sh --help
Usage: Master [options]
Options:
-i HOST, --ip HOST Hostname to listen on (deprecated, please use --host or -h)
-h HOST, --host HOST Hostname to listen on
-p PORT, --port PORT Port to listen on (default: 7077)
--webui-port PORT Port for web UI (default: 8080)
--properties-file FILE Path to a custom Spark properties file.
Default is conf/spark-defaults.conf.
```
```
charlesyeh-mbp:spark charlesyeh$ ./sbin/start-slave.sh
Usage: Worker [options] <master>
Master must be a URL of the form spark://hostname:port
Options:
-c CORES, --cores CORES Number of cores to use
-m MEM, --memory MEM Amount of memory to use (e.g. 1000M, 2G)
-d DIR, --work-dir DIR Directory to run apps in (default: SPARK_HOME/work)
-i HOST, --ip IP Hostname to listen on (deprecated, please use --host or -h)
-h HOST, --host HOST Hostname to listen on
-p PORT, --port PORT Port to listen on (default: random)
--webui-port PORT Port for web UI (default: 8081)
--properties-file FILE Path to a custom Spark properties file.
Default is conf/spark-defaults.conf.
```
Author: Charles Yeh <charlesyeh@dropbox.com>
Closes#9432 from CharlesYeh/helpmsg.
This PR is based on the work of roji to support running Spark scripts from symlinks. Thanks for the great work roji . Would you mind taking a look at this PR, thanks a lot.
For releases like HDP and others, normally it will expose the Spark executables as symlinks and put in `PATH`, but current Spark's scripts do not support finding real path from symlink recursively, this will make spark fail to execute from symlink. This PR try to solve this issue by finding the absolute path from symlink.
Instead of using `readlink -f` like what this PR (https://github.com/apache/spark/pull/2386) implemented is that `-f` is not support for Mac, so here manually seeking the path through loop.
I've tested with Mac and Linux (Cent OS), looks fine.
This PR did not fix the scripts under `sbin` folder, not sure if it needs to be fixed also?
Please help to review, any comment is greatly appreciated.
Author: jerryshao <sshao@hortonworks.com>
Author: Shay Rojansky <roji@roji.org>
Closes#8669 from jerryshao/SPARK-2960.
This refixes #3699 with the latest code.
This fixes SPARK-4848
I've changed the stand-alone cluster scripts to allow different workers to have different numbers of instances, with both port and web-ui port following allong appropriately.
I did this by moving the loop over instances from start-slaves and stop-slaves (on the master) to start-slave and stop-slave (on the worker).
Wile I was at it, I changed SPARK_WORKER_PORT to work the same way as SPARK_WORKER_WEBUI_PORT, since the new methods work fine for both.
Author: Nathan Kronenfeld <nkronenfeld@oculusinfo.com>
Closes#5140 from nkronenfeld/feature/spark-4848 and squashes the following commits:
cf5f47e [Nathan Kronenfeld] Merge remote branch 'upstream/master' into feature/spark-4848
044ca6f [Nathan Kronenfeld] Documentation and formatting as requested by by andrewor14
d739640 [Nathan Kronenfeld] Move looping through instances from the master to the workers, so that each worker respects its own number of instances and web-ui port
wihtout this change the below error happens when I execute sbin/start-all.sh
localhost: /spark-1.3/sbin/start-slave.sh: line 32: unexpected EOF while looking for matching `"'
localhost: /spark-1.3/sbin/start-slave.sh: line 33: syntax error: unexpected end of file
my operating system is Linux Mint 17.1 Rebecca
Author: Jose Manuel Gomez <jmgomez@stratio.com>
Closes#5262 from josegom/patch-2 and squashes the following commits:
453af8b [Jose Manuel Gomez] Update start-slave.sh
2c456bd [Jose Manuel Gomez] Update start-slave.sh
https://issues.apache.org/jira/browse/SPARK-6552
/cc srowen
Author: WangTaoTheTonic <wangtao111@huawei.com>
Closes#5205 from WangTaoTheTonic/SPARK-6552 and squashes the following commits:
b02263c [WangTaoTheTonic] use less than rather than less equal
f0fa408 [WangTaoTheTonic] expose start-slave.sh
...
Tested ! TBH, it isn't a great idea to have directory with spaces within. Because emacs doesn't like it then hadoop doesn't like it. and so on...
Author: Prashant Sharma <prashant.s@imaginea.com>
Closes#2229 from ScrapCodes/SPARK-3337/quoting-shell-scripts and squashes the following commits:
d4ad660 [Prashant Sharma] SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within.
the lines in start-master.sh and start-slave.sh no longer work
in ec2, the host name has changed, e.g.
ubuntu@ip-172-31-36-93:~$ hostname
ip-172-31-36-93
also, the URL to fetch public DNS name also changed, e.g.
ubuntu@ip-172-31-36-93:~$ wget -q -O - http://instance-data.ec2.internal/latest/meta-data/public-hostname
ubuntu@ip-172-31-36-93:~$ (returns nothing)
since we have spark-ec2 project, we don't need to have such ec2-specific lines here, instead, user only need to set in spark-env.sh
Author: CodingCat <zhunansjtu@gmail.com>
Closes#588 from CodingCat/deadcode_in_sbin and squashes the following commits:
e4236e0 [CodingCat] remove dead code in start script, remind user set that in spark-env.sh