Commit graph

36 commits

Author SHA1 Message Date
Matei Zaharia 3db404a43a Run script fixes for Windows after package & assembly change 2013-09-01 23:45:57 +00:00
Matei Zaharia 46eecd110a Initial work to rename package to org.apache.spark 2013-09-01 14:13:13 -07:00
Matei Zaharia 2ee6a7e32a Print output from spark-daemon only when it fails to launch 2013-08-31 17:31:07 -07:00
Matei Zaharia 89a20b83e9 Delete some code that was added back in a merge and print less info in
spark-daemon
2013-08-31 16:55:25 -07:00
Matei Zaharia aab345c463 Fix finding of assembly JAR, as well as some pointers to ./run 2013-08-29 21:19:06 -07:00
Matei Zaharia 53cd50c069 Change build and run instructions to use assemblies
This commit makes Spark invocation saner by using an assembly JAR to
find all of Spark's dependencies instead of adding all the JARs in
lib_managed. It also packages the examples into an assembly and uses
that as SPARK_EXAMPLES_JAR. Finally, it replaces the old "run" script
with two better-named scripts: "run-examples" for examples, and
"spark-class" for Spark internal classes (e.g. REPL, master, etc). This
is also designed to minimize the confusion people have in trying to use
"run" to run their own classes; it's not meant to do that, but now at
least if they look at it, they can modify run-examples to do a decent
job for them.

As part of this, Bagel's examples are also now properly moved to the
examples package instead of bagel.
2013-08-29 21:19:04 -07:00
Jey Kottalam 47a7c4338a Don't assume spark-examples JAR always exists 2013-08-18 16:59:02 -07:00
Jey Kottalam ad580b94d5 Maven build now also works with YARN 2013-08-16 13:50:12 -07:00
Jey Kottalam cb4ef19214 yarn support 2013-08-15 16:50:37 -07:00
Patrick Wendell b4905c383b Log the launch command for Spark daemons
For debugging and analysis purposes, it's nice to have the exact command
used to launch Spark contained within the logs. This adds the necessary
hooks to make that possible.
2013-08-02 16:58:19 -07:00
Jey Kottalam 1d10192806 Fix setting of SPARK_EXAMPLES_JAR 2013-07-24 14:04:17 -07:00
Josh Rosen c83680434b Add JavaAPICompletenessChecker.
This is used to find methods in the Scala API that
need to be ported to the Java API.  To use it:

  ./run spark.tools.JavaAPICompletenessChecker
Conflicts:
	project/SparkBuild.scala
	run
	run2.cmd
2013-07-22 16:11:49 -07:00
Ubuntu 88a0823c58 Consistently invoke bash with /usr/bin/env bash in scripts to make code more portable (JIRA Ticket SPARK-817) 2013-07-18 00:51:18 +00:00
Matei Zaharia 4ff494de20 Some missing license headers 2013-07-16 17:26:48 -07:00
Matei Zaharia af3c9d5042 Add Apache license headers and LICENSE and NOTICE files 2013-07-16 17:21:33 -07:00
Matei Zaharia cd28d9c147 Merge remote-tracking branch 'origin/pr/662'
Conflicts:
	bin/compute-classpath.sh
2013-07-13 19:10:00 -07:00
Matei Zaharia 43b24635ee Renamed ML package to MLlib and added it to classpath 2013-07-05 11:38:53 -07:00
Evan Chan 1107b4d55b Merge branch 'master' into 2013-06/assembly-jar-deploy
Conflicts:
	run

Previous changes that I made to run and set-dev-classpath.sh instead
have been folded into compute-classpath.sh
2013-06-28 17:18:35 -07:00
Matei Zaharia 03906f7f0a Fixes to compute-classpath on Windows 2013-06-26 17:40:22 -07:00
Matei Zaharia 6c8d1b2ca6 Fix computation of classpath when we launch java directly
The previous version assumed that a CLASSPATH environment variable was
set by the "run" script when launching the process that starts the
ExecutorRunner, but unfortunately this is not true in tests. Instead, we
factor the classpath calculation into an extenral script and call that.

NOTE: This includes a Windows version but hasn't yet been tested there.
2013-06-25 18:21:00 -04:00
Evan Chan 4cda8f865a Add simple usage to start-slave script 2013-06-24 15:14:48 -07:00
Matei Zaharia dc4073654b Revert "Fix start-slave not passing instance number to spark-daemon."
This reverts commit a674d67c0a.
2013-06-11 00:08:02 -04:00
Stephen Haberman a674d67c0a Fix start-slave not passing instance number to spark-daemon. 2013-05-28 16:24:19 -05:00
Josh Rosen cda2b15041 Use ec2-metadata in start-slave.sh.
PR #419 applied the same change, but only to start-master.sh,
so some workers were still starting their web UI's using internal
addresses.

This should finally fix SPARK-613.
2013-05-24 13:05:06 -07:00
kalpit aa9134f72a spark instance number must be present in log filename to prevent multiple workers from overriding each other's logs 2013-03-26 17:49:30 -07:00
kalpit f08db010d3 added SPARK_WORKER_INSTANCES : allows spawning multiple worker instances/processes on every slave machine 2013-03-26 17:49:30 -07:00
Shivaram Venkataraman 717b221cca Detect whether we run on EC2 using ec2-metadata as well 2013-01-26 23:03:11 -08:00
Josh Rosen 1948f46093 Use spark-env.sh to configure standalone master. See SPARK-638.
Also fixed a typo in the standalone mode documentation.
2012-12-14 01:20:00 +00:00
Josh Rosen cdaa0fad51 Use external addresses in standalone WebUI on EC2. 2012-12-01 18:19:13 -08:00
Matei Zaharia 59c0a9ad16 Use hostname instead of IP in deploy scripts to let Akka connect properly 2012-11-27 21:00:04 -08:00
Reynold Xin f67bcbed07 Use SPARK_MASTER_IP if it is set in start-slaves.sh. 2012-10-19 01:08:23 -07:00
Matei Zaharia 30362a21e7 Update license info on deploy scripts 2012-09-25 14:43:47 -07:00
Denny 8fb955fd40 Add Apache license to non-trivial scripts taken from Hadoop. 2012-08-04 17:04:33 -07:00
Denny c90c9ec208 Read config variables before to get the master port 2012-08-02 16:12:40 -07:00
Denny 53008c2d8a Settings variables and bugfix for stop script. 2012-08-02 15:59:39 -07:00
Denny 0ee44c225e Spark standalone mode cluster scripts.
Heavily inspired by Hadoop cluster scripts ;-)
2012-08-01 20:38:52 -07:00