Commit graph

3292 commits

Author SHA1 Message Date
jerryshao 7fb574bf66 Code clean and remarshal 2013-07-24 14:57:46 +08:00
Andrew xia 4d6dd67fa1 refactor metrics system
1.change source abstract class to support MetricRegistry
2.change master/work/jvm source class
2013-07-24 14:57:46 +08:00
jerryshao 03f9871116 MetricsSystem refactor 2013-07-24 14:57:46 +08:00
jerryshao c3daad3f65 Update metric source support for instrumentation 2013-07-24 14:57:46 +08:00
jerryshao 9dec8c73e6 Add Master and Worker instrumentation support 2013-07-24 14:57:46 +08:00
jerryshao 503acd3a37 Build metrics system framwork 2013-07-24 14:57:46 +08:00
Matei Zaharia b011329040 Merge pull request #727 from rxin/scheduler
Scheduler code style cleanup.
2013-07-23 22:50:09 -07:00
Matei Zaharia 876125b997 Merge pull request #726 from rxin/spark-826
SPARK-829: scheduler shouldn't hang if a task contains unserializable objects in its closure
2013-07-23 22:28:21 -07:00
Reynold Xin 3dae1df66f Moved non-serializable closure catching exception from submitStage to submitMissingTasks 2013-07-23 20:29:07 -07:00
Reynold Xin d33b8a2a0f Added comments on task closure serialization. 2013-07-23 20:28:39 -07:00
Reynold Xin 85ab8114bc Moved non-serializable closure catching exception from submitStage to submitMissingTasks 2013-07-23 20:25:58 -07:00
Matei Zaharia 6a31b7191d Small bug fix 2013-07-23 16:20:24 -07:00
Matei Zaharia 2f1736c396 Merge pull request #725 from karenfeng/task-start
Creates task start events
2013-07-23 15:53:30 -07:00
Karen Feng abc78cd331 Modifies instead of copies HashSets, fixes comment style 2013-07-23 15:47:16 -07:00
Karen Feng 383684daaa Replaces Seq with HashSet, removes redundant import 2013-07-23 15:33:27 -07:00
Reynold Xin f2422d4f29 SPARK-829: scheduler shouldn't hang if a task contains unserializable objects in its closure. 2013-07-23 15:30:20 -07:00
Reynold Xin 5ed38b4d1d Scheduler code style cleanup. 2013-07-23 15:28:59 -07:00
Reynold Xin 101b8cc78a SPARK-829: scheduler shouldn't hang if a task contains unserializable objects in its closure. 2013-07-23 15:28:20 -07:00
Karen Feng 9f2dbb2a7c Adds/removes active tasks only once 2013-07-23 15:10:09 -07:00
shivaram 5364f645c5 Merge pull request #723 from rxin/mllib
Made RegressionModel serializable and added unit tests to make sure predict methods would work.
2013-07-23 13:40:34 -07:00
Karen Feng 0200801a55 Tracks task start events and shows number of active tasks on Executor UI 2013-07-23 13:35:43 -07:00
Matei Zaharia f369e0e51b Merge pull request #720 from ooyala/2013-07/persistent-rdds-api
Add a public method getCachedRdds to SparkContext
2013-07-23 13:22:27 -07:00
Reynold Xin 2210e8ccf8 Use a different validation dataset for Logistic Regression prediction testing. 2013-07-23 12:52:15 -07:00
Reynold Xin 87a9dd898f Made RegressionModel serializable and added unit tests to make sure predict methods would work. 2013-07-23 12:13:27 -07:00
Evan Chan efd6418c1b Move getPersistentRDDs testing to a new Suite 2013-07-23 10:40:41 -07:00
Evan Chan 4830e22562 Rename method per rxin feedback 2013-07-23 09:50:13 -07:00
Evan Chan 2c2bfbe294 Add toMap method to TimeStampedHashMap and use it 2013-07-23 01:36:44 -07:00
Matei Zaharia 401aac8b18 Merge pull request #719 from karenfeng/ui-808
Creates Executors tab for Jobs UI
2013-07-22 16:57:16 -07:00
Karen Feng 872c97ad82 Split task columns, memory columns sort by numeric value 2013-07-22 16:54:37 -07:00
Matei Zaharia ea1cfabfdd Merge branch 'master' of github.com:mesos/spark 2013-07-22 16:22:02 -07:00
Josh Rosen e17e1b388e Remove annotation code that broke build. 2013-07-22 16:12:11 -07:00
Josh Rosen c83680434b Add JavaAPICompletenessChecker.
This is used to find methods in the Scala API that
need to be ported to the Java API.  To use it:

  ./run spark.tools.JavaAPICompletenessChecker
Conflicts:
	project/SparkBuild.scala
	run
	run2.cmd
2013-07-22 16:11:49 -07:00
Matei Zaharia 8e38e77232 Fix a test that was using an outdated config setting 2013-07-22 16:05:32 -07:00
Matei Zaharia 8ae1436981 Merge pull request #722 from JoshRosen/spark-825
Fix bug: DoubleRDDFunctions.sampleStdev() computed non-sample stdev()
2013-07-22 16:03:04 -07:00
Karen Feng 2eea974795 Executors UI now calls executor ID from TaskInfo instead of TaskMetrics 2013-07-22 15:15:54 -07:00
Karen Feng 85c4d7bf3b Shows number of complete/total/failed tasks (bug: failed tasks assigned to null executor) 2013-07-22 14:35:47 -07:00
Josh Rosen f649dabb4a Fix bug: DoubleRDDFunctions.sampleStdev() computed non-sample stdev().
Update JavaDoubleRDD to add new methods and docs.

Fixes SPARK-825.
2013-07-22 13:21:48 -07:00
Karen Feng 8901f379c9 Fixed memory used/remaining/total bug 2013-07-22 09:58:03 -07:00
Karen Feng 636b19f833 Merge branch 'master' of https://github.com/mesos/spark into ui-808 2013-07-22 09:53:26 -07:00
Evan Chan 0337d88321 Add a public method getCachedRdds to SparkContext 2013-07-21 18:26:14 -07:00
Matei Zaharia 15fb394833 Merge pull request #716 from c0s/webui-port
Regression: default webui-port can't be set via command line "--webui-port" anymore
2013-07-21 10:33:38 -07:00
Karen Feng 865dc63bac Changed table format for executors 2013-07-19 15:57:01 -07:00
Karen Feng 81bb5dc640 Creates Executors tab for application with RDD block and memory/disk used, solves SPARK-808 2013-07-19 14:08:30 -07:00
Konstantin Boudnik cfce9a6a36 Regression: default webui-port can't be set via command line "--webui-port" anymore 2013-07-19 14:00:58 -07:00
Matei Zaharia c40f0f21f1 Merge pull request #711 from shivaram/ml-generators
Move ML lib data generator files to util/
2013-07-19 13:33:04 -07:00
Matei Zaharia 413b84172e Merge pull request #717 from viirya/dev1
Do not copy local jars given to SparkContext in yarn mode
2013-07-19 13:31:38 -07:00
Liang-Chi Hsieh d1738d72ba also exclude asm for hadoop2. hadoop1 looks like no need to do that too. 2013-07-20 00:37:24 +08:00
Liang-Chi Hsieh 4530e8a9bf fix typo. 2013-07-20 00:04:25 +08:00
Liang-Chi Hsieh aa6f83289b A better fix for giving local jars unde Yarn mode. 2013-07-19 22:25:28 +08:00
Liang-Chi Hsieh a613628c50 Do not copy local jars given to SparkContext in yarn mode since the Context is not running on local. This bug causes failure when jars can not be found. Example codes (such as spark.examples.SparkPi) can not work without this fix under yarn mode. 2013-07-19 16:59:12 +08:00