[SPARK-19181][CORE] Fixing flaky "SparkListenerSuite.local metrics"

## What changes were proposed in this pull request?

Sometimes "SparkListenerSuite.local metrics" test fails because the average of executorDeserializeTime is too short. As squito suggested to avoid these situations in one of the task a reference introduced to an object implementing a custom Externalizable.readExternal which sleeps 1ms before returning.

## How was this patch tested?

With unit tests (and checking the effect of this change to the average with a much larger sleep time).

Author: “attilapiros” <piros.attila.zsolt@gmail.com>
Author: Attila Zsolt Piros <2017933+attilapiros@users.noreply.github.com>

Closes #21280 from attilapiros/SPARK-19181.
This commit is contained in:
“attilapiros” 2018-05-10 14:26:38 -07:00 committed by Marcelo Vanzin
parent 6282fc64e3
commit 3e2600538e

View file

@ -17,6 +17,7 @@
package org.apache.spark.scheduler
import java.io.{Externalizable, IOException, ObjectInput, ObjectOutput}
import java.util.concurrent.Semaphore
import scala.collection.JavaConverters._
@ -294,10 +295,13 @@ class SparkListenerSuite extends SparkFunSuite with LocalSparkContext with Match
val listener = new SaveStageAndTaskInfo
sc.addSparkListener(listener)
sc.addSparkListener(new StatsReportListener)
// just to make sure some of the tasks take a noticeable amount of time
// just to make sure some of the tasks and their deserialization take a noticeable
// amount of time
val slowDeserializable = new SlowDeserializable
val w = { i: Int =>
if (i == 0) {
Thread.sleep(100)
slowDeserializable.use()
}
i
}
@ -583,3 +587,12 @@ private class FirehoseListenerThatAcceptsSparkConf(conf: SparkConf) extends Spar
case _ =>
}
}
private class SlowDeserializable extends Externalizable {
override def writeExternal(out: ObjectOutput): Unit = { }
override def readExternal(in: ObjectInput): Unit = Thread.sleep(1)
def use(): Unit = { }
}