Super minor: Add require for mergeCombiners in combineByKey
We changed the behavior in 0.9.0 from requiring that mergeCombiners be null when mapSideCombine was false to requiring that mergeCombiners *never* be null, for external sorting. This patch adds a require() to make this behavior change explicitly messaged rather than resulting in a NPE. Author: Aaron Davidson <aaron@databricks.com> Closes #623 from aarondav/master and squashes the following commits: 520b80c [Aaron Davidson] Super minor: Add require for mergeCombiners in combineByKey
This commit is contained in:
parent
9e63f80e75
commit
3fede4831e
|
@ -77,6 +77,7 @@ class PairRDDFunctions[K: ClassTag, V: ClassTag](self: RDD[(K, V)])
|
|||
partitioner: Partitioner,
|
||||
mapSideCombine: Boolean = true,
|
||||
serializerClass: String = null): RDD[(K, C)] = {
|
||||
require(mergeCombiners != null, "mergeCombiners must be defined") // required as of Spark 0.9.0
|
||||
if (getKeyClass().isArray) {
|
||||
if (mapSideCombine) {
|
||||
throw new SparkException("Cannot use map-side combining with array keys.")
|
||||
|
|
Loading…
Reference in a new issue