[SPARK-1998] SparkFlumeEvent with body bigger than 1020 bytes are not re...
flume event sent to Spark will fail if the body is too large and numHeaders is greater than zero Author: joyyoj <sunshch@gmail.com> Closes #951 from joyyoj/master and squashes the following commits: f4660c5 [joyyoj] [SPARK-1998] SparkFlumeEvent with body bigger than 1020 bytes are not read properly
This commit is contained in:
parent
1abbde0e89
commit
2966044307
|
@ -71,12 +71,12 @@ class SparkFlumeEvent() extends Externalizable {
|
|||
for (i <- 0 until numHeaders) {
|
||||
val keyLength = in.readInt()
|
||||
val keyBuff = new Array[Byte](keyLength)
|
||||
in.read(keyBuff)
|
||||
in.readFully(keyBuff)
|
||||
val key : String = Utils.deserialize(keyBuff)
|
||||
|
||||
val valLength = in.readInt()
|
||||
val valBuff = new Array[Byte](valLength)
|
||||
in.read(valBuff)
|
||||
in.readFully(valBuff)
|
||||
val value : String = Utils.deserialize(valBuff)
|
||||
|
||||
headers.put(key, value)
|
||||
|
|
Loading…
Reference in a new issue