Skip to content

Commit

Permalink
Spark 1916
Browse files Browse the repository at this point in the history
The changes could be ported back to 0.9 as well.
Changing in.read to in.readFully to read the whole input stream rather than the first 1020 bytes.
This should ok considering that Flume caps the body size to 32K by default.

Author: David Lemieux <david.lemieux@radialpoint.com>

Closes apache#865 from lemieud/SPARK-1916 and squashes the following commits:

a265673 [David Lemieux] Updated SparkFlumeEvent to read the whole stream rather than the first X bytes.
  • Loading branch information
David Lemieux authored and pwendell committed May 28, 2014
1 parent 032493e commit 0b769b7
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ class SparkFlumeEvent() extends Externalizable {
def readExternal(in: ObjectInput) {
val bodyLength = in.readInt()
val bodyBuff = new Array[Byte](bodyLength)
in.read(bodyBuff)
in.readFully(bodyBuff)

val numHeaders = in.readInt()
val headers = new java.util.HashMap[CharSequence, CharSequence]
Expand Down

0 comments on commit 0b769b7

Please sign in to comment.