I checked javacore file, there is: Dump Event "systhrow" (00040000) Detail "java/lang/OutOfMemoryError" "Java heap space" received
After checking the failure thread, I found it occur in SparkFlumeEvent.readExternal() method: 71 for (i <- 0 until numHeaders) { 72 val keyLength = in.readInt() 73 val keyBuff = new Array[Byte](keyLength) In line 73, it tried to construct an array with a length, this length may be a big size so are there some limitations on the size of what flume sent, or some way to adjust the heap size ? thanks