never mind, one of my peers correct the driver program for me - all dstream
operations need to be within the scope of getOrCreate API
On Wed, Dec 9, 2015 at 3:32 PM, Renyi Xiong wrote:
> following scala program throws same exception, I know people are running
> streaming jobs against kafka, I mu
following scala program throws same exception, I know people are running
streaming jobs against kafka, I must be missing something. any idea why?
package org.apache.spark.streaming.api.csharp
import java.util.HashMap
import kafka.serializer.{DefaultDecoder, Decoder, StringDecoder}
import org.ap
hi,
I met following exception when the driver program tried to recover from
checkpoint, looks like the logic relies on zeroTime being set which doesn't
seem to happen here. am I missing anything or is it a bug in 1.4.1?
org.apache.spark.SparkException:
org.apache.spark.streaming.api.csharp.CSharp