Re: java.io.NotSerializableException: org.apache.spark.streaming.StreamingContext

2017-03-12 Thread Lysiane Bouchard
=> { > val assembly = record.topic() > val value = record.value > val datatime = value.substring(datetime_idx, datetime_length - 1) > val level = value.substring(logLevelBeginIdx, logLevelBeginIdx + > logLevelMaxLenght - 1) > (assembly,value,datatime,level)

Re: java.io.NotSerializableException: org.apache.spark.streaming.StreamingContext

2017-03-11 Thread ??????????
i think the val you defined are only valid in the driver, you can try boardcast variable. ---Original--- From: "lk_spark" Date: 2017/2/27 11:14:23 To: "user.spark"; Subject: java.io.NotSerializableException: org.apache.spark.streaming.StreamingContext hi,all: I

java.io.NotSerializableException: org.apache.spark.streaming.StreamingContext

2017-02-26 Thread lk_spark
g(logLevelBeginIdx, logLevelBeginIdx + logLevelMaxLenght - 1) (assembly,value,datatime,level) }) I will get error : Caused by: java.io.NotSerializableException: org.apache.spark.streaming.StreamingContext Serialization stack: - object not serializable (class: org.apache.spar

Re: java.io.NotSerializableException: org.apache.spark.streaming.StreamingContext

2015-01-23 Thread Sean Owen
Heh, this question keeps coming up. You can't use a context or RDD inside a distributed operation, only from the driver. Here you're trying to call textFile from within foreachPartition. On Fri, Jan 23, 2015 at 10:59 AM, Nishant Patel wrote: > Below is code I have written. I am getting NotSeriali

java.io.NotSerializableException: org.apache.spark.streaming.StreamingContext

2015-01-23 Thread Nishant Patel
Below is code I have written. I am getting NotSerializableException. How can I handle this scenario? kafkaStream.foreachRDD(rdd => { println("") rdd.foreachPartition(partitionOfRecords => { partitionOfRecords.foreach( record => {