Hi Vaibhav,
As you said, from the second link, I can figure out that, it is not able
to cast the class when it is trying to read from checkpoint. Can you try
explicit casting like asInstanceOf[T] for the broad casted value ?
>From the bug, looks like it affects version 1.5. Try sample wordcoun
m/Read-from-kafka-after-application-is-restarted-tp26291p26303.html
>> To unsubscribe from Read from kafka after application is restarted, click
>> here.
>> NAML
>
>
> Capture.JPG (222K) Download Attachment
> Capture1.JPG (169K) Download Attachment
>
> View this message in context: Re: Read from kafka after application is
> restarted
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
Hello
I have tried with Direct API but i am getting this an error, which is being
tracked here https://issues.apache.org/jira/browse/SPARK-5594
I also tried using Receiver approach with Write Ahead Logs ,then this issue
comes
https://issues.apache.org/jira/browse/SPARK-12407
In both cases it see
Regarding the spark streaming receiver - can't you just use Kafka direct
receivers with checkpoints? So when you restart your application it will
read where it last stopped and continue from there
Regarding limiting the number of messages - you can do that by setting
spark.streaming.receiver.maxRat
Hi Vaibhav,
Please try with Kafka direct API approach. Is this not working ?
-- Padma Ch
On Tue, Feb 23, 2016 at 12:36 AM, vaibhavrtk1 [via Apache Spark User List] <
ml-node+s1001560n26291...@n3.nabble.com> wrote:
> Hi
>
> I am using kafka with spark streaming 1.3.0 . When the spark applicati
The direct stream will let you do both of those things. Is there a reason
you want to use receivers?
http://spark.apache.org/docs/latest/streaming-kafka-integration.html
http://spark.apache.org/docs/latest/configuration.html#spark-streaming
look for maxRatePerPartition
On Mon, Feb 22, 2016 at