Hi Guys,
We need to do some state checkpointing (an rdd thats updated using
updateStateByKey). We would like finer control over the serialization.
Also, this would allow us to do schema evolution in the deserialization
code when we need to modify the structure of the classes associated with
the
There is one for the key of your Kafka message and one for its value.
On 26 Jun 2015 4:21 pm, "Ashish Soni" wrote:
> my question is why there are similar two parameter String.Class and
> StringDecoder.class what is the difference each of them ?
>
> Ashish
>
> On Fri, Jun 26, 2015 at 8:53 AM, Akhi
my question is why there are similar two parameter String.Class and
StringDecoder.class what is the difference each of them ?
Ashish
On Fri, Jun 26, 2015 at 8:53 AM, Akhil Das
wrote:
> ​JavaPairInputDStream messages =
> KafkaUtils.createDirectStream(
> jssc,
> String.class,
>
​JavaPairInputDStream messages =
KafkaUtils.createDirectStream(
jssc,
String.class,
String.class,
StringDecoder.class,
StringDecoder.class,
kafkaParams,
topicsSet
);
Here:
jssc => JavaStreamingContext
String.class => Key , Value classes
Hi ,
If i have a below data format , how can i use kafka direct stream to
de-serialize as i am not able to understand all the parameter i need to
pass , Can some one explain what will be the arguments as i am not clear
about this
JavaPairInputDStream
,
V
>
org
.apache
.spark
.streaming
.kafk
Hi,
i have a non-serializable class and as workaround i'm trying to
re-instantiate it at each de-serialization. Thus, i created a wrapper class
and overridden the writeObject and readObject methods as follow:
> private def writeObject(oos: ObjectOutputStream) {
> oos.defaultWriteObject()
>