You should be able to cast the object type to the real underlying type
(GenericRecord (if generic, which is so by default), or the actual type
class (if specific)). The underlying implementation of KafkaAvroDecoder
seems to use either one of those depending on a config switch:
https://github.com/co
I got it working by using jsonRDD. This is what I had to do in order to
make it work :
val messages = KafkaUtils.createDirectStream[Object, Object,
KafkaAvroDecoder, KafkaAvroDecoder](ssc, kafkaParams, topicsSet)
val lines = messages.map(_._2.toString)
lines.foreachRDD(jsonRDD =>
You can use `DStream.map` to transform objects to anything you want.
On Thu, Feb 25, 2016 at 11:06 AM, Mohammad Tariq wrote:
> Hi group,
>
> I have just started working with confluent platform and spark streaming,
> and was wondering if it is possible to access individual fields from an
> Avro o
Hi group,
I have just started working with confluent platform and spark streaming,
and was wondering if it is possible to access individual fields from an
Avro object read from a kafka topic through spark streaming. As per its
default behaviour *KafkaUtils.createDirectStream[Object, Object,
KafkaA