Similarly a Java DStream has a dstream method you can call to get the
underlying dstream.
On Oct 11, 2016 2:54 AM, "static-max" wrote:
> Hi Cody,
>
> thanks, rdd.rdd() did the trick. I now have the offsets via
> OffsetRange[] offsets = ((HasOffsetRanges) rdd.rdd()).offsetRanges();
>
> But how ca
Hi Cody,
thanks, rdd.rdd() did the trick. I now have the offsets via
OffsetRange[] offsets = ((HasOffsetRanges) rdd.rdd()).offsetRanges();
But how can you commit the offset to Kafka?
Casting the JavaInputDStream throws an CCE:
CanCommitOffsets cco = (CanCommitOffsets) directKafkaStream; // Throw
This should give you hints on the necessary cast:
http://spark.apache.org/docs/latest/streaming-kafka-0-8-integration.html#tab_java_2
The main ugly thing there is that the java rdd is wrapping the scala
rdd, so you need to unwrap one layer via rdd.rdd()
If anyone wants to work on a PR to update
Hi,
by following this article I managed to consume messages from Kafka 0.10 in
Spark 2.0:
http://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html
However, the Java examples are missing and I would like to commit the
offset myself after processing the RDD. Does anybody have a wor