Jay, thanks for the response.
Regarding the new consumer API for 0.9, I've been reading through the code
for it and thinking about how it fits in to the existing Spark integration.
So far I've seen some interesting challenges, and if you (or anyone else on
the dev list) have time to provide some h
The 0.9 release still has the old consumer as Jay mentioned but this
specific release is a little unusual in that it also provides a completely
new consumer client.
Based on what I understand, users of Kafka need to upgrade their brokers to
> Kafka 0.9.x first, before they upgrade their clients to
Thanks Jay. Yeah, if we were able to use the old consumer API from 0.9
clients to work with 0.8 brokers that would have been super helpful here. I
am just trying to avoid a scenario where Spark cares about new features
from every new major release of Kafka (which is a good thing) but ends up
having
Hey, yeah, we'd really like to make this work well for you guys.
I think there are actually maybe two questions here:
1. How should this work in steady state?
2. Given that there was a major reworking of the kafka consumer java
library for 0.9 how does that impact things right now? (
http://www.co
Hi Kafka devs,
I come to you with a dilemma and a request.
Based on what I understand, users of Kafka need to upgrade their brokers to
Kafka 0.9.x first, before they upgrade their clients to Kafka 0.9.x.
However, that presents a problem to other projects that integrate with
Kafka (Spark, Flume, S