Re: migrating offsets for old Scala consumers.

I work in the python world, so haven't directly used the old high level
consumer, but from what I understand the underlying problem remains the
migration of zookeeper offsets to the __consumer_offsets topic.

We've used a slightly modified version of Grant Henke's script for
migrating offsets here: https://github.com/apache/kafka/pull/2615
It doesn't support rolling upgrades, but other than that it's great... I've
used it for multiple migrations, and very thankful for the time Grant put
into it.

I don't know that it's worth pulling this into core, it might be, it might
not be. But it probably is worth documenting the procedure at least
somewhere.

Personally, I suspect that those who absolutely need a rolling migration
and cannot handle a short period of downtime while doing a migration
probably have in-house experts on Kafka who are familiar with the issues
and willing to figure out a solution. The rest of the world can generally
handle a short maintenance window.




On Fri, Nov 10, 2017 at 10:46 AM, Ismael Juma <ism...@juma.me.uk> wrote:

> Hi Gwen,
>
> A KIP has been proposed, but it is stalled:
>
> https://cwiki.apache.org/confluence/display/KAFKA/KIP-125%3A+
> ZookeeperConsumerConnector+to+KafkaConsumer+Migration+and+Rollback
>
> Unless the interested parties pick that up, we would drop support without a
> rolling upgrade path. Users would be able to use the old consumers from
> 1.1.x for a long time. The old Scala clients don't support the message
> format introduced in 0.11.0, so the feature set is pretty much frozen and
> there's little benefit in upgrading. But there is a cost in keeping them in
> the codebase.
>
> Ismael
>
> On Fri, Nov 10, 2017 at 6:02 PM, Gwen Shapira <g...@confluent.io> wrote:
>
> > Last time we tried deprecating the Scala consumer, there were concerns
> > about a lack of upgrade path. There is no rolling upgrade, and migrating
> > offsets is not trivial (and not documented).
> >
> > Did anything change in that regard? Or are we planning on dropping
> support
> > without an upgrade path?
> >
> >
> > On Fri, Nov 10, 2017 at 5:37 PM Guozhang Wang <wangg...@gmail.com>
> wrote:
> >
> > > Thanks Ismael, the proposal looks good to me.
> > >
> > > A side note regarding: https://issues.apache.org/
> jira/browse/KAFKA-5637,
> > > could we resolve this ticket sooner than later to make clear about the
> > code
> > > deprecation and support duration when moving from 1.0.x to 2.0.x?
> > >
> > >
> > > Guozhang
> > >
> > >
> > > On Fri, Nov 10, 2017 at 3:44 AM, Ismael Juma <ism...@juma.me.uk>
> wrote:
> > >
> > > > Features for 2.0.0 will be known after 1.1.0 is released in February
> > > 2018.
> > > > We are still doing the usual time-based release process[1].
> > > >
> > > > I am raising this well ahead of time because of the potential impact
> of
> > > > removing the old Scala clients (particularly the old high-level
> > consumer)
> > > > and dropping support for Java 7. Hopefully users can then plan
> > > accordingly.
> > > > We would do these changes in trunk soon after 1.1.0 is released
> (around
> > > > February).
> > > >
> > > > I think it makes sense to complete some of the work that was not
> ready
> > in
> > > > time for 1.0.0 (Controller improvements and JBOD are two that come to
> > > mind)
> > > > in 1.1.0 (January 2018) and combined with the desire to give advance
> > > > notice, June 2018 was the logical choice.
> > > >
> > > > There is no plan to support a particular release for longer. 1.x
> versus
> > > 2.x
> > > > is no different than 0.10.x versus 0.11.x from the perspective of
> > > > supporting older releases.
> > > >
> > > > [1] https://cwiki.apache.org/confluence/display/KAFKA/Time+
> > > > Based+Release+Plan
> > > >
> > > > On Fri, Nov 10, 2017 at 11:21 AM, Jaikiran Pai <
> > jai.forums2...@gmail.com
> > > >
> > > > wrote:
> > > >
> > > > > Hi Ismael,
> > > > >
> > > > > Are there any new features other than the language specific changes
> > > that
> > > > > are being planned for 2.0.0? Also, when 2.x gets released, will the
> > 1.x
> > > > > series see continued bug fixes and releases in the community or is
> > the
> > > > plan
> > > > > to have one single main version that gets continuous updates and
> > > > releases?
> > > > >
> > > > > By the way, why June 2018? :)
> > > > >
> > > > > -Jaikiran
> > > > >
> > > > >
> > > > >
> > > > > On 09/11/17 3:14 PM, Ismael Juma wrote:
> > > > >
> > > > >> Hi all,
> > > > >>
> > > > >> I'm starting this discussion early because of the potential
> impact.
> > > > >>
> > > > >> Kafka 1.0.0 was just released and the focus was on achieving the
> > > > original
> > > > >> project vision in terms of features provided while maintaining
> > > > >> compatibility for the most part (i.e. we did not remove deprecated
> > > > >> components like the Scala clients).
> > > > >>
> > > > >> This was the right decision, in my opinion, but it's time to start
> > > > >> thinking
> > > > >> about 2.0.0, which is an opportunity for us to remove major
> > deprecated
> > > > >> components and to benefit from Java 8 language enhancements (so
> that
> > > we
> > > > >> can
> > > > >> move faster). So, I propose the following for Kafka 2.0.0:
> > > > >>
> > > > >> 1. It should be released in June 2018
> > > > >> 2. The Scala clients (Consumer, SimpleConsumer, Producer,
> > > SyncProducer)
> > > > >> will be removed
> > > > >> 3. Java 8 or higher will be required, i.e. support for Java 7 will
> > be
> > > > >> dropped.
> > > > >>
> > > > >> Thoughts?
> > > > >>
> > > > >> Ismael
> > > > >>
> > > > >>
> > > > >
> > > >
> > >
> > >
> > >
> > > --
> > > -- Guozhang
> > >
> >
>



-- 

*Jeff Widman*
jeffwidman.com <http://www.jeffwidman.com/> | 740-WIDMAN-J (943-6265)
<><

Reply via email to