t;
>
> - Original message -
> From: Mario Ds Briggs/India/IBM
> To: Cody Koeninger
> Cc: "dev@spark.apache.org"
> Subject: Re: Spark Streaming Kafka - DirectKafkaInputDStream: Using the
> new Kafka Consumer API
> Date: Mon, Dec 7, 2015 3:58 PM
>
>
&
uot;
Date: 04/12/2015 08:45 pm
Subject: Re: Spark Streaming Kafka - DirectKafkaInputDStream: Using the
new Kafka Consumer API
Brute force way to do it might be to just have a separate
streaming-kafka-new-consumer subproject, or something along those lines.
On Fri, Dec 4, 2015
che.org"
> Date: 04/12/2015 12:15 am
> Subject: Re: Spark Streaming Kafka - DirectKafkaInputDStream: Using the
> new Kafka Consumer API
> --
>
>
>
> Honestly my feeling on any new API is to wait for a point release before
> taking it seriously :)
>
>
gs/India/IBM@IBMIN
Cc: "dev@spark.apache.org"
Date: 04/12/2015 12:15 am
Subject: Re: Spark Streaming Kafka - DirectKafkaInputDStream: Using the
new Kafka Consumer API
Honestly my feeling on any new API is to wait for a point release before
taking it seriously :
Honestly my feeling on any new API is to wait for a point release before
taking it seriously :)
Auth and encryption seem like the only compelling reason to move, but
forcing people on kafka 8.x to upgrade their brokers is questionable.
On Thu, Dec 3, 2015 at 11:30 AM, Mario Ds Briggs
wrote:
> H