0
Currently figuring out who is responsible for the regression that I am
seeing in some user code ScalaUDFs that make use of Timestamps and where
NULL from a CSV file read in via a TestHive#registerTestTable is now
producing 1969-12-31 23:59:59.99 instead of null.
On Thu, Dec 3, 2015 at 1:57
Brute force way to do it might be to just have a separate
streaming-kafka-new-consumer subproject, or something along those lines.
On Fri, Dec 4, 2015 at 3:12 AM, Mario Ds Briggs
wrote:
> >>
> forcing people on kafka 8.x to upgrade their brokers is questionable.
> <<
>
> I agree and i was more t
I think one problem is that the assembly by nature includes a bunch of
particular versions of dependencies. Only one can be published, but,
it would be unlikely to be the right flavor of assembly for any given
user.
On Fri, Dec 4, 2015 at 1:27 AM, Matt Cheah wrote:
> Hi everyone,
>
> A very brief
To be clear-er, I don't think it's clear yet whether a 1.7 release
should exist or not. I could see both making sense. It's also not
really necessary to decide now, well before a 1.6 is even out in the
field. Deleting the version lost information, and I would not have
done that given my reply. Reyn
>>
forcing people on kafka 8.x to upgrade their brokers is questionable.
<<
I agree and i was more thinking maybe there is a way to support both for a
period of time (of course means some more code to maintain :-)).
thanks
Mario
From: Cody Koeninger
To: Mario Ds Briggs/India/IBM@IBMI