I tried 21.0.7 but it ended up having the segfault problem in one of the
indexing module unit tests - https://github.com/apache/druid/pull/18128
so I ended up closing it.
On 2025/06/17 19:17:09 Gian Merlino wrote:
> I haven't tried 21.0.7 but IIRC 21.0.6 still had the problem. We should
> figure
I tried 21.0.7 but it ended up having the segfault problem in one of the
indexing module unit tests - https://github.com/apache/druid/pull/18128
so I ended up closing it.
On 2025/06/17 19:17:09 Gian Merlino wrote:
> I haven't tried 21.0.7 but IIRC 21.0.6 still had the problem. We should
> figure
It is more aggressive than originally discussed, but only by one release. I
think at this point, given the EOL of Jetty 9 we should be upgrading to Jetty
12 (and therefore raising our min Java version to Java 17) as fast as possible.
Gian
On 2025/06/17 21:41:29 Clint Wylie wrote:
> re Druid 35
"We donot want to be in a boat where a CVE in jetty requires us to upgrade
to jetty 12 for a patched release."
Big +1 to Karan's point quoted above. I think most folks in the community
would rather know they have to move to java17 for druid 35 and have X
months to prepare versus having to move une
TBH I also think Druid 35 is a better candidate because the jetty 9 is
close to 9 years old and is EOL. We donot want to be in a boat where a CVE
in jetty requires us to upgrade to jetty 12 for a patched release.
Also we want to move to Kafka 4.0 clients sooner rather than later. Kafka 4
requires d
re Druid 35 - since Hadoop doesn't support java 17 yet, I think that
means we would also have to drop that too. I'm on board, but wondering
if that Is too aggressive?
On Tue, Jun 17, 2025 at 2:15 PM Gian Merlino wrote:
>
> Actually, I wonder if Druid 35 would be a better time to drop Java 11. It'
Actually, I wonder if Druid 35 would be a better time to drop Java 11. It's a
little sooner, but, there are reasons to do this earlier because of Jetty 9
being EOL. It's EOL as of this year. If we need any security fixes they will
only be available in Jetty 12, which requires Java 17. We could t
This sounds good to me.
On 2025/06/09 20:11:41 Clint Wylie wrote:
> Following up on this, I want to propose the first release of 2026 for
> removal, which I think would be Druid 36, to give some lead time for
> those affected to prepare (which is the same timeline I proposed for
> Hadoop removal).
I haven't tried 21.0.7 but IIRC 21.0.6 still had the problem. We should figure
something out here, since I think it's one of the last items stopping us from
fully supporting Java 21.
On 2025/06/12 02:53:13 Abhishek Balaji Radhakrishnan wrote:
> Dropping support for Java 11 sounds good to me.
>
Removing support of jdk 11 also looks good to me.
On Thu, Jun 12, 2025 at 1:49 PM Abhishek Balaji Radhakrishnan <
abhishe...@apache.org> wrote:
> Dropping support for Java 11 sounds good to me.
>
> By the way, once Java 11 is dropped, we’ll only have one officially
> supported version, Java 17:
Dropping support for Java 11 sounds good to me.
By the way, once Java 11 is dropped, we’ll only have one officially supported
version, Java 17:
https://druid.apache.org/docs/latest/operations/java/#selecting-a-java-runtime
I think there were some segfaults observed in Java 21 that it's been pinn
Following up on this, I want to propose the first release of 2026 for
removal, which I think would be Druid 36, to give some lead time for
those affected to prepare (which is the same timeline I proposed for
Hadoop removal).
On Fri, Dec 20, 2024 at 1:39 AM Clint Wylie wrote:
>
> I guess we need t
I guess we need to add this to the pile of reasons to drop java 11:
https://lists.apache.org/thread/y35cxlj90hwx6cv3kds9j8yqnmqgcczv which
if i understand correctly it looks like datasketches is only doing new
stuff with java 17, older versions only getting fixes.
On Tue, Dec 17, 2024 at 10:36 PM
oh, good point. I agree then that we should drop Hadoop support. It should
be alarming enough for Hadoop users that it still doesn't support Java 17
while many big data projects have either dropped or considering dropping
support for Java 11. We will never see zero Hadoop usage in the community.
Wh
Regarding Hadoop: if core Druid code starts requiring Java 17, we might run
into issues with running that core Druid code inside the remote Hadoop M/R
processes. People would need to update their YARN runners to Java 17. And given
Hadoop doesn't officially support Java 17 yet, this might cause p
Do we really need to wait for Hadoop runtime to support Java 17 if the
Hadoop client jars themselves can be used in JDK 17 runtime? Spark dropped
support for Java 11 but I think, spark jobs can still use Hadoop client
code. So I am not sure if Hadoop is really a blocker for us to move off
Java 11.
16 matches
Mail list logo