Hi spark community
I have quick question .I am planning to migrate from spark 3.0.1 to spark
3.2.
Do I need to recompile my application with 3.2 dependencies or application
compiled with 3.0.1 will work fine on 3.2 ?
Regards
Pralabh kumar
(Don't cross post please)
Generally you definitely want to compile and test vs what you're running on.
There shouldn't be many binary or source incompatibilities -- these are
avoided in a major release where possible. So it may need no code change.
But I would certainly recompile just on principle!
Hi Dongjoon,
Raised the JIRA at https://issues.apache.org/jira/browse/SPARK-38824
Thanks,
Souvik
From: Dongjoon Hyun
Sent: Wednesday, March 30, 2022 4:44 AM
To: Paul, Souvik [Engineering]
Cc: dev@spark.apache.org
Subject: Re: Probable bug in async commit of Kafka offset in
DirectKafkaInputDSt