Good to hear :)
On Wed, Dec 9, 2020 at 7:44 PM Eric Beabes wrote:
> Gabor,
>
> I waited to revert for a long time to ensure that this is working as
> expected. I am VERY HAPPY to tell you that this configuration change has
> fixed this issue! Not a single task has failed for over 2 weeks!
>
> TH
Gabor,
I waited to revert for a long time to ensure that this is working as
expected. I am VERY HAPPY to tell you that this configuration change has
fixed this issue! Not a single task has failed for over 2 weeks!
THANKS once again. Hopefully, at some point we can switch to Spark 3.0.
On Fri, N
Happy that saved some time for you :)
We've invested quite an effort in the latest releases into streaming and
hope there will be less and less headaches like this.
On Thu, Nov 19, 2020 at 5:55 PM Eric Beabes
wrote:
> THANK YOU SO MUCH! Will try it out & revert.
>
> On Thu, Nov 19, 2020 at 8:18
THANK YOU SO MUCH! Will try it out & revert.
On Thu, Nov 19, 2020 at 8:18 AM Gabor Somogyi
wrote:
> "spark.kafka.producer.cache.timeout" is available since 2.2.1 which can be
> increased as a temporary workaround.
> This is not super elegant but works which gives enough time to migrate to
> Spar
"spark.kafka.producer.cache.timeout" is available since 2.2.1 which can be
increased as a temporary workaround.
This is not super elegant but works which gives enough time to migrate to
Spark 3.
On Wed, Nov 18, 2020 at 11:12 PM Eric Beabes
wrote:
> I must say.. *Spark has let me down in this ca
I must say.. *Spark has let me down in this case*. I am surprised an
important issue like this hasn't been fixed in Spark 2.4.
I am fighting a battle of 'Spark Structured Streaming' Vs 'Flink' at work &
now because Spark 2.4 can't handle this *I've been asked to rewrite the
code in Flink*.
Moving
BTW, we are seeing this message as well:
*"org.apache.kafka.common.KafkaException:
Producer** closed while send in progress"*. I am assuming this happens
because of the previous issue.."producer has been closed", right? Or are
they unrelated? Please advise. Thanks.
On Tue, Nov 10, 2020 at 11:17 AM
Thanks for the reply. We are on Spark 2.4. Is there no way to get this
fixed in Spark 2.4?
On Mon, Nov 2, 2020 at 8:32 PM Jungtaek Lim
wrote:
> Which Spark version do you use? There's a known issue on Kafka producer
> pool in Spark 2.x which was fixed in Spark 3.0, so you'd like to check
> wheth
Which Spark version do you use? There's a known issue on Kafka producer
pool in Spark 2.x which was fixed in Spark 3.0, so you'd like to check
whether your case is bound to the known issue or not.
https://issues.apache.org/jira/browse/SPARK-21869
On Tue, Nov 3, 2020 at 1:53 AM Eric Beabes wrote