How/what tools can we use to monitor directory usage?
On Thu, Aug 29, 2024 at 8:00 AM John Smith wrote:
> Also linger and batch is producer setting we are getting this error on
> consumers. In fact we don't use Kafka as a sink what so ever in D-Link.
>
> On Thu, Aug 29, 2024, 8:46 AM John Smith
Also linger and batch is producer setting we are getting this error on
consumers. In fact we don't use Kafka as a sink what so ever in D-Link.
On Thu, Aug 29, 2024, 8:46 AM John Smith wrote:
> Maybe the change in direct memory allocation in java 11 did this?
>
> Java 8: By default, the amount of
Maybe the change in direct memory allocation in java 11 did this?
Java 8: By default, the amount of native memory used for Direct Byte
Buffers is limited to 87.5% of the maximum heap size.
Java 11: By default, the amount of native memory used for Direct Byte
Buffers is limited to the maximum heap
The same exact task/code and exact same version of flink had no issues
before.
The only thing that changed is deployed flink to java 11. Added more memory
to the config and increased the parallelism of the Kafka source.
On Fri, Aug 23, 2024, 3:46 PM John Smith wrote:
> Online resources includin
Online resources including my previous question to this problem said there
was some client bug connecting to SSL broker that caused memory issues. As
far as memory setup I have the following...
Here is the link and there's a link to a JIRA...
https://stackoverflow.com/questions/64697973/java-lang-
Hi John,
I've experienced this issue recently; it's likely caused either by:
- the size of the producer record batch, it can be reduced by configuring
lower linger.ms and batch.size values
- the size of an individual record
On Fri, Aug 23, 2024 at 7:20 AM Ahmed Hamdy wrote:
> Why do you belie
Why do you believe it is an SSL issue?
The error trace seems like a memory issue. you could refer to
taskmanager memory setup guide[1].
1-
https://nightlies.apache.org/flink/flink-docs-master/docs/deployment/memory/mem_setup_tm/
Best Regards
Ahmed Hamdy
On Fri, 23 Aug 2024 at 13:47, John Smith
I'm pretty sure it's not SSL is there a way to confirm, since the take does
work. And/or is there other settings I can try?
On Thu, Aug 22, 2024, 11:06 AM John Smith wrote:
> Hi getting this exception, a lot of resources online point to an SSL
> misconfiguration.
>
> We are NOT using SSL. Neithe
Hi getting this exception, a lot of resources online point to an SSL
misconfiguration.
We are NOT using SSL. Neither on the broker or the consumer side. Our jobs
work absolutely fine as in the flink task is able to consume from kafka
parse the json and then push it to the JDBC database sink.
I wo