Hi Team,
https://ci.apache.org/projects/flink/flink-docs-release-1.11/api/java/org/apache/flink/streaming/connectors/kinesis/FlinkKinesisConsumer.html
*" There is no perfect generic default assignment function. Default shard
to subtask assignment, which is based on hash code, may result in skew,
Hello,
With the default memory settings, after about 5000 records in my
KafkaFlinkConsumer, and some operators in my pipeline, I get the below error:
Caused by: java.lang.OutOfMemoryError: Direct buffer memory
at java.nio.Bits.reserveMemory(Unknown Source) ~[?:?]
at java.nio.Dire
Great, glad it was an easy fix :) Thanks for following up!
On Fri, Jul 23, 2021 at 3:54 AM Thms Hmm wrote:
> Finally I found the mistake. I put the „—host 10.1.2.3“ param as one
> argument. I think the savepoint argument was not interpreted correctly or
> ignored. Might be that the „-s“ param wa
in my app pom.xml
org.apache.flink
flink-connector-kafka_2.11
1.13.1
provided
and I have copy flink-connector-kafka_2.11-1.13.1.jar to FLINK_HOME/lib/
but I also get a error
ClassNotFoundException: org.apache.kafka.clients.consumer.ConsumerRecord
so
How can fix it?
igyu
Hi Dan,
1) If the key doesn’t change in the downstream operators and you want to avoid
shuffling, maybe the DataStreamUtils#reinterpretAsKeyedStream would be helpful.
2) I am not sure that if you are saying that the data are already partitioned
in the Kafka and you want to avoid shuffling in th
Hello,
It’s hard to say what caused the timeout to trigger – I agree with you that it
should not have stopped the heartbeat thread, but it did. The easy fix was to
increase it until we no longer see our app self-killed. The task was using a
CPU-intensive computation (with a few threads created
Hi Samir,
to unsubscribe please send an empty-body / empty-subject email to
user-unsubscr...@flink.apache.org. You can see a community page [1] in docs
for more details.
[1] https://flink.apache.org/community.html
Best,
D.
On Fri, Jul 23, 2021 at 9:01 AM Samir Vasani wrote:
> Hi,
>
> This is
Hi,
Can you elaborate more on UDF as I did not understand it.
Thanks & Regards,
Samir Vasani
On Fri, Jul 23, 2021 at 1:22 PM Caizhi Weng wrote:
> Hi!
>
> In this case it won't work, as JobListener#onJobExecuted will only be
> called when the job finishes, successfully or unsuccessfully.
>
>
Finally I found the mistake. I put the „—host 10.1.2.3“ param as one
argument. I think the savepoint argument was not interpreted correctly or
ignored. Might be that the „-s“ param was used as value for „—host
10.1.2.3“ and „s3p://…“ as new param and because these are not valid
arguments they were
Hi!
In this case it won't work, as JobListener#onJobExecuted will only be
called when the job finishes, successfully or unsuccessfully.
For a forever-running job I would suggest adding a UDF right after the
source and adding a special "EOF" record in each of the csv file. This UDF
monitors the da
Hi Caizhi Weng,
Thanks for your input.
I would explain the requirement in little more detail.
Flink pipeline will be running forever (until some issue happens and we
would need to restart) so It will continuously monitor if a new file comes
to the *input *folder or not.
In this case will your sugg
Hi!
JobListener#onJobExecuted might help, if your job is not a forever-running
streaming job. See
https://ci.apache.org/projects/flink/flink-docs-master/api/java/org/apache/flink/core/execution/JobListener.html
Samir Vasani 于2021年7月23日周五 下午3:22写道:
> Hi,
>
> I am a new bee to flink and facing so
Could this be related to https://issues.apache.org/jira/browse/FLINK-22414?
On Thu, Jul 22, 2021 at 3:53 PM Timo Walther wrote:
> Thanks, this should definitely work with the pre-packaged connectors of
> Ververica platform.
>
> I guess we have to investigate what is going on. Until then, a
> wor
Hi,
I am a new bee to flink and facing some challenges to solve below use case
Use Case description:
I will receive a csv file with a timestamp on every single day in some
folder say *input*.The file format would be
*file_name_dd-mm-yy-hh-mm-ss.csv*.
Now my flink pipeline will read this csv fil
Hi,
This is the user subscription request.
Thanks & Regards,
Samir Vasani
15 matches
Mail list logo