Hi,
No, they don't - only the job is being restarted after that, without any
luck. Exception I provided is added to a exceptions list of the job itself.
On Mon, Aug 23, 2021 at 4:50 AM Caizhi Weng wrote:
> Hi!
>
> This might be that some task managers cannot reach out to the job manager
> in ti
Hi!
You can first use the Table & SQL API to create a RMQ source table[1].
Then you can use the to_data_stream method in TableEnvironment to change
the table to a data stream.
[1]
https://ci.apache.org/projects/flink/flink-docs-stable/docs/dev/python/table/python_table_api_connectors/
Nadia Mos
Hi!
This might be that some task managers cannot reach out to the job manager
in time. Has any of the task manager instance restarted after this failure?
If yes, what does the log (Flink log and kubernetes log) of the failed task
manager say?
Zbyszko Papierski 于2021年8月20日周五 下午11:07写道:
> Hi!
>
>
Hi!
If I'm not mistaken, you would like your window to be triggered every 15
minutes, or if there are no activity for 15 minutes?
This seems like an integration of tumbling window and session window. You
can refer to ProcessingTimeSessionWindows for the implementation of a
session window and modi
I also noticed that Flink s3-hadoop plugin has Hadoop common dependency. I'
trying this.
>From the logs, the plugin is enabled:
Enabling required built-in plugins
Linking flink-s3-fs-hadoop-1.12-SNAPSHOT.jar to plugin directory
Successfully enabled flink-s3-fs-hadoop-1.12-SNAPSHOT.jar
But I a
As I know, flink-shaded-hadoop is not officially supported since Flink 1.11
(https://ci.apache.org/projects/flink/flink-docs-release-1.11/ops/deployment/hadoop.html).
Anyway, I installed Hadoop common package into the docker images to make Flink
happy. I marked the hadoop dependencies in the i
Hello,
Is there any way to use RMQ as a data source in the DataStream python api?
Thanks in advance
I prefer using flink bundled hadoop, such as
https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar.
May help.
L. C. Hsieh 于2021年8月22日周日 上午1:40写道:
>
> BTW, I checked dependency tree, the flink-iceberg demo only has on
Hi Matthias,
Before the bug is fixed, you could specify the return type explicitly in
the second parameter of the map function.
DataStream rows = integers.map(i -> Row.of("Name"+i, i)); ->
DataStream rows = integers.map(i -> Row.of("Name"+i, i), new
RowTypeInfo(Types.STRING, Types.INT));
Best,
Hi Suman,
I've learned the providing code, and have some questions,
1. Why we do a
WindowAggregate window(TumblingProcessingTimeWindows.of(Time.minutes(1))),
then do a windowAll(TumblingEventTimeWindows.of(Time.hours(1))).maxBy(2);
One uses `ProcessingTimeWindow`, the other uses `EventTimeWindow`.
10 matches
Mail list logo