Hi,
We are running flink applications on Flink versions 1.15.4 and now we want
to migrate them to 1.19.1 version. When I am trying to start the
application on a newer version, Flink is complaining about an internal
change in the data structure of JobVertex. getting below error:
JobVertex
*1.19.1
Hi,
I'm trying to send data to a Kafka topic using PyFlink (DataStream API),
while setting a key on the Kakfa record. The key is a simple string, the
value is a JSON string. What I have so far basically works, except the
whole record is sent as both the key and the value. How do I specify that I
w
Thanks Zhanghao and Andreas.
Thanks & Regards,
Sachin Sharma
+1-669-278-5239
On Tue, Sep 17, 2024 at 6:04 AM Zhanghao Chen
wrote:
> In our production environment, it works fine.
>
> Best,
> Zhanghao Chen
> --
> *From:* Sachin Sharma
> *Sent:* Friday, September 13
Hi Lasse,
You may use define a type information factory with config [1] starting from
v1.20. We are also working on extending the Flink type system to cover basic
generic collection types and avoid falling back to Kryo.
[1]
https://nightlies.apache.org/flink/flink-docs-master/docs/dev/datastre
In our production environment, it works fine.
Best,
Zhanghao Chen
From: Sachin Sharma
Sent: Friday, September 13, 2024 1:19
To: Oscar Perez via user
Subject: Flink 1.19.1 Java 17 Compatibility
Hi,
We are planning to use Flink 1.19.1 with kubernetes operator, I
Flink relies on JDK's methods to get the amount of resources available for an
TM. If you are using an outdated version of JDK (<8u191 for JDK8 with cgroup-v1
based containerization for example), the exact amount of resources available in
a containerized environment cannot be retrieved, and the r