Thank you - I'll try.
There is no 'UPDATE' clause in Flink SQL?
Sent: Monday, September 19, 2022 at 4:09 AM
From: "Shengkai Fang" <fskm...@gmail.com>
To: pod...@gmx.com
Cc: user@flink.apache.org
Subject: Re: INSERT INTO will work faster in Flink than in regular database?
From: "Shengkai Fang" <fskm...@gmail.com>
To: pod...@gmx.com
Cc: user@flink.apache.org
Subject: Re: INSERT INTO will work faster in Flink than in regular database?
Hi. I think you can write a udf[1] to process some fields and then insert into the sink.
Best.
Shengkai
<pod...@gmx.com> 于2022年9月15日周四 22:10写道:
What's the most effective way (performance) to update big no of rows?
Sure this will be probably "INSERT INTO table (column1) SELECT column1 FROM ...". Anyway, I do not see any "UPDATE" in Flink?
But sometimes SQL is not enough.
Suppose I have code:
TableResult tableResult1 = tEnv.executeSql("SELECT * FROM SomeTable");
try (org.apache.flink.util.CloseableIterator<Row> it = tableResult1.collect()) {
while(it.hasNext()) {
Row row = it.next();
//Treat row:
String x_field = row.getField("some_column").toString();
//Do something with x_field
...
tEnv.executeSql("INSERT INTO AnotherTable (column) VALUES ('new_value')");
}
}
But this INSERT will be probably performance killer...
Any suggestion how to do it in a smart way?
Mike