What's the most effective way (performance) to update big no of rows?
Sure this will be probably "INSERT INTO table (column1) SELECT column1 FROM 
...". Anyway, I do not see any "UPDATE" in Flink?
But sometimes SQL is not enough.
Suppose I have code:
 
TableResult tableResult1 = tEnv.executeSql("SELECT * FROM SomeTable");
try (org.apache.flink.util.CloseableIterator<Row> it = tableResult1.collect()) {
        while(it.hasNext()) {
                Row row = it.next();
                //Treat row:
                String x_field = row.getField("some_column").toString();
                //Do something with x_field
                ...
                tEnv.executeSql("INSERT INTO AnotherTable (column) VALUES 
('new_value')");
        }
}

But this INSERT will be probably performance killer...

Any suggestion how to do it in a smart way?

Mike

Reply via email to