Hi all! I'm trying to debug a job via inspecting its savepoints but I'm getting this error message:
``` Caused by: java.lang.RuntimeException: Record size is too large for CollectSinkFunction. Record size is 9627127 bytes, but max bytes per batch is only 2097152 bytes. Please consider increasing max bytes per batch value by setting collect-sink.batch-size.max ``` My code looks like this: ``` private static void run(String savepointPath) throws Exception { StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); org.apache.flink.state.api.SavepointReader savepoint = org.apache.flink.state.api.SavepointReader.read(env, savepointPath, new HashMapStateBackend()); var operator = savepoint.readKeyedState("uuid", new MyKeyedOperatorReader()); var operatorState = matcher.executeAndCollect(1000); } ``` I haven't found the way to increase the `collect-sink.batch-size.max` as suggested in the error msg. Any help on this will be appreciated! Regards, Salva