lpn666 commented on PR #20192:
URL: https://github.com/apache/flink/pull/20192#issuecomment-1177110023

   > ## What is the purpose of the change
   > When I using the sql-jdbc to transform a big table from mysql to other 
database, the flink program load the entire table into memory. The source table 
is too big (16GB), and the taskmanager crashed. So What can I do, or what about 
add a new option to limit the speed of reading data (or batch the data )
   > 
   > ## Brief change log
   > ## Verifying this change
   > ## Does this pull request potentially affect one of the following parts:
   > ## Documentation
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to