Re: Got exception: unable to create new native thread when use BucketingSink writing to HDFS

2017-06-08 Thread Mu Kong
Hi Till Rohrmann, Thanks for the prompt response! Limiting the number of open connections to the HDFS would be great. I created an issue here in JIRA: https://issues.apache.org/jira/browse/FLINK-6873 Best regards, Mu On Thu, Jun 8, 2017 at 7:06 PM, Till Rohrmann wrote: > Hi Mu Kong, > > I thi

Re: Got exception: unable to create new native thread when use BucketingSink writing to HDFS

2017-06-08 Thread Till Rohrmann
Hi Mu Kong, I think this is a problem with how the BucketingSink works. It keeps for each file a writer which has an open file stream. I think we should limit the number of open writers such that we don't run into the problem of too many open file handles/open HDFS connections. Could you open a J

Got exception: unable to create new native thread when use BucketingSink writing to HDFS

2017-06-08 Thread Mu Kong
Hi all, I am new to flink. I tried to start a cluster over 3 servers, and the process speed was GREAT. However, after several hours' streaming, I got this error: *java.lang.OutOfMemoryError: unable to create new native thread* From: at org.apache.flink.streaming.connectors.fs.StreamWriterBase