sorry, the UID
On 10/31/16 11:59 AM, Chan Chor Pang wrote:
actually if the max user processes is not the problem, i have no idea
but i still suspecting the user,
as the user who run spark-submit is not necessary the pid for the JVM
process
can u make sure when you "ps -ef | grep {you
) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 120242
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang
mailto:chin...@indetail.co.jp>> wrote:
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 120242
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang
jvm process will still not able to create new thread.
btw the default limit for centos is 1024
On 10/31/16 9:51 AM, kant kodali wrote:
On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang
mailto:chin...@indetail.co.jp>> wrote:
/etc/security/limits.d/90-nproc.conf
Hi,
I am using Ubuntu
you may want to check the process limit of the user who responsible for
starting the JVM.
/etc/security/limits.d/90-nproc.conf
On 10/29/16 4:47 AM, kant kodali wrote:
"dag-scheduler-event-loop" java.lang.OutOfMemoryError: unable to
create new native thread
at java.lang.Thread.start0(
after upgrade from Spark 1.5 to 1.6(CDH 5.6.0 -> 5.7.1)
some of our streaming job getting delay after long run.
with a little invesgation, here is what i found.
- the same program have no problem with Spark 1.5
- we have two kind of streaming and only those with
"updateStateByKey" was af