Yeah, I’m seeing that too, and I am also running Jenkins 1.656 version. I’m only running about 400 jobs in Jenkins with 11 slaves, so it’s definitely not the busiest build environment out there. I’ve currently bumped the heap up to 5GB, and still do not have anywhere near enough to run the backup plugin for example. I’m thinking of doubling it to see if that helps, but your experience seems to say that this is also not going to be enough.
This problem is also scaring me about whether unreasonable resource utilization has been corrected in a near-future upgrade to Jenkins 2.2… From: jenkinsci-users@googlegroups.com [mailto:jenkinsci-users@googlegroups.com] On Behalf Of Ugo Bellavance Sent: May-11-16 11:44 To: Jenkins Users Subject: Jenkins using a lot more resources after upgrade Hi, After some testing we upgraded Jenkins in production, from 1.617-1.1 to 1.656-1.1 on April 20. However, since the upgrade we're running into problems that we never saw before and it's now kind of out of control: * We had to raise the nofile parameter in limits.conf for the jenkins user because we were getting "Too many open files" errors. We raised it to 8192 soft, 10240 hard on April 28th. By looking into /proc/pid/ld/ I found out that most of these files where inet sockets. Got these errors again on May 8th and 9th. I didn't change anything yet for that issue * We had to raise the java heap memory. It was at -Xmx4096m, we raised it to -Xmx7168m, then to -Xmx10752m and adjusted the vRAM allocation for this VM accordingly, but it doesn't solve the problem, we're still getting "java.lang.OutOfMemoryError: Java heap space" errors. When that starts happening, we have to restart jenkins because the web interface is not responding anymore (which makes it difficult to troubleshoot because we can't really tell what is running at this moment. Note that we do run Selenium testing through IC, but nothing has changed drastically after the upgrade. RHEL updates were applied a few days before (Apr 6th) and that include a minor update to Firefox (38.0 => 38.7). We're using openjdk java JRE. Was updated to 1.7.0.99 (from 1.7.0.79) on April 6th, 1.7.0.101 on May 2nd. Has anyone experienced something similar? What are my best options? Our workaround now is to restart jenkins manually when needed. * Would it be possible to rollback to 1.617 without breaking anything? * Try the LTS version (that would mean a downgrade - would it break stuff?) * Jump to version 2 (2.2-1.1 available) * Inspect the JVM's memory * Any other idea is welcome Thanks in advance, Ugo -- You received this message because you are subscribed to the Google Groups "Jenkins Users" group. To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-users+unsubscr...@googlegroups.com<mailto:jenkinsci-users+unsubscr...@googlegroups.com>. To view this discussion on the web visit https://groups.google.com/d/msgid/jenkinsci-users/4e6f40ca-cbf7-41b4-874e-18df1c04f9ce%40googlegroups.com<https://groups.google.com/d/msgid/jenkinsci-users/4e6f40ca-cbf7-41b4-874e-18df1c04f9ce%40googlegroups.com?utm_medium=email&utm_source=footer>. For more options, visit https://groups.google.com/d/optout. -- You received this message because you are subscribed to the Google Groups "Jenkins Users" group. To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-users+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/jenkinsci-users/d17d6d1e166041699e01ebecfa6427ba%40mbx01cmb01p.esentire.local. For more options, visit https://groups.google.com/d/optout.