Hi Viktor,
We suffered a lot with excessive usage of Plugins. Yes, some
plugins do leak memory. Kindly retrospect the plugin usage.The only way I
know was trial and error [not the best of the methods, but we didn't have
much choice around it]
Thanks,
Subbu
On Mon, Apr 15, 2013 at 4:49
One more important thing to note is the FTP information will not holdup [if
you happen to use FTP or CICS Copy of build deliverables to Drop]. It still
persists with 1.510.
Consider this fact if you restart Jenkins frequently. Yep, you got to
re-configure it manually.
Thanks,
Subbu
On Mon, Apr 1
We like to keep the build logs as long as we because it enables us to
reproduce builds if the need arises (dependencies changing dynamically).
Hence we have jobs with 1000 log, ~100k each in size. We do not keep the
artifacts on the server.
Could there be any other factor which contributes to th
It has been fixed in the
version 1.485.
What's new in 1.485 (2012/10/07)
Build records are now lazy loaded, resulting in a reduced startup time (issue
8754)
We're using 1.510 and upgrading every week if it worth to upgrade
(checking the changelog first).
How long do you keep the build hi
From which version is this implemented?
We are using LTS 1.480.3
-Djava.awt.headless=true -Xms1g -Xmx6g -XX:MaxPermSize=512M
-XX:+UseParallelGC -XX:ParallelGCThreads=16 -XX:+PrintGCDetails
-XX:+PrintGCTimeStamps -verbose:gc -Djava.net.preferIPv4Stack=true
-XX:+HeapDumpOnOutOfMemoryError
Java
What version are you
using?
The newer version's read the build history on demand.
And you can control the memory usage as well via parameters like:
JAVA_ARGS="-Djava.awt.headless=true -Xms1g -Xmx10g -XX:MaxPermSize=1G
-XX:+UseParNewGC"
This is a very optimal setup for us. ~100 nodes, ~200 job
Hi,
Jenkins seems to consume an awful lot of memory. As I understand it is
due to keeping build logs in memory for one. My question is what other
factors influence jenkins' memory consumption and what can I do about
it? Also, is there a way to flush the memory on demand or force a GC
without