William,
On 11/7/23 05:59, William Crowell wrote:
Olaf and Sevendu,
Thank you for your replies. Correct, I sincerely doubt this is a Tomcat class
loading bug.
I am using Tomcat’s normal class loader (webapp/WAR) to load the classes into
memory, and it is a single class loader.
I am going to periodically run: jcmd <pid> GC.class_stats
I am only having the issue in general operation and not on a redeploy, and I
have to restart the server daily. I will find out more details as to how these
classes are loaded into memory.
Here is what is going on…
I have a 16GB Linux instance and one Apache Tomcat 9.0.78 instance running on
it. It is running JDK 1.8.0_371-b25. I have min and max JVM heap setting at
8GB. JVM arguments are:
-XX:CICompilerCount=3 -XX:ConcGCThreads=1 -XX:+DisableExplicitGC
-XX:G1HeapRegionSize=4194304 -XX:GCLogFileSize=3145728
-XX:+HeapDumpOnOutOfMemoryError -XX:InitialHeapSize=8589934592
-XX:MarkStackSize=4194304 -XX:MaxHeapSize=8589934592 -XX:MaxNewSize=5150605312
-XX:MinHeapDeltaBytes=4194304 -XX:NativeMemoryTracking=detail
-XX:NumberOfGCLogFiles=25 -XX:+PrintGC -XX:+PrintGCApplicationConcurrentTime
-XX:+PrintGCApplicationStoppedTime -XX:+PrintGCDateStamps -XX:+PrintGCDetails
-XX:+PrintGCTimeStamps -XX:+UseCompressedClassPointers -XX:+UseCompressedOops
-XX:+UseFastUnorderedTimeStamps -XX:+UseG1GC -XX:+UseGCLogFileRotation
The MaxMetaspaceSize is not set, so this means it is unlimited.
After the server comes up and after a short period of time I get a native out
of memory condition:
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (mmap) failed to map 8589934592 bytes for committing
reserved memory.
# Possible reasons:
# The system is out of physical RAM or swap space
# The process is running with CompressedOops enabled, and the Java Heap may
be blocking the growth of the native heap
# Possible solutions:
# Reduce memory load on the system
# Increase physical memory or swap space
# Check if swap backing store is full
# Decrease Java heap size (-Xmx/-Xms)
# Decrease number of Java threads
# Decrease Java thread stack sizes (-Xss)
# Set larger code cache with -XX:ReservedCodeCacheSize=
# This output file may be truncated or incomplete.
#
# Out of Memory Error (os_linux.cpp:2798), pid=191803, tid=0x000014fff94b7700
#
# JRE version: (8.0_371) (build )
# Java VM: Java HotSpot(TM) 64-Bit Server VM (25.371-b25 mixed mode linux-amd64
compressed oops)
# Core dump written. Default location: /hosting/core or core.191803
#
Enable heap dumps on OOME and look at them with a heap analyzer. You may
find that you have some huge things in memory that aren't being released
for other reasons.
If you have to bounce your server every day, I suspect that you have a
"known resource leak" in your web application already. It's very
unlikely to be Tomcat doing this.
What you should NOT do is just keep raising the heap size until it stops
failing, because you will never find out what is taking up all that heap
space.
If you find that nothing is out of place in that heap dump then AND ONLY
then should you raise the heap size. Sometimes you really just need more
heap.
-chri
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org