On 12/13/21 8:53 AM, Scott wrote:
I guess my questions are:
- Why does Solr use more than 16g ?
- Why isn't swapped memory released ?

Solr is a Java program, and it is Java that manages the memory. If the process is going significantly beyond the 16GB max heap that has been configured, and swap is in use, then it is most likely Java that is broken.  Java simply shouldn't use much more memory than it has been told it can use.  Some overhead beyond the configured max heap will always be required, but typically the overhead should be pretty small.  Maybe a few hundred MB when the heap is 16GB.

Swap is managed by the OS, not Java or Solr.  A significant bug in the OS seems unlikely, but that sort of thing HAS happened before.  Do you have all the packages on the server, including the kernel and Java, fully up to date?

All this assumes that there are no other processes on the machine that are using significant amounts of memory.

If you can't get an updated version of Java 11, what I would try next is installing openjdk8 and setting JAVA_HOME in solr.in.sh so it points at that version of Java.  Java 8 is the minimum requirement for Solr 8.x.  It has been out a lot longer than Java 11, which in theory would mean it's less likely to be affected by major bugs.

Something seems to be very broken somewhere, either in Java or the OS.  I would suspect Java before the OS.  It just shouldn't behave like this.  On a well-tuned system that is not suffering from significant bugs, it should never be necessary to restart Solr. Solr has not had valid reports of a memory leak bug in a long time... but even if Solr did have a memory leak, that would not cause Java to use so much extra memory.

Thanks,
Shawn


Reply via email to