On 10/10/23 05:05, John Jackson wrote:
Now for the 20 questions:
What OS do you have this running on?  --> centos
How much physical memory is in the machine? --> 370 GB
Is it running anything other than Solr? --> Yes, we have 3 Wildfly
applications (each has 50GB memory) and 2 microapps (each has 20GB memory).
What is the Solr heap size?  --> 100 GB memory assign to solr
Are you running more than one Solr instance on a single machine? --> Yes,
we are using two instances of Solr in one machine. We have 2 shard 2
replicas so one shard and the other shard replica are running on this
server.

Why run two instances, especially if each one is taking 100GB of memory? One instance can handle multiple replicas with ease.

100GB seems like WAY too much heap. Do you have any evidence that you actually need a heap that big? Solr does not load most index data into the heap, it relies on the OS to cache the bulk of the index data.

Have you set up custom GC tuning, or are you using Solr's default? What Java vendor and version are you running?

Also, each of the replicas of a shard must run on different physical hardware, or you do not actually have redundancy. If you only have one machine, you might as well run with only one replica since the second replica is only draining resources and will not provide redundancy.

I am betting that you are in a situation where there is simply not enough memory for what you are running, especially with 700GB of index data and multiple processes other than Solr on the machine.

Can you get the screenshot mentioned on the wiki page that I linked in my earlier message? Note that you cannot attach an image file to an email to the list, as it will be deleted. You need to use a file/image sharing site like dropbox or imgur and give us a link.

Thanks,
Shawn

Reply via email to