On 1/24/24 01:27, uyil...@vivaldi.net.INVALID wrote:
Is there a general guideline to optimize Solr for very little number of 
documents in the core and low memory? For example, let's say 2000 documents and 
100mb of memory. It crashes often due to OOM error with the default 
configuration.

Are there places in the Solr config where we can look to make it need less heap 
when document count is very low? This is just for regular indexing and regular 
searches by the way, nothing fancy like facets.

The default heap size Solr starts with out of the box is 512MB. This is quite small. It is enough to run Solr, but from what I have seen, as soon as you add data and start to actually use it for more than the most simple queries, you'll need to increase the heap.

It is not going to be possible to run Solr on a system with only 100MB of memory.

I would say that the absolute minimum system memory requirement for running Solr on a non-Windows operating system is going to be about 1GB, and 4GB would be a lot better.

One thing you can do to reduce heap requirements is disable all the caches - just delete or comment the definitions in solrconfig.xml.

Enabling docValues on fields that you use for things other than searching (one example is sorting) can help.

Thanks,
Shawn

Reply via email to