Also for $115 I can buy a terabyte of a Samsung ssd, which helps a lot. It 
comes to a point where money on hardware will outweigh money on engineering man 
power hours, and still come to the same conclusion. As much ram as your rack 
can take and as big and fast of a raid ssd drive it can take. Remember since 
solr is always meant to be destroyed and recreated you don’t have to worry much 
about hardware failure if you just buy two of everything and have a backup 
server ready and waiting to take over while the original fails and is 
reconstructed. 

> On Jul 4, 2022, at 1:32 PM, Shawn Heisey <apa...@elyograg.org> wrote:
> 
> On 7/4/22 03:01, Mike wrote:
>> My Solr index size is around 500GB and I have 64GB of RAM. Solr eats up all
>> the memory and because of that PHP works very, very slowly. What can I do?
> 
> Solr is a Java program.  A Java program will never directly use more memory 
> than you specify for the max heap size.  We cannot make any general 
> recommendations about what heap size you need, because there is a good chance 
> that any recommendation we make would be completely wrong for your install.  
> I did see that someone recommended not going above 31G ... and this is good 
> advice.  At 32 GB, Java switches to 64-bit pointers instead of 32-bit.  So a 
> heap size of 32 GB actually has LESS memory available than a heap size of 31 
> GB.
> 
> The OS will use additional memory beyond the heap for caching the index data, 
> but that is completely outside of Solr's control. Note that 64GB total memory 
> for a 500GB index is almost certainly not enough memory, ESPECIALLY if the 
> same server is used for things other than Solr.  I wrote the following wiki 
> page:
> 
> https://cwiki.apache.org/confluence/display/SOLR/SolrPerformanceProblems
> 
> Others have recommended that you run Solr on dedicated hardware that is not 
> used for any other purpose.  I concur with that recommendation.
> 
> Thanks,
> Shawn
> 

Reply via email to