Search generally trades memory/disk to achieve speed. Thus it tends to use
the available JVM memory, and it also benefits greatly from excess memory
that the OS can dedicate to caching disk information. For this reason,
while it is certainly *possible* to run solr on the same machine as your
PHP server it's a suboptimal solution from the perspective of getting the
most out of Solr. In addition, if your php server is serving a user
interface to end users, and exposed to the internet, hosting solr on it
means you have to be very careful about security, because *by necessary
design* the admin features have the ability to do things like delete your
index or store arbitrary data in your index. These features are necessary
to provide the powerful functionality that solr provides, and are meant to
be protected either by properly configuring the security features in solr
or by carefully sequestering solr behind a firewall and/or proxy.

If the server hosting solr can be reached directly from the internet,
that's one less barrier for attackers. Typically solr is run on a separate
internal only server both for performance reasons and to simplify security
concerns.

While one can "go for it" if one has lots of memory it's worth noting that
large memory can lead to very long GC pauses too. So unless you need to do
heavyweight analytics/stats/sorts etc. more small machines are better than
one large one (assuming you care about controlling your maximum latency).

-Gus

On Mon, Jul 4, 2022 at 3:31 PM Dave <hastings.recurs...@gmail.com> wrote:

> Also for $115 I can buy a terabyte of a Samsung ssd, which helps a lot. It
> comes to a point where money on hardware will outweigh money on engineering
> man power hours, and still come to the same conclusion. As much ram as your
> rack can take and as big and fast of a raid ssd drive it can take. Remember
> since solr is always meant to be destroyed and recreated you don’t have to
> worry much about hardware failure if you just buy two of everything and
> have a backup server ready and waiting to take over while the original
> fails and is reconstructed.
>
> > On Jul 4, 2022, at 1:32 PM, Shawn Heisey <apa...@elyograg.org> wrote:
> >
> > On 7/4/22 03:01, Mike wrote:
> >> My Solr index size is around 500GB and I have 64GB of RAM. Solr eats up
> all
> >> the memory and because of that PHP works very, very slowly. What can I
> do?
> >
> > Solr is a Java program.  A Java program will never directly use more
> memory than you specify for the max heap size.  We cannot make any general
> recommendations about what heap size you need, because there is a good
> chance that any recommendation we make would be completely wrong for your
> install.  I did see that someone recommended not going above 31G ... and
> this is good advice.  At 32 GB, Java switches to 64-bit pointers instead of
> 32-bit.  So a heap size of 32 GB actually has LESS memory available than a
> heap size of 31 GB.
> >
> > The OS will use additional memory beyond the heap for caching the index
> data, but that is completely outside of Solr's control. Note that 64GB
> total memory for a 500GB index is almost certainly not enough memory,
> ESPECIALLY if the same server is used for things other than Solr.  I wrote
> the following wiki page:
> >
> > https://cwiki.apache.org/confluence/display/SOLR/SolrPerformanceProblems
> >
> > Others have recommended that you run Solr on dedicated hardware that is
> not used for any other purpose.  I concur with that recommendation.
> >
> > Thanks,
> > Shawn
> >
>


-- 
http://www.needhamsoftware.com (work)
http://www.the111shift.com (play)

Reply via email to