On 14 September 2015 at 16:43, Alexander Popov <mogada...@gmail.com> wrote:

> Yes there is plenty of errors there  like
> Committed before 500 {msg=GC overhead limit
> exceeded,trace=java.lang.OutOfMemoryError: GC overhead limit exceeded
>  null:org.eclipse.jetty.io.EofException
>
> and so on,  this is reason why I try to restart  node
>
> My concerns is:
> * search on this node come to un-working state and not repaired itself
> * halted node, requires manual  actions
> * false positive report or* riak restart *
>
>
>
>
Hi Alexander,

If you see garbage collection related Solr errors, you may want to revisit
your Java VM settings in 'riak.conf'. By default, only 1 GB of heap space
is given to the JVM. This is sufficient for light loads, but in production
you'd typically want to increase the heap allocation. Solr memory tuning
may also involve switching to a different GC algorithm. See e.g.
https://wiki.apache.org/solr/ShawnHeisey or
https://wiki.apache.org/solr/SolrPerformanceProblems for more details.

Regards,

Magnus

-- 
Magnus Kessler
Client Services Engineer @ Basho

Registered Office - 8 Lincoln’s Inn Fields London WC2A 3BP Reg 07970431
_______________________________________________
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to