Just a follow-up that since limiting my Solr QuerySet rows to 1000 like
this:

qs.get_results(rows=1000)

...I have not had a single OOM killer issue. I consider this resolved.

side-note:
I tuned the heap and Java mem by observing the Solr dashboard and observing
the "JVM-Memory" grows with various queries in my web-app.
For reference I kept it constant like this:

# Increase Java Heap as needed to support your indexing / query needs
SOLR_HEAP="256m"
SOLR_JAVA_MEM="-Xms256m -Xmx256m"

On Fri, 10 May 2024 at 07:14, Imran Chaudhry <ichaud...@gmail.com> wrote:

>
>
> On Wed, 8 May 2024 at 17:53, Rajani M <rajinima...@gmail.com> wrote:
>
>> >I think I have limited the documents returned in each query to 10,000 via
>> the client software.
>>
>> Did you mean the solr query rows param is limited to 10,000? Fetching a
>> large number of records can be inefficient, it should be less than 100, a
>> standard page size. Go through this[1] doc. Hope it helps.
>>
>> https://cwiki.apache.org/confluence/display/solr/solrperformanceproblems#SolrPerformanceProblems-JavaHeap
>>
>>
> Thank you! That looks really useful. I will read properly later.
>
> In the meantime, I continued to get OOM issues with 256Mb so I have
> decided to try limiting the rows returned by my queries to 1000:
>
>      # set a sensible limit, like 1000
> -    return qs.get_results(rows=50000)
> +    return qs.get_results(rows=1000)
>
> I already use a paginator [0]. The only problem with limiting the rows is
> that there could be more than 1000 results.
>
>
> [0] django.core.paginator
>
>

Reply via email to