[ 
https://issues.apache.org/jira/browse/SOLR-11211?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Pugh resolved SOLR-11211.
------------------------------
    Resolution: Won't Fix

This may be a valid non HDFS use bug too?  if so, please reopen.  Closing since 
HDFS has been removed in Solr 10.

> Too many documents, composite IndexReaders cannot exceed 2147483519
> -------------------------------------------------------------------
>
>                 Key: SOLR-11211
>                 URL: https://issues.apache.org/jira/browse/SOLR-11211
>             Project: Solr
>          Issue Type: Task
>         Environment: Hadoop Centos6
>            Reporter: Wael
>            Priority: Major
>
> I am running a single node Hadoop SOLR machine with 64 GB of ram.
> The issue is that I was using the machine successfully untill yesterday where 
> I made a restart and one of the indexes I am working on wouldn't start giving 
> the error :Too many documents, composite IndexReaders cannot exceed 
> 2147483519". 
> I wonder how SOLR allowed me to add more documents than what a single shard 
> can take. I need a solution to startup the index and I don't want to loose 
> all the data as I only have a 2 week old backup. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@solr.apache.org
For additional commands, e-mail: issues-h...@solr.apache.org

Reply via email to