[
https://issues.apache.org/jira/browse/LUCENE-8043?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16271874#comment-16271874
]
Yonik Seeley edited comment on LUCENE-8043 at 11/30/17 12:32 AM:
-----------------------------------------------------------------
I had worked on tracking this down for a bit before I got pulled off onto
something else...
I remember adding the boolean to drop() just as this patch does, but when using
that I only put the conditional around the pendingNumDocs decrement (in
multiple places). Perhaps that's why it didn't work to fix the issue for me...
edit: Actually, it looks like it was SegmentInfos.remove(SegmentCommitInfo) was
part of my attempted fix:
{code}
- public void remove(SegmentCommitInfo si) {
- segments.remove(si);
+ public boolean remove(SegmentCommitInfo si) {
+ return segments.remove(si);
}
{code}
I also exposed pendingNumDocs for testing reasons and then tested it against
expected values, and was able to get tests that reliably failed after a handful
of updates. I'll try digging that up and see if it passes with this patch.
was (Author: [email protected]):
I had worked on tracking this down for a bit before I got pulled off onto
something else...
I remember adding the boolean to drop() just as this patch does, but when using
that I only put the conditional around the pendingNumDocs decrement (in
multiple places). Perhaps that's why it didn't work to fix the issue for me...
I also exposed pendingNumDocs for testing reasons and then tested it against
expected values, and was able to get tests that reliably failed after a handful
of updates. I'll try digging that up and see if it passes with this patch.
> Attempting to add documents past limit can corrupt index
> --------------------------------------------------------
>
> Key: LUCENE-8043
> URL: https://issues.apache.org/jira/browse/LUCENE-8043
> Project: Lucene - Core
> Issue Type: Bug
> Components: core/index
> Affects Versions: 4.10, 7.0, master (8.0)
> Reporter: Yonik Seeley
> Assignee: Simon Willnauer
> Attachments: LUCENE-8043.patch
>
>
> The IndexWriter check for too many documents does not always work, resulting
> in going over the limit. Once this happens, Lucene refuses to open the index
> and throws a CorruptIndexException: Too many documents.
> This appears to affect all versions of Lucene/Solr (the check was first
> implemented in LUCENE-5843 in v4.9.1/4.10 and we've seen this manifest in
> 4.10)
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]