From: Erick Erickson [mailto:[EMAIL PROTECTED]
Sent: Monday, August 14, 2006 6:20 PM
To: java-user@lucene.apache.org
Subject: Re: 7GB index taking forever to return hits
I actually suspect that your process isn't hung, it's just taking
forever because it's swapping a lot. Like a re
uot; searches are really just
phrase searches ... that should still be faster then what you're doing
now.
:
: Thanks,
:
: Van
:
: -Original Message-
: From: Erick Erickson [mailto:[EMAIL PROTECTED]
: Sent: Monday, August 14, 2006 6:20 PM
: To: java-user@lucene.apache.org
: Subject: R
3, BooleanClause.Occur.MUST);
> >
> > TermQuery t1 = new TermQuery("COMPANY_CODE", "u1"); q.add(t1,
> > BooleanClause.Occur.MUST);
> >
> > TermQuery t2 = new TermQuery("LANGUAGE", "enu"); q.add(t2,
> > BooleanCla
-user@lucene.apache.org
Subject: RE: 7GB index taking forever to return hits
Sounds like you want to tokenise CONTENTS, if you are not already doing
so.
Then you could simply have:
+CONTENTS:white +CONTENTS:hard +CONTENTS:hat
-Original Message-
From: Van Nguyen [mailto:[EMAIL PROTECTED]
Sent: 15 A
: 7GB index taking forever to return hits
It was how I was implementing the search.
I am using a boolean query. Prior to the 7GB index, I was searching over a
150MB index that consist of a very small part of the bigger index. I was
able to set my BooleanQuery to
BooleanQuery.setMaxClauseCount
T);
>
> TermQuery t2 = new TermQuery("LANGUAGE", "enu");
> q.add(t2, BooleanClause.Occur.MUST);
> .
> .
> .
>
> I take it this is not the most optimal way about this.
>
> So that leads me to my next question... What is the most optimal way
> about this?
&g
t the most optimal way about this.
So that leads me to my next question... What is the most optimal way
about this?
Van
-Original Message-
From: yueyu lin [mailto:[EMAIL PROTECTED]
Sent: Monday, August 14, 2006 11:30 AM
To: java-user@lucene.apache.org
Subject: Re: 7GB index taking forever to re
Occur.MUST);
.
.
.
I take it this is not the most optimal way about this.
So that leads me to my next question... What is the most optimal way
about this?
Van
-Original Message-
From: yueyu lin [mailto:[EMAIL PROTECTED]
Sent: Monday, August 14, 2006 11:30 AM
To: java-user@lucene.apache.
2GB limitation only exists when you want to put them to memory in 32bits
box.
Our index size is larger than 13 giga bytes, and it works fine.
I think it must be something error in your design. You can use Luke to see
what happened in your index.
On 8/14/06, Van Nguyen <[EMAIL PROTECTED]> wrote:
Hi,
I have a 7GB index (about 45 fields per document X roughly
5.5 million docs) running on a Windows 2003 32bit machine (dual proc, 2GB
memory). The index is optimized. Performing a search on this index
will just “hang” when performing the search (wild card query with a
sort). At fi
10 matches
Mail list logo