Aha, it's not initially clear, but after looking at it more closely, I see how
it works
now. This is very good to know.
Tony Schwartz
[EMAIL PROTECTED]
> Tony Schwartz wrote:
>> What about the TermInfosReader class? It appears to read the entire term
>> set for the
>> segment into 3 arrays
Tony Schwartz wrote:
What about the TermInfosReader class? It appears to read the entire term set
for the
segment into 3 arrays. Am I seeing double on this one?
p.s. I am looking at the current sources.
see TermInfosReader.ensureIndexIsRead();
The index only has 1/128 of the terms, by def
Tony Schwartz wrote:
I think you're jumping into the conversation too late. What you have said here
does not
address the problem at hand. That is, in TermInfosReader, all terms in the
segment get
loaded into three very large arrays.
That's not true. Only 1/128th of the terms are loaded by
On Thursday 18 August 2005 14:32, Tony Schwartz wrote:
> Is this a viable solution?
> Doesn't this make sorting and filtering much more complex and much more
> expensive as well?
Sorting would have to be done on more than one field.
I would expect that to be possible.
As for filtering: would you
ww.aviransplace.com
>
> -Original Message-
> From: Tony Schwartz [mailto:[EMAIL PROTECTED]
> Sent: Thursday, August 18, 2005 8:32 AM
> To: java-user@lucene.apache.org
> Subject: Re: OutOfMemoryError on addIndexes()
>
> Is this a viable solution?
> Doesn't this m
-user@lucene.apache.org
Subject: Re: OutOfMemoryError on addIndexes()
Is this a viable solution?
Doesn't this make sorting and filtering much more complex and much more
expensive as well?
Tony Schwartz
[EMAIL PROTECTED]
> On Wednesday 17 August 2005 22:49, Paul Elschot wrote:
>> > the i
Is this a viable solution?
Doesn't this make sorting and filtering much more complex and much more
expensive as well?
Tony Schwartz
[EMAIL PROTECTED]
> On Wednesday 17 August 2005 22:49, Paul Elschot wrote:
>> > the index could potentially be huge.
>> >
>> > So if this is indeed the case, it is
On Wednesday 17 August 2005 22:49, Paul Elschot wrote:
> > the index could potentially be huge.
> >
> > So if this is indeed the case, it is a potential scalability
> > bottleneck in lucene index size.
>
> Splitting the date field into century, year in century, month, day, hour,
> seconds, and
>
hanks,
>
> Tony Schwartz
> [EMAIL PROTECTED]
>
>
>
>
>
> From: John Wang <[EMAIL PROTECTED]>
> Subject: Re: OutOfMemoryError on addIndexes()
>
>
>
>
> Under many us
urned me in the past. I am going to start
working on and
testing a solution to this, but was wondering if anyone had already messed with
it or
had any ideas up front?
Thanks,
Tony Schwartz
[EMAIL PROTECTED]
From: John Wang <[EMAIL PROTECTED]>
Subject: Re: OutOfMemoryError on
the nature of your indexes?
>
>
> : Date: Fri, 12 Aug 2005 09:45:40 +0200
> : From: Trezzi Michael <[EMAIL PROTECTED]>
> : Reply-To: java-user@lucene.apache.org
> : To: java-user@lucene.apache.org
> : Subject: RE: OutOfMemoryError on addIndexes()
> :
> : I did some m
Lea [mailto:[EMAIL PROTECTED]
: Odesláno: st 10.8.2005 12:34
: Komu: java-user@lucene.apache.org
: P?edm?t: Re: OutOfMemoryError on addIndexes()
:
:
:
: How much memory are you giving your programs?
:
: java-Xmxset maximum Java heap size
:
: --
: Ian.
:
: On 10/08/05, Trezzi Mich
omu: java-user@lucene.apache.org
Předmět: Re: OutOfMemoryError on addIndexes()
How much memory are you giving your programs?
java-Xmxset maximum Java heap size
--
Ian.
On 10/08/05, Trezzi Michael <[EMAIL PROTECTED]> wrote:
> Hello,
> I have a problem and i tried everythin
m: Otis Gospodnetic [mailto:[EMAIL PROTECTED]
> Sent: Thursday, August 11, 2005 11:15 AM
> To: java-user@lucene.apache.org
> Subject: Re: OutOfMemoryError on addIndexes()
>
> > > Is -Xmx case sensitive? Should it be 1000m instead of 1000M?
> Not
> > > sure.
>
would shrink the java memory pool back down to the min?
Thanks,
Tom
-Original Message-
From: Otis Gospodnetic [mailto:[EMAIL PROTECTED]
Sent: Thursday, August 11, 2005 11:15 AM
To: java-user@lucene.apache.org
Subject: Re: OutOfMemoryError on addIndexes()
> > Is -Xmx case sensitive?
> > Is -Xmx case sensitive? Should it be 1000m instead of 1000M? Not
> > sure.
> >
>
> I'am starting with:
> java -Xms256M -Xmx512M -jar Suchmaschine.jar
And if you look at the size of your JVM, does it really use all 512 MB?
If it does not, maybe you can try this:
java -Xms256m -Xmx512m -j
Otis Gospodnetic schrieb:
> Is -Xmx case sensitive? Should it be 1000m instead of 1000M? Not
> sure.
>
I'am starting with:
java -Xms256M -Xmx512M -jar Suchmaschine.jar
--
Die analytische Maschine (der Computer) kann nur das ausführen, was wir
zu programmieren imstande sind. (Ada Lovelace)
.
>
> Michael
>
>
>
> Od: Ian Lea [mailto:[EMAIL PROTECTED]
> Odesláno: st 10.8.2005 12:34
> Komu: java-user@lucene.apache.org
> Pøedmìt: Re: OutOfMemoryError on addIndexes()
>
>
>
> How much memory are you giving your programs
Předmět: Re: OutOfMemoryError on addIndexes()
How much memory are you giving your programs?
java-Xmxset maximum Java heap size
--
Ian.
On 10/08/05, Trezzi Michael <[EMAIL PROTECTED]> wrote:
> Hello,
> I have a problem and i tried everything i could think of to solve it. TO
How much memory are you giving your programs?
java-Xmxset maximum Java heap size
--
Ian.
On 10/08/05, Trezzi Michael <[EMAIL PROTECTED]> wrote:
> Hello,
> I have a problem and i tried everything i could think of to solve it. TO
> understand my situation, i create indexes on several
Hello,
I have a problem and i tried everything i could think of to solve it. TO
understand my situation, i create indexes on several computers on our network
and they are copied to one server. There, once a day, they are merged into one
masterIndex, which is then searched. The problem is in merg
21 matches
Mail list logo