Alice,

If you have a computer that crashes once you put a lot of load on it, I'd say you have bigger problems then the speed of the indexing. A computer should not crash, no matter how much load you put on it. If you have such a huge database, I can't believe that you don't have access to other computers. You can even use workstations and run the indexing at night, and then merge it to a central repository.
Lucene does allow for index merging, right?

Russ



Alice wrote:
Unfortunately I can't use multiple machines.

And I cannot start lots of threads because the server crashes.

-----Original Message-----
From: Russ [mailto:[EMAIL PROTECTED] Sent: quinta-feira, 11 de janeiro de 2007 14:33
To: java-user@lucene.apache.org
Subject: Re: Huge Index

Can you use multiple threads/machines to index the data into separate
indexes, and then combine them?

Russ
Sent wirelessly via BlackBerry from T-Mobile.
-----Original Message-----
From: "Alice" <[EMAIL PROTECTED]>
Date: Thu, 11 Jan 2007 13:47:36 To:<java-user@lucene.apache.org>
Subject: Huge Index

Hello!

I have to index 37million documents retrieved from the database.

I was trying to do by loading intervals of 10000 records but it is too slow.

Anybody could sugest a better way to get all the data indexed in a
reasonable time?

Thanks




---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to