aslam bari wrote:
Hi Saikrishna, Unluckily my xml structure is not the same, some times
it goes too long and some times too small on nodes. It may be one
element go throught the whole document or there may be many elements
of different types come. So need your help on it how to parse in good
and
a-user@lucene.apache.org
Sent: Monday, 22 January, 2007 10:44:50 AM
Subject: Re: Big size xml file indexing
Hai ,
Nothing to change in Indexing process. What requires is a little
pre-processing.
If the structure of ur xml file is same as what I said earlier,then
split the 35MB file into
e
From: saikrishna venkata pendyala <[EMAIL PROTECTED]>
To: java-user@lucene.apache.org
Sent: Monday, 22 January, 2007 10:07:27 AM
Subject: Re: Big size xml file indexing
Hai ,
I have indexed 6.2 gb xml file using lucene. What I did was
1 . I have splitted the 6.2gb file into
t doc = builder.build(CONTENT);
loop(---)
{
doc.selectNodes(xpathquery);
}
Thanks...
- Original Message
From: saikrishna venkata pendyala <[EMAIL PROTECTED]>
To: java-user@lucene.apache.org
Sent: Monday, 22 January, 2007 10:07:27 AM
Subject: Re: Big size xml file indexing
Hai ,
Hai ,
I have indexed 6.2 gb xml file using lucene. What I did was
1 . I have splitted the 6.2gb file into small files each of size
10mb.
2 . And then I worte a python script to quantize number
no.ofdocuments in each file.
Structure of my xml file is """
Dear all,
I m using lucene to index xml files. For parsing i m using JDOM to get XPATH
nodes and do some manipulation on them and indexed them. All things work well
but when the file size is very big about 35 - 50 MB. Then it goes out of memory
or take a lot of time. How can i set some parameter