Yeah, there's a setting on windows that allows you to use up to .. erm 3G I think it was. The limitation there is due to the silly windows file system. I'm don't remember off hand exactly what that setting was, but I'm 100% certain that its there.

If you do a google search for jvm maximum memory settings on windows you should be able to find a few articles about it.

(At least that's certainly my recollection)

Secondly, if you have a linux machine available you should likely just use that, particularly if its a 64 bit processor because then a whole ton more memory becomes available to you.

When I'm developing my indexes I do it via eclipse on my windows platform, but with the actual directories themselves mounted from a solaris machine. When I go to actually MAKE the indexes I simply login to the machine do a quick ant compile, and run them. Sure its an extra step, but the gains are more than worth it in our case.

Matt

Sudarsan, Sithu D. wrote:
Hi Matt,

We use 32 bit JVM. Though it is supposed to have upto 4GB, any
assignment above 2GB in Windows XP fails. The machine has  quad-core
dual processor.

On Linux we're able to use 4GB though!

If there is any setting that will let us use 4GB do let me know.

Thanks,
Sithu D Sudarsan

-----Original Message-----
From: Matthew Hall [mailto:mh...@informatics.jax.org] Sent: Friday, May 22, 2009 8:59 AM
To: java-user@lucene.apache.org
Subject: Re: Parsing large xml files

2g... should not be a maximum for any Jvm that I know of.

Assuming you are running a 32 bit Jvm you are actually able to address a

bit under 4G of memory, I've always used around 3.6G when trying to max out a 32 bit jvm. Technically speaking it should be able to address 4g under a 32 bit or, however a certain percentage of the memory is set aside for overhead, so you can only really use a bit less than the max.

If you have a 64 bit os/jvm (which you likely might), you can use the -d64 setting for your runtime environment to set your maximum memory much.. MUCH higher, for example we regularly use 6G of memory on our application servers here at the lab.

Hope this helps you a bit,

Matt

crack...@comcast.net wrote:
http://vtd-xml.sf.net

----- Original Message ----- From: "Sithu D. Sudarsan" <sithu.sudar...@fda.hhs.gov> To: java-user@lucene.apache.org Sent: Thursday, May 21, 2009 7:42:59 AM GMT -08:00 US/Canada Pacific Subject: Parsing large xml files

Hi, While trying to parse xml documents of about 50MB size, we run into OutOfMemoryError due to java heap space. Increasing JVM to use close
2GB
(that is the max), does not help. Is there any API that could be used
to
handle such large single xml files?
If Lucene is not the right place, please let me know alternate places
to
look for, Thanks in advance, Sithu D Sudarsan sithu.sudar...@fda.hhs.gov sdsudar...@ualr.edu






---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org
For additional commands, e-mail: java-user-h...@lucene.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org
For additional commands, e-mail: java-user-h...@lucene.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org
For additional commands, e-mail: java-user-h...@lucene.apache.org

Reply via email to