2g... should not be a maximum for any Jvm that I know of.
Assuming you are running a 32 bit Jvm you are actually able to address a
bit under 4G of memory, I've always used around 3.6G when trying to max
out a 32 bit jvm. Technically speaking it should be able to address 4g
under a 32 bit or, however a certain percentage of the memory is set
aside for overhead, so you can only really use a bit less than the max.
If you have a 64 bit os/jvm (which you likely might), you can use the
-d64 setting for your runtime environment to set your maximum memory
much.. MUCH higher, for example we regularly use 6G of memory on our
application servers here at the lab.
Hope this helps you a bit,
Matt
crack...@comcast.net wrote:
http://vtd-xml.sf.net
----- Original Message -----
From: "Sithu D. Sudarsan" <sithu.sudar...@fda.hhs.gov>
To: java-user@lucene.apache.org
Sent: Thursday, May 21, 2009 7:42:59 AM GMT -08:00 US/Canada Pacific
Subject: Parsing large xml files
Hi,
While trying to parse xml documents of about 50MB size, we run into
OutOfMemoryError due to java heap space. Increasing JVM to use close 2GB
(that is the max), does not help. Is there any API that could be used to
handle such large single xml files?
If Lucene is not the right place, please let me know alternate places to
look for,
Thanks in advance,
Sithu D Sudarsan
sithu.sudar...@fda.hhs.gov
sdsudar...@ualr.edu
---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org
For additional commands, e-mail: java-user-h...@lucene.apache.org