On 25/09/2012 20:09, Uwe Schindler wrote:
Hi,
Without a full output of "free -h" we cannot say anything. But the total Linux
memory use should always used by 100% on a good server otherwise it's useless (because
full memory includes cache usage, too). I think, -Xmx may be too less for your Jav
Hi,
Without a full output of "free -h" we cannot say anything. But the total Linux
memory use should always used by 100% on a good server otherwise it's useless
(because full memory includes cache usage, too). I think, -Xmx may be too less
for your Java deployment? We have no information about
On Sat, 2011-09-03 at 20:09 +0200, Michael Bell wrote:
> To be exact, there are about 300 million documents. This is running on a 64
> bit JVM/64 bit OS with 24 GB(!) RAM allocated.
How much memory is allocated to the JVM?
> Now, their searches are working fine IF you do not SORT the results. If
Michael Bell wrote:
> How best to diagnose?
>
>> Call your java process this way
>>java -XX:HeapDumpPath=. -XX:+HeapDumpOnOutOfMemoryError
>> and drag'n'drop the resulting java_pid*.hprof into eclipse.
>> You will get an outline by class for the number and size of allocated
>> objects.
Just lo
On Saturday 03 September 2011 20:09:54 Michael Bell wrote:
> 2011-08-30 13:01:31,489 [TP-Processor8] ERROR
> com.gwava.utils.ServerErrorHandlerStrategy - reportError:
> nastybadthing ::
> com.gwava.indexing.lucene.internal.LuceneSearchController.performSear
>chOperation:229 :: EXCEPTION : java.lang
There is no difference between 2.9 and 3.0, ist exactly the same code with
only Java 5 specific API modifications and removal of deprecated methods.
The issue you have seems to be that maybe your index have grown beyond some
limits of your JVM.
Uwe
-
Uwe Schindler
H.-H.-Meier-Allee 63, D-282