Hi Andre,

I suppose we should read 1.2 GB here ? Yes
Anyway, why do you say "which is enough" ? How do you know ? By the past
test results. that we have been doing on each application
And do not top-post. How do we know what you are responding to ? By
scrolling up and down ?

On Thu, Mar 30, 2017 at 10:43 AM, André Warnier (tomcat) <a...@ice-sa.com>
wrote:

> On 30.03.2017 19:36, Utkarsh Dave wrote:
>
>> Thanks Olaf and Suvendu for the response.
>> We are using 1.2 MB of heap size which is enough and haven't created an
>> issue so far.
>>
>
> I suppose we should read 1.2 GB here ?
> Anyway, why do you say "which is enough" ? How do you know ?
> And do not top-post. How do we know what you are responding to ? By
> scrolling up and down ?
>
>
>
>
>> On Thu, Mar 30, 2017 at 9:51 AM, Suvendu Sekhar Mondal <suv3...@gmail.com
>> >
>> wrote:
>>
>> Memory heap dump generated is of
>>>> Size: 787.3 MB Classes: 139k Objects: 19.3m Class Loader: 1.6k
>>>>
>>>
>>> Overview shows 580.9 MB occupied by remainder's.
>>>>
>>>
>>> Problem suspect:-
>>>> 465 MB occupied by remainder
>>>>
>>>
>>> Remainder section has retained a good chunk of memory. That indicates
>>> lots of small objects are being created by different apps. Your "Live
>>> Set" is not very big. What is the heap size? You also mentioned,
>>> Tomcat process was consuming high CPU. If you have small heap and all
>>> of it is filled up by live objects then JVM will run frequent GC to
>>> clean up some space. In that case CPU usage will be high for Tomcat
>>> process.
>>>
>>> As Olaf indicated, you can try to increase heap size and see if the
>>> problem goes away. But before that, I am curious to see what heap and
>>> GC settings you are using. Please post that info.
>>>
>>> Thanks!
>>> Suvendu
>>>
>>> On Thu, Mar 30, 2017 at 2:01 PM, Olaf Kock <tom...@olafkock.de> wrote:
>>>
>>>> Am 30.03.2017 um 01:33 schrieb Utkarsh Dave:
>>>>
>>>>> Hello all,
>>>>>
>>>>> My tomcat (7.0.72) hosts several web aplications in the server (based
>>>>> in
>>>>> linux 6.8).
>>>>>
>>>> [...]
>>>>
>>>>> Memory heap dump generated is of
>>>>> Size: 787.3 MB Classes: 139k Objects: 19.3m Class Loader: 1.6k
>>>>>
>>>> The combination of "hosts several web applications" and a heap space of
>>>> this size does not convince me of a leak - it might be the memory
>>>> requirement of one of the webapps. A leak is typically something that
>>>> grows uncontrolled until you run out of heapspace, no matter how much
>>>> you grow the available space.
>>>>
>>>>> In the thread dumps I see these threads repeatedly. I wonder these
>>>>>
>>>> pointing
>>>
>>>> to com.rsa.sslj.x.
>>>>>
>>>> You seem to be handling https requests from Tomcat. If you're not happy
>>>> with the implementation of this endpoint/protocol you should move this
>>>> to an Apache httpd or similar and just forward to tomcat, so that tomcat
>>>> does not deal with encryption.
>>>>
>>>> As a conclusion: Your problem might not be poorly designed clients, it
>>>> might be poorly equipped servers - I'd try to double the memory
>>>> allocated to Tomcat's heap and potentially tune the garbage collector.
>>>> If you run into problems, you might also identify one of the
>>>> webapplications that eats up your resources (no matter what the clients
>>>> do).
>>>>
>>>> Olaf
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
>>>> For additional commands, e-mail: users-h...@tomcat.apache.org
>>>>
>>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
>>> For additional commands, e-mail: users-h...@tomcat.apache.org
>>>
>>>
>>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
> For additional commands, e-mail: users-h...@tomcat.apache.org
>
>

Reply via email to