Hi.
K J.Sreekumar wrote:
Hello Andre
TCP 0.0.0.0:8080 0.0.0.0:0 LISTENING
5356
[tomcat6.exe]
Apart from the above and the other ports in LISTEN state, when "tomcat
freezes", do you have any other ports in the netstat listing, shown as
"CLOSE_WAIT" for example ?
If yes, how many ?
These are the ports in the close wait state -
Ok, there do not seem to be a whole bunch of them, which is good.
One some systems, with badly-behaved applications, I have seen hundreds of those, to the
point of rendering the system totally incapable of accepting new TCP connections of any
kind. But that does not appear to be the case here.
TCP 127.0.0.1:3450 127.0.0.1:8080 CLOSE_WAIT 1836
[httpd.exe]
This is one of the connections between the Apache front-end proxy_http module, and the
back-end tomcat port 8080 HTTP Connector.
The fact that it is a "CLOSE_WAIT" state indicates that one of the sides has closed its
connection, but the other has not yet. It is a normal TCP state, if it remains moderate
and does not last too long.
One source of problems - as I believe Mark pointed out recently, maybe in another thread -
is when there is a mismatch between the number of connections which the front-end is
trying to make to Tomcat, and the number of threads available in Tomcat to handle them.
If there are many more client requests than Tomcat threads available to serve them, then a
lot of them will end up in the accept queue of the back-end Tomcat, waiting for a tomcat
thread to become available to serve them. At some point, this queue reaches its maximum
size, and then further requests are being rejected.
There are a whole bunch of parameters allowing you to control this at the httpd level, see
: http://httpd.apache.org/docs/2.2/mod/mod_proxy.html#proxypass
and at the Tomcat level, the Connector attribute "maxThreads".
The defaults are usually fine however, so I would not start experimenting with then until
you know what the problem really is.
...
TCP 192.168.103.117:1790 184.84.255.35:80 <http://184.84.255.35/>
CLOSE_WAIT 6008
[jucheck.exe]
As far as i know, the above is the "java update scheduler" service.
Nothing to do with the current problem, but I generally dislike this kind of thing on a
server, and turn them off. They use up resources, and ports which you later always wonder
about. Plus, I don't want any server of mine to decide to update himself, or even pop up
annoying dialogs all the time.
A matter of preference.
Then, the next time Tomcat appears to freeze, try with a browser to access
the "freeze" page, as :
http://(hostname)/freeze/freeze.html
and as
http://(hostname):8080/freeze/freeze.html
and let's see what happens.
We have another test application running on tomcat, which also fails to
respond once tomcat starts ignoring requests; so is the case with tomcat
manager too.
Right. What I was trying to do, is to have some application as simple as possible, and
totally independent of your own webapps. A simple html page will be served by the Tomcat
embedded "default servlet", using only Tomcat code.
If that one blocks too, then you would know that it has nothing to do with application
code, extra libraries etc..
Well, not quite, as tomcat could still be blocked by your own apps.
But if the rest blocks, and this does not, then it would be a clear sign that the block is
in your applications.
It is equivalent to the telnet test done before, just a bit easier to use.
We are trying to replicate this behavior again by directly running tomcat on
80; will be posting the observations here.
That's a good idea, to eliminate the front-end http and the proxy connector's
impact.
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org