Thanks for the input!

It is primarily intended to get rid auf Apache modules that did that before and 
to be more flexible in using that information.

Search engines crawlers can usually slow down a website a lot - or experimental 
crawlers by university students etc....or malicious webcrawlers or people who 
download your website for offline viewing. 
I think restricting it to 1 page view per second would be enough - certain ip 
addresses could get a higher "bandwidth".

Instead of sending them to an error page, however, I would just let the request 
time out.....

The good thing would be you could write your own profiling application and log 
those ipaddresses and clients that are bandwidth hungry....

The only question is, whether such a Tapestry implementation would be efficient 
or not....maybe this would consume too much memory itself if there are many 
visitors on a website?


-------- Original-Nachricht --------
Datum: Sat, 27 Jan 2007 20:38:30 -0800
Von: "Patrick Moore" <[EMAIL PROTECTED]>
An: "Tapestry users" <users@tapestry.apache.org>
Betreff: Re: Limit the number of clicks per second inside Tapestry

> I would hazard a guess that it would be useful to stop automated spam
> entries in blogs and automated sign-up by robots.....
> 
> anything that goes faster than a human could go could be throttled....
> 
> 
> On 1/27/07, Howard Lewis Ship <[EMAIL PROTECTED]> wrote
> >
> >
> >
> > I'd love to know what your underlying use-case is.
> >
> >

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to