On 3/13/07, Christopher Schultz <[EMAIL PROTECTED]> wrote:
Pierre,
> I am trying to implement a flood control mechanism to prevent robots
> requesting pages after pages at an "inhuman" rate.
I know you've gotten lots of feedback already, but there's a
super-simple way to do this: put a marker in the request attributes the
first time your filter "sees" it. Check for it each time. When you place
the marker in the request, perform all your magic: check the queue, add
the current request + timestamp, etc. If the marker is already there,
skip everything.
For redirects, the request should be re-used, so the marker should
remain until your final response.
You are confusing redirection with forwarding.
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]