Re: Suggestion on perl handler to reject request with large content length

2010-11-18 Thread Mark J. Reed
Accidental tab strikes again. Anyway, the idea is to set up the handler to run before LimitRequestBody, if possible. I'm not sure what hook that is. If you can do that, then something like this would work: sub handler {    my $r = shift;    if ($r->uri =~ /abc/) { $r->content_type('

Re: Suggestion on perl handler to reject request with large content length

2010-11-18 Thread Mark J. Reed
On Thu, Nov 18, 2010 at 8:47 PM, Mohit Anchlia wrote: > I have a requirement to look at content length and if it is greatar > than desired size then return error message. So psuedo code is like: Easier solution: use LImitRequestBody, but have a perl handler that sets a custom response text for 4

Suggestion on perl handler to reject request with large content length

2010-11-18 Thread Mohit Anchlia
I have been searching for an answer in httpd forum. But I think I need to ask here: I have a requirement to look at content length and if it is greatar than desired size then return error message. So psuedo code is like: if content_length > 32G then if url contains /abc/ then echo "0|ab

Re: combining multiple filtered files into a single response

2010-11-18 Thread Brian
On 11/18/10 6:53 PM, André Warnier wrote: I'd also like to avoid the last resort which would be to run a long process to process each file, save them to a temporary directory, and then re-read them > Why is that "the last resort" ? It seems to me to be the logical way of achieving what you want

Re: combining multiple filtered files into a single response

2010-11-18 Thread André Warnier
Brian wrote: ... I'd also like to avoid the last resort which would be to run a long process to process each file, save them to a temporary directory, and then re-read them (one after the other) at the end (and send them out) a single output stream. This defeats the purpose because I'd lik