Hello,

In my company we often want to monitor transfer speeds. To do that we upload 
1M-10G zero/random file to a web server and then we set up some monitoring to 
time the download. Or we do the download by hand during troubleshooting 
sessions.

The downside of this is that we need to upload and keep those files on disk and 
sometimes disk is a very limited resource.

That's why I am wondering if somebody could develop (and include in mainline) a 
new nginx module that after configuration similar to this:

location = /100mb.test {
        big_file 100M zero;
}

or:

location = /1m.random {
        big_file 1M random;
}

would serve such file in chosen location. Of course the quality of random data 
does not need to be high - we only need something that compresses poorly - so 
any simple and fast userspace generator should be enough.

Thank you in advance.

-- 
Grzegorz Kulewski

_______________________________________________
nginx mailing list
nginx@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx

Reply via email to