Hi,
I am having a strange issue with a mod_perl handler which I've written lately. A little background. we are using a mod_perl script for our self-developed MS .NET application. The application connects to the frontend server, where the mod_perl "proxy" is running. The script does some kind of load_balancing and then proxies the request to the backend and providest the answer from the backend back to the application. This concept is working fine, but the initial version of the proxy script is poorly written. It first collects all data from the client in memory (which might be a pretty high amount of data (up to 100MB)), then send the stuff to the backend and then provides the answer. This causes bad memory consumption- especially if there are more than 100 concurrent users. Also the script is no real mod_perl script, but uses ModPerl::Registry instead. So I've re-written the whole script from scratch. I've written it as "real" mod_perl handler module. It's now working with chunks of data to be sent and received. Everything seems to working fine- except of one thing. The .NET app has the ability to upload a file to the backend. With the original script the upload usually has at least about 1-2MB/sec bandwidth. but with the new script that I've written, it only provides a throughput of about 300kb/sec. I did couple of profiling already, but wasn't able to find the cause of this. So I am wondering if someone of the mod_perl community has some advices or hints, how to resolve this issue. Every help or thought-provoking impulse is highly appreciated. Thanks Winni