On 2011-07-06 17:47, Guille wrote:
I'm using:
- Debian Squeeze
- Server version: Apache/2.2.16 (Debian)
- ProFTPD Version 1.3.3c
- Net2FTP v 0.98

Hi guys i'm currently using a php script (Net2FTP) as a web FTP client. But i'm experiencing some issues when i try to access big files. I know this is an Apache list but I've already asked in proftpd forums and net2ftp forums and nobody could help me.

So let me get this straight - you're using apache to serve a PHP page that connects to an FTP server ON THE SAME MACHINE to transfer files over the network ?

Wow.

Really - wow.

When i try to access a big file (1GB or more) from my web server through apache2 directly it works well i get the download start immediately. When i access a big file connecting to my proftpd server through a normal client like Filezilla i get no problem too, it works as intended.

The problem occurs when i connect through net2ftp script.

Contact its author and ask him to fix it.

Connection and file listing it's ok. But when i try to get a big file, is when i get the issue. It takes for e.g 15minutes to serve the net2ftp script client a 500Mb file.

Obviously the bad script tries to retrieve the FTP file completely before serving it via apache.
Try it with a file larger than you have virtual memory.
The process should die.

I think the problem is the connection between apache2 and proftpd.

No, it's not. it's the bad script.

--
J.


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
  "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org

Reply via email to