Re: [PHP] handling large files w/readfile

2005-02-23 Thread Marek Kilimajer
A "little" late but: Robin Getz wrote: If this runs for awhile things go very bad. This seems to be related to a specific download manager called NetAnts that seems to be popular in China. http://www.netants.com/ Which attempts to open the same url for downloading 10-15 times at the same instan

Re: [PHP] handling large files w/readfile

2005-01-04 Thread Jason Wong
On Tuesday 04 January 2005 22:04, Robin Getz wrote: > Jason Wong wrote: > >Are you using the above code on its own (ie not within some other code > >that may affect the memory usage)? > > Well, herethe entire file (It is pretty short - only a 2 pages, but sorry > in advance if anyone considers this

RE: [PHP] handling large files w/readfile

2005-01-04 Thread Robin Getz
Jason Wong wrote: Are you using the above code on its own (ie not within some other code that may affect the memory usage)? Well, herethe entire file (It is pretty short - only a 2 pages, but sorry in advance if anyone considers this bad form). site is called with something like http://blackfin.

Re: [PHP] handling large files w/readfile

2005-01-04 Thread Jason Wong
On Sunday 02 January 2005 16:43, Robin Getz wrote: > Rasmus Lerdorf wrote: > > >> > > >> $buff = "0"; > > >> while (!feof($fp)) { > > >>$buff = fread($fp, 4096); > > >>print $buff; > > >> } > > >> unset($buff); > > >> fclose ($fp); > > >> ===

Re: [PHP] handling large files w/readfile

2005-01-03 Thread Richard Lynch
Sebastian wrote: > yea. all the files aren't 100MB though.. some are 2mb (even less) while > some > files are over 300MB as well. > so, does this need to be adjusted depending on the filesize? I believe that at a certain point, your setting there will be worse for the system if you make it too big

RE: [PHP] handling large files w/readfile

2005-01-02 Thread Robin Getz
Rasmus Lerdorf wrote: >> >> $buff = "0"; >> while (!feof($fp)) { >>$buff = fread($fp, 4096); >>print $buff; >> } >> unset($buff); >> fclose ($fp); >> Well, the above code does not use more than 4K of ram plus a bit of overhead. So

Re: [PHP] handling large files w/readfile

2005-01-01 Thread Sebastian
achment; filename="' . $file['type'] . '"'); header('Content-Length: ' . filesize($file['path'] . $file['type'])); header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); - Original Message - From: &quo

Re: [PHP] handling large files w/readfile

2005-01-01 Thread Rasmus Lerdorf
Robin Getz wrote: Robin Getz wrote: My next experiment is: $buff = "0"; while (!feof($fp)) { $buff = fread($fp, 4096); print $buff; } unset($buff); fclose ($fp); Nope that doesn't work either - came back, and saw apache processes that

RE: [PHP] handling large files w/readfile

2005-01-01 Thread Robin Getz
Robin Getz wrote: My next experiment is: $buff = "0"; while (!feof($fp)) { $buff = fread($fp, 4096); print $buff; } unset($buff); fclose ($fp); Nope that doesn't work either - came back, and saw apache processes that were +450Meg. Cha

RE: [PHP] handling large files w/readfile

2005-01-01 Thread Robin Getz
Curt Zirzow wrote: * Thus wrote Richard Lynch: > Sebastian wrote: > > i'm working on a app which output files with readfile() and some headers.. > > i read a comment in the manual that says if your outputting a file > > php will use the same amount of memory as the size of the file. so, > > if the

Re: [PHP] handling large files w/readfile

2005-01-01 Thread Curt Zirzow
* Thus wrote Richard Lynch: > Sebastian wrote: > > i'm working on a app which output files with readfile() and some headers.. > > i read a comment in the manual that says if your outputting a file php > > will > > use the same amount of memory as the size of the file. so, if the file is > > 100MB p

Re: [PHP] handling large files w/readfile

2005-01-01 Thread Sebastian
gt; Cc: "Sebastian" <[EMAIL PROTECTED]>; Sent: Friday, December 31, 2004 10:24 PM Subject: Re: [PHP] handling large files w/readfile > I'd go with Richards Basic idea, but if you're outputting a 100Mb file > I'd use a hell of a lot bigger chunks than 4K. W

Re: [PHP] handling large files w/readfile

2004-12-31 Thread Rory Browne
I'd go with Richards Basic idea, but if you're outputting a 100Mb file I'd use a hell of a lot bigger chunks than 4K. With the syscall and loop overhead, i'd go with at least half a megabyte, or more likely 2Mb depending on your amount of memory. To do this you'd change Richards echo fread($fp,

Re: [PHP] handling large files w/readfile

2004-12-31 Thread Richard Lynch
Sebastian wrote: > i'm working on a app which output files with readfile() and some headers.. > i read a comment in the manual that says if your outputting a file php > will > use the same amount of memory as the size of the file. so, if the file is > 100MB php will use 100MB of memory.. is this tr

Re: [PHP] handling large files w/readfile

2004-12-26 Thread Raditha Dissanayake
Sebastian wrote: i'm working on a app which output files with readfile() and some headers.. i read a comment in the manual that says if your outputting a file php will use the same amount of memory as the size of the file. so, if the file is 100MB php will use 100MB of memory.. is this true? I d

Re: [PHP] handling large files w/readfile

2004-12-26 Thread Heilig \(Cece\) Szabolcs
Hello! > i'm working on a app which output files with readfile() and some headers.. > i read a comment in the manual that says if your outputting a file php > will > use the same amount of memory as the size of the file. so, if the file is > 100MB php will use 100MB of memory.. is this true? > > i

[PHP] handling large files w/readfile

2004-12-26 Thread Sebastian
i'm working on a app which output files with readfile() and some headers.. i read a comment in the manual that says if your outputting a file php will use the same amount of memory as the size of the file. so, if the file is 100MB php will use 100MB of memory.. is this true? if it is, how can i wo