More info please... how big is a "certain size"? How about running "strace -e trace=net" ? Even better, can you sit down and step through and find out where the read is being cut short?
The thing is that file_get_contents() should not have been sensitive to the greedy read problem (since it reads in chunks anyway), so this doesn't make too much sense to me right now (but I've only spent 5 minutes looking at it; got a lot of work on). --Wez. On Mon, 12 Jul 2004 18:02:28 +0000, Curt Zirzow <[EMAIL PROTECTED]> wrote: > * Thus wrote Antony Dovgal: > > On Mon, 12 Jul 2004 15:00:09 +0000 > > Curt Zirzow <[EMAIL PROTECTED]> wrote: > > > > > * Thus wrote Andi Gutmans: > > > > When did you try that? Did you check with latest CVS because Wez > > > > made some fixes. > > > > > > Just rechecked with current CVS version this morning and the > > > problem still persists. > > > > > > Not only is it comming back with partial content, there seems to be > > > blocks of ^@ chars between the content. > > > > Could you give a small reproduce script ? > > I tried it on local and remote files and it works ok for me. > > Its simply: > <?php > echo file_get_contents("http://php.net/"); > ?> > > I rechecked this with a fresh compile from snapshot of > php5-200407121630. It still occurs for me. > > It seems to work fine for regular files and small web pages but > breaks when content from a webpage becomes a certain size. > > > > Curt > -- > First, let me assure you that this is not one of those shady pyramid schemes > you've been hearing about. No, sir. Our model is the trapezoid! > > -- > PHP Internals - PHP Runtime Development Mailing List > To unsubscribe, visit: http://www.php.net/unsub.php > > -- PHP Internals - PHP Runtime Development Mailing List To unsubscribe, visit: http://www.php.net/unsub.php