Hi everyone, I have a question about streams and the maximum ‘chunk size’ of
8192. 

 

I’ve read README.STREAMS and found these slides by Wez:

http://netevil.org/blog/2008/07/slides-php-streams

 

While trying to write an Amazon S3 stream wrapper and I ran into an issue
with large files: 

 

$fp = fopen('s3://mvtest/large.html', 'r'); // 30 mb

 

// This is OK

fseek($fp, 10);

echo fread($fp, 100) . "\n"; // 100 bytes

echo fread($fp, 100) . "\n"; // 100 bytes

 

// This is OK (according to documentation, max 8192 bytes) 

echo fread($fp, 65536) . "\n"; // 8192 bytes

 

My issue is I would like to request larger ‘chunks’, something like:

stream_set_chunk_size($fp, 65536);

 

echo fread($fp, 65536) . "\n"; // 65536 bytes

echo fread($fp, 100000) . "\n"; // 65536 bytes

echo fread($fp, 15) . "\n"; // 15 bytes

 

Then copying to a file and avoiding memory issues:

 

$wfp = fopen(‘/tmp/large.html’);

stream_copy_to_stream($fp, $wfp); // read 65536 byte chunks, write default
8192 byte chunks

 

stream_set_chunk_size($wfp, 65536);

stream_copy_to_stream($fp, $wfp); // read & write 65536 byte chunks

copy('s3://mvtest/large.html', '/tmp/large.html'); // read & write default
8192 byte chunks

 

Going through the PHP 5.2 source, it looks like there’s support for it but
at some places the 8192 ‘chunk’ is hardcoded:

 

#define CHUNK_SIZE     8192

 

PHPAPI size_t _php_stream_copy_to_stream(php_stream *src, php_stream *dest,
size_t maxlen STREAMS_DC TSRMLS_DC)

{

                char buf[CHUNK_SIZE]; ß  Is there any reason the php_stream
*src->chunk_size isn’t used?

 

stream_set_chunk_size($fp, 65536); // Would mean src->chunk_size = 65536;

 

I’d like to try to write a patch for it, anything that I should know about
streams and why the limit is there?

Reply via email to