If I try to create a binary io.BufferedReader with a 1GB buffer, either with 
open(foo, "rb", buffering=1024*1024*1024) 
or with 
BufferedReader(open(foo, "rb"), buffer_size=1024*1024*1024) 
and then I read some data from a 4GB file, why does Python's memory usage not 
go up immediately by 1GB? Is there a much smaller maximum buffer size? Does it 
not pre-load data to fill the buffer if I only request a few bytes or kilobytes 
at a time? Am I doing something wrong?
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to