Josh Rosenberg added the comment:

+1. I've been assuming writelines handled arbitrary generators without an 
issue; guess I've gotten lucky and only used the ones that do. I've fed stuff 
populated by enormous (though not infinite) generators created from stuff like 
itertools.product and the like into it on the assumption that it would safely 
write it without generating len(seq) ** repeat values in memory.

I'd definitely appreciate a documented guarantee of this. I don't need it to 
explicitly guarantee that each item is written before the next item is pulled 
off the iterator or anything; if it wants to buffer a reasonable amount of data 
in memory before triggering a real I/O that's fine (generators returning 
mutable objects and mutating them when the next object comes along are evil 
anyway, and forcing one-by-one output can prevent some useful optimizations). 
But anything that uses argument unpacking, collection as a list, ''.join (or at 
the C level, PySequence_Fast and the like), forcing the whole generator to 
exhaust before writing byte one, is a bad idea.

----------
nosy: +josh.rosenberg

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue21910>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to