On 4/22/2011 4:01 AM, Thomas Rachel wrote:
Am 22.04.2011 09:01, schrieb Wolfgang Rohdewald:
On Freitag 22 April 2011, Terry Reedy wrote:
When returning from the function, g, if local, should
disappear.
yes - it disappears in the sense that it no longer
accessible, but
AFAIK python makes no guarantees as for when an object
is destroyed - CPython counts references and destroys
when no reference is left, but that is implementation
specific
.close() methods that release operating system resources are needed
*because* there is no guarantee of immediate garbage collection. They
were not needed when CPython was the only Python. The with statement was
added partly to make it easier to make sure that .close() was called.
Right - that's what I am thought about when writing the OP. But Terry is
right - often the generator doesn't need to know that is
closed/terminated, which is anyway possible only since 2.5.
But in these (admittedly rarely) cases, it is better practice to close
explicitly, isn't it?
If by 'rare case' you mean a generator that opens a file or socket, or
something similar, then yes. One can think of a opened file object as an
iterator (it has __iter__ and __next__) with lots of other methods.
Instead of writing such a generator, though, I might consider an
iterator class with __enter__ and __exit__ methods so that it was also a
with-statement context manager, just like file objects and other such
things, so that closing would be similarly automatic. Or easier:
from contextlib import closing
def generator_that_needs_closing(args): ...
with closing(generator_that_needs_closing(args)) as g:
for item in g:
do stuff
and g.close() will be called on exit from the statement.
The 'closing' example in the doc uses urlopen, which similarly has a
close() methods
--
Terry Jan Reedy
--
http://mail.python.org/mailman/listinfo/python-list