Am 23.04.2011 04:15, schrieb Terry Reedy:

.close() methods that release operating system resources are needed
*because* there is no guarantee of immediate garbage collection. They
were not needed when CPython was the only Python. The with statement was
added partly to make it easier to make sure that .close() was called.

I was already aware that "with:" saves me a "finally: close(...)" and replaces the "try:" line by a slightly more complicated line, but seen on the whole it is much better memorizable.

I really like "with" and use it wherever sensible and practical.


But in these (admittedly rarely) cases, it is better practice to close
explicitly, isn't it?

If by 'rare case' you mean a generator that opens a file or socket, or
something similar, then yes. One can think of a opened file object as an
iterator (it has __iter__ and __next__) with lots of other methods.

Oh, ok. I never thought about it in that way - and it is not exactly the same: a file object already has its __enter__ and __exit__ methods which serves to avoid using closing(), while a generator object misses them.

But thanks for pointing out that it is only necessary in generators who do "necessary" cleanup things while reacting to GeneratorExit or in their finally. Then a programmer should propagate exactly such generators (or their generating functions) not "as they are", but as context manager objects which yield a generator which is to be closed upon exit[1].

Instead of writing such a generator, though, I might consider an
iterator class with __enter__ and __exit__ methods so that it was also a
with-statement context manager, just like file objects and other such
things, so that closing would be similarly automatic. Or easier:

from contextlib import closing

Yes, I mentionned that in the original posting, and I think my question is answered as well now - unlike file objects, generators were not designed to have a __enter__/__exit__ by themselves, but instead require use of closing() probably due to the fact that this feature is needed so rarely (although it would have been nice to have it always...).


Thanks for the answers,

Thomas

[1] Maybe in this way:

class GeneratorClosing(object):
    """Take a parameterless generator functon and make it a context
    manager which yields a iterator on each with: invocation. This
    iterator is supposed to be used (once, clearly) inside the with:
    and automatically closed on exit. Nested usage is supported as
    well."""
    def __init__(self, gen):
        self._iter = gen
        self._stack = []
    def __enter__(self):
        it = iter(self._iter())
        self._stack.append(it) # for allowing nested usage...
        return it
    def __exit__(self, *e):
        self._stack.pop(-1).close() # remove and close last element.

(which even can be used as a decorator to such a generator function, as long as it doesn't take arguments).

So I can do

@GeneratorClosing
def mygen():
    try:
        yield 1
        yield 2
    finally:
        print "cleanup"

and then use

with mygen as it:
    for i in it: print i
# Now, have the __enter__ call the generator function again in order to
# produce a new generator.
with mygen as it:
    for i in it: print i

and be sure that cleanup happens for every generator created in that way.
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to