Robert Elsner <robert.elsn...@googlemail.com> added the comment: Well the problem is, that performance is severely degraded when calling unpack multiple times. I do not know in advance the size of the files and they might vary in size from 1M to 1G. I could use some fixed-size buffer which is inefficient depending on the file size (too big or too small). And if I change the buffer on the fly, I end up with the memory leak. I think the caching should take into account the available memory on the system. the no_leak function has comparable performance without the leak. And I think there is no point in caching Struct instances when they go out of scope and can not be accessed anymore? If i let it slip from the scope I do not want to use it thereafter. Especially considering that struct.Struct behaves as expected as do array.fromfile and numpy.fromfile.
---------- _______________________________________ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/issue14596> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com