Kristján Valur Jónsson added the comment:

Yes, the old memory argument.
But is it valid? Is there a conceivable application where a dict of weak 
references would be storing a large chunk of the application memory?
Remember, all of the data must be referred to from elsewhere, or else, the weak 
refs would not exist.  An extra list of pointers is unlikely to make a 
difference.
I think the chief reason to use iterators has to do with performance by 
avoiding the creation of temporary objects, not saving memory per-se.

Before the invention of "iteritems()" and friends, all such iteration was by 
lists (and hence, memory usage).  We should try to remain nimble enough so that 
we can undo an optimization previously done, if the requirements merit us doing 
so.

As a completely unrelated example of such nimbleness:  Faced with stricter 
regulations in the 70s, american car makers had to sell their muscle cars with 
increasingly less powerful engines, efectively rolling back previous 
optimizations :)

Anyway, it's not for me to decide.  We have currently three options:
a) my first patch, which is a duplication of the 3.x work but is non-trivial 
and could bring stability issues
b) my second patch, which will increase memory use, but to no more than 
previous versions of python used while iterating
c) do nothing and have iterations over weak dicts randomly break when an 
underlying cycle is unraveled during iteration.

Cheers!

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue7105>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to