Change by Paul Ellenbogen :
Removed file: https://bugs.python.org/file48278/dump.py
___
Python tracker
<https://bugs.python.org/issue36694>
___
___
Python-bugs-list m
Change by Paul Ellenbogen :
Removed file: https://bugs.python.org/file48281/dump.py
___
Python tracker
<https://bugs.python.org/issue36694>
___
___
Python-bugs-list m
Change by Paul Ellenbogen :
Added file: https://bugs.python.org/file48282/dump.py
___
Python tracker
<https://bugs.python.org/issue36694>
___
___
Python-bugs-list mailin
Paul Ellenbogen added the comment:
Good point. I have created a new version of dump that uses random() instead.
float reuse explains the getsizeof difference, but there is still a significant
memory usage difference. This makes sense to me because the original code I saw
this issue in is
Change by Paul Ellenbogen :
Added file: https://bugs.python.org/file48280/common.py
___
Python tracker
<https://bugs.python.org/issue36694>
___
___
Python-bugs-list m
New submission from Paul Ellenbogen :
Python encounters significant memory fragmentation when unpickling many small
objects.
I have attached two scripts that I believe demonstrate the issue. When you run
"dumpy.py" it will generate a large list of namedtuples, then write that list
Change by Paul Ellenbogen :
Added file: https://bugs.python.org/file48279/load.py
___
Python tracker
<https://bugs.python.org/issue36694>
___
___
Python-bugs-list mailin
Paul Ellenbogen added the comment:
I think this behavior is due to the underlying behavior of the dbm. The same
code using dbm, rather than shelve, also throws KeyErrors:
from multiprocessing import Process
import dbm
db = dbm.open("example.dbm", "c")
for i in range(100):
New submission from Paul Ellenbogen:
If a shelve is opened, then the processed forked, sometime the shelve will
appear to work in the child, and other times it will throw a KeyError. I
suspect the order of element access may trigger the issue. I have included a
python script that will exhibit