On 21/07/2022 23:39, Marco Sulla wrote:
I've done this other simple test:
#!/usr/bin/env python3
import tracemalloc
import gc
import pickle
tracemalloc.start()
snapshot1 = tracemalloc.take_snapshot().filter_traces(
(tracemalloc.Filter(True, __file__), )
)
for i in range(10000000):
pickle.dumps(iter([]))
gc.collect()
snapshot2 = tracemalloc.take_snapshot().filter_traces(
(tracemalloc.Filter(True, __file__), )
)
top_stats = snapshot2.compare_to(snapshot1, 'lineno')
tracemalloc.stop()
for stat in top_stats:
print(stat)
The result is:
/home/marco/sources/test.py:14: size=3339 B (+3339 B), count=63 (+63),
average=53 B
/home/marco/sources/test.py:9: size=464 B (+464 B), count=1 (+1),
average=464 B
/home/marco/sources/test.py:10: size=456 B (+456 B), count=1 (+1),
average=456 B
/home/marco/sources/test.py:13: size=28 B (+28 B), count=1 (+1),
average=28 B
It seems that, after 10 million loops, only 63 have a leak, with only
~3 KB. It seems to me that we can't call it a leak, no? Probably
pickle needs a lot more cycles to be sure there's actually a real leakage.
If it was a leak, then the amount of memory used or the counts would
increase with increasing iterations. If that's not happening, if the
memory used and the counts stay the roughly the same, then it's probably
not a leak, unless it's a leak of something that happens only once, such
as creating a cache or buffer on first use.
--
https://mail.python.org/mailman/listinfo/python-list