On Jun 22, 2017 9:32 AM, "Chris Angelico" <ros...@gmail.com> wrote:
On Thu, Jun 22, 2017 at 11:24 PM, CFK <cfkar...@gmail.com> wrote: > When > I draw memory usage graphs, I see sawtooth waves to the memory usage which > suggest that the garbage builds up until the GC kicks in and reaps the > garbage. Interesting. How do you actually measure this memory usage? Often, when a GC frees up memory, it's merely made available for subsequent allocations, rather than actually given back to the system - all it takes is one still-used object on a page and the whole page has to be retained. As such, a "create and drop" usage model would tend to result in memory usage going up for a while, but then remaining stable, as all allocations are being fulfilled from previously-released memory that's still owned by the process. I'm measuring it using a bit of a hack; I use psutil.Popen ( https://pypi.python.org/pypi/psutil) to open a simulation as a child process, and in a tight loop gather the size of the resident set and the number of virtual pages currently in use of the child. The sawtooths are about 10% (and decreasing) of the size of the overall memory usage, and are probably due to different stages of the simulation doing different things. That is an educated guess though, I don't have strong evidence to back it up. And, yes, what you describe is pretty close to what I'm seeing. The longer the simulation has been running, the smoother the memory usage gets. Thanks, Cem Karan -- https://mail.python.org/mailman/listinfo/python-list