On Wed, 29 Sep 2010 06:50:05 -0700 (PDT)
Tom Conneely <tom.conne...@gmail.com> wrote:
> 
> My original plan was to have the data processing and data acquisition
> functions running in separate processes, with a multiprocessing.Queue
> for passing the raw data packets. The raw data is read in as a char*,
> with a non constant length, hence I have allocated memory using
> PyMem_Malloc and I am returning from the acquisition function a
> PyCObject containing a pointer to this char* buffer, along with a
> destructor.

That sounds overkill, and I also wonder how you plan to pass that
object in a multiprocessing Queue (which relies on objects being
pickleable). Why don't you simply create a PyString object instead?

> So if I call these functions in a loop, e.g. The following will
> generate ~10GB of data
> 
>     x = MyClass()
>     for i in xrange(0, 10 * 2**20):
>         c = x.malloc_buffer()
>         x.retrieve_buffer(c)
> 
> All my memory disapears, until python crashes with a MemoryError. By
> placing a print in the destructor function I know it's being called,
> however it's not actually freeing the memory. So in short, what am I
> doing wrong?

Python returns memory to the OS by calling free(). Not all OSes
actually relinquish memory when free() is called; some will simply set
it aside for the next allocation.
Another possible (and related) issue is memory fragmentation. Again, it
depends on the memory allocator.

Regards

Antoine.


-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to