On Jan 8, 2:59 am, "Diez B. Roggisch" <de...@nospam.web.de> wrote: > casevh schrieb: > > > > > > > I'm working with a C extension that needs to rapidly create and delete > > objects. I came up with an approach to cache objects that are being > > deleted and resurrect them instead of creating new objects. It appears > > to work well but I'm afraid I may be missing something (besides > > heeding the warning in the documentation that _Py_NewReference is for > > internal interpreter use only). > > > Below is a simplified version of the approach I'm using: > > > MyType_dealloc(MyTypeObject *self) > > { > > if(I_want_to_save_MyType(self)) { > > // Save the object pointer in a cache > > save_it(self); > > } else { > > PyObject_Del(self); > > } > > } > > > MyType_new(void) > > { > > MyTypeObject *self; > > if(there_is_an_object_in_the_cache) { > > self = get_object_from_cache; > > _Py_NewReference((PyObject*)self); > > } else { > > if(!(self = PyObjectNew(MyTypeObject, &MyType)) > > return NULL; > > initialize_the_new_object(self); > > } > > return self; > > } > > > The objects referenced in the cache have a reference count of 0 and I > > don't increment the reference count until I need to resurrect the > > object. Could these objects be clobbered by the garbage collector? > > Would it be safer to create the new reference before stuffing the > > object into the cache (even though it will look like there is a memory > > leak when running under a debug build)? > > Deep out of my guts I'd say keeping a reference, and using you own > LRU-scheme would be the safest without residing to use dark magic. > > Diez- Hide quoted text - > > - Show quoted text -
Thanks for the reply. I realized that I missed one detail. The objects are created by the extension but are deleted by Python. I don't know that an object is no longer needed until its tp_dealloc is called. At that point, its reference count is 0. casevh -- http://mail.python.org/mailman/listinfo/python-list