I'm working with a C extension that needs to rapidly create and delete objects. I came up with an approach to cache objects that are being deleted and resurrect them instead of creating new objects. It appears to work well but I'm afraid I may be missing something (besides heeding the warning in the documentation that _Py_NewReference is for internal interpreter use only).
Below is a simplified version of the approach I'm using: MyType_dealloc(MyTypeObject *self) { if(I_want_to_save_MyType(self)) { // Save the object pointer in a cache save_it(self); } else { PyObject_Del(self); } } MyType_new(void) { MyTypeObject *self; if(there_is_an_object_in_the_cache) { self = get_object_from_cache; _Py_NewReference((PyObject*)self); } else { if(!(self = PyObjectNew(MyTypeObject, &MyType)) return NULL; initialize_the_new_object(self); } return self; } The objects referenced in the cache have a reference count of 0 and I don't increment the reference count until I need to resurrect the object. Could these objects be clobbered by the garbage collector? Would it be safer to create the new reference before stuffing the object into the cache (even though it will look like there is a memory leak when running under a debug build)? Thanks in advance for any comments, casevh -- http://mail.python.org/mailman/listinfo/python-list