I have an NSOperation that opens a CGImageRef from a digital camera(large file). It creates a full-size TIFFRepresentation in RAM, then scales it down to a smaller CGContextRef. I create an NSImage out of this CGContextRef. This same NSOperation then adds it to a QTMovie with the -addImage:forDuration:withAttributes: method.
Once the image is added, I send off an NSNotification and the thread/NSOperation dequeues(I'm assuming). Once the frame-adding operation is complete, the movie (originally created with a tempfile on the main thread) that is generated is subsequently added to another NSOperationQueue which just writes it to a file. However, I find that I run out of RAM very quickly during the image-creation phase. When analysing it in Instruments, objects don't appear to be getting deallocated(not sure which objects). After about 120 images, my 4-gig macbook pro runs out RAM ... mmap error stating that it couldn't allocate RAM. I thought encapsulating the image-creation and movie-creation together in a separate thread would guarantee me that once that operation got dequeued at least the NSImage objects associated with that autorelease pool would be sent a -release message and be freed. It appears the NSImage is being properly disposed-of because when I try: NSLog(@"[anImage retainCount] %d", [anImage retainCount]); [self setImageAsMPEG4:anImage]; NSLog(@"[anImage retainCount] %d", [anImage retainCount]); the second call to [anImage retainCount] causes a crash: objc[4573]: FREED(id): message retainCount sent to freed object=0x24114470 so the NSImage must be gone? Why does my RAM fill-up then? I keep the full movie in memory because I need it to save to disk later, but it shouldn't hog all the system resources after 120 1/2 second frames, should it? So here is the main() method to the NSOperation subclass that takes a CGImageRef, converts it to an NSImage, the adds it as a frame to a QTMovie: -(void)main { NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init]; [QTMovie enterQTKitOnThreadDisablingThreadSafetyProtection]; //[self.movieExport.movie attachToCurrentThread]; NSAssert(m_fileURL != nil, @"file url is nil!"); NSImage *anImage = nil; if (![self isCancelled]) { CGImageRef scaledImage = [self scaledImageForURL:m_fileURL]; NSBitmapImageRep *bitmapRep = [[[NSBitmapImageRep alloc] initWithCGImage:scaledImage] autorelease]; CFRelease(scaledImage); anImage = [[NSImage alloc] initWithData:[bitmapRep TIFFRepresentation]]; } NSLog(@"[anImage retainCount] %d", [anImage retainCount]); [self setImageAsMPEG4:anImage]; //NSLog(@"[anImage retainCount] %d", [anImage retainCount]); //[self.movieExport.movie detachFromCurrentThread]; [QTMovie exitQTKitOnThread]; [pool release]; } I don't think I have enough room in this email to include -scaledImageForURL: nor -setImageAsMPEG4: but can supply it in a subsequent posting. Michael _______________________________________________ Cocoa-dev mailing list (Cocoa-dev@lists.apple.com) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to arch...@mail-archive.com