Re: Blob Detection with Core Image

2008-05-06 Thread Jean-Daniel Dupas
You have to properly configure your QTVideoContext to get this. By default, most of the CoreVideo sample code uses QTOpenGLTextureContextCreate(), and so, you get CVOpenGLTextureRef. If you want to retreive CVPixelBuffers, you have to create your QTVisualContext using the QTPixelBufferContext

Re: Blob Detection with Core Image

2008-05-06 Thread Raphael Sebbe
I understand, processing is made on CPU anyway. The overhead you get is because you duplicate (or redraw) the image before processing it. I believe you actually get a CVImageBufferRef from QTKit, not a CIImage, which resides in memory (not VRAM, as it comes from a camera anyway). You could get acc

Re: Blob Detection with Core Image

2008-05-06 Thread Bridger Maxwell
I think I was unclear on where I was lost. I didn't think that I would be able to use the OpenTouch blob detection framework, because I couldn't pass it a CIImage, and converting the CIImage to an NSBitMapImageRep was too slow. The only way to pass the image data to the blob detection library was t

Blob Detection with Core Image

2008-05-06 Thread Bridger Maxwell
Hello,I am trying to write a program that will detect bright "blobs" of light in an image and then track those blobs of light. I would be a Cocoa version of OpenTouch at http://code.google.com/p/opentouch/. I am wondering the best way to do this sort of image processing with Cocoa frameworks. I ha