I tweaked one of my format parameters somewhere in the processing chain, and it started working. I think I was accidentally initing my NSBitmapImageRep in 2 places. I can't explain why it was acting how it was, but it works now.

On 26/04/2008, at 1:49 AM, Jean-Daniel Dupas wrote:

What do you want to do with you captured images ? Just display them or record them ?

Both. At the moment I'm probably doing things very inefficiently, but I looked at the OpenGL examples, and got very lost very quickly, so decided to stick with classes that I had some idea of how to use. I want to display the original image, process the bitmap data, then display some resulting images as well.

If you just want to display it, you should use an NSOpenGLView and draw the CVBuffer or the CIImage directly. If this is for recording or processing the pixels (without CoreImage), you can try to generate a CVPixelBuffer (using the ARGB pixel format) instead of CVOpenGLTextureBuffer.

You can do this using QTPixelBufferContextCreate() instead of QTOpenGLTextureContextCreate() to create your drawing context. A CVPixelBuffer allow you to access directly to the bitmapData (without using CIImage and NSBitmapImageRep).

You can check the QTPixelBufferVCToCGImage sample code to see how to setup this kind of context.

What would the "best" technique (in terms of execution speed) be to both display, AND process the frames? Is there any reason not to create both a CVPixelBuffer AND a CVBuffer?

Thanks,
Nick
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to