Hello, all ...

I'm trying to create a sample buffer from an OpenGL view using glReadPixels(), 
so I can write the sample buffer with an AVAssetWriter I set up. I'm recording 
audio and video, and i'm really quite confused as to how to do this. So far, 
i'm recording audio and video straight from the sample buffers passed into the 
sample buffer delegate's captureOutput:didOutputSampleBuffer:fromConnection 
call, using an AVAssetWriter that I already set up. However, since i've got a 
shader doing things to the video, I actually want to write the shader-modified 
frame instead of the one passed to the delegate. The examples i've seen so far 
use AVAssetWriterInputPixelBufferAdaptor, but the examples only handle video, 
not video and audio. How do I take what glReadPixels() returns and put it into 
a CMSampleBuffer, so I can write it with my AVAssetWriter?

Any help would be quite appreciated :-)

Regards,

John



_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to