I have been digging for a couple of weeks now and I have progressed to the
point where I have the AVCaptureDevice and the session.

Here's what I need help/guidance/assistance on:

I want to capture the video images from which ever camera device I select
and using those images, real time, take the image and pass the bytes to a
java program.

What I can't seem to figure out is how to get the images real time, like
preview does.  I am trying to emulate how the QuickTime API worked using
AVFoundation to capture the images and pass them to my java program.

The JNI portion I already have figured out.

I have initialized the AVCaptureSession.  I have the list of
AVCaptureDevices.  I can select one of the Capture Devices.  I can start
the AVCaptureSession using [session startRunning];.  I can't figure out
the next portion to get the images from the device.

Can anyone tell me what the next steps are?  Any snippets of code that I
can look at?  I'm sure I'm not the first to try to do this.

Thanks in advance.
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to