I'm trying to capture video. In the capture the user can select a specific area
to capture out of the full field of the video.
That much functions the problem is that although the image I get has the
correct size I cannot figure out how to get that source image to draw in to an
NSImage correctly scaled. In other words the video gets all pulled out of it
correct aspect...
this is what I have
NSSize imSize = NSMakeSize(CGImageGetWidth(cgImage),
CGImageGetHeight(cgImage));
NSImage * screenImage = [[NSImage alloc]initWithCGImage:cgImage
size:imSize];
[screenImage setScalesWhenResized:YES];
[screenImage setSize:movieFrame.size];
I did try the drawInRect:fromRect but that di not produce any better results at
all even though I locked the destination rect, filled it with black pixels and
then tried the source over composite. Any ideas on what I'm doing wrong
here?_______________________________________________
Cocoa-dev mailing list ([email protected])
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com
This email sent to [email protected]