In a photography app, I'm chaining a series of CIFilters together to process the image. I'm able to do the chain just fine, but I'm only using a few filters at the moment. As I increase the length of this chain, am I going to run into performance issues? Here's my code:

exposureFilter   = [CIFilter filterWithName: @"CIExposureAdjust"
        keysAndValues:
                @"inputImage", inputImage,
                @"inputEV", [NSNumber numberWithFloat: exposureValue],
                nil];
outputImage = [exposureFilter valueForKey: @"outputImage"];
[exposureFilter retain];

zoomFilter   = [CIFilter filterWithName: @"CILanczosScaleTransform"
        keysAndValues:
                @"inputImage", outputImage,
                @"inputScale", [NSNumber numberWithFloat: mappedZoom],
                @"inputAspectRatio", [NSNumber numberWithFloat: 1.0],
                nil];
outputImage = [zoomFilter valueForKey: @"outputImage"];
[zoomFilter retain];


Obviously, it's easy to continue to add to this chain. Is this the most efficient way to string multiple filters together? I can probably use Quartz Composer to create a single image unit that does all of the processing. Would that be any more efficient, or would it be essentially the same result?

Thanks,

Josh
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to