Hi.

I’m capturing live video using AVFoundation APIs only for display (Mac). This 
is a live microscope view of lake water, where scientists identify and measure 
organisms. No compression and no saving. I need to draw over the captured video 
some grid lines, polygons and text. For that I create 2 CALayers, aside from 
the one used to preview the video. This has been working fine for few years. 

Recently I tried to apply modest image-enhancement to the captured video  For 
that I introduced a CoreImage CIFilter (namely: CISharpenLuminance) like so:

1. I call setLayerUsesCoreImageFilters:YES on the NSView hosting the video 
capture.
2. Created programmatically and applied the filter to the  
AVCaptureVideoPreviewLayer.

Here’s the code - quite simple - straight from some old Apple sample).
 
       // Get our videoView and its CALayer, to draw video and other stuff on 
it.
    CALayer *videoViewLayer = [[delegate videoView] layer];     // The Delegate 
is an NSWindowController.
    NSRect bounds = [videoViewLayer bounds];
    [[delegate videoView] setLayerUsesCoreImageFilters:YES];                    
                                   // <—— new code 

    // Create and Add videoCapture layer to our videoView.
    AVCaptureVideoPreviewLayer *newPreviewLayer = [[AVCaptureVideoPreviewLayer 
alloc] initWithSession:[self session]];
    [newPreviewLayer setBackgroundColor:CGColorGetConstantColor(kCGColorClear)];
    [newPreviewLayer setVideoGravity:AVLayerVideoGravityResize];
    [newPreviewLayer setFrame:bounds];
    [newPreviewLayer setAutoresizingMask:kCALayerWidthSizable | 
kCALayerHeightSizable];

    CIFilter *sharpenLuminance = [CIFilter 
filterWithName:@"CISharpenLuminance"];       // <—— new code
    [sharpenLuminance setValue:@0.5 forKey:kCIInputSharpnessKey];               
        // <—— new code
    [newPreviewLayer setFilters:@[unsharpMask]];                                
        // <—— new code

    [videoViewLayer addSublayer:newPreviewLayer];

    // Add the delegate's Grid layer
    CALayer *gridLayer = delegate.gridLayer;
    gridLayer.frame = bounds;
    gridLayer.backgroundColor = [NSColor clearColor].CGColor;
    gridLayer.autoresizingMask = kCALayerWidthSizable | kCALayerHeightSizable;
    gridLayer.needsDisplayOnBoundsChange = YES;
    [videoViewLayer addSublayer:gridLayer];
    
    // Add the delegate's Overlay layer.
    CALayer *overlayLayer = delegate.overlayLayer;
    overlayLayer.frame = bounds;
    overlayLayer.backgroundColor = [NSColor clearColor].CGColor;
    overlayLayer.autoresizingMask = kCALayerWidthSizable | 
kCALayerHeightSizable;
    overlayLayer.needsDisplayOnBoundsChange = YES;
    [videoViewLayer  addSublayer:overlayLayer];


This seems to work too - filter and all, but I receive a nasty error log on 
EVERY CAPTURED FRAME: Fallingback to pbuffer. FBO status is 36054 .Capturing 
30fps, these log lines are intolerable.

I googled around, and found other people complaining about the same error - but 
they narrowed down the scenario further. 
1. This only happens in OS-X 10.11 El-Capitan
2. This only happens when the video’s pixel format is some YUV.

My (and my clinent’s) Macs can only run 10.11, and the video cameras attached 
to the microscope only support specific YUV pixel formats. What to do?

Has anyone any hint or leads on where to go from here? 

1. What is FBO, and what do we fallback from, to the buffer? 
2. Who emits this error, and where are these error-codes documented?
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to