I've got the following code snippet which is part of code that creates a 
CGImage. The CGImage is created but the last line of this code snippet triggers 
the output of an error message to the console. The self._movieAsset is an 
AVURLAsset created from a local file on disk. I've tried some variations for 
the properties of the videoSettings dictionary without making a difference. For 
example whether kCVPixelBufferOpenGLCompatibilityKey is a property of the 
dictionary or not, has a value of @(YES)/@(NO) does not make any difference.

The console message is:
[18:45:59.824] openglVideoCompositor_SetProperty signalled err=-12784 
(kFigBaseObjectError_PropertyNotFound) (unrecognised property) at 
/SourceCache/CoreMedia/CoreMedia-1562.19/Prototypes/MediaConverter/VideoCompositing/FigVideoCompositor_OpenGL.c
 line 1424
[18:45:59.924] <<<< Boss >>>> figPlaybackBossPrerollCompleted: unexpected 
preroll-complete notification

==========================

        CMTime frameTime = frame time when I want the frame taken at.
        NSArray *tracks = A list of AVAssetTracks
        AVAssetReader *assetReader;
        assetReader = [[AVAssetReader alloc] initWithAsset:self._movieAsset
                                                     error:nil];
        if (!assetReader)
        {
            return nil;
        }

        NSDictionary *videoSettings = @{
                (id)kCVPixelBufferPixelFormatTypeKey : 
@(kCVPixelFormatType_32ARGB),
                (id)kCVPixelBufferOpenGLCompatibilityKey : @(YES)
            };
        AVAssetReaderVideoCompositionOutput *vidComp;
        vidComp = [AVAssetReaderVideoCompositionOutput
            assetReaderVideoCompositionOutputWithVideoTracks:tracks
                                               videoSettings:videoSettings];
        AVVideoComposition *avVidComp;
        avVidComp = [AVVideoComposition
                        videoCompositionWithPropertiesOfAsset:self._movieAsset];
        vidComp.videoComposition = avVidComp;

        if (![assetReader canAddOutput:vidComp])
        {
            return nil;
        }

        [assetReader addOutput:vidComp];
        // We need a duration for the time range, to do that I'm pulling the
        // min frame duration from the first video track. Need to think about
        // what alternatives we might have.
        AVAssetTrack *track = tracks.firstObject;
        CMTimeRange range = CMTimeRangeMake(frameTime, track.minFrameDuration);
        assetReader.timeRange = range;
        if ([assetReader startReading])


_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to