Hello,

I am currently working on a patch that implements DXGI desktop capturing and 
WASAPI audio capturing for Windows.

For this, I implemented two new devices in libavdevice.


The basics work, but now I'm not sure how to synchronize the audio stream with 
the video stream. Specifically, the two devices

obviously maintain independent pts, starting from the point in time when they 
were first read from. So while in most cases there

is no human-noticeable offset, this does occasionally - with some bad luck - 
produce a rather large error.


So is there anything I'm missing here? How can I make sure the two streams sync 
up (other than hardcoding some kind of communication between them)?


Thanks,

Noah Bergbauer
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".

Reply via email to