On Tue, 2 Apr 2024 00:18:04 GMT, Alexander Matveev <almat...@openjdk.org> wrote:
> - Added support for #EXT-X-MEDIA tag to HTTP Live Streaming. > - Following audio renditions via #EXT-X-MEDIA tag will be supported (see CSR > for more details): > - MP2T streams with one H.264/AVC video track and elementary AAC audio > stream via #EXT-X-MEDIA tag. > - fMP4 streams with one H.264/AVC or H.265/HEVC video track and elementary > AAC audio stream via #EXT-X-MEDIA tag. > - fMP4 streams with one H.264/AVC or H.265/HEVC video track and fMP4 > streams with one AAC audio track via #EXT-X-MEDIA tag. > - Separate audio stream will be playback via separate chain of GStreamer > elements inside one pipeline. Which means two "javasource" elements will be > used inside one pipeline and they will be reading data independently of each > other via two separate HLSConnectionHolders. GStreamer will handle audio and > video synchronization based on PTS as for other streams. Other solutions were > considered such as one "javasource" with multiple source pads, but such > implementation will be more complex and does not provide any benefits. > - HLSConnectionHolder which handles video stream will also parse all > information for separate audio stream and then create child > HLSConnectionHolder for separate audio stream which will be responsible for > downloading audio segments and seek of audio streams. > - Parser in HLSConnectionHolder was reworked to make it more readable and > easy to maintain and extend. > - JavaDoc updated to point to latest HLS implementation vs old draft. It also > updated with information on #EXT-X-MEDIA tag. Also, added missing information > on AAC elementary streams and fMP4 from previous fixes. > - Fixed and improved debug output in Linux AV plugins. > - Added new property to "dshowwrapper" to disable PTS reset for each new > segment, since with #EXT-X-MEDIA tag audio and video segments are not align > and they can start at different time. > - Fixed missing PTS on first buffer after seek in MP2T demuxer in > "dshowwrapper". Without it audio and video synchronization breaks with two > separate streams. > - Removed dead code from MediaManager. > - Added handling for GST_MESSAGE_LATENCY. Based on GStreamer doc we need to > call gst_bin_recalculate_latency() when such message received. Not sure if we > really need to do this, but with separate video and audio streams we do > receive this message when seek is done. Most likely due to video and audio is > not align perfectly when we seek. For other streams this message is not > received in most cases. Reviewers: @kevinrushforth @arapte ------------- PR Comment: https://git.openjdk.org/jfx/pull/1435#issuecomment-2033203914