Quoting Kieran Kunhya via ffmpeg-devel (2024-09-23 16:45:30) > On Mon, Sep 23, 2024 at 3:27 PM Anton Khirnov <an...@khirnov.net> wrote: > > > > Quoting Antoni Bizoń (2024-09-23 10:09:51) > > > I understand that the r_frame_rate is the lowest framerate with which > > > all timestamps can be represented accurately. And I know it is just a > > > guess. But why did the logic behind the calculation change? > > > > Because you're most likely using a codec like H.264 or MPEG-2 that > > allows individually coded fields. In that case the timebase must be > > accurate enough to represent the field rate (i.e. double the frame > > rate), but the code doing this was previously unreliable, so you'd > > sometimes get r_frame_rate equal to the frame rate rather than field > > rate. That is not the case anymore. > > This is bizarre and kafkaesque to say the least.
As far as I'm concerned, r_frame_rate is a mistake and should never have existed. But since it does exist, it's better for the calculation to at least not depend on whether the lavf internal decoder has been opened during avformat_find_stream_info() or not (which used to be the case). -- Anton Khirnov _______________________________________________ ffmpeg-devel mailing list ffmpeg-devel@ffmpeg.org https://ffmpeg.org/mailman/listinfo/ffmpeg-devel To unsubscribe, visit link above, or email ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".