On Mon, 22 Jan 2024, Tomas Härdin wrote:
>>> - if (frame->quality)
> > - enc->lambda = frame->quality - 1;
> > - else
> > - enc->lambda = 2*ROQ_LAMBDA_SCALE;
> > + if (avctx->bit_rate <= ROQ_DEFAULT_MIN_BIT_RATE) {
> > + /* no specific bit rate desired, use frame quality */
> > + if (frame->quality)
> > + enc->lambda = frame->quality - 1;
> > + else
> > + enc->lambda = 2*ROQ_LAMBDA_SCALE;
> > + }
>
> This looks like a bit of a janky way to switch between qscale and
> bitrate. Isn't there a way to detect whether an option has been set
> explicitly? At the very least this behavior should be documented in
> doc/encoders.texi
>
Originally, the code just checked for bit_rate !=
AV_CODEC_DEFAULT_BITRATE,
which required including options_table.h, which in turn produced a
bunch
of compilation warnings about certain fields being deprecated. None
of the
other codecs include that file + many simply check the bit_rate field
against
magic constants.
grepping for 200000 didn't reveal anything like that. Do you have a
specific example of an encoder that does this?
Perhaps we could move AV_CODEC_DEFAULT_BITRATE somewhere else, to avoid
pulling in a bunch of unrelated stuff. Maybe that doesn't need to hold
up this patch though. Tbh the way bitrate is defaulted to a value,
which makes it impossible to differentiate between a user-supplied -b
200k an no -b at all, is even more janky. The default is also
ridiculously low..
I know some encoders like libvpx allow specifying both quality (-crf)
and bitrate at the same time
FWIW, it's possible for an encoder to individually override the defaults
for fields like these. See e.g. x264_defaults in libx264.c, where it
overrides the default bitrate to zero.
// Martin
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".