On 29/09/2020 18:14, Pavel Koshevoy wrote:
On Tue, Sep 29, 2020 at 10:09 AM Mark Thompson <s...@jkqxz.net> wrote:
On 28/09/2020 02:17, Pavel Koshevoy wrote:
On Wed, Sep 23, 2020 at 1:32 PM Paul B Mahol <one...@gmail.com> wrote:
On Mon, Sep 21, 2020 at 09:47:40PM -0600, Pavel Koshevoy wrote:
Allow setparams to be used with hw backed frames and
avoid an assertion failure in avfilter_config_links.
---
libavfilter/vf_setparams.c | 3 +++
1 file changed, 3 insertions(+)
LGTM
applied, pushed.
I reverted this: setparams is not hwframe aware so adding the flag does
not make sense. It broke all attempts to use it with hardware frames,
because the default context passthrough is required.
E.g.
$ ./ffmpeg_g -y -i in.mp4 -init_hw_device vaapi:/dev/dri/renderD128 -an
-vf hwupload,setparams=range=full,hwdownload out.mp4
...
[hwdownload @ 0x55916d90a580] The input must have a hardware frame
reference.
[Parsed_hwdownload_2 @ 0x55916dcab140] Failed to configure input pad on
Parsed_hwdownload_2
Error reinitializing filters!
Failed to inject frame into filter network: Invalid argument
Error while processing the decoded data for stream #0:0
Conversion failed!
Perhaps it would help if you explained what your original problem was,
because it definitely wasn't this.
- Mark
It's pretty much this use case, except I'm not using ffmpeg cli but the
avfilter api to configure the filter chain, and I'm working with
AV_PIX_FMT_CUDA frames.
Perhaps I am mis-using the api, but my patch was sufficient for my needs:
```
bool
VideoFilterChain::setup_filter_links(int num_threads)
{
graph_ = avfilter_graph_alloc();
graph_->nb_threads = num_threads;
int err = avfilter_graph_parse2(graph_, filters_.c_str(), &in_,
&out_);
UL_FAIL_IF_AVERROR(err);
if (hw_frames_.ref_)
{
AVHWFramesContext * hw_frames_ctx =
hw_frames_.get<AVHWFramesContext>();
AVBufferRef * device_ref = hw_frames_ctx->device_ref;
for (int i = 0; i < graph_->nb_filters; i++)
{
AVFilterContext * filter_ctx = graph_->filters[i];
UL_ASSERT(!filter_ctx->hw_device_ctx);
filter_ctx->hw_device_ctx = av_buffer_ref(device_ref);
bool found_hwdownload = strcmp(filter_ctx->filter->name,
"hwdownload") == 0;
if (found_hwdownload)
{
break;
}
for (int j = 0; j < filter_ctx->nb_outputs; j++)
{
AVFilterLink * link = filter_ctx->outputs[j];
UL_ASSERT(!link->hw_frames_ctx);
link->hw_frames_ctx = av_buffer_ref(hw_frames_.ref_);
<http://git.videolan.org/?p=ffmpeg.git;a=blob;f=libavfilter/avfilter.h;h=99297ae798aa325ac37836a3a90d9a3f8e1e7a95;hb=HEAD#l497>
Don't write to internal fields.
I'm not sure exactly what you're trying to do by writing the internal context
fields in this section, but I suspect that if you just remove it entirely then
the expected context propagation will happen and it will work.
}
}
}
err = avfilter_graph_config(graph_, NULL);
UL_FAIL_IF_AVERROR(err);
src_ = lookup_src(graph_->nb_filters ? graph_->filters[0] : NULL,
"buffer");
sink_ = lookup_sink(src_, "buffersink");
UL_FAIL_UNLESS(src_ && sink_);
return true;
}
```
the filters_ datamember could contain setparams filter, like this:
```
if (src_pix_fmt != dst_pix_fmt)
{
oss << ",format=pix_fmts=" << dst_pix_fmt_txt;
}
if (!(same_color_specs || csp_xform_.zscale_transform()))
{
// a bold faced lie ... it's either that
// or do the actual colorspace conversion:
oss << ",setparams"
<< "=range="
<< av_color_range_name(dst_specs_.color_range)
<< ":color_primaries="
<<
av_color_primaries_name(dst_specs_.color_primaries)
<< ":color_trc="
<< to_colorspace_trc_name(dst_specs_.color_trc)
<< ":colorspace="
<< av_color_space_name(dst_specs_.colorspace);
}
}
oss << ",buffersink";
filters_ = oss.str().c_str();
```
- Mark
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".