On Thu, 22 Sep 2016 18:45:43 +0000 "Priebe, Jason" <jpri...@cbcnewmedia.com> wrote:
> This patch adds a new filter that allows you to drive dynamic graphic overlays > on a live encoding by creating/updating/deleting a specified 32-bit PNG. > This is very different from the overlay filter, because it lets you change > the overlay in real time during a live stream. It doesn't allow you to > overlay > video on top of video, but you can overlay still images over video, which is > useful for things like lower-thirds and fullscreen graphics. This seems like a very odd approach, and the filter seems to have very limited usefulness. Why can't this be a bit more general? Like providing an API to set overlay images, and letting ffmpeg.c use it to e.g. load from a file in intervals. Rather than hardcoding yet another CLU use-case in the libs. We already have a long list of filters that seem to hardcode such things. Most importantly, it definitely shouldn't duplicate any blending code. > + > +static unsigned long long get_current_time_ms (void) > +{ > + unsigned long long ms_since_epoch; > + struct timespec spec; > + > + clock_gettime(CLOCK_REALTIME, &spec); Unportable. > + > + ms_since_epoch = > + (unsigned long long)(spec.tv_sec) * 1000 + > + (unsigned long long)(spec.tv_nsec) / 1.0e6; > + > + return ms_since_epoch; > +} > + > +static int load_overlay (AVFilterContext *fctx) > +{ > + DynOverlayContext *ctx = fctx->priv; > + > + AVFrame *rgba_frame; > + > + struct stat attrib; > + int ret; > + > + if ((ret = stat(ctx->overlayfile, &attrib)) != 0) Not sure if portable. _______________________________________________ ffmpeg-devel mailing list ffmpeg-devel@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-devel