On Thu, Aug 28, 2014 at 02:42:45PM +0200, Andreas Cadhalpun wrote: > On 28.08.2014 14:29, Derek Buitenhuis wrote: > >On 8/28/2014 12:28 PM, Andreas Cadhalpun wrote: > >>>>>>ffmpeg -f data -i http://samples.ffmpeg.org/image-samples/lena.pnm -c > >>>>>>copy -f data -map 0 -y lena.pnm > >>>> > >>>>From: > >>>><https://lists.ffmpeg.org/pipermail/ffmpeg-devel/2014-August/161881.html> > >>> > >>>possible > >>> > >>>but would this make andreas / debian happy ? > >> > >>No. > > > >I think you've all missed Lou's point here. The point isn't to use the URL > >via -i <url> during testing. The point was that you can use ffmpeg itself to > >download the file to disk, instead of wget, and saves a dep from being added. > > That's clear. But it doesn't work without internet connection. > So why not use fate-rsync, which is only run manually, when internet is > available?
I completely lost track of the discussion, but this all so shouts bikeshed and overkill. If I may suggest another colour for the shed: Put some random, freely licensed image where the Lena image was, put the Lena one in the sample repository and run the tests with either just one or both depending on what is available. Disadvantages: increases runtime Advantages: everyone running proper tests gets increased coverage (especially if the image has quite different characteristics), everyone else still gets a good bit of testing. _______________________________________________ ffmpeg-devel mailing list ffmpeg-devel@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-devel