On Thu, May 28, 2015 at 05:52:35PM +0000, Urvang Joshi wrote: > On Wed, May 27, 2015 at 5:33 PM Michael Niedermayer <michae...@gmx.at> > wrote: > > > On Wed, May 27, 2015 at 03:10:05PM -0700, Urvang Joshi wrote: > > > All the frames that the native muxer gets are fully reconstructed > > > frames, > > > > wrong > > > > > > > and they should not be alpha-blended with the previous frames. > > > > > > As per the WebP container spec > > > https://developers.google.com/speed/webp/docs/riff_container#animation, > > > ANMF chunk should specify blending method = do not blend (and disposal > > > method = do not dispose). > > > > > > However, the native muxer was wrongly setting blending method = use > > > alpha blending. > > > This bug can be reproduced by converting a source with transparency > > > (e.g. animated GIF with transparency) to an animated WebP, and viewing > > > it with vwebp. > > > --- > > > libavformat/webpenc.c | 2 +- > > > 1 file changed, 1 insertion(+), 1 deletion(-) > > > > this breaks the encoder completely > > the testcase is the same as previously but probably any testcase that > > enables encoding multi frame animations will do > > try -cr_threshold 10000 -cr_size 16 for example > > > > Ah, the problem seems to be for sources which don't have alpha, and then > alpha is introduced by 'cr_threshold' and 'cr_size' for example. [There are > other cases too, but this is one example]. > > I believe the logic for cr_threshold / cr_size is incorrect then, > unfortunately. Here's why: > 1. The original frame that the encoder gets (before possibly being modified > based on cr_threshold and cr_size) is fully-reconstructed, and should NOT > be alpha-blended with the previous frame. > [Yes, this is true. You can repro this bug by converting this GIF to WebP > before this patch: http://dhelemann.de/images/Flug1.gif] > > For example, if this frame had a transparent pixel, it should be shown as > transparent pixel and should NOT see-through the corresponding pixel from > the previous frame. This would be achieved by setting blending method = "do > not blend". > > 2. On the other hand, based on the cr_threshold and cr_size, some pixels > which are 'similar' to the corresponding pixels in the previous frame are > modified to be transparent. So, this logic expects that the frame is > alpha-blended with the previous frame (to see-through pixels from the > previous frame). > > Clearly, the two requirements are contrasting and cannot be met. >
> One solution I can think of: > (1) By default, set blending method = "do not blend" > (2) Some pixels can be modified to become transparent ONLY IF the original > frame doesn't have any transparent pixels. And if some pixels are made > transparent, we set blending method = "blend". sounds possible unless iam too tired and misunderstand > > Thoughts? webp allows updating the last frame by using alpha, but it does not allow a normal RGBA difference update like a P frame with 0,0 MVs would be IIUC. supporting something similar to P frames would be better than trying to emulate GIF I never tried to reencode a gif to webp and dont plan it in the future either, i think the whole webp design is too much based on gif not considering that it could be used for material that is not from a gif file [...] -- Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB DNS cache poisoning attacks, popular search engine, Google internet authority dont be evil, please
signature.asc
Description: Digital signature
_______________________________________________ ffmpeg-devel mailing list ffmpeg-devel@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-devel