On Thu, Dec 08, 2016 at 09:47:53PM +0100, Nicolas George wrote: > L'octidi 18 frimaire, an CCXXV, Michael Niedermayer a écrit : > > A. Is a heap limit for av_*alloc*() acceptable ? > > B. Are case based limits acceptable ? > > No. This is the task of the operating system. >
> > also even if C is choosen, a small set of limits on the main parameters > > still is needed to allow efficient fuzzing, all issues reported > > by oss-fuzz recently are "hangs" due to slow decoding, > > Then set a limit at the operating system level. You are misunderstanding the problem i think The goal of a fuzzer is to find bugs, crashes, undefined, bad things, OOM, hangs. If the code under test can allocate arbitrary amounts of memory and take arbitrary amounts of time in a significant number of non-bug cases then the fuzzer cannot reliably find the corresponding bugs. moving the threshold of where to declare something OOM or hang around will not solve this. blocking high resolution, high channel count, high stream count cases OTOH should improve the rate of false positives. also, secondary, resources spent on waiting for hangs to separate from slow decoding and real OOM to separate from cases just needing alot of memory, are resources that could be used for other things like fuzzing more seperate cases. but either way, iam the wrong one to disscuss changes to oss-fuzz with if you do have ideas that would improve it ... [...] -- Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB If you drop bombs on a foreign country and kill a hundred thousand innocent people, expect your government to call the consequence "unprovoked inhuman terrorist attacks" and use it to justify dropping more bombs and killing more people. The technology changed, the idea is old.
signature.asc
Description: Digital signature
_______________________________________________ ffmpeg-devel mailing list ffmpeg-devel@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-devel