On 11/22/2017 05:26 PM, Carl Eugen Hoyos wrote: > 2017-11-23 1:30 GMT+01:00 John Stebbins <stebb...@jetheaddev.com>: >> On 11/22/2017 02:36 PM, Carl Eugen Hoyos wrote: >>> 2017-08-24 0:39 GMT+02:00 Dale Curtis <dalecur...@chromium.org>: >>> >>>> - sc->ctts_data[ctts_count].count = count; >>>> - sc->ctts_data[ctts_count].duration = duration; >>>> - ctts_count++; >>>> + /* Expand entries such that we have a 1-1 mapping with samples. */ >>>> + for (j = 0; j < count; j++) >>>> + add_ctts_entry(&sc->ctts_data, &ctts_count, >>>> &sc->ctts_allocated_size, 1, duration); >>> count is a 32bit value read from the file, so this hunk makes >>> the demuxer allocate huge amount of memories for some >>> files. >>> >>> Is there an upper limit for count? >> In practice, if a valid mp4 blows up due to this ctts allocation, >> it's also going to blow up when AVIndexEntries is allocated >> for the samples. >> An invalid mp4 can do anything of course. > This is about invalid files allocating >1GB. > >
Ah, ok. The practical limit would be the number of samples (sc->sample_count). But you can't be certain this is set before the ctts box is parsed (the value is determined when parsing stsz box). You can be certain it is set before mov_build_index is called. So perhaps revert this part and then add code to mov_build_index to expand the ctts_data entries there. This would solve the invalid mp4 alloc issues while still preserving the fix for trampling of ctts. -- John GnuPG fingerprint: D0EC B3DB C372 D1F1 0B01 83F0 49F1 D7B2 60D4 D0F7
signature.asc
Description: OpenPGP digital signature
_______________________________________________ ffmpeg-devel mailing list ffmpeg-devel@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-devel