2012/2/8 Caolán McNamara <caol...@redhat.com>: > On Tue, 2012-02-07 at 22:05 +0100, Markus Mohrhard wrote: >> But keep in mind that this is nothing that will be fast. I needed >> around one week to check a bit more than 4000 files > > My times are dramatically faster than this for .doc files in writer at > least. I can load 4000 .doc files *under valgrind* overnight in about 10 > hours. So apparently mileage varies quite a bit depending on hardware > and file format and debugging level. >
That only works if you have no crashs or looping documents. Especially looping is a big problem in calc. Then if we want to be fast using a debug/dbgutil build is the wrong way but then we loose the advantages of gcc's safe iterators. So I think 4000 known good documents can be easily tested in one day or even faster with a "decent" machine but taking 4000 random documents from bugzilla needs some manual interaction and therefore will need more time. As mentionend in the last mail, I think we could speed that up by copying the test code and the makefile several times and run the test in parallel. That way we could use more cores and would be more reliable against crashs. ( we should of course not commit this stuff then) Markus _______________________________________________ LibreOffice mailing list LibreOffice@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/libreoffice