[CC to -devel]
(nobody checks the regression tests for each release, for example
-- and that's trivially done with a web browser!)
That reminds me of an idea I recently had: Wouldn't it be possible to
automatically generate a sort of "checksum" for each regression-test
output-file and compare it with the former releases? I know that every
little change in the pdf appearance would lead to a warning then.
But I'm sure there is a way to do not just "md5sum
regression-test-foo.pdf" but create a "difference score" by comparing
the two pdfs.
Ideally, this would output to a list like:
foo1.pdf - 0
foo2.pdf - 13
foo3.pdf - 142
foo4.pdf - 0
(...)
Then the dev's can mark the foo3.pdf like Jerry R. Ehman's wow signal...
;) I mean, this would give a priority for checking, when e.g. values >
100 are urgent to control.
Cheers,
Michael
_______________________________________________
lilypond-user mailing list
lilypond-user@gnu.org
http://lists.gnu.org/mailman/listinfo/lilypond-user