[EMAIL PROTECTED] said:

> I ran some more tests, some of which might be more significant:
>
>                    time(sec)   db size (kB)    peak RAM (MB)
> no coverage           15          ---             ~ 10
> Data::Dumper+eval    246          245             ~ 23.4
> Storable             190           60             ~ 19.7
> no storage           184          ---             ~ 18
>
> The 'no coverage' run is to provide a baseline.
>
> For the 'no storage' test, I ran using Devel::Cover, but modified the
> read() and write() methods to be essentially no-ops. I did this to
> isolate the time overhead of coverage itself, as opposed to the time
> spent reading and writing the db.

Thanks.  This is interesting.

Was this using all the coverage criteria?  I suspect so.  At some point I
need to do some tests to determine the average overhead of different
criteria but in general running subroutine and statement coverage will
give the lowest overhead, adding branch coverage will bump up the overhead
significantly, and putting condition coverage on top will get you to the
12x seen above.  All dependent on the actual code being covered, of
course.

There is an overhead in the time needed to collect coverage, the memory
required whilst it is being collected, the disk space required to store
it, and the size of the (html) report files.

-- 
Paul Johnson - [EMAIL PROTECTED]
http://www.pjcj.net

Reply via email to