Hello. > testing of special functions involves comparing actual values returned > by CM with expected values as computed with an arbitrary precision > software (I use Maxima [1] for this purpose). As I intend these tests > to be assesments of the overall accuracy of our implementations, the > number of test values is quite large.
How large? > For the time being, I've inlined > the reference values in double[][] arrays, in the test classes. A priori, that's fine. > This > clutters the code, and I will move these reference values to resource > files. I'm not fond of this idea. I prefer unit test classes to be self-contained as much as possible. > In order to limit the size of these files, I'm considering binary > files, the obvious drawback being the lack of readability (for those > of us who haven't entered the Matrix yet). > So what I would propose to add a readme.txt file in the same resource > file directory, where the content of each binary file would be > detailed. > Would you object to that? Why do you want to test a very large number of values? Isn't it enough to select problematic cases (near boundaries, very small values, very large values, etc.). I'm not sure that unit tests should aim at testing all values exhaustively. That might be a side project, maybe to be included in the user guide (?). > I'm thinking of reserving the *.dat extension to these binary files. > This would entail renaming a few resource files from *.dat (I had > myself introduced in the optimization.general package) to *.txt. Is > that OK? Let's first decide about the break-up... Regards, Gilles --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org For additional commands, e-mail: dev-h...@commons.apache.org