On Mar 13, 4:34 pm, Julien PUYDT <julien.pu...@laposte.net> wrote:
> Hi,
>
> among the few failing tests with my ARM built, two are because of
> accuracy reasons :
>
> File "/home/jpuydt/sage-4.6.2/devel/sage/sage/functions/other.py", line 497:
>      sage: gamma1(float(6))
> Expected:
>      120.0
> Got:
>      119.99999999999997

Why are we even testing the behaviour on floats? Isn't sage's default
floating point type RDF?

sage: a=float(119.99999999999997)
sage: a
119.99999999999997
sage: RDF(a)
120.0
sage: float(RDF(a))
119.99999999999997

CPython explicitly defers to the C it is compiled with to define the
behaviour or "float". See sys.float_info for more information. It's a
bit ridiculous to test verbatim behaviour for something that is
defined to be varying (although you may want to doctest exactly that
to be aware of possible sources of different behaviour on different
platforms).

Anyway, "RDF" has a lot more tolerant printing behaviour than "float"
-- I guess Python's printing produces enough decimal digits to
faithfully reconstruct the binary representation? Doctesting
RDF(gamma1(float(6.0))) would probably test float arithmetic but print
the result in a way that it is easier to pass doctests, without
allowing for very big inaccuracies.

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org

Reply via email to