I would say that computing more bits would be less confusing. I use the
general rule of thumb that 10 bits equals 3 decimal digits. At present,
SAGE seems to be out on the last digits, I think the answer is R(10/3)
= 3.33333333333335 bits. :-)

On a more serious note, sage currently claims to be working to 53 bits
accuracy which is 16 decimal digits, but it actually supplies 17
significant digits. It is possible that were this fixed, the problem
would vanish.

Incidentally, is there a way of changing the *default* precision in
SAGE (at runtime) from 53 bits? I couldn't find this in the manual
after an extensive search.

Bill.


--~--~---------~--~----~------------~-------~--~----~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~----------~----~----~----~------~----~------~--~---

Reply via email to