Recently, in the calculator group, a fellow has an older calculator and was 
using the power function. You know x^y. He'd put in 2^2 and got 3.99998 or 
something where the last digit was 2 off. He was worried that the calculator 
was bad. I explained to him that the calculator was fine. It was simply a 
rounding error. I explained that the calculator was not doing 2*2 and that it 
used some series or such to calculate the general case. Most calculators today 
have about 2 extra bits not shown to hide most rounding errors. Early 
calculators didn't have any extra digits.
The Intel case was actually worse as its errors were often large and not just 
in the last digits, when they happened. The errors tended to cluster around odd 
integer values as well. When looking at all the possible numbers, the odds were 
small. When looking at integer values the odds were a lot larger and the errors 
were also larger.
Dwight


________________________________
From: cctalk <cctalk-boun...@classiccmp.org> on behalf of Paul Koning via 
cctalk <cctalk@classiccmp.org>
Sent: Wednesday, January 9, 2019 5:49 AM
To: Tony Duell; General Discussion: On-Topic and Off-Topic Posts
Subject: Re: Teaching Approximations (was Re: Microcode, which is a no-go for



> On Jan 8, 2019, at 11:58 PM, Tony Duell via cctalk <cctalk@classiccmp.org> 
> wrote:
>
> ...
> IIRC one of the manuals for the HP15C had a chapter on 'Why this
> calculator gives the wrong answers'. It covered things like rounding
> errors.
>
> -tony

That reminds me of a nice old quote.

"An electronic pocket calculator when used by a person unconversant with it 
will most probably give a wrong answer in 9 decimal places" -- Dr. Anand 
Prakash, 9 May 1975

Understanding rounding errors is perhaps the most significant part of 
"numerical methods", a subdivision of computer science not as widely known as 
it should be.  I remember learning of the work of a scientist at DEC whose work 
was all about this: making the DEC math libraries not only efficient but 
accurate to the last bit.  Apparently this isn't anywhere near as common as it 
should be.  And I wonder how many computer models are used for answering 
important questions where the answers are significantly affected by numerical 
errors.  Do the authors of those models know about these considerations?  
Maybe.  Do the users of those models know?  Probably not.

        paul

Reply via email to