I've used a #define to set a value, I've set a declared variable to the same
text value and I've set a second declared variable to the #define value. The
declared values are the same, the #define value differs.
If this were a simple round off error, all three values should be the same. As
they are not the same, the #define has a different value. I think the #define
might have access to all of the guard bits in the FPU whereas the declared
values are truncated.
Also, please note, this problem does not happen with gcc 2.9.2 and I'm running
on Red Hat Linux
==============================================================================
cat prec.c
==============================================================================
#include <stdio.h>
#define VALUE 1e-4
int
main()
{
double v = 1.0;
double a = 1e-4;
double b = VALUE;
printf("a = %e\n", a);
printf("def/var - 1 = %e, var/def - 1 = %e\n", (VALUE*VALUE)/(a*a) - v,
(a*a)/(VALUE*VALUE) - v);
printf("def/var2 - 1 = %e, var2/def - 1 = %e\n", (VALUE*VALUE)/(b*b) - v,
(b*b)/(VALUE*VALUE) - v);
printf("var/var - 1 = %e, def/def - 1 = %e\n", (a*a)/(a*a) - v,
(VALUE*VALUE)/(VALUE*VALUE) - v);
printf("var2/var1 - 1 = %e, var2/var2 - 1 = %e\n", (b*b)/(a*a) - v,
(b*b)/(b*b) - v);
}
========================================================================================
$ gcc prec.c
$ ./a.out
a = 1.000000e-04
def/var - 1 = -7.486416e-17, var/def - 1 = 7.491837e-17
def/var2 - 1 = -7.486416e-17, var2/def - 1 = 7.491837e-17
var/var - 1 = 0.000000e+00, def/def - 1 = 0.000000e+00
var2/var1 - 1 = 0.000000e+00, var2/var2 - 1 = 0.000000e+00
$
--
Summary: Numerical error--#define value differs from declared
variable value
Product: gcc
Version: 3.4.5
Status: UNCONFIRMED
Severity: normal
Priority: P3
Component: c
AssignedTo: unassigned at gcc dot gnu dot org
ReportedBy: kevin dot glass at pnl dot gov
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=30813