[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Help required in gcc 3.2.3 optimization
From: |
Pradeep Kumar |
Subject: |
Help required in gcc 3.2.3 optimization |
Date: |
29 Sep 2004 10:52:34 -0000 |
Hi,
We are finding the output differences between O2 flagged , optimized
version and a g3 gdwarf-2 flagged debugable version
of a C++ application , which does some very heavy floating point
xnumber crunching.
We are noticing differences with very small numbers (close to 0) like 1.234E-11
The optimized and debugable binaries built on the same platform
generate different outputs. This we are observing on both the platforms
mentioned below:
Platform A: Linux 2.1 AS, gcc 2.96, RW SourcePro 3.0
Platform B: Linux 3.0 AS, gcc 3.2.3 , RW SourcePro 6.1
However the debuggable binaries generated on these two platforms give
the same output.
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=8613 a link which talks
about a bug with the O2 flag.
We need some detailed information of any issues related to the O2 flag
used for optimization on any of the above platforms.
If anybody has faced similar issues or come across any doc which talks
about this, please share with us.
Thanks,
Pradeep.
- Help required in gcc 3.2.3 optimization,
Pradeep Kumar <=