avr-gcc-list
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: [avr-gcc-list] More results from the testsuite with avrtest


From: Paulo Marques
Subject: RE: [avr-gcc-list] More results from the testsuite with avrtest
Date: Sun, 20 Jan 2008 04:21:08 +0000
User-agent: Internet Messaging Program (IMP) H3 (4.1.2)

Quoting "Weddington, Eric" <address@hidden>:
[...]

FAIL: gcc.c-torture/execute/built-in-setjmp.c execution,  -O2

Now reported as bug #34879:

http://gcc.gnu.org/bugzilla/show_bug.cgi?id=34879


FAIL: gcc.c-torture/execute/builtin-bitops-1.c compilation,  -O0
[...]

The "undefined reference to __ffshi2" had already been reported:

http://gcc.gnu.org/bugzilla/show_bug.cgi?id=34210

The other errors seemed similar enough to be added to the same report, so I just added a new comment.

<more identical problems snipped>

FAIL: gcc.c-torture/execute/float-floor.c execution,  -O0
[...]
This test seems to only work if we have a 8 byte "double" type. So,
this should probably be unsupported for avr.

Agreed. Although at some point it would be nice to have 8 byte double
types for the AVR.

Yes, it would be nice, but in the meantime, the test should be fixed so that we can run the test suite with no failures:

http://gcc.gnu.org/bugzilla/show_bug.cgi?id=34880

FAIL: gcc.c-torture/execute/multi-ix.c compilation,  -O0
FAIL: gcc.c-torture/execute/multi-ix.c compilation,  -O1
[...]
If I change the STACK_SIZE to 1000, the test runs successfully. Since
the atmega128 has 4Kb of RAM, maybe we should increase our
requirements a little.

Agreed.

From what I remember, there are some other test in the test suite that
fail on the AVR because the AVR does not have enough RAM to support
those tests. In those cases, those tests should be disabled for the AVR.

For avrtest, I could set the RAM to 64k and pretend I had external RAM, by giving the proper parameters to the compiler / linker. This should allow us to run more of the RAM demanding tests. I'll give it a try.

FAIL: gcc.c-torture/execute/pr17377.c execution,  -O0
[...]
This also uses __builtin_return_address. Still tracking down this one.

See AVR GCC bug #21080
<http://gcc.gnu.org/bugzilla/show_bug.cgi?id=21080>
It references __builtin_return_address. A comment in that bug also
references bug #21078:
<http://gcc.gnu.org/bugzilla/show_bug.cgi?id=21078>
Which I mentioned earlier.

So it's possible that these could all be related.

Ok. I'll wait for those bugs to be closed and retry later.


FAIL: gcc.c-torture/execute/pr22493-1.c execution,  -O1
FAIL: gcc.c-torture/execute/pr22493-1.c execution,  -O2
FAIL: gcc.c-torture/execute/pr22493-1.c execution,  -Os

This is an actual bug. This function:

void f(int i)
{
  if (i>0)
    abort();
  i = -i;
  if (i<0)
    return;
  abort ();
}

is compiled to:

void f(int i)
{
  abort ();
}

because "if (i <= 0)", then "(-i >= 0)" must be true, right?

Unfortunately this is wrong for the corner case where "i = INT_MIN",
because "-i" is also INT_MIN.

Then it probably needs a bug report.

After looking at it some more, I'm starting to wonder: i = -i is actually a signed overflow for the case where i = INT_MIN. So, by the standard, the result is undefined, and the compiler is free to optimize this.

Now, gcc has a couple of flags to deal with this sort of optimizations: -fwrapv and -fno-strict-overflow. They do produce different code, but still assume that "i = -i; if (i < 0)" is equivalent to "if (i >= 0)".

So, is the testsuite that needs fixing or gcc?


FAIL: gcc.c-torture/execute/pr27364.c execution,  -O1
FAIL: gcc.c-torture/execute/pr27364.c execution,  -O2
FAIL: gcc.c-torture/execute/pr27364.c execution,  -Os

This test assumes 32 bit integers.

Then the test itself needs to be fixed. There have been a number of
cases like this recently where the test would fail on another 16-bit
integer target, and the test was fixed so as to not make that
assumption.

Unfortunately, there is more to this test than meets the eye :(

Even after I changed it to:

int f(unsigned number_of_digits_to_use)
{
 if (number_of_digits_to_use > 1294)
   return 0;
 return ((number_of_digits_to_use * 3321928L) / 1000000L + 1) /16;
}

the test still fails with -O1, -O2 and -Os. With -O0 it produces correct code, and with -O3, the f function is incorrect (as with -O2), but the call is optimized away in the main function.

With -O2 we get this:

int f(unsigned number_of_digits_to_use)
{
 if (number_of_digits_to_use > 1294)
206:    65 e0           ldi     r22, 0x05       ; 5
208:    8f 30           cpi     r24, 0x0F       ; 15
20a:    96 07           cpc     r25, r22
20c:    c0 f4           brcc    .+48            ; 0x23e <f+0x38>
   return 0;
 return ((number_of_digits_to_use * 3321928L) / 1000000L + 1) /16;
20e:    bc 01           movw    r22, r24
210:    80 e0           ldi     r24, 0x00       ; 0
212:    90 e0           ldi     r25, 0x00       ; 0
214:    0e 94 6c 06     call    0xcd8   ; 0xcd8 <__mulsi3>
[...]

It "forgets" to load r18:r19:r20:r21 with 3321928 before calling __mulsi3.

So maybe I should file 2 bug reports... :P


<snipped bug reporting instructions>

A VERY BIG THANK YOU! for doing all of this! This work will go a long
way towards improving the quality of the AVR toolchain!

Thank you for your support :)

--
Paulo Marques


----------------------------------------------------------------
This message was sent using IMP, the Internet Messaging Program.





reply via email to

[Prev in Thread] Current Thread [Next in Thread]