avr-libc-dev
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [avr-libc-dev] [bug #34695] fixed width int types without __attribut


From: David Brown
Subject: Re: [avr-libc-dev] [bug #34695] fixed width int types without __attribute__()
Date: Tue, 01 Nov 2011 14:23:58 +0100
User-agent: Mozilla/5.0 (Windows NT 5.1; rv:6.0.1) Gecko/20110830 Thunderbird/6.0.1

On 01/11/2011 13:45, Joerg Wunsch wrote:
As David Brown wrote:

It is even more ironic that there are already standard definitions in
stdint.h precisely to support tools other than avr-gcc, namely doxygen.

OK, so what's wrong with writing

#if defined(__DOXYGEN__) || defined(LINT)

then?


That might be okay.  However, why not do this:

#if defined(__DOXYGEN__)
#define __attribute__(discard)
#endif

Then you can have a single set of definitions:

typedef unsigned char uint8_t __attribute__((__mode__(__QI__)));
typedef unsigned int uint16_t __attribute__((__mode__(__HI__)));
typedef unsigned long int uint32_t __attribute__((__mode__(__SI__)));
#if !__USING_MINT8
typedef unsigned long int uint64_t __attribute__((__mode__(__DI__)));
#endif


This means your real-world definitions are the same as the DOXYGEN ones (always a good idea, if possible). Tools that don't like __attribute__ will get the correct sizes, except in the odd case of people using -mint8 (which is already asking for trouble with third-party tools, since it is not standard C). avr-gcc will always see the correct size, even with -mint8, since the "mode" attributes will override the sizes of the types.

Surely that would keep everyone happy?


In particular, in C a
"unsigned char" is /not/ identical to an "8-bit unsigned int".  The
"unsigned char" has different aliasing properties than an "unsigned int"
of any size.

In which respect?


A "char *" can alias any other type, but an "int *" cannot - it will only alias compatible types (signed and unsigned ints, const versions, etc.). I'm a little unclear about whether it is only plain char* and unsigned char* that have this aliasing property, or if it also applies to signed char* (I've seen conflicting references). But the use of mode-sized "ints" opens up two possible issues:

If "8-bit QI int" does not alias any type (other than compatible ones), then code that relies on aliasing will be incorrect.

On the other hand, it may be that the different mode-sized int types will all alias each other, and standard int (and compatible types). This could then lead to less optimal code.

It may also be that a 32-bit mode-size int would not alias a normal "long int", which could again lead to incorrect code.


It is not very often that the aliasing rules actually influence the generated code. But it /can/ happen, and it is very important that they are correct - and that they are guaranteed to stay correct with future versions of the compiler. Someone familiar with the insides of gcc may be able to give a definitive answer here, but I would feel happier using standard types for the typedefs - then we are definitely on safe ground.

mvh.,

David






reply via email to

[Prev in Thread] Current Thread [Next in Thread]