[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [avr-libc-dev] RFC: avr/bits.h
From: |
Bob Paddock |
Subject: |
Re: [avr-libc-dev] RFC: avr/bits.h |
Date: |
Tue, 01 Mar 2005 13:42:35 -0500 |
User-agent: |
Opera M2/7.54u1 (Win32, build 3918) |
On Tue, 01 Mar 2005 11:24:18 -0700, E. Weddington <address@hidden> wrote:
Is there some case where the construct ((uint32_t)1<<(bitpos))
is really going to promote to 32 bit code?
Well it won't "promote" to 32 bits as it's there already. The other
macros that I defined will then trim that value down to the specified
length (8/16 bits).
Now the real question is whether we can just get away with having
one version of this and typecasts will fix everything up.
That does seem to be the case.
In my test case everything looks like it should, there are no
odd bits of 32 bit stuff laying around where it should not be.
GCC was smart enough to NOT turn everything into unsigned longs.
To rephrase my question:
Is there any place where using this "((uint32_t)1<<(bitpos))"
is going to generate real Flash occupying 32 bit code,
when it is not needed?
Re: [avr-libc-dev] RFC: avr/bits.h, Dave Hylands, 2005/03/01
Re: [avr-libc-dev] RFC: avr/bits.h, Joerg Wunsch, 2005/03/01
Re: [avr-libc-dev] RFC: avr/bits.h, Nicolas Schodet, 2005/03/01