lwip-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [lwip-devel] Noise from sockets.c


From: Gisle Vanem
Subject: Re: [lwip-devel] Noise from sockets.c
Date: Mon, 08 Aug 2011 16:25:59 +0200

"Kieran Mansley" <address@hidden> wrote:

On Sun, 2011-08-07 at 02:03 +0200, Gisle Vanem wrote:
A lot of noise coming from my MingW-gcc 4.5.0 when compiling sockets.c:

sockets.c: In function 'lwip_accept':
sockets.c:462:5: warning: format '%u' expects type 'unsigned int', but argument 
2 has type
'long unsigned int'

That is an odd error, because the value being printed is being
explicitly cast to the right type.  E.g. ip4_addr1_16(ipaddr) is defined
as ((u16_t)ip4_addr1(ipaddr)).  What is your definition of U16_F?  Your
port of lwIP should make sure that the definition of U16_F matches the
type width on your platform for u16_t.

My 'U16_F' is "hu". But the problem is not with IPv4-addresses. AFAICS the noise comes from printing an IPv6-address with the 'X16_F'. Mine is "hx" also. I build with '-DLWIP_IPV6=1' in my makefile. So stuff like 'IP6_ADDR_BLOCK1(ipaddr)' is supposed to return a 16-bit value from htonl(). Right? I.e.:
 #define IP6_ADDR_BLOCK1(ip6addr) ((htonl((ip6addr)->addr[0]) >> 16) & 0xffff)

But the shifting and 'and-ing' into a 16-bit value means nothing to silence gcc.
He is pretty stupid in this regard. Hence I suggested the explisit cast into a 
16-bit value.
That shuts gcc up here. How about it?

--gv



reply via email to

[Prev in Thread] Current Thread [Next in Thread]