octave-maintainers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: signbit and logical tests


From: Michael D. Godfrey
Subject: Re: signbit and logical tests
Date: Sat, 16 Feb 2013 00:42:22 -0500
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:17.0) Gecko/20130110 Thunderbird/17.0.2

On 02/10/2013 02:16 PM, Daniel J Sebald wrote:
On 02/10/2013 12:54 PM, Daniel J Sebald wrote:
On 02/10/2013 10:05 AM, Mike Miller wrote:
On Sun, Feb 10, 2013 at 12:48 AM, Daniel J Sebald
<address@hidden> wrote:
On 02/09/2013 09:02 PM, Michael D. Godfrey wrote:
This does not quite appear to be the case because this problem started
with bug #38291 which
showed that on at least one 32bit system (Ubuntu) the returned value
for
true is 512.
That is why my test for == 1 failed.

Just curious how that is coming about. Any guess? I would think that C
defines a logical true as 1.

The return value for signbit is defined as zero for false, non-zero
for true. You can easily verify this with a C program:

address@hidden:~/src$ cat signbit.c
#include<math.h>
#include<stdio.h>
int main()
{
printf ("signbit(-12.0) = %d\n", signbit(-12.0));
}
address@hidden:~/src$ gcc -m64 -o signbit signbit.c -lm; ./signbit
signbit(-12.0) = 128
address@hidden:~/src$ gcc -m32 -o signbit signbit.c -lm; ./signbit
signbit(-12.0) = -2147483648

As usual, this is a bit more complex/confusing than first meets the eye.

I see now it looks like C (not C++, but C) defined "signbit()" so that
the compiler could generate adequate information about the sign bit with
minimal amounts of assembly instructions. That is, whatever combination
of shifting, register manipulations, etc. that is minimum is adequate.
No need to make that signbit() result be 1 because more than likely, the
programmer writing C code will do something like

if (signbit(x)) {
}

So, if the compiler does some extra step to make signbit() 0 or 1, it's
a bit like doing the same conditional test twice.

Now, here is the monkey wrench. There is a standard C++ (not C, but C++)
library function std:signbit which DOES produce a logical 0 or 1. Observe:

address@hidden cat signbit.c
#include <math.h>
#include <stdio.h>
int main()
{
printf ("signbit(-12.0) = %d\n", signbit(-12.0));
}
[1]+ Done gvim signbit.c
address@hidden gcc -m64 -o signbit signbit.c
address@hidden ./signbit
signbit(-12.0) = 128


address@hidden cat signbit.c
/*#include <math.h>*/
#include <stdio.h>
int main()
{
printf ("signbit(-12.0) = %d\n", signbit(-12.0));
}
address@hidden gcc -m64 -o signbit signbit.c
address@hidden ./signbit
signbit(-12.0) = 1

Actually, I'm not sure what the second result is returning because the C++ signbit routine is defined in cmath.hpp which isn't being included.  Perhaps without a definition the compiler is treating signbit() as some kind of generic function and then fills in global variable for the function call with whatever symbol "signbit" it can find in the intermediate object file???

Dan
I took a look at mappers.cc and it appears that if people can accept signbit() returning logical (0,1) then the attached
patch will do it.  Could it be this simple?  Works for me.

Michael

Attachment: signbit_logical.diff
Description: Text Data


reply via email to

[Prev in Thread] Current Thread [Next in Thread]