[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: "cannot coerce inexact literal to fixnum"
From: |
Al |
Subject: |
Re: "cannot coerce inexact literal to fixnum" |
Date: |
Sat, 10 Feb 2024 14:32:16 +0200 |
User-agent: |
Betterbird (Linux) |
On 2024-02-10 13:00, Peter Bex wrote:
These so-called "big-fixnums" are compiled into a string literal which gets
decoded on-the-fly at runtime into either a fixnum (on 64-bit) or a bignum (on 32-bit).
That would be fine but where does that happen? csc actually barfs on my
Scheme code (as per the subject line), instead of emitting C code to
encode/decode into a string at runtime, as you mention. It won't even
let me use string->number by hand. The only thing that worked was
(cond-expand
(csi
(define INT32_MAX #x7fffffff)
(define INT32_MIN #x-80000000)
(define UINT32_MAX #xffffffff)
)
(else
; chicken csc only does 31-bit literals in fixnum mode
(define INT32_MIN (foreign-value "((int32_t) 0x80000000)" integer32))
(define INT32_MAX (foreign-value "((int32_t) 0x7fffffff)" integer32))
(define UINT32_MAX (foreign-value "((uint32_t) 0xffffffff)"
unsigned-integer32))
)
)
... and I'm not sure what the implications of using a "foreign value"
further down in my program are. If I assign them to another variable,
does that variable also become a "foreign value"? How about if I do
(bitwise-and IMAX32 int) to truncate a signed number to unsigned32
(which is what I'm actually using them for)?
There's (currently) no option to force fixnum mode in a way that ignores
the existence 32-bit platforms. Theoretically, it should be possible to
compile your code assuming fixnums (so it emits C integer literals) and
make it barf at compilation time if one tried to build for a 32-bit
platform using a #ifdef or something. We just don't have the required
code to do this, and I'm not sure this is something we'd all want.
Well if csc emitted string->number code in fixnum mode when necessary,
that would at least work. Although if I'm using fixnum mode, I'm
probably looking for performance, and I'm not sure the subsequent C
compiler is smart enough to optimize the "atoi" or whatever away into a
constant. Maybe it is nowadays.
Otherwise, how do I write Scheme code to truncate a signed number to
unsigned32? Resort to foreign values as I did above (or write foreign
functions)?
-- Al