emacs-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: JSON/YAML/TOML/etc. parsing performance


From: Paul Eggert
Subject: Re: JSON/YAML/TOML/etc. parsing performance
Date: Fri, 6 Oct 2017 12:36:17 -0700
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Thunderbird/52.3.0

On 10/06/2017 12:40 AM, Eli Zaretskii wrote:
Those valid-but-enormous values are either invalid (if they are larger
than PTRDIFF_MAX), or easily uncovered by looking at higher call-stack
frames in the debugger
I'm not quite following, since "valid-but-enormous values" cannot be invalid. However, I don't normally use a debugger to find or debug integer-overflow problems. I more often use static analysis, for which signed integers work better since static analysis can more-easily detect signed operations that look dubious. When I do run-time checking, I normally use -fsanitize=undefined or something like that, instead of GDB; and here again, signed integers work better than unsigned integers do.

I don't envision many primitives to need this kind of change
At present zero primitives need this kind of change for JSON, since the JSON code doesn't need to do any overflow checking for sizes under the currently-proposed patches. If we run across the problem in the future for other libraries, we can revisit the issue then.

I'm not sure that experience is 100% applicable to Emacs, because
Emacs has special needs due to the fact that our integers are narrower
than the corresponding C integral types.
That problem is separate from the ptrdiff_t vs size_t problem, which is the issue at hand here, and which corresponds directly to the experience I've had with with ptrdiff_t and size_t in other GNU programs. Preferring ptrdiff_t to size_t (or vice versa) does not affect whether code needs to check for fixnum overflow.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]