freetype-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [ft-devel] maxp.maxStackElements (e: Freetype-devel Digest, Vol 141,


From: Hin-Tak Leung
Subject: Re: [ft-devel] maxp.maxStackElements (e: Freetype-devel Digest, Vol 141, Issue 4)
Date: Sat, 8 Oct 2016 19:38:50 +0000 (UTC)

Hiya -

My impression of this part of FreeType's code is that FreeType treats the 
numbers in maxp as advisory. It reserves an extra amount (?32 seems to come to 
my mind) on top of what maxp says, then during running as the value is 
exceeded, either just silently ignore discrepancies and revises the internal 
value, or if ->pedantic is on, throw an error and returns. So there are two 
values, a soft-limit (which can be exceeded unless in pedantic mode), and a 
hard-limit (which can't) in FreeType.

Maybe VTT is doing effectively what the extra amount of reservation does.

Also - maybe the microsoft rasterer assumes some of the structures/values from 
CVT or fpgm stays on the stack? This is really more a question to the author of 
VTT of how VTT or the microsoft rasterer works (if you can get an answer from 
Microsoft..).


 Message: 1
 Date: Sat, 8 Oct 2016 11:01:43 +0100
 From: Cosimo Lupo <address@hidden>
 To: address@hidden
 Subject: [ft-devel] how to compute maxp.maxStackElements?
 
 Hello list,
 
 I would like to use a debug hook with the truetype bytecode
 interpreter to
 calculate the maxp table's `maxStackElements` field, which
 is defined as
 the "maximum stack depth", including "Font and CVT Programs,
 as well as the
 instructions for each glyph".
 
 I'm trying to emulate what MS Visual TrueType (VTT) does
 when one runs the
 command "Recalc maxp values" from its graphical interface.
 
 I've looked at how the freetype2-demos' ttdebug, as well as
 FontForge's TT
 debugger work, and below is what I've got so far.
 Please note that my C is quite rudimentary -- I plan to move
 this code to
 Python at some point --, but I think you get the point:
 
 
 ```
 #define CUR  (*exc)
 
 struct debugger_context {
     FT_Long maxStackSize;
 };
 
 static struct debugger_context *global_debugger_context;
 
 static FT_Error test_debug_hook( TT_ExecContext exc ) {
     struct debugger_context *dc =
 global_debugger_context;
     FT_Error error = 0;
 
     CUR.instruction_trap = 1;
 
     while ( CUR.IP < CUR.codeSize ) {
         error = TT_RunIns( exc );
         if ( error != 0 )
             break;
         if ( CUR.top >
 dc->maxStackSize )
            
 dc->maxStackSize = CUR.top;
     }
 
     return error;
 }
 ```
 
 This seems to work, in the sense that the computed value is
 big enough for
 both FreeType and MS rasterizers to render the font. If I
 set it to
 anything less than that value, for example, the interpreter
 probably runs
 out of memory and rejects the instructions.
 
 I noticed however that the value as computed by VTT seems to
 be always a
 bit greater than the one I get from the code above, usually
 around 80-90
 bytes greater (it varies from font to font).
 I don't know why that is the case, and I wonder if someone
 else in this
 list could shed some light on this?
 
 Thank you for your support.
 All best,
 


reply via email to

[Prev in Thread] Current Thread [Next in Thread]