The graphic shows that your fonts are being rendered to their point size
instead of their size in pixels, which is how GNUstep (incorrectly) used to
handle fonts.
pixelSize = round (pointSize * (dpi * (1/72)))
As a user and/or as an application writer, I don't understand how this is
supposed to work.
We draw lines and rectangles and fill colors using pixels. All right,
probably in postscript if you draw a line of width 1 and length 3 in the
standard scaling that is supposed to draw a line of xxx millimeters of
width and 3*xxx millimeters of length, but the fact is we ignore that and
always draw a line of width 1 pixel and length 3 pixel (I seriously doubt
you can draw lines in millimeters on the screen anyway, since the pixel
size depends on the monitor and I don't think there is a way you can know
the exact pixel size from software, and if you can't know the exact pixel
size, no way you can draw in millimeters, you can only draw in pixels).