[EMAIL PROTECTED] writes:
> I have been doing a lot of reading on de-uglifying fonts on X. I
> have installed truetype fonts, worked on xfs, tried different
> configurations in my applications.

Personally, I don't believe in this truetype thing; PostScript Type 1
fonts work perfectly well for scaledness, and there is a reasonable
set of them installed on most systems.

> In the end it worked quite well, but nowhere near to what I was used
> to on that Other System. I was running 1280x1024 on my 19 inch
> monitor. Just for the heck of it I tried 1600x1200. Automagically
> all my fonts are now perfect. In fact browsing looks at least as
> good as IE/Mozilla on Win.
...
> If I decide that the higher refresh rate is more important than the
> 1600x1200 refresh rate and go back to 1280x1024, I would like my
> fonts to stay the same. Does anyone know a way to do this?

I do disgusting things with my .Xresources file.  There seems to be no
reasonable way to do this with GNOME, sadly, and it doesn't work for
remote GNOME apps (where it does for other remote apps, e.g. Emacs and
xterms).  That having been said...

xrdb can run an arbitrary preprocessor over your .Xresources file.  It
normally runs the C preprocessor, cpp, but that's really not suited
for this since X resources files aren't C source.  Instead, I use m4;
rename .Xresources to .Xresources.m4, and add 'xrdb -cpp m4
.Xresources.m4' to your .xsession file.

xrdb(1) documents a number of macros that are defined by xrdb before
including the X resources file.  These include, among other things,
the display resolution in pixels per meter.  We can use m4 macros to
simplify this, though.  Add to the top of your .Xresources.m4 file:

define(X_DPI,eval(X_RESOLUTION*100/3937))
define(Y_DPI,eval(Y_RESOLUTION*100/3937))

Now we just need to have a standard way of getting a font name for a
particular foundry, face, weight/slant, and size.  You want to use
full X font names here (since you don't get a scalable font unless all
of the pixel size, point size, X resolution, Y resolution, and
character with are defined, possibly to zero!).  My world only has
normal and bold fonts, so I define:

define(FONT,`$1-0-$2-X_DPI-Y_DPI-*-0-iso8859-1')
define(NORMALFONT,`FONT(-$1-$2-medium-r-normal-,$3)')
define(BOLDFONT,`FONT(-$1-$2-bold-r-normal-,$3)')

Then, whenever I need to define a font, I use one of these macros:

XTerm*font: NORMALFONT(adobe,courier,110)
Emacs.default.attributeFont: NORMALFONT(adobe,courier,110)


<rant><para>
I consider it a major bug in GNOME that there's no good way to do
this.  I think in my ideal world GNOME apps would magically figure out
the resolution of the display they're running on and scale their own
fonts correctly.  (From playing around briefly with KDE apps, there
are suggestions that it might do this...)  While X resources are a bit
klunky for ordinary users, it'd be nice if there were some way for
power users to get things to work "correctly", for values of "correct"
that are "consistent with the rest of their setup".
</para></rant>

-- 
David Maze         [EMAIL PROTECTED]      http://people.debian.org/~dmaze/
"Theoretical politics is interesting.  Politicking should be illegal."
        -- Abra Mitchell


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED] 
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]

Reply via email to