On Sun, Dec 05, 2004 at 10:12:08AM -0600, Billy Biggs wrote: > Anders Karlsson ([EMAIL PROTECTED]): > > > The best scenario would be if X + Desktop Environment picked up on the > > actual DPI of the screen(s) and adjusted for that automatically. > > I disagree. If I hook my laptop up to a data projector, I do not want > all of my fonts to shrink to nothingness.
I think that the data projector would have to lie about its DPI. It doesn't even make sense for a data projector to tell anyone about its DPI. It certainly doesn't know. > I believe that: > - "Screen DPI" as a concept is really just a "scale factor" for user > interfaces and should almost never represent the literal meaning Screen DPI is actually meant to mean exactly that. If you want a scale factor, this should be set somewhere else, since this is a different concept. That way, if you want all your widgets super big because you have bad vision, even if you switch monitors, they will stay exactly the same size. > - Many fonts design their hints for specific point sizes at specific > DPIs. Using "99x98" DPI or "102x102" DPI, even if accurate, is > not productive. Hrm... I always thought that since fonts were hinted algorithmically, the minor differences between those DPIs would be indistinguishable. I didn't know that hinting engines are that fragile. Simon