Simon Law ([EMAIL PROTECTED]): > > I disagree. If I hook my laptop up to a data projector, I do not > > want all of my fonts to shrink to nothingness. > > I think that the data projector would have to lie about its DPI. It > doesn't even make sense for a data projector to tell anyone about its > DPI. It certainly doesn't know.
Agreed. :) > > I believe that: > > - "Screen DPI" as a concept is really just a "scale factor" for user > > interfaces and should almost never represent the literal meaning > > Screen DPI is actually meant to mean exactly that. If you want a > scale factor, this should be set somewhere else, since this is a > different concept. That way, if you want all your widgets super big > because you have bad vision, even if you switch monitors, they will > stay exactly the same size. This is exactly what I am proposing, basically. Screen DPI today is meaningless on Windows because they use it as a scale factor, meaningless on MacOS because it is always 72, and meaningless on Linux because it is set basically randomly unless someone manually figures it out. If you want a DPI setting that is correct, but must always be set manually, fine, use DPI. But please set it to something sane as a default! > > - Many fonts design their hints for specific point sizes at specific > > DPIs. Using "99x98" DPI or "102x102" DPI, even if accurate, is > > not productive. > > Hrm... I always thought that since fonts were hinted algorithmically, > the minor differences between those DPIs would be indistinguishable. I > didn't know that hinting engines are that fragile. I should probably be making this case stronger. I put up a webpage with some screenshots that shows the effects of having "close, but not as intended" DPI values: http://scanline.ca/dpi/fonts.html -Billy