Hi Mike, I think I understand your point of view. Please correct me where I am wrong.
- You feel that font sizes should be based on how large they will physically be -- the DPI for UI rendering should be the "real" DPI. - You think that having two DPI parameters is silly, and that everything should track the DPI reported by the X server. - You believe that fonts should be improved to look good at _any_ size wherever possible, or at least snap to good looking sizes. I think that's fair enough, I mean, I think the font design problem is somewhat intractable and therefore you'll never get great-looking text at small pixel sizes, but we can happily diagree on this point. I don't want to stop you from configuring your system this way. My priority is solving the practical problem we have today: many Linux users by default are given systems with seemingly random DPI values, and they have to go configure all of their fonts. Can we agree that this is a problem worth solving? Standardizing the default DPI value at the Xft level rather than the X server level seems to have better consensus, so I think it is a good start. I think if you want to promote your method, there are two changes to make: change GNOME to track Xft.dpi if it is set (your first email suggested this), and secondly add a parameter in the X server to seed Xft.dpi from the X server DPI. I do not think my proposal makes things any worse for your setup. Mike Hommey ([EMAIL PROTECTED]): > > 1. Other operating systems do not use the screen's DPI when > > rendering fonts. On Windows, there is a different function to > > determine the real DPI of the display, separate from the DPI > > used in text rendering. This seems to work well in practice. > > It's not because other systems do stupid things that we must do the > same. With such way of thinking, we would end up with a huge amount of > crap. My point was simply that there is a lot of practical evidence that this method works well, especially on the Mac which seems quite popular for desktop publishing. > > 2. Font hints are designed for specific font sizes at certain > > common DPIs. There is value in using a small set of "standard" > > DPI values for UI rendering. (see > > http://scanline.ca/dpi/fonts.html) > > Then fix font hinting. While differences of 2 dpi seem to make an ugly > difference, i'm pretty sure a difference of 10 does not. Agreed, but this does not affect whether or not two values are worse than one. I am not advocating that we make DPI non-configurable. > > 3. DPI becomes more complicated given different display devices > > such as data projectors. > > I don't see why. The only problem that could happen is that when > plugging in the new display, X doesn't know instantly that the dpi > changed. The problem is that these display devices reduce the value of using the "real" DPI for UI rendering. -Billy