On Wed, Jun 09, 2004 at 10:47:30PM -0700, Erik Steffl wrote: > I wonder why would edid info (that comes directly from monitor, as > far as I can tell) result in worse settings then what X does based on > vert/horiz frequencies.
I don't know. It's an 18" flat panel monitor who's "natural resolution" is [EMAIL PROTECTED] -- if you run at other sizes, the pixels look funny because they are fixed size, and I don't know what happens when you run at other frequencies but they are similarly "synthetic." One observes this as "tearing" of full-motion video, and DirectX always runs at 60 anyway. In Windows, the "Modes this monitor can support" are 60 and 70, and you can uncheck "Hide modes this monitor can't support" and run at 85. Maybe the EDID lists frequencies it *can* support, and this is the first time that "the highest possible" frequency is not the best one -- in fact the "lowest" supported frequency is best. -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]