On Thu, 2007-08-23 at 18:30 +0200, Michael Biebl wrote: > > Screen 0: minimum 320 x 200, current 1280 x 768, maximum 1600 x 1200 > > VGA-0 disconnected (normal left inverted right) > LVDS connected 1280x768+0+0 (normal left inverted right) 0mm x 0mm > 1680x1050 60.7 + > 1280x800 60.0 > 1280x768 60.0* > 1024x768 60.0 > 800x600 60.3 > 640x480 59.9 > S-video disconnected (normal left inverted right) > > So 1280x768 is chosen by the xserver, although 1680x1050 is my default.
Others have explained this, not sure why it doesn't choose 1280x800 though. > I can switch to my preferred solution via xrandr -s 1680x1050, but that > doesn't > work e.g. for gdm. Funny, that shouldn't work given the above... > Besides from this, the new driver seems to work stable so far and my > glxgears rates (I know it's not benchmark ^_^) are even 10% higher. Maybe the older driver unnecessarily enabled the second CRTC, wasting memory bandwidth. > Will test the TV-Out and VGA output later. Do I have to use xrandr to > enable them? The radeon manpage doesn't list any options anymore howto > enable them. You can use xrandr at runtime, or see the xorg.conf manpage about the monitor section (not sure this is already documented in the 1.3.0 manpage though). -- Earthling Michel Dänzer | http://tungstengraphics.com Libre software enthusiast | Debian, X and DRI developer