I raised this question before without success, and here rephrase the question.
Normally xrandr reports that VGA-1 is disconnected and DVI-I-1 is connected. However, currently both are connected. Problem is that the display comes up with a low resolution. I'm running Jessie, but an installation of Wheezy on another drive on the same machine automatically gives me the 1920x1080 resolution with the same nouveau driver. I'm apparently using DVI-I-1, but highest available resolution is not being selected. In the return for xrandr I get 1920x1080 59.83 + ... 1024x768 60.00 * ... For VGA-1 I get 1024x768 60.00 * ... What is the mechanism/script that automatically selects my card's optimal resolution? How can I disconnect the VGA-1 so that what it has as current does not conflict with the DVI-I-1 current?