On Sat, 26 Aug 2023, Ahmad Nouralizadeh wrote:

> > However, I would have expected that VLC would produce a lot
> > GPU/iGPU accesses even without drawing anything, because it would
> > try to use GPU decoder.

For the discrete GPU, the turned off screen requires much smaller bandwidth in 
any benchmark (reduces from 2GB/s to several KB/s). The same seems to be true 
with iGPU. Of course, there might exist
some DRAM accesses originating from the GPU/iGPU. But the main traffic seems to 
fade. These assumptions are based on my experiments and I could be wrong.
(P.S.: VLC seems to be aware of the screen state. The rendering thread will 
stop when the screen is off (mentioned 
here:https://stackoverflow.com/q/76891645/6661026).)

> > Displaying video is also often done using GL or Xvideo - plain X is
> > too slow for this.
I'm looking for a simpler solution. I'm not familiar with these Xorg-related 
concepts! It seems a bit strange that turning off screen requires so much 
effort! If `xset dpms force off` would not
cause screen activation with user input or `xrandr --output...` wouldn't cause 
segfault, everything would be fine.

Here is a simplified explanation:

In order to display anything on the screen the video card needs an array of data given color of each pixel. This is usually called "framebuffer" because it buffers data for one frame of video.

For every monitor you plugged in there is a separate framebuffer, unless they display the same thing (mirror).

To draw, the CPU either sends data directly to framebuffer, requests video card to pull data from RAM, or does some more complicated combination of the two (this includes GL and Xvideo acceleration).

So you have a system CPU -> Video Card -> Monitor

When you request "dpms off" all this does is tell monitor to turn off the light and save power. Everything that normally be drawn will still be drawn, as you can verify using x11vnc and vncviewer.

When you request "xrandr ... --off" you are requesting the equivalent of physically unplugging monitor cable. The framebuffer associated with that monitor will get destroyed. That's likely why you saw that gnome-panel error - some library it relies on could not deal with the fact that the framebuffer it was supposed to draw into suddenly disappeared.

From the point of view of a benchmark you need to be very careful not
alter the task, as modern systems love to optimize.

For example, many applications will stop drawing when their window is fully obscured (don't know about vlc, but likely).

However, this behaviour will change depending on whether compositor is enabled, and even depending on how many windows are open as compositor has limits.

best

Vladimir Dergachev



>edit to add: google suggests another candidate might be something
>called pin-instatPin works at the source code level. It counts source-level 
accesses which might not reach DRAM (e.g., services by caches).

> > best
> >
> > Vladimir Dergachev 

Reply via email to