On Die, 2012-03-06 at 10:46 -0800, Alon Zakai wrote: > On Tue, Mar 6, 2012 at 6:41 AM, Michel Dänzer <mic...@daenzer.net> wrote: > > On Die, 2012-03-06 at 05:11 -0800, Benoit Jacob wrote: > >> > >> Do you think that the translation of Gallium3D state to GL state could > >> be efficient (given that this is the converse of the primary use case, > >> which is IIUC to convert GL state to Gallium3D state)? Just checking > >> if this has a reasonable chance of performing well. > > > > It does. VMware uses Gallium3D for the guest drivers to achieve OpenGL > > hardware acceleration in virtual machines, and it performs well under > > presumably worse circumstances (there's a virtual machine barrier > > between the Gallium3D driver in the guest and the graphics stack in the > > host). > > > > Thanks for the information! > > If that approach is fast enough for VMWare to run games with, > then it sounds pretty good. I assume btw that that work is not > open source?
Yes, it is. See src/gallium/drivers/svga/, src/gallium/winsys/svga/ and src/gallium/targets/dri-vmwgfx/. > Would be nice if it were, it sounds like the closest thing to what we > are doing here... I'm afraid it may not be as close as you hope though, as the VMware virtual GPU is probably quite different from the WebGL environment. -- Earthling Michel Dänzer | http://www.amd.com Libre software enthusiast | Debian, X and DRI developer _______________________________________________ mesa-dev mailing list mesa-dev@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/mesa-dev