On Wed, Mar 7, 2012 at 12:00 AM, Michel Dänzer <mic...@daenzer.net> wrote: > On Die, 2012-03-06 at 10:46 -0800, Alon Zakai wrote: >> On Tue, Mar 6, 2012 at 6:41 AM, Michel Dänzer <mic...@daenzer.net> wrote: >> > On Die, 2012-03-06 at 05:11 -0800, Benoit Jacob wrote: >> >> >> >> Do you think that the translation of Gallium3D state to GL state could >> >> be efficient (given that this is the converse of the primary use case, >> >> which is IIUC to convert GL state to Gallium3D state)? Just checking >> >> if this has a reasonable chance of performing well. >> > >> > It does. VMware uses Gallium3D for the guest drivers to achieve OpenGL >> > hardware acceleration in virtual machines, and it performs well under >> > presumably worse circumstances (there's a virtual machine barrier >> > between the Gallium3D driver in the guest and the graphics stack in the >> > host). >> > >> >> Thanks for the information! >> >> If that approach is fast enough for VMWare to run games with, >> then it sounds pretty good. I assume btw that that work is not >> open source? > > Yes, it is. See src/gallium/drivers/svga/, src/gallium/winsys/svga/ and > src/gallium/targets/dri-vmwgfx/. > >> Would be nice if it were, it sounds like the closest thing to what we >> are doing here... > > I'm afraid it may not be as close as you hope though, as the VMware > virtual GPU is probably quite different from the WebGL environment. >
Ok, thanks. I was hoping it would be similar since it takes OpenGL and then eventually converts it to OpenGL back again in the host environment (I am assuming). But I guess the intermediate step might be very different... Definitely worth us taking a look at the code there though, thanks for the pointers. Best, Alon Zakai _______________________________________________ mesa-dev mailing list mesa-dev@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/mesa-dev