Quoth sirjofri <sirjofri+ml-9f...@sirjofri.de>: > > 22.08.2021 18:41:06 o...@eigenstate.org: > Basically do software rendering on the GPU?
Yes. Or software neural net evaluation on the GPU. Or software video decoding on the GPU. Or software image transforms on the GPU. Or software signal processing on the GPU. If there's an interface to be selected, it needs to be tractable to implement, *and* general purpose enough for everything that wants to use it. > Well, it's totally possible. Even Nanite (the new system in Unreal Engine > 5) has its own rasterizer and I believe even its own base pass. Also > Lumen does software rendering for global illumination. > > But for serious 3d AAA stuff we'd have to consider: Lumen is for next-gen > GPUs and Nanite for newer GPUs. We'll never reach their quality in > realtime if we don't use the GPU features (built-in rasterizer, ...) to > have enough free power for crazy software calculation. By the time any code is written, next-gen GPUs will be previous-gen GPUs. General compute is what any hardware you buy a few years from now will be doing -- and it's far more intersting in terms of what capabilities it allows. > I like that /dev/compute approach, but may I suggest putting it below > another directory /dev/gpu/compute so we have the ability to add > /dev/gpu/{vertex,geometry,fragment,tessellation,pixel} later? I think supporting those is a cost we should not pay. It's fundamentally solving a less general problem, and adds a lot of complexity for the potential of a small performance boost. We already have zero people sinking time into the slim interface; sinking time into a fatter interface seems like a bad idea. ------------------------------------------ 9fans: 9fans Permalink: https://9fans.topicbox.com/groups/9fans/Tad29bfc223dc4fbe-M9436d5323abdd3522e9996aa Delivery options: https://9fans.topicbox.com/groups/9fans/subscription