Hi Keiichi,
Thanks for the update.
> > Thanks for providing this detailed overview. But again, we have already
> > discussed this in a similar way and it does not answer the questions. Ok,
> > suppose we set bitrate to 0x as I assumed above. Then the decoder
> > code should ideally wait u
The key feature is a pipeline "X framebuffer -> FBC -> VRAM -> NVENC". This is done via hardware+nvidia driver. NVENC can get its data via CUDA, OpenGL or DirectX. CUDA is not an option by some reasons (requires Quadro configurations of vGPU only that requires more expensive licenses, and some hard
On 4/27/20 3:24 PM, ole-kru...@yandex.ru wrote:
Hi,
I can set display=off of course. But in that case Nvidia FBC
(FrameBuffer Capturing) is unusable. But we try to make and use a plugin
that uses that feature. I use GRID vGPUs. xorg.conf is set to use nvidia
card -- it's the same config that w
Hi,I can set display=off of course. But in that case Nvidia FBC (FrameBuffer Capturing) is unusable. But we try to make and use a plugin that uses that feature. I use GRID vGPUs. xorg.conf is set to use nvidia card -- it's the same config that works with gst-plugin. 27.04.2020, 15:20, "Uri Lublin"
On 4/27/20 11:31 AM, Oleg Krutov wrote:
We are trying to make plugin which uses nvidia FBC + NVENC instead of
gst-plugin. When using FBC, I must set "display" to "on", else FBC is
reporting as not supported. I can't do the trick with qxl+nvidia with
display off as with gst-plugin. Thus, two spi
On 4/27/20 6:17 AM, Tomasz Chmielewski wrote:
Is it possible to configure a spice X session?
So that X session is available remotely?
Tomasz
Hi Tomasz,
There are two Spice projects that do that: x11spice and Xspice.
Uri.
___
Spice-devel mailing
We are trying to make plugin which uses nvidia FBC + NVENC instead of gst-plugin. When using FBC, I must set "display" to "on", else FBC is reporting as not supported. I can't do the trick with qxl+nvidia with display off as with gst-plugin. Thus, two spice windows appear, one with main display cha
Hi Keiichi,
One more addition to the comments below. Currently spec does not define units
for bitrate (are units needed for anything else except that?). Let's
explicitly state that bitrate is provided in bits per sec, so whatever
implementation can do proper conversion if needed.
Best regards,
Signed-off-by: Vasily Averin
---
drivers/gpu/drm/qxl/qxl_image.c | 1 +
1 file changed, 1 insertion(+)
diff --git a/drivers/gpu/drm/qxl/qxl_image.c b/drivers/gpu/drm/qxl/qxl_image.c
index 43688ecdd8a0..7270da62fc29 100644
--- a/drivers/gpu/drm/qxl/qxl_image.c
+++ b/drivers/gpu/drm/qxl/qxl_image.