Gunnar

>You’re greatly cutting down the rendering required per frame by putting
the video to an overlay. Just putting it to a window which is composed in
lipstick is an improvement. Right now we get the texture, then blend it
into the window >surface through the scene graph, then send the result to
lipstick and render it again there. So the video buffer is drawn as a
texture twice before presented to screen.

am I understanding correctly this would mean the window would be rendered
solely on the lipstick side?

Best,
tk

On Fri, May 22, 2015 at 6:21 PM, Gunnar Sletta <gunnar.sle...@jolla.com>
wrote:

>
> > On 22 May 2015, at 14:42, Mohammed Hassan <mohammed.has...@jolla.com>
> wrote:
> >
> > On Fri, 22 May 2015 16:31:35 +0800
> > Halley <halley_z...@sina.com> wrote:
> >
> >> after  a thought, I think overlay can be added back in the following
> >> way:1. wayland-android-client-protocol.h supports passing
> >> ANativeWindowBuffer from wayland client to server.2. An
> >> object(WindowSurface) with same interface of ANativeWindow can be
> >> constructed in host; which meets the requirement of android codecs.3.
> >> then we can create gst video sink element which accepts
> >> ANativeWindow, gst can feed the WindowSurface to android
> >> codecs.finally weston receives ANativeWindowBuffer from media client,
> >> it has the option to assign this ANativeWindowBuffer to Overlay plane.
> >
> > There is no need to add anything to wayland because all the needed
> > functionality is already there.
> >
> > clients communicate already with the compositor using android native
> > buffers.
> >
> > The only thing we need is:
> > 1) a way to tell the sink to use the overlay
> > 2) a way to tell the compositor to use the overlay for a certain buffer
> > 3) A way to tell the compositor the "position" of rendering (x, y).
> > 4) expose that to Qt in a simple way.
> >
> > #1 is easy to achieve with a simple property
> > #2 is already implemented but I am not entirely sure
>
> It is in part at least. The lipstick bits are in place:
>
>
> https://github.com/nemomobile/lipstick/blob/master/src/compositor/hwcrenderstage.cpp
>
> https://github.com/nemomobile/lipstick/blob/master/src/compositor/lipstickcompositorwindow.cpp#L434
>
> Implementation for the Jolla phone (which is a research project for now)
> is here:
> https://github.com/sletta/qt5-qpa-hwcomposer-plugin/tree/sbj-hwcinterface.
>
> > #3 is something that might or might not be there. Or maybe the overlay
> > can only be used for full screen rendering?
>
> I’ve always thought that we should have some special QWindow type for
> media surfaces, that allows for posting the media stream buffers directly
> to the compositor without any extra composition.
>
> > #4 is something I have no idea how to implement.
>
> something with QWindow with backing in the platform plugin and in
> libhybris, I’d say :)
>
> >
> > The question still remains: Does using the overlay bring any
> > improvement? Is it really needed?
>
> You’re greatly cutting down the rendering required per frame by putting
> the video to an overlay. Just putting it to a window which is composed in
> lipstick is an improvement. Right now we get the texture, then blend it
> into the window surface through the scene graph, then send the result to
> lipstick and render it again there. So the video buffer is drawn as a
> texture twice before presented to screen.
>
> By having a media surface style QWindow, you would skip the client side
> composition step done by the scene graph, basically halving the work the
> GPU needs to do.
>
> If the buffer coming in to lipstick was picked up and hardware compositor
> compatible, we could ditch that step as well, letting the GPU be idle while
> playing video (assuming no overlay controls or subtitles)
>
> You probably wouldn’t notice too much in terms of performance on the phone
> as we can already do the two GL passes at the framerate of the video, but
> it should benefit battery for playback quite a bit.
>
> cheers,
> Gunnar
>
> >
> > Cheers,
> >
> >
> >>
> >> --------------------------------
> >>
> >>
> >> ----- 原始邮件 -----
> >> 发件人:Mohammed Hassan <mohammed.has...@jolla.com>
> >> 收件人:Sailfish OS Developers <devel@lists.sailfishos.org>
> >> 主题:Re: [SailfishDevel] 回复:Re: could we
> >> support_hw_overlay_from_gst-droid? 日期:2015年05月19日 21点27分
> >>
> >>
> >> On Tue, 5 May 2015 23:09:40 +0300
> >> Tone Kastlunger <users.giulie...@gmail.com> wrote:
> >>> Hi;
> >>> apologies for dropping the mailing list - it appears gmail does not
> >>> reply correctly to the mailing list but only to the sender.
> >>> Qt 5.1 was my typo, should have been 5.2.
> >>>
> >>> Point being, does lipstic currently handle wayland subsurfaces?
> >> Unfortunately not but it can be done if there is a need ;-)
> >> The point is: Does it really improve the rendering/playback
> >> performance? Cheers,
> >> _______________________________________________
> >> SailfishOS.org Devel mailing list
> >> To unsubscribe, please send a mail to
> >> devel-unsubscr...@lists.sailfishos.org
> >
> > _______________________________________________
> > SailfishOS.org Devel mailing list
> > To unsubscribe, please send a mail to
> devel-unsubscr...@lists.sailfishos.org
>
> _______________________________________________
> SailfishOS.org Devel mailing list
> To unsubscribe, please send a mail to
> devel-unsubscr...@lists.sailfishos.org
>
_______________________________________________
SailfishOS.org Devel mailing list
To unsubscribe, please send a mail to devel-unsubscr...@lists.sailfishos.org

Reply via email to