> On 26 Oct 2015, at 10:52, Kimmo Lindholm <kimmo.lindh...@eke.fi> wrote:
> 
> I think I found a way to do this.
>  
> I draw the painting tool to a separate preview canvas, and use this as a 
> alpha-mask input for the shader. Then combine this with processed drawing 
> canvas contents and view the shader output in realtime.
> When user accepts the result, the whole thing is grabbed as an image and 
> drawn on top of the main drawing canvas.
>  
> PoC with simple shader that swaps RGB --> BRG channels on user selected 
> area:https://twitter.com/LiKimmo/status/658559356221440000/photo/1
>  
> shader code: gl_FragColor = texture2D(source, qt_TexCoord0).brga * 
> texture2D(mask, qt_TexCoord0).a;
> where source is whole drawing canvas, mask is preview canvas where the actual 
> drawing happens – hidden during this process.

Nice one :)

>  
> grapToImage is still slow, but that delay is suffered only when applying the 
> results. Maybe this is place for a Qt feature request, Item::toCanvasImageData

Yeah grabToImage() is pretty costly due to the pixel readback. What you want is 
to draw your ShaderEffect into a texture and then draw that texture directly 
onto the canvas without going through the pixel readback / CPU memory code path 
that  grabToImage would imply. Not sure how exactly we would swing that with 
the current Canvas design (at least if we take threaded/non-threaded and Image 
codepaths into account), but feel free to open a suggestion for that on 
bugreports.qt.io. Something might happen :)

>  
> -kimmo
>  
> From: devel-boun...@lists.sailfishos.org 
> [mailto:devel-boun...@lists.sailfishos.org] On Behalf Of Kimmo Lindholm
> Sent: 26. lokakuuta 2015 8:55
> To: devel@lists.sailfishos.org
> Subject: [SailfishDevel] Drawing ShaderEffect to Canvas
>  
> Hi,
>  
> This relates to my Paint app. Doing tools to modify image, blur, invert, 
> colorchannel swap etc. from area under your finger, like realtime brush.
>  
> Doing the bitmanipulation with javascript in canvas-onPaint works but even 
> simple things like invert are utterly slow.
>  
> Then I decided to give ShaderEffect a try. I can successfully get image data 
> under mouse from Canvas;
>  
> ShaderEffectSource {
>     sourceItem: myCanvas
>     sourceRect: Qt.rect(x,y,w,h)
> …
>  
> The shader outputs the selected part of canvas nicely on screen with 
> fragmentShader applied. Looks great.
>  
> Now I would like to draw the shader output to my 2nd canvas, which is later 
> drawn on top of the actual canvas if accepted by user.
>  
> But putImageData(myShader,x,y) does not draw on canvas, and does not give any 
> error. (other trials complain that ImageDataObject is invalid)
>  
> I managed to do this by calling myCanvas.grabToImage(…) and 
> drawImage(result.url) but this is again as slow as the Javascript 
> implementation.
> Nothing that I can use on realtime-editing. This could be used after masking 
> an area where to apply the shader.
>  
> If someone has any ideas?
>  
> -kimmo
>  
> _______________________________________________
> SailfishOS.org Devel mailing list
> To unsubscribe, please send a mail to devel-unsubscr...@lists.sailfishos.org

_______________________________________________
SailfishOS.org Devel mailing list
To unsubscribe, please send a mail to devel-unsubscr...@lists.sailfishos.org

Reply via email to