Hi!

So I found a bunch of examples of how to convert YUV to RGB in OpenGL ES 
2.0, but they all require me to give to buffers (luminance and chrominance) 
as input to the shaders. 

What I want to do is to use the SurfaceTexture that I initialise from an 
OpenGL ES 2.0 texture I allocated. The camera gives its input to the 
SurfaceTexture correctly, but I can't display draw the texture in the 
fragment shader. I'm assuming this is because the format is NV21 (YUV) and 
the normal texture2D() shader function can't process that correctly.

Now, the question is how do I make this work? Is it possible to convert 
NV21 to RGB in a shader without first splitting the NV21 buffer in two 
separate buffers? This would sort of defeat the purpose since I want to 
process the entire frame in OpenGL. Having to copy buffers manually when I 
can get the entire frame into OpenGL seems like a waste of CPU.

I'm pretty sure that the underlying camera framework in Android converts 
nv21 to rgb on the fly, since I can draw the preview to a SurfaceTexture 
that I receive from a TextureView, but is this conversion then happening on 
the CPU or is there some secret yuv2rgb shader in the Android source code 
that I can't find?

Thanks!

// Erik


-- 
-- 
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Android Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to android-developers+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to