I’ve ported some code to use the new WebGLMultipleRenderTargets but I still have one piece that was reading a pixel back from one of my texture buffers that was previously in a single WebGLRenderTarget using WebGLRenderer.readRenderTargetPixels.
What is the right way to do this now? I couldn’t see how to easily share a texture between a WebGLMultipleRenderTarget and a WebGLRenderTarget but is that even the correct strategy?
readRenderTargetPixels doesn’t give me a way of telling it which of the textures in my WebGLMultipleRenderTargets to use.
I’m afraid this use case is not yet supported by
readRenderTargetPixels() does only support normal render targets. More specific ones like
WebGLMultipleRenderTargets can’t be used with this method.
The use case here is hit testing (mouse cursor coordinates) on my deferred object id buffer.
Short of dropping into pure webGL api, I imagine my implementation could be:
- Create 1x1 WebGLRenderTarget and keep it around.
- Render into it from the objectID texture living inside the WebGLMultipleRenderTargets with correct coordinates.
- Then use readRenderTargetPixels. If I remember correctly, I believe glReadPixels forces a flush of any pending draws into the FBO(?)