Reading / writing floating point texture

I didn’t find any examples of working with unclamped 4-byte float textures (gl.R32F) for parallel computing and I needed it for something else, so I made a small example, maybe it’ll be useful for someone (I’m only interested in WebGL2):

https://jsfiddle.net/f9meyb1c/

data_texture

The idea is to have two textures A & B, initialize A with some values in JS, then read A, do some math, render on B, swap input and output, rinse and repeat. A separate shader program reads output texture and draws some visual.

Notes:

  1. I had a look at the birds example, it seems that it uses normalized values [0…1], so it reads texture channels as usual and it’s not really a float texture.

  2. I didn’t find a way to set initial values to the WebGLRenderTarget texture, despite the fact that it’s set to be gl.R32F / gl.FLOAT, so I had to replace it with DataTexture on the fly which looks like a hack, though it works.

  3. I had to go with RawShaderMaterial and manually wire a quad used for reading/writing float textures because THREE shader code seems to always have out variable for fragment shaders to be vec4 (it needs to be float in this case) and I don’t know how to change that.

2 Likes