BufferGeometry disappears when the resolution for the DataTexture increases

I recently covered the topic of how to load vertex data from a buffer geometry into a DataTexture and thus have full access to all vertex coordinates in the shader. Here is the link to the topic and there is a github link to the repository.The repository below (BufferGeometry-from-DataTexture)

If I now increase the resolution of the bufferGeometry, I notice that something happens between 60 and 64. With 60, i.e. 60x60 bufferGeometry vertices, everything still works. But if I set the resolution to 64, the buffer geometry disappears and all vertices are in the origin. Does anyone know an elegant solution how I can improve my repository so that it is e.g. also works with much higher resolutions?

If I would want to use a texture to store the data of vertices, I’d make it not 1 pixel line-like, but more rectangular :slight_smile: (or I don’t know how to explain it better)

Here is a very rough example, that works with WebGL2 only, as it utilizes texelFetch and gl_VertexID. Pretty much doable with WebGL1, just more calculations.

Demo: https://codepen.io/prisoner849/full/GRYLOGV

PS Made the plane wavy in shader to be sure, that we’re working with the data from texture (see comments in the code)

2 Likes

Now it works with any number of vertices. With your few lines of code, I learned several new things at once.

1 Like