I’m trying to create an integer-valued DataTexture:
const myTexture = new DataTexture(myU32Array, width, height, THREE.RedIntegerFormat, THREE.UnsignedIntType)
But it seems this is not working. In particular, when the texture gets uploaded, three seems to be trying to call gl.texStorage2D with an invalid internalFormat === 36244 (“RED_INTEGER”), as returned by the below call stack.
What’s the deal here? Have I created the texture wrong or is WebGLTextures.getInternalFormat broken (and it should be able to infer the format from my input)?
Thanks! BTW after some bug fixing, this is the intended output:
Incidentally, since a sized internal format uniquely specifies a base format, it may make sense to allow the Texture constructor to take that internal format instead of the unsized format (and save the same to JSON). Is there some reason this wasn’t done back when internalFormat was added?