How to use integer texture?

I’m trying to create an integer-valued DataTexture:

const myTexture = new DataTexture(myU32Array, width, height, THREE.RedIntegerFormat, THREE.UnsignedIntType)

But it seems this is not working. In particular, when the texture gets uploaded, three seems to be trying to call gl.texStorage2D with an invalid internalFormat === 36244 (“RED_INTEGER”), as returned by the below call stack.

What’s the deal here? Have I created the texture wrong or is WebGLTextures.getInternalFormat broken (and it should be able to infer the format from my input)?

getInternalFormat (three.module.js:22605)
uploadTexture (three.module.js:23169)
setTexture2D (three.module.js:22922)
setValueT1 (three.module.js:17632)
upload (three.module.js:18187)
setProgram (three.module.js:28713)
WebGLRenderer.renderBufferDirect (three.module.js:27632)
renderObject (three.module.js:28263)
renderObjects (three.module.js:28232)
renderScene (three.module.js:28154)
WebGLRenderer.render (three.module.js:27974)

Any chances to demonstrate the issue with a live example? Try using three.js dev template - module - JSFiddle - Code Playground as a foundation.

1 Like

Absolutely. Here’s a small example reproducing the problem:

1 Like

You can fix the issue by manually defining the internal format of the texture with the Texture.internalFormat property. Setting it to R32UI does fix it for me: threejs integer texture - JSFiddle - Code Playground

three.js automatically derives the internal format for the defaults format/type combinations but not yet for more special ones, see

1 Like

Thank you! The docs are a bit fuzzy but the linked PR gives some good insight!

Is there a reason the internalFormat property is omitted from both the Texture constructor and Texture.toJson?

I guess because of the special character of internalFormat, it wasn’t been added to the already long ctor signature.

However, the missing support in toJSON() should be fixed. I’ll file a PR.

1 Like

Thanks! BTW after some bug fixing, this is the intended output:

Incidentally, since a sized internal format uniquely specifies a base format, it may make sense to allow the Texture constructor to take that internal format instead of the unsized format (and save the same to JSON). Is there some reason this wasn’t done back when internalFormat was added?

I don’t know that, sorry. Maybe this should be asked at which introduced internalFormat.