Does threejs support integer position attributes?

Hi Everyone,

I stream encoded point clouds using Google Draco. Threejs version is 0.182.0. Draco has an option to skip vertex data dequantization at the decoding phase.

Dequantization is a process of converting integer coordinates using a formula like this:

vec3 decompressedPos = uMin + (position * uRange / uMaxQuantized);
where uMin, uRange, uMaxQuantized are constant parameters of quantization (per frame).

I thought my code could benefit from doing dequantization in the shader. So I defined this formula in the shader using ShaderMaterial, as shown above.

Geometry data is set like this:

mesh.geometry.setAttribute('position', new Uint32BufferAttribute(positions, 3, false))

Positions array is what comes from Draco in the uint32 format. I’m getting an error

[.WebGL-0x19ac0011f800] GL_INVALID_OPERATION: glDrawArrays: Vertex shader input type does not match the type of the bound vertex attribute.

I see there was a merge request regarding integer attributes Examples: Add integer attributes demo. by Mugen87 · Pull Request #19024 · mrdoob/three.js · GitHub but the attached application example does not demonstrate position integer attribute for this reason: “We can’t use ShaderMaterial in this case since it’s necessary to overwrite the default position attribute (which is vec3 but we would need ivec3).

Surprisingly I do not get the error if I use Int16BufferAttribute, but the picture is corrupted because draco uses uint32 type.

Is my understanding correct that I don’t have options except maybe using webgl directly.

It’s common to use int8, uint8, int16, or uint16 vertex attributes with three.js, typically for the memory savings. You don’t necessarily need a custom shader for those, you can just apply a scale and offset on the mesh to compensate. I find it convenient to make them ‘normalized’ attributes so that they’re automatically remapped to [-1, 1] or [0, 1], but that’s up to you.

I haven’t tried with int32 or uint32, but you might need to assign attribute.gpuType in that case? If that’s not working then I think it might be worth filing an issue and/or updating the documentation.

I successfuly used RawShaderMaterial with little code in its onBeforeRender to set model and projection matrix. My shader in RawShaderMaterial has ‘uvec3 position’ definition instead of ‘vec3’. The geometry was set using BufferAttribute and uint32 data.

I also discovered that when we use (u)int8(16), the values are implicitly converted to float by glVertexAttrib (it’s likely happening on gpu side). So you are right that this can be used with standard threejs shaders that accept float vectors.

Can you share some code? Why did you need to use onBeforeCompile with the rawshadermaterial? You should be able to (I think) set it up synchronously.