Support for EXT_texture_norm16

Hi,

Is there a reason why EXT_texture_norm16 is not supported? If not, would you be willing to accept a PR?

I have large 2D and 3D textures in uint16 format. The extension enables you to use this data without conversion and still have linear interpolation on the shader end.

Cheers,
Tom

To add some context: The extension would allow you to use

  • 16 bit integers (signed or unsigned) for textures (2D or 3D),
  • have those values normalized in the shader, and
  • choose the interpolation type, in particular linear.

This is very useful for visualizing any data that typically comes in 16 bit, such as medical image data for instance.

This might be better off posted on github?

@dubois, You might be right. I’ll give it a try.

1 Like