Support for EXT_texture_norm16

Hi,

Is there a reason why EXT_texture_norm16 is not supported? If not, would you be willing to accept a PR?

I have large 2D and 3D textures in uint16 format. The extension enables you to use this data without conversion and still have linear interpolation on the shader end.

Cheers,
Tom

To add some context: The extension would allow you to use

  • 16 bit integers (signed or unsigned) for textures (2D or 3D),
  • have those values normalized in the shader, and
  • choose the interpolation type, in particular linear.

This is very useful for visualizing any data that typically comes in 16 bit, such as medical image data for instance.

This might be better off posted on github?

@dubois, You might be right. I’ll give it a try.

1 Like

To wrap up this subject: This PR introduces the extension. The PR has already been merged and will be included in r184. Kudos to @Mugen87 for his support!

1 Like

For people working with large datasets, especially volumetric or scientific data, support for Three.js with EXT_texture_norm16 is actually very important. A lot of real world data already comes in 16 bit formats, and converting everything to floats on the CPU can become expensive both in memory and preprocessing time.

Being able to upload the data directly and have it normalized on the shader side while still allowing linear interpolation makes workflows much cleaner. This is particularly useful for things like:

  • medical imaging volumes

  • terrain or density fields

  • simulation grids

  • large scientific datasets

I work a lot with procedural environments and large data driven systems, and avoiding unnecessary conversions can make a big difference when dealing with large textures or 3D volumes.

Nice to see this improvement making its way into the engine. Kudos to the contributors who helped push the PR forward.

2 Likes