WebGLMultipleRenderTargets with different texture format/types

I am trying to create a WebGLMultipleRenderTargets (MRT) with three textures.

My goal is for the textures of the MRT to have two textures be RGBA8 and one RG32F:

// Create multipleRenderTarget
var mrt = new THREE.WebGLMultipleRenderTargets(window.innerWidth, window.innerHeight, 3, {
  format: THREE.RGBAFormat,
  type: THREE.UnsignedByteType

mrt.texture[0].format = THREE.RGBAFormat;
mrt.texture[0].type = THREE.UnsignedByteType;
mrt.texture[1].format = THREE.RGBAFormat;
mrt.texture[1].type = THREE.UnsignedByteType;
mrt.texture[2].format = THREE.RGFormat;
mrt.texture[2].type = THREE.FloatType;

However, I am getting these errors when I try to use the MRT to render.

[.WebGL-0x12400d1fc00] GL_INVALID_OPERATION: Invalid combination of format, type and internalFormat.
[.WebGL-0x10000d4bd00] GL_INVALID_FRAMEBUFFER_OPERATION: Framebuffer is incomplete: Attachment has zero size.

Here is a codepen demonstrating my setup with the resulting error https://codepen.io/RemusW/pen/WNWBLeN

Is it possible to have differently formatted textures in a MRT or even just update their settings after initialization? The code runs fine if I don’t change the MRT’s textures after its initialization.

Note: I know that in most recent versions of three.js the WebGLMultipleRenderTargets has been removed with the functionality now built into WebGLRenderTarget. I’ve added another codepen trying to use the latest version of WebGLRenderTarget but get the same errors

Hi Romulus,
I’ve been working a bit with the RenderTargets lately because they’re also of interest to me. The format and type are defined by the parameterization of the RenderTarget. Doing this for the textures themselves will lead to errors as far as I understand the RenderTarget code. If you want a texture with a different format and type, this requires a new RenderTarget.

What I’m writing now is a guess: The RenderTargets have to transport the textures into the GPU. To do this, you need clear instructions on how the shader must be structured (format, type, etc.). If you don’t specify all the options, the RenderTargets use their internal default options so that everything is properly parameterized. And with all these details a shader is then created with which the textures are then rendered into the GPU.
However, these shaders cannot be changed once they have been created, because they tell the GPU exactly how it should reserve memory space. The GPU is very picky and doesn’t like subsequent changes at all.
I almost only work with WebGPU and I have to specify the texture type exactly for the compute shaders when I create textures with it and that can no longer be changed. If I want a different texture format I need a new compute shader.

I was able to assign different formats to individual textures inside .textures

1 Like

Ok then my hypothesis was wrong.

Could you provide an example? .textures isn’t defined in WebGLMultipleRenderTargets but I know it is in the newest version of three.js in which WebGLRenderTarget now supports multiple render target functionality.

I’ve made another code pen that uses the latest version of three.js and use WebGLRenderTarget in the same scenario but get the same error.

You are right that this may be from the more recent versions. I updated to 163 just to try this out.

I think you’re misunderstanding the relationship of shaders and other resources in the context of the gpu. The shader is concerned with its inputs, of which one type can be a sampler. Once you have this shader created, you can reuse it, eg to draw different meshes with different textures but apply the exact same logic on them (eg lighting).

I didn’t dive too deep, but at a glance, it’s still one frame buffer, and then has this arbitrary number of color attachments which are the same as always.

Snapshot of my demo above:

1 Like

Hey dubois,

Thanks for the help! It seems that I needed to specify glslVersion = THREE.GLSL3 for the shader material.

I was unsure of how some of the internals were working because layout (location=0) already seemed to be defined as pc_fragColor so I wasn’t sure how that was working with my own defined layouts. After specifying the GLSL3 version it was no longer predefined and all seems to work!

I’ve updated the codepens so that they run without any warnings now.

1 Like

Oh ouch yeah, that gave me a massive head ache.

Another way of explaining that came to mind:

In GLSL you do gl_FragColor = myValue; and you don’t care too much about the precision and format of myValue, yet it will be written, at least somehow, to a target regardless of what you setup (at least i think). In that sense, you could actually run the same exact shader with the same exact uniforms and meshes twice, but write to two completely different targets (float, unsigned byte etc).

In wgsl things are a little different. You have to specify exactly what you want in the shader

texture_storage_2d<rgba32float, write>

I’m probably already too focused on wgsl