I would like to use two fragment shaders with the outputStruct node. This works without errors at first.
Now I would like to save the output of the second fragment shader in a texture. For this I simply use the renderTarget. However, I have to make a mistake here because I get error messages in the console
renderTarget = new THREE.RenderTarget(window.innerWidth * dpr, window.innerHeight * dpr, {count: 2});
const fragmentShader1 = wgslFn(`
fn main1(
) -> vec4<f32> {
return vec4<f32>(1, 0, 0, 1);
}
`);
const fragmentShader2 = wgslFn(`
fn main2(
) -> vec4<f32> {
return vec4<f32>(0, 1, 0, 1);
}
`);
const material = new MeshBasicNodeMaterial();
material.outputNode = outputStruct(
fragmentShader1(),
fragmentShader2(),
);
quad = new QuadMesh(material);
//in the render loop
quad.render(renderer);
renderer.setRenderTarget( renderTarget );
renderer.render( scene, camera );
renderer.setRenderTarget( null );
error message in the console:
index.html:1 Color target has no corresponding fragment stage output but writeMask (ColorWriteMask::(Red|Green|Blue|Alpha)) is not zero.
- While validating targets[1] framebuffer output.
- While validating fragment state.
- While calling [Device].CreateRenderPipeline([RenderPipelineDescriptor]).
250[Invalid RenderPipeline] is invalid.
- While encoding [RenderPassEncoder].SetPipeline([Invalid RenderPipeline]).
249[Invalid CommandBuffer from CommandEncoder "renderContext_0"] is invalid.
- While calling [Queue].Submit([[Invalid CommandBuffer from CommandEncoder "renderContext_0"]])
I see a red screen, the output of the first shader. I need the output of the second shader in a texture for further calculations in another shader. And that’s why I would like to render the output of the second shader into a texture.
Does anyone already have experience with this?
here a codePen example:
1 Like
I don’t know the exact nature of those error messages, but wouldn’t this require multiple render targets?
I have managed to render to multiple render targets on WebGPU using the node system by using WebGLMultipleRenderTargets
. (However, notice that I am still on Three.js version r157.)
I define it like this:
const multiRenderTarget = new WebGLMultipleRenderTargets(undefined, undefined, 0);
multiRenderTarget.texture = [colorTexture, normalTexture, idTexture];
multiRenderTarget.depthTexture = depthTexture;
and obviously also set its correct size later.
My image is rendered into colorTexture
, which I later render to screen in a different pass, while my other textures contain the normals, IDs and depth values that I write into it from the fragment shader.
My output node declaration looks like this, which is similar to yours so I think that should work:
material.outputNode = Nodes.outputStruct(
mainColorNode(colorNodeParams), // Color (wgslFn node)
Nodes.vec4(Nodes.normalView, 1.0), // Normal
idNode(idNodeParams), // ID (wgslFn node)
);
The WebGLMultipleRenderTarget will disappear with r172. It is also a WebGL target which I would not like to use in a WebGPU environment. I’ll take a look at the renderTargets in the threejs code, maybe that will help me. But thank you for your effort. If I find something I’ll let you know so you’re up to date because a lot has happened in threejs in terms of WebGPU since r157
1 Like
Solved with the example:
https://threejs.org/examples/webgpu_multiple_rendertargets.html
The textures do not need to be specified extra.
For the WebGPU renderer you should then use the RenderTarget. This is now automatically a multipleRenderTarget with a count > 1. If no count is specified, it is 1 by default.
const renderTargetOptions = {
count: 2,
minFilter: THREE.NearestFilter,
magFilter: THREE.NearestFilter
}
renderTarget = new THREE.RenderTarget(width, height);
Since you also work with the readRenderTargetPixelsAsync. With r164 comes a small extension with which you can specify which of the renderTarget textures you want to read.
readRenderTargetPixelsAsync( renderTarget, x, y, width, height, index );
However, the index is optional and can also be omitted if you want to read the first texture.
1 Like