I want to make a custom effect using WebGLRenderTarget, let’s say I’m implementing my own Shadow Mapping, where I basically store the depth of the drawn surface for each pixel of the scene.
What would be the best approach for doing it?
I would need a custom ShaderMaterial that stores the depth in the texture, but should I clone all my Meshes in the scene? Applying this custom Material to those copies? I guess I would also need a specific Scene object with an OrthogonalCamera from the light point of view. Is there a better approach?
three.js does not provide an API for performing depth pre-passes so far. However, you can do this on application level, First, setup a render target and an override material.
var pars = { minFilter: LinearFilter, magFilter: LinearFilter, format: RGBAFormat };
var renderTargetDepth = new WebGLRenderTarget( x, y, pars );
var depthMaterial = new MeshDepthMaterial( { depthPacking: RGBADepthPacking } );
I don’t think it’s necessary to have a separate scene object. However, a camera representing the light’s shadow frustum does make sense (three.js uses the same approach).
Thanks a lot for you answer, that is really helpful. I did not notice this overrideMaterial attribute in the documentation. That is exactly what I need.
Now, ideally I would only override the fragment shader (because some of my meshes might make use of a height map, modifying the vertex position in the vertex shader, and this might change the depth value). Do you think I could achieve this override of the fragment shader only? I suppose ThreeJS does it internally, or do shadows do not render that great when using a custom ShaderMaterial that modifies the vertex position?
It might be difficult to achieve though… You might end up having a Shader program that do not compile because the varyings do not match, right?
Scene material override won’t consider different material features such as skinning or vertex morph like the internal shadow map system does, it would require it with caching different variants too in case there are more than one.
@martinRenou may i ask why you need a custom actually?
That’s what I feared. Ok, I’ll try to use the overrideMaterial for now. It might be enough.
may i ask why you need a custom actually?
I actually don’t need a custom Shadow map, I want to create a normal map of a mesh seen from the light point of view. So it would be just like a shadow map, but instead of storing the fragments depth I would store the fragment normals.
EDIT: I am trying to improve this demo: ThreeJS Water. Where the caustics are computed given the water height map and everything is “hardcoded” in shaders, so this demo only works for a plane mesh for the water and a box for the pool. I would like to make a generalized solution for caustics computation, maybe similar to shadow mapping in the approach. I’ll see how far I can get with this. I might have other ThreeJS questions later
We hopefully can avoid such issues as soon as depth pre-passes are supported.
Right but it doesn’t address OP’s actual concern:
I actually don’t need a custom Shadow map, I want to create a normal map of a mesh seen from the light point of view. So it would be just like a shadow map, but instead of storing the fragments depth I would store the fragment normals.
Issue 14577 I made discusses how this can be solved a bit more generally to address OPs use case and the shadow case. I’ve started using a small utility to create and replace new materials for all objects in the scene and copying all the necessary uniforms before render and resetting them back after. It requires a bit of hacking per material, though, because of the all the magic and setting of defines that happens in the WebGLProgram class.
The idea is that a depth pre-pass honors all built-in material settings.
Of course but how does that address OP’s interest in rendering normals? (Sorry I saw that I quoted the wrong part of the message above)
I actually don’t need a custom Shadow map, I want to create a normal map of a mesh seen from the light point of view. So it would be just like a shadow map, but instead of storing the fragments depth I would store the fragment normals.
I suppose normal and depth are the more widely used passes for effects but it just seems like if the effort is being put in to replace shaders in the scene for depth and normals that reuse uniforms that it should be done in a way that can be used for other passes, as well. The bloom pass should want access to an emissive pass, reflection effects a roughness and metalness pass. I know you don’t see a deferred renderer in core (I agree) but if done correctly the feature can afford people the ability to create their own deferred system and a lot more flexibility and correctness in there postprocessing effects.