I have attempted to use this fiddle (which is basically a working version of this one), this example and this explanation to integrate into my own fiddle the ability to have a post processing shader redrawing things using the scene’s depth, but all I get is a white canvas and I don’t know what to do to make it work.
Can someone help me identify what is missing or done wrong in my fiddle (maybe a working fork if that’s not too much to ask)? For the record, I use a single camera and want that simple shader to modify the original image presented to the user based on depth. The final aim is to use the original color, the depth value and some other parameters which will be added later on to construct a “volumetric layer” that would simulate an atmosphere.
P.S. No need to bother with the terrain, clone and raycaster parts in my code, they’re working fine. You can easily test how things work correctly at a basic level by either uncommenting the /* Working Shader Test */
part in the fragment shader (this will draw a red “layer” over the scene), or initialize the global compose
variable with false
(to toggle the post processing / composer stuff off and draw the original outcome).