Shader use scene's depthmap to displace a plane

the plane needs to be always over the background landscape, how can in use the scene’s depthmap to displace the geometry?

You can’t access the depth buffer directly. If you want to access is, you have to create/manage/render to/use… a WebGLRenderTarget with a depth attachment, then pass the depth target.texture as a uniform for a custom shader to displace with, or you might be able to get away with using it in plane.material.displacementMap = depthTarget.texture
Your plane may need many subdivisions to get something interesting.

2 Likes

Thanks! so i add a camera to the scene, render a depth texture to use it? wouldnt there be a simpler way to get it (like in unity/unreal)?

I just re-read your question and I might be overthinking what you need.

You sure you don’t just want:

depthTest:false,depthWrite:false,transparent:true.
and mesh.renderOrder = 10;

on your plane? (causing it to always render last, and on top of what was there before)

Or are you trying to actually warp a mesh to a depth buffer?

By default in webGL you can’t access the contents of the main display buffer, only rendertargets.

Sorry for the late reply, yes I am trying to warp the mesh to the depth Buffer, is there an example of the dpeth target used for displacement?

Im using a shaderMaterial, i will have to implment this in the shader, how can i get the height of the depthTexture, the following code seems to return on the Z, i need Y!

  • the camera is a Perpective Camera
  • the Depth Texture has the default values

float readDepth( sampler2D depthSampler, vec2 coord ) {
				float fragCoordZ = texture2D( depthSampler, coord ).x;
				float viewZ = perspectiveDepthToViewZ( fragCoordZ, cameraNear, cameraFar );
				return viewZToOrthographicDepth( viewZ, cameraNear, cameraFar );
			}

Depth textures are only a single value… but you can use it as Y.

Are you reading this in your vertex shader? You would just add this to “position.y” before you run it through the modelView + proj matrices.

I dont understand what is a Depth Texture it seems.

A depth texture is just a texture that you happen to store depth information in.

It’s just a name. The texture itself isn’t much different than other textures… except…

it might be a single channel… instead of RGB

or it might be a Float channel… whereas often textures are 8 bit rgb

Thanks, so… i should make the depthTextrue withthe camera pointed specifically to my needs so the depth equates the elevation in my case?
wouldnt it be easier to save the scenes vertex position?

I don’t know what you’re building, so it’s hard to say.
I’m not sure what’s hard/easy about rendering a depth buffer.
A shader can’t effectively read from an array unless it’s written to a texture.
If you have such an array you can also create a single channel DataTexture from it?

https://threejs.org/docs/#api/en/textures/DataTexture

thanks, interesting ideas. i will look into this.