Hello,
I’m building an application for visualization of scientific data. I have a scene created, along with the essentials (camera, renderer, controls, etc.). I would like to render a real-time depth map of the camera view in a second canvas, and am unable to figure out how to do that.
What I would like, is for elements of the screen at the same depth to have the same color, and some color scheme to indicate objects that are closer and further respectively.
How can I accomplish this in threejs? I’ve been stuck on this a while and would really appreciate help!
Thanks!
Depends on which part you got stuck at - since there’s quite a few things to get blocked at when working with depth maps.
- You render the depth map to a depth texture.
- Pass that depth texture to either a material or a post processing pass.
- Apply color to the material (or screen pixel, if postprocessing) based on the depth value.
Two important things to take into account:
- you have to unpack the “depth color” in your shader to get the actual proper depth value (there’s a helper function in built-in three.js shaders in “unpacking” chunk that’ll do that for you.)
- Depth is based on camera near-far values, it’s not normalised to the contents of the scene. So if your camera has a large distance between its near and far plane, depth map will possibly have veeery small delta values between different depth levels.)
2 Likes
Thank you so much for your kind answer. Would you be able to guide me with a few code snippets? I’m revisiting this answer and feeling a little clueless about where to begin…sorry I’m a newbie with threejs