Rendering a depth map (heat map) using depth buffer information


I’m building an application for visualization of scientific data. I have a scene created, along with the essentials (camera, renderer, controls, etc.). I would like to render a real-time depth map of the camera view in a second canvas, and am unable to figure out how to do that.

What I would like, is for elements of the screen at the same depth to have the same color, and some color scheme to indicate objects that are closer and further respectively.

How can I accomplish this in threejs? I’ve been stuck on this a while and would really appreciate help!


Depends on which part you got stuck at - since there’s quite a few things to get blocked at when working with depth maps.

  1. You render the depth map to a depth texture.
  2. Pass that depth texture to either a material or a post processing pass.
  3. Apply color to the material (or screen pixel, if postprocessing) based on the depth value.

Two important things to take into account:

  1. you have to unpack the “depth color” in your shader to get the actual proper depth value (there’s a helper function in built-in three.js shaders in “unpacking” chunk that’ll do that for you.)
  2. Depth is based on camera near-far values, it’s not normalised to the contents of the scene. So if your camera has a large distance between its near and far plane, depth map will possibly have veeery small delta values between different depth levels.)