Using a displacementMap to modify the (top) face of a cube

Hello, I’m probably going about this all wrong, but was hoping I’m at least somewhat on the right path.

Context: I’m building something which will try to scan the height of an (IRL) object using a machine which has the ability to collect height points with a laser distance sensor. I then want hopefully export a representation of that scan into either an STL or some other object format that I can then import into fusion 360 to offset a CNC path so I could e.g. carve into the complex face of some object. I was hoping to use threejs to do the building of the digital model (I’d like to do this in a browser since it’ll make sharing the tool that much easier).

History: I started out trying to just make a plane, and use a displacementMap to modify the plane and then hopefully export the plane as an STL and I could then import into fusion. Unfortunately, that didn’t work at all and I was seemingly exporting some mesh which was just a flat plane. I figured that it must be that STLs really don’t like to represent a planar object as something not a flat plane, so I tried to make a cube which has a displaced top face that represents my plane.

I’ve gotten to the point of building the cube, and at least in the threejs renderer, it’s doing mostly what I want, I see a cube with my displaced plane as the top face (codepen: https://codepen.io/nivekmai/pen/VwBBwPq):
image

However, when I then export an STL and import it in a model viewer, it seems the top plane is totally flat:

I’m guessing this is because the displacementMap I apply to the Material for the top face isn’t actually updating the vertices of the Mesh, it’s just updating how the top layer is displayed instead of actually updating the geometry of the object. Is there some way to make it so that the displacementMap actually updates the vertices?

Also, side problem: I can’t get the top face to actually take the color I want, no matter what I do it’s always pure black, you can see in my codepen I tried to make the walls and top surface the same color, but the top is just black (hence the need for a white background and using wireframe for the top face).

The black color:

  • your wireframe is MeshLambertMaterial
  • add some light to the scene, otherwise it will be black
  • set the color in the material like color:'red'

The displacement map:

  • you already have an image with heightmap
  • you can modify the vertices themselves in the geometry, use the values in the heightmap or better use directly your getPoints()
  • a PlaneGeometry is OK, no need to use BoxGeometry

Here is an example with red color and random heights of vertices:

If some of the steps above is unclear, let me know.

Awesome, thanks for the answer.

So by this you mean to not bother using displacementMap, and instead going in myself and manually updating all the points in the geometry right? Or do you mean that I can use getPoints() on my Material to get the points, and then just update the positions in the PlaneGeometry with the Material’s points?

But then: what is the point of the displacementMap? It claims it updates the vertices, but it doesn’t? Or is position different than what a vertex’s position in space is?

If you modify the vertices in the geometry (i.e. you modify them in JavaScript) , they can be exported modified.

The displacement map really modifies the vertices … but this happens down in the shader (i.e. in the GPU). It is very fast because GPU utilizes a significant level of parallelism. However, because modification happens in the GPU, the data that you have (up in JavaScript) is unmodified and if you export it, it will be exported unmodified.

Ah that makes sense, curious then if there’s a feature request to say “export the mesh back to js data” (obv you wouldn’t want that to happen automatically, but a simple method to do it would be nice).

There is a way to do this, but I’d not consider it simple. WebGL is optimized to be one-directional, CPU → GPU. Sending data backwards GPU → CPU is cumbersome. One approach to is to ‘draw’ the data on an off-screen buffer and it can be read back from JavaScript and decoded. Another approach is to use WebGPU, not WebGL.

You should not be afraid to modify the vertices in JavaScript. It will take a fraction of a second at run time. You may also consider building the geometry directly, instead of creating a stock geometry and modifying its vertices.

Just posting back here, what PavelBoytchev suggested worked out great, after a few more modifications to get the STL outputting correctly oriented, I’ve now got a working version that takes the list of points (soon to be from the machine) and properly builds the plane with correct geometry in the JS layer for exporting, hopefully my comments are helpful to anyone else coming along with the same question: Snapmaker Laser Scanner · GitHub

1 Like