[SOLVED] Can't display displaced BufferGeometry using RaytracingRenderer

I am working on a feature where I can switch between WebGLRenderer and RaytracingRenderer. So far it is working, but with a few issues:

  1. The intensity of lights has to be incredibly high, which means that my lighting looks quite a bit different in one renderer than the other.
  2. My PlaneBufferGeometry, which I displace using it’s position buffer, shows up fine in the WebGLRenderer, but displays as a flat plane in the RaytracingRenderer.

This is how I’m displacing the mesh. I’ll be happy to provide more code, but it’s all becoming hard to separate from the larger class around it, so I’ve pulled out what I imagine would be most helpful.

  initGeometry(v) {
    this.geometry = new THREE.PlaneBufferGeometry(
      v,
      v,
      this.state.detail,
      this.state.detail
    );

    this.initializeMesh();
    this.displaceGeometry();
  }

  initializeMesh() {
    this.material = new THREE.MeshNormalMaterial();

    var mirrorMaterialSmooth = new THREE.MeshPhongMaterial({
      color: 0xffaa00,
      specular: 0x222222,
      shininess: 10000,
      vertexColors: THREE.NoColors,
      flatShading: false
    });
    mirrorMaterialSmooth.mirror = true;
    mirrorMaterialSmooth.reflectivity = 0.3;

    this.mesh = new THREE.Mesh(this.geometry, mirrorMaterialSmooth);
    this.scene.add(this.mesh);
  }

  displaceGeometry() {
    const displacement_buffer = this.elevation.getBufferArray();
    const positions = this.geometry.getAttribute('position').array;
    const uvs = this.geometry.getAttribute('uv').array;
    const count = this.geometry.getAttribute('position').count;
    
    for (let i = 0; i < count; i++) {
      const u = uvs[i * 2];
      const v = uvs[i * 2 + 1];
      const x = Math.floor(u * (this.width - 1.0));
      const y = Math.floor(v * (this.height - 1.0));
      const d_index = (y * this.height + x) * 4;
      let r = displacement_buffer[d_index];

      positions[i * 3 + 2] = (r * this.amplitude);
    }

    this.geometry.getAttribute('normal').needsUpdate = true;
    this.geometry.getAttribute('position').needsUpdate = true;
    this.geometry.getAttribute('uv').needsUpdate = true;

    this.geometry.computeVertexNormals();
    this.geometry.computeFaceNormals();
    this.geometry.computeBoundingBox();
    this.geometry.computeBoundingSphere();

    this.geometry.translate(0, 0, -this.geometry.boundingBox.min.z);
  }

Displaced test mesh

Same mesh through RaytracingRenderer
Screenshot%20from%202019-06-24%2011-45-18

The raycaster seems to be ignoring any changes to a BufferGeometry, such as,

this.geometry.translate(100, 100, -this.geometry.boundingBox.min.z);

are there any issues relating to Raycaster and BufferGeometry? Am I failing to provide some necessary update flag?

I’m afraid you are mixing two different things. The is the experimental THREE.RaytracingRenderer on the one side and THREE.Raycaster on the other side.

Are you referring to THREE.Raycaster or THREE.RaytracingRenderer? I’m afraid this is not clear from your previous posts.

Thanks for catching that! I am talking about the RaytracingRenderer. The renderer’s internal raycaster is failing to recognize changes to my geometry.

But it does it work if you render with THREE.WebGLRenderer, right?

1 Like

Yep, fully functional in WebGLRenderer. I am using an EffectComposer to generate a displacement map, and then am reading from that texture to displace the geometry on the CPU. This all works as expected in the WebGLRenderer, but when I switch to the RaytracingRenderer, none of the displacement (or even translation) is recognized.

Normally, THREE.RaytracingRenderer should always work with the latest scene object when calling its render method. So I can’t tell you what’s going wrong in your application.

It will be easier to investigate the issue if you can share a live example that demonstrates the issue or your code as a github repository.

@Mugen87 I managed to solve the mystery while preparing an example! It came down to the issue you commented on here (thank you).

I was using PlaneBufferGeometry, which, when serialized by toJSON(), is stripped of it’s buffer attributes. The solution was to copy the PlaneBufferGeometry into a BufferGeometry, and then displace that BufferGeometry.

So, to sum it up, if you are using RaytracingRenderer with a BufferGeometry, the buffer attributes won’t be sent to the renderer if it is a standard geometry like PlaneBufferGeometry, SphereBufferGeometry, etc. It must be copied to a new BufferGeometry.

    let plane = new THREE.PlaneBufferGeometry(
      width,
      height,
      segments,
      segments    
    );

    let geometry = new THREE.BufferGeometry();
    geometry.copy(plane);

    displaceGeometry();

3 Likes