Using Mesh instead of Group is bad practise to change opacity of an 3DObject?

Hello, this question follow Brabbit_640’s topic about changing the opacity of an object group.
I understood that in order to change the opacity of an object group, I have to loop through the group’s children to change the opacity of their material.
But I was wondering what if I replace the object group by an another mesh.
I could replace this code:

<group ref={ref}>
        <mesh geometry={nodes.Cube001.geometry} material={materials.IPHONE}/>

        <mesh geometry={nodes.Cube001_1.geometry} material={materials['iphone-x-screenshot']}/>

        <mesh geometry={nodes.Cube001_2.geometry} material={materials.glass}/>
</group>

to this:

<mesh ref={ref}>
        <mesh geometry={nodes.Cube001.geometry} material={materials.IPHONE}/>

        <mesh geometry={nodes.Cube001_1.geometry} material={materials['iphone-x-screenshot']}/>

        <mesh geometry={nodes.Cube001_2.geometry} material={materials.glass}/>
</mesh>

Now using the ref I can change the material’s opacity of the mesh’s parent thus making all the children fading away.
Do you think it can work or do you this is a bad idea and is not optimal?
Thanks in advance for your response.

Mmmm… No! It’s is not like CSS!

Materials are independent objects, they do not inherit any property from there parent, or parent’s parent, they’re not aware of there parent and they don’t need to.

The best approach in your case, is to loop trough the materials object, and set the opacity, something like the following:

const opacity = 0.5;
Object.values(materials).forEach((material) => {
    material.opacity = opacity;
    material.transparent = true;
});

there must also be a misunderstanding, wrapping meshes in a mesh does nothing, it doesn’t have geometry, it doesn’t render, its material — most likely some default that three pre-emptively created isn’t going to affect anything. for all intents and purposes it will behave as a normal group.

is it simply a burden to traverse materials or do you have other reasons to seek parental opacity? i know that for transitions or cross fade traversal is pretty bad because objects will not fade as one, but each on its own with terrible looking overlap. if it’s that there are ways,

it would probably be easier now using screenquad + meshportal

import { ScreenQuad, MeshPortalMaterial } from '@react-three/drei'

ref.current.opacity = opacity
...
<ScreenQuad>
  <MeshPortalMaterial transparent ref={ref}>
    <mesh geometry={nodes.Cube001.geometry} material={materials.IPHONE}/>
    <mesh geometry={nodes.Cube001_1.geometry} material={materials['iphone-x-screenshot']}/>
    <mesh geometry={nodes.Cube001_2.geometry} material={materials.glass}/>
  </MeshPortalMaterial>
</ScreenQuad>

i don’t know if it makes sense, it would be like an extra render i guess, and you would loose depth. if you have a real world use case i could try it.

but if it’s really only to cut some boilerplate i wouldn’t bother, just traverse.

Hello thank you for your reply. My question is linked to my other post.
I want to make a dynamic background where multiples 3dObject fade in and fade out randomly.
Kinda like this
I try using delay to make the animation look desync but it always play at the same time for all 3dObject.
I came across the library gsap with his timeline object but it can’t modify the opacity of a 3dObject with one property, I have to traverse all the mesh child and change their material opacity.
Maybe your use of MeshPortalMaterial and ScreenQuad can help me with gsap

Have you considered using a fog (either the simple fog or the exponential fog)? Depending on your overall scene, a fog might be sufficient to make objects pop-up and pop-away without all the issues caused by transparency.

Hey, that can be a really nice hack but the background on my page need to be transparent so I can see the text.
This could solve my problem but I don’t know how to create the fog, do you have an example.
Like this I only need to play with the position of the 3dObject to make them fade in and fade out by making them enter/exit the fog.

You can see this thread for a material based distance fog…

The codesandbox don’t work, it give an fetch error.
I also try to convert it to a Typescript React component but I struggle

i updated the box, it should be fine now

1 Like

You are incredible!
I struggle to convert this into Typescript, here is my code so far:

import { Mesh, Vector3, Matrix4, Color, ShaderMaterial } from 'three';
import { useRef } from 'react';
import { extend, useFrame, useThree } from '@react-three/fiber';
import { shaderMaterial, useFBO } from '@react-three/drei';
import { DirectionalFogProps } from '@/constants/interface';

const FogMaterial = shaderMaterial(
  {
    tDiffuse: null,
    depthTexture: null,
    projectionMatrixInverse: new Matrix4(),
    viewMatrixInverse: new Matrix4(),
    cameraPos: new Vector3(),
    depthColor: new Color('black'),
    fogNormal: new Vector3(0, 1, 0),
    transitionLength: 1.0,
    fogOffset: 0,
    near: 0.025,
    far: 0.1
  },
  ` varying vec2 vUv;
    varying vec2 vTexCoords;
    varying vec4 vWorldCoords;
    void main() {
      vUv = uv;      
      vec4 modelPosition = modelMatrix * vec4(position, 1);
      vec4 viewPosition = viewMatrix * modelPosition;
      vec4 projectionPosition = projectionMatrix * viewPosition;
      vTexCoords = uv;
      vWorldCoords = modelPosition;
      gl_Position = vec4(2.0 * uv - 1.0, 0.0, 1.0);
    }`,
  ` uniform highp sampler2D tDiffuse;
    uniform highp sampler2D depthTexture;
    uniform mat4 projectionMatrixInverse;
    uniform mat4 viewMatrixInverse;
    uniform vec3 cameraPos;
    uniform vec3 depthColor;
    uniform vec3 fogNormal;
    uniform float fogOffset;
    uniform float transitionLength;
    uniform float near;
    uniform float far;
    varying vec2 vUv;
    varying vec2 vTexCoords;
    varying vec4 vWorldCoords;    
    vec3 worldCoordinatesFromDepth(float depth, vec2 vUv) {
      float z = depth * 2.0 - 1.0;
      vec4 clipSpaceCoordinate = vec4(vUv * 2.0 - 1.0, z, 1.0);
      vec4 viewSpaceCoordinate = projectionMatrixInverse * clipSpaceCoordinate;
      viewSpaceCoordinate /= viewSpaceCoordinate.w;
      vec4 worldSpaceCoordinates = viewMatrixInverse * viewSpaceCoordinate;
      return worldSpaceCoordinates.xyz;
    }
    #include <common>
    #include <dithering_pars_fragment>
    void main() {
      vec2 uv = vTexCoords;

      vec4 diffuse = texture2D(tDiffuse, uv);
      float depth = texture2D(depthTexture, uv).x;
      vec3 worldPos = worldCoordinatesFromDepth(depth, uv);
      vec3 rayDir = normalize(worldPos - cameraPos);
      if (depth == 1.0) {
        worldPos = cameraPos + rayDir * 1e6;
      }
      worldPos += fogNormal * -fogOffset;
      vec3 offsetCameraPos = cameraPos + fogNormal * -fogOffset;
      float a = near;
      float b = far;
      float camStartAlongPlane = dot(offsetCameraPos, fogNormal);
      float rayAlongPlane = dot(rayDir, fogNormal);
      float fogAmount =  (a/b) * max(abs(exp(-camStartAlongPlane*b)), 1e-20) * (1.0-exp( -distance(cameraPos, worldPos)*rayAlongPlane*b ))/rayAlongPlane;
      diffuse.rgb = mix(diffuse.rgb, depthColor, 1.0 - exp(-fogAmount));

      gl_FragColor = vec4(diffuse.rgb, 1.0);
      #include <dithering_fragment>
      #include <tonemapping_fragment>
      #include <encodings_fragment>
    }`
)

export function DirectionalFog({
  samples = 8,
  args = [1, 1],
  scale = 10,
  speed = 10,
  color = 'black',
  near = 0.025,
  far = 0.1,
  animate = false,
  ...meshProps
} : DirectionalFogProps) {
  extend({ FogMaterial })
  const meshRef = useRef<ShaderMaterial>(null);
  const camera = useThree((state) => state.camera)
  const target = useFBO({ depth: true, samples })

  const camPos = new Vector3();
  const meshPos = new Vector3();
  const meshDir = new Vector3();
  const meshOffset = new Vector3();
  const zero = new Vector3();

  useFrame((state) => {
    if (meshRef.current) {
        meshRef.current.visible = false
        state.gl.setRenderTarget(target)
        state.gl.render(state.scene, state.camera)
        meshRef.current.visible = true

        meshRef.current.getWorldPosition(meshPos);
        (meshRef.current.material as FogMaterial).cameraPos = camera.getWorldPosition(camPos)
        meshRef.current.material.fogNormal = meshRef.current.getWorldDirection(meshDir)
        //meshRef.current.material.fogOffset = meshOffset.copy(meshPos).length()
        meshRef.current.material.fogOffset = meshOffset.copy(meshPos).distanceTo(zero)
        //console.log(meshRef.current.material.fogOffset)
        state.gl.setRenderTarget(null)
    }
  })
  return (
    <mesh ref={meshRef} rotation-x={-Math.PI / 2} scale={scale} {...meshProps}>
      <planeGeometry args={args} />
      <fogMaterial
        transparent
        depthColor={color}
        near={near}
        far={far}
        tDiffuse={target.texture}
        depthTexture={target.depthTexture}
        projectionMatrixInverse={camera.projectionMatrixInverse}
        viewMatrixInverse={camera.matrixWorld}
        cameraPos={camera.position}
        defines={{ WAVES: animate }}
      />
    </mesh>
  )
}

It does not reconize fogMaterial as being part of type JSX.IntrinsicElements.
And I struggle with the meshRef, I should convert it to Mesh for the ref prop of mesh but if I do that I can’t access the custom properties cameraPos, fogNormal, fogOffset in the material object of meshRef.
What should I do?

I just realised, why can’t I just use the ThreeElements.fog instead of this complicated thread?