Rendering pictures captured by the camera as texture react native and react three fiber

Hello, I have been struggling with this one for a while. I want to capture images by the camera using expo camera and render them as texture and so far all I am getting is a black texture or an error. here is the current code

 const captureImage = async () => {
    if (cameraRef.current) {
      const photo = await cameraRef.current.takePictureAsync({
        base64: true,
        shutterSound: false,
        skipProcessing: true,
      });
   
      return photo?.uri;
    }
function ImageT({ src }: { src: string }) {
  try {
    const texture = useLoader(TextureLoader, src as string);
    console.log(texture);

    return (
      <mesh>
        <planeGeometry attach="geometry" args={[2, 2]} />
        <meshBasicMaterial
          side={THREE.DoubleSide}
          attach="material"
          map={texture}
        />
      </mesh>
    );
  } catch (e) {
    console.log(e);
    return null;
  }
}

You need to share more code, there’s too much missing to have an idea of what’s going wrong. ideally in a live example on codepen / jsfiddle.

For debugging, I would also add the generated image to the DOM, so you can figure out if the problem is with taking the picture, or with rendering it in your 3D scene.

Ah I didn’t realise it was React Native.

Since there’s no problem with the image the issue could be with your app’s react lifecycle.

What does the console show when you log the texture? What does it show if you try to load the src prop passed to the ImageT function?

If you pass that texture a base 64 string that you’ve created beforehand, or an external URL, does it work? eg
const texture = useLoader(TextureLoader, aBase64String);
const texture = useLoader(TextureLoader, anImageUrl);

If it does, then that means you need to look into your state / props management.