My Screenshot saver doesn't apply any Effect (from EffectComposer)

I have a WebGL widget (using R3F) of which I want the user to be able to take a screenshot, which is then saved and can be displayed elsewhere on the page.
I’m doing this by grabbing the RootState of the current scene, rendering it to a brand new WebGLRenderTarget and then render the scene to it. This mostly works alright (for all geometry and lights in the room), however, none of the Effects in the scene are applied. I don’t know how to grab these effects from the RootState and render them (in subsequent next passes?) onto the new render target I created.

I have a minimum working example here: https://codesandbox.io/p/sandbox/effect-screenshot-clpfrv

The relevant screenshot rendering is Widget::getCurrentImage().

import React from "react";

import { Canvas, RootState } from "@react-three/fiber";
import { EffectComposer, Bloom } from "@react-three/postprocessing";
import { OrbitControls } from "@react-three/drei";
import { Color, WebGLRenderTarget } from "three";

interface IWidgetProps {}

interface IWidgetState {
  rootState: RootState | null;
}

class Widget extends React.Component<IWidgetProps, IWidgetState> {
  constructor(props: IWidgetProps) {
    super(props);

    this.state = {
      rootState: null,
    };
  }

  public getCurrentImage(): string | null {
    const WIDTH = 512;
    const HEIGHT = 512;

    const gl = this.state.rootState?.gl;
    const camera = this.state.rootState?.camera;
    const scene = this.state.rootState?.scene;

    if (gl && scene && camera) {
      const renderTarget = new WebGLRenderTarget(WIDTH, HEIGHT);
      gl.setRenderTarget(renderTarget);
      gl.render(scene, camera);

      const imageData = new ImageData(WIDTH, HEIGHT);

      gl.readRenderTargetPixels(
        renderTarget,
        0,
        0,
        WIDTH,
        HEIGHT,
        imageData.data
      );
      gl.setRenderTarget(null);
      const canvas = document.createElement("canvas");
      canvas.width = WIDTH;
      canvas.height = HEIGHT;
      const ctx = canvas.getContext("2d");
      ctx?.putImageData(imageData, 0, 0);
      return canvas.toDataURL("image/png");
    }
    return null;
  }

  render() {
    return (
      <Canvas
        className="canvas"
        // Preserving drawing buffer is necessary in order to be able to make a screenshot
        // using "toDataURL()".
        onCreated={(renderState: RootState) =>
          this.setState({
            ...this.state,
            rootState: renderState,
          })
        }
        style={{
          width: "80vw",
          height: "80vw",
          background: "black",
        }}
      >
        <OrbitControls />
        <ambientLight intensity={0.95} />
        <pointLight position={[3, 2, 0]} intensity={0.5} />
        <mesh rotation={[0, 10, 0]}>
          <boxGeometry attach="geometry" args={[1, 1, 1]} />
          <meshStandardMaterial
            attach="material"
            color={new Color(5, 3, 1.5)}
          />
        </mesh>
        <EffectComposer>
          <Bloom
            mipmapBlur={true}
            luminanceThreshold={1}
            levels={8}
            intensity={1}
          />
        </EffectComposer>
      </Canvas>
    );
  }
}

export default Widget;

The Widget here will render the following:

Whereas the saved-out state looks like that but without the applied effect (Bloom in this case):

You’re adding postprocessing as JSX which has its consequences - to apply any of that postprocessing you’d need to use JSX to render the screenshot. It’s a slight misconception introduced by react three fiber - postprocessing is not an element in the scene, even if it’s added via JSX there.) In vanilla threejs (used by R3F behind the scenes), postprocessing is replacing the renderer in a way (same exceptions apply when you’d use environment in your scene via Environment component.)

If you’d like entire JSX tree to be taken into account - RenderTexture from drei could be helpful.

2 Likes

Can you use renderer.preserveDrawingBuffer=true and just gl.domElement.toDataUrl() to grab the current buffer?

@manthrax that’s a fair suggestion, but no, I specifically want to save this picture with a different (fixed and predefined) resolution so just saving the draw buffer doesn’t work here. :confused:

@mjurczyk thanks for the explanation and RenderTexture suggestion… I’m not 100% sure how this would work. I think what you mean is that this would be part of the scene, correct, meaning it would render every frame? That would be an option but it sounds like it would be bad performance for something I only need very few moments.
Is there no way to get the PostProcessing from somehwere from the root and apply it to the render target of the screenshot (in the above example). Or maybe in the worst case, copy/instance the whole EffectComposer from somewhere and apply it just for the screenshot? Not quite sure how to write that though, if it’s even possible.