Need help understanding an issue in using the SSR effect


I’m very new to 3D modeling and UI and wanted to ask for some help in understanding an issue I’m seeing while using the SSR (screen surface reflections) effect. For full context on my setup, I’ve taken the following steps so far in generating the model and rendering it in a simple React + Vite app using react-three-fiber:

  1. Modeled the object using Blender. Model looks like the following (screenshot from Blender using the Eevee render mode):

  1. Exported to glb/gltf format

  2. Used gltfjsx to transform the exported model from #2 into jsx.

  3. Applied Bloom effect for desired meshes by modifying the generated jsx from #3. Following is an example for a modified mesh.

<mesh name="Plane007_2" geometry={nodes.Plane007_2.geometry} material={materials['Material.007']} material-toneMapped={false} material-emissiveIntensity={4} />
  1. Applied SSR effect from @react-three/postprocessing

Following snippet outlines the details of the Canvas I’m using:

import { EffectComposer, Bloom, ToneMapping, SSR } from '@react-three/postprocessing'

<Canvas gl={{ antialias: false }} shadows camera={{ position: [20, 20, 20] }}>
  <Suspense fallback={null}>
      position={[-10, 20, -10]}
      rotation={[-45, 0, 0]}
    <ambientLight intensity={0.5} />
    <directionalLight position={[-50, 0, -40]} intensity={0.7} />
    <CameraControls />
    <AccumulativeShadows position={[0, -2.24, 0]} temporal frames={100} alphaTest={0.9} opacity={1.5} scale={12}>
      <RandomizedLight amount={8} radius={2} ambient={0.5} intensity={1} position={[-2.5, 5, 0.5]} bias={0.001} />
    <EffectComposer disableNormalPass multisampling={4}>
      <SSR {...props} />
      <Bloom mipmapBlur luminanceThreshold={1} intensity={0.5}/>
      <ToneMapping adaptive resolution={256} middleGrey={0.4} maxLuminance={16.0} averageLuminance={1.0} adaptationRate={1.0} />

Based on my goal to add postprocessing Threejs for achieving as much parity as possible between renders in Threejs and Blender, I enabled SSR to make the Threejs render look more realistic by adding screen surface reflections, which is currently enabled in Blender.

After enabling SSR, I’m noticing a weird artifact where trails of “light particles” remain when changing the zoom or rotating the viewport. Specifically, I’ve noticed the artifact when I either 1) zoom in very close to the emission sources (e.g. the door and windows) then zoom out or 2) rotate the viewport to face beneath the plane. Please see the included screenshots below for examples. The former leaves the trail in the back of the building and the latter leaves below the building. I’m a bit lost and wanted to get help from experts in understand what could be going wrong. Thanks a lot for any help in advance!

Example 1 -

Example 2 -

Leaving another screenshot capturing more examples in case it helps:

  1. The light from lamp post creates an unexpected cylinder of light particles when rotating around.
  2. Similar to earlier examples, traces of light particles are left in the background as camera zooms in

Thanks! :slight_smile:

The SSR effect is not finished, it will be replaced by realism-effects. The author is currently fighting the last remnants of ghosting. You can track the status on his Twitter

The last demos I have seen looked really promising, this is night and day from the early ssr library you’re using. But we’re all waiting for the release. :grin:

I see, thanks a lot for the details. realism-effects looks very cool! I’ll be eagerly waiting for the release :smiley:

Are there any currently available alternative approaches you’d recommend looking into while waiting for the release? I’m wondering if experimenting with different materials in Blender could give better results when using the “early” SSR library. Not sure if I’m thinking in the right direction there. Any advice would be very appreciated :pray:

The older ssr versions where ok but the later updates introduced a lot of ghosting again

Ah bummer. Guess I can take a look at how the older library is approaching it and experiment from there. Out of curiosity, is it common for version updates in threejs to break existing postprocessing effects? Perhaps more complex effects such as SSR are less future proof?

i think the new stuff will finally do it, but the older attempts changes drastically from version to version. to get this right across different hardware has be a real challenge but yannic is almost done it seems.

Although postprocessing is allegedly more maintained than native three post effects, it may be best to fall back to native three ssr as all three examples and library addons always cover 99.9% of use cases tried and tested across a diverse set of hardware and devices, are stable and simply work…

Sounds good on the advice to fall back on native three ssr. Will take a look at that and experiment in the meantime. Thanks for the guidance guys :slight_smile:

that effect imo runs very, very slow. i don’t mean to downplay the efforts made, it’s just that yannics effect is considerably faster. older releases are fine. the new stuff (global illumination will take some time).


would imo help threejs a lot because the current model is clearly holding it back, people don’t use effects because with jsm/ec even adding one or two passes halves the framerate as it renders the full scene for each pass. :hot_face:


Yeah I’m just suggesting there could be fallback flags set, the native three ssr can be slow with dense meshes I agree but of course at this stage it’s a weigh up between speed or multi platform bugs…

I did see doobs post a few weeks ago, it would be amazing to see a stable incorporation of the postpro lib but as of yet I’ve not seen any viable pr’s or merges as such, hopefully soon? It would be amazing and exciting to see for sure!

fwiu three’s examples tend to lean towards (cross platform) stability rather than speed, the work going into the postpro libs and such is amazing and I agree it’s a shame there’s not yet been the push required to make these libs watertight for cross compatibility stability…

fwiu three’s examples tend to lean towards (cross platform) stability rather than speed, the work going into the postpro libs and such is amazing and I agree it’s a shame there’s not yet been the push required to make these libs watertight for cross compatibility stability…

You should be lobbying browser makers and Apple specifically if you expect cross-platform stability from anyone, whether an open-source volunteer or a mega corporation. IDK where that narrative comes from, but it’s plain wrong and disconnected from the graphics landscape. If you encounter a device issue, you’re expected to take it upstream.

In an ideal world maybe but with apple specifically what might work today may likely break tomorrow due to the instable updates they make, sometimes breaking some of the simplest graphical processes…

I’m not sure about this statement, it seems a bit naive, suggesting a fallback is not wrong in terms of web graphics and what you’ve suggested in terms of taking things upstream has been for the most part impossible for some cases. Making tools and plugins that function under the lowest cross compatibility circumstances is the difference between things working and them not, for example up until very recently the memory limit of the ios safari browser was no more than 384MB, just to clarify 384MB! With next to no previous intention to increase that limit, good luck getting cross compatibility solutions for medium to higher memory processes such as a few 4k textures in a shader which would otherwise run fluidly on any android or non apple browser :thinking:

Cross compatibility is the wrong term here, this isn’t about implementing or using new features but ensuring the stability of the browser under ANGLE and in some cases the OS. It’s a moving target even for previously stable API surfaces. I don’t see how that’s naïve if you’ve worked in graphics for any amount of time whether web or native. If you do anything more than a basic pixel shader, you’ve been snagged by this several times over the last year in increasing frequency. Pitting this on anyone other than browser/OS makers is simply misinformed, everything in question is well within device limits and spec guarantees. You can ask myself, 0beqz, gkjohnson, mrdoob, or anyone who actually maintains this stuff about how many times this has come up this week in triage versus a month versus a year. We can’t fix driver behavior in three.js, and many workarounds aren’t accessible from the perspective of WebGL, so you can only take this upstream as you’re expected to. This affects everybody, not just the postprocessing or three.js library, and if you want things to improve you have to do your due diligence otherwise we may never know about it – browser makers surely aren’t.

1 Like

Although I agree with your sentiment in an ideal world, I think the point still stands that…

Point in question, as ambiguous a reason for a reversion, the following was reverted as fog bugged out on a 7 year old device with little to no actual evidence it doesn’t work because again, it’s the difference between something working across the board to working on one person’s setup…

Yes it’s good to be as ambitious as possible with effects and addons but if the idea breaks somewhere along the line of command it’s probably best to weigh up if the idea is intended for complete inclusivity or a fun exclusive effect / addon for a select few who use the same setup, not to say I’m disagreeing with you but from what you’re saying, until something works or is fixed upstream a product is practically useless in terms of its broadest reach? At least that’s my opinion of what I’m interpreting you say…

I don’t think you understand what it is I said at either point, nor do I know what it is you want here in being what I can only guess is contrarian, but if there’s anything to take away, it’s to tell people, whether library maintainers or vendors, when you encounter a problem. Had that had happened for this thread specifically, it might have expected a resolve to begin with and hardened the stack all around should it be a known issue. Some issues are commonplace enough for there to be something to revert or refactor, and others are completely project- or ecosystem-killing where there is no agency. I trust that nobody will know the difference as a user with any nontrivial amount of complexity, and considering how temporally inconsistent this is, us maintainers usually have to work together on this to get more eyes on an issue since it’s usually not isolated. This only happens when there is user feedback, because yes of course we do due diligence in device testing when investing years into a project, but that doesn’t guarantee stability or even feasibility in the future.

Yes you said to lobby browser devs to have issues resolved…

As I said, I didn’t disagree with your sentiment, so I’m not sure what you’re referring to here, I simply provided examples to how three usually manages their library and said that it’s OK to provide a fallback if something doesn’t work from one device specification to the next from which you’ve told me is outright wrong, delusional and misinformed to which I personally don’t think is correct.

I’m pretty sure that’s what the OP is / was reporting or trying to report right? You’ve chosen instead of finding out if the cross browser issue in the ssr effect has been resolved and provide details of the fix, to basically say “report it upstream and wait is better than a fallback or nothing at all” it seems counter productive.

Anyone should report issues on GitHub or vendors (or both, we’ll do so on your behalf) so people who actually do the work in question can see it. That is the proper place for such issues. This isn’t a matter of sentiment or opinion but common sense for how to resolve an issue. You in specific have been quite unhelpful in that and I chose to comment to redirect the issue, with the audacity to belittle and debate the people do this work (for free) to the benefit of none. I’ll entertain this from you no further.