I came across an unusual issue when animating a pointlight’s intensity with a fixed decay value.
At random, the light will render without the correct decay and renders a much brighter light.
What’s really unusual is that it renders like this across different browsers. When the issue appears, if I run the same scene in Safari, Chrome, Firefox, it shows the same brightness. If I load up a virtual machine or different computer, it renders at a different brightness.
Sometimes a computer that renders incorrectly will work on different days but I can’t see any consistency as to why it works sometimes.
When the issue appears, it seems that the decay and distance values are ignored when rendering. If I change the values at runtime, it makes no difference. I think I read these values are passed to the shaders as uniforms, could it be that the GPU is compiling and caching the shaders incorrectly? If so, is there a way to clear GPU shader caches and force a rebuild?
I’m using a couple of standard pointlights, decay set to 2, animate the intensity and run it on different machines. I don’t know what causes it to break but I’m wondering if anyone has seen this issue or might have any suggestions on how to fix or debug it to see what’s going wrong.
What could cause the pointlight to ignore the decay and distance values, even at runtime? Lights without animated intensity look like they render ok in the same scene.
It looks like it’s an issue with the GPU. My computer has both Intel integrated graphics and Nvidia dedicated graphics. The lights render correctly on the Intel one but not the Nvidia one (650m). I found an app to manually switch between them and I can reproduce it consistently now.
This explains why it was happening at random as it switches the GPU depending on the load.
I tried switching the renderer property to use physicallyBasedLights and that seems to render correctly on Nvidia. I have to change my light intensities when I use that so 1x decay I have to do 10x, 2x decay = 100x intensity.
I wonder why the Nvidia 650m GPU doesn’t respect the decay value unless I use the physicallyBasedLights property.
Do you know what the physicallyCorrectLights setting does differently vs it being turned off and does it come with a performance hit? It’s strange that this setting seems to work ok.