I really like webgl and three.js.
Among the reasons, I was pretty sure that as long as it works, it would work the same on every platforms.
Yet yesterday I had an issue in my project…
I was using, in a fragment shader, a png texture where the colors would be used to separate different areas, where to apply different effects.
I use png because it is capable of keeping the actual colors (I haven’t found a way to do so with jpeg).
On Chrome, it works as expected but as it turns out the colors Edge and Firefox give are not accurate.
I solved the issues by spreading my indexing values and testing less precisely, but still I would like to understand this issue and see if there is a way to avoid it.
So, I don’t do much there.
I left the code at work, but basicaly to pick the color I render to a texture and extract the color with renderer.readRenderTargetPixels at the mouse position.
Much like what is done there for GPU ColorPicking:
If you check with a color picker, you will see it is the same color that is outputed by the default renderer target (on screen).
And as I said earlier, when reading the value directly in a shader, I would get the same results.
Also, just to be clear, the mesh I display uses MeshBasicMaterial (so light doesn’t matter), I don’t touch the gamma or the tone mapping, and the texture is set as nearest for min and mag filters.
So it’s not, I think, a matter of rendering.
I think this is really what the GPU is fed.
Interesting. I encountered a similar error also with PNG just yesterday. I have to be 100% sure it’s not my code, but knowing that you encountered the same is “positive”.