I just realized that my website/app, developed using threejs 123, is running under WebGL2 on modern systems. So far, it works fine, but I’m curious:
- are there any reasons I should be forcing my older code to WebGL1Renderer ? Any gotchas?
I just realized that my website/app, developed using threejs 123, is running under WebGL2 on modern systems. So far, it works fine, but I’m curious:
I think in today’s three.js webgl1 is pretty much 2nd class citizen, so you better avoid it when you can. Some things work properly only with webgl2, some other things were deliberately broken sacrificed with webgl1. Apart from penetration %, one time when webgl1 performed better than webgl2 for me was with very specific shader that did not compile under webgl2 and, when rewritten to compile, was performing slower - but with three.js built-in materials this would not happen.
Webgl2 has good browser support across the board now that Safari got its game together, so I doubt you have much to worry about:
Unforuntately, I’m in the situation of using the ancient Win32 SDK WebView control which only supports the Internet Explorer 11 renderer. So I’m stuck at WebGL1 for that side. My app also runs on macOS where WebGL2 is now becoming availalbe.
My main worry is : what if WebGL2 has some bug or performance regression… would it be better to force my app to stick to WebGL1? Or is it more likely that WebGL2 would be better, e.g. fixing bugs or having better performance?
If you’re really worried, just use the WebGL1 renderer. It’ll work just as well.