I think lack of RAM is causing this because when it’s at 99% I’m getting it every 1 hour.
My use case requires having high stability because Three.js elements are supposed to be in a native iOS/Android app, so any “context lost” is unacceptable and may even lead to Google/Apple disabling the application and to poor user experience to many people.
With 1,000,000+ users I’m pretty sure “context lost” will be happening over and over again to many of them on Androids and iPhones if it’s already happening to 1 person during testing.
Can someone from Three.js dev team answer:
How big of a problem globally “context lost” is?
If I will render pixel bitmap in JavaScript (Three.js without renderer) and send that to Java/Kotlin and render that in native OpenGL, is that a valid solution?
Off the bat there should be no reason here why your scene would be triggering the webGLContextLost event (especially with your hardware) unless you have some sort of major memory leak such as recreating geometries and / or materials in a render loop, not reusing geometries and materials where possible or generating / applying unnecessarily large textures, it’s too broad a scenario to diagnose from the simple description provided, any chance you can share a live editable example on codepen / js fiddle / codesandbox of your current environment?
Thanks, my app has no memory leaks and it indeed doesn’t crash at all when my RAM is at 50%.
But when 99% of my RAM is occupied by:
Photoshop,
Illustrator,
Unity,
Android Studio,
100 tabs open in browsers,
Node.js that I sometimes run on all processor cores to complete some task,
Etc.
then what happens is my Chrome tabs crash all the time and every software lags very much until I free RAM to <90%.
So, this seems to be caused entirely by RAM being at 99%, not by my WebGL/Three.js setup being in any way wrong.
Android and iOS users currently use devices ranging from iPhone 7 (iOS 15 is 20% market share) to iPhone 15 (iOS 16+ is 80% market share) and same with Android - old versions like 9.0 have high market share of 10-20%.
So, these older Android and iOS devices don’t have much CPU/GPU/RAM and I suspect they will constantly crash WebGL.
It seems it would be necessary to create a new custom Three.js renderer that doesn’t rely on WebGL and the HTML <canvas>, because by default only WebGL and WebGPU renderers are available here:
There seems to be no way to return raw pixel data without sending that to WebGL/WebGPU for rendering .
So, while this seems to be possible with moderate effort (like 1000 man-hours of work), I think there is little hope that the Three.js dev team will decide to make it. Perhaps they will in 2025-2030 (since WebGPU is practically Vulkan/Metal/DirectX and transitioning between environments is probably going to become easy soon), but not in the near future (not in 2023-2024).
I tested this a little bit more and here are my findings for anyone trying to use Three.js for Android or iOS (news are not good):
TLDR: Don’t use WebGL for Android/iOS native application (in WebView) because you will not get stable and smooth 60 FPS.
Even an empty Three.js scene will have occasional stutter because the amount of sandboxing (WebView) and emulation (JavaScript rendering to WebGL and the browser engine handling rendering of WebGL) is too much for 90% of Android devices to handle this.
Even the newest iPhone 15 Pro, Samsung S23 or Google Pixel 8 will not work smoothly, so forget about this.
Setting rendering priority to “high” for WebView will not help you.
So, as of October 2023, WebGL/WebGPU/Three.js combination is unfortunately a toy, not production ready.
Three.js would need to introduce native renderers in addition to WebGL and WebGPU to fix this.
You need to go with Unity or native Java/Kotlin or Swift to get stable and smooth 60 FPS rendering.
That’s nothing sort of surprising, or a new shocking discovery - every system becomes unstable when driven up to its limits, from human creations, to humans themselves.
You took the 0.001% of cases and made a generalized conclusion, well, that conclusion is only valid (and valuable) for those 0.001% of cases…
I have 64 GB RAM and my WebGL crashed many times with a very basic 2D scene that has probably less than 10 MB of assets loaded.
True, I overloaded RAM and I discovered this limitation only then.
Many Android phones have 2-3 GB RAM in 2023, so my initial post and concern is valid that it would not be even stable for 50% of users (not to mention it would not reach and maintain 60 FPS).
0.001% - funny, now you are downplaying the situation. Where is any proof that it’s 0.001%?
I’m speaking for 50%+ users.
50%+ users of Three.js need stable and 60 FPS smooth rendering on Android/iOS for their companies/employers/users.
Three.js/WebGL/WebGPU don’t offer this. It’s not production ready.
Make a native OpenGL/Vulkan renderer for Three.js and Three.js will become useful only then.
Also, Kotlin ↔ JavaScript integration exists, and various such other solutions exist (like GitHub - gfx-rs/naga: Universal shader translation in Rust), so instead of Three.js community waking up in 2030 and realizing “maybe we need to make a Vulkan renderer because WebGPU will never give stable 60 FPS” (which will happen) there is nothing wrong about being ahead and making it already in 2023.