THREE.WebGLRenderer Context Lost_Performance_RAM

Hi,I have a problem with WebGL in Three js
I have a scene in three js and I import 3dmodels and add to the scene.
(I download them from sketchfab in gltf format ,some of them with animation and some of them not)
My PC hardware has 8 GB RAM,CPU i5-3,SSD 128 (But I tested again with a laptop that has 16 GB RAM but again I faced with this problem)

When I run my Program in Chrome browser after adding 3-6 models with size 30MB,5MB,25MB,7MB,… for example
(2 of them have animation and I add those animations from gltf.animations to gltf.scene.animations in gltfloader by travers,mixer … and add gltf.scene to main scene as a group 3dobject with animations)

Sometimes these oprations fill my RAM about 4GB(just add and show 5 3dmoedls that have animations but I do not play them)
And Sometimes I faced with these errors:

  • react_devtools_backend.js:4026 THREE.WebGLProgram: Shader Error 0 - VALIDATE_STATUS false
  • WebGL: INVALID_OPERATION: useProgram: program not valid
  • react_devtools_backend.js:4026 THREE.WebGLProgram: Shader Error 1282 - VALIDATE_STATUS false
  • WebGL: CONTEXT_LOST_WEBGL: loseContext: context lost
  • THREE.WebGLRenderer: Context Lost.
  • Sometimes : three.module.js:27177 THREE.WebGLRenderer: Context Restored.
  • Failed to execute ‘shaderSource’ on ‘WebGL2RenderingContext’: parameter 1 is not of type ‘WebGLShader’.
  • three.module.js:27169 THREE.WebGLRenderer: Context Lost.

Can you help me please how can I solve this Problem??!!

Is it possible for you to demonstrate the issue with a live example? Alternatively, can you share a link to your application? That would make it easier to investigate the errors.

Can I screenrecord and send it for you?

A screenrecord isn’t very helpful since it’s necessary to debug the app and have a closer look at the resource consumption.

Context loss problems are in general hard to debug since they often appear only on a group of devices. So depending on your computer, the application might run fine.

Did you have such this experience about WebGL Context lost?

The few occasions I have encountered a WebGL context loss were caused by resource intensive apps. Simplifying models, reducing the number of displayed models and especially avoiding high texture resolutions can easily mitigate context loss.

You mean that we can not use perfect 3dmodels in a three js websites?

Well, what you mean with “perfect”?

Properly authored models will look good and also be compatible with low-end devices.

Rendering models with millions of vertices and 4K textures with no compression techniques push many devices to their limits. This is no three.js or WebGL problem but a problem caused by inexperienced developers.

I mean high resolutions models

Before updating the models in a content creation tool like Blender, you might want to try to optimize your glTF assets with gltfpack - npm.

Make sure to use compressed textures and also compress geometry data. Both things can be done with the above tool. Such a compressed asset can then be loaded like demonstrated in the following example: three.js examples

Nevertheless, even with compression techniques you hit hardware limitations at some point.

Thanks,I install gltfpack and use
loader.setMeshoptDecoder(MeshoptDecoder);

but I do not know how to use options and how to check that gltf optimized or not?

I installed gltfpack but for options I can not use gltfpack command (not found) but this is added to package json
Do you have any example for this?

Thanks,I install gltfpack and use
loader.setMeshoptDecoder(MeshoptDecoder);

but I do not know how to use options and how to check that gltf optimized or not?

I installed gltfpack but for options I can not use gltfpack command (not found) but this is added to package json
Do you have any example for this?

I’m also facing this problem but i think this is because we are using more ram memory.

Keep in mind that PNG and JPEG textures cost a lot more memory than their original file size, because they’re fully decompressed on the GPU. Use only a small number of textures, no larger than 2K–4K resolution.

Ideally this should be done in Blender, with UV layouts that suit the texture resolution. But if you need to reduce texture resolution later, there are options:

npm install --global @gltf-transform/cli

gltf-transform optimize input.glb output.glb --texture-size 2048