Webglcontextrestored event not fired

Hi,

I’m making a kind of small test-flight app for my users, with the purpose of measuring their computer’s performance prior to them using the main app, so I can give them a more tailored experience regarding quality.

In one of the tests I make, I fill the scene with a lot of large textures. In the case that the their GPU doesn’t handle the memory pressure, I’d like to handle a WebGL context lost, and continuing my tests (to see for example if their GPU can handle my “medium” sized textures).

My problem is that after the context lost, it doesn’t seem to be ever restored. I’m not very familiar with the context lose/restore topic, but as I understand, this should happen automatically. It doesn’t, at least not on my computer. Any thoughts?

Thanks

When three.js recognizes a context lost, it should print the following warning in the browser console:

THREE.WebGLRenderer: Context Lost.

However, when the user agent restores the context, you should see:

THREE.WebGLRenderer: Context Restored.

If this log does not appear, it seems the browser is not willing to restore the WebGL rendering context.

In general, three.js is able to restore its internals when a context restore happens so it should normally render the scene in the next valid frame.

But why would a browser (standard chrome in my case) not be willing to restore the context?

Sorry, I don’t know that. I can only assume that web apps which cause GPU crashes are potentially black-listed by the browser such that a context restore is not allowed.