Three.js Possibly Not Using GPU?

It’s been a few months since I did anything with Three.js, but I hopped back in today because I had some neat ideas that I wanted to flesh out. Before I even got to working, I noticed a pretty weird issue occurring with the amount of resources that Three.js seems to be using. Now, I’m really not too in-depth on hardware and how software interacts with that hardware, but, to me, it seems as though Three.js is not using my GPU to render anything at all, and is most likely using my CPU?

This example on the Three.js site is where I can most clearly notice the issue, but as soon as I load it and try to rotate the camera or mess with the settings of the scene in the UI, I get a huge amount of slowdown. When I turn on the “Render Continuously”, the FPS monitor says I’m getting like 4 frames per second, and I can really see the slowdown across the whole machine, even something as simple as moving between tabs in Chrome (the browser I use, which may have an effect on this issue? I’m not sure) takes way the hell longer than normal, and, opening Task Manager, I can see that this tab in Chrome is using about ~80% of my CPU. When I close the Chrome tab, the CPU usage total returns to a normal amount, so Three.js is obviously the culprit here. This issue is also not exclusive to this example, as I’ve gone through many of the examples on the official site, as well as some of the external sites listed on the main Three.js page, and had this issue with all of them.

For reference, I’m running Windows 10, and my GPU is a RX 470 that I’ve never had issue with, and always works with whatever I throw at it, whether it’s 3D rendering, gaming, whatever. I have 16gbs of memory that I installed under a year ago, if that has any effect on this, and have never had issues with them either. Now, my CPU is a Intel Core i5-4590 @ 3.30GHz, from like 2014, so it’s a rather outdated bit of tech, but I don’t really rely on it for anything too computationally expensive, so it’s not much of an issue for me, but it’s obviously not what I would want to use for 3D stuff.

Am I correct in assuming that Three.js isn’t using my GPU, but my CPU instead? I’ve done a light bit of searching, and it seems like, from what I’ve read, Three.js should use the GPU instead, but I don’t know much about the nitty-gritty of how Three.js works, so I could be misinformed there. Is there anything I can do to fix this issue? I can provide more info if need be.

It’s not possible to render with WebGL without using the GPU, so if you see anything at all (and it isn’t a CanvasRenderer example) then your GPU is working. That particular demo is expensive (I think it’s meant to show something that would be prohibitively expensive on the CPU, but it still uses quite a bit of CPU time anyway). If you open Dev Tools, start a Performance profile recording, and snapshot the result, you should see something like this:

There’s both GPU and CPU time there. As for why it might be slow on your computer I’m not sure, but it’s not particularly fast on my machine (2014 MBP) either. :slight_smile:

1 Like

Readrendertargetpixels? Is this a synchronous read?

Yes. WebGLRenderer.readRenderTargetPixels() internally uses WebGLRenderingContext.readPixels(). This API method sends a request for pixel data to the GPU and then waits for the response. More information:

Ah I didn’t realize this isn’t the op’s screenshot.

Actually, in some Windows systems, Chrome uses SwiftShader (Sofware OpenGL Renderer) for rendering WebGL.

1 Like

One may also encounter this: