How to scale down a rendered texture image

In my webapp, when clicking on a button a collection of many (~200) images is displayed sequentially.
The images are 4160x3120 jpeg files (ballpark size of ~0.5MB to 3MB).
Memory is managed so that only 1 image is stored at a time.

The webapp runs ok on a variaty of other OSs and browsers, e.g. Windows/Linux/MacOS, Firefox/Chrome/Safari.
But on iOS devices (on iOS 15.2, and on iOS 14.8.1) I’m getting memory errors after playing only some of the images.
Maybe the problem is just hidden, and exacerbates on the weaker iOS device.

I checked for memory leaks and things look fine.
Adjusting WebGL experimental features in Safari (specifically disabling the flags: “GPU Process: Canvas Rendering”, and “GPU Process: WebGL”) helps but not all the way (the error happens less frequent).

I found that for some people the solution is to downscale the rendered image, e.g. from 4160x3120 to 2080x1560.
Using texture and canvas what would I do to scale down the rendered texture image to relieve pressure from WebGL rendering.


As far as I know, textures are automatically resized to the nearest power of 2 in THREE, so something like 4096 x 4096 in your case, ios devices are renowned for being a bit temperamental when handling 4k textures, an option could be to create a new canvas element with the appropriate size (2048), draw your bigger image to that canvas and then either create a new image from that canvas as an image texture or parse the canvas itself as a canvas texture…