Offscreen Canvas with Textures

Hi,

Is there a way to render a scene with textures (images) on an offscreen canvas?

On WebGLState.js file this line:
gl.texImage2D.apply( gl, arguments );
expects arguments[5] to be HTMLImageElement but an element like that could not exist on a web worker because the document (and DOM) doesn’t exists.

Thanks.

Um, I’m not sure what you mean by that :thinking:. Normally you perform RTT by rendering a scene into a render target. I don’t understand the purpose of OffscreenCanvas in this context. Can you maybe demonstrate with a code example what you want to do? Maybe things get more clear then…

Not sure that RTT is what the OP is asking about? OffscreenCanvas could be used a lot of ways, even to render into a canvas shown on the page, controlled by an OffscreenCanvas managed in a worker.

@ranbuch I think you’d need to get the raw data of the image — as an ArrayBuffer or Blob — and then create a THREE.DataTexture, THREE.CanvasTexture, or THREE.CompressedTexture from that.

1 Like

Right, this is what the following example does:

https://threejs.org/examples/webgl_worker_offscreencanvas

Still, I don’t get the reference to texImage2D(). I’m sure a live example makes things more clear.

1 Like

@Mugen87 sorry for not being clear enough. Your example is indeed relevant but there’s no textured material - witch I find challenging to load and render.

As @donmccurdy stated, maybe I should use a different texture type.
Let’s say I want to load a glTF scene (that have textured materials), where should I load it? Is it posible to load it inside of the worker? How can you convert an image URL (or a base64 string) to THREE.DataTexture, THREE.CanvasTexture, or THREE.CompressedTexture under the web worker limitations (no Image, no document)?

Should I load it from the main thread and PostMessage it to the worker? How should I send the image data (remind you about the structured clone algorithm limitations).

Would I be able to create a new Texture() instance out of the transferred data from within the worker?

In this case, you should use ImageBitmapLoader. The respective ImageBitmap API does work in a worker context.

@Mugen87 looks like it is indeed possible loading textures using ImageBitmapLoader but from some reason all textures are black.

I’ve altered the GLTFLoader so he will use ImageBitmapLoader and I’m running on all of the ImageBitmap maps and converting them to Textures on the GLTFLoader sucess callback:

material.map = new this.app.window.THREE.CanvasTexture(material.map);
The image field on the new texture is 1 instead of an image element.

There are no errors but, as I’ve mentioned, all of the textures are black.

Any thing else I should do in order to render CanvasTexture?

The official offscreen canvas demo does this:

I’m afraid you need to share your code as a live example or git repository so it’s possible to debug the issue.

1 Like

Terns out I was converting fields like envMapIntensity to Texture.

Now everything works great.

Thank you for your help guys, that was really helpful!

We can’t use ordinary TextureLoader() because it use ImageLoader with DOM dependency. But it is posible to fetch a blob from WebWorker and create ImageBitmap what ImageBitmapLoader() is actually do inside. Right answer is posted by @Mugen87.

^Does that work in browsers other than Chrome? I’d thought browsers’ createImageBitmap support was pretty bad…

According to CanIuse is enough. Main problem is offscreencanvas support, but there is some old fashion way how to run it in Firefox. But we need to duplicate almost all WebGL 1/2 API inside a worker and pass data to main thread.

I’m not sure but I’m using this solution only if the browser supports OffscreenCanvas:
if ('transferControlToOffscreen' in document.createElement('canvas')).

I guess I should run some more tests.

I’ve got to admit, writing a fully functional web viewer that runs on an OffscreenCanvas is not an easy task. Some of the three-js components should be change and sometimes re-write.