I am looking to create a texture from a VideoFrame object to enable the use of three.js as a manipulator of WebRTC video for WebRTC Insertable Streams.
I am following this WebRTC Insertable Stream example, which uses vanilla WebGL to create a texture from a VideoFrame here / code here.
Is this possible to recreate in three.js?
I tried using the WebGLRenderer.copyTextureToTexture() method, but this is throwing an error: Error from stream transform: TypeError: Cannot read properties of undefined (reading 'width')
.
Any advice would be helpful!