DataTexture as RenderTarget

I’m trying to blur a dynamic texture before passing it to a material - kindof like this:

However the texture map I’m using is a THREE.DataTexture (from a Kinect depth camera feed), and I can only seem to get the Render Target to behave as a regular Texture (not a Data Texture) - see attached screenshot for console readout.

Question: Is there any way to get a WebGLRenderTarget to behave as a Data Texture? and if not is there a simple way to convert a Data Texture to a regular Texture for use in this scenario?

I’ve copied the relevant code below:


depth_texture = new THREE.DataTexture(new Float32Array(DEPTH_WIDTH * DEPTH_HEIGHT * 4), DEPTH_WIDTH, DEPTH_HEIGHT , THREE.RGBAFormat);
depth_texture.type = THREE.FloatType;

kinectMesh = new THREE.Mesh( new THREE.PlaneBufferGeometry( DEPTH_WIDTH, DEPTH_HEIGHT, DEPTH_WIDTH, DEPTH_HEIGHT ), new THREE.MeshStandardMaterial({map: renderTarget.texture,});

scene.add( kinectMesh );

var renderTargetParameters = { minFilter: THREE.LinearFilter, magFilter: THREE.LinearFilter, format: THREE.RGBAFormat, stencilBuffer: false };
var renderTarget = new THREE.WebGLRenderTarget( DEPTH_WIDTH, DEPTH_HEIGHT, renderTargetParameters );
var fxComposer = new THREE.EffectComposer( renderer , renderTarget );
var texturePass = new THREE.TexturePass( depth_texture );
fxComposer.addPass( texturePass );
fxComposer.addPass( hblur );             
fxComposer.addPass( vblur );

Sorry, but I’m afraid I don’t understand what you mean by that. For a shader, it does not matter if you use an instance of Texture or DataTexture. The only difference between both classes is how the data source of a texture is specified. And when using an instance WebGLRenderTarget as a texture (via renderTarget.texture), the texture data are already on the GPU. So in this case it does not matter what type renderTarget.texture actually is.

Can you please share your entire code as an editable live demo? Or a git repository? Your code snippet does not yet explain what you are actually doing in your application.

Hi Mugen, thanks for the quick reply

Essentially the problem I’m having is the renderTarget.texture is rendering black, I assume the issue has something to do with renderTarget parameters not matching the depth_texture params. If I bypass the blurPass/renderTarget and map the kinect “depth_texture” directly on the material it works fine.

I’m afraid I still don’t understand what the code is supposed to do. I can only say that creating a renderTarget for EffectComposer's ctor AND use it as a texture for a material at the same time is not supported.

The render target that is passed in to EffectComposer will represents an internal entity that will be cloned and then used as read and write buffers for FX passes. It’s not valid to use this render target in another context since you never know what EffectComposer is going to do with it.

Imagine this simplified example, except instead of “vidTexture” I’m using a kinect feed “depth_texture”


texturePass ( depth_texture )

blurPass

renderTarget

material (map: renderTarget.target)


I’m not passing the render target to the EffectComposer, I’m passing the result of the EffectComposer to the the render target.

The EffectComposer doesn’t have anything going into it, the texture is being generated inside of the composer using THREE: TexturePass.


So the issue didn’t have anything to do with DataTexture or the render target settings - I had my mesh initialization within a seperate function and I think that was tripping up the data flow of the render target -

If anyone comes across the same issue, just set everything up like this codepen example.

Need to learn to keep things simple and not clutter so much :roll_eyes:

At any rate, thanks for the help @Mugen87, you actually clarified a few things that lead to the solution!

1 Like