Using THREE.RGBAFormat instead of THREE.RGBFormat

The RGBFormat has been removed in favor of using the RGBAFormat.
https://github.com/mrdoob/three.js/pull/23228

I previously did things like this:

var count = vectorData.length;
var data = new Float32Array(count * 3);
var index;
for (i = 0; i < count; i++) {
    index = 3 * i;

    data[index + 0] = vectorData[i].x;
    data[index + 1] = vectorData[i].y;
    data[index + 2] = vectorData[i].z;
}
myDataTexture = new THREE.DataTexture(data, count, 1, THREE.RGBFormat, THREE.FloatType);  

Do I understand correctly that now I’m supposed to do it like this:

var count = vectorData.length;
var data = new Float32Array(count * **4**);
var index;
for (i = 0; i < count; i++) {
    index = **4** * i;

    data[index + 0] = vectorData[i].x;
    data[index + 1] = vectorData[i].y;
    data[index + 2] = vectorData[i].z;
    data[index + 3] = **[some arbitray number]**;
}
myDataTexture = new THREE.DataTexture(data, count, 1, **THREE.RGBAFormat**, THREE.FloatType);  

Basically just ask for a larger array to store the data and pad every fourth element with some arbitrary, never to be used values?

I would use “1” (or 255 for byte arrays) instead of some arbitrary number for alpha depending on how it is used since I believe that was the implicit alpha value for textures before.

3 Likes

Alright. Thank you! :slight_smile:
I’m not actually using that value anywhere at all (I only need 3 floats per texel). But if it automatically used to be “1” before, I guess assigning “1” makes sense.

Basically just ask for a larger array to store the data and pad every fourth element with some arbitrary, never to be used values?

Worth noting that the extra padding would be added behind the scenes in many devices anyway, within the WebGL implementation. We consider it better practice (and better for performance) to use RGBA.

1 Like

I recently migrated to a newer ThreeJS Version with the RGBFormat feature removed. I want to display 4K color images in a texture, so we’re talking about 30MB of data. I can’t control the input format. It’s always RGB Data. Converting it to RGBA would mean a lot of overhead. Is there a performant way to do this? I tried using a custom fragment-shader to extract the RGB data from an UnsignedInt248Type/DepthStencilFormat . But this is not working properly. I either get an INVALID_OPERATION message or a black texture. Any suggestions?

1 Like

I’m not sure, but you might still be able to use UnsignedInt248Type. See the workaround here:

Well… It may do that…WebGL offers these formats so they can be used and then the driver decides what to do best, and it may even be able to do add the padding on the GPU during upload, which should be much faster than anything one can do in JS.
Even if that’s not the case, I’d much rather rely on someone with lots of experience to write some fast padding code, instead of me - so I really don’t see how that should be best practice, if one actually has RGB input data.

I just upgraded to the newest Threejs and I’m really annoyed by this change, especially since it seems to be 0 effort for threejs to just allow the RGB format.

Do I really now need to write some potentially buggy/slow JS code, to convert my RGB data to RGBA?

edit: sorry for the rant, but, this upgrade has taken more time than it needed too…

1 Like

Sorry that the upgrade has been a challenging one!

The larger reasons for the change are described at the top of https://github.com/mrdoob/three.js/pull/23228 … the fact that WebGL might pad the data isn’t the motivation here. We’re working toward WebGPU support, and it doesn’t allow RGB8. We’ve also made color management improvements that wouldn’t be possible with RGB8 in WebGL, or would require slow and complex duplicate color pipelines.