Creating DataTexture showing weird warnings inside console

Hello all, first time posting a topic here. I am really desperate about this.

I tried to create simple DataTexture, 32x32, with data being Float32Array, populating it with random numbers and my code looks like this:
const width = 32
const height = 32

const size = width * height //number of cells of the image - num of pixels
const data = new Float32Array(3 * size)
const color = new THREE.Color(0xffffff)

for (let i = 0; i < size; i++) {

 const stride = i * 3

 data[stride] = Math.random()
 data[stride + 1] = Math.random()
 data[stride + 2] = Math.random()

}

this.texture = new THREE.DataTexture(data, width, height, THREE.RGBFormat, THREE.FloatType, THREE.UVMapping, THREE.ClampToEdgeWrapping, THREE.ClampToEdgeWrapping, THREE.NearestFilter, THREE.NearestFilter, 1)
this.texture.needsUpdate = true

Then I used this texture on two meshes. On the first I mapped texture in MeshBasicMaterial and in the second one I’ve sent it as an uniform to the fragment shader. Both of them are totally, full black colored.

I am not receiving any error in the console except these warnings that I cannot understand how to solve.

  1. [.WebGL-0x62e400db0e00] GL_INVALID_ENUM: Invalid internal format 0x1907.
  2. [.WebGL-0x62e400db0e00] GL_INVALID_OPERATION: Level of detail outside of range.

I hope someone can understand this and solve it.

i think RGBFormat is deprecated… do you get the warning if you use RGBAFormat?

1 Like

I didn’t get any errors on that. I gave up on using FloatType, at the end I switched to integers like in the DataTexture documentation example on threejs.org. I couldn’t get pass those weird warnings.

Maybe Math.random is returning 64 bit numbers?

I’ve experienced Js providing 64 bit numbers, and had to account for it when sending data to a web worker.