Orthographic camera for 2D visualisation: issues with scale and positions in shader

This is a test for a more complex 2D data visualisation. I am trying to place 200 particles/points on the screen with a shader (there will be many more in the real application). My data contains 200 values between 9000$ and 11000$, with 10$ increments. I simply need to centre the particles on the X axis (simply 0!) and scale them on the Y axis in order to show a certain range within the viewport (1000$).

const RANGE = 1000; // visible $ within viewport height
const MID_VALUE = 10000; // center of the screen
const PARTICLES_COUNT = 200;
const SIZE = 16; // texture size
const FRUSTUM = 2;

So i have a DataTexture with all the values that i will send to the shader:

array = new Float32Array(SIZE * SIZE * 4);

 for (var i = 0; i < PARTICLES_COUNT * 4; i += 4) {
  	array[i + 1] = 9000 + (i / 4) * 10
  }
 
  dataTexture = new THREE.DataTexture(
    array,
    SIZE,
    SIZE,
    THREE.RGBAFormat,
    THREE.FloatType
  )
  dataTexture.needsUpdate = true;

And an orthographic camera:

const aspect = window.innerWidth / window.innerHeight;
  camera = new THREE.OrthographicCamera(
    FRUSTUM * aspect / -2,
    FRUSTUM * aspect / 2,
    FRUSTUM / 2,
    FRUSTUM / -2,
    1,
    1000
  );
  camera.position.z = 500;

And this is the way i calculate the scale to be applied to the data inside the vertex shader:

scale = FRUSTUM / RANGE;

The particles are correctly scaled and centred but they are not evenly spaced on the Y axis, as they are supposed to be according to the data. Could this be a precision issue? Or what is wrong?

You can see the code here: http://jsfiddle.net/bze7sgmy/4/

You need to use uv coordinates when sampling your data texture.

http://jsfiddle.net/bze7sgmy/6/

1 Like

That’s great thanks @Mugen87 , but where is the uv variable coming from? I suppose Three adds it.

How does it work if i have to add other textures? How does Three name the UV in the shader?

Correct. The following page lists the built-in uniforms and attributes when using ShaderMaterial. The actual uv data come from PlaneBufferGeometry.:

https://threejs.org/docs/index.html#api/renderers/webgl/WebGLProgram

I’m not sure I understand. The attribute name uv stays the same, no matter how many textures you are going to use.

2 Likes

Ok I see. uv comes from the mesh that contains the material.

My question was about textures that i need to add to the material that are not directly related to the mesh: for example a texture (datatexture or image) with some arbitrary data or a colour map.

I suppose texture UVs are always normalised (0 to 1) right? So i can find the coordinates by normalising the sides of the texture in the shader like 1.0 / width, 1.0 / height?

For example i have this 16 x 64 png that contains several gradients i need to use in my 3D data viz, so i’d need to find a pixel colour from the coordinates:
gradients-texture

No. But if you use PlaneBufferGeometry, uvs are in the range [0,1].

I’m not sure I understand your intended approach. Maybe it’s more clear when you show your code in a live demo.

1 Like