Texture3d for representing points?

Hi.

In my current application I am using an array of preloaded textures in order to be able to represent multiple types of THREE.Points objects.

The idea is very simple, I pass the array of textures in an Uniform and each point vertext has an attribute telling which is the texture I have to apply.

As I am using WEBGL2 and texture3d is available, I wonder if it could be possible to use the same approach but using texture3d array instead of an standard texture array and be able to give 3d appearance to my points.

I am not used at all with texture3d, the examples I saw are all linked to nrrd format.

I want to be able to represent my points with very basic 3d geometries like boxes or spheres.

Is it possible?

Is there any example of texture3d by not using a nrrd file to obtain the texure3d data and size?

Many thanks in advanced.

Best regards

I think you can create a var image = new THREE.DataTexture3D( myData, sizeX, sizeY, sizeZ ) and then use that image to create a texture: var texture = new THREE.Texture( image, ... rest of parameters );

I think it is an interesting question. Surely, it is possible to sample a 3D texture in the fragment shader for a point. It could even perhaps give a performance benefit compared to using an array of 2D textures. In WebGL2, textures can also be sampled by pixel coordinates instead of relative coordinates.

But representing points with very basic 3D geometries like boxes or spheres… Boxes can be rendered efficiently in quite large numbers by joining them into a single BufferGeometry. See some of the three.js examples for that. No need to try to use points, I think…

But spheres… High-resolution spheres have a lot of vertices, so rendering points in the appearance of spheres could be interesting. What you basically want to do is to find, for, every pixel in the point, first whether it “hits” the sphere, and second where on the sphere it is. When you know that, you can use u,v coordinates (basically the angles) to apply a texture, and use the corresponding calculated normal and position to calculate lighting. There is absolutely no guarantee for improving rendering performance, but it will be an interesting experiment to try.

The hardest problem is to map from pixel position to spherical coordinates under a perspective projection (you would have to control scale and shifting of points in the vertex shader to accommodate for the extreme cases where the points are close to the camera and near the edge of the screen). Orthographic projection is much easier, though, because the spheres then have constant size and circular shape in screen space.

Hi Elias.

Thanks for your feedback.

The point is that I have no idea how to create the data relative to my texture3d.

In the docs says:
DataTexture3D( data : TypedArray, width : Number, height : Number, depth : Number )
** data – data of the texture.**
** width – width of the texture.**
** height – height of the texture.**
** depth – depth of the texture.**

But it doesn’t explain how to create this data.
What I whant to achieve is the same you can see in this fiddle but using a texture3d (sphere data) instead the 2d image (disc.png)

Here you can have a look:

https://jsfiddle.net/b4qewc3n/

So basically my question is how to create the data relative to a sphere texture3d distribution

Many thanks in advanced.

Best regards

The fiddle doesn’t show anything here.

OK, I made it work locally with a few fixes, and then discovered that I had to roll the mouse wheel for anything to show. That works in the fiddle too. :slight_smile:

GL points are basically two-dimensional. The vertex shader decides their screen position and size, and the fragment shader colors them. You get two-dimensional relative coordinates in the fragment shader, and can also use whatever you manage to forward from the vertex shader. But you cannot make a point into a sphere using a texture. You can, however add a static rendering of a sphere seen from one direction and under one set of lighting, and add that as a 2D texture to the point. As is done in some threejs examples. three.js examples

There is no obvious way to map a 3D texture to a GL point, as you don’t get any local depth information for free. You can do stuff like coloring the point according to where it is located spatially “in” a 3D texture, by using position as coordinates into the texture. You can also take u,v from the local coordinates of the GL point and w as an extra attribute (per point).

Do you understand the format of the data array that is to be passed to DataTexture3D? Basically, the data is stored sequentially, R,G,B,R,G,B,R,G,B… Three consecutive entries form a pixel, or rather a voxel, since it is a 3D texture. Besides that, I don’t know right now the order of the dimensions. But I think you will end up with something like this:

for (let i = 0; i < width; i++) {
    for (let j = 0; j < height; j++) {
        for (let k = 0; k < depth; k++) {
            let offset = 3*(i*height*depth+j*depth+k);
            data[offset] = redLevel(i,j,k);
            data[offset+1] = greenLevel(i,j,k);
            data[offset+2] = blueLevel(i,j,k);
            /*an alternative to calculating offset by multiplication is to
            just increment it by 3 here at the end of the inner loop*/
        }
    }
}

Where redLevel, greenLevel, blueLevel are functions of position in the texture.

Hi Elias.

I hadn’t seen this example before.

This is exactly what I wanted.

So I guess I don’t really need a 3d texture to achieve what I want.

I will have a look into the code.

Many thanks for your help.

No problem. :slight_smile: