I have a stl model which consists of a cube that was heated. I wrote a software which calculates the temperature distribution at every point/voxel in the cube, so it is essentially like a heat map where blue color is colder region and red color is warmer region on the cube.
The user provides how many voxels the cube will be divided into and calculates the temperature at each voxel in the cube. This information is provided to me in an hd5f file which I can turn into a numpy array with 3 dimensions that indicates the temp at each x,y,z point. I am trying to figure out how to overlay this hd5f file or convert the numpy array into something that can be overlaid on top of the stl cube so that the user can see the temperature distribution on the cube in Three JS.
My original idea was to create a bunch of cube particles which would represent the voxels and then assign them a color that indicates their temperature and overlay them as transparent objects on top of the cube stl model. I don’t think this will be a good idea when there are 100 million voxels which would require rendering 100 million cubes. So I was wondering if there is a better way to approach this or if anyone knows how to display hd5f files into the canvas?
If you have too much data for a geometry to represent it, you could go with textures. Instead of a cube, you could have a bunch of parallel square quads (one set goes parallel to X axis, one - to Y and one to Z). You can calculate temperature on each quad’s surface, make them semi-transparent and texturize them. They will cut the space into cells/voxels but not individually, so this “cube” can hold a lot more data.
I thought textures only wrap on meshes, so are you suggesting I create a bunch of plane geometries parallel to x-axis, bunch along y-axis, and bunch along z-axis which all overlap to look like a 3d grid. Then apply a heatmap texture to each one of them so that when they overlap transparently, it will appear as if it is voxelated heatmap?
Not exactly like voxel map but kind of, since you have too much info to make individual voxels.
A mesh is a combination of a geometry and a material, so you can make a mesh out of any geometry suitable for texturizing (with uv coordinates assigned to vertices).
I was thinking about something like this:
but now when I made it… it looks pretty complex, so I’m not sure how useful it will be when you assign different textures to each plane.
You will also need a custom code to render half-transparent planes in a specific order.
Or maybe not to make them transparent and turn their visibility on and off, so you could look inside the cube.
That is certainly an interesting approach. I would just have slices of the heat map stacked together in one direction and similar for the other directions so when the user sections through it they would see the temperature distribution on that slice.
I also found an example where they appear to do the same thing but use nrrd files: three.js webgl - volume rendering example
Maybe this is also a good approach?
That is certainly an interesting approach. I would just have slices of the heat map stacked together in one direction and similar for the other directions so when the user sections through it they would see the temperature distribution on that slice.
You could let make the slice under the mouse opaque while keeping all others faint, so you could “thumb” through them.
I’m not familiar with the volumetric example, but the main idea is that textures can hold a lot of data, so you rather keep data on textures and them pull them over w/e geometry suits the purpose.
Well, from a data visualization standpoint, I would suggest avoid material w/ transparency approach. Blend modes is going to mix values, so when two cubes stack togheter you’ll kind of lose ‘absolute values reading’. Instead, I was about to suggest the very same volume rendering example, because that lets the user to clip the volume while allowing absolute values readings from inside.
Anyway, I see an opportunity here to reflect about the need in seeing inside. Without knowing much about the final user, is seems kind of vague the real value of that, if it is really worth the effort of rendering a volume of data (3d textures tend to be very heavy files).
For instance, in applications for designers and architects, it is common to have 3d simulations about solar energy incidence over walls and floors because, well, only that surfaces is what can be manipulated, so even from a radiometric standpoint (estimating how energy propagates) it is irrelevant in this field knowing values related to the space/air, so ‘inside’ data is not calculated or depicted in anyway in those applications.