Hello everyone, I am learning to implement Voxel Cone Tracing. I still haven’t figured out how to visualize voxels based on textureLod from mipmap (stored in texture2D or 3D). I usually use RayMarching to visualize voxels, but it seems that in the case of using VCT this is not necessary. It seems that in the video, the voxels are represented in volume. If someone knows please help me
*** this post i learn : miaumiau.cat – GPU gems
The video in the left corner shows the same process as visualizing voxels from model voxelization results, maybe. I am also wanting to visualize voxel in threejs like that, to check the data after voxelizing my model from mipmap
Thank you for your response, this is actually quite similar to my approach. But I don’t know if this visualization is the same as what VCT needs to operate? (for example, about the limits of the cone range, the size of the voxel, the ratio of the voxel to the boundary box). The main reason I needed to visualize exactly what VCT needed to work correctly was to better understand the method.
I guess you’ll have to recreate the blender nodes programmaticcaly, the steps are in the video you’ve shared, you’l have to somehow adapt them for InstancedMeshes. Although it has its limitations.
I’m not sure which part of the problem you’re referring to, from my understanding you want to control the size of the Voxels, and the total count of the displayed voxels relative to that size?
If so, in the article I’ve shared, they’re using a grid system, with a params.gridSize, you can adapt it to compute the voxels size and get the total count, then use it with InstancedMehes.setMatrixAt to set the scale of each Voxel.
Yes, three-mesh-bvh is a great tool. However, in my case, I just want to voxelize the particles, so I don’t need to use this tool, I just need to map the particle pos to the mipmap.
However, I also have another problem about rendering sdf gradient from this tool, like some problems from converting sdf to sdf gradient and I will post in another post.