Three-mesh-bvh: A plugin for fast geometry raycasting and spatial queries!

Hi @gkjohnson,

It’s a wonderful library that you have developed! Kudos!

I have been looking at this library for a week and have a couple of questions that I have been pondering over.

In the case of a WebXR based VR application, how would I use this library if I had to check constantly for the intersection of "gamepad controllers" and a few “interactive” grabbable objects in the scene?

Let’s consider that in my application I have a few objects that I can interact with. Other objects are static and make a part of the environment. The only objects of interest are the ones that the users can interact with which are spread across the scene.

In that case, should I multi ray cast from the gamepad controllers (to get better intersection results and not just in a single direction) or should I check for the intersection of the gamepad’s bounding box/sphere with the BVH of every interactable object?

Any help regarding this will be greatly valued as I am yet to decide the approach for the same.

Thanks!

Regards,
Srinivas

1 Like

Successfully modified the rollup.config.js and built a js module bundle file, it works well, thanks!

Made a pr Add `esm` ( module ) build output. by gonnavis · Pull Request #240 · gkjohnson/three-mesh-bvh · GitHub

1 Like

@gkjohnson I had a need for spatial indexing in a little physics engine I’m making for a game. I knew about your lib but I’m using three.js math primitive classes for collision, not meshes, so I assumed I couldn’t use three-mesh-bvh.

I coded my own makeshift ( and very suboptimal I’m sure ) binary tree, but now I’m wondering if I should have asked you before if I could use your lib for this. Is it possible to index objects such as OBB and Sphere and speed up collision detection between these objects ?

1 Like

@Srinivas_Prabhu

In that case, should I multi ray cast from the gamepad controllers (to get better intersection results and not just in a single direction) or should I check for the intersection of the gamepad’s bounding box/sphere with the BVH of every interactable object ?

Either one would work – it really depends on what you want. Raycasting with a visual indicator can be good for interacting with objects at far distances but if you want users to interact with objects at closer distances using something like a sphere attached to the controller to detect intersections can work well too.

The project doesn’t intend to dictate or suggest interactions as much as it enables interactions that may not have been possible or performant previously. I think the best thing to do is experiment and find what you like best for your project!

@felixmariotto

I coded my own makeshift ( and very suboptimal I’m sure ) binary tree, but now I’m wondering if I should have asked you before if I could use your lib for this. Is it possible to index objects such as OBB and Sphere and speed up collision detection between these objects ?

No the project is built to index the triangles of a mesh with a small memory footprint by rearranging the index buffer data and storing bounds information in an array buffer, as well, which means its not well suited for storing other types of dynamic objects (though static point and line geometry may be supported more fully at some point).

I haven’t had a need myself but I would think there’s a space for an efficient, stand alone dynamic spatial index of scene objects for uses like physics, other geometric spatial queries, or optimized raycasting capabilities in scenes with lots of objects as @prominent mentioned above. A spatial scene index like that could be used in conjunction with this one to, too, with the dynamic index being used for broad queries / detection while three-mesh-bvh is used for individual triangle intersection detection.

1 Like

Thanks to a recent contribution an example on how to use the library to improve point cloud raycasting performance by multiple orders of magnitude has been added! Right now it works by creating a separate mesh of degenerate triangles which can be treated as points which means some extra index memory and a bit of custom raycasting logic is required. In the future a more robust, built-in implementation could be added but it will require some reworking of the indexing functions.

Check the demo out here:

https://gkjohnson.github.io/three-mesh-bvh/example/bundle/pointCloudIntersection.html

8 Likes

I just recently released v0.4.0 which made some API improvements and added a couple new features to enable a bit more flexibility in the BVH. I also added a new geometry sculpting demo inspired by tools like SculptGL and ZBrush. There are still some improvements that could be made to improve performance and add features but it was a fun little demo to make. WebXR sculpting would be a cool addition.

You can check it out here:

https://gkjohnson.github.io/three-mesh-bvh/example/bundle/sculpt.html

ezgif-3-c65d0c6dfa67

8 Likes

That’s a really fine thing you’ve done there!! Well done @gkjohnson

2 Likes

this is an amazing library, thank you for sharing it!

2 Likes

Let me join the cheer party too!
Great plugin! a must-have for any real time raycasting. :+1:

1 Like

Hello @gkjohnson! will your library work well on non manifold meshes? or does it require “clean” geometry to function correctly? I’m thinking about the “shape intersection demo” as a use case, but with dodgy imported/converted meshes.

will your library work well on non manifold meshes? or does it require “clean” geometry to function correctly?

The package doesn’t make any structural assumptions about the mesh. There are some things to maybe keep in mind listed in the gotchas section but generally a “triangle soup” model will work just fine.

I’m thinking about the “shape intersection demo” as a use case, but with dodgy imported/converted meshes.

That should work – the geometry intersection demo works by detecting intersection of triangles between the two meshes.

alright, thank you :slight_smile:

I just added a new demo showing how the BVH can be used to speed up extraction of edges along clip planes for things like CAD applications. Using the BVH the clipped edges can be generated in real time on the 2 million polygon demo model.

Check it out here:

https://gkjohnson.github.io/three-mesh-bvh/example/bundle/clippedEdges.html

image

image

clip-edges-only

And thank you again @prisoner849 for the suggestion!

12 Likes

There have been a couple new releases recently that add some new features and demo!

InterleavedBufferAttributes are now supported for geometry position attributes meaning glTF models generated from gltf-transform and MeshOpt projects will now work out of the box without having to transform them:

And with the latest version the Surface Area Heuristic (SAH) BVH generation strategy has (finally) been made viable and produces a more optimal BVH typically with a smaller memory footprint. From some the following demo it seems raycasting is improved by 10-20% overall but would probably be improved further in some of the worst case scenarios on a model where there are deep hierarchy traversals required. You pay for it in generation cost, though, as it can take several times longer to produce compared to the CENTER strategy.

I’ve put together this demo to help show the impact of different MeshBVH options and hopefully help people understand a bit more about how the whole project works!

https://gkjohnson.github.io/three-mesh-bvh/example/bundle/inspector.html

6 Likes

I came across Peter Shirley’s Raytracing in One Weekend series and figured it would be a neat use case for the project and a good opportunity to learn about path tracing so I’ve just added a singled threaded CPU Path Tracing demo to the repo. I a number of other papers, open path tracer implementations, and PBR Book to help wrap my head around everything but I think it came out pretty well. The demo includes some more advanced features like a GGX roughness model, direct area light sampling, and multiple importance sampling. Next up I’ll be looking into enabling raycasting / raytracing in a shader. Check out the new demo below!

CPU Path Tracing Demo

13 Likes

I just released v0.5.0 this weekend which includes signature changes to a number of the BVH functions to help simplify function arguments and make the results more consistent and the ability to use SharedArrayBuffers so the same BVH memory can be used across multiple WebWorkers. The functions will log a warning if you’re using the old signature so it should hopefully make the transition easy if you’re using a function that’s changed. If you’ve just been using three-mesh-bvh to override and accelerate Mesh.raycast then no changes will be needed!

Release notes here:

And for fun a couple more path traced images made with the code from the new demo!

8 Likes

This library is awesome. I downloaded it for some collision detection, but once I’d tried the demos…

I make room/product configurators, and it would be fantastic to offer some browser based raytracing. At the moment I download a scene and take it through Blender/Cycles.
This is the most realistic lighting I’ve ever seen in threejs. Props gkjohnson!

I’ve uploaded a test room. Still needs some tweeking, but I’m amazed the difference some ambient shadows can make.
I used the gpu path tracing example as a start point, and added a color pass.
https://gkjohnson.github.io/three-mesh-bvh/example/bundle/gpuPathTracing.html

https://devine.pub/raytrace/example/

Can’t wait to see what you do next! :smiley:




6 Likes

What is the name of the example with the rabbit? Could you share the link?

In example PointCloudIntersection, is that just Intersection with the point?

@Peter_Devine this is awesome! And what a clever approach to getting color working without full material support. Thanks for sharing. After a few more changes the raycasting shader code will be available to use in the next release and I’m pretty excited about the possibilities of raytracing in a shader.

@Zvyozdo4ka

What is the name of the example with the rabbit? Could you share the link?

If you take a look at the list of demos it’s listed as the lasso selection example. The demo uses a torus knot geometry instead, though.

In example PointCloudIntersection, is that just Intersection with the point?

As explained above the point intersection example is just using degenerate triangles to emulate point queries. The demo shows raycasting but with the shapecast function you can implement any kind of spatial query you want.

2 Likes