How I can combine a shader based rendering and three.js scene graph

Hi! I am looking to use a shader to raymarch a scene - for simplicity we can assume its just a sphere in the ray marched scene.

Now I would like to draw a cube in the regular three.js way - by adding it to the scene graph and allow it to render using rasterization as it normally does.

Is there a way to combine these two methods - so I have both a sphere and a cube and will occlusion and other effects still work? Meaning can I generate vertices or something and create a mesh that does the same thing as the ray march step so that occlusion and other effects work as normal with a scene graph with three.js?

I am currently thinking of a method - but am unable to realize it in three.js. Hope somebody can clarify.

Generate a framebuffer and a depth buffer from a ray marching shader or two of them - does not matter much as its not random. Given these two we can load or render them to textures and given these two we have a good description of the scene.

Similarly generate a framebuffer or texture that contains the color data of the rasterized section and another that contains the depth buffer data.

Given these two, maybe we can compare the two depth maps and based on that take the pixel color value for whatever is closer to the camera or having a lower depth value comparatively.

I would like to know if this would work and any ideas as to how to get this working.