Wormhole renderer

I found this amazing video:

Does anyone would have any clue on how to reproduce a such “renderer” in Three.js?

2 Likes

fairly easy but heavy way would be to render 2 scenes using CubeCamera-s and then just draw full-screen plane with a shader mixing them. as you can see, to get a decent resolution this method would require you to render 12 times more than a normal scene would need to. to get better performance and use normal cameras you would need to carefully track visible areas in both scenes, which kind of triples an effort math-wise (unless maybe you are willing to cut down on “lensing” effects)

1 Like

@makc3d would you again do your magic with a quick drawing? It’s hard for me to figure it out…

Where do you position the 2 cubeCameras, and how do you exactly mix the renderTarget I guess?

Thank you

at exactly the same place but one in each of your 2 scenes. then you can have a typical fullscreen quad setup where you mix the pixels from the resulting two cube maps in the shader depending on how close you are to the wormhole.

you literally just mix them: mix(textureCube(map1, ray), textureCube(map2, ray), ratio); -where ray is your final camera 3D ray shooting through particular pixel

now the way you construct the ray is totally up to you, though, you can just emulate PerspectiveCamera or add some curvature to make it nicer, or you can even do it in physically nearly-accurate way as they did for Interstellar movie.

edit: another way would be to just cheat and look it up at github :smiley:

2 Likes

Related example I found:

2 Likes

conversation on the subject on Poimandres postprocessing channel Discord

Now that is cool! Really lovely transitions there!

and

4 Likes

still collecting ressources…

3 Likes

As a physicist, I find this very interesting. I understand the general theory of relativity and Einstein’s field equations very well. This was one of the reasons I studied physics. I once had a homepage for this, but it was far too mathematical.
Unfortunately, I’m already fully booked. Because of the ocean project in which phil crowthers and I work together, I have to learn to understand WebGPU because we have reached the limits with WebGL2. The space-time in the 3D world is Euclidean. Simulating a riemanian curved geometry is a great project. A physicist colleague discovered a new warp metric in 2021 that caused a stir in the professional world because it is the first of its kind. I will follow your project.

3 Likes

i was just going to suggest a portal

and a magnifier thrown on top for refraction similar to (although shaped in a continuous way)

but now that i read this i just feel silly for even thinking about that. :sweat_smile:

2 Likes

I like the de-composition non-physical approach :slight_smile:

According to the last video, there are 2 parts in the wormhole:

  1. the mouth
  2. the throat

@drcmda do you think the “mouth deformation” could be handled with refraction only?
image

Or even when getting closer and closer like in the first video? :

1 Like

I know the picture in the middle. It is a real simulation of a wormhole. The building behind is the physics institute in tübingen.

But each in his own way! You don’t have to like physics.
It’s just my personal splin since I’m a physicist. I have the earth, mars, the moon in full scale. I can land on all of them. The map material uses many GB. Simulating a wormhole from earth to mars would be awesome.

Your project is very interesting, whether physically or not, and I will follow it.

yup, they actually had a 2.0-ish video with the colored rods going through the wormhole which would be kind of hard to imitate in 3js :sweat_smile:

1 Like

I didn’t know the video. If you want to simulate something like that, I don’t think you can avoid a mathematically correct approach. The same can be done in 3js. You just have to work with the Schwarzschild metrics like the simulation does. My cloud shader produces very real looking clouds because I mathematically replicate the light scattering of reality. I have a simplified repo on github if anyone is interested.

But whether it’s physically correct or if possible without physics, if you don’t like it, I don’t care. I always find it exciting when someone does something out of passion. And I see that Abernier likes to do it :+1::blush:

The big weakness that 3js had was the lack of imageStore in shaders. But that’s fixed with webGPU.
Phil Crowther’s ocean also uses mathematically real described waves. Combined with my multithreading quadtree geometry, this runs much more resource-efficient. But the many necessary intermediate renderings, because we can’t keep textures in webGL2 with imageStore in the GPU, are a very big handicap. This consumes a lot of valuable resources.

Luckily webGPU is now available and Phil and I are learn now using it. I’m obviously using the computePass function incorrectly. There is still no documentation and very few examples. In any case, WebGPU gives 3js enormous possibilities.

Is it possible from within Three.js to use WebGL for graphics and WebGPU for computation at the same time?

@Abernier: I found the dissertation on the wormhole in front of the institute in tübingen. Unfortunately, it doesn’t contain any usable code. It only describes the theory.

https://publikationen.uni-tuebingen.de/xmlui/handle/10900/48958?show=full

Perhaps one of the references in the dissertation will be useful.

You would have to use both renderers for this. I’m not a 3js developer but I don’t think that’s advisable.
Two different render worlds sounds like a dangerous solution to me because both independently claim the GPU.

1 Like

It’s possible to use GPUComputationRenderer in conjunction with WEBGLRenderer yes, this gpgpu_birds example uses a combination of both, I’m not sure how this would compare to computing and rendering everything with webGPU though…

1 Like

I kinda doubt it since it would be not so easy for a (webgl) shader to know when the ray hit any geometry. Perhaps with the something like this.

2 Likes