I’m doing some experimenting on an upcoming client project which involves building an AR camera on the web which creates a sort of displacement effect (wave, swirl, ripple) in the user’s sky, emitting from a particular point. I’ve got a hacked solution going but I was wondering if there was a smarter, more performant way of achieving this before I commit to making this hack a part of the final build. Any advice is much appreciated. Here’s a video of the hack I put together and I’ll break down the solution below.
Basically, I have a simple displacement texture of a static ripple which I use on the material of a sphere. This sphere is rendered to a renderTarget, let’s call it displacementTarget
. Then, I use that displacementTarget texture as part of a displacement shader alongside the sky texture to create the displacement effect in the sky. This is also rendered to a renderTarget called backgroundTarget
. Finally, the background target texture is used as the scene background to my final scene which also contains a sphere wireframe and single sprite at the point of displacement. This is rendered back in the video you see above. I’m using OrbitControls
to control both the displacement sphere scene and the final scene so it’s synced. In reality, I’d swap the OrbitControls
for DeviceOrientationControls
and allow the user to navigate the scene with their device orientation.
What do you think about this solution which uses multiple render targets? I’m choosing to render the displacement to a sphere because I want the bend that a sphere provides so the displacement effect sits more naturally in the user’s sky.
Thanks for taking a look.