Immersiv: A Next-Gen VJ & Visual Performance Tool Built with Three.js
Immersiv is a browser-based, real-time visual performance suite designed for musicians, streamers, and visual artists. It leverages the power of Three.js to bridge the gap between static 3D models and dynamic, audio-reactive live performances.
By combining GLTF asset loading, dynamic video textures, and a robust post-processing stack, Immersiv transforms any web browser into a professional VJ console.
Key Features & Three.js Implementation
1. Dynamic Video Texturing
At its core, Immersiv allows users to map live video or prerecorded clips onto 3D geometry.
- Implementation : We utilize THREE.VideoTexture to treat HTML5 elements as dynamic textures. These are applied to loaded GLTF models or the scene background/environment, allowing for seamless integration of 2D footage into 3D space.
- Use Case : A user can drag-and-drop a looping sci-fi tunnel video, and it instantly wraps around a spinning 3D skull or geometric shape, creating complex composite visuals in real-time.
2. Cinematic Post-Processing Stack
To achieve a “broadcast-ready” look, Immersiv relies heavily on THREE.EffectComposer . The rendering pipeline includes:
- UnrealBloomPass : Adds a high-dynamic-range glow, essential for neon and cyber-aesthetic visuals.
- FilmPass : Introduces film grain and scanlines for a tactile, analog feel.
- AfterimagePass : Creates “light trails” and motion blur, making fast-moving objects feel fluid and organic.
- RGB Shift & Kaleidoscope : Custom shader passes (using ShaderPass ) allow users to fracture the screen or split color channels for glitch effects on the fly.
3. Audio Reactivity 
Immersiv isn’t just a visualizer; it listens.
- Tech : Using the Web Audio API alongside Three.js, we analyze frequency data in real-time.
- Reaction : This data modulates mesh.scale , light.intensity , and shader uniforms. Bass kicks can trigger bloom flashes, while high-hats might jitter the camera position.
4. AI-Powered “Auto-VJ” Mode
We implemented an AutoRunManager that acts as a virtual lighting operator.
- Logic : Instead of simple linear interpolation, the camera uses compound sine wave algorithms to orbit and zoom in an “organic” handheld style.
- Procedural Generation : The system automatically cycles through models, video textures, and effect palettes based on a configurable interval, ensuring the visual stream never gets stale during long DJ sets.
5. AI Integration (Gemini)
We’ve integrated Google’s Gemini AI to analyze scene composition.
- AutoMask Pro : Uses AI to detect structural elements in uploaded photos (like windows on a building) and generates masking coordinates for projection mapping simulations within the Three.js scene.
Why Three.js?
Three.js was the obvious choice for Immersiv because of its, and is thanked on our credits page, we’re big fans and users:
- Performance : The WebGLRenderer handles high-poly models and complex shader chains at 60fps directly in the browser.
- Ecosystem : Loaders like GLTFLoader and the extensive example library for post-processing meant we could focus on building the tool rather than the engine.
- Accessibility : It allows us to deliver a professional VJ app via a simple URL, with no installation required.
- Stack : Three.js, React, Vite, TailwindCSS, Google Gemini AI.
- Check it out: https://immersiv.haawke.com
