I want to use a single canvas for several totally different scenes for a website. The most clean solution I found is to make custom scene classes in which I’d add a camera, and I’d render the corresponding scene using a public property “camera” of my custom scene. Basic implementation looks like this :
CustomScene1.js
//... imports
export class CustomScene1 extends THREE.Scene {
constructor() {
super();
this.camera = new THREE.PerspectiveCamera(25,window.screenWidth / window.screenHeight,0.1,1000);
// instantiate 3D objects...
this.add(meshes);
}
//Do the scene stuff...
}
Main.js
import { CustomScene1 } from '../CustomScene1';
import { CustomScene2 } from '../CustomScene2';
cScene1 = new CustomScene1();
cScene2 = new CustomScene2();
showScene1 = true;
camera = new THREE.PerspectiveCamera(25,window.screenWidth / window.screenHeight,0.1,1000);
//
const animate = () => {
requestAnimationFrame(animate);
if(showScene1) {
renderer.render(cScene1, cScene1.camera); // NOT OK
renderer.render(cScene1, camera); // OK
} else {
renderer.render(cScene2, cScene2.camera);
}
}
To embedding the camera inside the scene feels much more cleaner as the cameras have specific behaviours I don’t want to mix.
PROBLEM : My problem is that if in the render() call I use the camera contained in the customScene nothing shows, but if I create a camera in the Main and use it then it works…
Any idea why ? Or shoud I should another structure ?
Thx a lot for your help