I have a scene that I would like to optimize as far as lighting is concerned.
I read that example and understand that it uses a generator from a cubemap.
Now my question is: is it possible to place lightprobes in the scene, bake the scene, store the result in, says, JSON, and when needed, reinject that data in the scene.
I actually tried to add the lightprobes but the helper show them black. I use RectAreaLight, Hemi, and Ambient
The scene is static. That would lead to a more accurate lighting than an unrelated image.
Serialization and deserialization of light probes is supported. Meaning calling
scene.toJSON() works if light probes are part of the scene and
ObjectLoader can handle light probes, too. Is that your question?
My goal is to improve my scene performance because my lighting make it slower.
So I wanted a way to generate some kind of lightmaps of my static scene, save the information.
Next time I load my scene without any lights, I still see the lights effect on meshes without their realtime cost.
It basically means that I render offline first then when people see the optimized version.
And I understood that lightprobes could help in that regard.
Is scene serialization the approach I need? Where do I reinject the lighting information afterwards?
I am currently reading this and it might be of help.
Ok, so lightprobes can export their lighting info? → yes
But back to my initial issue: my lightprobes are all black, they don’t catch anything (according to the helper): why is it so?
I can confirm that when I console.log my lightprobe objects, no data is registered and the helper show them black.
May someone give me a pointer to improve my understanding of these beasts and tell me if lightprobe without cubemap is indeed possible?
Trials and errors for the win: I needed a cubecamera.