I am building a configurator for furniture and would like the user to be able to request a more realistic image than in the threejs renderer. I found this thread
which offers exactly this function, but there is no more current information. My idea to solve this problem is to export a GLB file and load it automatically into Blender. Has anyone ever implemented a similar function and has any tips or solutions?
Personally, I’d first check if three-gpu-pathtracer wouldn’t yield results realistic enough to accept - that’d let you avoid quite a bit of complexity and stack mixing - since you could keep everything within JS and three.js.
If neither of these would work - you can hop on the AI train, first render your scene in Three.js in lower-grade 3D, save it as a PNG - then pipe it through StableDiffusion / Dall-E / Midjourney (MJ doesn’t have an API though ), and ask AI to convert it into a realistic rendering with a proper prompt.