SOON: Adding ThreeJS Renderer to Blender

Long story short: I’m adding ThreeJS as a renderer into blender and there are all kinds of interesting things that could be done once the implementation of this linkage is complete.

My question to the community is, what would you like to see?

I’ve built a few engines in ThreeJS and BabylonJS and was always frustrated by the art workflow that involved saving and checking assets. It’s just damn slow to save something as GLTF thousands of times as you want to tweak a scene and perfect a look. And especially if you’re playing with transmission and are curious how something might render, or the performance you should expect (which is important when optimized for web)

To summarize how this actually works in blender:

  1. Blender renderers can be considered separate “applications” or “modules” that are fed all the data structures and variable changes that happen in the blender viewport so the renderer can create the image.
  2. Adding features to blender 3.X is pretty easy because of how well they have abstracted the everything. Essentially you register your renderer with blender then you can provide a texture that it will paint into the viewport window (and it announces what resolution it wants and stuff like that).
  3. ThreeJS is a browser technology which is where this gets complicated. It doesn’t have a nice C API we can interface with and hook into Blender. So I have done the legwork to encapsulate an instance of ThreeJS in chromium embedded framework. I have shared memory sections (Works in Windows right now but will work Mac/Linux too in time) and chromium renders into a shared memory section that is accessed via the Blender render python script and fed into the OpenGL context.

Without going into too much detail. Each blender renderer view would spawn its own chromium process, and they’re tied together with a job object on Windows, so if blender closes there are no orphaned renderer processes stuck open etc… But the more time-consuming part is efficiently serializing the blender objects that we’re interested in rendering (Meshes, lights, material params, textures, channels) and sending them into ThreeJS which lives in another address space inside of a JavaScript VM (a shared memory ring buffer probably required just to get the data from one process to the other…) This part is important because keeping the latency down for moving large arrays around will make the whole thing actually run well rather than be a laggy mess.

Good News
4. I have the remote rendering done already. Meaning I can see threejs rendering in blender, but there is still work to do for serializing objects so we can render what’s supposed to be there :grinning: .

But after this is done, you should be able to preview a scene, or at least what is compatible with the ThreeJS renderer, the same way you could in EEVEE, or Cycles. But the question should be asked, what’s next? It seems like this has the potential for a tool ecosystem but I’m curious to hear what you all think.

10 Likes

I’ve seen people complaining that what they see in Blender is not what they see later on in Three.js. With your tool it will be easier to distinguish the two cases:

  • the difference is because Blender and Three.js use incompatible (or different) properties
  • the difference is because the intermediate 3D model file format has restrictions

I’m somewhat curious about animations and also about incompatible features/properties between Blender and Three.js → will you ignore them, or will you try to mimic them as much as possible?

Disclaimer. I’m not a good (or even an average) Blender user. I have used it on several occasions, but usually my time with Blender is spent like this: 10% doing what I need, 90% reading tutorials and watching videos how to do what I need.

3 Likes

I should clarify that I’m planning to support the Principled BSDF Material shader only, because it is the only that is compatible with the GLTF exporter and has a pretty close parity between quite a few properties. Properties that aren’t part of what GLTF supports I’m going to ignore for now.

TLDR; the whole point is optimizing the gltf model / scene creation workflow so I plan to support at least anything you can save into GLTF out of blender.

Extra Notes: regarding displaying animations, the animations actually wouldn’t get sent into threejs as animations, but rather updates to positions are sent as scene updates to the renderer while in the editor (as far as I know). But to be transparent, I haven’t dug into the animation side of it yet and it would come after getting the rendering working correctly.

3 Likes

My thoughts are: Since GLTF can be extended so easily to carry additional properties on any node; it could be possible to configure and save custom settings in future versions of the ecosystem.

Imagine a menu in blender specific to ThreeJS GLTF scenes that allow you to select meshes and set their render order, add custom flags, or really anything else you can think of. As long as there’s some code in place to handle these extra parameters on the GLTF loader, you can add features to expand the scope of what you can author inside of Blender.

I don’t want to personally promise this, but I imagine a world in the near future where my project is the first brick to having blender become a Web 3D IDE. A sandbox where you could code / create applications while leverage the mature power of blender as a tool.

3 Likes

You can use the Custom Properties panel on both materials and objects to add whatever extra properties you need in Blender. Make sure to check “custom properties” in the Export as GLTF dialog (under “Include” I think).

Then, in your gltf loader, add a traverse loop to get every entry in o.userData object and apply it to o itself.

I’ve been using this to apply things like reflectivity and envMapIntensity in my materials in Blender.

The only caveat to this, obviously, is that you don’t actually see the effects of these in Blender but only in Three.
So to that point a Three renderer for Blender would be very welcome.

3 Likes

This is pretty cool! I think the Chromium foundation could become more generic, and then it could work for all the web engines, f.e. Three, Babylon, PlayCanvas, r3f, A-Frame, Lume, etc.

Each framework or engine author could use an API in the JS side that you’ve mapped to/from the Blender stuff, then attach their engine that way.

It would be the ultimate Blender-to-web preview system for all web engines/frameworks.

5 Likes

@Nowayz Is this work visible somewhere? GitHub?

2 Likes

@trusktr I’ve been out of town for another project recently, but I am back now. I’m trying to get some free time so I can finish the first release of this. It’s on GitHub and I will release the public repo and make announcement when I’ve got the first revision working completely.

4 Likes

This is precisely my thought as well :slight_smile:
I have posted here in three.js forum because that’s the first integration I am planning, but the code that empowers this integration would work for any 3D platform on the web.

3 Likes

How’s this project going? Very excited about it!

2 Likes