A working example that renders a .glb file on the server side (node)

Hi. Newbie here. I’m trying to render my first image in a node js script (no browser). After lots of hassle, I’ve managed to eliminate all the errors. But it seems like I eliminated too much! Here’s my code:

import jsdom from 'jsdom';

const {JSDOM} = jsdom;
const dom = new JSDOM('<!DOCTYPE html><html><body></body></html>');
global.document = dom.window.document;
global.window = dom.window;
global.THREE = THREE;
global.self = global;
global.URL = require('url');
global.Blob = require('cross-blob');

URL.createObjectURL = (blob) => {
    return new Promise(resolve => {
        blob.arrayBuffer().then(buffer => {
            const base64 = new Buffer.from(buffer).toString('base64');
            const completedURI = `data:image/jpeg;base64,` + base64;
            resolve(completedURI);
        });
    })
};


import * as fs  from 'fs';
import { createCanvas, Image, loadImage } from "canvas";
import * as THREE from "three";
import { GLTFLoader } from 'three/examples/jsm/loaders/GLTFLoader';
import bufferToArrayBuffer from 'buffer-to-arraybuffer';

async function main() {
    const w = 200;
    const h = 200;

    const scene = new THREE.Scene();

    const camera = new THREE.PerspectiveCamera(70, 1, 1, 10000);
    camera.position.y = 150;
    camera.position.z = 400;

    const loader = new GLTFLoader();

    const glbBuffer = fs.readFileSync(__dirname + '/Avocado.glb');
    const loadPromise = new Promise((resolve, reject) => {
        loader.parse( bufferToArrayBuffer(glbBuffer), '', function (gltf) {
            scene.add(gltf.scene);
            resolve();
        }, undefined, function (error) {
            console.error( error );
            reject(error);
        });
    });

    await loadPromise;

    const canvas = createCanvas(w, h);
    canvas.style = {};
    const renderer = new THREE.CanvasRenderer({
        canvas: canvas
    });

    renderer.setClearColor(0xffffff, 0);
    renderer.setSize(w, h);

    renderer.render(scene, camera);

    fs.writeFileSync('out.png', canvas.toBuffer())
}

main().then(_ => console.log('All done'));

with package.json like this:

{
  "name": "3d-node",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "keywords": [],
  "author": "",
  "license": "ISC",
  "dependencies": {
    "buffer-to-arraybuffer": "0.0.5",
    "canvas": "^2.6.1",
    "cross-blob": "^2.0.0",
    "esm": "^3.2.25",
    "jsdom": "^16.4.0",
    "three": "^0.119.1"
  }
}

And I’ve got the Avocado.glb file from here:

https://github.com/KhronosGroup/glTF-Sample-Models/blob/0bbfedda698c95d280c78e8dae1c587f1a54276d/2.0/Avocado/glTF-Binary/Avocado.glb

The problem is that when I execute this script like this:

$ node -r esm ./index.js

It prompts no errors but at the same time, it doesn’t do anything and it doesn’t even prompt the All done. If it helps, I’m using node version v14.8.0.

To make sure there’s no misunderstanding, I’m trying to render a model (in .glb format) into a png file. I have no idea if the scene setup is right or if the camera is actually pointing at the object. But that’s an issue for the future. Right now, I’m focused on saving the file.

1 Like

There’s code on the three.js repo that does this for e2e testing using Puppeteer. That’s probably the easiest way to go about this. You can see the generated images here.

1 Like

Thanks, @looeee. I was hoping there would be a way to render using three.js without a browser involved (even a headless browser). Performance is a concern to me.

I’ve done a little bit of profiling. I’ve setup a scene with one object in it, the canvas is 1280*720 pixels, and my computer has a GPU.

As instructed by @looeee, I’ve run the browser in headless mode using Puppeteer. After setting up the scene, I rotate the object one degree in a loop and re-rendered the scene over and over again. Each time I render the scene, I call the toDataURL method of the canvas to convert the rendered scene into a string (PNG format). Then the PNG string is returned from the browser context to the NodeJs context and there the PNG is reconstructed. I’ve profiled all these steps to see how long each take:

Rendering the scene: 0.04 ms
Calling the toDataURL method: 0.21 ms
Transfering the string from the browser to the NodeJs memory: 116.22 ms
Reconstructing the PNG in NodeJs: 5.13 ms

But again, I’m not sure of the first two timings since they are a little too good to be true. There might be a problem with calling performance.mark() in Puppeteer (from within the browser context). In any case, it takes more than 120 ms per each frame (that I’m sure of) which is not good. I wish there was a way to render a scene in NodeJs without needing a browser.