Building an asset pipeline in JavaScript

So, here’s a question is that is somewhat tangentially related to three.js.

In any serious game development, a major component is the asset pipeline - the build process which takes the raw exported files from the 3D artists and performs the various offline processing steps and bundles them into resource files that are consumable by the game engine. Because many resources such as scripts, sounds, materials and textures are shared between different models, the linking process can get quite complex. (Fun fact: when I worked at EA in 2006, one of the engineers working on the Godfather project told me that a full resource build took over 12 hours of real time!)

So, my question is this: what node.js build system would be suitable for building an asset pipeline? There are a zillion bundlers out there (webpack, rollup, parcel, and many more), but almost all of them are targeted at creating websites, not games. The central thing that they do is create bundles of obfuscated JavaScript code, all of the other assets that they process are secondary. I have no need to create a JS bundle (well, I do but that’s a completely separate build system - I’m using Snowpack for that.)

The only build that I have discovered that is not opinionated in this way is Gulp - and in fact, I’ve been using Gulp for about 6 months for this, and I’ve developed a modestly complex pipeline which processes GLB files, textures and so on.

However, Gulp is comparatively old and clunky - it’s based on the old pre-Promise Node.js pipeline operators, and most of the modern bundlers provide a much friendlier interface for defining custom build steps.

Essentially what I am looking for is a modern, JavaScript-hosted equivalent of “make”.

Any ideas.

Not any sort of comprehensive answer here, but a few ideas:

  • Caporal: Helpful library for building CLI tools in JS or TS. Can be used as a thin wrapper around any number of async functions, like this example.
  • glTF-Transform: Lots of tools for processing glTF/GLB, runs in the browser or Node.js.
2 Likes

Thanks! I hadn’t seen Caporal before.

I am using glTF-Transform as one of the builder methods in the Gulp pipeline - I use it to strip out unnecessary objects (cameras, lights) that inserted by Blender, as well as combine many small .glb files into larger catalogs that can be loaded with a single HTTP call (I use this for trees and flora models).

1 Like

all bundlers handle non-js assets. they don’t turn assets into “obfuscated js” they hash and copy them into the dist folder, so that they become fast-access cached items. bundlers are not meant for websites per se, they are asset managers. if you want to alter assets at build time then that’s what loaders are for (image-loader compresses, glsl-loader takes care of imports, etc). otherwise you’ll receive the url and the asset is still managed, typesafe and controlled. using gulp for anything would be crazy imo.

you might also want to look into react, this is the closest thing to a controlled asset pipeline you will encounter. especially gltfjsx: GitHub - pmndrs/gltfjsx: 🎮 Turns GLTFs into JSX components this solves many of the issues that are plaguing three projects.

how do i go from a blender model to a dynamic asset that can be used/animated/interacted with in my game/project: https://twitter.com/akella/status/1341811097905586178

how do i cache,
how do i re-use models,
how do i alter models without destroying the source data: reuse gltf - CodeSandbox

how do i deal with async loading,
fallbacks and level-of-detail: https://twitter.com/0xca0a/status/1358789334145658882

it handles all other loaders (textures, …) in the same manner.

1 Like

Again, those are all things necessary for creating a web bundle, which is not what I need. Converting images and other assets into something that can be bundled into a JavaScript file is not helpful in this case, because my assets are not part of the client bundle.

All of my graphical assets are managed on the server. The game is designed to support huge worlds, where only a small part of the world is loaded at any given time. As you move around the world, the client is continually loading and unloading scenery which is requested from the server.

Effectively what this means is that my assets are using the filesystem as a database. Building that filesystem structure is one of the jobs of the asset pipeline.

None of the bundlers that I looked at are helpful for this use case. I don’t want to create a dependency on “make”, “cmake” or any non-npm resource, but that’s the kind of functionality I need.

I’m really surprised that the npm ecosystem doesn’t have something like this - people are competing so much to produce the best website bundler that they don’t seem to be aware of other use cases.

To be honest I’m used to running “make” periodically, but not writing any serious systems with it, so “make, but JS” might not be as clear to me as it is to you. :slight_smile: I guess I’d summarize my interpretation of this as:

“Define many tasks, potentially with other tasks as prerequisites. Since this is JS, those tasks would ideally be async functions. Evaluate the whole dependency graph, or any subtree of it, generating arbitrary outputs (not necessarily webpage assets) efficiently.”

Does that sound right?

Also just found this: https://jakejs.com/. Pretty surprised to see it has more weekly NPM downloads than Grunt, and I’d never heard of it.

Yep, that sounds right. I think the only item I would want beyond that - and this is strictly a nice-to-have, not a requirement - is the ability to write processing functions as plugins which would run inline, rather than forking a separate process (similar to the way other bundlers do - for example, you can write callbacks to HtmlWebpackPlugin).

Jake looks intriguing, I will check it out.

1 Like

this does not happen. bundlers don’t do that. they merely manage the asset, be it an image, a texture, a gltf model, a hdri map, etc. and by manage i mean that the asset is available on the server, can be imported, cached, has a version and is type safe. bundlers don’t “make websites”, they just pack assets, for the web, but also node and from there to other platforms.

As you move around the world, the client is continually loading and unloading scenery which is requested from the server.

dynamic imports. bundlers all support that.

Effectively what this means is that my assets are using the filesystem as a database.

that is what bundlers do:

if (level !== currentLevel) {
  const asset = import(`./assets/stage-${level}.glb`)

at build time it will AST analyze that piece of code and ready the contents of assets preparing them to be available. they are not part of a bundle, they will will not be converted to javascript. but they are part of the distro and can be dynamically fetched and used whenever the app needs it.

as for assets that are not part of the distro, it’s the same thing. just fetch requests and/or dynamic esm imports.

could be i don’t understand, but i really don’t see the problem. we deploy similar strategies at work successfully.

There are specialized pipelines for game development, like Houdini PDG, that don’t just pack existing assets, but do significant processing of mesh and texture data to create the final assets.

Consider a pipeline that converts source models to glTF, compresses mesh vertex data, optimizes UV layouts for GPU texture formats, resamples animation, or splits a large world mesh into smaller tiles appropriate for streaming. Perhaps Webpack could be extended to do these things, but I’m not sure it’s the most natural choice.

So, I should probably mentioned that I’ve used Webpack in my day job for the last seven years, and have even written a custom plugin for it. I’ve also used rollup, parcel and snowpack. In fact, my current side-project uses snowpack for the game editor, since the faster build times are important with a large code base. Although, sadly, I have not figured out how to get hot reloading working with three.js - that is, if I change some code in a React component, it hot reloads just fine, but if I edit a shader or a class that generates meshes, the game just hangs, so for now I have hot reloading disabled.

While it is technically possible to use Webpack and other bundlers for things other things, they are primarily designed to create resources that are importable via the JavaScript import mechanism. They are fairly opinionated in this respect.

The problem with that approach is that you lose a lot of control over loading lifecycles. For one thing, three.js resources need to be properly disposed - they can’t just be garbage collected. This is because buffers and shaders allocate resources on the CPU, and if you want your app to run for more than 30 minutes you better not leak this stuff. So you need an explicit unload mechanism - I tend to use the lru-cache package for this, that lets me tweak and tune how long the resources live in memory.

The game world is organized in to a grid of cells, each being 16 x 16 meters - there are thousands of cells in a typical overland map. The resources for each cell are fetched via window.fetch, with an API path like /api/scenery/<realm>/<cell-x>/<cell-y>. The server (written in node.js/Koa) also supports area queries, so you can load the entire surrounding terrain in a single HTTP request. Some of these resources are .glb files, but a lot of it is game data that is encoded in msgpack format - another task that the build pipeline needs to accomplish. So for example, the metadata for an actor is a msgpack structure that includes character behavior data, color remappings, and so on - as well as a reference to a .glb skin and armature, which are loaded separately since they are shared between many actors.

I haven’t gone too far into mesh vertex optimization yet, although I may reach that point. I’ve made the deliberate choice to do most of my coding on a 2015 MacBook Air, which gives me a framerate of about 20 fps - but when I run it on a more modern machine, I get a smooth 60fps. The reason for coding it on the older platform is to prevent me from getting too fancy with the graphical detail, I basically keep adding features until I can’t stand the frame rate, and then go optimize for a while until I feel comfortable again :slight_smile:

I’m also experimenting with “reactive game scripting” - taking the basic ideas of systems like MobX or Recoil and applying them to character and scenery behavior rather than HTML widgets. (I have authored a number of obscure game scripting languages, the primary one being SAGA - Scripts for Animated Graphic Adventures, which is still supported by the Scrumm VM.)

Reactive scripting means that scripts are not written in an imperative language like Lua/Python/C#/Papyrus/etc. but are more like a spreadsheet formula, one that recalculates automatically when its dependencies change - so it’s relatively easy to attach an expression like “door.open = button1.active” and have the door open or close whenever the toggle button changes state - without all that tedious mucking about with subscribing and unsubscribing to event channels. I’m in the middle of constructing a general framework for this, which I call “reflex”.

1 Like