Browser-based module support

Nothing serious here, just looking to satisfy my curiosity.

The background of this question is similar to this one, but a little different, because it targets using a browser’s built-in module support, rather than using a bundler or a CDN like There is also a little confusion between what is in GitHub vs. what is in npm, which I’ll detail in a moment.

First, here’s the basics of the environment:

<!-- path = /index.html -->
<script src="main.js" type="module"></script>
// path = /main.js
import { OrbitControls } from './node_modules/three/examples/jsm/controls/OrbitControls.js'

This fails in the browser because the npm version of three.js (0.130.1) imports its dependencies from 'three';. The browser can’t resolve ‘three’ to a path, so it fails.

Most of my confusion came when I was trying to reference the code for answering a SO question. In GitHub, the dependencies import is: from '../../../build/three.module.js'; (and this is up to the most recent version). Using this fixes the pathing for the browser.

Now that I’m done pointing out the obvious, I guess my questions are:

Is the npm version of three.js specifically built with the assumption that you’ll be using it with a bundler? I guess that’s fine, if true, because you’re in the environment, so the assumption could be made that a bundler will be involved. But using es modules (not to be confused with node/npm modules) is becoming better supported, and it’s easier to use a manager like npm to keep dependencies up-to-date, rather than manually downloading each one from GitHub.

Is any consideration made for browser-based module support (beyond the core libraries*), or is the “roadmap” simply that three.js is always intended to be used with a bundler?

* making this distinction, because the examples are, after all, only meant to be examples.

P.S. I also noticed the npm version is 0.130.1, but there’s no associated tag in GitHub. Are patches not tracked on GH?

The NPM package is now intended to be used in an environment than can resolve NPM package names. See #21654 — in short, if you are using npm install three then you will need to also have a build tool or bundler that can resolve imports from 'three'. The source files on GitHub still use relative imports, if you need those e.g. with shallow git clone, but I’d really advise using a bundler if you are using NPM. Snowpack and Vite provide a good developer experience with less complexity than Webpack or Parcel, if that’s a concern.

You can also now use import maps in your html. These describe where your browser can find any module specifiers that it finds in any of your imported scripts. You can host the linked scripts yourself or use cdn. No bundler required.

eg, pointing to self hosted scripts

    // some other html head tags
    <script type="importmap">
          "imports": {
            "three": "./build/three.module.js",

and in your main javascript module you reference modules using the named module specifers instead

import * as THREE from 'three';
import {OrbitControls} from 'three/examples/jsm/controls/OrbitControls';
import Stats from 'three/examples/jsm/libs/stats.module';

const scene = new THREE.Scene();
// etc, the rest of your code

JSFiddle : Threejs using import-maps

I have also created a Threejs boilerplate that demonstrates this technique using Three r130

GitHub : Threejs Boilerplate using import-maps

The r130 scripts in the GitHub boilerplate are served via nodejs (app.js) using static routes pointing to the node_modules threejs install.

see app.js

app.use('/build/', express.static(path.join(__dirname, 'node_modules/three/build')));
app.use('/jsm/', express.static(path.join(__dirname, 'node_modules/three/examples/jsm')));

To try it

git clone
cd Threejs-Boilerplate
npm install
npm start

Visit in browser

everything on npm is specifically built for node resolution. that is the whole point, modules that declare their dependencies.

the es module spec only works for modules without dependencies, and import maps is as of yet just a rough draft. pointing to …/…/dist contradicts package resolution. the only thing to be done here imo is to wait for specs to complete.

The essentially answers my questions. And I get it–use appropriate tools when appropriate.

My only remaining concern would be should this be called out somewhere? In the docs? In the npm README? Base on StackOverflow topics I’ve seen, I’m not the only one who looked at npm as a much less rigid environment.

I’m just looking to forestall assumptions (like mine) made, and instill some confidence that a developer has the right package for the right purpose, since even npm its self hedges their bets by saying they are (despite having “Node” in their name) simply a “JavaScript package manager”/“registry of JavaScript software,” and only “the default package manager bundled with Node.”

Good to know, but I err on the side of established specs. Thanks!

since even npm its self hedges their bets by saying they are (despite having “Node” in their name) simply a “JavaScript package manager”

Libraries on NPM may be used in Node.js environments or on the Web, hence the “N” in NPM being a misnomer. But anything on NPM with dependencies is still expected to import those dependencies with a bare identifier, not a relative path. And so most libraries on NPM do require one of the following:

  • (a) a bundler
  • (b) a no-bundler dev server like Snowpack or Vite
  • (c) a bundling CDN like Skypack

I don’t think it is necessary to put a disclaimer on the NPM readme that the project relies on NPM’s module resolution – this applies to anything that has dependencies.

Aside — building an application without any build tooling seems uniquely popular in the three.js community, as opposed to any other JS community I’ve participated in. I’m honestly pretty curious why that is, if it’s historical or something about the web graphics use cases?

I’m going to going make a few generalizations and assumptions (based purely on my observations over on SO), so take this for what you will. I believe a lot of three.js “work” is done within the context of:

  1. Rapid prototyping
  2. Academia (learning/teaching 3D graphics)
  3. Beginners (perhaps a subset of 2, but includes DYI-ers wanting to make games and websites)

This does not speak negatively to these groups, nor to three.js–quite the opposite! It should be taken as a compliment that three.js is such an appealing and approachable library. But all of these groups have something in common: “getting to the point.”

  • Setting up a POC to show a client their site can show off GLTF files
  • Putting a spotlight on the meat of the lesson
  • Excitement driving someone to jump the gun and dive in

All of these things would result in someone who is perhaps new to JavaScript development, or at least web 3D, to ignore (or not know about) modern tooling, and jump right in. It’s great that the core of three.js lets that happen. I’m certain that its accessibility is a major reason why it is so popular.

But as you say, this is an aside. Thanks again for your input. My curiosity is satisfied. :slight_smile:

1 Like

it’s all besides the point now but well, it reads very romantic, :hibiscus: no build tools :butterfly:, but …

you can’t load a gltf, browsers don’t fetch local files. you can’t prototype rapidly by refreshing the browser on every change. incomplete specs such as ESM make it impossibly hard to move forward because nothing will work. you can’t share your work or show anyone, no one is going to load up an unminified 10 megabyte website.

it only takes a few minutes to understand vite or parcel, which enables people to easily build, share, re-use, prototype and deploy. :man_shrugging:

Thanks for the thoughts @TheJim01! Makes sense, I expect those are likely to be a good approximation of the causes and reasons here.

To your comment @drcmda, it’s not that the workflows aren’t (IMHO) better with these tools, but that this value isn’t quickly legible to someone unused to the prevailing web development tools — I won’t necessarily say “beginners”, I think many experienced developers at larger companies (working with custom or older build systems) also find npm/node.js/javascript to be an inscrutable blob of technologies. For educators, just getting ‘npm’ installed on all of your students’ various operating systems can be a risk in a 1-2 week unit.

it only takes a few minutes to understand vite or parcel…

Vite doesn’t provide installation instructions on its website, only boilerplate initialization for a new project. Parcel requires an external plugin just to serve static files (like glTF) in its dev server, and without that you get 404 errors that are too vague to really debug. :man_facepalming: I use these tools but the onramp is steeper than most of us remember after a year or two of using them, I think.