Minimize main-thread work

I’m trying to improve the performance of a website because Google PageSpeed ​​Insights is complaining. If I leave out the three js part I get a performance of 100% with the three js part only 59%, and I don’t know how to improve it or what possibilities there are to improve the performance

import * as THREE from "../lib/three.module.js";
import { GLTFLoader } from "../lib/GLTFLoader.js";
import { OrbitControls } from "../lib/OrbitControls.js";
const canvas = document.querySelector(".webgl"),
  renderer = new THREE.WebGLRenderer({
    canvas: canvas,
    antialias: !0,
    alpha: !0,
  });
renderer.setPixelRatio(window.devicePixelRatio),
  (renderer.shadowMap.enabled = !0),
  (renderer.shadowMap.type = THREE.PCFSoftShadowMap);
const scene = new THREE.Scene();
scene.background = null;
const camera = new THREE.PerspectiveCamera(45, getAspectRatio(), 1, 1e3);
camera.position.set(2, 2.5, 5.5);
const controls = new OrbitControls(camera, renderer.domElement);
(controls.enableDamping = !0),
  (controls.enablePan = !1),
  (controls.minDistance = 6),
  (controls.maxDistance = 15),
  (controls.minPolarAngle = 0.5),
  (controls.maxPolarAngle = 1.5),
  (controls.autoRotate = !0),
  (controls.autoRotateSpeed = 1),
  (controls.target = new THREE.Vector3(0, 1, 0)),
  controls.update();
const ambientLight = new THREE.AmbientLight(16777215, 0.4);
scene.add(ambientLight);
const directionalLight = new THREE.DirectionalLight(16777215, 0.8);
directionalLight.position.set(5, 10, 7.5),
  (directionalLight.castShadow = !0),
  (directionalLight.shadow.bias = -1e-4),
  (directionalLight.shadow.mapSize.width = 2048),
  (directionalLight.shadow.mapSize.height = 2048),
  scene.add(directionalLight);
const spotLight = new THREE.SpotLight(16777215, 1);
let mesh;
spotLight.position.set(-5, 10, -5),
  (spotLight.castShadow = !0),
  (spotLight.shadow.bias = -1e-4),
  (spotLight.shadow.mapSize.width = 2048),
  (spotLight.shadow.mapSize.height = 2048),
  scene.add(spotLight);
const loader = new GLTFLoader().setPath("llLogoThreeJs/");
function setRendererSize() {
  window.innerWidth > 700 &&
    renderer.setSize(window.innerWidth / 4, window.innerHeight / 2);
}
function getAspectRatio() {
  return window.innerWidth > 700
    ? window.innerWidth / 4 / (window.innerHeight / 2)
    : 1;
}
loader.load("scene.gltf", (e) => {
  (mesh = e.scene),
    mesh.traverse((e) => {
      e.isMesh && ((e.castShadow = !0), (e.receiveShadow = !0));
    }),
    mesh.position.set(0, 1.05, -1),
    scene.add(mesh);
}),
  window.addEventListener("resize", () => {
    window.innerWidth > 700
      ? ((camera.aspect = getAspectRatio()),
        camera.updateProjectionMatrix(),
        setRendererSize(),
        animationRunning || animate())
      : (renderer.setSize(0, 0), (animationRunning = !1));
  });
let animationRunning = !1;
function animate() {
  window.innerWidth > 700
    ? ((animationRunning = !0),
      requestAnimationFrame(animate),
      controls.update(),
      renderer.render(scene, camera))
    : (animationRunning = !1);
}
window.innerWidth > 700 && (setRendererSize(), animate());

I wouldn’t trust googles pagespeed for a 3d webapp.
It dings you for load times… which for 3d is usually a bunch of large-ish assets. It dings you for GPU/cpu/javascript usage, which again, for a 3d webapp is like 90% of the workload.

It might be dinging you for the size of the “scene.gltf” … which is probably easy to compress and optimize a bit. .gltf is the text format version of GLTF… but for delivery you want to use a compressed binary version (.glb) and probably run it through some compression tools, if it’s over 5 megs in size.

2 Likes

I’m also skeptical about Google’s speed analysis, but I want to improve the site’s performance to get better ranking.

I temporarily fixed the issue by artificially delaying the loading process by 2100ms, which stopped Google’s crawler. However, this is a suboptimal solution as it simply delays the appearance of the 3D element.

From my understanding, converting the .gltf file to .glb format could solve the underlying problem. Is that correct?

Don’t know if it will fix googles issue, but its good practice generally.

How big is your gltf?

Now it is 3,7KB

I’ve now tried it with .glb, but unfortunately the performance doesn’t improve

Now I tried with a compressed .glb and loading it with Draco but unfortunately the performance remains bad

Have you tried removing specific parts of your THREE.js code to figure out where the issue is? eg if you remove the model loading code and just display a basic cube, does that affect the page speed measurement?

There isn’t in my opinion a issue…

But I have the feeling that Google crawler simply doesn’t understand whether a website has poor performance or whether a website just needs a little longer to show certain things properly

@Locis1 is your website public and if so, can you provide a link to it?

Also, is 59% related to Mobile or Desktop testing (Desktop results are usually higher than Mobile)?

A simple way of pleasing the PageSpeed Insights “overlord” is to actually pay attention to its suggestions and then try to rearrange the code.

You could always check the code of my webpage but just to show off its PageSpeed Insights results check the attached picture. This does not and can not really reflect the complexity of my and your page .

Hey, yeah the website for my customer is online, you can check the site under:

But the website performance is perfect, but only because I load the logo 2100ms later and the Google crawlers stop testing the website and thats not the best solution…

The PageSpeed Insights numbers look good when testing that website from my location.

Also, your website loads pretty quick and there is also that popup about cookies, which distracts enough to pay attention to that 2100ms late loading. For me at least, there was also a popup to translate the page which also adds to distraction to ever really notice the late loading.

I don’t really have any further suggestion for you.

definitely do not use this one tho :shushing_face:

3 Likes

Yes, I think there will be a good solution for this until Google changes their crawlers

Use Web Workers like here with good mobile page speed.

Mainly to narrow things down… if you replace the entire model with just a THREE.BoxGeometry, does the result improve? Either way that may help tell you where to focus.

The model is a bit high on vertex count for what it does, see wireframe view…

… but I’m not necessarily sure that’s the cause of the issue.

2 Likes

Yes, there are many vertices, but I don’t think that’s the main issue. I believe the real problem is that the Google crawler struggles to differentiate between a poorly optimized website with a lot of unnecessary code that hurts performance, and a well-designed website that requires a bit more performance to deliver a better user experience."

Nope, importing megabytes of code and assets just to render a rotating logo (that has way too many faces btw), thereby wasting bandwidth and draining the visitor’s battery for no good reason, is pretty much the epitome of “poorly optimized.”

1 Like

Yes, I’ve noticed that the logo has too many faces. But that’s not really the point here. I’ve tested many websites and have never found one that gets good performance from Google when it includes Three.js elements. (Maybe I’ve only tested “poorly optimized” sites.)

So please tell me how to generally address this issue instead of just telling me again that my logo has too many faces.

I can’t tell you what specific thing on your website leads PageSpeed to believe the site is slow; I can only tell you how you might find out. This requires experimentation on your part, which might include:

  • (1) import three.js, but do not use it at all
  • (2) replace the model with something trivial like a BoxGeometry

Those tests would immediately tell you whether PageSpeed is penalizing you for the size of the JS bundle, or size of the model.

1 Like

Thank you!

That sounds like the easiest way to check why Google is whining.

I’m going to try that out, maybe I’ll get smarter because of it.