Complicated Issue

Update: I fixed it by adding the following after the home plate renderer renders its first frame:

renderer.compile( scene, camera );

I can’t believe I fixed it myself because I’ve been tryin off and on for two weeks. I’ve read the WebGLRenderer documentation before and I thought I tried that. I guess not.

I’ll do my best to explain this problem I’m having. I create a shadermaterial with the following code:

let textures = [];

if ( shader == "dirt" ) {

	textures = [
		{ "file": "dirt.jpg", "t": null },
		{ "file": "dirtDark.jpg", "t": null },	
		{ "file": "dirtN.jpg", "t": null }	

	for ( let l = 0; l < textures.length; l++ ) {
		textures[l].t = new THREE.TextureLoader().load( "textures/" + textures[l].file );
		textures[l].t.anisotropy = aF;
		textures[l].t.wrapS = THREE.RepeatWrapping;
		textures[l].t.wrapT = THREE.RepeatWrapping;

	let standard = THREE.ShaderLib['physical'];
	let theUniforms = THREE.UniformsUtils.merge( [
			"randomSeed": { value: Math.random() + Math.random() * Math.random() },
			"map2" : { value: textures[1].t },
	] );	

	let theMaterial = new THREE.ShaderMaterial({
		lights: true,
		uniforms: theUniforms,
		defines: {
			PHYSICAL: true,
			USE_CLEARCOAT: false,
			USE_SHEEN: false,
			USE_ENVMAP: false,
			USE_SPECULAR: false
		vertexShader: document.getElementById( 'dirtVS' ).textContent,
		fragmentShader: document.getElementById( 'dirtFS' ).textContent
	}); = textures[0].t; = textures[0].t;
	theMaterial.map2 = textures[1].t;

	theMaterial.uniforms.normalMap.value = textures[2].t;
	theMaterial.normalMap = textures[2].t;
	theMaterial.uniforms.normalScale.value = new THREE.Vector2( 1.4, 1.4 );
	theMaterial.normalScale = new THREE.Vector2( 2.0, 2.0 );

	// theMaterial.uniforms.diffuse.value = new THREE.Color( 0xBBBBBB );
	theMaterial.uniforms.roughness.value = 1.0;
	theMaterial.uniforms.metalness.value = 0.0;
	theMaterial.uniforms.ior.value = 1.0;
	theMaterial.uniforms.reflectivity.value = 0.0;
	theMaterial.uniforms.iridescence.value = 0.0;
	theMaterial.uniforms.iridescenceIOR.value = 1.0;
	// theMaterial.uniforms.specularIntensity.value = 0.0;
	// theMaterial.uniforms.specularColor.value = new THREE.Color( 0x000000 );
	return theMaterial;

That texture is applied to a mesh on an imported GLTF. It works, BUT I have another renderer beside the main one, which points downward at homeplate from above it. That renderer only is activated once every swing, but as soon as it renders one frame, my shadermaterial does this:

What I’m conveying with that image is the shadermaterial, which is the dirt, looks as it’s intended to look when the camera is pointing straight downward at it but at any other angle, it’s dark, like you see on the right. I don’t understand why that other renderer is causing the main renderer to do this.

I’m hopin someone has some input on this. I’ve given it a lot of thought before posting here.

Here’s another oddity about it all. I have a couple cubecameras and cuberendertargets for reflectiive material. There’s a statue and a pool in CF that use envmaps. When I update those, it corrects the shadermaterial issue completely, but I added a bloom pass to the game today and now even when I update the reflections, it doesn’t correct the shadermaterial problem.

If you wanna see it in action, the link is Home Run Derby.

Update: I removed all the reflections (cubecameras and cuberendertargets) and it didn’t change anything.

Update: It definitely has something to do with the separate renderer looking down at home plate , but I don’t understand why.

Glad you figured out a solution! Your app looks really cool. :smiley: I love the details (water, flag, fireworks! etc)…
I do notice your stadium bin file is a bit chonky, leading to slightly long load times. You might want to consider using .glb (binary gltf) format, and perhaps DRACO mesh compression, or meshopt to crush that model down.

The next biggest offenders are the dirt normalmap… and some other textures… but overall really cool project. :smiley:

1 Like

Hey, man, I appreciate that. Thank you. I’m aware some of the textures are unnecessarily big, but I’ll add it to my to-do list now.

I do a lot of work with the model in the JavaScript code - things like replacing a material with a custom shader material, using certain meshes as collision meshes only, then hiding them, etc.

If I use GLB format with compression, would I still be able to do such things?

.glb and .gltf are identical, .glb just embeds binary resources into the same file. Once loaded in three.js the data will be the same.

npm install --global @gltf-transform/cli

# convert to glb only
gltf-transform cp stadium.gltf stadium.glb

# convert and optimize
gltf-transform optimize stadium.gltf stadium_opt.glb --no-simplify --no-flatten --no-join --texture-compress webp --compress meshopt
1 Like

Thank you. I will definitely look into this when I can. For now, it’s added to the list.

I could just export the stadium model from Blender in GLB format too, right?