Orange/Encoded Normal Map Support?

Does three.js support Orange Normal Maps as shown in the image? I wasn’t able to get it working on the engine.

normaldetail

I’ve never seen this type of normal map. Can you please explain a bit why the encoding results in an orange color? I assume the components of a normal vector are mapped to RGB values in a different way. If so, this type of normal map is not supported.

BTW: Where do you get this map? Generated by yourself?

It seems like it’s only encoded in RG, i use such for storing in a geometry buffer.

@tomgie this kind of encoding isn’t supported by default, is there a reason you need to use it?

@Fyrestar This texture is from another game called ROBLOX. The textures are going to be used in conjunction with the three.js engine for a project involving ROBLOX. Plus, many game engines support RG textures as you named them.

While trying to research this type of normal map I found this on stack overflow, https://blender.stackexchange.com/questions/105032/trying-to-convert-a-dxt5nm-normal-map-for-use-in-my-scene

It talks a lot about the texture format and how it works.

Yes, it isn’t implemented in THREE though, i also noticed this one seems different form the one i use (it’s from Far Cry). To add this feature you can either implement the onBeforeCompile callback for materials and replace it, or make it the clean way in the source files.

The file for it is src\renderers\shaders\ShaderChunk\normalmap_pars_fragment.glsl on line 22, replacing it with the following should add it i guess (not tested and i haven’t looked through those visual coding screencaps, maybe some axis flip is needed). You need in your materials then to define the constant DXT5NM in defines.

	#ifdef DXT5NM
		vec3 mapN = texture2D( normalMap, vUv ).xyz;
		mapN.z = sqrt(1.0 - (mapN.x * mapN.x + mapN.y * mapN.y));
		mapN = mapN * 2.0 - 1.0;
	#else
		vec3 mapN = texture2D( normalMap, vUv ).xyz * 2.0 - 1.0;
	#endif
3 Likes

I was able to replace the line that you said with the code that you wrote up but I have no clue on how to use it, I haven’t messed with the GLSL shaders. Plus that last sentence doesn’t really make any sense

You use the map just like a normal normal map (pun intended). In the defines property you set the constant so the shader will know what to do. Like that:

const material = new THREE.MeshLambertMaterial({
            normalMap: yourNormalMap,
            defines: {
                DXT5NM: true
            }
        });

Um, this would be actually a nice three.js example. I think of a simple model that uses this special normal map and a modified material (maybe via onBeforeCompile()) in order to illustrate the different sampling approach.

Well if I use this, the lambert doesnt use normal maps plus defines is not a property of MeshLambertMaterial. I tried using the phong material but defines is again not a property. Also, for the glsl I ended up replacing line 31 which seemed like the right line instead of line 22. Are you on a different version that may cause me not to use the defines property?

Yes, defines isn’t available for the standard materials, so the constant should be set internally then anyway, since it should be defined by the texture, not the material. You could also use ShaderMaterial and just extend MeshStandardMaterial or those which support normal maps.

Here’s a quick pen, don’t know why i needed to flip both axes. I also noticed it uses A and G channels, was already wondering when your image turned darker in a new tab.

2 Likes

I

You can always consult this for how to extend materials. onBeforeCompile is iffy in best case scenario.

I believe that you are suggesting a mix of various official, unofficial and to be deprecated approaches :slight_smile:

I’d advise consistency here.

It would be nice if baked normal maps actually worked :confused:

I’ve been sharing this demo for years:

http://192.241.199.119:8080/dev/testwp/

I would advise avoiding onBeforeCompile() for many reasons. It’s bugged, it’s not very flexible, mrdoob doesn’t like it, and he wants to replace the whole thing with NodeMaterial. It doesn’t seem to be a very future proof solutions.

As is, I’ve also posted many examples on how to modify the materials. Custom tangent/normal map space should not be / is not the only thing you might want to change in a material.
Nor is there just one way to render normal maps. As is, threes normal mapping is suitable for walls and floors and nothing more.

So I’m really curious why this approach makes more sense to be documented than countless others.

No of course not mixing things, this was just to demo and test it. It can be integrated into the core, while the encoding method should be set by the normal map texture. The constant can be set in WebGLProgram then.

The internal materials can be easily extended with a ShaderMaterial, technically there is no difference, except that the uniforms are set each frame from e.g material.normalMap, setting the corresponding isMeshStandardMaterial might already do it then too.

I use a custom system for shaders in a module system, not dealing with manually including header and body chunks, also to plugin code like this case here, the code is assembled in order without regex hacks. I actually kicked out the standard materials since they had a enormous condition forest in the renderer :yum:

How would you integrate it in the core? Can you instanced your custom normal map materials with onBeforeCompile?

I’m under the impression that as people encounter random normal mapping schemes it will just bloat threejs.

Ie id rather have a way for a texture to carry over some define or some encoding than have a hardcoded set of them. Object space normal maps have not been available for years now they are, but I don’t think the normal map texture has any information on how it’s encoded.

I think though that we are in agreement:

It seems that map encodings go against three’s design.

As the code before with the #ifdef DXT5NM condition, it’s just about few lines, and this seems commonly used and is supported by other engines too. With “in texture” i mean on the Texture object, like you can set other format options there.

#ifdef DXT5NM
	vec3 mapN = texture2D( normalMap, vUv ).agr;

	mapN.z =  1.0 - (mapN.x * mapN.x + mapN.y * mapN.y) * 0.5;
	mapN.xy = 1.0 - mapN.xy;
	mapN = mapN * 2.0 - 1.0;
#else
	vec3 mapN = texture2D( normalMap, vUv ).rgb;
#endif

You’d call this one a dxt5nm texture? Ie there would be a property on the texture object that describes what the pixels encode?

I’m really confused by all this :confused:

What is confusing about it? You could call it RGB, AG and such too, that’s not really the issue here. But channels only wouldn’t be consistent of course, i’ve seen 2 others before using RG while differently en-/decoded.

Well object space is confusing me, that probably can’t be encoded with just two channels but I’m not sure. It seems like you can make quite a mix of the encodings and spaces. There’s also a whole bunch of dxt formats.