I have an image texture PNG that has an alpha channel. When I set it to alphaTest 0.5, it works when:
- I export the GLB file from Blender with no materials.
- Use textureloader to load the image texture and then attach it to the GLB file.
All the transparent places are cut out properly.
It does not work when:
- I export the GLB file from Blender with the PNG image texture.
- I find that material in the GLB file update it, and set its alphatest to 0.5.
I’ve checked to see if the material was properly updated and it always shows alphatest 0.5, but the transparent places remain black rather than transparent. I’ve also tried setting transparency properties to true, but did not change the result.
So my question:
Does alphaTest not work when you export image textures with alpha channels with the GLB model?
Or is it still possible I’m just missing something in my code?
Try material.needsUpdate=true
Doesn’t seem to work unforuantely. I swear I got it working before though, but now I can’t.
Send glb here i will try to export image to see how it look in real after blender
Awesome, thank you so much, here it is, I exported it with a PNG, it’s too big here so here’s a google drive link:
Yes. Texture is transparent png.
So it’s possible to get it working with alphaTest if I export the PNG with the GLB model rather than seperately?
I try to see model in three.js example “webgl_loader_gltf.html” but cant see model. I see model in online glb viwer and there without transparency and cant access to mesh. Can you share with you code to load model in html
Here it is!
/*
Auto-generated by: https://github.com/pmndrs/gltfjsx
Command: npx gltfjsx@6.5.2 Grass.glb
*/
import React, { useEffect } from "react";
import { useGLTF } from "@react-three/drei";
import * as THREE from "three";
export default function Model(props) {
const { nodes, materials } = useGLTF("/Grass.glb");
useEffect(() => {
// Modify the material to support transparency
if (materials["MergedBake_Baked.025"]) {
materials["MergedBake_Baked.025"].transparent = true;
materials["MergedBake_Baked.025"].alphaTest = 0.5; // Adjust this value as needed
materials["MergedBake_Baked.025"].side = THREE.DoubleSide;
// If using a texture
if (materials["MergedBake_Baked.025"].map) {
materials["MergedBake_Baked.025"].map.encoding = THREE.sRGBEncoding;
materials["MergedBake_Baked.025"].map.flipY = false;
}
}
}, [materials]);
return (
<group {...props} dispose={null}>
<mesh
geometry={nodes.grass_block_first_half_right_Baked.geometry}
material={materials["MergedBake_Baked.025"]}
position={[5.421, 67.409, -0.825]}
rotation={[Math.PI / 2, 0, 0]}
/>
</group>
);
}
useGLTF.preload("/Grass.glb");
1 Like
Thank you i already found model. it was on top
1 Like
Your texture into material.emmisiveMap
instead material.map
and transparent=false
and _alphaTest=0
1 Like
OHHHHHHHHHHHHHHHHH now I get it, thank you! I used an emission node in Blender, in the past it would still attach the image into the “map” and create a basic material, for some reason now it attaches it to the emissiveMap property.
Thank you so much!!!
1 Like
Optimised size of png now 6mb instead 55mb
let map=model.children[0].material.emissiveMap;
model.children[0].material=new THREE.MeshBasicMaterial();
model.children[0].material.map=map;
model.children[0].material.transparent=true;
model.children[0].material._alphaTest=0.5;
model.children[0].material.needsUpdate=true;
1 Like