I’ve create a 3D model, along with UV Mappings in Blender:
Using the Blender to Three exporter, I then export my geometry to Three: (https://github.com/mrdoob/three.js/tree/master/utils/exporters/blender)
After doing that, I can then import my JSON:
It looks like:
{
"uvs":[[0.775568,0.779311,...
"faces":[41,0,527,843,84...
"normals":[0.0973846,-0.111087...
"vertices":[0.173144,0.225188,...
"metadata":{
"faces":660,
"normals":156,
"generator":"io_three",
"version":3,
"vertices":1078,
"uvs":1,
"type":"Geometry"
}
}
I can then pull that into Three.js
const json = jsonLoader.parse(json);
const geometry = json.geometry; // The only thing in the parsed json is `geometry`
What I’d like to do now, is take an image and apply it to this geometry, using the same UV Mappings I set up in blender.
Note that when I say “an image”, I don’t mean the exact image that was used in blender. User’s of this application will be filling in details of that particular template image and uploading it to my application. However, since it follows the same template from Blender, the same UV Mappings will apply.
I’m stuck on this piece of it … what would the code look like to use UV Mappings from blender, to create and apply a texture to this geometry given a provided image?
Right now, I do this, but the UV Mappings from Blender seem to have no bearing on how this is rendered:
const loader = new THREE.TextureLoader();
loader.load('http://www.example.com/some-image-url.png', artworkTexture => {
const json = jsonLoader.parse(json); // The JSON from blender
const geometry = json;
// Most likely unrelated code ommitted here
artworkTexture.anisotropy = renderer.getMaxAnisotropy();
const artworkMaterial = new THREE.MeshBasicMaterial({
side: THREE.FrontSide,
map: artworkTexture,
depthWrite: false,
depthTest: false
});
objectMaterials.push(artworkMaterial);
const mainObject = THREE.SceneUtils.createMultiMaterialObject(
geometry,
objectMaterials
);
});