I have this example here which doesn’t set a buffer if it is in onBeforeRender callback. Why doesn’t this work?
it seems to work if you increment currentIndex
inside the onBeforeRender
(it cycles through all the attributes, they are all black, then on the next cycle they all work). I wonder if there is some missing “initialization” step here
if you move setting the buffer to the animate()
call instead, it seems to work? I don’t know enough about the internals of the render lifecycle here, it’s possible that when setting it in that callback, that the attributes aren’t uploaded to the GPU until the next time it is used? (like if it somehow “missed” the upload pass)
This is a part of a much bigger project and I would need it to be initialized in onBeforeRender callback. Looking at source code of Renderer was difficult and I couldn’t find the reason why this wouldn’t work. But yeah I would really want to get a concrete answer to this, because I am assuming that something similar is happening. So possibly:
- Attribute update gets called before onBeforeRender;
- needs update gets cleared somewhere after on Before render.
This is the only reasonable explanation that comes to my head, but I am not completely sure and would love someone to confirm this.
Attributes are updated before onBeforeRender
, here:
I’m not an expert but I think that you can update attributes inside onBeforeRender
callback using GLBufferAttribute.
Example:
I think to make it work as if it were an InstancedBufferAttribute
, you will have to manually copy some properties or create your own class.
I’m waiting for an expert opinion.
I was thinking that attributes are updated before onBeforeRender, but it is a bit weird, why it seems like needsUpdate flag gets removed after OnBeforeRender. If this is correct, I’d probably need some lower level call to create buffer manually.
What do you think about this?
https://codesandbox.io/p/sandbox/idmaterial-test-forked-r2783m?file=%2Fsrc%2Findex.ts%3A50%2C25
This looks like a neat workaround, but again because this is part of a much bigger project I am at the moment not sure if it will suffice. Seems like now I have to manage much lower level calls and some unexpected behaviour could potentially arise.
I would more look for a solution where I could monkey patch a renderer so that it updates attribute at correct place or even just manually call an existing method that will update Attributes to GPU.
If I fail at everything I just might try this out.
the other direction is potentially a PR/github issue, if there is something ThreeJS at the engine level can expose to make this use case easier, and if it doesn’t break existing usage, I think they would consider it!
Why are you creating a buffer on each render? Why are you not disposing them before creating them?
const newAttribute = new THREE.InstancedBufferAttribute(newArray, 1); //this should never be called in loops
attributes.push(newAttribute); //this is weird, geometries are already lists of attributes.
//newAttribute.needsUpdate = true; //this is true already, it's redundant
Thats, fine. It just means they would then render in the next frame, not the current right? It would work, just be delayed, it wouldn’t stop working altogther.
You can do something like this.
const ON_BEFORE_ON_BEFORE_RENDER = ()=>{
//do stuff here before the before render event
//eg you can create a new buffer
const newBuffer = new BufferAttribute( new Float32Array(3), 3)
//eg you can add that buffer to some geometry
someGeometry.settAttribute(newBuffer)
}
function animate() {
requestAnimationFrame(animate);
ON_BEFORE_ON_BEFORE_RENDER() //by putting this line above the line below, it willl be executed first
renderer.render(scene, camera);
}
I’d probably need some lower level call to create buffer manually.
Nah, no need for low level calls, new BufferAttribute
is fine.
…I would need it to be initialized in onBeforeRender callback.
This is likely not true. No matter what you do, it will always be possible to do some work outside of onBeforeRender
.
Sorry but maybe I didn’t quite understand what you mean.
Attributes are sent/updated to the gpu while all objects are iterated to determine which ones to render (a renderList is created).
For each object in renderList is then called OnBeforeRender, updated uniforms and drawn.
This is what I remember.
However, I still think the best way to update attributes inside OnBeforeRender is using GLBufferAttribute, otherwise an update function before calling the render (as @dubois said)
Ps. I studied your three-instanced-mesh repo <3
Your understanding of this pipeline is not correct.
Attributes are sent/updated to the gpu while all objects are iterated to determine which ones to render (a renderList is created).
No.
- they are uploaded once, the first time the renderer encounters an object with some geometry containing attributes
- render lists are created using
Sphere
andFrustrum
and certain flags (eg.frustrumCulled
) there no buffers involved.
eg
geometry.attributes.position.setX(1,0)
geometry.attributes.position.needsUpdate = true
Would upload something to the gpu. But unless you call geometry.computeBoundingSphere
it would still draw the same as in the previous frame.
For each object in renderList is then called OnBeforeRender, updated uniforms and drawn.
The callback is called, uniforms are unrelated to this. You should be able to update uniforms in onBeforeRender
.
GLBufferAttribute
however is something new, i don’t know much about this.
This is what I read:
WebGL.render → projectObject
projectObject → recursively iterates all objects, check if inside frustum, call object.update() and add to renderList.
object.update → create attributes if first render, or update it if necessary (needsUpdate flag).
When all objects have been iterated, projectObjects function ends and enter inside renderScene
renderScene → renderObjects → renderObject → onBeforeRender
Being quite complex, maybe I misunderstood something
You maybe understand but didn’t express it right:
Attributes are sent/updated to the gpu while all objects are iterated to determine which ones to render (a renderList is created).
More like:
“Attributes are sent/updated to the gpu, while all the objects are iterated, to have them available on the graphics card for rendering”.
By design, sometimes this happens right after another. The object could already be on the GPU but something else determines if its rendering or not. Eg object.matrial.visible
, object.visible
, object.frustrumCulled
, object.geometry.boundingSphere
.
Yes you are right, I have a hard time explaining myself even in my first language
There is not such a call that creates buffer on each render, I create 5 instancedBufferAttributes in a loop here:
const attributes: THREE.InstancedBufferAttribute[] = [];
for (let i = 0; i < 5; i++) {
const newArray = new Float32Array(instanceCount);
for (let j = 0; j < instanceCount; j++) {
newArray[j] = Math.random() * (255 << 16);
}
const newAttribute = new THREE.InstancedBufferAttribute(newArray, 1);
attributes.push(newAttribute);
newAttribute.needsUpdate = true;
}
What I have in a loop is setting of first created InstancedBufferAttribute to geometry through the use of geometry.setAttribute(…).
So ultimately I would want to have onBeforeRender where I loop through those 5 buffers.
Why doesn’t this work? OnBeforeRender in my case doesn’t create new buffer it should just set one of the 5 existing buffers like so:
instancedMesh.onBeforeRender = (
renderer: THREE.WebGLRenderer,
scene: THREE.Scene,
camera: THREE.Camera,
geometry: THREE.BufferGeometry,
material: THREE.Material,
group: THREE.Group
) => {
instancedMesh.geometry.setAttribute(
"aDrawingColorId",
attributes[currentIndex]
);
attributes[currentIndex].needsUpdate = true;
};
Also I didn’t understand your comment on
//this is weird, geometries are already lists of attributes
Do you mean geometry contains list of attributes? By my knowledge you can have limited number of attributes depending on WebGL specs for your machine? What I would want is to quickly update a new buffer from CPU to GPU. This exact case works as you explained if I have a ON_BEFORE_ON_BEFORE_RENDER function. But that is exactly my question, why doesn’t it work in onBeforeRender function?
To be fair i actually dont understand what is supposed to happen here. The different buffers will be referenced by the same name.
The weird thing (to me) is that this is at all existing, and in onbefore render.
Say you have buffers A,B,C, and then you also have buffers 0,1,2
You can make 3 geometries with buffers, ABC0, ABC1, ABC2, and just draw those. I think this would potentially make three.js less confused.
const ig0 = instanceGeometry.clone()
const ig1 = instanceGeometry.clone()
const ig2 = instanceGeometry.clone()
ig0.setAttribute.setAttribute("aDrawingColorId_0", attributes[0]);
ig1.setAttribute.setAttribute("aDrawingColorId_1", attributes[1]);
ig2.setAttribute.setAttribute("aDrawingColorId_2", attributes[2]);
What would not work for you about this approach?
Alternatively, why exactly do you need it to happen in onBeforeRender
? You can set it before your call to renderer.render(scene,camera)
and see what happens?
The project I am working on is quite big and it would require some time to implement non-onBeforeRender approach. Also I was curious if I am doing something wrong.
I’ll try to explain what I am trying to achieve with multiple buffers.
Say I have 3 instances of buffers like so: A, B, C. I have them each filled with color information, a specific color information that I want to initialize only once at the start of program (this is crucial, because you have to consider that my insanced mesh could have 100000 instances which would mean having one buffer and replacing all the colors every frame would be unoptimal).
Now if I would have 3 prefabricated instances buffers, I could just set them whenever I need one of them.
For example buffer A has red color and I need it, I’d say setAttribute(name, A). Same for buffer B which holds blue color etc.
Also if interested here is what was explained on github when I opened an issue about it: https: Issue