Since today I am new to three.js, but love it already.
I am at the moment trying to figure out how to read realtime sensor data into a VR website, and I am unsure where to start researching… like, how could data could be read into three.js? Where does it interface on the three.js side?
I read sth about mysql and D3.js?
Can anyone maybe point me in a direction?
I am good with arduino, osc, processing etc., but I am not very experienced with web and three.js…
Well, since it’s usually the goal to visualize data in some way, it’s necessary to represent them as an instance of BufferGeometry. Think of this class as a container that groups together logically related geometry data like vertices, normals or texture coordinates. Such data are represented as instances of BufferAttribute.
So the idea is to create buffer attributes for all of your sensor data, add them to your instance of BufferGeometry, use this geometry to create a mesh (probably a point cloud or lines) and then render this 3D object with a custom shader. You probably want to start with ShaderMaterial for this use case.
Thank you so much! Those are super helpful directions. I will look them up in the documentation in more detail. !!!
Do you maybe also have a hint on how to get the data from the websites data base?
( I guess I have to upload it from arduino for instance, to the websites data base and then read it into the BufferGeometry).
Maybe I am too tired, i just find examples where defined values are written into the BufferGeometry, but i will need to link it to the matrix of values which the sensor gives.
Thanks a ton for the links!
And sorry for my late reply, I fell over sick for a couple of days
I love the microphone example :))
Seems like the analyzer nodes are the way to go here…
I assume they use the internal mic, I didn t see, if that was assigned somewhere / if it was possible to take input from an external mic. (possible probably but probably requires more work…)
Thank you for sending the links!
I am excited to play with it more