Sensor data read into website

Hi!

Since today I am new to three.js, but love it already.
I am at the moment trying to figure out how to read realtime sensor data into a VR website, and I am unsure where to start researching… like, how could data could be read into three.js? Where does it interface on the three.js side?

I read sth about mysql and D3.js?

Can anyone maybe point me in a direction?
I am good with arduino, osc, processing etc., but I am not very experienced with web and three.js…

Would be gret if anyone could give me a lead!

Thank you for reading! :slight_smile:

Best!
Blou

Well, since it’s usually the goal to visualize data in some way, it’s necessary to represent them as an instance of BufferGeometry. Think of this class as a container that groups together logically related geometry data like vertices, normals or texture coordinates. Such data are represented as instances of BufferAttribute.

So the idea is to create buffer attributes for all of your sensor data, add them to your instance of BufferGeometry, use this geometry to create a mesh (probably a point cloud or lines) and then render this 3D object with a custom shader. You probably want to start with ShaderMaterial for this use case.

If you are new, you may find the official three.js examples as well as the simpler examples to deal with data helpful. :slightly_smiling_face:

Hi Mugen87,

Thank you so much! Those are super helpful directions. I will look them up in the documentation in more detail. :slight_smile: !!!

Do you maybe also have a hint on how to get the data from the websites data base?
( I guess I have to upload it from arduino for instance, to the websites data base and then read it into the BufferGeometry).

Maybe I am too tired, i just find examples where defined values are written into the BufferGeometry, but i will need to link it to the matrix of values which the sensor gives. :thinking:

Thank you again for your help!

Blou

Thank you hofk!

I wish I had a week of holidays to try all of these :smirk:
But I ll definately check them more :slight_smile:

Did you remember an example of reading data from an external sensor? I checked and didn t see it but maybe I overlooked it.

In any case, super cool collection!
Thank you for doing the work and for sharing it :slight_smile: :partying_face:

Best,
Blou

In the example, data is read in from the microphone. However, they are only displayed and converted into a simple rotary movement.

if( magnitude > 128 ) {
		
			mesh1.rotation.y  += magnitude / 128;
			//mesh1.rotation.z  = magnitude / 128;
		}

You can also use the data to manipulate the buffer values.

https://hofk.de/main/threejs/audioBasic/index.html
from
https://hofk.de/main/discourse.threejs/2018/Xindex2018.html


In this example I manipulate the data of an indexed BufferGeometry.
It is defined in geo.js

https://hofk.de/main/threejs/modifyGeo/modifyGeo.html
from
https://hofk.de/main/discourse.threejs/2018/Xindex2018.html

Note that starting with r 110, .setAttribute is used instead of .addAttribute.


This is where data is stored in the local memory.
Used 2017 Geometry only, not BufferGeometry. That can be adjusted.

https://hofk.de/main/threejs/raycaster/raycaster.html
from
https://hofk.de/main/discourse.threejs/2017/Xindex2017.html


Away from three.js. Do you know this thing?

https://d3js.org/
https://dev.to/flippedcoding/starting-with-d3-js-for-data-visualization-4bbo
https://github.com/d3/d3/blob/1126611a8972244ba2e876f57a71c82c3098331b/API.md

Hi hofk!

Thanks a ton for the links!
And sorry for my late reply, I fell over sick for a couple of days :stuck_out_tongue:

I love the microphone example :))
Seems like the analyzer nodes are the way to go here…
I assume they use the internal mic, I didn t see, if that was assigned somewhere / if it was possible to take input from an external mic. (possible probably but probably requires more work…)

Thank you for sending the links!
I am excited to play with it more :slight_smile:

Best!
Blou