🎵 particles driven by music

I’ve created a particles simulation driven by music.

live: Particles Music
repo: GitHub - polygonjs/tutorial_audio_analysers: 🎵 Tutorial showing how to use audio analysers to update a WebGL scene 🔊

It is based on the GPUComputationRenderer, tonejs, lots of glsl, and Polygonjs to build the scene and make the creation of shaders mush easier. In the coming days I hope to release a tutorial on how to create scenes like this.

I’ve tested it on quite a few browsers and device, and that runs fine even on a 7 year old android, but I’d be curious to hear how that runs on your devices? Any frame drop or obvious bug?
One thing that surprised me for instance, is that for iphone, even a recent one, I’ve had to set the data type of the GPUComputationRenderer to HalfFloat. I would have expected this not to be necessary since iOS supports webgl2. And I have not tried on windows at all, so am curious to hear from any one how that works there.

7 Likes

Win10, GTX1060. Works as smooth as butter.

Love the idea. Love the realization. Love the atmosphere.
Everything is top notch. Just epically wow. :metal:

1 Like

ah, thanks a ton for taking the time to check, and for such a positive reaction. That’s great to read.

And also great to know it works well on your windows setup.

1 Like

And here are the tutorials I mentioned a few days ago, to create this type of scene, where particles are driven by music:

part 1 (2min): Create Real time Particle Simulations From Music - introduction [part 1/4] - YouTube

quick summary where I show snippets of what this tutorial is about, namely how to use audio to drive elements like material properties, lights and forces applied to particles.

part 2 (16min): Create Real time Particle Simulations From Music - connect volume to any parameter [part 2/4] - YouTube

Here I show the basic on how to import and manipulate audio in Polygonjs.

part 3 (16min): Create Real time Particle Simulations From Music - use waveform data into shaders [part 3/4] - YouTube

this explains how to use waveform data inside a material.

part 4 (57min): Create Real time Particle Simulations From Music - connect FFT to particles [part 4/4] - YouTube

This last part explains how to use FFT audio analysis to drive particles.

Hope that’s interesting. Feel free to let me know if anybody has other experiences or techniques to use audio to drive WebGL elements.

2 Likes

wantit

I like it ! Very creative.
Would be awesome if you could play own music :wink:
10/10

1 Like

haha, I had not seen this meme yet :slight_smile:

Thanks a lot for the kind words.
And yes, I do plan an update where you can drag and drop your own mp3/wav files.

2 Likes

Here you go, you can now drag and drop your own music files:

same link: Particles Music

2 Likes

Haha Yes ! Now its the real deal :wink:

Thumbs up !

P.S. i noticed that some of my music is to “heavy” in Bass so the dots flying away to far.
Try: Slow Hours - Endless - Slow Hours - Endless - YouTube

1 Like

Yes, you’re right, thanks a lot for the feedback. I have not yet found a good way to automatically modulate the beat detection for every type of music.

I’ll probably add a slider “sensitivity” to allow fine tuning (not quite sure when though :grimacing: )

And thank you for the link to that song, lots of good ones in this channel (and plenty of tests cases to use!)

1 Like

First of all I wanna say great work,
and the problem of the particles flying away could be solved by normalizing all your values between 0 and 1, and multiply by a certain set of scalars so that you could offset the range/radius of the particles from your sphere based on that normalizedValue*scalarValue, that way you can add a certain amout of randomness too without worrying too much about particles flying away.

Hey, thanks a lot for the feedback, I appreciate.

And yes, I definitely agree normalizing values + adding randomness is very often a good way to controls things. Although in this case, there are a couple gotchas:

  • we need to know what values to normalize. In this case, the root data is the audio analysis. And as I understand, the output of the FFT will be dependent on the frequency as well as the volume. And the question is how do you normalize volume on a realtime analysis? Some songs may have a overall low volume, some may have an overall high volume, and we can’t know that in advance. But we need to be able to have a pleasant amount of particles movement in both situations.

  • at the moment, a force on each particle is applied when a beat is detected. Ands beats may be detected once every 5 seconds, or several times per seconds. If the first case, a force will be applied at the moment of the beat, and then the particles have time to go back to their rest position. In the second case, the force applied keep accumulating, and this is when the particles tend to fly away. I’m not entirely sure (yet) what data can be normalized here.

Let me know if I’ve misunderstood what you mean.

1 Like