Possible to calculate linear regression on GPU?

I have a point cloud that is being used to display a scatterplot. This point cloud can be an arbitrary amount of points depending on the data being brought in and can be crossfiltered by the user.

I’d like to have a 3d regression plane that is calculated in real time as the user filters with slicers, but because there can potentially be many points, I figure the only way to do that at 60fps+ would be by leveraging the GPU.

For example, let’s say there are 30,000 vertices; I’d want to do a least sum of squares calculation on those (via GPU) to ultimately output the vertices that make up the plane geometry.

I’m thinking it’s not possible in a standard vertex shader because there’s no way to sum vertices together since they don’t “know” about one another. My only other thought would be possibly using some sort of GPGPU in order to do it?

Is something like this possible? Thank you!

This sounds like the sort of thing that tensorflow.js would help with: GitHub - tensorflow/tfjs: A WebGL accelerated JavaScript library for training and deploying ML models.. Here’s a short tutorial for linear regression: Basic Tutorial with TensorFlow.js: Linear Regression | by Tristan Sokol | Medium

2 Likes

This is fascinating! Had no idea it existed… I will look into it! Thanks!