How to use console.log() in shader

Hello webgl experts.
I would like to use console.log() in three.js shader code, or something similar with that.
Of course I know that console.log() is imporsible in glsl code.

My idea is to use Uniform.
First. declare a uniform in javascriop.
Second. change it’s value to something you want to show in console.
Then, detect the uniform value is changed in javascript.

Please let me know if this is possible.

I think the best method is often to output a meaningful color. The shader cannot write to a uniform.

1 Like

Yes, donmccurdy is correct. The easiest way (if we can call it easy) to debug is by purposely setting colors and looking for meaningful changes on screen to reflect the changes/values you are testing. It takes some getting used to. The other thing you can do, which is more complex and may not be appropriate to your use case, is to output your shader to a rendertarget, and call _gl.readPixels on that rendertarget, which will retrieve the numerical data only from the shader’s output. Three has this function available through the API interface to WebGLRenderer and to use it you will want to look at readRenderTargetPixels( renderTarget, x, y, width, height, buffer) in the WebGPURenderer class. I use this for debugging compute shaders in WebGL/Three and it is indispensable.

I’ve used it with the jsm/GPUComputationRenderer addon that is used in the compute shader examples which do GPGPU computations via the old school ping pong method. I usually set up a simple utility function that is bound to an onOnKeyDown() eventListener and let it spill data to the console on demand when I press the assigned key. -Remember we are dealing with rows and columns of data, so you sort of have to adjust to debugging data like this. You need to know the coordinates of the pixel of the data you are looking for. In lieu of having the exact coordinates, I almost always just output each row and column via a loop. I also output the row and column ( j, i in the example below) coordinates that hold the value so I can keep track of the data across different testing phases. If the rendertarget is a small texture, I just let it dump the entire contents to the console. In the example below, ‘10’ is used just to get a small sample of the data. To look at all of the RGBA data on a 64x64 rendertarget, you would change renderTargetHW to 64.

 const read = new Float32Array( 4 );
let renderTargetHW = 10;
for ( let j = 0; j < renderTargetHW; j++ ) {

                    for ( let i = 0; i < renderTargetHW; i++ ) {
                    
                                renderer.readRenderTargetPixels( renderTarget, i, j, 1, 1, read);
  
                               console.log("RenderTarget Data: row: " + j + ", column " +  i + " + ".  Values are:  " + read[0] + ",  " + read[1] + ", " + read[2] + ", " + read[3] );
                          
                        }
                    }

Yes it is complicated, but if you are doing GPGPU computations and trying to verify your work you will eventually end up using this because it is the only way to get the values back from the GPU in WebGL, at least as far as I know. In any case, keep in mind that if you did use readRenderTarget, you are only getting back a Float32Array(4) for each pixel- no text- no special characters or anything, just numbers.