I am try to make a ray Sphere intersection function work for a post-processing effect in threejs but am stuck

I’ve been trying to add this post-processing (taken from sebastian lague video which I am trying to convert from unity to threejs) effect that when a ray hits the ocean on my mesh

(the blue) it is colored white (just like in his video

) ), and everywhere else the original color is returned. But for the life of me can’t seem to figure out the problem, I assume my ray origin or direction might be wrong but nothing seems to work, get an enitrely white screen instead of just the ‘ocean parts’ :confused: , Here’s the code that I pass to the ray Sphere intersection function and the function itself.

vec2 raySphere(vec3 centre, float radius, vec3 rayOrigin, vec3 rayDir) {
        vec3 offset = rayOrigin - centre;
        float a = 1.0; // set  to dot(rayDir, rayDir) instead of rayDir may not be normalized
        float b = 2.0 * dot(offset, rayDir);
        float c = dot(offset, offset) - radius * radius;

        float discriminant = b*b-4.0*a*c;
        // No intersection: discriminant < 0
        // 1 intersection: discriminant == 0
        // 2 intersection: discriminant > 0
        if(discriminant > 0.0) {
            float s = sqrt(discriminant);
            float dstToSphereNear = max(0.0, (-b - s) / (2.0 * a));
            float dstToSphereFar = (-b + s) / (2.0 * a);

            if (dstToSphereFar >= 0.0) {
                return vec2(dstToSphereNear, dstToSphereFar-dstToSphereNear);

        return vec2(99999999, 0.0);
void main() {
       float depth = readDepth(tDepth, vUv);
        vec4 ro = inverse(modelMatrix) * vec4(cameraPosition, 1.0);
       // position in object space
        vec3 rd = normalize(position - ro.xyz);
        vec3 oceanCentre = vec3(0.0, 0.0, 0.0);
        float oceanRadius = 32.0;
        vec2 hitInfo = raySphere(oceanCentre, oceanRadius, ro.xyz, rd);
        float dstToOcean = hitInfo.x;
        float dstThroughOcean = hitInfo.y;

        vec3 rayOceanIntersectPos = ro.xyz + rd * dstToOcean - oceanCentre;
        // dst that view ray travels through ocean (before hitting terrain / exiting ocean)
        float oceanViewDepth = min(dstThroughOcean, depth - dstToOcean);
        vec4 oceanCol;
        float alpha;

        if(oceanViewDepth > 0.0) {
            gl_FragColor = vec4(vec3(1.0), .1);
        gl_FragColor = texture2D(tDiffuse, vUv);

If anyone can help point out where I might be messing up, that’d be much appreciated, or to some resources thanks!


I was also struggeling with this code, but then with his project on planet atmosphere (see here).

In Unity the direction away from the camera (in view space) is z+ and in Three.js it’s z-. This might mess things up in:
a) giving the right ray direction vector (the normalized viewVector in his code)
b) comparing the found intersection points

After some trial-and-error I got the following to work:

Vertex shader

varying vec4 vPos;
varying vec3 viewVector;

void main() {
    vPos = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
    viewVector = vec3( modelMatrix * vec4(position, 1.0) ) - cameraPosition;
    // ...etc

Fragment shader
Then, in the fragment shader, I put a different sphere intersection function, that I found at this page:

vec2 RaySphereIntersection(
	in vec3 spherePos, in float sphereRadius,
	in vec3 rayPos, in vec3 rayDir
) {
	float dist1 = 0.0;
	float dist2 = 0.0;
	vec3 o_minus_c = rayPos - spherePos;

	float p = dot(rayDir, o_minus_c);
	float q = dot(o_minus_c, o_minus_c) - (sphereRadius * sphereRadius);
	float discriminant = (p * p) - q;
	if (discriminant <= 1e-7) {
		// No intersections or one (a tangent point on the sphere)
		return vec2(-1.0);

	// Two intersections

	float dRoot = sqrt(discriminant);
	dist1 = -p - dRoot;
	dist2 = -p + dRoot;

	return vec2(dist1, dist2);

This function returns the two distances to the intersection points or vec2(-1.0) if no intersection was found or the ray is barely scraping the sphere.

I also added the readDepth function from the depth texture example, but changed it a bit so it would return the z distance in view space (so z- in case of Three.js). The uniforms are supplied in JS just as in the example and uses a THREE.WebGLRenderTarget's texture and depthTexture.

#include <packing>
uniform sampler2D tDiffuse;
uniform sampler2D tDepth;
uniform float cameraNear;
uniform float cameraFar;

float readDepth( sampler2D depthSampler, vec2 coord ) {
	float fragCoordZ = texture2D( depthSampler, coord ).x;
	float viewZ = perspectiveDepthToViewZ( fragCoordZ, cameraNear, cameraFar );
	return viewZ;

Now, in the fragment shader, I use it like this:

varying vec4 vPos;
varying vec3 viewVector;

void main() {
	// Get depth and original color
	vec2 vCoords = (vPos.xy / vPos.w) * 0.5 + 0.5;
	vec4 originalColor = texture2D( tDiffuse, vCoords ).rgba;
	// Get the depth value and invert z-direction for compatibility with calculations
	// (so z+ is in front of the camera and z- is behind the camera, out of sight)
	float sceneDepth = -1.0 * readDepth( tDepth, vCoords );

	vec3 rayOrigin = cameraPosition;
	vec3 rayDir = normalize(viewVector);

	vec2 test = RaySphereIntersection(planetCentre, atmosphereRadius, rayOrigin, rayDir);

	if (test.y > 0.0) {
		// If the ray intersected
		float dstFar = min(sceneDepth, test.y);
		float dstNear = max(0.0, test.x);
		float dstThroughAtmosphere = (dstFar - dstNear);

		// Uncomment this line to see if it works correctly:
		// gl_FragColor = vec4(dstThroughAtmospheret / (atmosphereRadius * 2.0));

		vec3 pNear = rayOrigin + rayDir * dstNear;
		vec3 pFar = rayOrigin + rayDir * dstFar;

		// ..etc
	} else {
		gl_FragColor = originalColor;

Of course, planetCentre and atmosphereRadius would be oceanCentre and oceanRadius in your case.

Hope this helps!

Note: the intersection function I used seems to have less issues with z-fighting on render (as Sebastian encounters in his video).

Sebastian’s implemented this sphere intersection calculation, I think. However, I’m not sure if raySphere is correct…

1 Like