Grass Shader Confusion

I am attempting to implement this technique of rendering grass into my three.js app.

On level terrain at y position 0, everything looks absolutely fantastic!

Problem is, my app (game) has the terrain modified by a heightmap so very few (if any) positions on that terrain are at y position 0.

It seems this vertex shader animation code assumes the grass object is sitting at y position 0 for the following vertex shader code to work as intended:

	if (pos.y > 1.0) {
		float noised = noise(pos.xy);
		pos.y += sin(globalTime * magnitude * noised);
		pos.z += sin(globalTime * magnitude * noised);
		if (pos.y > 1.7){
			pos.x += sin(globalTime * noised);

This condition works on the assumption that terrain is flat and at position 0, so that only vertices above the ground animate. Well… umm… since all vertices are above 1 with a heightmap (mostly), some strange effects occur, such as grass sliding all over the place lol.

Is there a way to do this where I can specify a y position threshold based more on the sprite than its world position? Or is there a better way all together to deal with this “slidy” problem?

I am an extreme noobie when it comes to shader code =]

Any help would be greatly appreciated.

I have no idea what I’m doing.

Edit* Ok, I think the issue is that I am altering the y position of each mesh merged into the main grass container geometry based on the y position of the terrain it sits on. I guess the shader is looking at the local position, but since the geometry itself vertically displaced, the shader doesn’t know how to compensate. Hmm…

Ok, I made a fiddle that demonstrates the issue:

Change the value on line# 128 to a 1 instead of 2 and everything looks fine. Not sure how to go about resolving this =[

Also, I have no idea why the colors are doing that, they look fine in my app.

One way to solve the problem is to create a new buffer attribute and fill it with the height of each grass element. But this means you have to have to port loadGeometry() to BufferGeometry first.

In your vertex shader, you would then have an additional attribute float height that you can use as a replacement for the hardcoded value of 1.0.


Ah ok, interesting. So those “position” and “normal”, “uv” attributes in the shader come directly from the BufferGeometry attributes. That makes sense =]

The reason the loadGeometry function waits till the end to convert to BufferGeometry as opposed to using it all along (as i’d prefer) is due to BufferGeometry not haveing either the “mergeMesh” or the secondary matrix parameter on it’s merge function.

Is there a way to merge buffer geometries with translated mesh’s the way you can with regular old Geometry?

Regarding the double post, sorry, was not 100% sure where this belonged (def not three.js issues section hehe, got yelled at for that already =]). For future reference what is the preferred method of asking these app specific questions?

Also, thank you for your response =]

Ok sigh…

So I got it to where the app does at least not break, but my limited knowledge of buffergeometry in general seems to be holding me back.

In this modified fiddle:

I am attempting to correlate a “heightmap” that includes the height offset for each vertex. Each grass plane contains 30 vertices in regular Geometry, so I figured I would add 30 values of each height offset per plane.

Turns out that resulted in an “out of range vertices” error. Apparently 30 was not enough as the BufferGeometry “position” attribute has a count of 42,000, and the “height” attribute had a count of 15,000. I tripled it to 90 just to test and it at least didn’t break but obviously the mapping is way off.

lol, I’m trying =[


Ok, lol I “solved” it in a super low brow (i don’t really understand this way). I simply created one plane which told me that the position/normal/uv attribute count was 84 for a single plane. I changed my arbitrary “90” value to 84 and it’s working like a charm lol.

I guess a single plane split into 14 sections creates 84 vertices in non-indexed buffergeometry, which makes sense I guess. 3 vertices per 14 sections = 42 * 2 triangles per section = 84… edited to correct stupidness

I’m getting it!!!

Now… on to figuring out how the hell to get fog working on ShaderMaterial. WIll try to hunt down an answer, if not, will be back to bother y’all =]

You can use BufferGeometry.applyMatrix() to apply the world matrix of the mesh to the geometry. Then you can use BufferGeometryUtils.mergeBufferGeometries() for merging. Just apply all existing geometries as an array to the function call.

Ohhh. interesting thank you!

Getting there =]

Got it converted to using solely BufferGeometry. Loads about 100x faster lol. Thank you so much.

1 Like

cannot find the working sample on jsfiddle… can you share the link? with Buffergeometry and instanceBuffer ?

I just want to add that one can leverage an existing attribute. For example if the terrain is just deformed in Z the XY probably overlap UV freeing them. You could encode your local hight in the U channel for example. If you do need the UVs you can use vertex colors.

1 Like

Cool idea. I took the instancing route, which you have been helpful with in the past. Thank you.

1 Like