Under Construction

Click to Continue

Minecraft Clone – Lighting and Shaders

TODO

In this post I discuss the techniques I used to improve the visuals of my Minecraft clone. These will include a mix of built-in Unity tools and custom shaders.

Built-in Lighting

In a previous post I covered how voxels are converted to Unity’s native triangle mesh before rendering. The main advantage of this is I can then take full advantage of Unity’s built-in render pipeline.

This means I get basic transparency, shading, and shadows for free. This alone produces some nice results, as seen in the before/after example below:

While the scene is now passable, some areas still look “flat” compared to the real Minecraft. For example, the corners in the ravine are hard to distinguish. This is fixed using a post processing effect.

Post Processing

Unity supports a post processing stage that runs after the initial rendering of the scene, but before the pixels are displayed to the screen. This is useful for making final adjustments to the image, such as adding bloom or motion blur.

In this case, I wanted to add an effect called ambient occlusion. This is a phenomenon where certain parts of a surface receive less ambient light than others, causing them to appear darker. This is why the corners of a room typically appear darker than the center of the walls.

There are a few common methods for calculating ambient occlusion. I opted to use screen-space ambient occlusion (SSAO) since Unity already provides it as a post processing effect. The results are shown below:

Now the corners are easily visible, giving the scene much more depth and realism.

How does Minecraft implement ambient occlusion?

Minecraft bakes the ambient occlusion values directly into the vertices of its chunk meshes. This adds some additional logic to the meshing and shading algorithms, but means the expensive post processing stage can be skipped entirely. See more details here.

Displacement Shader

After getting the terrain generation up and running, I noticed that my voxel world felt rather rigid and lifeless. To help with this, I decided to add some wind and wave effects. Both of these boil down to nudging mesh vertices back and forth along some direction. For example, a wave should nudge vertices up and down, while wind should nudge them side to side.

This is accomplished using a displacement shader – a special type of shader that changes the position of vertices before rendering them to the screen. Technically this nudging could be applied to the actual mesh vertices, though it’s much more performant to translate the vertices inside a shader.

The distance a vertex is nudged gets determined by an offset function that takes in the vertex’s world position and spits out a value between -1 and 1. An output of 1 causes the vertex to move the maximum distance in the specified direction, while an output of -1 moves it in the opposite direction. If the output is 0, then the vertex does not move at all. As time passes, we can “slide” this function across our world to animate the vertices.

Below is a simplifed 2D example that moves vertices in the vertical direction according to a sine function. If I were to slide the sine function along the x-axis, it would create the appearance of waves moving across the surface of the line.

Diagram showing vertices being offset vertically by the sine function.

Instead of using sine for my offset function, I sample a smooth and tileable 2D noise texture. This produces somewhat random movements in the vertices, which appears more natural.

2D noise texture sampled by my displacement shader.

If we take a look at a cross section running through the middle of the noise texture (red line) and apply it to the simplified example, we get the following displacement:

Displacement caused by noise texture cross section. Notice the negative displacement caused by the darker pixels in the center of the noise texture.

The final step to complete my displacement shader is to selectively apply offsets to different vertices. As I mentioned in a previous post, I save on draw calls by rendering each chunk using a single material (i.e. shader). Therefore, this shader runs on vertices for all block types, some we want to offset and some we don’t.

To differentiate these, I assign different vertex colors during the meshing process. I ended up with 4 different offset categories:

Vertex ColorOffsetExample Blocks
BlackNoneStone, Dirt, Wood
BlueVerticalWater
RedHorizontalGrass, Flowers
GreenVertical + HorizontalLeaves

If I return the vertex color directly from the shader, the offset categories become clear:

Vertex colors used by displacement shader.

Putting all these techniques together, I get the following result:

Example of displacement shader creating a wind effect on the leaves and grass.