Tuesday, March 16, 2010

GPU Procedural Planet

This example does not use any textures to colour the terrain, and is therefore 100% procedurally generated. In basic terms it means that no media is loaded. Everything is 100% generated in the engine.

Let’s take some time to recap what needs to be procedurally generated in order to render the planet (shown in the video below).

Permutations Texture
The topography of the planet is created using noise algorithms. This example uses Multi Ridged Brownian Fractal Motion.  At runtime this noise is created using a series of permutations which are stored in a texture so that it can be accessed by in a shader. The texture data doesn’t make allot of visual sense, however here is an example of what it looks like.


1 Vertex buffer
The planet is rendered as a series of patches. It is this patch nature which allows the recursive sub-division resulting in the increased/decreased level of visible detail. Whereas the CPU Planet generates a unique vertex buffer for each patch (because the noise is calculated when the patch is created and applied to height data in the vertex buffer), the GPU Planet only uses 1 vertex buffer of X * X vertices, generated procedurally, which are displaced in a shader at runtime for each patch to be rendered.


16 Index Buffers
An index buffer is used along with a vertex buffer to render some geometry. In allot of cases 1 vertex buffer is used with 1 index buffer. As described in previous posts a terrain requires 16 index buffers, generated procedurally, so that there are no terrain cracks. It must be possible for the edges of terrain patches, with different levels of detail, to join together seamlessly.



In the video above shows a basic GPU Planet. There is quiet an obvious bug visible as the camera moves. Because all noise is generated on the GPU, the Decade Engine running on the CPU has no knowledge of how much a vertex is displaced. All distance checking from the camera to the planet is calculated from the camera position to the underlying sphere of the planet (vertex displaced to radius of the planet but not with height noise applied). This is ok when the camera is high above the terrain however as the camera moves close to the surface, especially if this ground is at a high altitude, the sphere position may still be some distance beneath and therefore terrain subdivision do not occur properly.

I am considering 2 possible techniques to over come this
  1. Generate the same noise values on the CPU as is generated on the GPU. Since all (pseudo) random data is stored in the permutations texture, it should be possible.
  2. Render the height data to a texture instead of generating it as required each frame, then use this texture for shader vertex displacement as well as calculating the height of key vertices on the CPU.