Displacement
Background[edit]
Displacement is technique which is used to give greater shape and detail to surfaces without using precalculated geometry. This means you can take an object made up of relatively few polygons and add much more detail. During rendering scene elements such as the terrain or objects are broken up into micropolygons. Displacement is used to move these micropolygons in 3D space to create more detailed shapes.
Displacement is a fundamental part of the Terragen 4 rendering engine. All of the terrain is created by applying displacement to the smooth sphere of the underlying planet, even when using heightfields. Displacement can create features from the size of mountain ranges down to little pebbles. Displacement can also be applied to other parts of a scene, such as objects like rocks or imported 3D models.
You might be familiar with bump mapping. Bump mapping is another technique which can be used to give surfaces the appearance of more detail. It uses a bump texture to simulate lighting effects that give the impression of a surface having more shape, or making it "bumpier". The difference between displacement and bump mapping is that displacement creates real 3D geometry whereas bump mapping fakes the appearance. With displacement a flat surface takes on a real 3D shape. It looks 3D from all viewing angles. With bump mapping the flat surface stays flat and this is obvious from many viewing angles, especially looking from the side.
Displacement is generated at render time, basically by using mathematical calculations performed by shaders. The geometry generated by displacement is not saved in project files or exported with terrain or object files.
Using Displacement[edit]
As mentioned above displacement can be used to create very large and very small features. Any large features should be created as part of the terrain rather than as a surface layer. You can do it but we don't recommend it. For best results you should connect any shaders that are generating large displacements into the terrain part of the network. They should be connected above the Compute Terrain node.
Most of the shaders which can generate displacement have a common set of parameters for controlling it. Here's a rundown of how those parameters work:
Displacement direction
This popup list allows you to choose the direction that displacement is applied in. Any options in the popup list that have "(requires computed normal)" require that there be a Compute Terrain or Compute Normal connected somewhere above the node in the network to work properly. The popup has the following options:
- Along vertical: Displacement will happen along the normal of the underlying object (i.e. the planet or a model) as it was before any displacement was applied. Usually on a planet this means that positive displacement will occur upwards and negative displacement will occur downwards.
-
Along normal: Displacement will happen along the current surface normal. This may be the same as "Along vertical" if the normal has not been modified by any other shaders before this one, but there are many situations where other shaders have recalculated the current surface normal so that it changes according to the shape of the surface. One way this can happen is if you have a Compute Terrain or Compute Normal shader somewhere upstream in the shader network.
-
Vertical only (requires computed normal): Displacement only happens along the normal of the underlying object (i.e. the planet or a model) as it was before any displacement was applied. Usually on a planet this means that positive displacement will occur upwards and negative displacement will occur downwards. It differs from "Along vertical" in that the displacement is scaled by how similar the surface normal is to the object normal. On a planet this is how much "upwards" the surface normal is (how horizontal the surface is), so horizontal surfaces will be most affected by displacement and side-facing surfaces will be least affected. On other objects "upwards" means the normal of the surface of the underlying object as it was before any displacement.
-
Lateral only (requires computed normal): Displacement only occurs in the lateral plane, or in other words perpendicular to the normal of the underlying object. Usually on a planet this means it will only displace sideways, not up or down. The displacement is scaled by the difference between the object normal and the surface normal. On a planet this is how much "sideways" the surface normal is, so side-facing surfaces will be most affected by displacement and horizontal surfaces will be least affected. On other objects "sideways" means any tangent to the surface of the underlying object as it was before any displacement.
-
Lateral normalized (requires computed normal): This is the same as Lateral only but the normal is normalised (scaled so it has a length of 1).
Displacement multiplier
This multiplies the displacement values coming from the Displacement function input. A value of 1 leaves the incoming values unchanged. A value of 2 would make the incoming values twice as large. A value of 0.5 would make them half as large. Negative values will invert the displacement.
Displacement function
This parameter is where you connect the node(s) used to generate displacement for the layer. It expects scalar inputs. This means some nodes which create displacement themselves may not give the results you expect, no displacement for example. This is because those nodes are displacing scene geometry directly rather than outputting values which can be used for generating displacement in another node. You can connect nodes which create colour though. The colour will be automatically converted to a scalar.
An example of this situation is using the Simple Shape Shader to generate displacement for another node. If you just turn on displacement for the shape shader you won't get any displacement in the node its connected to. However if you turn on colour for the shape shader you will see displacement.
From v2.4 on you can use the Displacement Shader to Vector node to convert the output of a displacement shader to a vector which can be connected to the Displacement function input. The vector gets converted to a scalar.
Displacement offset
This value is added to incoming displacement values after they are multiplied by the Displacement multiplier parameter. This creates the effect of offsetting the displacement by a set amount along the Displacement direction. Positive values push the displacement out so it looks almost as if it was sitting on a plinth. Negative values will sink the displacement back into the surface. It doesn't reverse the displacement, it's more like creating a hole in the surface and then applying the displacement to the bottom of the hole.
You might find surfaces with rough or spikey displacement occasionally show problems, such as being cut off at bucket edges or causing gaps in ray traced shadows. Some nodes, such as the Planet, have Displacement tolerance parameters which can help to improve this. Changing this parameter can greatly increase render times so you should only change it if you have a specific need. The default value is 1. If you're having problems try increasing it to 2. If that improves things but doesn't completely resolve them then try increasing it by small increments. You would not generally want to go above a value of 4 or 5 however.
Literally, to change the position of something. In graphics terminology to displace a surface is to modify its geometric (3D) structure using reference data of some kind. For example, a grayscale image might be taken as input, with black areas indicating no displacement of the surface, and white indicating maximum displacement. In Terragen 2 displacement is used to create all terrain by taking heightfield or procedural data as input and using it to displace the normally flat sphere of the planet.
A single object or device in the node network which generates or modifies data and may accept input data or create output data or both, depending on its function. Nodes usually have their own settings which control the data they create or how they modify data passing through them. Nodes are connected together in a network to perform work in a network-based user interface. In Terragen 2 nodes are connected together to describe a scene.
A shader is a program or set of instructions used in 3D computer graphics to determine the final surface properties of an object or image. This can include arbitrarily complex descriptions of light absorption and diffusion, texture mapping, reflection and refraction, shadowing, surface displacement and post-processing effects. In Terragen 2 shaders are used to construct and modify almost every element of a scene.
Literally, to change the position of something. In graphics terminology to displace a surface is to modify its geometric (3D) structure using reference data of some kind. For example, a grayscale image might be taken as input, with black areas indicating no displacement of the surface, and white indicating maximum displacement. In Terragen 2 displacement is used to create all terrain by taking heightfield or procedural data as input and using it to displace the normally flat sphere of the planet.
A parameter is an individual setting in a node parameter view which controls some aspect of the node.
A scalar is a single number. 1, 200.45, -45, -0.2 are all examples of scalar values.
A vector is a set of three scalars, normally representing X, Y and Z coordinates. It also commonly represents rotation, where the values are pitch, heading and bank.
When Terragen renders, it divides the image up into buckets or tiles. Each bucket is rendered separately, allowing multiple buckets to be rendered at once. It also allows memory to be used more efficiently.