Quote from: keyframe on January 18, 2011, 03:41:01 PM
Hello Everyone,
My apologies for the somewhat vague posting - i'm trying really hard to wrap my head around some of the concepts of the TG2 demo - and it seems that I'm failing. Perhaps someone could shed some light on this for me. I am familiar with PRman shader writing, and manipulating noise functions to create 'stuff'.
Hi keyframe,
Hope you're still checking in.
As you're coming from a shader-writing background I know I can skip past the basics and explain things a bit differently

Quote
-What data type is returned by the displacement functions? (Alpine Fractal, for example)
-What data type is returned by the 'color functions' (Power Fractal for instance)?
The shader nodes that are coloured red in the network view manipulate a wide band of data (the render/shading state), which includes things like P, Pg, Pt, N, Ng, Cd, etc. They are in some ways analogous to displacement shaders or surface shaders in PRMan, except that both displacement and surface shading are encapsulated in the same node. Some shaders, e.g. the Alpine Fractal, perform only displacement, while others such as the Lambert Shader perform only surface shading. Many do both, including the Power Fractal Shader and the Image Map Shader. (Displacement and shading are performed at different stages of the render pipeline, however.)
Low level functions are handled using the "function" nodes which are coloured dark blue in the network view. They don't become displacement or colour or anything else until they are fed into a shader, as only the shaders modify the shading state.
Since a lot of the interesting functionality of Terragen is wrapped up in the shaders without there being equivalent functions, shaders can also pass as functions, but only if they generate and store diffuse colour (Cd). Most shaders which are not solely displacement shaders will do this. For example the Power Fractal and the Image Map Shader can be used as functions, with only their colour settings and not their displacements being used. In various situations where one shader can call another shader or function, this has led to some inconsistency in terminology, with parameter names using "shader" and "function" interchangeably.
Quote
-What data type is expected as input to the displacement functions (They seem to be able to operate both with, and without input -- does having an input imply Base_fractal+New_fractal?)
When shaders are chained together through the main input labeled "input", roughly speaking they are executed in series. Most displacement shaders apply displacement additively, but this is entirely up to the shader. Compute Terrain and Compute Normal are a bit different in that they call the input displacement 3 times in order to calculate the normal.
Quote
-Can that data type (Displacement? Vector? Scalar? foo?) be manipulated by using the 'Functions' (for instance, can I multiply the result of Alpine Fractal, by 2), and still have it be a 'valid displacement'? (my tests seem to indicate that it cannot - the minute that I append any multiply node after Alpine Fractal, my terrain disappears).
Yes, but not using the Multiply function. When you did that, the Multiply function would have cast your shader to a value, and the rule for this is to take the current diffuse colour. That would be whatever diffuse colour the upstream nodes had set.
Taking displacement shaders and manipulating them with functions like this is a current weakness of Terragen. You can do it, but it's not intuitive, because the design is biased towards chaining displacements in series. The basic idea is given here:
http://forums.planetside.co.uk/index.php?topic=7548.msg80786#msg80786Then again, if you build your displacement from scratch using functions, you can simply use them to displace the surface using the Displacement Shader. It takes a scalar function and displaces along the normal.
If you want to displace by a vector, that is possible using the Redirect Shader and three Displacement Shaders. Redirect Shader takes 3 other shaders and "redirects" them along each axis by giving them a modified normal to work with. Each of those shaders could be a Displacement Shader that takes a scalar.
Quote
-What is the relationship between the 'combo nodes' color and displacement parameters? (PF, for instance, has both color AND displacement controls -- which suggests to me that they are handled separately -- gut instinct: displacement scalar, color vector)
Displacement can happen in any direction. A shader takes a point P and modifies it however it likes. The shader returns the modified shading state.
All displacement occurs before any shading is done. The shader tree is called twice. Because of this, the state of the surface being shaded might be different when performing shading than when performing displacement. For shaders which aim to link displacement with shading this can be a problem, but is often solved by choosing a suitable point in the displacement pipeline to calculate texture coordinates which will be used for shading. The "Tex Coords From XYZ' shader does this. The Compute Terrain shader does this too, and also calculates the normal of the incoming displacement.
A bit more info here:
http://forums.planetside.co.uk/index.php?topic=9745.msg101996#msg101996I hope this helps. Please continue to ask questions.
Matt