Functions, Colors And Displacement a.k.a confusion, pain, agony, and despair

Started by keyframe, January 18, 2011, 03:41:01 PM

Previous topic - Next topic

Volker Harun

Quote from: keyframe on January 19, 2011, 02:50:29 PM
At some point Both Martin, and Volker may have mentions the 'compute terrain' node - which allows one access to the Y-term (among other useful things like normal, and slope (and i'm assuming surface derivatives dpds/dpst).  I've attempted to add it between 'AF' and 'merge shader', but it didn't seem to affect the result returned by 'get position'.

Well, the Compute terrain node would be nice to have below the Merge shader.
Between the Compute terrain and the Planet come the surface shaders and maybe additional functions with a separate 'Get position'.

By the way ... this is some unusual and interesting way to make a crater :)

keyframe

Quote from: Volker Harun on January 19, 2011, 03:18:59 PM
Well, the Compute terrain node would be nice to have below the Merge shader.

By the way ... this is some unusual and interesting way to make a crater :)

Heya Volker,

Is there a way to get the y-term (altitude?) that results from AF to be available BEFORE the merge shader?

G

ps: there is another issue puzzling me a little bit.  If you roll over the mouse in the 3d viewport over the inside of the crater - it seems that the height inside the crater is roughly -60m.  I would expect this height to be 0.  Any idea where the height difference is coming from?

Volker Harun

I doubt that there is a convenient way to do it ... maybe there is no way at all.
One thing to try is:
Put a compute terrain below the AF and do a second connection from AF's output to this Compute terrain node. After this you can use the Get position.
Maybe this, maybe a simple Function node like 'Convert -> Length to scalar' does it. But I cannot give it a try myself right now.

The height of the crater ... the default setting of the merge shader gives you an average of the two inputs. The AF has its own height range, the Smooth step scalar gives you a colour range from 0 to 1. So the -60m is the average of the AF and the Smooth step scalar.

Tangled-Universe

As far as I know you would need a "get position from texture" for the y-term to work as well.
Beware that if you use this other type of "get" function it will behave differently, which is logical as it provides different data.
There are of course also "get altitude (in texture)" functions, which only output the y-term.

FYI
Compute terrain computes the normals and texture coordinates ==> allows for slope and altitude restriction
Compute normal computes the normals but NOT texture coordinates ==> allows for slope but NOT altitude restriction
Tex coords from XYZ computes NOT the normals but only texture coordinates ==> allows for altitude restriction

Quote from: keyframe on January 19, 2011, 02:50:29 PM
At some point Both Martin, and Volker may have mentions the 'compute terrain' node - which allows one access to the Y-term (among other useful things like normal, and slope (and i'm assuming surface derivatives dpds/dpst).  I've attempted to add it between 'AF' and 'merge shader', but it didn't seem to affect the result returned by 'get position'.

I still have no idea though how to make use of these blue function nodes together with the normal red ones.

Quote from: keyframe on January 19, 2011, 02:50:29 PM
- one can separate the X and Z axis terms from 'get position' by using the 'x to scalar' and 'z to scalar' functions. (note that I've also attempted to use 'Red to scalar' and got the same result as 'x to scalar', which leads me to believe that vector data (color, position, normal, etc - is in fact one and same, despite having specific functions to deal with each)

I think this is intentional because this would allow vector data to be split in colour, position etc. so that you can use it with non-vector data and then rebuild it to a completely new function.
It adds flexibility/possibilities.

Quote from: keyframe on January 19, 2011, 02:50:29 PM
[edit: just for completeness sake, in most systems the reason a distinction exists between different types of vectors (color-vector, point-vector, normal-vector, etc) is that certain operations 'transform' certain types of vectors (imagine a rotation of the terrain by 45deg).  You would want point, and normal vector to be transformed along with the operations, while 'other vectors' [ie: color] to be left alone.  I have no idea whether this is the case with TG or not.  Just speculating, and trying to fill in some blanks.   ;)

You might find these interesting:
http://forums.planetside.co.uk/index.php?topic=2220.msg23499#msg23499
http://forums.planetside.co.uk/index.php?topic=4047.msg42267#msg42267
http://forums.planetside.co.uk/index.php?topic=3227.msg33372#msg33372
http://forums.planetside.co.uk/index.php?topic=5664.msg60374#msg60374

Matt

Quote from: keyframe on January 18, 2011, 03:41:01 PM
Hello Everyone,

My apologies for the somewhat vague posting - i'm trying really hard to wrap my head around some of the concepts of the TG2 demo - and it seems that I'm failing.  Perhaps someone could shed some light on this for me.  I am familiar with PRman shader writing, and manipulating noise functions to create 'stuff'.

Hi keyframe,

Hope you're still checking in.

As you're coming from a shader-writing background I know I can skip past the basics and explain things a bit differently :)

Quote
-What data type is returned by the displacement functions? (Alpine Fractal, for example)
-What data type is returned by the 'color functions' (Power Fractal for instance)?

The shader nodes that are coloured red in the network view manipulate a wide band of data (the render/shading state), which includes things like P, Pg, Pt, N, Ng, Cd, etc. They are in some ways analogous to displacement shaders or surface shaders in PRMan, except that both displacement and surface shading are encapsulated in the same node. Some shaders, e.g. the Alpine Fractal, perform only displacement, while others such as the Lambert Shader perform only surface shading. Many do both, including the Power Fractal Shader and the Image Map Shader. (Displacement and shading are performed at different stages of the render pipeline, however.)

Low level functions are handled using the "function" nodes which are coloured dark blue in the network view. They don't become displacement or colour or anything else until they are fed into a shader, as only the shaders modify the shading state.

Since a lot of the interesting functionality of Terragen is wrapped up in the shaders without there being equivalent functions, shaders can also pass as functions, but only if they generate and store diffuse colour (Cd). Most shaders which are not solely displacement shaders will do this. For example the Power Fractal and the Image Map Shader can be used as functions, with only their colour settings and not their displacements being used. In various situations where one shader can call another shader or function, this has led to some inconsistency in terminology, with parameter names using "shader" and "function" interchangeably.

Quote
-What data type is expected as input to the displacement functions (They seem to be able to operate both with, and without input -- does having an input imply Base_fractal+New_fractal?)

When shaders are chained together through the main input labeled "input", roughly speaking they are executed in series. Most displacement shaders apply displacement additively, but this is entirely up to the shader. Compute Terrain and Compute Normal are a bit different in that they call the input displacement 3 times in order to calculate the normal.

Quote
-Can that data type (Displacement? Vector?  Scalar? foo?) be manipulated by using the 'Functions' (for instance, can I multiply the result of Alpine Fractal, by 2), and still have it be a 'valid displacement'? (my tests seem to indicate that it cannot - the minute that I append any multiply node after Alpine Fractal, my terrain disappears).

Yes, but not using the Multiply function. When you did that, the Multiply function would have cast your shader to a value, and the rule for this is to take the current diffuse colour. That would be whatever diffuse colour the upstream nodes had set.

Taking displacement shaders and manipulating them with functions like this is a current weakness of Terragen. You can do it, but it's not intuitive, because the design is biased towards chaining displacements in series. The basic idea is given here: http://forums.planetside.co.uk/index.php?topic=7548.msg80786#msg80786

Then again, if you build your displacement from scratch using functions, you can simply use them to displace the surface using the Displacement Shader. It takes a scalar function and displaces along the normal.

If you want to displace by a vector, that is possible using the Redirect Shader and three Displacement Shaders. Redirect Shader takes 3 other shaders and "redirects" them along each axis by giving them a modified normal to work with. Each of those shaders could be a Displacement Shader that takes a scalar.

Quote
-What is the relationship between the 'combo nodes' color and displacement parameters?  (PF, for instance, has both color AND displacement controls -- which suggests to me that they are handled separately -- gut instinct: displacement scalar, color vector)

Displacement can happen in any direction. A shader takes a point P and modifies it however it likes. The shader returns the modified shading state.

All displacement occurs before any shading is done. The shader tree is called twice. Because of this, the state of the surface being shaded might be different when performing shading than when performing displacement. For shaders which aim to link displacement with shading this can be a problem, but is often solved by choosing a suitable point in the displacement pipeline to calculate texture coordinates which will be used for shading. The "Tex Coords From XYZ' shader does this. The Compute Terrain shader does this too, and also calculates the normal of the incoming displacement.

A bit more info here:

http://forums.planetside.co.uk/index.php?topic=9745.msg101996#msg101996

I hope this helps. Please continue to ask questions.

Matt
Just because milk is white doesn't mean that clouds are made of milk.

Matt

Quote from: keyframe on January 18, 2011, 08:37:27 PM
I think part of my confusion is that in 'our' world (Houdini), data is a little bit more fluid.

A function returns 'values' - how you map it, is completely up to the user.  You can use the result of perlin noise (whether 1D, 2D, or 3D), and map the results to displacement, color, or use as a basis for yet another function.  I suppose I was looking for an analogue in the TG world, but I suppose that's a bad approach on my part.

The same is possible using the function nodes, rather than the shaders. They can then be mapped to displacement, diffuse colour etc. using some of the shaders, e.g. the "Default Shader" has connections for diffuse, luminosity, two specularity attributes, displacement, opacity (the latter being limited to either on or off right now). The "Surface Layer" has some, but not all, of these attributes, but has additional features like "blend shader" (i.e. a mask) and some built-in slope-based and altitude-based masking features.
Just because milk is white doesn't mean that clouds are made of milk.

Matt

Quote from: Tangled-Universe on January 19, 2011, 11:45:33 AM
Well, to be honest I never really understood the fundamental difference between a vector and scalar. Maybe you're wondering how the hell I know the other stuff then, well, please don't ask :) lol
Anyway, you've made that clear to me now, thanks :)

TU, a vector contains multiple values. In TG2 vectors are always 3-dimensional. The components of the 3D vector are x, y and z. As you know, we use this to represent positions or directions in 3D space.

A scalar is just a single value.
Just because milk is white doesn't mean that clouds are made of milk.

Matt

Quote from: keyframe on January 19, 2011, 02:50:29 PM
I think I might be getting somewhere ;)

I've attached a simple file showing a crater carved into an alpine terrain.

From what i'm able to gather at the moment:

- 'get position' returns X and Z coordinates by default

It's also returning Y, but if you are starting with a nearly flat surface then the Y component will be very close to 0.

EDIT: Oh, in your case 'get position' Y would be changed by the Alpine Fractal, but I think the Merge Shader operates with two independent render states so that the results of the first branch don't pollute the second branch. If you want to modify the Alpine Fractal by a function that is aware of the result of the fractal, you might have to use the trick I posted in my first reply.

Quote
- one can separate the X and Z axis terms from 'get position' by using the 'x to scalar' and 'z to scalar' functions. (note that I've also attempted to use 'Red to scalar' and got the same result as 'x to scalar', which leads me to believe that vector data (color, position, normal, etc - is in fact one and same, despite having specific functions to deal with each)

Data can be cast between colours and vectors silently. However, the distinction exists so that the right thing is done when casting either of these to a scalar. For example, casting a colour to a scalar is done by taking the luminance of the colour using:


luminance = 0.2125f * r + 0.7152f * g + 0.0724f * b


A vector would be cast to a scalar by taking it magnitude or length.

The type of data output by a function is indicated by the last word in the function name, although I think there might be some nodes that omit that.

Quote
- the Y term is 'undefined' at this stage (you can confirm this by plugging the output of 'y to scalar' into the first input of 'smooth step').

It's not undefined, it's just very nearly 0 because of the shape of the surface you are shading.

Quote
At some point Both Martin, and Volker may have mentions the 'compute terrain' node - which allows one access to the Y-term (among other useful things like normal, and slope (and i'm assuming surface derivatives dpds/dpst).  I've attempted to add it between 'AF' and 'merge shader', but it didn't seem to affect the result returned by 'get position'.

The updated position generated by Compute Terrain is the texture position. This is retrieved using 'get position in texture'. Before Compute Terrain or Tex Coords From XYZ, 'get position in texture' will return the same as 'get position'.

As a general rule, you usually want to use the texture coordinates for any kind of texturing or shading. This also allows other shaders like the Transform Shader and Warp Shader to move your shader around.

EDIT: However, this still won't allow the second branch of the Merge Shader to see the results of the first branch.

Quote
[edit: just for completeness sake, in most systems the reason a distinction exists between different types of vectors (color-vector, point-vector, normal-vector, etc) is that certain operations 'transform' certain types of vectors (imagine a rotation of the terrain by 45deg).  You would want point, and normal vector to be transformed along with the operations, while 'other vectors' [ie: color] to be left alone.  I have no idea whether this is the case with TG or not.  Just speculating, and trying to fill in some blanks.   ;)

Yeah, with arbitrary attributes on some geometry this is useful. We don't really have anything like that. In the context of these function networks we only ever operate on one vector at a time, and it's nice to have it quietly cast to the type expected by the function requesting the data. The distinction between colour and spatial vector is kept so that these casts do the appropriate thing. Perhaps this will become an issue in future, I don't know.

Matt
Just because milk is white doesn't mean that clouds are made of milk.