Visualize Normals -> erratic render

Started by KlausK, August 08, 2019, 07:26:22 PM

Previous topic - Next topic

KlausK

Hi everyone,
any idea what is going on here?
I tried to change the Gradient Patch Size in the Compute Terrain node and it does something but...
just a shot in the dark  :-[

Any hints / help / solvers appreciated very much.

CHeers, Klaus
/ ASUS WS Mainboard / Dual XEON E5-2640v3 / 64GB RAM / NVIDIA GeForce GTX 1070 TI / Win7 Ultimate . . . still (||-:-||)

WAS

#1
No real clue, but I imagine the shaders are augmenting texture/displacement space, and than you're merging those results, and somehow it's interpreting different patch sizes per bucket.

As a note, you should not be merging compute terrains like this -- not only for weird results like this, but you're essentially creating duplicated merged compute terrains which will impact render time.

Matt

I'm not sure why it's messing up the buckets like that, but I suspect something strange is happening with the displacement in Merge Shader 02. If you just want to show the normals I would suggest simply using the Visualise Normal shader like any other surface layer and put it at the end of the main shader chain.
Just because milk is white doesn't mean that clouds are made of milk.

KlausK

#3
Thank you WASasquatch and Matt.

@WASasquatch: There is only one Compute Terrain. How could there suddenly be different Gradient Patch Sizes in the mix?

@Matt: It is the second Merge for sure which is somehow causing this.
I also was wondering why the 3D preview shows a "good" result and the render is messed up later.
 
Now I fed the nodes in a Surface Layer first, then merged the two network strings and the messed up buckets are gone.
Edit: Re-opening the scene and rendering again gave me the same bucket render errors...again. So that`s not it.
What I get is not exactely the same result like before but at least a "good" render result.
+++   +++   +++   +++   +++   +++   +++   +++   +++   +++   +++   +++   +++   +++   +++   +++
I was trying to get a grayscale image from the Visualize Normals together with the Contours shader.
But I only got a flat colour trying all conversions I could think of.
Is that not possible?

CHeers, Klaus 
/ ASUS WS Mainboard / Dual XEON E5-2640v3 / 64GB RAM / NVIDIA GeForce GTX 1070 TI / Win7 Ultimate . . . still (||-:-||)

Dune

What do you want to achieve? Furthermore, I don't think the visualize normals needs an input, and I think it outputs 3 vectors. Red, green (up/down) and blue, or x,y,z. So you could try convert from these colors to scalar and then multiply by a contour... or something like that.

KlausK

@Dune: nothing special. Just trying to figure out what else the node could be used for.

Not feeding data into the input of the Visualize Normals results in a flat colour.
Feeding in the Compute Terrain gives a usable visual representation of the normals.
Looks a little bit like a Normal Map.

Later the idea to connect it to a Contour shader was to somehow produce terraces from this.
Not only contour lines. It works ;)
I`ll post a picture of the network later. Have to go.

CHeers, Klaus
/ ASUS WS Mainboard / Dual XEON E5-2640v3 / 64GB RAM / NVIDIA GeForce GTX 1070 TI / Win7 Ultimate . . . still (||-:-||)

Matt

#6
You are merging the terrain into itself, indirectly through the Visualise Normal shader and through the Merge Shader. That may be causing unexpected results.

Visualise Normal doesn't need to be merged in, it can simply sit in the main network like a regular shader. If you need it to be mixed in a more complicated way, it can be a child of a surface layer, but in that case you really should not link the terrain as its input because that will merge the terrain into itself.

Visualise Normal colours the surface with a colour that represents the surface normal. The red/green/blue values derive from the X/Y/Z components of the surface normal, in other words they represent the direction the surface is facing in. R to Scalar, G to Scalar and B to Scalar can be used to extract the individual components as greyscale values.
Just because milk is white doesn't mean that clouds are made of milk.

KlausK

@Matt: thanks, I realize that I am merging the terrain into itself, but for now I have not managed to get the resulting displacement in another setup.
I`ll try to use it in the main network and try to separate the components like you described.
For now: here is the network as it is now. Still terrain into terrain.

CHeers, Klaus
/ ASUS WS Mainboard / Dual XEON E5-2640v3 / 64GB RAM / NVIDIA GeForce GTX 1070 TI / Win7 Ultimate . . . still (||-:-||)

KlausK

Just a quick example what you can do with the Visualize Normals node.
CHeers, Klaus
/ ASUS WS Mainboard / Dual XEON E5-2640v3 / 64GB RAM / NVIDIA GeForce GTX 1070 TI / Win7 Ultimate . . . still (||-:-||)

WAS

#9
Sorry Klause, I meant that your terrain, and the compute terrain is being augmented and merged together creating multiple compute terrains in use, and one is augmented by a merged PF it seems and the other with the visualize normal vector.

Matt

Have you tried it without the Visualise Normals shader? I don't think it is necessary for the displacement effect you're seeing. Any simple white surface would probably have the same effect. The colour information is lost when you convert it to displacement, so I don't see why it's useful here.
Just because milk is white doesn't mean that clouds are made of milk.