Vertical problem with cloud warping

Started by Denis Sirenko, March 05, 2018, 01:30:01 PM

Previous topic - Next topic

Denis Sirenko

Hi, guys!

I have a problem. A simple scheme, warp one fractal through another.

[attachimg=1]

And I sometimes see too active vertical elements and I never see anything similar in the horizontal direction.

[attach=2]

The camera looks sideways, that is, the Y axis is directed vertically. I tried to switch on/ff and adjust the power of various parameters of PFSv3. I can not understand where the problem is. Is the warper not able to correctly warp in all three axes? I tried all day to solve it, I could not. Maybe someone already decided this?

j meyer

Don't know if it is a solution for your problem, but I would try to put a Vector Displacement Shader
between the Power Fractal(Warper) and the Warp Input Shader.
At least it allows control of the 3 axes.

Matt

#2
Warp is based on the displacement generated by the warper. Displacement usually happens in only one direction, which by default is vertical.

To create displacement along all 3 axes you can use the Redirect Shader and plug 3 displacement shaders into its X, Y and Z inputs. Alternatively, as J Meyer mentioned you can use a Vector Displacement Shader instead. This will do a similar thing but there's a very important difference. Vector Displacment Shader takes colour or scalar inputs (which can be the colour output from a fractal) and converts them to displacement, whereas the Redirect Shader takes displacement inputs and simply changes their direction. A Power Fractal can output both displacement and colour but be aware of the difference because they can look different and have different settings in the fractal.

You probably want to give different seeds to each of the 3 fractals. If you don't, you'll just produce a displacement that goes along a diagonal vector, not much more interesting than what you have now.

Matt
Just because milk is white doesn't mean that clouds are made of milk.

Denis Sirenko

J meyer, Matt, thanks!

Both ways helped, they really give similar but slightly different results. While I could not choose which one suits me more, but this is a more pleasant problem.

Denis Sirenko

I continue to deal with a redirect shader and with a vector displacement shader.

From the pictures attached below it is seen that the resulting direction of the displacement is in any case taken from the vector displacement shader, and not from the redirect shader, which stands last in the node network. I expected in this case to be able to connect to the X Y or Z inputs of the redirect shader and change the direction of the displacement. But it did not work. does anyone know why this is happening? Thank you in advance!

Dune

What do you want to achieve? This is not the usual way of using these shaders (I don't anyway). As Matt wrote, vector displacement takes color (either + or -) as input and displaces either in +X+Y+Z or -X_Y_Z, or any combination. This can be rotated/mixed afterwards.
Redirect takes displacement, and + displacement of a PF does a +X in the X input, etc. I don't think it's wise to use a vector displacement shader, which points to one direction, and use that to do another direction in the redirect. It is likely it won't work.
You can use a vector displacement by itself (with color input), or use an additional warp shader, like Jochen wrote, which works differently.

Matt

#6
You should either use the Redirect Shader method or the Vector Displacement Shader method, but not both.

I forgot to mention that you need a Build Vector node if you use the Vector Displacement method.

Use one of the following methods, but not both.


Method 1 - Redirect Shader:

Use 3 fractals that generate displacements. Connect fractal 1 to the 'X' input of the Redirect Shader, connect fractal 2 to the Y input, and connect fractal 3 to the Z input. The Redirect Shader is now producing 3D displacement (vector displacement). You can then use this as a warper for the Warp Shader.

Fractal 1 (disp) --\
Fractal 2 (disp) --> Redirect Shader --> Warp Shader
Fractal 3 (disp) --/


Method 2 - Build Vector and Vector Displacement Shader:

Use 3 fractals that generate colour or greyscale. Connect fractal 1 to the 'X' input of the Build Vector, connect fractal 2 to the 'Y' input, and connect fractal 3 to the Z input. Connect the Build Vector to the 'Vector function' input of a Vector Displacement Shader. The Vector Displacement Shader is now producing 3D displacement (vector displacement). You can then use this as a warper for the Warp Shader.

Fractal 1 (colour) --\
Fractal 2 (colour) --> Build Vector --> Vector Displacement Shader --> Warp Shader
Fractal 3 (colour) --/


Method 1 is simpler for your needs because there are fewer nodes and conversions. But Method 2 is simpler if you already have a vector to use as displacement, e.g. if you have a vector displacement map, where you don't need to use a Build Vector.

Matt
Just because milk is white doesn't mean that clouds are made of milk.

Matt

This is Method 1, which is all you need if your fractal inputs are generating displacement.
Just because milk is white doesn't mean that clouds are made of milk.

bobbystahr

That's just awesome, thanks for the great explanation Matt...
something borrowed,
something Blue.
Ring out the Old.
Bring in the New
Bobby Stahr, Paracosmologist

Dune

Yes, Matt is more thorough in his explanation. Mine was harder to understand I guess.

Denis Sirenko

#10
Dune, Matt, thanks for your explanations.

Quote from: Dune on March 14, 2018, 02:16:03 PM
What do you want to achieve?

This is an experiment. I have some problems with understanding what kind of mathematics Terragen operates with and how he does it. In this experiment I had a question about how the redirect shader works. Although initially I did not set the experiment for this - I originally wanted to understand the work of warp shader in the case of three-dimensional forms - clouds. And for this, I first made simple cylinders using a vector displacement shader to see how they (cilinders), being sent to the warp shader, will work.

Matt, as far as I understand, the need to introduce the Build vector appears here only because it gives an additional opportunity: to send different fractal shaders for different directions (X, Y and Z), thus avoiding bias only diagonally (for a grayscale noise and "1" value for X, Y and Z miltipliers of vector disp shader). Correct me, please, if I'm wrong. But in principle at the output of the vector displacement shader, we get the same array of information as in the case of using a vector displacement shader without the Build vector - we get the space of vectors (but the values of the vectors differ because we have 3 PFS, not 1). Further, this space of vectors, when applied to a surface or volumetric form, will inform (in that place where this space of vectors intersects with this surface or volume form), in which direction this point of the surface or the volume form should be moved.

As for my story with a redirect shader. It seems to me, I understood why redirect shader does not change the direction for the displacement, generated using a vector displacement shader. Apparently the fact is that the vector displacement shader does not use information about the surface normals to which the displacement is applied. The displacement vector shader works in global coordinates. If you order the PFS to generate the displacement not along the normals (by default), but along the verticals (I understand this as "along the vertical in global coordinates"), then the redirect shader also will not be able to change the direction of the dispacement. This is what my experiments say. I hope that with at least one problem I am done and understanding this will enable me to more freely contact with Terragen.

Correct me, please, if I'm somewhere wrong.

Matt

#11
I think you understand this correctly. The only point where I am not sure if I should clarify something is the mixture of Redirect Shader and Vector Displacement Shader. There is no reason to input a Redirect Shader into the vector input of a Vector Displacement Shader, because the Vector Displacement Shader is expecting the input to be a colour or a vector function(*), yet the Redirect Shader's output is displacement (any colour would simply pass through without being changed).

Matt

(*) Colours and vectors are interchangeable, raw data. But displacement is different, it is a modification of the description of a surface. Shaders can be used to generate displacement (displace a surface) using vectors, and other shaders can extract vectors from displaced surfaces.
Just because milk is white doesn't mean that clouds are made of milk.

Matt

Something about how Redirect Shader only affects shaders that displace along the normal:

https://planetside.co.uk/wiki/index.php?title=Redirect_Shader
Just because milk is white doesn't mean that clouds are made of milk.

Denis Sirenko

#13
Thanks for the explanation, Matt. I'm glad that my guesswork was confirmed.

Quote from: Matt on March 17, 2018, 07:33:15 AM
Something about how Redirect Shader only affects shaders that displace along the normal:
https://planetside.co.uk/wiki/index.php?title=Redirect_Shader

It was a little difficult for me to understand this, perhaps because it was written using the term "surface normal", whereas I do not have any surfaces, I work exclusively with the clouds.

Hetzen

#14
Clouds work differently to ground displacement. The ground works with surface normals, clouds work on the volume defined by the white value you're plugging in.

So in the example you posted above, your SSS tubes are defining what is white and you are trying to redirect/vector displace that in x and z. What you should be doing is warping the texture space using a Warp Input node. Y/altitude will be defined by your cloud layer's depth and altitude.