Planetside Software Forums

Support => Terragen Support => Topic started by: Kadri on October 07, 2015, 03:30:54 PM

Title: Cloud quality changes by itself
Post by: Kadri on October 07, 2015, 03:30:54 PM

I am not sure if this a problem at all actually. I am more curious what is happening.

When i opened some scenes before, i saw strange high numbers like 60 in the quality options.
At first i wasn't sure what happened.
Today i opened another scene and saw a number like 25 in there. It should have been 1.

I don't know how to repeat it but i think it is related to the animated "cloud depth" option.
The cloud depth begins with zero and gets higher in time.

I didn't saw much of a difference in render time with 1 or 25 used too(just testing this ones more now).
Looked strange to me. Is this a normal behavior ?
Title: Re: Cloud quality changes by itself
Post by: Oshyan on October 07, 2015, 03:33:19 PM
Strange indeed. We'd be very interested in finding out more about how to reproduce this.

As for why it may not be affecting render time as much as you expect, are you using Defer Atmosphere?

- Oshyan
Title: Re: Cloud quality changes by itself
Post by: Kadri on October 07, 2015, 03:35:34 PM

I just edited my post Oshyan. I might have rendered a frame where the cloud depth is still zero.
Title: Re: Cloud quality changes by itself
Post by: Kadri on October 07, 2015, 03:51:57 PM
There is no "Defer atmo" in this scene Oshyan.

Another possibility that comes to my mind is that this is actually a very old scene(4-5 years?) where i used unneeded extreme settings.
Some clouds altitude settings are for example 200 000 m. The camera is there too. Some kind of a too far from the origin problem?

I am still testing.
Title: Re: Cloud quality changes by itself
Post by: Kadri on October 07, 2015, 04:10:33 PM

When i open the scene the number i see in the cloud quality is "29.3333".
I rendered without changing anything.The small crop render time was 1.30 or so.

Then i tried a basic thing.I copied and pasted the same number "29.3333" ones more in the same input field.
Now suddenly the render time is 3 times longer.

Then i used "1" as input.The render time was roughly the same as when i rendered it as it is when i opened it.

Basically it shows "29.3333" but acts as it is "1" or so when you open and render the file.

I think there is no harm done (at least in my tests). But it looks frightening :)
Interesting.
Title: Re: Cloud quality changes by itself
Post by: Oshyan on October 07, 2015, 04:21:53 PM
Ah. If it's a very old scene then it's probably using the older "number of samples" parameter instead of "quality". We found that because the # of samples needed for low noise results could vary so much with different settings of density, edge sharpness, etc. that just specifying the # of samples wasn't the best way to present the option to the user. So that's why it was changed to Quality. I would think that value should be automatically translated into "quality" from # samples, but perhaps it's not. It's an issue that not many people should run into though. So I'd suggest returning that value to 1 whenever you see it, at least to start.

- Oshyan
Title: Re: Cloud quality changes by itself
Post by: Kadri on October 07, 2015, 04:25:54 PM

Ahh...that makes sense. Thanks :)