Nebulae

Started by Denis Sirenko, July 26, 2017, 07:40:59 AM

Previous topic - Next topic

Denis Sirenko

#150
Maybe Oshyan will correct me, but it is not even necessary to use non-adaptive max samples. It is enough to put, for example, 1/64 first samples for First sampling level, and adjust the noise level with the Pixel noise threshold parameter (which is near). Most likely it can be reduced, for example, to 0.03-0.02. It has a very strong effect on rendering time, but it can completely eliminate noise.

Quote from: Oshyan on March 10, 2018, 03:24:40 PM
...Also if you're using v3 cloud layers, take a look at the newer Voxel Scattering Quality setting in the GI in Clouds tab of the Render GI Settings node...

Does the increase in this parameter always increase the render time? For some reason, in my test scene (two nested a few different cloud layers v3 without any geometric objects) once I got the opposite result - the rendering time decreased when changing the value from 25 to 50 (by 13%). But at the same time, I did not find noise reduction at all. Perhaps I've already sufficiently reduced it in other ways.

Quote from: WASasquatch on March 10, 2018, 06:44:20 PM
The fountains of knowledge that spring up around here...

WASasquatch, there is another way to reduce noise, use Photoshop's AA :). It is enough to render image a larger size than necessary, and then make the rescale down in the PS with suitable resampling settings. True, if the resolution of 1280 × 900 is too small for this, you will have to do several renderings. (About denoisers like Neat Image for PS, I do not need to write, I think -  such advice can offend  :))

Oshyan

I mainly suggested non-adaptive just so he can be sure that he is getting the most number of samples for a given AA level, i.e. the highest quality. This avoids having to spend time fine-tuning the noise threshold.

The increase in voxel scattering quality does not always increase render time, no. There are also other sources of noise which it might not have any effect on.

- Oshyan

WAS

Will have to five the the methods a try at the very least. I know noise reduction in Photoshop is possible, but it can reduce fine detail if you're trying to remove a fair amount of noise.

Denis Sirenko

#153
Quote from: Oshyan on March 13, 2018, 05:39:38 PM
I mainly suggested non-adaptive just so he can be sure that he is getting the most number of samples for a given AA level, i.e. the highest quality. This avoids having to spend time fine-tuning the noise threshold.

Ok, It's clear, thanks!

Quote from: Oshyan on March 13, 2018, 05:39:38 PM
The increase in voxel scattering quality does not always increase render time, no. There are also other sources of noise which it might not have any effect on.

It seemed amusing to me that sometimes increasing this parameter can REDUCE the rendering time. Although I admit that I could make a mistake in the optimization test itself.

Denis Sirenko

Hello! I experimented with explosions in nebulae, for example, it may be explosions of supernovae that scatter around their internals. Here's what I got.

[attachimg=1]

[attachimg=2]

[attachimg=3]

[attachimg=4]

[attachimg=5]

[attachimg=6]

[attachimg=7]

Dune


Hannes

This is so outstanding and unusual. Incredibly beautiful!

luvsmuzik

Excellent representations! Any jellyfish look-a-likes yet? I want to ask if you find version2, version3, or the new easy clouds easier to control in your warping. I don't want a debate, just a general answer if it is not too much disclosure.  :)

WAS

Quote from: Denis Sirenko on March 29, 2018, 07:33:57 AM
Hello! I experimented with explosions in nebulae, for example, it may be explosions of supernovae that scatter around their internals. Here's what I got.

[attachimg=1]

[attachimg=2]

[attachimg=3]

[attachimg=4]

[attachimg=5]

[attachimg=6]

[attachimg=7]

Are you serious? This are amazing. I would love to see these animated lol Really want to know to trick with the hard details.

Does grayscale rendering save on time?

Denis Sirenko

Thanks guys.

Quote from: luvsmuzik on March 29, 2018, 01:42:42 PM
Excellent representations! Any jellyfish look-a-likes yet?

No, the jellyfish was not yet, can be another seed needed :) In general, I do not need it yet. For jellyfish need a more crafty warper. I have now simply "stretching" of a matter from the center of sphere of the set radius. And someone on the forum has already made an explosion in the form of a jellyfish?

Quote from: luvsmuzik on March 29, 2018, 01:42:42 PM
I want to ask if you find version2, version3, or the new easy clouds easier to control in your warping. I don't want a debate, just a general answer if it is not too much disclosure.  :)

Oh, I do not know. I used only clouds v3 because they work better with light. With the parameter turned off "Move textures with cloud". Is there any difference what kind of cloud?

Quote from: WASasquatch on March 29, 2018, 06:00:50 PM
Are you serious? This are amazing. I would love to see these animated lol Really want to know to trick with the hard details.
Does grayscale rendering save on time?

Animated, yes, yes, very funny))

About details: I have a cloud 15 kilometers in size, made specifically to be able to receive small parts. I think I have no problems here. Even on the contrary sometimes the details too small)

About grayscale: No, I just did not work on color settings, it's a separate big task. Nebulae are now really just in grayscale.

Denis Sirenko

I calculated that a 1 second video (800x800 px) will be rendered about a day, i.e., an explosion lasting 1 minute = 2 months of rendering.

WAS

Quote from: Denis Sirenko on March 30, 2018, 03:42:03 AM
Animated, yes, yes, very funny))
Awe, why not? ;)
Quote from: Denis Sirenko on March 30, 2018, 03:42:03 AMNebulae are now really just in grayscale.

Well hubble uses a series of measurements to predict the correct colours (unless being represented in false colour to show something specific). Users on earth use real-world color images and composite programs to bring out real colour.

Denis Sirenko

Quote from: WASasquatch on March 30, 2018, 03:54:35 AM
Well hubble uses a series of measurements to predict the correct colours (unless being represented in false colour to show something specific). Users on earth use real-world color images and composite programs to bring out real colour.

All this is absolutely exactly. But the fact is that the Hubble, by sampling his measurements, does this in different wavelength ranges, and gets slightly different grayscale images. The pattern of each of these measurements is determined by the chemical composition of the nebula and the physical processes. Then you just convert these different components in the colors, and you get natural color variations based on natural processes and differences. In my case, there is no attribute by which I can get different grayscale renders.

Nacer Eddine

#163
Quote from: Denis Sirenko on March 30, 2018, 03:48:07 AM
I calculated that a 1 second video (800x800 px) will be rendered about a day, i.e., an explosion lasting 1 minute = 2 months of rendering.

good job, me personally doing this job on terragen takes a lot of time especially the render . I prefer doing this work on after effects it's faster. but I respect enormously what you do.
for me it's just a comparison of time especially when you said : 1 min = 2 months of rendering.

look the semilair job in just few minutes :
https://www.youtube.com/watch?v=2FNUi3_8_80




luvsmuzik

#164
Quote from: Denis Sirenko on March 30, 2018, 04:35:20 AM
Quote from: WASasquatch on March 30, 2018, 03:54:35 AM
Well hubble uses a series of measurements to predict the correct colours (unless being represented in false colour to show something specific). Users on earth use real-world color images and composite programs to bring out real colour.

All this is absolutely exactly. But the fact is that the Hubble, by sampling his measurements, does this in different wavelength ranges, and gets slightly different grayscale images. The pattern of each of these measurements is determined by the chemical composition of the nebula and the physical processes. Then you just convert these different components in the colors, and you get natural color variations based on natural processes and differences. In my case, there is no attribute by which I can get different grayscale renders.

I go to this site often. here is the jellyfish from the archives. Don't we all wish we had the power to determine the chemical composition of our own grayscale images. ;D
https://apod.nasa.gov/apod/ap180323.html

As far as animating, I actually made a shockwave object from a torus in Blender a while back. I think an object might render faster than the clouds, I shall have to dig that up. That would give you a way to do a flight path, I think.