How do you create physically accurate hdri skies in Terragen 4?

Started by eapilot, May 29, 2017, 06:31:33 PM

Previous topic - Next topic

eapilot

I am researching a good way to create physically accurate hdri panoramas for Realtime rendering game engines.  From looking at Terragen 4, it can render very dynamic hdris with a sun intensity that rendered unclamped and spans the whole range of a 32 bit float texture.  This is better than most commercially available panoramic hdris because of the the technical limitations of phyiscal cameras (except if you use multiple ND filters). However, Terragen's rendering camera doesn't have a physical camera.  Can you import a physical camera that measures shutter speed, aperture, and exposure?  How can you render a cg sky in Terragen and mimic desired real-world measurements like lux values and sun to shadow ratios? At least to gauge the accuracy of the the cg sky. 

I saw that these skies are for sale.  It claims that it has a dynamic range of 25-30 stops.  I don't really know how this can be measured accurately, without real world values.

Ultimately, I would like to find

link below
http://www.sastudios.tv/cghdri/cg-hdri-v2


Oshyan

Those HDRIs at SAStudios are created in Terragen 4. ;)

Here is some info on real-world lighting equivalence for Terragen, quoted from Matt Fairclough, Lead Developer/Software Architect:

QuoteLuminance values in Terragen are based on pixel values without any real-world units, because the camera exposure is also without units. But it should be possible to work out the scaling factor with the information we have. I've aimed to make all the ratios in Terragen physically plausible with default atmosphere settings, so the ratio of sun to sky luminance should be about right, and the surface of the sun should be the physically correct ratio to the illumination it provides. We just need to figure out the constant scalar to match the target of 2.43e9 nits. In theory...

Allow me to get into the math :)

In the absence of atmosphere, a Terragen sunlight strength of 1 falling directly (perpendicularly) onto a diffuse surface with albedo 1 seen with camera exposure 1 produces a pixel of value 1 in the EXR. That's the scale that everything else derives from.

With the sun's angular diameter at 0.5 degrees (approximately correct for Earth), the math I use in Terragen creates a pixel luminance ratio of approximately 26263. That means if you use a sunlight strength 1 and camera exposure 1, pre-atmosphere the sun will appear with values of about 26263 in the EXR. If you change the sun's angular diameter you get a different value. That's because luminance is inversely proportional to the square of the angular diameter (for relatively small angles) so that the total energy from the sun remains the same.

That's for sun strength and exposure of 1. In the default scene the sunlight intensity is actually 5 so this creates pixels with 5 times that value. Incidentally that's beyond what 16-bit EXR can store, but if you increase the diameter 0.75 that brings the luminance in range.


Therefore:

    If the pre-atmosphere sun needs to be 2.43e9 nits, and you rendered with sun intensity 1 and exposure 1, you should scale the EXR output by 2.43e9 / 26263 ~= 92525.
    If the pre-atmosphere sun needs to be 2.43e9 nits, and you rendered with sun intensity 5 and exposure 1, you should scale the EXR output by 2.43e9 / (26263 * 5) ~= 18505.


Unfortunately because Terragen outputs only 16-bit EXR at the moment, you can't create an EXR with actual values in nits, if that's what you wanted. But I would try to standardize on either 1 or 5 for the sun intensity and then you have a constant scaling factor.

This might be the first time I've derived these figures, or it was so long that I don't remember doing it. So I wanted to do a sanity check against the other number you gave, the 8000 nits for a blue sky on a clear day at noon. The ratio of sun luminance to blue sky luminance should be 303,750. If I wanted to expose the sky so it has a value of 0.5 in the image, the sun luminance should be 303,750 * 0.5. If I divide that by 92525 to get the sunlight strength parameter, I get 5.783. That's very close to the default sunlight which does indeed give us a nicely exposed clear blue sky at noon.

I hope that helps.

- Oshyan

eapilot

Thanks Oshyan!  That is very helpful, but a little beyond my expertise.  If you leave the default sunlight value of 5 in Terragen, then this is the equivalent of 1 in luminance if the angular diameter is increased from .5 to .75?  If not, then is it more accurate to set the sun default to 1?

Matt

Quote from: eapilot on May 29, 2017, 07:50:09 PM
Thanks Oshyan!  That is very helpful, but a little beyond my expertise.  If you leave the default sunlight value of 5 in Terragen, then this is the equivalent of 1 in luminance if the angular diameter is increased from .5 to .75?  If not, then is it more accurate to set the sun default to 1?

Without metadata, EXRs are essentially 'unitless'. This means that when you bring the EXR into another renderer, you have to decide how bright to make it according to how brightly you want to light the scene. I think it's usually assumed that you'll want to adjust the intensity of an HDRI when loading it into any renderer, and I haven't found any information on mapping HDRIs to physical units yet. Perhaps someone else here knows more about this?

Unfortunately I don't have experience in loading EXRs into Unreal and mapping them to physical units, but I can say that with default settings in Terragen (sunlight 5, exposure 1) the EXR will contain a sunlight luminance of approximately 26263 (of "something") when outside the atmosphere. When using this EXR in another renderer, if you know what luminance the sun (before being filtered by atmosphere) should be in that renderer's choice of units (e.g. 1361 W/m²), then you can work out what intensity multiplier to use when loading the EXR into the renderer. E.g. if the renderer expects values in W/m² then you could set the intensity to 0.0518, because 1361/26263 = 0.05182195484. But to be honest I have no idea whether this will give the correct results, or how game engines work with physical units when loading HDRIs.

Even though the EXRs are unitless, they can still represent physically accurate dynamic ranges. Dynamic range is the ratio of the brightest to the darkest pixel.

Matt
Just because milk is white doesn't mean that clouds are made of milk.

eapilot

Quotebut I can say that with default settings in Terragen (sunlight 5, exposure 1) the EXR will contain a sunlight luminance of approximately 26263 (of "something") when outside the atmosphere

Is there a place that I look up publicly available data of real world sun measurements that you used to calibrate the intensity of the sun in Terragen?  I could chart the approximate sun to sky values in a default Terragen sky throughout different times of the day. Then I can approximate the metadata that will be needed to map the intensity values into unreal units, like luminance, and sun to sky ratio. 

Quote
Without metadata, EXRs are essentially 'unitless'. This means that when you bring the EXR into another renderer, you have to decide how bright to make it according to how brightly you want to light the scene.

My initial strategy to take my own hdri panorama skies because you can collect metadata and other real world measurements on location.  To take them properly still requires purchasing the right equipment. There is also quite a lot of cleanup. My other thought is use Terragen.  It allows creative control while still producing  an unclamped hdri. The problem with that is deriving metadata from the exr or tgd file.

Ethrieltd

The sun produces about 100,000 lux (lumens per square meter) at sea level. So if you can work out what the level of attenuation by the atmosphere in Terragen is then multiplying the (atmosphere adjusted) 26263 figure by 4 should give you the approx figure in lux. Whether this information is useful isn't obvious to me, it'd be a ballpark figure at best and this is before taking into account clouds & Rayleigh scattering, etc.

Almie

So now that Terragen supports 32-bit EXRs, how does this affect the physical accuracy of the values? Using the default atmosphere with no clouds gave me luminance values of about 51k in the hottest spot and it looked decent enough when used as IBL. I'm just wondering if this is physically accurate, or if there are any other steps I need to take to make sure it is.

Oshyan

It's pretty physically accurate out of the box. 32 bit EXR support just allowed for better representation of really extreme values within the sun, but they were high enough and accurate enough already that they produced illumination that was essentially indistinguishable from real-world sun illumination even with 16 bit EXRs.

- Oshyan

N-drju

Funny you ask this, because during Christmas time I played a bit with making my own, pretty basic sphericals and then turning them into HDRs using Picturenaut (thanks digitalguru). I am quite content with the lighting from even the most basic (empty) scenes, and totally agree with Oshyan. I guess light colors and exposure need some adjustments, but otherwise than that? Me like.
"This year - a factory of semiconductors. Next year - a factory of whole conductors!"

KirillK

I honestly never understood why anyone needs true HDR images . rendered or captured   for the lighting.   Results are never what you wanted.   Imo you could just paint such HDR in Photoshop  or convert from LDR image  to what light intensity and tint  you need and it's usually just ok.    Moreover they work only for the static scene point where they are taken.   

N-drju

Hehe, in a way you are right KirillK. ;) The first time I heard about HDR images and for what end they are used I was quite confused. I also thought that it is better to simply add a distant light and an "omni" type fill light because it is easy and allows a tremendous deal of control. You can't change the color and intensity of the light in the HDR image much less the fill lights used (if any).

Still, I like the idea of using HDRs in my art. That is, if I make them myself. :D So I have exactly the parameters that I expect. HDRs are quick and let you employ automatic lighting instead of doing everything by hand. Provided you want to sacrifice control for speed. Besides, I find it fascinating that solutions like these exist.

However, I cannot agree with you as long as the "static scene point" is concerned. Many CG programs actually let you to rotate the hdr sphere (hence the lightsources) in whatever fashion you want. So if what you mean is that picture taken at sunset will always shine from 270 degrees in your scene, it does not necessarily have to. All you need is to rotate the image. Many programs, like DAZ Studio for that matter, allow you to adjust the orientation of the lightsource with ease.
"This year - a factory of semiconductors. Next year - a factory of whole conductors!"

KirillK

Rotation is ok. But if you have a kind of specific detail in your HDR, an architecture , a tree , just strongly illuminated object making a strong indirect cast  it's no more working once it's not matching the actual scene.  So in practice you need either a pattern/ sequence of HDRIs  along a camera path which is too complicated imo   or just very simple hand painted HDRI consisting of just sky/ground gradients and a sun/ light source dots and no specific details.   


N-drju

Oh, ok I see what you mean.

In that case - yes. If the illumination is complicated, like rainbow-lit fountains at night, this will probably be useless unless you are after a very similar scene.

I am in that comfortable situation that I usually just have to paste DAZ characters into TG renders. So the natural workflow for me would just be to make a copy of the (yet empty) render, convert this copy into HDR and later render DAZ scene using TG-made HDR. Then it's just copy-paste job and finishing touches in Photoshop. :)
"This year - a factory of semiconductors. Next year - a factory of whole conductors!"

digitalguru

QuoteYou can't change the color and intensity of the light in the HDR image much less the fill lights used (if any).

Yes you can - it's pretty much standard practise for studios to make copies of an HDR to isolate various elements -

For instance, for an outdoor scene, one copy would be made with the sun painted out and another copy with just the sun isolated. These two maps can then be loaded into 2 separate HDR or dome-lights in a 3d app and balanced as desired. To go even further, other elements in a HDR can be masked out and copied - strong sunlight on a road could be isolated to reduce the effect of its bounce light in a scene.

But you are right, illumination in a shot is very complex and using an HDR as a light source which basically fires its rays of illumination from an infinite sphere doesn't take into account the realities of a scene - where objects that block light or bounce light can be near or far away from the object being rendered.

ILM experimented with this on Iron Man 2, they took HDR stills of objects in a set and mapped them onto proxy objects in a 3d scene, so if Iron Man got close to an object that was emitting (or bouncing) a lot of light it would be able to to be reproduced in 3d.

QuoteSo in practice you need either a pattern/ sequence of HDRIs  along a camera path which is too complicated imo

Did a test of this in this video - https://www.youtube.com/watch?v=vS21UjyQpWU ( about 10 mins in )

Works much better I thought it would and can reproduce the dynamic light as the object passes through the scene. Unless you're looking for some very specific reflections on your object, the rendered spherical sequence can have a lower quality and resolution and thus be less expensive to render.






N-drju

Quote from: digitalguru on January 11, 2018, 06:56:31 AM

Yes you can - it's pretty much standard practise for studios to make copies of an HDR to isolate various elements -


::) Bahhh... Of course. You are right. Given enough resources and time you can edit the output image and, like you said, mask items away. Thank you for this explanation. :) Most of us are simply on a rather tight schedule and don't have much time to make such improvements to their content.

This is a lot of work right there in that video. I can't possibly imagine how long it took to get all the details right. :) And all this for a four second biplane flyover. ;D
"This year - a factory of semiconductors. Next year - a factory of whole conductors!"