VDB workflow testing

Started by Matt, June 13, 2018, 07:18:11 AM

Previous topic - Next topic

Matt

I'd like some help to test VDBs exported from Terragen clouds. Who here has experience using VDBs in other software? Better yet, have any of you rendered VDB clouds and understand how to manipulate their scale, density and shading parameters to make realistic renders?

At the moment I have basic VDB export capability working in our Linux build, and the next step is to work out how to scale them appropriately in space in a variety of situations so that they can sit correctly alongside cameras imported and exported between Terragen and other major 3D software that supports VDB files. We also need to figure out how to set density parameters to try to get 1:1 match of optical density (alpha / opacity) for shadow casting, regardless of scale. Eventually we want to be able to take VDBs in the other direction too, i.e. import VDBs to Terragen, but we are not working on that just now.

If you're interested in helping us to test this, please let me know what software you've used VDB with. It would be really great if you can also show examples of any prior work you've done.

You can either post a reply here or send me a private message or email if you prefer. Maybe we can start a VDB testing topic here on the forum, or collaborate offline if that's better for whatever reason.

Thanks! 😀

Matt
Just because milk is white doesn't mean that clouds are made of milk.

pokoy

I could do some test now that Corona in 3dsmax supports VDB. I have no experience with VDB yet but I know that different packages do things differently (Blender and PhoenixFD have different density/temperature channel scaling for example) so some testing of different VDB file examples would be needed in order to find some middle ground that works for most users.
From what I know most packages allow arbitrary scaling of any channel so users have all the control, I guess it's 'just' a matter of finding a good default.
I could do some tests starting next week until end of June. I'm away for holidays through July and could continue testing after that again.

I haven't rendered VDBs yet since the renderer of my choice didn't support them until recently. But I'm pretty sure I can get something usable in short time, and could test the VDBs in Arnold and Corona in 3dsmax.

SILENCER

We'd love to give it a shot in LW2018 with Octane.
2018 has VDB import, and Octane renders them like a boss.
The linux thing, however, is a barrier for us.

Matt

Pokoy,

That sounds great. Yeah, the differences in channel scaling are something I hope to minimise, or at least get an understanding of the differences in various software so we can provide clear advice for users of each program or renderer.

SILENCER,

For this early testing phase I will generate the VDBs, so you won't need Linux. Later on we'll get it running on Windows and Mac.


My plan is to generate some VDBs next week and share them with you. I'll also include cameras (FBX) and terrain export so we can try to get parity between renderers.

Matt
Just because milk is white doesn't mean that clouds are made of milk.

WAS

No way! I only just found out about open vdb clouds recently due to your v3 clouds Matt, this is so exciting to hear about.

mdharrington

Would love to do some beta tests as I was one of the originals asking for this.....be using octane/houdini/lightwave.

I think its important to remember some of the important render scale parameters will be outside the scope of the VDB exported fields ....step length being most important.

Octane has the step length parameter in meters...as well as Houdini (AFAIK)....this will greatly affect render density appearance, overiding whatever you put in the VDB file for density (as far as render appearance). This parameter if not precisely matched, will greatly affect density/shadowing/detail appearance but is not integrated into the VDB container (AFAIK)

If TG can have an adaptable control (or equivalent) of the ray step length through the volume....then all things being equal density should appear equal in both TG renders and external.

As long as users are aware that they should scale step length to match TG world size during export (many apps may wish to shrink world scale as opposed to TG)

pokoy

Quote from: mdharrington on June 13, 2018, 05:48:10 PM
Octane has the step length parameter in meters...as well as Houdini (AFAIK)....this will greatly affect render density appearance, overiding whatever you put in the VDB file for density (as far as render appearance). This parameter if not precisely matched, will greatly affect density/shadowing/detail appearance but is not integrated into the VDB container (AFAIK)

Step length - I guess this is a ray marching accuracy parameter on the renderer side only, totally independent from VDB data. We should test VDBs from TG and other packages with the same ray length settings for consistent and meaningful testing.

pokoy

What would be nice is a VDB viewer app, to be able to analyze the data... I've found this, but it has to be compiled if I understood correctly:
https://nidclip.wordpress.com/2014/02/26/openvdb-standalone-gui-with-opengl/

Anyone know of a standalone tool independent from any DCC host app?

ajcgi

Hey Matt. It's a yes to testing that in Houdini here!

mdharrington

#9
Quote from: pokoy on June 14, 2018, 06:46:01 AM

Step length - I guess this is a ray marching accuracy parameter on the renderer side only, totally independent from VDB data. We should test VDBs from TG and other packages with the same ray length settings for consistent and meaningful testing.

Exactly....it is precisely the ray marching length between samples (some renders may refer to it as quality..but they should be values normalized to meters).  A VDB of 1m with a step of 5m will have the ray pass right through it.

Normally there is zero advantage to having the step length shorter than the voxel cube size....but many advantages to going longer (speed being the main one)
For similar appearance and density comparison the step length must be the same.

A common step length of .5m or 1m are usually render defaults.....but will bring renders to a crawl with a 5km cloud scene. And in such sizes TG may be beneficial sampling a 20-100m step. This is the main difference in values I could see, as TG will commonly have massive scales.

EDIT:
Houdini step length is normalized to voxels not meters. 1 means all voxels sampled, .5 means every second voxel sampled.
Arnold step size is in object space and not world.....likely voxels as well
Octane is 100% in meters
Lightwave is in meters

One really has to pay attention to get meaningful comparisons. Octane and LW will have to know voxel size in order to set step length to match arnold and houdini renders

SILENCER

Sounds great, Matt.
It'll be interesting to see the TG clouds out of their comfort zone for a little cardio.

Matt

TESTING: For some reason the forum won't let me post my reply. Is is too long or something?
Just because milk is white doesn't mean that clouds are made of milk.

Matt

Mike and Alex, I look forward to seeing your tests too. Thanks for the info about step lengths Mike. It sounds like there's a lot to learn here.

Matt
Just because milk is white doesn't mean that clouds are made of milk.

Matt

Quote from: mdharrington on June 13, 2018, 05:48:10 PM
If TG can have an adaptable control (or equivalent) of the ray step length through the volume....then all things being equal density should appear equal in both TG renders and external.

TG renders more detail than will be exported in the VDB, because its density functions are evaluated on the fly. So if TG had a step length property it would be measured in metres in world space rather than in voxels. However, it isn't an exposed parameter because when TG renders it changes step length dynamically to optimise quality, with "ray march quality" being a master control.

The way TG thinks about rendering volumes, the optical density of a voxel is measured in m^-1 in world space. As long as the ray march quality is not too low to miss important details, ray march quality shouldn't have much effect on mean density. This density will be written directly into each voxel. If some external renderer expects an opacity value rather than density, I'll add an option for that, taking into consideration the expected world size when converting from density to opacity. If some other renderer works with density but has odd behaviour when the voxel sizes are different (which they will be in a multi-resolution VDB) then I'll have to adjust for that when exporting.
Just because milk is white doesn't mean that clouds are made of milk.

Matt

I don't anticipate changing TG's render output to match an external renderer (unless we discover something wrong or significantly lacking compared to another renderer), but I do want to try to get the external renderers to match TG at shadow opacity, if not other measures. To test for equivalent volume density we'll render a pure black cloud casting a shadow onto a white surface (pure black to eliminate any scattering/GI from the cloud onto the surface). It should be possible to make these black shadow tests look almost the same in multiple packages. After that, we'll add some scattering (albedo) to the cloud, and this is where it gets complicated. With a non-black cloud the shadows on the ground will receive some light scattered through the cloud, and that's where I expect the shadows to look a little different between renderers. Some unbiased renderers might do this better than TG. And then finally we'll look at the light reflected from the cloud into the camera, which is where most volumetric renderers struggle to create realistic clouds. It's going to be really interesting to see how this looks in different renderers and I don't expect to make them look similar. The basic measure of equivalent volume density will come from the black shadow tests.

Matt
Just because milk is white doesn't mean that clouds are made of milk.