PBR Displacement Advice

Started by WAS, July 04, 2019, 03:44:09 AM

Previous topic - Next topic

WAS

I just wanted to share some advise for those of us making PBR materials, or downloading PBR materials. I found that often (especially free) PBR materials that are approximated often have a displacement map that doesn't have appropriate detail for it's texture.

When making your material displacement you need to somewhat think like Terragen, how you'd do displacement. We'd start with our base smoother geometry, than next our geometries basic definition, and than finally the displacements fine, low level surface detail. How you do each displacement map really depends on the texture base itself.

Below is a PBR I found a long time ago, and always liked the texture, but I figured it only being 1600x1600 it was just too small for use in Terragen. It doesn't look right at all, and the shapes are so smooth. I thought it was because of the texture it was based on, because of shadows and highlights like Ulco mentioned before.

[attachimg=1]

However, below I have generated 4 new displacement maps and combined them according to intensity. The base geometry (large shapes), a secondary base geometry (medium shapes), than geometry definition (defining the shape detail of the geometry), and than finally the surface detail displacement map.

[attachimg=2]

Even at 1600x1600 the results look pretty darn good in Terragen (I mean seriously), though mind you I optimized for Terragen, including the finalized displacement intensity which was lowered to fit with the default shaders 0.1 displacement.

Here is a comparison of the two displacement maps. Second being the newly generated one by me.

[attachimg=3]

I hope this helps some people when making their new approximated PBRs for TG or otherwise. Feel free to ask any questions, it's late. I remembered I told Bobby I'd make a post about this.

Oshyan

How did you generate the new maps?

- Oshyan

RichTwo

PBR?    I don't even know what that refers to - Pabst Blue Ribbon?  Thus spoke the Terragen Village Idiot...
They're all wasted!

Oshyan

No, no Rich, Professional Bull Riders, jeez! ;)

- Oshyan

archonforest

Dell T5500 with Dual Hexa Xeon CPU 3Ghz, 32Gb ram, GTX 1080
Amiga 1200 8Mb ram, 8Gb ssd

Matt

#5
Thank you Rich for giving me an excuse to say what I think Physically Based Rendering is all about. In my opinion there are two separate things which both call themselves PBR.

https://en.wikipedia.org/wiki/Physically_based_rendering

Historically, "physically based rendering" was about rendering techniques. Path tracing is an example of a physically based rendering technique.

More recently, game engines made an important shift to using the same kinds of shaders and textures that had previously only been possible in non-real-time photorealistic renderers. Some people who spearheaded this in the real-time world were calling this Physically Based Shading (PBS) because they recognized that they were not using true physically based rendering (although some game engines are converging on that goal now). Most people now call this PBR even if it only applies to the shaders/textures.

Bringing the game engine world into using the same kinds of shaders and textures as other renderers has made it much easier to share assets between real-time and non-real-time engines. The whole computer graphics world has benefited from this and there has been an explosion in the number of available assets and the tools to make them and work with them.

This has all fallen under the banner of "PBR".

And that's where we are today, with the term PBR applying to certain rendering methods (e.g. path tracing) but also some loosely-defined conventions about what PBR shaders and textures are. But there are some themes that are present in most PBR assets. Most importantly, they should react realistically to many different lighting conditions. When it comes to textures, you will usually see textures maps for roughness (or gloss), albedo (or base color), and either a reflection, specular or metalness map. Different engines/shaders prefer different combinations of these maps. You'll also see displacement maps, normal maps and bump maps. But these things were around long before we called them PBR, and they really shouldn't be called that, in my opinion.

It's ironic, but things like AO maps are actually the *opposite* of PBR, yet they have become associated with the term because they are necessary for game engines to make some assets look good, so they are bundled with assets.
Just because milk is white doesn't mean that clouds are made of milk.

WAS

#6


Quote from: Oshyan on July 04, 2019, 02:14:09 PM
How did you generate the new maps?

- Oshyan

Honestly I had planned to use Materialize and PS, but I forgot I bought CrazyBump and should probably get my money worth back as I remember I made a big stink about needing to purchase it awhile back.  :-X

Thanks for the comments, I also learned a bit about delighting and also repairing a delight image (most processes create artifacts and levels 1 (white) and 0 (black) in their absolute forms aren't delight in all the programs I've tried).

So I just went and grabbed a "bad" texture, that one probably wouldn't want to use, and than got to work. Unfortunately I cannot share it due to the websites policy for their "Free" textures. In fact I can't even link to the texture (no joke) but I can link to the main website (yeah, really by their own technicality): https://www.highqualitytextures.com

But it goes to show what can be accomplished with 1k-2k images, not even upscaled besides being transformed to a 1:1 ratio for seamless texturing. This uses 3 displacement maps mixed together. Your large geometry (the lifts), the lifts definition (medium shapes) and than the surface detail of the stone itself (fine sharp detail).

The texture itself is truly 1k, but has been made 2k by conforming to 1:1 aspect ratio. Additionally, I of course performed some quality magic. :P

Quote from: Matt on July 04, 2019, 06:24:02 PM
Thank you Rich for giving me an excuse to say what I think Physically Based Rendering is all about. In my opinion there are two separate things which both call themselves PBR.

https://en.wikipedia.org/wiki/Physically_based_rendering

Historically, "physically based rendering" was about rendering techniques. Path tracing is an example of a physically based rendering technique.

More recently, game engines made an important shift to using the same kinds of shaders and textures that had previously only been possible in non-real-time photorealistic renderers. Some people who spearheaded this in the real-time world were calling this Physically Based Shading (PBS) because they recognized that they were only not using true physically based rendering (although some game engines are converging on that goal now). Most people now call this PBR even if it only applies to the shaders/textures.

Bringing the game engine world into using the same kinds of shaders and textures as other renderers has made it much easier to share assets between real-time and non-real-time engines. The whole computer graphics world has benefited from this and there has been an explosion in the number of available assets and the tools to make them and work with them.

This all fallen under the banner of "PBR".

And that's where we are today, with the term PBR applying to certain rendering methods (e.g. path tracing) but also some loosely-defined conventions about what PBR shaders and textures are. But there are some themes that are present in most PBR assets. Most importantly, they should react realistically to many different lighting conditions. When it comes to textures, you will usually see textures maps for roughness (or gloss), albedo (or base color), and either a reflection, specular or metalness map. Different engines/shaders prefer different combinations of these maps. You'll also see displacement maps, normal maps and bump maps. But these things were around long before we called them PBR, and they really shouldn't be called that, in my opinion.

It's ironic, but things like AO maps are actually the *opposite* of PBR, yet they have become associated with the term because they are necessary for game engines to make some assets look good, so they are bundled with assets.

I believe the full term is technically "Physically Based Renderer Materials". Historically, the sections of websites, and refereed to was "PBR Materials". And the materials themselves were designed off of real-world values and textures. It seems now in the last few years "Material" has been dropped altogether instead of PBRM used or something. Than we have "Photogrammetry PBR" which I think would be more appropriated called PBM or TOFBM; respectively Photogrammetry Based Material and Time of Flight Based Material

I am actually thinking of picking up a Hawuai(sp) P30 phone because it has a 5 meter ToF sensor that works with several AR/Photogrammetry apps. My LG G8 has a ToF sensor, but it's much smaller, has maybe a quarter meter full resolution distance, and has no API hook.

Matt

Quote from: WASasquatch on July 04, 2019, 06:26:38 PM
I believe the full term is technically "Physically Based Renderer Materials". Historically, the sections of websites, and refereed to was "PBR Materials". And the materials themselves were designed off of real-world values and textures. It seems now in the last few years "Material" has been dropped altogether instead of PBRM used or something. Than we have "Photogrammetry PBR" which I think would be more appropriated called PBM or TOFBM; respectively Photogrammetry Based Material and Time of Flight Based Material

Yeah, there's nothing quite like a catchy TLA for everyone to jump onto!
Just because milk is white doesn't mean that clouds are made of milk.

WAS

#8
Quote from: Matt on July 04, 2019, 06:39:41 PM
Quote from: WASasquatch on July 04, 2019, 06:26:38 PM
I believe the full term is technically "Physically Based Renderer Materials". Historically, the sections of websites, and refereed to was "PBR Materials". And the materials themselves were designed off of real-world values and textures. It seems now in the last few years "Material" has been dropped altogether instead of PBRM used or something. Than we have "Photogrammetry PBR" which I think would be more appropriated called PBM or TOFBM; respectively Photogrammetry Based Material and Time of Flight Based Material

Yeah, there's nothing quite like a catchy TLA for everyone to jump onto!

Seriously. I still end up calling them just "textures" by mouth, which confuses my friends when I'm working with all the other maps. Lol

I just wanted to fiddle around with some method to create nice displacement from approximation as after reading some documentation on Google Scholar, I found most people aren't even approaching it right -- and why these programs like CrazyBump etc have "Displacement Mixers", though I'll be honest I use photoshop as layer settings such as "Overlay" come in handy.

Most the good "PBR" materials you see are done via generation in Substance Designer these days, which is why they are so crisp and clean. Which I find annoying when I see "Physical" involved, as it's all fractal based generation.

Oshyan

I'm still confused about your workflow though. If that's the source of textures you used, all the ones I can see are just a photo of a real rock or something. So you'd have to derive the bump/displacement maps, correct? If that's the case then the issue may really lie in how the software generates maps and what it expects apps to do with them. As you've found even somewhat lower resolution textures can give a decent result, it's partly a matter of the amount of detail and contrast encoded in the bump/displacement map.

In the case of downloaded textures that do include bump/displacement already, sometimes they will include both a Displacement Map *and* a Bump or Normal Map, the intention being that the larger-scale displacement is separated from the finer detail in the Bump/Normal Map. This is not how you want to do it in Terragen, and so those kinds of setups can be problematic when trying to bring that straight in. As I *think* you have found, you want to find a way to combine those multiple surface shape (rather than color or reflectivity) textures into one.

- Oshyan

WAS

#10
Quote from: Oshyan on July 04, 2019, 06:58:06 PM
I'm still confused about your workflow though. If that's the source of textures you used, all the ones I can see are just a photo of a real rock or something. So you'd have to derive the bump/displacement maps, correct? If that's the case then the issue may really lie in how the software generates maps and what it expects apps to do with them. As you've found even somewhat lower resolution textures can give a decent result, it's partly a matter of the amount of detail and contrast encoded in the bump/displacement map.

In the case of downloaded textures that do include bump/displacement already, sometimes they will include both a Displacement Map *and* a Bump or Normal Map, the intention being that the larger-scale displacement is separated from the finer detail in the Bump/Normal Map. This is not how you want to do it in Terragen, and so those kinds of setups can be problematic when trying to bring that straight in. As I *think* you have found, you want to find a way to combine those multiple surface shape (rather than color or reflectivity) textures into one.

- Oshyan

Oh no, the first image is a old free PBR I got, actually not sure where that it's from but clearly approximated (from bitmap/png/etc). The second image is my attempt at using any old image that has bad lighting.

And yes, you are correct, Normal Maps are used for fine level displacement, however, that isn't a feature of Terragen for some reason, this is about optimizing for Terragen, which is perfectly capable of reading low level dispalcement in a file, it just needs to be mixed in accordingly.

This is how new PBRs are made from programs like Substance Designer, all your layers, masked fractals, are baked into the displacement map. The Normal is pretty much redundant legacy support and really just "amplifies" the surface detail already there (game engines heavily rely on parallax now).

I honestly don't know what you're talking about, we all know contrast detail and contrast makes your displacement, and all these applications, including PS are capable of any level of detail from diffuse. It's about how you mix different levels of displacement to create an appropriate vanilla map to use at a standard displacement value or there around. That noted, a lot of free materials that are approximated (materials usually note this, like on CC0 Textures) there isn't much time spent here, or they're not aware of this work flow. So you often have one displacement map that was exported, based on one level of detail/contrast.

Matt

Aside from why most displacement maps suck, I think Oshyan was just asking how you made the extra detail. I'm curious too.
Just because milk is white doesn't mean that clouds are made of milk.

Oshyan

Yes. A basic summary of my thought/point is this: the original displacement map shown above seems to simply be low detail and/or low contrast. The image resolution does not appear to be the main problem or limitation. Thus it makes sense to generate more data in some way, sure. BUT if you start with a high detail displacement map in the first place, it doesn't seem like there needs to be any special workflow. So I'm just trying to understand what the intention of this advise is here. Particularly the "think like Terragen" bit. This just seems like low detail displacement maps, nothing specific to do with Terragen *as far as I can see*. But maybe I'm missing something.

- Oshyan

Dune

I think he added some detail based on the level of whiteness of the original 'soft' image in PS. Maybe a random noise, or same texture shifted, tiled and reduced, or something.
If so, you can do the same in TG; just add some noise based on the color values of a low quality image. Or a 'low quality' noise, for that matter (like wind wave ripples, based on the upper whites of a color driven wave displacement, resulting in smooth windstill wave valleys, for instance).

WAS

#14
Quote from: Dune on July 05, 2019, 01:40:11 AM
I think he added some detail based on the level of whiteness of the original 'soft' image in PS. Maybe a random noise, or same texture shifted, tiled and reduced, or something.
If so, you can do the same in TG; just add some noise based on the color values of a low quality image. Or a 'low quality' noise, for that matter (like wind wave ripples, based on the upper whites of a color driven wave displacement, resulting in smooth windstill wave valleys, for instance).

Nope. All approximated from diffuse. What I do in PS is adjust the maximum whiteness since CrazyBump I'm currently using only has overall intensity, you cannot adjust levels by channel (white/black). Other than that I just repair the diffuse after de-shadowing/de-lighting (blur/heal hard single pixels that are white/black)

Quote from: Oshyan on July 05, 2019, 12:23:12 AM
Yes. A basic summary of my thought/point is this: the original displacement map shown above seems to simply be low detail and/or low contrast. The image resolution does not appear to be the main problem or limitation. Thus it makes sense to generate more data in some way, sure. BUT if you start with a high detail displacement map in the first place, it doesn't seem like there needs to be any special workflow. So I'm just trying to understand what the intention of this advise is here. Particularly the "think like Terragen" bit. This just seems like low detail displacement maps, nothing specific to do with Terragen *as far as I can see*. But maybe I'm missing something.

- Oshyan

I'm not sure what programs you use, but of the 3 I've tried free/freeware, they only ever achieve one level of detail intensity based on the original image. So a highly detail displacement map alone would be nothing but jagged peaks in Terragen. This is why these same programs have built in displacement map mixers as I mentioned before, to do what I am explaining here that seems to somehow be lost in translation. Yes you can get more detail into one displacement map but that's by shifting texture influence and detail, which tightens slopes the more you go, so of course, why not mix different levels for more realism closer to what the texture is based on?

To try to illustrate what I am doing, here is 3 displacement maps before mixing or photoshop leveling on a random image that I would mix together to form a map for Terragen...

Image: https://pixnio.com/textures-and-patterns/rock-stone-texture/rock-texture-land

[attach=1] [attach=2] [attach=3]

The first map may be duplicated and leveled multiple times and mixed depending on the lift definition you want....

[attach=4] [attach=5]

And finally mix them into something...

[attach=6]

(mind you I didn't delight so there is holes from the shadows)

[attach=7]