Quote from: Tangled-Universe on August 23, 2019, 02:56:21 PMCan you tell how you came up with 0.3?
I first added adaptive sampling to Terragen 2.1. The original algorithm simply looked at 4 samples, calculated their average value and calculated the differences between the 4 samples and the average. It added up those differences to produce a total contrast value. It's a bit more complex than that, but I don't think the details are important here. What I've described above is what happens when the samples are 1 pixel apart, and that's what we usually think of when we are looking at pixel noise. At subpixel levels the threshold is adjusted by another factor.
From that basic algorithm, I tried it with lots of different scenes and I decided what range of values I thought were good for the default. As you know, it's sometimes difficult to predict whether it's better to increase AA or decrease PNT or do both. The default value is based on the results I got at the time. I don't remember the range of defaults that I was considering, but I homed in on 0.3/AA. It could have been 0.25, but 0.3/AA produces nicer looking numbers than 0.25/AA for most of the AA values between 2 and 6. (You can see for yourself by calculating 0.25/AA for different AA values). I couldn't decide between 0.25 and 0.3 on any other basis so I chose 0.3 for this aesthetic reason.
Decisions made at the algorithm level would have changed these numbers. For example, if I had looked at 9 samples instead of 4, the default PNT would be a larger number. On the other hand, if I'd used 9 samples I might have decided to use the
mean of the differences rather than the sum, and then that would make the equivalent PNT values smaller.
In Terragen 4.3 I added "Robust adaptive sampling" and improved upon it for 4.4. I wanted to keep the default values the same to reduce confusion and to minimize the number of things you have to change when switching between legacy and robust adaptive sampling. But there's a problem. Robust adaptive sampling is a very different algorithm and the old thresholds don't apply directly. To solve this, there is an internal mapping from PNT to the specific threshold that the Robust sampler uses. The specific mapping sometimes has to change as I improve the algorithm, and it's very specific to the algorithm. I had to calibrate that mapping based on lots of tests, with 2 major goals. The robust sampler should produce visibly better images than the legacy sampler for any given PNT (within reasonable values), and it should render the image faster. It doesn't always meet both of these goals, but they help me to constrain the range of possible ways to map PNT to the internal threshold used by the robust sampler. This might be tweaked again in future when improvements are made to the robust sampler.