Camera Zoom Formula and Aperture

Started by WAS, April 18, 2022, 07:22:54 PM

Previous topic - Next topic

WAS

So, may have noticed my topic about creating a importer for FBX cameras into Terragen. I've hit numerous roadblocks and figured out many of them, but one I can't figure out is zoom and aperture. FBX Cameras in Three.js don't seem to have an aperture, so I don't know how to derive this to set in TG.

My main issue though is zoom. In a FBX file it's just a factor, default 1, which is then obviously applied to FOV and Focal Length. How can I apply the zoom factor to these values? It appears Terragen does this internally, so it isn't part of the shader XML to get a clue.

Additionally, FBX in Three.JS only seem to use vertical FOV.  Nothing about horizontal in docs that I can find. :\

You can see the object values I have to work with by uploading an FBX with cameras (attached) in it and opening developer tools console. Click convert and scroll to the top, the first array is the data from the FBX camera.

https://nwdagroup.com/terragen-tools/fbxcam2terragen/

pokoy

#1
The missing FOV angle is calculated from the image aspect ratio. So if the image aspect is 3:2, with a vertical FOV of 25° it would be 25° x 3/2 = 37,5° for the horizontal FOV.

Zoom factor:
Focal length x zoom factor = effective focal length
FOV / zoom factor = effective FOV

FOV is calculated from focal length and aperture size (one dimension is enough as the other is derived from image aspect as noted above). So in the simple case of 3d cameras for any given camera all you really need is knowing the FOV, even if you don't know focal length and aperture size (plus position and vector/target position, of course).

pokoy

Also, looking at the FBX data you posted in the other thread it looks like FBX assumes a horizontal film width of 35mm while TG uses 36mm.

WAS

Thanks for that. I can apply those calculations for zoom at least. I believe there are width/height attributes, which I wasn't sure what they were for, obviously not the resolution that is found in the renderer?

I dug into the FBX loader and forced it to spit out a list of raw attributes to see what all is there that maybe the loader isn't using (limitations for web based 3D stuff) and I did find a ApertureMode. It had 3 types, and the FBX I import are on type 3, which uses focal length as aperture diameter. So that should be easy to port. But the other two modes require calculations.

See here under Aperture: https://download.autodesk.com/us/fbx/sdkdocs/fbx_sdk_help/files/fbxsdkref/class_k_fbx_camera.html

As far as 35 vs 36mm this is to be expected, and may be wildly different from different exports. Like a 60mm camera. 

pokoy

Didn't know FBX has specs for all these... I'd go with FOV in all cases, as this is the final result for all the possible combinations of variables.
I'm a bit puzzled by aperture - it is used for depth of field only and I'm pretty sure most 3d apps don't really care for it unless you specify these values in physically modeled cameras that actually support advanced real-world parameters like aperture, film offset etc. I guess Maya exports most of it, but Max... not so sure.

There's the phenomenon of 'lens breathing' where different aperture numbers affect focal length differently (and FOV subsequently) - for example, the physical camera in Max has this built in. But it's really arbitrary since the maximum strength of the effect has been set by some developer who 'thought' it's the right amount... If someone uses it you wouldn't be able to get a perfect match in TG. Same for distortion, film tilt/shift...

WAS

#5
I do notice discrepancies from raw camera attributes and what Three.js uses. For example, aspect ratio is 1.5, but it uses "SafeAspectRatio" of 1.7777777777777777.

As for aperture, I just am so puzzled by what they're doing computational wise for the first two modes. Third mode just uses focal length as aperture, which is fine, easy enough to do. Blender seems to export using mode 3, but not sure about other software. Could just default to focal length and forget the computations. But that may create issues with someones camera literally using mode 1 or 2 and liking their DOF results. First mode I know uses near and far planes to do it's calculations, but not sure how.

As for people using different settings, FBX writes unitless, so anything special they're doing wouldn't be in the FBX. That's done by the program, which is using the raw data. This is why most FBX are unitless and don't declare a scale factor on units, cause most programs do this themselves, like I do for TG with position by dividing by 100 for meters.

Edit: Oh, there is horizontal FOV stuff in the attributes, so either Three.js defaults to vertical FOV, or Blender does.

WAS

#6
So... I said screw it, and I modified FBXLoader.js and I created a global variable which is cameraAttributesArray[]. It will be loaded with the raw data from the FBX when a FBX is loaded. So I don't have to rely on just what the FBXLoader exposes itself.

If you load a FBX now, it will have an array filled with camera attributes. This does contain more data then I thought, such as the resolution within Blender (1920x1080) for aspectWidth, and aspectHeight.

This project is starting to get a bit overwhelming. Lol Seems if I wanted the "best" export experience, I'd probably need to create the render nodes as well, to at least keep to their resolution/aspect settings.

Here is an example of all this extra data:
Camera: Object { singleProperty: false, Position: {...}, UpVector: {...}, ... }
ApertureMode: Object { type: "enum", type2: "", value: 3, ... }
AspectHeight: Object { type: "double", type2: "Number", value: 1080, ... }
flag: ""
type: "double"
type2: "Number"
value: 1080
<prototype>: Object { ... }
AspectRatioMode: Object { type: "enum", type2: "", value: 2, ... }
flag: ""
type: "enum"
type2: ""
value: 2
<prototype>: Object { ... }
AspectWidth: Object { type: "double", type2: "Number", value: 1920, ... }
flag: ""
type: "double"
type2: "Number"
value: 1920
<prototype>: Object { ... }
AudioColor: Object { 0: {...} }
0: Object { singleProperty: false, id: 0, attrName: 1, ... }
<prototype>: Object { ... }
BackPlaneDistance: Object { type: "double", type2: "Number", flag: "A", ... }
flag: "A"
type: "double"
type2: "Number"
value: 10000
<prototype>: Object { ... }
BackgroundColor: Object { type: "Color", type2: "", flag: "A", ... }
flag: "A"
type: "Color"
type2: ""
value: Array(3) [ 0, 0, 0 ]
<prototype>: Object { ... }
CameraOrthoZoom: 1
DisplayTurnTableIcon: Object { type: "bool", type2: "", value: 1, ... }
flag: ""
type: "bool"
type2: ""
value: 1
<prototype>: Object { ... }
FarPlane: Object { type: "double", type2: "Number", value: 10000, ... }
flag: ""
type: "double"
type2: "Number"
value: 10000
<prototype>: Object { ... }
FieldOfView: Object { type: "FieldOfView", flag: "A", value: 39.597755335771296, ... }
flag: "A"
type: "FieldOfView"
type2: ""
value: 39.597755335771296
<prototype>: Object { ... }
FieldOfViewX: Object { type: "FieldOfViewX", flag: "A", value: 39.597755335771296, ... }
flag: "A"
type: "FieldOfViewX"
type2: ""
value: 39.597755335771296
<prototype>: Object { ... }
FieldOfViewY: Object { type: "FieldOfViewY", flag: "A", value: 26.991466429975517, ... }
flag: "A"
type: "FieldOfViewY"
type2: ""
value: 26.991466429975517
<prototype>: Object { ... }
FilmAspectRatio: Object { type: "double", type2: "Number", value: 1.5, ... }
flag: ""
type: "double"
type2: "Number"
value: 1.5
<prototype>: Object { ... }
FilmHeight: Object { type: "double", type2: "Number", value: 0.9448818897637795, ... }
flag: ""
type: "double"
type2: "Number"
value: 0.9448818897637795
<prototype>: Object { ... }
FilmOffsetX: Object { type: "double", type2: "Number", flag: "A", ... }
flag: "A"
type: "double"
type2: "Number"
value: 0
<prototype>: Object { ... }
FilmOffsetY: Object { type: "double", type2: "Number", flag: "A", ... }
flag: "A"
type: "double"
type2: "Number"
value: 0
<prototype>: Object { ... }
FilmWidth: Object { type: "double", type2: "Number", value: 1.4173228346456692, ... }
flag: ""
type: "double"
type2: "Number"
value: 1.4173228346456692
<prototype>: Object { ... }
FocalLength: Object { type: "double", type2: "Number", flag: "A", ... }
flag: "A"
type: "double"
type2: "Number"
value: 50
<prototype>: Object { ... }
GateFit: Object { type: "enum", type2: "", value: 2, ... }
flag: ""
type: "enum"
type2: ""
value: 2
<prototype>: Object { ... }
GeometryVersion: 124
InterestPosition: Object { type: "Vector", type2: "", flag: "A", ... }
flag: "A"
type: "Vector"
type2: ""
value: Array(3) [ 735.2032470703125, 495.8309326171875, 693.3067626953125 ]
<prototype>: Object { ... }
LookAt: Object { "-0.685920774936676": {...} }
"-0.685920774936676": Object { singleProperty: false, id: -0.685920774936676, attrName: -6.866099511171342e-8, ... }
<prototype>: Object { ... }
NearPlane: Object { type: "double", type2: "Number", value: 10.000000149011612, ... }
flag: ""
type: "double"
type2: "Number"
value: 10.000000149011612
<prototype>: Object { ... }
OrthoZoom: Object { type: "double", type2: "Number", value: 7.314285755157471, ... }
flag: ""
type: "double"
type2: "Number"
value: 7.314285755157471
<prototype>: Object { ... }
Position: Object { type: "Vector", type2: "", flag: "A", ... }
"735.88916015625": Object { singleProperty: false, id: 735.88916015625, attrName: 495.8309326171875, ... }
flag: "A"
type: "Vector"
type2: ""
value: Array(3) [ 735.88916015625, 495.8309326171875, 692.5791015625 ]
<prototype>: Object { ... }
SafeAreaAspectRatio: Object { type: "double", type2: "Number", value: 1.7777777777777777, ... }
flag: ""
type: "double"
type2: "Number"
value: 1.7777777777777777
<prototype>: Object { ... }
ShowAudio: 0
ShowInfoOnMoving: 1
TypeFlags: "Camera"
Up: Object { "-0.32401350140571594": {...} }
"-0.32401350140571594": Object { singleProperty: false, id: -0.32401350140571594, attrName: 0.8953956365585327, ... }
<prototype>: Object { ... }
UpVector: Object { type: "Vector", type2: "", flag: "A", ... }
flag: "A"
type: "Vector"
type2: ""
value: Array(3) [ -0.32401350140571594, 0.8953956365585327, -0.3054208755493164 ]
<prototype>: Object { ... }
attrName: "Camera"
attrType: "Camera"
id: 549278231
name: "NodeAttribute"
propertyList: Array(3) [ 549278231, "Camera", "Camera" ]
0: 549278231
1: "Camera"
2: "Camera"
length: 3
<prototype>: Array []
singleProperty: false
<prototype>: Object { ... }

WAS

#7
So... blender says it's 36mm, but the values I get for horizontal/vertical don't match? Or do they when in millimeters (default)? I am confused.

These three values correspond to the settings you have in Blender, which is Auto (standard FieldOfView) then you have ability to set to Horizontal or Vertical.

Another issue is there is no flag, for what the end user is using in the attributes that I can see... so I guess I can just assume either horizontal or vertical. Probably horizontal.

But I don't know how these values relate to 36mm???

...
type: "FieldOfView"
type2: ""
value: 39.597755335771296
...
type: "FieldOfViewX"
type2: ""
value: 39.597755335771296
...
type: "FieldOfViewY"
type2: ""
value: 26.991466429975517



To be clear I'm specifically trying to figure out how to set this in the TG shader. I still don't know how to derive these values from the data I have.

Terragen seems to calculate this itself, as provided by the clarification below for fstop and focal length. Do I do focal length / FieldOfViewX  (Or Y if in that mode), to get the f-stop, and then use that on something else?
film_aperture_in_mm = "36 24"

WAS

#8
So trying to figure out the vertical field of view. I am using this formula but the result is not right. I think I'm mixing up values. I don't know. The camera only provides one FOV, but the attributes have more data just not sure what to use.

VFOV = ( 2 * arctan( tan( 22.275333337289535 / 2 ) * 1.7777777777777777 ) fbxcam2terragen.js:199:13
Effective Vertical FOV: -2.9810001665442383


I have updated the script online at https://nwdagroup.com/terragen-tools/fbxcam2terragen/ When you click convert, it will show the camera attributes, camera node info, and parse shader (so far) in the console. It won't download the shader as I haven't finished figuring out parsing all the data.

pokoy

#9
I think it's simpler, maybe you are confused by different words for the same thing.

Let's go with this Wikipedia page, which assumes a pinhole camera (the one used in CG):
https://en.wikipedia.org/wiki/Angle_of_view

To get FOV_H with the given values:
Focal_Length
Film_Width

The formula for FOV_H would be:
FOV_H = 2 *  ARCTAN (Film_Width / (2 * Focal_Length))

If you want zoom factor to be considered, all you need is to divide FOV_V by the zoom factor:
FOV_H = 2 *  ARCTAN (Film_Width / (2 * Focal_Length)) / Zoom_Factor

Now that we have FOV_H, getting vertical FOV is simply dividing it by image aspect:
Image_Aspect = Image Width / Image Height
FOV_V = FOV_H / Image_Aspect

One important thing - according to this wiki page, it appears that the 35mm standard assumes a film/sensor size of 36mm x 24mm. Why it's called 35mm, and not 36mm... I don't know, probably historical reasons when film was shot on 35mm but later adapted the 36mm x 24mm frame dimensions. So it's possible that the film width 35mm defined in FBX is actually 36mm and no further adjustment needs to be made here.

Also, you don't really need to bother about vertical FOV anywhere as it's derived from horizontal FOV depending on image aspect. Horizontal FOV is given most of the time anyway and it's enough to get all that's needed.


Aperture Diameter
This one's easier than I thought. According to this wiki page - https://en.wikipedia.org/wiki/F-number - the formula would be:
Aperture Diameter = Focal_Length / F_Number

But f-stop/f-number don't seem to be included in the FBX camera data from the other thread anyway.


For other parameters in the FBX camera data:
- 'Near' and 'Far' are probably just clipping distances, not related to any of lens defining variables
- 'Focus' would probably be Focus Distance in TG (or Target Distance in Max and maybe other apps)

Attaching a zip of a simple Excel sheet here with all of these formulas included. I hope it correctly works on your system - mine is set to use comma as decimal separator, hopefully it'll automatically use point decimator on an US/English OS. The yellow fields are variables you can change, white fields with bold text are calculated from the formulas mentioned above.

Hope this helps!