Virtual Light Meter?

Started by PabloMack, August 29, 2014, 09:23:38 AM

Previous topic - Next topic

PabloMack

As Terragen is used more frequently to produce its amazing landscapes for use as backdrops for live action productions, it is going to become increasingly important to have the capability to automatically adjust live studio lighting that closely matches the lighting in the surrounding CG. Though one can already export channel files of the light sources themselves, as everyone knows, these only indirectly contribute to the ambient light at the virtual location(s) where the live actors are supposed to be in the virtual landscape. A solution to this problem would be to create a virtual light meter, say a sphere of specified diameter and placed into the TG environment at a place where the virtual light is to be sampled. When exported as a channel file it would contain both color and intensity of light passing through the "meter". An option that might limit the amount of data produced to prevent key-framing every frame would be to produce into the channel file only frames where light sources themselves and the meter are key-framed at the "beginnings"and the "ends" of their animation ramps. If the meter itself is moving or something like clouds going over start changing the lighting too frequently then the user might want to have the option to turn on key-framing (i.e. reporting values to the channel file) for every frame. Still it would be desirable to limit the amount of data by having some algorithm that might do a best fit for a series of ramps to limit the amount of data and the number of key frames going into the channel file. This best fit analysis could be done in the software that is importing the data for use in the lighting controller system. In such case one would probably want to report lighting data for every frame.

This metering could be done with a camera that is already a part of Terragen. However, cameras actually receive more light than they "see" because of light coming in from out-or-frame. But the angles of FoV can be changed to obtain the desired effect. And with the new 360 degree panoramic camera, lighting can be completely sampled from every direction at any point in the scene. In this light (pun intended) the lighting values reported to the channel file could be obtained by simply averaging the camera's image.

If TG already has this capability then I would like to know how it is done. I have queried this forum for "light meter" and it brings up nothing. Such a device could be made from a camera in TG if there was the capability to create a channel file on the averaging of its image to one pixel value. In the "Export" Camera Tab where the "Export chan file" and "Export FBX file" exist could be added
"Export Light Metering" as taken as the overall lighting as seen by the camera. With the panoramic camera now available, the overall environmental lighting coming from every direction could be sampled at any point in a virtual TG scene by placing the camera there. The real "gotcha" is that this information can't be obtained without rendering. And when the render is done to individual still frames, where can the data be stored pending export to a channel file?

It is my guess that this will become a feature request to the TG development team. As I am a developer and am a C++ programmer, I could consider using the SDK for this purpose. However, I do not have and have no plans to purchase the somewhat pricey MicroSoft C++ compiler. The toolchain I use is the most recent release of the Open Watcom compiler and it does not yet support the 64-bit environment. Purportedly this is in the works as well as support for ARM7 as a target but will take some time before it comes out and there may be incompatibilities between the two toolchains.

Perhaps the best way to solve this problem is to just write a program that will average a series of BMP files. Now that I am thinking about it, I'd don't think it will be very difficult. The only problem is that I will have to render a sequence just to do the metering. But I can just render at the key frames and this will cut down on the amount of rendering to be done as well as limit the amount of data in the lighting sequence.

Matt

Why not use the spherical renders as IBLs? You don't just want to capture light intensity, you want direction information too. The spherical image captures this. If you want to average this and compress your data into a chan file, this could still be done using an external tool - it doesn't necessarily have to happen in the application. So your Watcom compiler might come in useful.

Matt
Just because milk is white doesn't mean that clouds are made of milk.

PabloMack

What is an IBL? I think this is where I am headed. Thanks.

bobbystahr

Quote from: PabloMack on August 29, 2014, 09:36:22 AM
What is an IBL? I think this is where I am headed. Thanks.

IBL=image based lighting
something borrowed,
something Blue.
Ring out the Old.
Bring in the New
Bobby Stahr, Paracosmologist

goldfarb

you may also want to have a look at the lightGen plugin for HDRshop
http://gl.ict.usc.edu/HDRShop/lightgen/
I havent used it in a long time so it may be a bit funky...and it's windows only AFAIK...
it will take an hrd and export lights for a few 3D apps...

I've long thought that this could be used to determine real life light setups from CG environments...
--
Michael Goldfarb | Senior Technical Director | SideFX | Toronto | Canada

PabloMack

#5
This is my plan. When I want to control the physical stage lights from a virtual scene in TG I will set up a virtual camera in TG just to capture the lighting where the stage is supposed to be located. I will then render those frames that are used as key frames for animated channels that are likely to change the lighting at the subject location. I will then render those frames using the same kind of file naming system that TG uses to identify their frame numbers (such as Image0000~Image0120). For a sequence where the camera is flying around I can simply render every Nth frame. The ends of the lighting ramps will be automatically created according to the files that are found in the specified directory (i.e. folder). Even if the animation will be done in HD I will only need to use low definition renders to obtain lighting information because each frame will be averaged into a single triad of RGB values. Upon user prompting the controller host program (running on Windows) will then ingest these images and create lighting ramps from the points that were generated from these images. The BMP file format is easy to read in and process. I have done it a couple of times for other projects. These lighting sequences will be stored on a per scene basis as a studio lighting file and can be downloaded to the actual controllers for use in a shoot. The trick will be to translate RGB values for controlling "white" lights of different temperature color ranges. I can see that it would actually be more trivial to use R, G, & B LEDS for stage lighting. I may want to lay out a board to carry RGB LEDs for customers who want a more psychedelic color range for stage lights. I can foresee specifying an equation for determining the intensity of each LED channel within what I am calling a "Zone" of control.

Question: Does the Mac version of TG also generate BMP files as an option or do they use some other format to store rendered frames?

PabloMack

I just got the LED lighting controller working today. It now seems to run lighting programs flawlessly. Getting data into my host program seems to be the next logical step.

Thinking about it some more, it appears that I will need to use HDRI. The reason is because lighting received from high-intensity light sources (such as direct sunlight) normally saturate  to "white" in narrow range image files such as RGB. This means that most of the bright light is thrown away. It also means that the small amount of area in the image that comes from bright light source will be under-represented in the overall lighting. A consequence of this is that the rest of the image (such as clouds and blue sky) is over-represented for its light contribution. I discovered that Terragen supports OpenEXR which is an open-source format developed at ILM. I have downloaded the basic viewer source kit and my next step will be to make sense of and use out of this format. Looks like my work is cut out for me.