How to Simulate Various Light Sources

Started by PabloMack, September 18, 2014, 10:11:19 AM

Previous topic - Next topic

PabloMack

As some of you know, I have been developing an LED studio light controller and I am now asking the question "How can I simulate different virtual light sources on a live green screen stage?" In order to light my live actors in a way that simulates how they would be lit if they were CG characters on set in the virtual environment, I will need to control my real lights based on the lighting that is seen in the virtual set. Of course the ideal solution is to have a dome over the actors and project an HDRI image around them while still maintaining a usable chromakey backdrop which itself conceivably could be part of the surrounding image used to light the live action. Short of that luxury, I plan to use a hand full of discrete lights to simulate the lights that are lighting the spot in the CG virtual set where the live action is to take place.

One of the most common problems I will face is an outdoor scene where I have a brilliant light source coming from standing in the direct sunlight. This light could be from orange to yellow to white depending on the time of day. Additional indirect light will come from everywhere else and this light might go from pink to red to orange to yellow to white to blue. So the simplest plan would be to have overhead lights that simulate skylight scatter while one or more flood lights would simulate the direct sunlight.

Using Terragen to generate the virtual environment, then, I will be using two different virtual cameras (or renders from two different configurations of the same virtual camera) to be used as light meters (a sort of IBL). These will ultimately drive two different light sources using a lighting controller. What I need to do is to make each "light meter" camera only see the light it is supposed to see. I know I can turn off the disk of the sun in order to only capture skylight. This signal will be used to drive the overhead skylights. My question is: "Is it possible to turn off indirect light sources (i.e. reflection) and only see the light coming directly from the sun?" I know that the amount of direct sunlight coming from the disk of the sun will probably make the skylight insignificant. Still, the skylight theoretically should not be represented in the flood lights that are there to simulate direct sunlight. Ideally I could turn of reflection/scattering from any single light source individually but I don't know if TG can do that. I tried turning off the "Envirolight" and that doesn't do it. Perhaps it is doable in the new "Layers" feature, I don't know. I suppose I could render the images to be used for IBL twice, one set with the disk turned on and one set with it turned off. The skylight signal would be derived directly from the image set with the disk of the sun turned off while the direct sunlight signal would be derived indirectly by subtracting out the skylight only images from those that have the disk of the sun turned on.

I would like for someone to give me their wisdom so that I can head off in the right direction.

Much Thanks, Pablo.

Oshyan

I'm not entirely sure I understand what you're aiming for, but turning off the Enviro Light disables Global Illumination, which is responsible for handling light bouncing, so I think it should do what you need. What are you expecting to see that you didn't see when you disabled it? Regarding layers, there *is* a separate direct and indirect pass, so yes you could use that.

Also turning on or off the disc of the sun just turns on or off a visible circle where the sun is, it doesn't otherwise affect the light/brightness being generated.

- Oshyan

PabloMack

#2
Quote from: Oshyan on September 18, 2014, 03:00:22 PMI'm not entirely sure I understand what you're aiming for, but turning off the Enviro Light disables Global Illumination, which is responsible for handling light bouncing, so I think it should do what you need.

When I uncheck the box labeled "Enable" in the Envirolight node then render, the sky is just as bright and blue as it is when the box is checked. This means I am still getting scattered (i.e. reflected) light from the sun and I still appear to have full skylight. The sky should go black (like it is on the moon) when the atmosphere is not reflecting and dispersing light. Yet the atmosphere still needs to be there because it does act to block some if not all of the direct sunlight so I can't just disable the atmosphere. What I am needing is for light that is absorbed or reflected to just disappear and only let the direct light that is left after it filters through the atmosphere to reach the camera.

Quote from: Oshyan on September 18, 2014, 03:00:22 PMAlso turning on or off the disc of the sun just turns on or off a visible circle where the sun is, it doesn't otherwise affect the light/brightness being generated.

When I turn on the disk of the sun, render and then save to an EXR file will the pixels representing the disk of the sun retain full super-bright light levels that would otherwise be clipped to white if saved as, say, a BMP image? If the sun's disk is only a white disk that doesn't produce light far above and beyond what just a non-luminous flat white disk would produce then it isn't going to give me what I need.

Oshyan

Light diffusion/dispersion in a *volume* (the atmosphere) is different than light reflection/interreflection on/between surfaces, or between surfaces and atmosphere. It sounds like you just want no effect from the Atmosphere (black sky), so just disable it, right?

My point with the Sun Disc is that the Visible Disc does not change the representation of light/brightness values, you get "beyond white" values whether or not it is visible. If it *is* visible then the entire disc has the same ultra-bright values uniformly, which is realistic. Just make sure the disc size is realistic.

- Oshyan

PabloMack

#4
Quote from: Oshyan on September 18, 2014, 04:12:05 PMLight diffusion/dispersion in a *volume* (the atmosphere) is different than light reflection/interreflection on/between surfaces, or between surfaces and atmosphere. It sounds like you just want no effect from the Atmosphere (black sky), so just disable it, right?

I disabled the atmosphere and the clouds are still there. That's good and bad. The good is that clouds block the suns rays where they should. The bad is that they catch and diffuse the sun's light indirectly and this will show up as part of the lighting. I can't disable the clouds or they will not block direct sunlight as they should. I think I can live with the compromise since direct sun will dominate over the reflected light when not blocked by clouds. When blocked by clouds, the lights controlled by this "camera" that is there to measure direct sunlight should dim to levels comparable to the overhead skylighting. That seems reasonable.

Quote from: Oshyan on September 18, 2014, 04:12:05 PMMy point with the Sun Disc is that the Visible Disc does not change the representation of light/brightness values, you get "beyond white" values whether or not it is visible. If it *is* visible then the entire disc has the same ultra-bright values uniformly, which is realistic. Just make sure the disc size is realistic.

This is what I wanted to hear. It means I am not headed down a dead-end path. Thanks so much. :D

Oshyan

For your cloud problem, try turning off Enable Primary (leave Enable Secondary checked). This will make the clouds not visible but they will still cast shadows.

- Oshyan

PabloMack


PabloMack

#7
I got the LED controller running programs last week. In a working system each "Zone" will usually be composed of more than one "Channel". Each "Channel" is basically a string of LEDS of the same "color". This can be "color temperature" or even saturated colors such as Red, Green and Blue. In a program, points in time will be controlled using TG renders that are generated expressly for that purpose.  I used a JVC HD-7 that uses three CCDs at 30 FPS to create the following video. Though the camera was set to "Manual Exposure" I sort of doubt that this camera has a true manual exposure mode as the other parts of the image grew darker as the LEDs got brighter. Maybe a sort of current parasitism in the sensor is going on. I connected two strings of LEDs to two channels from different zones so it would show how they could be started together but then run independently off different programs. In a complete system, each zone will usually have more than one channel so that both brightness and color of stage lighting are under program control. You will notice that the program controlling the reel on the right is shorter (in time) and ends first:

https://www.youtube.com/watch?v=Tl_UzsaHpgw

The next step will be to actually set up a virtual camera in TG to be used as a light meter and render some key frames. I will then bring those into the controller's host software for (1) file storage in a format that is readily accessible for a production and (2) play the program to light a live shoot. This second part involves writing a lot of code.

bobbystahr

Quote from: PabloMack on September 23, 2014, 08:52:11 AM
I got the LED controller running programs last week. In a working system each "Zone" will usually be composed of more than one "Channel". Each "Channel" is basically a string of LEDS of the same "color". This can be "color temperature" or even saturated colors such as Red, Green and Blue. In a program, points in time will be controlled using TG renders that are generated expressly for that purpose.  I used a JVC HD-7 that uses three CCDs at 30 FPS to create the following video. Though the camera was set to "Manual Exposure" I sort of doubt that this camera has a true manual exposure mode as the other parts of the image grew darker as the LEDs got brighter. Maybe a sort of current parasitism in the sensor is going on. I connected two strings of LEDs to two channels from different zones so it would show how they could be started together but then run independently off different programs. In a complete system, each zone will usually have more than one channel so that both brightness and color of stage lighting are under program control. You will notice that the program controlling the reel on the right is shorter (in time) and ends first:

https://www.youtube.com/watch?v=Tl_UzsaHpgw

The next step will be to actually set up a virtual camera in TG to be used as a light meter and render some key frames. I will then bring those into the controller's host software for (1) file storage in a format that is readily accessible for a production and (2) play the program to light a live shoot. This second part involves writing a lot of code.

This is a most interesting thread that tho I'm mostly not sure what's happening yet, lighting a real scene with TG generated light settings is bordering on ground breaking...if that's your goal....well awesome.
something borrowed,
something Blue.
Ring out the Old.
Bring in the New
Bobby Stahr, Paracosmologist

PabloMack

#9
Quote from: bobbystahr on September 23, 2014, 02:27:46 PM...lighting a real scene with TG generated light settings is bordering on ground breaking...if that's your goal....well awesome.

That's the goal.

I guess the bad news for Europeans will be that they will not be able to legally purchase such a product unless it has certification. The CE mark can more than double the price, depending on volume.

PabloMack

#10
I have been working on the host program a lot and it has become quite useful now. The host program can now use a series of BMP images to automatically produce a lighting sequence for each of four different zones that are supported by the controller. The host program supports up to 32 controllers so that I could have up to 128 different zones (and 448 individual channels) controlled from one console program running on a PC. With 14 channels per controller board and each channel driving up to 3A that's a theoretical total of 1344 Amps @12V = 16 thousand watts per installation. That's a lot of light when coming from LEDs. The host program can also save/load sequence files to/from disk storage. The main limitations for this systems are: (1) there is a maximum of 2978 ramps per zone. At 30 FPS, if you explicitly store one ramp per frame it can only store a 99 second sequence for each zone. But most sequences will not need to store a ramp for every frame. Each ramp can be up to 36 minutes long so sequences are effectively not limited by time. Ramp beginning and ending points are set by 8-bit values but the points between end points are smoothly interpolated to 16-bit values. This saves some storage in the program without losing the benefit of 16-bit resolution during a ramp. The other limitation is that the host communicates via a serial port and the download rate is about 15 ramps per second. I anticipate download time for typical lighting sequences to only take seconds if the light sampling frames are chosen well and kept to a minimum. The downloads are done ahead of time and the sequences are "played" in real time at full speed. There are a set of broadcast commands so that all sequences can be started and stopped simultaneous so there is effectively no time skew between controllers. I have put in a number of features anticipating they will be needed by users. Among these is a special "Intermission" lighting arrangement that is easily turned on or off by the click of a button. This can be used during breaks between shots/takes and for extra lighting for the crew when they are working on the set.

I am already thinking about how to implement an Ethernet version.

PabloMack

As discussed in my EXR thread I am having some technical difficulties implementing the use of EXR files. Now that I am cruising along with BMP files I am interested in solving a problem given that this is the only option my lighting system is now supporting. I want to use a camera as a general light meter and this means that I want to sample ALL light hitting the lens and not just that part that is in the field of view (i.e. hitting the sensor). What would be ideal is if I could place a frosted pane of virtual glass in front of the camera so that it will not only catch light sources that are in-frame but also those that are out-of-frame. This piece of virtual frosted glass would scatter all of the light striking the surface and the camera would see it as one very translucent image representing the overall lighting. If I could create a CG object that is a white translucent pane of "glass" and parent it to the camera, I could position it so that it would be in front of the camera no matter where I turn it. Can you parent objects to a camera in TG? I have looked at both the Object panels and Camera panels and I see no such options.

Of course the ideal solution would be to have camera type "Light Meter" (as an alternative to Perspective, Orthographic and Spherical) in the TG Camera panel that would be selectable by the radio buttons. . You might have a parameter to filter incident light by the angle it strikes relative to normal to the front of the lens. Just a Christmas wish from Santa... :)

Matt

#12
Please forgive the dumb question, but if you are not interesting in the directional information in the lighting (i.e. you just want to capture a "frosted glass" version of the light coming from all directions), then what is the purpose of the "light stage" you are building? If you compress the information down to a single RGB triplet, then you can simply shoot your live action with static lighting and do the lighting modulation in post, using a simple multiply/gain on the footage.

(I'm using the term "light stage" pretty loosely here)

Matt
Just because milk is white doesn't mean that clouds are made of milk.

Matt

#13
Quote from: PabloMack on October 03, 2014, 01:04:53 PM
Of course the ideal solution would be to have camera type "Light Meter" (as an alternative to Perspective, Orthographic and Spherical) in the TG Camera panel that would be selectable by the radio buttons. . You might have a parameter to filter incident light by the angle it strikes relative to normal to the front of the lens. Just a Christmas wish from Santa... :)

This sort of light meter that weights all directions equally could be implemented by rendering using a spherical camera and doing a weighted average of all the pixels in the image. You'd need to take care to use smaller weights on the pixels near the poles. But you'd need to work with an HDR format (e.g. EXR) to capture the sun.

Matt
Just because milk is white doesn't mean that clouds are made of milk.

PabloMack

#14
Quote from: Matt on October 04, 2014, 03:39:09 AMPlease forgive the dumb question, but if you are not interesting in the directional information in the lighting (i.e. you just want to capture a "frosted glass" version of the light coming from all directions), then what is the purpose of the "light stage" you are building? If you compress the information down to a single RGB triplet, then you can simply shoot your live action with static lighting and do the lighting modulation in post, using a simple multiply/gain on the footage. (I'm using the term "light stage" pretty loosely here)

It's not a dumb question. The only way I can see accurately lighting a live stage (with real lights) using the spherical lens capturing HDRI images is to have the whole stage under a dome where the inside of the dome is basically a curved display panel facing inward. The virtual light sources coming from those portions of the dome occupied by the green screen are obviously not available for lighting the live stage. However, surfaces lit from those regions of the dome are always on the far side of the live action camera so are mostly out of view. Still, building such a light stage would be challenging and expensive to say the least.

As a much cheaper and easier alternative, I can see where portions of the surrounding spherical image could be used and analyzed to control discrete lighting panels that are strategically positioned to approximate the virtual environment. Deriving this information from a single large spherical image could be daunting. I can imagine some visual tools where you would load the image and then carve it up and assign each region to a zone of control. I will have to think about that some more. But, as I said in my other thread, I am having technical difficulties making use of the EXR libraries, and I will have to solve those problems before I can even do something as simple as loading and displaying an EXR image.

An easy way to tackle the (diffuse) discrete lighting approach is to render a series of images for each "zone-of-control". Each zone would be driven by a virtual discrete light meter that is looking at a different part of the dome. The resolution used to represent each region of the dome lighting the scene could be tiny as each one is boiled down to a single pixel. But you would have as many of them as you have zones of control in the attempt to adequately light the whole "stage". I can see satisfying the requirements for many virtual outdoor scenes just from mimicking skylight as one "zone of control" using overhead lights and some directional flood lights to mimic direct sunlight as another "zone of control". You do it all live and don't have to fix anything in post except for overall 2D color correction and brightness which are almost trivial.

As you know, real lighting has very complex 3D interactions and it is very labor-intensive to try to accurately approximate reality in post. It is extremely difficult to get it right using 2D video processing tools such as After Effects. On the other hand, real light does its accurate 3D differential lighting because it is real light as long as it is coming from the right general directions and has the right color and intensity (Reflected images are another matter). I think most post production houses do the lighting effects in post because (1) They have little control over production, (2) there are no good affordable programmable lights on the market and (3) they have a lot more talented people and larger budgets to pay them to do post production work than I have. Often the production planners don't have such things as an automatic programmable lighting process in their pipelines and so they throw the footage over the wall at the post people and let them deal with it in the ways they know best. Some things are far easier to do by making them part of production than it is to "fix it in post". It is my belief that there's a whole lot you can do very easily using permanent programmable overhead lights to provide the "skylight" and one discrete programmable directional light to mimic direct sunlight, for instance.

I'll be working to solve the EXR library problems. Once that is done, I will have a challenging task to try to use a single dome image to drive discrete lights. I know it can be done but I have the feeling it won't be easy.

Quote from: Matt on October 04, 2014, 03:43:23 AMYou'd need to take care to use smaller weights on the pixels near the poles.

I guess this is because a rectangular image is wrapped around a sphere and the pixels squeeze together (i.e. each pixel represents less angular area) as they approach the poles.