Blender to Terragen Camera Import/Export???

Started by acolyte, February 15, 2010, 10:53:09 PM

Previous topic - Next topic

acolyte

Hey guys. I've recently been planning to beef up my reel by combining the use of Blender and Terragen 2. I'm hoping to be able to get a workflow going between the two software packages. After searching for a while on the forums and the wiki I wasn't really able to tell if this was possible or not yet. What I would like to be able to do is figure out a way to export camera animation from Blender to Terragen or the other way around. Of course if I was going to do this, there would probably need to be a version of the terrain or some reference objects in both software packages so that the camera animation had a point of reference. The main concerns I have here are matching scene scale, the different camera translations that may occur between the two software packages (XYZ vs. ZYX), and just how to physically get the data of keyframes from Blender out to Terragen 2. I know this is a lot, but can I ask the community or Oshyan to point me in the right direction for a solution on this one?

Henry Blewer

You will really have a lot of trouble with Blender camera paths in Terragen 2. Blender does not use real world measurement. Terragen 2 does. Objects created in Blender can be converted over into Terragen 2 quite easily. I would use the Terragen 2 renderer and sent up the scene in Terragen 2. Then create the camera path.

I have not tried this. The Lightwave terrain exporter would allow you to bring a height field into Blender. Then you could use Blender for the animation and render. I have never been able to make a convincing sky in Blender, but Blender's compositor is quite powerful. Great looking sky maps can be rendered using Terragen 2.
http://flickr.com/photos/njeneb/
Forget Tuesday; It's just Monday spelled with a T

acolyte

First. Thanks so much for the quick reply. :)

The good news about the camera path is that it probably won't matter anyways. Instead of relying on the interpolation of the curves, I'm going to just bake the animation to my camera once the animation is finished. This will essentially create a keyframe for every frame, so I can get reliable camera animation from one program to another; however I still have a problem of translating and scaling my camera accurately from Blender to Terragen. OK, everything is relative right? So if I just say one Blender unit is equal to one Terragen meter, would this not be a simple way to come up with a scale factor? I'm not really worried about anything other than the camera animation translation. If there was a predictable way to keep the camera movement, even if I only have a proxy version of the terrain in blender, I should be able to predictably animate a camera in Blender, provided I am able to get proper XYZ and rotational coordinates from Blender to Terragen 2. That's really my main concern. Because if that's possible, then there's no reason a script can't be written to do the tedious work.

Oshyan

You can export terrain data in LWO format using two methods. I recommend the heightfield-based "Heightfield To LWO" method. You can use this as a proxy. For camera data, you will need to find a way to convert Blender camera path to Nuke CHAN format, and taking into account the different coordinate system of course.

- Oshyan

acolyte

Hey Oshyan, thanks. Thanks for the terrain solution, is there a way to do this so that the terrain scales predictably between the two programs, or is it based on multiple factors? I've looked in the Blender forums, and while some have asked for a NUKE export, no one has created an export script to the best of my knowledge. I'm a bit of a coder, so I'm not opposed to looking into doing it myself, but im wondering if the CHAN file format is pretty complicated, or is it simple to script for like on the level of the original terragen camera script files? Also, if anybody here knows about the CHAN file format, or about the coordinate system behind Blender, can you fill me in on what I would need to know to start coding an exporter or a simple script which would produce a NUKE file from Blender? (ie: Blender's camera rotation order, a basic rundown of the CHAN syntax for a keyframe in the file or where I can find this, and anything else that might help get the ball rolling so it might be easier to go between Terragen 2 and Blender). :)

Oshyan

The CHAN format is very simple as far as I know, but I'm afraid I don't have details. It is described in some posts on the forum here, which should turn up in a search on "CHAN".

As far as the scaling of terrain, I'm not certain but I think TG probably exports it in the same scale it uses internally, which is measured in meters. How that maps to external apps I can't say however.

- Oshyan

Henry Blewer

There may be a Python script which would do the conversion from Blender to NUKE. I would try BlenderNation and see if anything is posted.
http://flickr.com/photos/njeneb/
Forget Tuesday; It's just Monday spelled with a T

acolyte

Ok, I found a CHAN exporter for Blender at http://www.virtualmatter.org/stardrive/2009/12/blender-nuke/. Looks like it works for all versions up to 2.5, which is fine for now. It works ok, so I guess the most important thing to figure out now is how to convert Terragen meter units to Blender units. If i'm ballparking the scale factor, then it's as good as being back in the stoneage because the framing won't line up when I render separate passes in Blender to match the background plates from Terragen. I haven't tried just outputing an LWO terrain and importing it into Blender yet and seeing if it just works. Experience tells me it probably won't, but i'll give it a try tonight or tomorrow and see what results I get. Oh, Oshyan, any news on where you guys are as far as updating the animation tools in Terragen 2? (ie: graph editor?).

red_planet

If I remember correctly from my own experiments, 1 blender unit = 1 Terragen metre...

I have successfully comped a blender animation with a Terragen 2 background (Camera and light settings the same for both apps) and all lined up nicely...

Rgds

Chris

acolyte

Thanks Chris. Would you mind providing a simple explanation of your workflow between the two program? Also, any problems you encountered or things to look out for?

red_planet

Hi..

I'll have to dig back through files , I did it a couple of yrs ago and haven't needed to do anything like it since.

It was fairly basic though.

Getting the camera "lenses" to match was the hardest part if I remember. I arrived at the camera settings by placing a 1 metre cube at the origin 0,0,0. raising it vertically by .5 m to make it sit on the ground plane. I then tweaked camera settings in Blender till the rendered output from both applications matched. Elevatio and rotation values for the Blender camera were taken directly from the Terragen camera values. (Old fashioned method...write it down and retype in the target application)  There is a direct correlation in one particular parameter but I don't have the detail here at the moment. I used the angles of elevation and direction from the Terragen Sun to provide lighting angle and direction for the animated elements in Blender.

The Blender output was an image sequence in png format with a transparent alpha channel. The Terragen output was just a backplate.

The Blender image sequence was converted to a .mov in QuickTime and then the backplate and animation were comped in Final Cut Pro.

The last stages can be accomplished with a variety of tools, I use QT and FCP because I have them... but the whole process can be done in Blender.

This is a bit rambly but I hope it helps.

You can always pm me if you need more info.

Rgds

Chris

acolyte

Thanks again Chris. I've done some things like this before with the original Terragen and Blender, and I do remember the hardest part was getting the lenses to match up.

I'm most interested in the actual lens conversion in terms of making a lens with a certain focal length in Blender match up in Terragen 2. I know Blender uses mm focal length just like a real camera, but the nuke export script somehow converts this into a value for the vertical field of view and passes that information into Terragen 2. My hope would be that they simply line up, but that remains to be seen because I haven't been able to run a test yet.

I seem to remember having a lot of issues with having to convert the rotational values from Blender back to the original Terragen because once you pass the 180 degree threshold on an axes, it starts to wrap to a negative value from the other direction. Does anyone know if this is still the same case in Terragen 2, or will a number like -325 read the same as in other 3d packages?

Chris, if at all possible I would like to animate from Blender and continue that workflow by exporting my camera animations back to Terragen. Do you see any reason why this would not be possible? I'd like to stay away from animating anything in Terragen and only tweaking things on the side of Blender.

Oshyan

TG2 should properly interpret negative rotation values. Rotation order may be an issue though.

- Oshyan