Hi, I just thought I'd throw in my 2 peneth.
I have used camera tracking data in Terragen2 but to do it you need to use a work around. The info is in the forums you just got to hunt it down. I was using XSI.
The way I did it was to output the live action plate camera tracking data from the tacking program in to .xsi format, and used a plugin to convert it into .chan format, I then loaded it into the camera in Terragen2.
It took me a lot of trial and error as the xsi plugin was created by another user in jscript and was a work in progress.
I have also used the plugin available for 3Dsmax, it is amazing and hassle free.
I feel that animation is better off done in another 3D program and exported to Terragen2, I mean by using the Terragen2 mesh export as a guide. An FBX import/exporter would be great and I understand is in the pipeline, but who knows how long before it will be available.
For live action only very close work would need to be rendering directly in Terragen2 which would be matched with live action matchmove data for example a fully greenscreened actor, his feet contacting with a terragen surface. But for most set extension stuff an environment sphere/cube or an image matte could be used, and a camera projection set up in another compositing program such as Nuke to match the close depth perception.
Here are a couple of the things I did.
This one is a simple test of live action camera tracking data fed into Terragen via XSI plugin.
http://vimeo.com/8652516 This one is at the beginning of my current show reel where the base jumper jumps off a tower, I was pushed for time and only had time to render the scene at a much reduced detail level (rendered frame by frame in Terragen2) but you get the idea, also I should have taken my own advice here and used a BG matte within XSI, although I was using it to learn the workflow between matchmove XSI and Terragen2.
http://www.youtube.com/watch?v=nKILGFFbWcQThe truth is out there, you just got to dig.