Large renders are generally not a problem, and in fact 11x17 at 300dpi (5100 pixels on the longest side) is in the middle of the range for resolutions I've seen people rendering at with Terragen. The highest I've seen so far is 16,000 pixels on the longest side, but that was rendered in tiles on a render farm. Forum user "Dune" has done a number of high resolution renderings for large print installations as well. Ranch Computing has a benchmarks page that shows a 12,000 pixel image and gives you an idea of render time as well:
http://www.ranchcomputing.com/terragen-benchmarks-produit.en.htmlMemory usage for rendering is dependent on a variety of factors, particularly the size in memory of the assets you're using in your scene (objects and textures), the detail settings for rendering and GI, and the antialiasing level. The number of cores/threads you are rendering with is also a potentially important factor since you need a minimum of 50MB (and ideally more) per thread for the render subdivision caches. But on normal home machines this usually isn't as big of a factor as the others since most such systems have 4 or at most 6 cores and use up to 8-12 render threads. At 100MB/thread (the recommended setting), even a 12 thread system will only use 1200MB for the subdivision cache. 1200MB may seem like a lot, but you can easily have 16GB or more in a good, modern system so it's not necessarily huge.
Estimating render time can be done with a reasonable degree of accuracy in many cases, based on lower resolution renders of the same scene. You would want to use the same render settings as you will use for the final render, except render at some fraction of the final render size. So if you're aiming for 11x17 at 300dpi, that's 3,300x5,100, you might want to render at 1275x825, which is going to be 1/16th of the pixel area of the larger render, and so should take *approximately* 1/16th the time. If your render at that resolution takes an hour, you can reasonably estimate that your higher resolution render will take roughly 16-20 hours. You can use these kinds of estimations to predict cost on a render farm, just keep in mind they are only estimates. You'll want to test the effectiveness of that kind of estimation on your own scenes, of course.
If you want to build your own system, I would recommend a minimum of 16GB of RAM; get 24 or 32GB if you can. Fortunately RAM is quite cheap right now so it's not difficult to get that much or more in a fairly reasonably priced system. If you anticipate doing a lot of high resolution renders and don't mind your system being tied up on rendering regularly, then buying a machine is probably the most economical choice over the long-term. However, depending on whether you will be selling the results and thus may have a small budget for each piece, a final render at high quality on the Ranch render farm might be the most convenient and responsive option. The 12,000 pixel render would have taken several days on a fast home workstation, whereas it was less than an hour on the render farm.
In general Terragen 2 is fairly economical in terms of memory usage when rendering (with a few exceptions such as high GI settings). If you loaded similar objects and textures into another application to render a similar scene, you'd probably have similar memory usage, perhaps even greater, depending on the application. Which is to say, if you can't render a particular scene at high resolution in Terragen, it's quite possible you'd have the same problems with many other 3D applications, assuming the memory usage in the scene is driven largely by the assets in use and not the render settings. Making sure that's the case is really the goal and should be fairly easy if you just follow some simple guidelines about render settings. One of the most important, as Martin mentioned, is that a detail of 1 is seldom necessary, 0.75 usually provides very good deal and will use less memory.
- Oshyan