It appears that the corners are breaking apart because the displacement is displacing outwards along the face normals. In the test where you put the displacement up really high and said that the corners were not exploding, it looks to me like it is still breaking apart because there's a bright band in the middle of the pillar in the foreground; it looks like we can see through to the other side where it's in sunlight.
At every position on the surface, the surface is displaced (moved) in the direction of the normal. If your model has built-in normals and "Use smooth normals" is checked in TG, the model's normals are used and interpolated across each face. If not, normals are derived automatically from the faces, and they are not smoothed, although we are planning to add a feature to do this in future. More on that below.
Face normals aren't the same as vertex normals. We need vertex normals to tell the renderer how to shade the edges and vertices identically across neighbouring faces. Otherwise each face ends up doing its own thing and can break away from the others when displaced.
If the normals are continuous across the edges of faces, displacement shouldn't break the surface, but often the normals are not continuous.
When you generate these normals in another program, the creasing threshold or creasing angle (may be called something different) affects which edges the program decides to smooth. If this threshold is set to create some sharp/creased edges, then it does this by creating different vertex normals on each face. If this happens, the displacement will not be continuous across the edge, and it will break apart.
Maybe there are different smoothing groups. I am not sure how that affects the normals that are seen by Terragen.
We are aiming to add an auto-generate-smooth-normals option in future, so you don't necessarily have to do this when you build your model. However, even with such a feature, you need to model your corners in such a way that the smoothing does what you want.
IANAM (I am not a modeler), but...
At these corners you need to model in some extra polygons to round/fillet/chamfer (or radius) the corners (I see that you've done this). This is a good way to tell a renderer exactly how sharp or smooth you want to the corner to be and what normals it should displace along. Then use your modeler, or PoseRay, or in future Terragen, to generate smooth normals.
If this isn't working, try displaying the normals in the Maya viewport and see what they are doing. Face normals aren't the same as vertex normals. If it's showing you one normal per face, that won't give Terragen enough information to smoothly transition across edges of faces. You want to see a normal at each vertex. If there is more than one normal at a vertex and they point in different directions, then that will produce a sharp edge when shaded and will tear apart when displaced.
I'm not a modeler, but IMHO you should take care of normals as part of the modeling process, and this only becomes more important when you intend to render it with displacements. Think about what those faces will do when they are displaced along the normals. As long as "Use smooth normals" is checked in TG, these normals will be change smoothly across the surface when rendered by Terragen. If neighbouring faces share the same normal, it should render smoothly without breaking apart.
Let's look at what happens when you don't have these vertex normals, or if the vertex normals are set to produce sharp edges. Try to imagine you are a renderer (!) Here is a face that you've been asked to displace. Everything you know about the face (the polygon and the normals) tells you it faces in direction X. So you are going to move all of its surface in that direction. Any other direction would probably be wrong, because you want the bumps in the surface to displace in and out of that normal. Then there's another face adjacent to the first one. The second face has a different normal. You displace the second face along the second face's normal. This makes the two faces move away from each other, creating a gap. Let's try to patch up the gap. How? Add more polygons at render time? OK, that might work. But the texture on the patched up polygons will be stretched across the gap, due to both edges sharing the same UV. That won't look good. It would be better if the model had these "patch" polygons built into it to begin with. They would have UVs and normals so we could simply displace those surfaces with the displacement texture.
Perhaps the renderer could automatically generate these patch/chamfer/rounding polygons for you. This is possible. The user would need to tell the renderer the radius of the corner. Maybe this could be set per model or per material, but there will be many times where you want control this at different parts of the model. If the model already has these round edges built-in, it would be wasteful to add additional polygons. It would be difficult to try to detect whether these already exist, although I can imagine some kind of "minimum radius" approach to analyzing corners prior to rendering.
I've seen that V-Ray has a "keep continuity" option for this kind of situation. I'm not sure how it works, but I guess that it might smoothly interpolate the displacement normal across the whole face and ensure continuity with the neighbours, decoupled from the shading normal which is kept sharp. I've thought about offering that as an option. It might work well for organic models but I don't think it's ideal for architecture.
Do other renderers have better ways of handling this? I don't mean the "auto generate normals" idea. We're already planning to add that in future, but you still need to be careful about how you model corners in architecture.
Matt