If my assumption is correct (and we all know what happens when one assumes...), Compute Terrain determines the positional coordinates (x, y, z) of a given bit of terrain. I imagine it's output is originally detemermined by the 'planet' radius and is modified by the displacement values of the shaders you heap on the surface.
Vectors, without getting into the mathematical definition, may be thought of for this purpose as simple arrows, perpendicular to the surface at every point. Typically one composes/defines surfaces with a series of vertices (positions) and normals (one at each vertex). The normals are all of the same length (1.0 unit) and vary only in their direction, which is determined by the slope of the surface at that point (remember, the normals are always perpendicular to the surface). The normals are used primarily in lighting computations; if the normal vector is pointed directly at the light source, the surface is perpendicular to the source, and will receive a maximum amount of light. As the surface moves away from perpendicular, the normal points further and further away from the light, and the lighting calculations will result in the surface at that point being less illuminated.
Clear as mud, right?