I have a small sphere Radius 50, within a larger sphere Radius 200. Both are set with the Water shader and are semi-transparent. I have moved the render camera into the larger sphere to view the small one. When I render, the outer sphere does not show at all? Seems to me you should be able to be looking out from within the sphere, if it were solid the render would be black but it is not. It is a render of the small sphere as I want, but no outer sphere only landscape. What am I doing wrong?
Thanks,
JR
the sphere is a one-sided object apparently. It can't be rendered from the inside. From there, it's invisible to the camera.
If it is only that the sphere is one-sided you could make it double sided in the object dialogue. However, it may have something to do with the water shader treating the sphere as a solid volume.
Greg, not in this case. The built-in sphere is a procedural object, unlike an imported mesh object.
What's the difference? The procedual math just creates the geometry on the fly. What functional difference is there between that and a model built externally?
That seemed to be the problem. The native-sphere object is only one-sided and while you can get it transparent like glass, you can not move the camera inside. I imported a sphere I created in PoseRay which is double-sided and now it works. You can have the camera within the big sphere closed off from the outside world.
Thanks for the ideas!
JR
Glad to help.
Right, I'm now confused. The background node is a TG native sphere object, we're inside it in every scene. I often plan procedural stars on its surface. How so?
Its radius is negative, therefore its normals face inward.
:)
Richard
Ah-ha! Thanks, Richard. :)
Doh! ::)
So, on topic, couldn't you negate the radius of the outer sphere that you're inside, JR?
Tested. Yes, you can but, it removes the atmosphere's visibility outside of the sphere. You can see solid things outside but not the sky.