Space rendering is full of interesting problems like this. Another one is that single precision floating point doesn't have enough precision to represent both planet scale and human scale in the same coordinate system (let alone solar system scale or galaxy scale), yet GPUs don't support double precision well. So you have to make sure that you do calculations needing high precision on the CPU in double precision and only send the GPU coordinates it can handle, or your 3D models will get crunched by precision errors.<p>Another one is that a planet sphere renderer will often tesselate the sphere into quads in lat-lon space. Of course the quads are split into two triangles for rendering. However, at the poles, one of the triangles has zero area because two of its vertices are the same, the pole. Then when you texture map that "quad" with a square texture, half of the texture is not shown, and you get visible seams (Google Earth suffers from this artifact, or at least it did in the past). What's less obvious is that this problem is present to a lesser extent in every quad on the sphere, because the triangle with the horizontal edge nearer the pole is smaller than the other, so half of the texture is stretched and half is shrunk. The fix is to use homogeneous texture coordinates.