We're using Blender to pre-render 3D assets into 2D ones for our upcoming top-down survival horror arpg. (Link in bio for the curious.) It's worked well for us overall. That said, it definitely adds a lot of complexity, and if you're ultimately making a 2D game it adds a whole new, complex tool to learn and integrate. Case in point the incredible efforts described in this article to create the cube map or depth buffer!<p>I will say, we solved the draw order problem a different, easier way by adopting a fixed ortho perspective and rendering objects as spritesheets/tiles. We can then author levels using conventional 2D methods. Objects now sort by their pivot's Y value, more or less, so walking behind things isn't an issue.<p>Seeing the Cave Story screenshot has me thinking, there might be an especially good opportunity for pre-rendered side-scrollers... Just entirely eliminate whole classes of problems, plus you easily bring back some of that animation-inspired back/mid/foreground goodness.
Not sure how the author wrote an entire article on pre-rendered backgrounds in gaming without once mentioning Baldur's Gate / Infinity Engine games, of which the developers largely pioneered this with DirectDraw API.
Great write up! I still don't really see the point in pre-rendered backgrounds for PC/console games anymore though. GPUs nowadays are powerful enough and storage space is limited enough that you can just render everything realtime and it will look great.<p>The technique does seem like it would be a great fit for mobile, where users have limited control and efficiency is really important.