Improbable's tech intrigues me, but I agree with many of the comments that technical information on how the actual tech works is limited, and most of the articles are puff pieces that focus on aspects of Bossa Studio's game that have nothing to do with Improbable, the part about emergent gameplay for example. According to Glenn Fiedler (www.gafferongames.com) in his recent GDC talk on networking physics (<a href="http://gdcvault.com/play/1022195/Physics-for-Game-Programmers-Networking" rel="nofollow">http://gdcvault.com/play/1022195/Physics-for-Game-Programmer...</a>) this paper by Insomniac Games (<a href="https://d3cw3dd2w32x2b.cloudfront.net/wp-content/uploads/2011/06/introductiontosynchost.pdf" rel="nofollow">https://d3cw3dd2w32x2b.cloudfront.net/wp-content/uploads/201...</a>) details a system similar to what Improbable is working on.<p>In this system the server is basically a distributed database and message passing system, and grants different clients authority over specific simulation objects or fields. If a client is the authority it runs the simulation logic for that object. As long as you can scale you message passing and db it seems like most work can be offloaded on clients, so complex simulation is "free". I'd be concerned about cheating in more competitive games, but it should work well for social or creative games and for non-game simulation where you know clients are trustworthy.