> The more logic happens near the db, the more surface for errors and edge cases can be reduced and reliability issues avoided. The reduction in distances may also result in improved abstractions and in making the computations efficient in a fundamental thermodynamic way.<p>Fundamentally this sounds great, but isn't this rather painful with Postgres?<p>In my DB lectures in Uni I've been informally told to avoid managing logic from within Postgres, as the development and debugging experience for stored procedures is rather poor. I would add to this the painfully slow development cycles for third party language implementations such as plv8. In addition, platform support is next to none, because language extensions cannot load external code due to the trusted nature of the execution environment.
<i>>The fact that persistent storage has to happen at the same place to be reliable is only incidental, not a defining attribute of a relational database.</i><p>Persistence happens to processes who's state is expensive to reconstruct. Measurements of the world requires time travel to recover and so are quite expensive to acquire! "Incidental" implies a weaker correlation than warranted.<p>As an aside, ~100% of data models are incapable of modelling contradictary or ambiguous measurements and fail to adequately model alternative normalization and integration into whatever model of "truth" you pick. And these systems entirely fall over when any part of these tacit and underspecified constraints change.