It looks like a rather oversimplified model: the examples assign the same overhead to any component, transformation, or feature, so anything monolithic or that can be grouped together is favoured, even if buggy, clunky, or poorly supported. Transformations may be awkward (lossy, or having to make assumptions and make up some data, and/or dealing with poorly specified formats), which makes them unequal as well, but it's often arguable which ones are simpler. Features may have different priorities, but even if they don't, I imagine them to be among requirements, not up to technical decisions; if they were, it'd be another way to hack the resulting value by adding useless ones. And ignoring all that, it just says that systems with fewer components are generally simpler.<p>It seems to be the case with most talks about simplicity: pretty much everyone agrees that it's good, but disagrees on what it means and how to estimate it.
I don't really understand the need for developers to find some real world scenario or situation that they can use as analogy.<p>Impedance mismatch is a physical thing, it creates physical problems that are based on laws in nature. Having to map your database entities to a model that is different has nothing to do with any physical things, it's a mental construct.<p>If you have a mismatch of impedance, what do you do? You insert something between the two mismatched devices to match the impedance.
Isn't that what all software does already anyways? Insert some mapping, converting, transforming layer between the two pieces?
> You have just one system ( Kafka + KSqlDB). Note that we are ignoring Zookeeper.<p>That's not really something you can ignore, unless someone else manages it for you. (And then, of course, you need to take cost into account.)<p>Really, the model is completely ignoring the cost (time or money) of operating the storage layer.<p>It's an interesting idea, but without taking the full picture into account, I don't think it's very useful.
The Impedance Mismatch Score (IMS) notion is worth exploring. Novel, right? From the hip, I don't recall a similar complexity measurement applied this way. Neat.<p>I like the potential that IMS potentially could be applied to more than ORMs.<p>Someone should probably link to Ted Neward's (?) Vietnam of Computer Science.<p>Happily, there's a resolution to this paradox. A time before ORMs, without an impedance mismatch. The key insight is realizing that client/server is the root cause of the imperative code-to-RDBMS impedance mismatch.<p>The root causes of other mismatches are also caused by suboptimal mental mentals (metaphors).<p>Teaser, since I'm using a pseudonym account. Maybe some day I'll get my shit together, polish and promote my FOSS projects.
How is adding features a reduction in impedance mismatch just because they are from the same vendor? For example: Redis timeseries is a separate setup - how does it reduce complexity?<p>Edit: fixed typo