I would also recommend 1st chapter of "Principles Of Computer System Design: An Introduction - Saltzer & Kaashoek" for a more general discussion on complexity in digital system.
We have to delineate between environmental entropy and the entropy within the system. As about adding entropy, every action has effects which can simultaneously both decrease entropy in one variable while increasing in another.<p>Any system's future state is not entirely predictable. We determine actions and their impact on current entropy while taking decisions.
I feel like either<p>1) the author thinks there are obvious ways to decrease the entropy of the coin flipping example and expects these to be so obvious to the reader that they don't need enumerating to exemplify the approach he has in mind<p>or<p>2) the author is pointing out that in the coin flipping example the entropy is already so close to zero as to render efforts to reduce it absurd, and believes that this is so self evident as to need no explanation<p>In other words, the entropy of this article is ~ln(2). I suspect some energy could be usefully devoted to reducing it.
Its interesting joining a new startup seeing how clean organized it is, joining a mature startup and seeing how disorganized it is. There are layers and layers of interpretation of what systems should look like and they are different. We often refer to this as tech debt, but there is always that one box no one wants to touch because of its age and potential importance, kind of like a time capsule of when the company was once clean and orderly. I've see so much entropy.<p>But to the point of the article entropy only exists when external factors are introduced. Like new talent or tech paradigms changing the landscape of a startup.