So there are two forces at work when it comes to evolution. The first is natural selection, which may start with random changes, where the structures that are beneficial tend to survive, hence there is a selection for them. The second is however that those random changes are restricted to changes that can physically happen, which depends on the laws of physics, and the structure itself.<p>The laws of physics themselves favor the selection for certain structures, and then those structures favor specific selections. It all keeps building up to more specific selections.<p>Thermodynamics itself always favors the creation of structures that dissipates free energy reservoirs, such as a star. Because heat death can't happen all at once, the system must move from one meta-stable state to another by dissipating structures feeding dissipating structures and so on. This is how entropy increases from minimum to maximum. There is however a minimum cost to maintaining a dissipating structure, so the feeding stops at structures that are barely left with enough potential energy to maintain themselves. The maintenance cost in fact increases the further away the structure is from the initial disequilibria, the star in this case. These bottom feeder structures are still thermodynamically favored since there is free energy left to do something, now however the selection changes. Now it's the structures that only create a minimum amount of entropy, structures that are thermodynamically efficient tend to survive, and dissipate another day, however slowly.<p>And here comes magic. It turns out that selection for thermodynamic efficiency is a selection for information processing, namely prediction. When a system interacts with it's environment it has a state, and that state is changed by the interaction. Some of that change correlates with past interactions, some with future ones. Turns out that how much of that state, or rather information (which is a physical thing like heat, because it needs a medium, and you can use the same statistical mechanics to describe it as well) correlates with future interactions is what thermodynamic efficiency is. This is called information thermodynamics, or the thermodynamics of prediction.<p><a href="https://arxiv.org/abs/1203.3271" rel="nofollow">https://arxiv.org/abs/1203.3271</a><p><a href="https://arxiv.org/abs/2009.04006" rel="nofollow">https://arxiv.org/abs/2009.04006</a><p>When there is enough state space for complex structures this thermodynamic pressure on bottom feeders is what creates information processing capabilities, including self-organizing structures like molecular machines that happen to approach 100% thermodynamic efficiency, and eventually things like DNS and nervous systems.<p>If you zoom out it's all lower entropy fighting it's way to higher entropy, creating things like stars, planets and us as a side effect. We are not the point, heat death is, it's just happens to be the case that you need structures like us for heat death to happen eventually. Not specifically us, but dissipating structures. We are bottom feeder dissipating structures, the worst of the bunch, but that also means that we get to think about it, which is cool. We get to predict the heat death. Or is it the heat death predicting itself? :-)<p>To answer the question we feel forces other than sound waves traveling in the medium we are in, and once you have a selection for prediction going on sooner or later you will wind up with organs that can aid creating computational models of the environment, to aid prediction, to aid thermodynamic efficiency, even at the cost of increasing complexity.