So the argument is, machine intelligence isn't going to happen, but its bad that we're trying?<p>But this idea: "In short, we can make the Singularity more likely by stupefying ourselves into becoming machines instead of simply seeing machines for what they are — useful tools." is interesting. PBS Idea Channel just released a video that I think is related: <a href="https://www.youtube.com/watch?v=FLieeAUQWMs" rel="nofollow">https://www.youtube.com/watch?v=FLieeAUQWMs</a><p>What I think both the author of this post and the Idea channel video have in common is that they both are thinking of humans (or animals for that matter) as something different from a computer - but what are we but the excitation and suppression of electrical signals? Does the fact that we were grown instead of constructed make a difference?<p>In reality, we are just a very complicated machine - once you learn enough about how we work you could build your own. And the singularity, specifically the melding of mind and machine is a logical path along that road. Are there real concerns that we need to be thinking about? Sure. Will the 2045 timeframe be accurate? Who knows. But I think we are going to start seeing the lines blurred more and more as we approach that timeframe.