<p><pre><code> Neuralink co-founder Flip Sabes doesn’t get it.
</code></pre>
It's not so much that I'm worried about <i>me</i> getting ahold of one of these things. I'm really just worried about other people having them. Rotten people. Terrible people. The people out on the street, that if I had to share a room with them, I'd jump out a window.<p>There's this idea that these things will make them better people. What a can of worms that concept is.<p><pre><code> But to Elon, the scariest thing the Human Colossus is
doing is teaching the Computer Colossus to think.
</code></pre>
This is the brute-force hack to achieve the same outcome. One way or another, the equation is the same. Humans and machines collaborating, to produce an outcome.<p>We can either level the playing field, so that each of us has implicit, private, immediate, instant interaction with our own implanted systems, or, we can defer to intermediary system administrators holding the keys to machine rooms around the world, portioning out bandwidth that permits each to siphon off some compute time from a massive cluster, be it for algorithmic day trading, personal medical diagnostics, nuclear fission/fusion detonation simulations, image processing for astronomical observations, winning a Go tournament, or whatever.<p>But if an implant changes a person, from what do we derive our concept of self, and authenticity? Without that, how do we know we haven't died? How do we know that others are not animated corpses? What prevents us from becoming pzombies?<p>What's the difference between a truly convincing Real Doll, and a person after this?