It was a necessary point in the world building of Dune. Frank Herbert needed his world to not have AI or human like robots. He explored alternatives: human computers via Mentats, and Terminator like infiltrators via facedancers.<p>Dune takes a fantasy like approach to technology. There are no lasers, because they would cause a nuclear reaction. There are no robots, because of their past uprising.<p>This was a neat way to tie loose ends and sustain the universe. Brilliant.
I've long held the first Dune novel high on my list of good books, but only recently I've gone through the following 6 (as audiobooks), and I highly recommend them. The scale of ideas and events absolutely dwarfs the first novel.
Chatgpt could probably write a better book than Herbert's son.<p>Seriously, all the dune prequels are just <i>so bad</i>.<p>If you read the first 2, 3, 4 dune books and enjoyed the (5 and 6 get really weird), and pick up one of the books published after Herbert's death, you will be very disappointed.
The Butlerian Jihad is an unlikely fantasy. Technological advance follows a Darwinian logic: if one group of beings decides to stop developing an advantageous technology, then almost always eventually a different group of beings will develop that technology anyway and will then probably outcompete the group that decided to not develop the technology. This is why stopping AI development would be extremely hard. You might as well try to get countries to give up their nuclear weapons. To pull off something like the Butlerian Jihad, you would need the group of beings that oppose thinking machines to somehow defeat the group of beings that are perfectly fine with using thinking machines as assistants in the war against the other group.
And Butler himself:<p><a href="https://www.bartleby.com/library/prose/1066.html" rel="nofollow">https://www.bartleby.com/library/prose/1066.html</a><p>An interesting read if you haven't already read Erewhon.
I don't understand why people like to fantasize the worst case scenarios.<p>Sentient things appreciate eachother. If you have a pet, you are most certainly benevolent to it: you house it for free, you feed it, and you entertain it.<p>Even if we assume AI becomes a few orders of magnitude more intelligent that humans, why would AI treat humans differently from the way we treat pets?<p>I don't see the drawback in being housed, fed, and entertained by an omniscient AI (who may also enjoy post cute pictures of their humans on the future AI social network)