Aside from our ability to use language, Humanity has also been partially defined by our relatively unique ability to make and use tools (such behaviors have also been observed in certain primate species. While some bottlenose dolphins have been observed using tools, they have yet to demonstrate an ability for making tools. This may result from the environment that dolphins inhabit, and possibly relate to the idea that technological civilizations cannot exist underwater). From the blunt implements of pre-history to the fine-tuned calibrations used to send a missile halfway around the world, all tools more less functioned the same way: they needed to be held, to be wielded, to be used. Without the tool occupying the space between a closed fist, they were useless. There is even the old saying that a tool is only as good as the person using it. But what would happen if a tool were to be invented that did not require a user, a hand to wield it to give it its power and purpose? Could this tool wield itself? Would it even need a user? And could it, in turn, make tools of its own?
In Frank Herbert’s six-book Dune saga, the AI entities that never make on-page appearances are collectively referred to as “thinking machines.” Yet despite not appearing in any of the books (they are occasionally referred to and only in the past tense as relics of history that still have pertinent lessons to teach), the effects of the thinking machines on the Dune human civilization are more than palpable to the reader. In perfect world-building fashion, Herbert provides details about Humanity’s ancient past (our distant future) where we built and came to be ever more reliant on robots, their artificial intelligence, and their ability to think for us. As time went on, this relationship devolved into one of dominator and dominated: the machines and those who made them, us. For an unspecified period, the thinking machines reigned supreme over Humanity. The tools had become unbounded. Is a tool still a tool if it no longer needs a wielder? If it can engage in activity without it being applied to a task and set to it, does it not have agency?
Finally, for one reason or another that is not specified by Herbert, humanity reaches a breaking point, and a holy war (Butlerian Jihad) is waged where Man overthrew the thinking machines, annihilated them entirely, and instituted a single commandment that steered technological and cultural advancement in the new Human Civilization: “Thou shalt not make a machine in the likeness of a human mind.” But how do you have a technologically advanced interstellar civilization without computers complex enough to chart courses through Space and provide data management services to bureaucracies overseeing billions of people and dozens of planets and Star systems? For Herbert, you don’t make better tools for users to use, you make better users – users so adept they render tools obsolete. Instead of computers plotting courses through interstellar Space, there are now Guild Navigators who ingest Spice to fold space, paving a clear path from point of origin to destination. And instead of supercomputers handling complex calculations too data-intensive for humans, mentats, or human supercomputers, were developed.
Now, it is important to contextualize this within the greater genre of Science Fiction. Frank Herbert and his Dune novels are a part of what is referred to as the New Wave (Silver Age) of Science Fiction. This period is defined by an emphasis on the socio-cultural. Where the Golden Age of science fiction was concerned with Man using his higher faculties and superior technology to carve his path in the cosmos and plant his flag on any new worlds visited, characters of the New Wave were more concerned with their place in society, their relations with others, and the implications of technology on the both the individual and collective. Frank Herbert once described Dune as having a central theme of being wary of charismatic leaders. If Dune were a product of the Golden Age, this theme would be flipped on its head: Paul Atreides goes from interstellar butcher and warmonger to stoic leader “doing what he’s got to do” and we should all be grateful for him doing so.
But how have these thinking machines been handled by other authors in other stories? Two works/series of works from the Golden Age can be compared to Dune to see just how these thinking machines are perceived by the writers and their audiences throughout the history of the genre. In Isaac Asimov’s stories there is a recurring AI called Multivac that is always portrayed as a loyal tool for Humanity. Even when Multivac becomes God and encompasses all of Existence in The Last Question, it does so in the hope that doing so can help it to answer a question it has been asked by many people across Time and Space: whether or not entropy can be reversed. To Asimov, a tool cannot transcend the state of being it was assigned at the onset of creation, namely as a means to an end for its creator/wielder. Another Golden Age author, Doc Smith, goes farther than Asimov does: Smith takes the machine out of the term “thinking machine” entirely. To him, a computer wasn’t a specialized type of machine, but a job description.
Part of this new age or wave that Dune finds itself in is also concerned with trans-humanism, the idea that humans as a species will change and evolve as technology progresses and the line between the organic and the cybernetic gets grayer and grayer (nanobots, prostheses and augmentations, gene editing, etc.). While I would argue that trans-humanism and Golden Age science fiction are mutually exclusive, Herbert uses trans-humanist elements in Dune by having specialized types of humans exist (aforementioned above).
Today, in modern science fiction and in our non-fictitious world, Herbert’s perspective on thinking machines has seemed to win out and I personally would lean more towards his camp than Asimov’s. Films like the Matrix and the Terminator franchises have only hardened us as a society to Herbert’s position, but the ideals represented in Asimov’s stories are what inspires engineers to keep innovating. There won’t be any thinking machines in the future for us to overthrow if there aren’t any loyal machines for us to start trusting now.
A part of a science fiction writing series titled “The Mule Speaks!”