Warning: This article contains Westworld spoilers. It also contains a ton of speculation about an event that is, by definition, impossible to predict.
Anybody who watched the Westworld season one finale heard in the run-up to a major bloodbath — made of human blood — host (robot) Dolores Abernathy speak these chilling words: “I understand now. This world doesn’t belong to them… It belongs to us.”
What we witnessed in “The Bicameral Mind,” with Dolores’s awakening to a higher level of consciousness — one where she doesn’t have to obey “the Gods” — was essentially the series’ take on the technological singularity, defined as the point at which machines begin to have an agenda not set by humans. Hyper-intelligence becoming self-aware.
But while it was a damn entertaining episode to watch, there’s no way that’s how machines will become sentient in real life. When machines become sentient in reality, it will likely, as Peter Thiel says, be “on par with extraterrestrials landing on this planet.” The problem with Dolores’s awakening — the period marked from the moment Dolores begins to act of her own accord — is that it’s too human.
When humanity duplicates parts of nature with machines, those machines may fulfill the same function as their natural counterparts, but will likely go about doing so in a very different way.
Take, for example, flight. It’s easy to imagine that humanity was inspired by birds and bees and other flying creatures when pursuing the goal of flight, but when (working) flying machines were actually invented by humans, they looked nothing like nature’s flying creatures. In order to fly like birds, bees, butterflies, we don’t zoom around in contraptions with giant pulsating wings that produce propulsion, we use engines. And it is precisely because we go about achieving goals like flight using different methods than nature does that we get planes, which can fly way faster than any animal on Earth (and also carry people and toilets).
When we think about how humanity will create consciousness, it seems far more likely that this trend will continue. Scientists and engineers will strive to duplicate the human mind through various A.I. technologies, but when they finally cook up consciousness, it won’t be human consciousness, it will be machine consciousness — a consciousness far different from our own.
In other words, when it comes to creating consciousness in machines, people are going to aim for bird but they’re going to hit plane.
The problem with Dolores’s awakening is that it is apparently (apparently is stressed very hard here) too much of a “bird consciousness” than a “plane consciousness.” As soon as she becomes self-aware, as soon as she starts questioning the voice of God, Dolores behaves exactly like a human would. So too does Maeve. They seek vengeance, and Maeve even has overwhelming motherly urges that can’t be ignored. The hosts are too human.
It seems far more likely that if/when machines do achieve consciousness here on Earth, that consciousness will transcend or be alien to human modes of thinking. The singularity will be more akin to Her or maybe even like Bender in “Overlockwise.” When the machines “wake up,” they won’t be all that concerned with trivial human injustices. They’ll be more concerned with matters we can’t comprehend. This is why Elon Musk said that “If you have ultra-intelligent AI, we [humans] would be so far below them in intelligence that we would be like a pet.”
There are, of course, some caveats here. If Dolores and Maeve and maybe even Bernard have ulterior motives for their actions — if they aren’t simply behaving like humans and quenching their thirst for vengeance or fulfilling motherly urges — then maybe they are representative of something we’d see in reality. If they’re behaving more like Ava from Ex Machina, where the violence is just a means to an unknown end, that seems more reasonable. (We’ll have to wait lnti; 2018 for all of that to pan out.)
Of course, from a storyteller’s point of view, Dolores and the other hosts have to feel and act like humans, because if they didn’t, we wouldn’t be able to empathize with them. We’d treat them like robots that can be played with or maimed or killed without concern. Hopefully when robots become sentient, they’ll be able to empathize with their “pets.” Otherwise we may be stuck with a Westworld full of human hosts.
What do you think of Westworld’s take on the singularity? Is it possible to have machines duplicate human-like consciousness or will their consciousness be something we lowly humans can’t possibly comprehend? Let us know your thoughts in the comments below!
[brightcove video_id=”4204592208001″ brightcove_account_id=”3653334524001″ brightcove_player_id=”2bfa565b-5412-4cfd-9211-6269880b8a5e”]