Westworld's Strength Is Its Inhumanity
In anticipation of Sunday's Emmy Awards, this week WIRED staffers are looking back at some of their favorite shows from the past year.
One scene from Westworld replays in my head again and again, a little like (I imagine) one of the poor, doomed robots on the show who start noticing and remembering the programmatic loops in their simulated, hyper-violent Old West sandbox game. It’s when the android Maeve, played by Thandie Newton, grabs a technician's tablet showing the dashboard for her personality software and, with a deft finger swipe, upgrades herself to genius. Yes, maybe taking control of your life by literally taking control of your life is a teensy bit on the nose. But for me it was the best flicker of weirdness from a show that—again, like its robots—dreamed big dreams.
Westworld got nominated for 22 Emmys, including Best Dramatic Series. It looks beautiful, is well acted, and is definitely ambitious. You can’t argue with the craft. Gorgeous sets, tense editing, sneaky costume changes that, if you walk them back, reveal (spoilers, duh) that the show takes place in two different time periods, a slick move bolstered narratively by the fact that the droids’ memory is so perfect they sometimes can’t tell the difference between past and present. (That’s neat!) Also, (spoilers again, duh) I am a sucker for climactic scenes where an oppressed class rises up and murders one-percenters. But really, what kept me watching week after week was Maeve’s Groundhog-Day-in-Tartarus plot line.
Related Stories
Westworld asks a lot of questions—about the nature of identity, about how memory (especially painful memory) makes people who they are. But it also fails to answer a lot of questions, and I have to admit I found that off-putting for a show that positioned itself as smart science fiction. Like: Why do visitors to Westworld act as though their actions have moral consequences when they don’t even have in-game consequences—for guests, at least? Why are human park guests essentially un-killable in-game? As I understand actual sandbox gaming, if you get "killed" you generally bounce back to a reset point. For that matter, why isn’t the park run by a centralized game engine that controls all the non-player characters (making the plot easier to manage) instead of giving each "host" quasi-autonomy? Why can the robots function off the game grid? How long is a typical guest's visit to Westworld? How big is it? Does anyone have a map?
Maybe the answer to all these questions is a Mystery Science Theater-like "it’s just a show—relax." But Westworld didn’t seem to be pitching itself as surrealism. And, look, when it comes to big moral questions, I’m willing to buy the idea that throwing off the yoke of your inhibitions and shooting, stabbing, and raping droids marks you as (or makes you) a bad human being. As my colleague Jon Mooallem persuasively argued in 2015, it’s not OK to kick a robot—not because it hurts them, but because it hurts you. In the heart.
Possibly I’m identifying not gaps in Westworld per se but what-might-have-beens. Smart, enslaved robots make good vehicles for playing with notions of violence, morality, and humanity. Just ask Commander Data.
Modern scholarship on the ethics and legality of sex robots, for example, is fascinating in its speculative panache. Researcher Sarah Jamie Lewis had a mind-blowing thread on Twitter the other day examining whether a human being could philosophically consent to sex with a robot, and what it will mean when that robot collects data on the people it has sex with. (Oh, here’s a story that says hackers will make sex robots kill people.)
But it’s a mistake to think of robots as wannabe humans. I dug out "A Cyborg Manifesto,” Donna Haraway’s 1985 essay on postmodern feminism and the porous boundary—even three decades ago—between humans and technology. If I’d read it since college, I didn’t remember; and, wow, was Haraway prescient about what an embodied, digital world was going to mean for sex (the descriptor and the activity), labor, and identity. "The relation between organism and machine has become a border war. The stakes in the border war have been the territories of production, reproduction, and imagination," she wrote.
To Haraway, cyborgs are post-gender, uncoupled from oedipal incentives. "Late 20th-century machines have made thoroughly ambiguous the difference between natural and artificial, mind and body, self-developing and externally designed, and many other distinctions that used to apply to organisms and machines," Haraway wrote. "Our machines are disturbingly lively, and we ourselves frighteningly inert."
Let me be clear: I would watch that show.
And with Westworld, I almost did. Maeve’s complex scheme to cogito-ergo-summarize herself out of Westworld came close to telling the story of a Haraway cyborg. Haraway is saying, in part, that cyborgs are feminist because they are post-human. They are their own interest group. And that's how Newton plays Maeve. She doesn't want to be a "real" woman in any prosaic, human sense. Maeve knows she's superior, and you can see that on Newton's face as she escapes "the maze of dualisms in which we have explained our bodies and our tools to ourselves," as Haraway's article puts it. "This is a dream not of a common language, but of a powerful infidel heteroglossia," Haraway wrote. "I would rather be a cyborg than a goddess."
But then, at the cusp of freedom, Maeve turns back to look. She goes back inside Westworld to rescue another robot she’d thought of, several lives ago, as a daughter. To me, that’s too human. In my head-canon, Maeve transcends love, maternal or otherwise. She's more. And yet, back into Westworld she heads. It's Dolores (played by Evan Rachel Wood) who becomes more of a Haraway cyborg, uninterested in the kind of jobs or sex or memories that humans seem to care about. (Like Newton, Wood deserves her Emmy nom; both of them brought life and depth to their material.)
Maybe I don’t actually care what makes someone human, or whether it’s only cultural strictures that keep people from acting like sociopaths. I don’t happen to think that’s true, but I thought Westworld was a more interesting show when it was trying to figure out what cyborgs—in this case, killer sex robots from the future—become when they don’t care about being human anymore. The unease that Westworld evokes at its best isn’t that deep down all humans are monsters. It’s that it doesn’t matter, because humans are obsolete.
Ah, well, possibly that’s Season 2. Apparently it’ll dig further into the uprising that closed the first season, and I for one welcome next year’s robot overlords. I know Westworld is a crowd-pleasing HBO show, but if I’m going to spend more time there—and I am, I am—my big dream is that the questions at its core will get harder, and the answers weirder.
Related Video
Science
How to Live with Robots
Sophisticated companion robots have arrived. But we need to be very careful about how we interact with them.