We’re comparing HBO’s fall series Westworld with Ronald D. Moore’s Battlestar Galactica and Joss Whedon’s least famous (but perhaps most thought-provoking) show Dollhouse.

Westworld is, hands down, the 2016 fall series I am most excited about. (Pitch comes in at a close second.)

Debuting on HBO in October, the series is loosely based on Michael Crichton’s 1973 movie of the same name. It takes the basic premise — a Western theme park populated by robots — but humanizes the robots and amps up their inferred mistreatment, with the intent of making the viewers question where technology ends and humanity begins.

The series boasts an impressive cast led by Anthony Hopkins and Evan Rachel Wood, and comes from J.J. Abrams’ Bad Robot production house. And it seems to lean heavily on sci-fi show conventions that I already adore.

While I recognize that the pop culture exploration of AI vs. humanity is vast and all-encompassing, I want to specifically break down how I believe Westworld is building on the moral dilemmas set up by two recent TV series: Dollhouse and Battlestar Galactica. (Note: I could have chosen to talk about a number of other series as well/instead, but today the comparison is restricted to these two, to better highlight certain themes.)

The BSG connection should be fairly obvious: Both the original Westworld movie and the Battlestar Galactica (1978-79) TV series were borne out of the ’70s and ’80s fascination with robot technology and the (seemingly) imminent AI takeover of humanity (see also: Blade Runner) — and if that wasn’t enough, the co-showrunner of HBO’s Westworld Lisa Joy is set to write the Battlestar Galactica reboot movie! I think it’s fair to assume that we can expect some creative synergy here.

The connection to Dollhouse perhaps doesn’t seem as straightforward — and one could argue that, if anything, it was Dollhouse that took its cues from the original Westworld. But I believe some of the questions raised by the underrated Whedon series will echo through the HBO show in significant ways.

So consider this my official plea to make everyone who’s interested in Westworld go back and watch these two excellent series that (I believe) tackle similar questions ahead of the HBO series’ October premiere.

On the edge of humanity

There are many ways to tell stories about humanoid robots, including last year’s insightful study in artificial intelligence Ex Machina.

But Westworld‘s robots aren’t robots at all; like the second generation Cylons in Battlestar Galactica, they are designed to have humanesque bodies. As explained in a big LA Times feature introducing the show, “Grown in milky vats, their muscles genetically grafted strand by strand, they can drink, stutter, sweat, cough, bleed and ‘die.'”

While neither Westworld nor BSG‘s humanoid beings were human to the extent of Dollhouse‘s dolls (who were actual human beings with re-programmed minds and transplanted ‘souls’), Westworld‘s hosts appear to transcend robotics, similarly to how second-generation Cylons have evolved beyond their robot predecessors — essentially meaning that, unlike in the original movie/TV series, you can’t cut their faces off and expect to find wiring underneath.

Where the original Battlestar Galactica‘s Cylons were pure machine, the main antagonists of the 2004 reboot series re-imagined them as partly composed of flesh and blood, robotics weaved into their nervous system and brain and linking them to a central hub of knowledge and memory.

In both cases, the fact that the hosts/Cylons’ emotions are programmed does not necessarily make them less valid than human emotions. Just different. It was in this complex definition of ‘humanity,’ and where the show’s characters chose to draw the line between human and machine, that BSG found its central conflict.

We were, for example, subjected to a scene of brutal mistreatment of one of the humanoid cylons, justified by (some of) the human characters because of her lack of humanity.

By the sounds of it, Westworld will delve into a similar grey area (more on that later), challenging both the characters and the audience to find pity and sympathy for a being that was initially set up as ‘less than’ human because of its status as a human-made, technological construct.

Real or not real?

Which leads us to the big, central questions: Are AI emotions real? Are they are real as ours? Are they even comparable — and should they be?

In BSG, some Cylons were programmed to think they were human, and experienced — seemingly — genuine distress and confusion when they found out they were not.

Late in the series came an additional shock when the audience discovered that several key human characters had, in fact, been Cylons all along. The show had thus also ‘programmed’ the audience to think they were human before the big reveal, and this game-changer forced everyone (on screen and off) to re-evaluate what ‘humanity’ even meant, when illusion had so effectively substituted the ‘real’ thing for everyone involved.

Joss Whedon’s Dollhouse, albeit working with a slightly different premise, set up a similarly world-shattering surprise. In Dollhouse, real humans had their brains wiped and were implanted with digitally built personalities, so-called Imprints, that gave them moldable identities.

One of the series’ biggest shocks came when one character was revealed to have been a doll the whole time; once again, the audience was invited to ponder the blurry line between ‘real’ and ‘not real.’

Intriguingly, it appears Westworld could be setting up shockers in the same vein, to further blur the lines between human and host.

“It’s a show that you really want to pay attention to,” teases Evan Rachel Wood in the LA Times feature. “When this series is done… everyone is immediately going to go back and re-watch the whole thing.”

Could it be because key human characters are, in fact, hosts programmed to serve key functions within the theme park? That is definitely a tried and tested way of challenging the audience’s perception of man vs. machine (again, see: Blade Runner), and I wouldn’t be surprised if Westworld ventured down this same path.

A virtual playhouse with living sex dolls

The Westworld theme park runs on desire and fantasy; the hosts are designed to interact with visitors in the way that will bring paying customers pleasure.

Dollhouse operated on a very similar premise: They imprinted human hosts with tailor-made personalities to act out fantasies for paying customers.

(Unlike Westworld, customers didn’t come to them; instead, the dolls were kept in a neutral, passive state while in the Dollhouse, and then sent out into the world to take on the ‘lives’ of their renters’ choosing.)

Echo (Eliza Dushku) was made to be many people, not all of them someone’s sexual fantasy, but there as never any doubt that the Dollhouse’s primary purpose was prostitution, the inhabitants of the bodies chillingly programmed to enjoy the interactions (with one disturbing exception).

Battlestar Galactica‘s Cylons differ here because they were generally considered ‘human’ enough to have free will and individual agency and awareness. (And the characters who thought otherwise, like the crew of the Pegasus, were generally construed as villains.)

But discussions of coercion and consent were central to the plot of Dollhouse, as the dolls were repeatedly made to do all kinds of degrading and illegal things, only to have their memories wiped away and wake up unaware of what had happened to them. Some characters thought this an inherent moral invasion, while others — with whom we were also made to sympathise with — thought that, since the dolls’ original personalities had signed a contract, they were well within their moral right to use their bodies freely.

It was a complex issue, which the show never attempted to provide a singularly ‘right’ answer to, and it was therefore endlessly fascinating to explore as the show went on.

A question of consent

The idea explored in Dollhouse — of creating living sex dolls with personalities to match — echoes in Westworld, where the purpose of the humanoid hosts is exactly the same: To fulfil paying visitors’ fantasies, whatever those fantasies may be.

Says co-creator Lisa Joy, “I think it goes back to the notion of romantic love, from the earliest myths, from Pygmalion and Galatea. You fall in love with this inanimate creature that you imbue with all your hopes and dreams. [The guests] come to feel love, to feel special… Another human can’t supply that because humans aren’t made to service each other’s fantasies.”

Intentionally or not, Joy’s words echo the description of the Dollhouse offered by its leader, Adele DeWitt, who describes a doll as, “A friend, a lover, a confidant in a sea of enemies. Your heart’s desire made flesh.”

While Dollhouse certainly grappled with questions of consent, it seems like Westworld — allowed to get much darker and more explicit on HBO than Dollhouse was on Fox — will take this issue several steps further.

One of the hosts will be subjected to sexual assault, according to the LA Times, as part of a guest’s fantasy. Dollhouse carefully toed that line by establishing a policy that the dolls must never be hurt of agitated by their clients, and its tackling of rape storylines was therefore much less gratuitous or visual.

But rape storylines were tackled, and one main character had not one, but two separate, very different storylines in which what happened to her was identified in the narrative as rape.

In season 1, Echo discovered that her fellow doll Sierra (Dichen Lachman) was raped by her handler while in her doll state; this act was universally deplored and condemned by all other characters on the show, DeWitt had the handler killed, and the story beat ultimately served to cement the dolls’ right to humanity and dignity even in their wiped state.

The second storyline was both much darker and more complex. In season 2, we learned that before Sierra had her mind wiped to become a doll, her ‘real’ self, Priya, had rejected a wealthy man, who had then forced her into the Dollhouse against her will. (Meaning that, unlike the other dolls, she had never in fact agreed to anything that happened to her once ‘owned’ by the Dollhouse.)

From that point, he would ‘rent’ her original imprint regularly, with one key alteration: She now worshiped him, allowing him to fulfil his twisted fantasy and establish a sense of ownership of her.

It was one of the show’s darkest storylines, culminating in a horrified Priya (now fully aware) confronting and ultimately killing him, getting her revenge but unable to erase everything that had been done to her.

By all accounts, Westworld will travel down the same narrative path, but it’ll be much more on-the-nose about this aspect of humanoid servitude due to HBO’s higher violence/nudity tolerance.

I’m not sure what kind of effect a rape scene will have on the audience in the wake of Game of Thrones‘ recent controversies (and reportedly, reviewers didn’t react positively to the scene), but Joy asserts that it is not about “fetishizing” assault.

“I think part of the way you tackle [violence toward women] is by addressing it,” Joy tells the LA Times. “It’s hard to watch and it should be because it’s terrible and because it really happens and it happens a lot.”

Carving a personality from nothing

The central question of consent in Dollhouse was framed by the show’s treatment of ‘the self,’ repeatedly asking what gives a human her humanity: Is it the body, or the mind?

After all, it was the doll-state of Echo that ended up becoming sentient enough to be considered her own person; her original personality ended up as just one of many imprints she could access at will, but it was no longer her.

Echo was a blank slate — a body without a ‘soul’ — who carved out a ‘self’ from what the Dollhouse creators would have called ‘nothing,’ filling herself up with personalities, most of them manufactured, all of them a part of her being.

In Westworld, it seems our central protagonist Dolores (Wood) finds herself facing a similar dilemma. She is a host who gains consciousness beyond that which her ‘maker’ has given her for guest interactions; it is Dolores, not an imprint her body has been given, that begins the fight to obtain free will.

Welcome to ‘Westworld’: Continuing the conversation about free will, consent and humanity

Both Battlestar Galactica and Dollhouse explored deep questions about the nature of humanity by pitting the virtues and flaws of actual humans against AI beings whose personalities had been manufactured. What is ‘real’? What is ‘legitimate’? What is ‘human’? Those were just some of the complicated questions posed and explored — but not definitively answered — by these series.

What intrigues me about Westworld isn’t that it’s inventing the wheel in this regard (although I do love the Western twist), but that it will likely go much deeper and darker than either BSG or Dollhouse did, adding yet another perspective to the ongoing conversation.

The infantilisation of AI in pop culture is something I find endlessly fascinating; because of its lack of self-preservation we are inclined to feel sympathy for mechanical beings that are — or appear to be — self-aware, much like we would a child, a pet or even a very elderly person. We feel protective of those we sense need help to survive, and we dislike seeing them suffer — even if, in the case of AI, that ‘hurt’ is not necessarily comparable to the experience of a living creature.

Because, as humanity’s ability to simulate reality increases, we are forced to constantly question what the word ‘real’ even means, and we have to re-examine our state in the universe. Are we ‘more’ sentient than AI simply because our sense of self came about in a different way? Are programmed emotions less than real emotions just because they’re different?

Dollhouse and Battlestar Galactica are two of my all-time favorite TV shows because they pose these questions, and explore them in intelligent, layered ways. There are no right or wrong answers to find; both Cylons and dolls are made to invoke sympathy and yet the ‘traditional’ humans in the stories — while imperfect and sometimes downright evil — are right to fear and question the unreality of AI, especially when it comes into direct conflict with the lives and freedoms of actual human beings.

How will Westworld add to the conversation? What will the series borrow from existing AI fiction, and what new questions will it pose and ponder?

I can’t wait to find out!

‘Westworld’ premieres Sunday, October 2 on HBO

Feel free to share your own recommendations for sci-fi series that tackle similar themes in the comments!