Recommendation: Dollhouse

when altering ones mind becomes as easy as programming a computer
what does it mean to be human?
ghost in the shell

Dollhouse” is an interesting and under-valued series. And strangely enough, another of Joss Whedon’s short-lived ones (a bit like “Firefly”).

The series deals with an organization that uses actual human beings (“dolls” or “actives”) and imprints them with specifically designed personalities, memories and skills for the “needs” of their clients. The person who works for the organization as a doll has agreed to get his/her “self” downloaded and stored. When on assignment, the person’s self completely replaced with whatever the clients want for the job. When not on assignment, the doll is reduced to a docile being and cared for. After 5 years working for the organization the original personality with all the memories and skills is restored and the person leaves with a lot of money.

“We’re pimps and killers. But in a philanthropic way.”
“Dollhouse”

Sounds like an interesting way to make money and a very nice service for the clients. Imagine you had lots of money and you could get a person to be exactly the kind of person you “need”. Fulfill a fantasy, service an emotional need, or get an incredibly qualified professional. A person who is not playing a role, but actually is that person. Only temporarily, not in a “The Stepford Wives” kind of way. But despite the temporary nature, they aren’t lying. They are convinced that they are that person, which includes emotional reactions and skills. It is real for the dolls and indistinguishable from a “real” person for the client.

This scenario allows for some interesting views on human nature (mostly ethical ones). And there are multiple perspectives here — the “doll” sacrificing everything for 5 years without noticing it after volunteering, the clients being able to use these services for their personal needs, the other employees of that organization trying to ensure everything runs smoothly, a corporation having access to this technology (which gets more advanced over time), the general public dealing with this technological breakthrough, etc. Some of these issues are current ones and some might become actual issues in the future.

For example (spoilers):

What makes a person a person (incl. immortality & multi-presence)

One of the basic questions is what makes a person a … well, person — when you can just create/rewrite personality, memories (incl. feelings), and skills. Is that all there is, or is there something like a soul? Something that remains?

It’s a bit like the transporter in Star Trek — is it just travel (or rather: a way to facilitate the plots), or is it the destruction of one person for the creation of another?

And if we aren’t more than what can be downloaded and copied in other people’s bodies (or on hard-drives) — does it matter when the body dies? As long as the scan survives as a backup? Can you achieve immortality this way by going from body to body?

Or can you simply create duplicates of yourself — at least the personality, not the body? Spreading one personality/consciousness to multiple bodies? Living multiple lives at the same time that start from the same point of your life?

And what about issues like dementia — suddenly fixable? Or can you only go back to prior backups of your “self”? Which would result in an increasing difference between the world you know and what the world is (imagine going to bed and waking up 40 years in the future). What can you offer in the remaining years before the brain deteriorates and new scans won’t be useful?

And do artificially constructed personalities have the right to live — even if they know they were constructed? They still might want to continue to exist. For them, their lives with all the constructed memories and emotions (and needs) are real. That’s the point of creating them.

And can these issues be solved with artificial bodies? Perhaps limiting the amount of bodies you can use at the same time (a one-body policy akin to China’s one child policy)?

But wouldn’t this reduce a human being to just a few ones and zeroes?

Self-determination and personal responsibility

What can people decide for themselves? What should other people decide for them?

Can a person sign away his/her self for a 5-year contract and be nothing more than a doll that is programmed to serve other people with everything they have? Who genuinely strive to be the best person they can be in service of paying customers (no employer could hope for more)? To experience created emotions as real? Even love, or hate?

To enter in what could be viewed as consensual slavery (which would be an oxymoron in most contexts)? To kill on demand, or prostitute themselves? BTW, is it prostitution when the person is living a constructed life (I really think the free will decision should take precedent here)? Is it possible to make a voluntary decision to have no more voluntary decisions for a 5 years, yet think of all future decisions as voluntary? What if you have reached a low-point in your life where you crave for a way out, out of your life, out of your consciousness? And that low point suddenly becomes 5 years?

Is that a free decision? Or is just a way to make money by escaping one’s memories and one’s past? To give oneself up for 5 years for money? To wake up after that time, perhaps with painful memories removed (e.g., take the sting out of the memory of losing one’s child shortly after birth)? Can one ditch past responsibility for one’s actions? When you can just … stop going on and not really kill yourself, but put your mind in storage and give your body to people who care for it?

And how do people who work as dolls deal with the realization of what they are — that they are constructed, not actually “real”? And if the constructed personalities are just disassembled or deleted later, can you really trust an organization to restore your “self” after the contract is finished? And not to leave, e.g., backdoors, or worse, continue using you? Esp. when no-one knows where you are, or that you still exist somewhere?

Determination of others (incl. mind control)

What about the consequences when you can control others, make them into the person you wish, or need, to have? Not only the temptation of wish-fulfillment. There’s also the way to deal with the past by reliving it, by having that conversation with a highly relevant person(ality) you could never have. Or a person changing him- or herself by getting knowledge and skills this way?

And when does the area between persuasion blur into manipulation? What happens when interactions can be staged without chance of detection, because the actors/actresses really are the role they are playing? When no lie can be detected, because the people act genuinely?

And — to address the elephant in the room — what about outright mind control? When not only personality, memories, and skills can be programmed, but also emotions like trust? What about people who misuse that trust — again and again?

How does it compare to more traditional ways of controlling people, e.g., force, financial coercion, or social influence? What about the ways cults use? Or the tactics used by sexual predators — who might get their victims to stay put even when they have the chance to flee? Not to mention socially accepted ones like alcohol, or even … tea(1). And how about more socially praiseworthy ones like helping people to move beyond an addiction?

And what are the consequences of treating human beings as things? When you dehumanize them? When free will turns into clay. Treat them as toys, or as way to deal with personal problems? When you can use mind and soul of another human being to completely fulfill your needs? Whether that is having the perfect “date”, a loving mother for your child, or get someone who questions your decisions to keep you on your toes?

But what if you want more than a short-time arrangement? When it’s so easy to program the perfect partner? Get a person to be what you want and this person is convinced its his/her free decision to act this way? That this person has the illusion of personal self-determination and control?

And what happens when you go beyond single individuals, when the whole world becomes a dollhouse to play with? When the technology gets perfected to control minds? More than mere mind control actually. It goes beyond controlling an existing personality — it creates a new personality and overwrites the old. When you can reprogram minds with the push of a button, as technology gets more and more powerful and scales quite nicely? When consent seems so optional given that you can make people consent so easily.

“You don’t need a translator when speaking directly to the brain. … And it was just one phone call. One robo call to a city. That’s all it takes. An entire army in a single instant in the hands of any government, and boom. We went boom. Millions programmed to kill anyone who’s not programmed to kill anyone. And then the war has two sides, those who answered the phone and those who didn’t. You know what, don’t answer the phone. … So which is worse? Pick up the phone or don’t pick up the phone? I can’t tell. It’s an interesting question. An entire army in a single instant. That’s all it takes. That’s brilliant. That’s so brilliant. Why didn’t I think of that? Did I think of that? Did I? Oh, God. Oh, my God. Oh, my God. Oh, my God. If I can think I can figure things out, is that curiosity or arrogance? Oh, my God. I know what I know. I know what I know. I know what I know. I know what I know.”
“Dollhouse”

Technological development and Profit/Power

The idea to cross-finance world-domination via mind control by serving the needs of people with lots of money and influence is a nice one. Especially giving the high demands of these (stereotypical) clients to continuously improve the technology. Going beyond what was previously possible and doing the work in a cost-effective way. Perhaps sometimes actually better than publicly funded research with few checks and balances.

And it’s interesting to consider what happens when tech that works well on an individual level and is limited in scope (e.g., due to high costs and expertise requirements) gets “mainstream”. What happens when playthings for the rich become mass mind control and a way of cheating death? And when that technology becomes widely available for many players, including “highly motivated individuals”?

“Maybe I should just go above ground.”
“Don’t talk like that. You know how dangerous it is. The tech’s gone wireless. People are stealing bodies left and right.”
“Dollhouse”

And what about corporate misuse of personal data? When companies collect data they use for other purposes. Here it’s brains scans to get other people’s personalities and memories without their knowledge and consent. A real world example would be access to genetic information via blood samples (e.g., because you have access to blood banks, or lab samples).

“We have access to over a hundred thousand brain models. Every scanner Rossum makes sends a full 3D map to our mainframe. In three years, we’ll have a million.”
“Dollhouse”

Conclusions

So, among others the series has some interesting ethical questions and perhaps even some lessons. About the power of wishes (or “needs”), people’s ability to chose … unwisely, how wishes can be fulfilled in unexpected ways (e.g., main character wishing to travel and save the world and does so as other persons). About other solutions beside that seemingly straight line to the fulfillment of needs — e.g., other solutions like therapy, conflict resolution, being open about their needs, and being able to see what other people “need”. And, of course, what happens when you play with matches in a complex environment with many competing agendas and flawed individuals.

[seeing a post-apocalyptic city]
“They really thought they were helping, huh? Giving people what they needed. Is this what we needed?”
“No. Kids playing with matches and they burned the house down.”
“Dollhouse”

All these topics aren’t really new and have been addressed in other media. There are a couple of interesting influences (I think), e.g., Alias, Nikita, The 6th Day, Ghost in the Shell, Pretender, Eternal Sunshine of the Spotless Mind, and a few others. But I like their take on it. Esp. how these things could get started and improved to “mass-market” capabilities using rich people’s desires.

And of course, there are writing issues. Most characters are tropes (ex-cop, agent, nerdy amoral scientist, nerdy doctor, bitchy boss, “the rich” in general, etc.) and there are the usual stereotypes. For example, men only thinking with their dicks, being unable to handle, e.g., nudity as distinct from sex (really, co-ed showers a problem?) and having problems with intimacy. Expect the usual collection of stereotypical scumbags and assholes. *yawn, really nothing else but white knights and scum of the earth?* Then there are plot holes like with any use of a secret society in a story (really, all that service and maintenance personnel able to keep the existence of that facility secret?). The motivations also are fuzzy (on the one side, pro-bono work and the overall intention of doing good, on the other side treating people — well, as things and putting them “in the attic” if they get damaged or broken). And of course, I wonder about the return of investment. With all the technology requirements and expenses, I doubt it would keep a facility afloat. But then again, I never understood how airlines can make money considering the costs that are involved and the prices of tickets.

Most suspension of disbelief is required for the technology, but with any science fiction, there are issues. And it’s social science fiction, so how the tech is supposed to work is not really the issue. Still, given that little I know about the brain, I’m not sure how the retrieval and storing of memories could work, how things like a personality functions and would be amendable to edits (not to mention skills). Then brain development is an issue (e.g., imprinting adult brain on the brain of a child could be difficult) and I don’t think there’s something like a blank slate. And as with much of neuroscience, it’s pretty hyped (well, they have pretty pictures, I guess they learned from astronomy). It’s Nuremberg Funnel in overdrive, with a suction pump first. Something that would rather fit in a 2360s “Star Trek TNG” world than the present.

But if you suspend your disbelief it’s interesting and in this sense even the tropes have their uses. They let you focus on the ethical questions.

So really worth a look.

 

(1) Haven’t finished watching the second (and unfortunately final) season yet, but I wouldn’t be surprised if the tea the leader of the house serves her guests contains some … calming or influencing drugs as well. Would put a nice spin on the issue of the people signing up to be a doll for five years “voluntarily”.

2 Comments

  1. Daniel,

    Dollhouse was a fabulous serie. I somehow never realized thepat it was from the creator of Firefly (which I also liked!)

    Luc

  2. Apparently he is loyal to his actors and created the series for (with) Buffy’s “Faith” actress (I think at least one actor from Firefly also appeared). Sad that it was cancelled after two seasons, a lot of unused potential still in it. I think the main criticism I have is the strong focus on fighting (makes sense regarding the actresses prior role). Instead, I would have liked to see more how different personalities deal with different situations (and only a few with violence). But still, highly enjoyed the series and even better, it made me think.

1 Trackback / Pingback

  1. Providing interesting perspectives in popular fiction | ORGANIZING CREATIVITY

Comments are closed.