We sometimes hear stories of innocent suspects falsely confessing to crimes. They fail to get a lawyer in time and become entangled in probings and accusations until fact and fiction blur and their minds create a new reality for them – one in which they’re guilty.
People can experience this self-delusion in a less drastic way, as well. How many times have you stood in front of the mirror and told yourself you’re a strong, independent woman, when in reality you don’t earn enough to live alone and can barely dead lift fifty pounds? The idea is that if you say it enough, you’ll start to believe it.
It works with negative things, too. When people ask me how wedding planning is going, especially in terms of decorations, I have found myself repeating the mantra, “I have poor taste in design” (not because it’s true, but partly because I don’t enjoy decorating and am trying to slough the burden off onto another person, and partly because I want an excuse for not having gotten further along on this project), until I am now fully convinced that I’m the last person in the world who should be making decisions about my own wedding decor.
It’s especially easy to rewrite the self-esteems of children. Lasting damage can be caused even by simple suggestions. For example, if you imply to a child that everyone in the family is bad at math, you’ve given them reason to think they will be bad at math too, and it can make them expect failure. If this supposed obstacle is repeated to them often, math might seem too daunting of a task to bother overcoming at all. And that is how you end up with a lot of English majors.
With these things in mind, I would like to make a proposition regarding a well-loved character from a well-loved sci-fi show. A few years ago, Netflix graced us with the full body of Star Trek shows, including all seven seasons of The Next Generation. Being a Trekkie and infatuated with Commander William T. Riker from childhood, but bound by the restraints of cable, reruns, and bedtime, I was excited to finally see the episodes in order and in their entirety (and commercial-free!). My friend beat me to the punch and I ended up watching many episodes sporadically during his binge sessions; however, my partner and I started from the beginning last fall and are now on Season 6.
*Be advised: spoilers and extreme, unabashed geekiness of the philosophi-sci-fi variety to follow!*
There are certain TNG episodes that immediately get you excited (for example, Q episodes), bummed (Geordi episodes), or scared (Borg episodes). But nothing beats the realization that you’re about to watch a Data episode.
Lieutenant Commander Data is an android and the first artificially intelligent lifeform to become a Starfleet officer. He put himself through Starfleet Academy and over time earned his position as third in command aboard the starship Enterprise, where he holds the position of Chief Operations Officer. He was given human functions by his maker, as well as the ability to reprogram himself, i.e., to adapt to his surroundings. One of the very few things he incapable of is human emotion – a fact he reminds people of at the slightest provocation. Data makes attachments (read: friends) by growing accustomed to people’s presence and idiosyncrasies. He has sex. He, in a manner of speaking, reproduces. He mimics art and music and even laughter (sort of…), but he is not capable of experiencing or sharing the passions behind them. And in this one aspect Data falls hopelessly short of humanity.
Or so he is convinced.
I, however, am convinced that Data can – and does – feel.
In Season 4, Episode 3 (“Brothers”), Data finds out that his maker has built an emotion chip specifically for him, which will allow him to finally feel the human emotions he has wished to experience for himself his whole existence. This immediately becomes problematic for me. If he’s programmed to be human but is not given human emotions, is it also part of his programming to notice that lack and to strive to fill it as something that would make him complete? And if that is the case, in what way does this noticing-a-lack-and-striving-to-fill-it differ from desire, the essential human urge at the root of emotions like love, lust, and loneliness?
The beauty of a story built over the course of 178 episodes is that you really get to know the characters. And the beauty of this story also being something that you watch is that you can pick up on visual hints from the show which, in writing, would be implicit at best. You get to watch Data’s face as his maker offers him the emotion chip. And you get to watch his face when, in Jacob-and-Esau fashion, he finds out that his evil brother has duped their maker into giving it to him instead.
Did I simply impose that flicker of hope on Data’s face? Did I project his devastation? Maybe so. In the most recent episode we watched, Data asks his best friend, Geordi, whether his original poetry elicited an emotional response, and Geordi doesn’t answer right away. Data says, “Your hesitation suggests you are trying to protect my feelings. However, since I have none, I would prefer you to be honest.” After hearing him repeat this spiel over and over again over the course of the show, even to people who definitely know it well by now like Geordi, it begins to sound like self-delusion.
To be a human comprises what one does and what one feels; to be a robot comprises function without feeling, utility without self-consciousness. As an experiment, I asked Siri a few formulations of the same question.
What differs between Siri and Data is that Data is sentient. He’s alive and he knows it. He has thought about having human emotions; in fact, his pursuit of it consumes the majority of his free time (which is considerable, because he doesn’t need to eat or sleep). We often find him in his quarters painting or talking to his cat, Spot, or working on his Shakespearean acting skills or practicing his laugh.
One reason why Data episodes are so beloved in our household is because they’re extremely touching. Despite being rather stoic, my partner usually tears up during these episodes and expresses his love for the android. Are we touched because Data can’t experience love even though he is undeniably lovable? Are we sad because he can’t feel how tragic his story is? I, for one, am touched and sad because he has convinced himself that he cannot have the full human experience, when it is clear to those close to him that, despite all his automated claims to the contrary, he already experiences a variety of emotions: friendship, loyalty, amusement; desire, disappointment, isolation; and affection toward his cat.