I would definitely say that the first episode of Star Trek that really stuck with me was “The Measure of a Man,” and it’s only come to mean more to me over time. Those of you who know me personally probably don’t find it surprising that one of my favourite Star Trek characters, and one who I identified with myself as a child, was Mr Data. My favourite episodes of TNG as a child featured Data, and a lot of them are still personal favourites today. I first saw “The Measure of a Man” when I was five years old, and while I didn’t understand everything that was going on at the time, I knew I was watching some amazing television.
I don’t think a discussion of an episode of TNG that’s nearly thirty years old can be spoiled, though I’ll still add my
sign for the sake of politeness. After all, Vaka Rangi and TARDIS Eruditorum don’t worry about spoilers, so neither will I. Starfleet robotics scientist Cmdr Maddox visits the Enterprise with a request that Data take part in an experiment to replicate his positronic brain, which would require that his memory banks be downloaded to an external hard drive and his brain and body disassembled. When Maddox can’t guarantee that he’d be able to put Data back together again, the android rightfully refuses, at which point Maddox produces a transfer order, and refuses to let Data resign to escape the experiment on the grounds that, as a machine, he is property of Starfleet and has no rights.
When Picard asks the local starbase JAG for a trial to decide Data’s fate, he’s pressed into service as defence advocate, and Riker is ordered to be the prosecutor. Although Riker successfully proves that Data is a machine, Picard and Data successfully demonstrate his sentience and self-awareness. Maddox justified his desire to build a race of thousands of Data androids so they could be probes sent into dangerous environments to do the dangerous and dirty work that organic beings could not be rightfully pressed into. And because this hearing will have ruled them machines, they’d have no right to object. Since it would go against all the basic ethical principles of the Federation to build a race of robotic slaves, the JAG grants Data his rights, and he immediately refuses, formally, to participate in Maddox’s work.
|Cmdr Maddox is certain that a bundle of electrical nets and
gears is incapable of true thought, of understanding
anything but the syntax and functions of poetry.
One aspect of the philosophical issues that inform this story is terribly obvious: the basic questions of philosophy of mind. All of the explicit conversation throughout most of the episode revolve around whether Data, as a machine, is really capable of thinking. At one point, Maddox thumbs through Data’s copy of the complete works of Shakespeare, and asks him, both fascinated and pathetically pleading with the android, if they are just words to him, or if he genuinely understands their meaning and significance. Data responds with a chipper affirmative, but Maddox still calls him ‘it’ for the rest of the episode.
This is the oldest issue in philosophy of mind. Indeed, it’s probably the definitive question of the whole sub-discipline. The sub-discipline was founded as the philosophical wing of the cybernetics research community, where the possibility of creating a genuine artificial intelligence with computer technology was paramount. For decades, its major problems revolved around what constituted the mind, because answering this ontological question would constitute a response to whether the Turing Test would succeed.
The Turing Test was a thought experiment, named after its first formulator Alan Turing, of course, that a machine would have achieved genuine intelligence if it was capable of pulling off a lie to a person about what it was, via exchange through a medium like text message that would keep the human from seeing it. The lie is usually depicted as the computer convincing a human conversation partner that it’s a human too. So a lot of the debate revolved around whether external activity was enough to guarantee the existence of a mind. Many thought experiments produced in response to this question attempted to pose this question with computers, (very technically specified) zombies, non-human animals, extra-terrestrials, convoluted translation apparatuses, and, naturally, androids.
Maddox treats Data just as some philosophers have conceived of mind: if it isn’t human (or at least organic) then it’s just a bundle of gears and mechanisms that spit very convincing responses to stimuli. Unfortunate for Data. But fortunately for Data, this is based on a premise that itself is up for philosophical debate: whether there is any aspect of the physical human organism (our squishier gears and mechanisms) which produces all that we call mind, or whether there is some immaterial aspect to the human mind’s constitution. If we’re all material, then a creature like Data could exist, if a machine (like the fictional positronic brain and neural net) could accomplish all that the human brain and perceptual apparatus does, or more.
Since all of the empirical evidence for the existence of human minds is our behaviour, Data’s behavioural demonstration of sentience and self-awareness is enough to declare him truly self-aware and sentient.
And the best part is, these ontological arguments are thrown out entirely for an ethical question. If we had an army of androids available to do terrifyingly risky tasks and labour, to be treated as disposable people, they would essentially be slaves, demarcated by their race: android. For people who know the horror of slavery, creating such a system with full knowledge of what you were doing would be intolerable in the deepest sense.
There is another ethical question in this episode, though, one that constitutes an idea that animates my own ideas for fiction work about my Alice character. She is an android that has overcome the human morality of resentment, who feels no impulse to punish for offence or wrongs, and seeks only to repair the damage to the wronged and the wrongdoer. Well, at least in this episode, Data is the one to do it first.
|Will Riker is forced to argue that one of his best friends is
not even a person.
Remember that Cmdr Riker was drafted to play the part of the prosecution, and he played it so well that if Picard and Guinan hadn’t thought of switching the case from an ontological to an ethical ground, Data would have been taken apart and died. By the end of the episode, Riker is miserable, because his actions almost cost the life of his friend. But Data is not offended; he’s grateful. If Riker had refused to prosecute the case, or do his level best to prove the Maddox’s case, the JAG would have ruled against Data’s having rights to self-determination. So Data says, “That action injured you, and saved me. I will not forget it.”
That’s a greater wisdom than most humans possess, the ability to see past the immediate appearance of acts to their larger consequences and results. And with these in mind, he takes no offence, and praises his friend for the openly offensive acts that the situation had made necessary to save his life. His morality is guided by his knowledge. Data the character often spoke of his desire to become more human. Well, in this episode at least, he became more than human, better than human.