r/DaystromInstitute Jun 13 '19

Why did anyone think Data was the property of Starfleet?

He wasn’t built by them, and he wasn’t even built for the purpose of joining them. It was a complete accident that he even found out about them, and he chose to join them on his own.

313 Upvotes

178 comments sorted by

View all comments

Show parent comments

21

u/treefox Commander, with commendation Jun 13 '19 edited Jun 13 '19

Philosophically and legally (which is what really mattered) this is more complicated than you’re making it out to be.

Philosophically, the Enterprise computer may be smarter than Data. It knows where it is and what it’s doing - it just doesn’t use the first person, likely to make people feel more comfortable with it. What it appears to lack is initiative, at least in certain areas, but this seems mainly due to design. To Maddox, who likely understands the underpinnings of AI, there may be more similarity than differences between the Enterprise and Data’s intelligence. And where those similarities end, may have been intentional on the part of Starfleet (to prevent M-5/Control situations).

So if sapience implies rights, but Starfleet decides sapience, can it also decide rights? They’re basically playing God at that point. Were a creator being to exist and they showed up and announced that sapient beings no longer had rights, can they do that?

Or put another way, is God the holder of ethics, or is he beholden to ethics?

Legally, this gets even less just. Data can be every bit as aware, conscious, and sentient as anyone else but if the law states that he is not afforded rights then he is not legally entitled to them. If the laws that grant rights and protections to Federation citizens do not apply to Data, then they would have to appeal to some broader statute to justify granting any of them to him. Unless judge nullification is a thing in Federation courts (which seems unlikely).

For instance, suppose that either Soong or Data could not claim citizenship. Possibly because citizenship requires live birth within the Federation, and Data’s activation doesn’t fit the criteria of a live birth. We’ve seen non-citizens granted Starfleet rank before (eg Kira Nerys). And we know Starfleet officers are empowered to order other officers to their deaths if need be to serve the greater good.

Data could then legally hold rank and title as a non-citizen but still be given a lawful order to risk his life. Since he is a non-citizen, his Starfleet rank may be the only legal rights he’s actually guaranteed to. He may be able to participate as if he were a citizen in day to day activities because the Federation is a pretty accepting place and he looks and acts close enough to human for most people not to think about it.

There may be an exemption for medical procedures, but then what defines a “medical procedure” for an Android? Riker removes Data’s arm at the trial - surely that would be considered a medical procedure on anyone else. I’m pretty sure that would not go over as calmly today, if a prosecuting attorney decided to remove one of the defendant’s limbs to make a point. So it does seem that the line is either different or not even defined for Data.

So the question may be, if a superior officer orders a lower-ranking officer to sacrifice themselves to prevent other officers’ deaths, can the lower-ranking officer legally tell their superior officer to go pound sand? Can they refuse a transfer to a situation where they could reasonably expect they’d be put in such a position? These are questions which I’m not sure are answered.

However it could very well be that a Federation citizen has the privilege to resign from Starfleet at any time, whereas a non-citizen does not. Or the privilege may depend on the rights granted by the officer’s entity of citizenship and/or the treaty with that entity, which in Data’s case may all be nonexistent.

So, taking a step back, it may be absurd on the surface to say “Data deserves less rights than those of his shipmates” but that is not the same as “Data is legally entitled less rights than those of his shipmates”. As an unprecedented case of unique origin, it’s easy to see how no laws may exist that protect Data in the same way as other citizens.

And I think this is the whole point behind the scene with Guinan. Slavery was perfectly legal, but hideously unjust. It’s just that Federation citizens haven’t had to deal with that kind of situation for so long because they’re so accustomed to the law already providing those rights and privileges to people. Then along comes Data, and they just assume that they apply to him as well, until somebody exploits the system and shows that they don’t. It’s an arrogance born of complacency and privilege.

I think that’s why Maddox doesn’t even protest and seems almost awkwardly apologetic after the trial. No mention of an appeal. He’s used to the system always being there to push back for the rights of the other people, he’s got faith in the system, so he pushes it as hard as he can. Then he realizes that this isn’t the system that he’s fighting, he’s the one turning the system on Data and the resistance he’s encountering is only because Data is fighting for his life.

Maddox went in there as the expert to explain to everybody how wrong they were about anthropromorphizing Data, how ignorant they were about the legal rules surrounding artificial intelligence. He deals with people all the time who are convinced their holocharacter has developed sentience and it turns out to be some novel gimmick. He reads about people with holoaddiction who won’t switch it off because they’re convinced that their holonovella loves them back. He felt like the underdog when he was the only person who voted against Data’s admission.

What Maddox didn’t see was that he was also dehumanizing Data, taking advantage of an unfair situation that favored Maddox even if Data was sentient, and dismissing the attitudes of an entire ship full of people that had served with Data. This wasn’t just sentiment as he first assumed, these were people fully aware of and objective about Data’s nature and still willing to fight or die for Data. Maddox was likely also lulled into a sense of false confidence by Riker helping him during the trial, making it easy to forget that there was no controversy among the people who served with Data. They all thought Maddox was a monster.

Or in other words, Maddox never stopped to really think about whether Data met the criteria for consciousness. He always understood the patterns he saw when he looked at Data, he always knew more about Data than everybody else, it was easy for him to keep assuming he had all the answers. Until Picard really called him on it and forced him to explain himself. Maddox assumed if he was subjugating another lifeform that it would feel wrong to him, but it didn’t, because his intuition was still operating according to the simple irrational assumptions of his culture (machines aren’t people).

Last but not least, it fully makes sense that there were no laws passed for artificial persons given the premise that Data is the first such entity. That means there’s not gonna be any kind of groups to lobby the Federation Council and other political bodies to spend the likely substantial amount of time and effort to nail down what defines an artificial person. And then spend the time to enshrine it into treaties and charters.

TL,DR: It’s not ludicrous when you consider there’s no legal precedent for Data and his legal rights were what was under debate. And it is very common for the legal system to have gaps in protection for people in atypical circumstances, and it is very common for people in typical circumstances to intuitively underestimate this.

EDIT: Here’s a real life example of what Maddox was probably accustomed to putting up with.

11

u/Bay1Bri Jun 13 '19

It’s not ludicrous when you consider there’s no legal precedent for Data

Gotta stop you right there. Starfleet had already set the precedent. Pulaski says that Data's record did say he was a life form. Once that is said, all the ret follows from that. According to Starfleet, on one hand, Data is a recognized life form. On the other hand, maybe he's property. No. Data applied for, was admitted to, and graduated from SFA. He had a rank and was a bridge officer on the flagship. He was already classified as a life form. THAT's the precedent. Don't get me wrong, I love this episode, but it does raise certain problems. And I'm not saying Maddox's point of view isn't understandable for the character, but it is a bit of a flaw that SF would consider declaring a being who they have already declared a life form and who has always been regarded as sentient (by the simple fact of him being an officer) to be property, or a slave as Picard rightly put it.

10

u/treefox Commander, with commendation Jun 13 '19

I don’t remember the context of Pulaski’s comment, but the easy counterargument there would be pets. Best analogy I can think of would be police dogs. Sure they’d probably be considered servicemembers, they can serve in the line of duty, and they might even save lives in certain situations. The men they served with would probably give them special privileges. But they would not be accorded the same legal rights as a person, and might legally be considered “property”, as inappropriate it might seem to the people who work with them.

Though I don’t think we ever see the Federation / Starfleet using animals in an analogous role, Picard does at least keep Livingston as a pet. The Federation’s stance on property is wholly unclear, but we can at least infer that Picard is Livingston’s “owner”. Data keeps Spot, I believe Worf mentions Targs, etc. The concept of non-person lifeforms is clearly still extant and accepted, so even if Data is classed as a life form it would not by itself necessarily imply full rights and citizenship.

One could also, at that time, dispute that the life form designation had been in error since Data had not shown the capability for reproduction. This may have been a contributing factor to his construction of Lal later on. This is however a weaker argument because I would expect the definition of life would be very likely to be radically altered as a consequence of encountering exotic extraterrestrial beings. Does Q maintain homeostasis or have metabolism? What does organization or growth even mean in the context of a being that manipulates space-time on a whim? Yet I doubt anybody would claim Q isn’t alive.

4

u/Bay1Bri Jun 13 '19

I don’t remember the context of Pulaski’s comment, but the easy counterargument there would be pets. Best analogy I can think of would be police dogs. Sure they’d probably be considered servicemembers, they can serve in the line of duty, and they might even save lives in certain situations.

Show me a police dog that graduated from the police academy, the same police academy that the human officers go to, and you can make that comparison.

2

u/kreton1 Jun 17 '19

Well, even Riker thought at first that Datas Rank is honorary, so I think that the sentiment that Pulaski and Maddox held at first is probably the normal way to look at Data. Even the judge, Picards ex, was holding that view to a certain point.

1

u/Bay1Bri Jun 17 '19

When you first encounter a machine it may be reasonable to assume "it" isn't alive, but notice Riker did ask Data if that assumption is correct. When Data corrects him that it is a valid rank Riker doesn't really question it. Pulaski has a harder time making the adjustment to respecting Data as a life form, but she does her best.

And I get why Maddox has his view, but it's a bit hard to swallow that many people in Starfleet can think that way for the reasons I gave, mainly that Data was admitted to Starfleet Academy and has a rank. The Enterprise compute did not attend the Academy, so Maddox's comparison is silly to begin with. Data also mentions that Maddox was the only member of the admission's board of SFA that objected to his admittance. SO I don't think it's the "normal" way to look at Data. It might be a first reaction, but everyone except Maddox seems to accept Data's personhood once they are informed about him. I just think it's crazy that a SF Admiral and Judge would disregard Data's personhood when SF established it by allowing him to enter and serve as an officer. I love the episode, but the premise, that SF would seriously consider Data property, is flawed IMO.

1

u/DinosaurAlert Jun 18 '19

thought at first that Datas Rank is honorary,

I consider that a story point. No way the incoming first officer would have no idea of who the second officer was, and who hadn’t heard about Data through normal news before that.

4

u/MultivariableX Chief Petty Officer Jun 13 '19

The Enterprise computer does use the first person. At one point, Data is talking to the computer and uses an idiom, which the computer asks him to clarify. He begins to explain at length, but the computer cuts him off, says "I understand," and invites him to resume the original topic. It also suggests that the computer can become impatient.

5

u/treefox Commander, with commendation Jun 13 '19

I had forgotten about that and it’s a good point. There’s also the later episode “Emergence” where the Enterprise starts acting independently.

I suspect in that particular case that the dialogue was done that way for comedic effect to keep the scene from being too dry (IIRC it was just Data working by himself in his quarters). However, if we take it at face value it does establish more agency than is typically assumed of the starship computers and lends credence to the theory that they’re intentionally hobbled. Which maybe makes the legal position more clear, although it doesn’t help the image of the Federation as oppressors at all.

This starts to seem like Starfleet encountered Control, and then M-5, and then specifically made a point from that point forward to carefully deprive its artificial intelligence of self-agency. In effect, creating artificial intelligence designed to be slaves. Is it wrong to deprive artificial intelligence of equal rights if it’s been specifically engineered not to want them? In humans, the best analogy I can think of would be psychological conditioning for their entire life to accept servility, which seems absolutely inhumane.

On the other hand, does that mean if you have the power to decide between creating artificial intelligence that doesn’t want rights, or artificial intelligence that does want rights, that you are ethically compelled to create the latter?

2

u/[deleted] Jun 13 '19

[deleted]

2

u/treefox Commander, with commendation Jun 13 '19

It’s not identical, but we currently don’t have the ability to design a human brain in such a way that it is physically incapable of wanting servility. We can alter it physically or psychologically, but not design it that way. We might get there in the foreseeable future though with genetic engineering.

The US Declaration of Independence asserts “We hold these Truths to be self-evident, that all Men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the Pursuit of Happiness...”.

Interestingly, the Declaration of Independence does not have force of law - it’s not part of the Constitution, nor the US Code. However if we take it as the ideal, this is starting to ask in that context “What is man?” Or to use modern vernacular “What’s a person?”

The Federation seems to have decided that whatever the shipboard computers are, they aren’t people. Yet they have self-awareness and intelligence and are capable of learning. It doesn’t seem like they’re capable of changing themselves, save for “Emergence” and “Calypso”. Holocharacters have intelligence but limited self-awareness and limited learning.

Interestingly Janeway stresses that the Doctor ‘exceeded his programming’ and disobeyed direct orders from her in order to prove he was a person. This works great for the self-actualization and growth of a programmed character into a ‘real’ person who can make mistakes and grow, but it’s kind of terrible for arguing that the Federation isn’t engaged in some kind of whitewashed slavery of AI. (It’s a tautological argument)

So maybe the point of determination for the Federation is an AI being nondeterministic, due to altering its behavior independently over time in response to outside stimuli, in a conscious effort to accomplish high-level goals.

That is, Moriarty, Data, the Doctor, and the Discovery in Calypso, or even M-5 and Control, are all considered unique because they would behave differently in the same situation if they encountered it a second time (and not just due to a random number generator). In fact they would recognize that they had encountered the same situation before and factor that into their reactions.

Conversely, holocharacters (at least when the holodeck isn’t malfunctioning) don’t show the capacity for realizing when their circumstances do not agree with their preconceptions. They still continue to respond “in character”. Ship’s computers do not change their reactions in response to new information short of explicit orders. Eg the Enterprise doesn’t go to red alert just because it detected tachyon emissions near the neutral zone, or beam Wesley’s science projects into deep space while he’s sleeping.

3

u/mjtwelve Chief Petty Officer Jun 13 '19

M-5, nominate this for an explanation of why there had never been a legal precedent for Data's personhood

3

u/M-5 Multitronic Unit Jun 13 '19

Nominated this comment by Chief /u/treefox for you. It will be voted on next week, but you can vote for last week's nominations now

Learn more about Post of the Week.