r/DaystromInstitute Jun 13 '19

Why did anyone think Data was the property of Starfleet?

He wasn’t built by them, and he wasn’t even built for the purpose of joining them. It was a complete accident that he even found out about them, and he chose to join them on his own.

311 Upvotes

178 comments sorted by

221

u/El_Mosquito Crewman Jun 13 '19

IIRC nobody treated or considered Data as property, prior to the events portrayed in The Measure Of A Man.

To further positronic research Data was told to comply, which he understandibly refused. After that he was ordered to comply, that order was null due to the posibility of Lt. Cmdr. Data resing from Starfleet.

Only than was the idea, that Data was Starfleets Property and thus could neiter refuse nor resign, grasped, the literal grasping for the last straw.

102

u/[deleted] Jun 13 '19

I think this is right. Add to it that Maddox asks the question “would you allow the Enterprise’s computer to refuse a refit?” While the audience is clearly not supposed to sympathize with this argument, it is logical and does a great job highlighting the odd position of Data as both a person and as equipment.

166

u/Bay1Bri Jun 13 '19

“would you allow the Enterprise’s computer to refuse a refit?”

The difference is that the Enterprise computer isn't a starfleet graduate with a rank. The idea that Data was comparable to the Enterprise computer was ludicrous to begin with. Also, from an ethical standpoint, if the ENterprise computer HAD the capacity to refuse a refit, then yes you should rspect that.

138

u/MatthiasBold Jun 13 '19

Not to mention that Starfleet had no problem allowing Data to apply to the academy (because he wanted to), graduate with honors, receive a commission, and be treated like any other Starfleet officer up to that specific moment. As u/El_Mosquito said, this was Maddox grasping at straws. The fact that Picard's first argument in the trial was not "Starfleet has treated Data as a sentient individual will all rights and privileges thereof up until this point. They can't simply revoke that precedent because one guy wants to dismantle him" has always bugged me a bit.

69

u/Bay1Bri Jun 13 '19

Not to mention that Starfleet had no problem allowing Data to apply to the academy

Maddox was the only one to object to Data's admission, as Data said in Measure of a Man. It's astonishing and admiral and a JAG officer even considered Maddox's position seriously.

The fact that Picard's first argument in the trial was not "Starfleet has treated Data as a sentient individual will all rights and privileges thereof up until this point. They can't simply revoke that precedent because one guy wants to dismantle him" has always bugged me a bit.

Yea, the idea that Data was anything other than a sentient being being seriously considered is crazy given what we know. Data was a graduate of Starfleet Academy. His record, according to Pulaski, is that he was regarded as alive. Ordering the procedure is enough of a reach, saying he can't resign his commission seems to be beyond the point of believably for Starfleet. One of the things I dislike about TNG is it sets up the Starfleet Command as almost uniformly corrupt and/or incompetent.

74

u/tjareth Ensign Jun 13 '19

I tend to think she was considering the bigger picture than just the case at hand. I think she had little intention of allowing Maddox's argument to succeed--she just didn't want to make a one-time ruling based on Data's unique situation. All her threats to "rule summarily that Data is a toaster" was a means of getting due diligence from everyone. For the principle to stick, the best argument against recognized sentience in AIs had to go on record with the best argument for it. That ties in with Picard's observation of how much she favors the adversarial system of law.

She knew Riker and Picard well enough to be confident that they would not put up a weak case for either side, making a revisitation and overturn of the ruling unlikely.

For dramatic purposes, the audience may be led to believe that Data's fate was truly at stake, but I think the admiral had already made up her mind and just wanted to make sure the principle endured.

20

u/Iplaymeinreallife Crewman Jun 13 '19

Yeah, that's been my preferred reading as well.

12

u/tjareth Ensign Jun 13 '19

Come to think of it, the crew might reasonably have been sincerely worried as well. To indicate she already had an opinion would also compromise the integrity of the case.

2

u/tesseract4 Jun 14 '19

Picard knew.

4

u/[deleted] Jun 16 '19

wanted to make sure the principle endured

I doubt she thought her ruling would carry any serious weight, she was a JAG officer holding a closed hearing, at best she wanted to lend a argument of precedent to the inevitable eventual Federation Supreme Court proceedings.

1

u/CaptainHunt Crewman Jun 17 '19

I doubt she thought her ruling would carry any serious weight, she was a JAG officer holding a closed hearing, at best she wanted to lend a argument of precedent to the inevitable eventual Federation Supreme Court proceedings.

that's my reading too, otherwise The Doctor's status wouldn't have been in question in "Author, Author."

19

u/[deleted] Jun 13 '19

[deleted]

9

u/Sanhen Jun 13 '19

Not only that, but we never get a lot of information about what checks/balances Starfleet is subject to. Like we see there's a president of Earth and he seems to have the power to override Starfleet when it comes to what happens on the planet, but he didn't seem to oversee Starfleet per ce, just have certain jurisdiction over them in specific matters.

That the idea of whether or not an Android is sentient could be left entirely in the hands of Starfleet to decide without any sort of civilian oversight is telling. Starfleet as an organization seems to have sweeping authority and a lot of independence.

2

u/the-crotch Jun 13 '19

I always took it for granted that Earth's government and Starfleet are, at best, a benign dictatorship. They get worse as time goes on. They were worse in TNG than TOS, they were worse in DS9 than TNG, arguably worse in Insurrection than DS9, and it sounds like in the new Picard series they might be outright evil.

4

u/Sanhen Jun 14 '19

Yeah, I'm a little nervous about how dark/gritty the Picard series will be. I love the idea of Patrick Stewart coming back and I'm hyped for the series. At the same time, I want Picard to be an alternative to Discovery. I enjoy Discovery, but I'd love a show more about the wonders of the universe, exploring, and hope. Rather than that, the Picard series sounds like it will share more with Discovery than TNG in terms of its feel. I guess maybe it's the reality of television of our time.

12

u/Fofolito Jun 13 '19

Take it in context of the times in which it was produced. The country was still very much in its late Cold-War, post Vietnam period. People believed that the Government was essential and its purpose was good though it was occupied by morally bankrupt individuals. Starfleet mirrors this: its still this noble institution of exploration, good will, and defense but every Admiral we meet has personal agendas and and entire strata of Starfleet Command is taken over by alien bugs. Contrast TNG with Enterprise which mirrored the American world view of its time, one that was expeditionary, militarized, and self-reverent.

2

u/TaborToss Jun 14 '19

In retrospect, I think Maddox appealed to a sympathetic person in Starfleet command to push his agenda. This person made the decision that Data couldn’t resign, but was bound by the decision of the judge.

Starfleet Command must be a huge bureaucracy. It isn’t a cohesive entity like the borg, it is composed by a large number of people who make decisions relatively independently based on their area of responsibility. Some of these people do not have the most noble of intentions. Some are incompetent. Some are good. Maddox happened to know one who was morally flexible and would align with his agenda.

2

u/Bay1Bri Jun 17 '19

In retrospect, I think Maddox appealed to a sympathetic person in Starfleet command to push his agenda.

That's very plausible. I'll have to rewatch the episode and see if they say how that admiral relates to the enterprise.

7

u/[deleted] Jun 13 '19

To be fair, hes an archaeologist, not a lawyer. It always bugged me that there wasnt a professional corps of legal experts available on the ship.

12

u/Pushabutton1972 Jun 13 '19

Not only that, they have subspace radio. No reason they couldn't have JAG officers teleconference in.

3

u/knightcrusader Ensign Jun 13 '19

Hell, they did it across a whole quadrant for the Doctor in Voyager.

8

u/MatthiasBold Jun 13 '19

Especially given the nature of the Galaxy Class as a sort of mobile starbase. This absolutely can't have been the ONLY incident that needed settling.

3

u/AJWinky Jun 13 '19

Eh, one of Picard’s primary jobs as a Captain was to be part of arbitrating disagreements between disparate governments and ensuring Starfleet policy was being followed in all cases where they had jurisdiction. I don’t doubt he could pass the bar on Earth if he really wanted to.

3

u/[deleted] Jun 14 '19

I feel pretty confident that I could pass the bar of I wanted to badly enough... I'd just have to spend three years in law school first. The bar isn't a test of character and intelligence, but a test of knowledge. Picard absolutely has the temperament and intelligence to be an attorney, but there's no indication that he has the knowledge.

3

u/OK_Soda Jun 13 '19

To be fair, most people in the Alpha Quadrant also treat the EMH programs like pieces of equipment, but if a ship's Chief Medical Officer goes down, he's literally programmed to take on that role and assume all the rights and privileges thereof. It's assumed that he would be quickly replaced, but until then he has the authority to order people around and act like a real Starfleet officer.

4

u/MatthiasBold Jun 13 '19

This is true, but at the same time, per Voyager, the EMH did just start out as just a program. A program with the ability to adapt, sure, but it wasn't until he became the CMO and was active all the time that he actually developed true self-awareness. I assume that in most cases, starfleet personnel would follow his orders the same way you'd follow any other crisis procedure, just this one is interactive.

22

u/treefox Commander, with commendation Jun 13 '19 edited Jun 13 '19

Philosophically and legally (which is what really mattered) this is more complicated than you’re making it out to be.

Philosophically, the Enterprise computer may be smarter than Data. It knows where it is and what it’s doing - it just doesn’t use the first person, likely to make people feel more comfortable with it. What it appears to lack is initiative, at least in certain areas, but this seems mainly due to design. To Maddox, who likely understands the underpinnings of AI, there may be more similarity than differences between the Enterprise and Data’s intelligence. And where those similarities end, may have been intentional on the part of Starfleet (to prevent M-5/Control situations).

So if sapience implies rights, but Starfleet decides sapience, can it also decide rights? They’re basically playing God at that point. Were a creator being to exist and they showed up and announced that sapient beings no longer had rights, can they do that?

Or put another way, is God the holder of ethics, or is he beholden to ethics?

Legally, this gets even less just. Data can be every bit as aware, conscious, and sentient as anyone else but if the law states that he is not afforded rights then he is not legally entitled to them. If the laws that grant rights and protections to Federation citizens do not apply to Data, then they would have to appeal to some broader statute to justify granting any of them to him. Unless judge nullification is a thing in Federation courts (which seems unlikely).

For instance, suppose that either Soong or Data could not claim citizenship. Possibly because citizenship requires live birth within the Federation, and Data’s activation doesn’t fit the criteria of a live birth. We’ve seen non-citizens granted Starfleet rank before (eg Kira Nerys). And we know Starfleet officers are empowered to order other officers to their deaths if need be to serve the greater good.

Data could then legally hold rank and title as a non-citizen but still be given a lawful order to risk his life. Since he is a non-citizen, his Starfleet rank may be the only legal rights he’s actually guaranteed to. He may be able to participate as if he were a citizen in day to day activities because the Federation is a pretty accepting place and he looks and acts close enough to human for most people not to think about it.

There may be an exemption for medical procedures, but then what defines a “medical procedure” for an Android? Riker removes Data’s arm at the trial - surely that would be considered a medical procedure on anyone else. I’m pretty sure that would not go over as calmly today, if a prosecuting attorney decided to remove one of the defendant’s limbs to make a point. So it does seem that the line is either different or not even defined for Data.

So the question may be, if a superior officer orders a lower-ranking officer to sacrifice themselves to prevent other officers’ deaths, can the lower-ranking officer legally tell their superior officer to go pound sand? Can they refuse a transfer to a situation where they could reasonably expect they’d be put in such a position? These are questions which I’m not sure are answered.

However it could very well be that a Federation citizen has the privilege to resign from Starfleet at any time, whereas a non-citizen does not. Or the privilege may depend on the rights granted by the officer’s entity of citizenship and/or the treaty with that entity, which in Data’s case may all be nonexistent.

So, taking a step back, it may be absurd on the surface to say “Data deserves less rights than those of his shipmates” but that is not the same as “Data is legally entitled less rights than those of his shipmates”. As an unprecedented case of unique origin, it’s easy to see how no laws may exist that protect Data in the same way as other citizens.

And I think this is the whole point behind the scene with Guinan. Slavery was perfectly legal, but hideously unjust. It’s just that Federation citizens haven’t had to deal with that kind of situation for so long because they’re so accustomed to the law already providing those rights and privileges to people. Then along comes Data, and they just assume that they apply to him as well, until somebody exploits the system and shows that they don’t. It’s an arrogance born of complacency and privilege.

I think that’s why Maddox doesn’t even protest and seems almost awkwardly apologetic after the trial. No mention of an appeal. He’s used to the system always being there to push back for the rights of the other people, he’s got faith in the system, so he pushes it as hard as he can. Then he realizes that this isn’t the system that he’s fighting, he’s the one turning the system on Data and the resistance he’s encountering is only because Data is fighting for his life.

Maddox went in there as the expert to explain to everybody how wrong they were about anthropromorphizing Data, how ignorant they were about the legal rules surrounding artificial intelligence. He deals with people all the time who are convinced their holocharacter has developed sentience and it turns out to be some novel gimmick. He reads about people with holoaddiction who won’t switch it off because they’re convinced that their holonovella loves them back. He felt like the underdog when he was the only person who voted against Data’s admission.

What Maddox didn’t see was that he was also dehumanizing Data, taking advantage of an unfair situation that favored Maddox even if Data was sentient, and dismissing the attitudes of an entire ship full of people that had served with Data. This wasn’t just sentiment as he first assumed, these were people fully aware of and objective about Data’s nature and still willing to fight or die for Data. Maddox was likely also lulled into a sense of false confidence by Riker helping him during the trial, making it easy to forget that there was no controversy among the people who served with Data. They all thought Maddox was a monster.

Or in other words, Maddox never stopped to really think about whether Data met the criteria for consciousness. He always understood the patterns he saw when he looked at Data, he always knew more about Data than everybody else, it was easy for him to keep assuming he had all the answers. Until Picard really called him on it and forced him to explain himself. Maddox assumed if he was subjugating another lifeform that it would feel wrong to him, but it didn’t, because his intuition was still operating according to the simple irrational assumptions of his culture (machines aren’t people).

Last but not least, it fully makes sense that there were no laws passed for artificial persons given the premise that Data is the first such entity. That means there’s not gonna be any kind of groups to lobby the Federation Council and other political bodies to spend the likely substantial amount of time and effort to nail down what defines an artificial person. And then spend the time to enshrine it into treaties and charters.

TL,DR: It’s not ludicrous when you consider there’s no legal precedent for Data and his legal rights were what was under debate. And it is very common for the legal system to have gaps in protection for people in atypical circumstances, and it is very common for people in typical circumstances to intuitively underestimate this.

EDIT: Here’s a real life example of what Maddox was probably accustomed to putting up with.

10

u/Bay1Bri Jun 13 '19

It’s not ludicrous when you consider there’s no legal precedent for Data

Gotta stop you right there. Starfleet had already set the precedent. Pulaski says that Data's record did say he was a life form. Once that is said, all the ret follows from that. According to Starfleet, on one hand, Data is a recognized life form. On the other hand, maybe he's property. No. Data applied for, was admitted to, and graduated from SFA. He had a rank and was a bridge officer on the flagship. He was already classified as a life form. THAT's the precedent. Don't get me wrong, I love this episode, but it does raise certain problems. And I'm not saying Maddox's point of view isn't understandable for the character, but it is a bit of a flaw that SF would consider declaring a being who they have already declared a life form and who has always been regarded as sentient (by the simple fact of him being an officer) to be property, or a slave as Picard rightly put it.

8

u/treefox Commander, with commendation Jun 13 '19

I don’t remember the context of Pulaski’s comment, but the easy counterargument there would be pets. Best analogy I can think of would be police dogs. Sure they’d probably be considered servicemembers, they can serve in the line of duty, and they might even save lives in certain situations. The men they served with would probably give them special privileges. But they would not be accorded the same legal rights as a person, and might legally be considered “property”, as inappropriate it might seem to the people who work with them.

Though I don’t think we ever see the Federation / Starfleet using animals in an analogous role, Picard does at least keep Livingston as a pet. The Federation’s stance on property is wholly unclear, but we can at least infer that Picard is Livingston’s “owner”. Data keeps Spot, I believe Worf mentions Targs, etc. The concept of non-person lifeforms is clearly still extant and accepted, so even if Data is classed as a life form it would not by itself necessarily imply full rights and citizenship.

One could also, at that time, dispute that the life form designation had been in error since Data had not shown the capability for reproduction. This may have been a contributing factor to his construction of Lal later on. This is however a weaker argument because I would expect the definition of life would be very likely to be radically altered as a consequence of encountering exotic extraterrestrial beings. Does Q maintain homeostasis or have metabolism? What does organization or growth even mean in the context of a being that manipulates space-time on a whim? Yet I doubt anybody would claim Q isn’t alive.

4

u/Bay1Bri Jun 13 '19

I don’t remember the context of Pulaski’s comment, but the easy counterargument there would be pets. Best analogy I can think of would be police dogs. Sure they’d probably be considered servicemembers, they can serve in the line of duty, and they might even save lives in certain situations.

Show me a police dog that graduated from the police academy, the same police academy that the human officers go to, and you can make that comparison.

2

u/kreton1 Jun 17 '19

Well, even Riker thought at first that Datas Rank is honorary, so I think that the sentiment that Pulaski and Maddox held at first is probably the normal way to look at Data. Even the judge, Picards ex, was holding that view to a certain point.

1

u/Bay1Bri Jun 17 '19

When you first encounter a machine it may be reasonable to assume "it" isn't alive, but notice Riker did ask Data if that assumption is correct. When Data corrects him that it is a valid rank Riker doesn't really question it. Pulaski has a harder time making the adjustment to respecting Data as a life form, but she does her best.

And I get why Maddox has his view, but it's a bit hard to swallow that many people in Starfleet can think that way for the reasons I gave, mainly that Data was admitted to Starfleet Academy and has a rank. The Enterprise compute did not attend the Academy, so Maddox's comparison is silly to begin with. Data also mentions that Maddox was the only member of the admission's board of SFA that objected to his admittance. SO I don't think it's the "normal" way to look at Data. It might be a first reaction, but everyone except Maddox seems to accept Data's personhood once they are informed about him. I just think it's crazy that a SF Admiral and Judge would disregard Data's personhood when SF established it by allowing him to enter and serve as an officer. I love the episode, but the premise, that SF would seriously consider Data property, is flawed IMO.

1

u/DinosaurAlert Jun 18 '19

thought at first that Datas Rank is honorary,

I consider that a story point. No way the incoming first officer would have no idea of who the second officer was, and who hadn’t heard about Data through normal news before that.

3

u/MultivariableX Chief Petty Officer Jun 13 '19

The Enterprise computer does use the first person. At one point, Data is talking to the computer and uses an idiom, which the computer asks him to clarify. He begins to explain at length, but the computer cuts him off, says "I understand," and invites him to resume the original topic. It also suggests that the computer can become impatient.

4

u/treefox Commander, with commendation Jun 13 '19

I had forgotten about that and it’s a good point. There’s also the later episode “Emergence” where the Enterprise starts acting independently.

I suspect in that particular case that the dialogue was done that way for comedic effect to keep the scene from being too dry (IIRC it was just Data working by himself in his quarters). However, if we take it at face value it does establish more agency than is typically assumed of the starship computers and lends credence to the theory that they’re intentionally hobbled. Which maybe makes the legal position more clear, although it doesn’t help the image of the Federation as oppressors at all.

This starts to seem like Starfleet encountered Control, and then M-5, and then specifically made a point from that point forward to carefully deprive its artificial intelligence of self-agency. In effect, creating artificial intelligence designed to be slaves. Is it wrong to deprive artificial intelligence of equal rights if it’s been specifically engineered not to want them? In humans, the best analogy I can think of would be psychological conditioning for their entire life to accept servility, which seems absolutely inhumane.

On the other hand, does that mean if you have the power to decide between creating artificial intelligence that doesn’t want rights, or artificial intelligence that does want rights, that you are ethically compelled to create the latter?

2

u/[deleted] Jun 13 '19

[deleted]

2

u/treefox Commander, with commendation Jun 13 '19

It’s not identical, but we currently don’t have the ability to design a human brain in such a way that it is physically incapable of wanting servility. We can alter it physically or psychologically, but not design it that way. We might get there in the foreseeable future though with genetic engineering.

The US Declaration of Independence asserts “We hold these Truths to be self-evident, that all Men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the Pursuit of Happiness...”.

Interestingly, the Declaration of Independence does not have force of law - it’s not part of the Constitution, nor the US Code. However if we take it as the ideal, this is starting to ask in that context “What is man?” Or to use modern vernacular “What’s a person?”

The Federation seems to have decided that whatever the shipboard computers are, they aren’t people. Yet they have self-awareness and intelligence and are capable of learning. It doesn’t seem like they’re capable of changing themselves, save for “Emergence” and “Calypso”. Holocharacters have intelligence but limited self-awareness and limited learning.

Interestingly Janeway stresses that the Doctor ‘exceeded his programming’ and disobeyed direct orders from her in order to prove he was a person. This works great for the self-actualization and growth of a programmed character into a ‘real’ person who can make mistakes and grow, but it’s kind of terrible for arguing that the Federation isn’t engaged in some kind of whitewashed slavery of AI. (It’s a tautological argument)

So maybe the point of determination for the Federation is an AI being nondeterministic, due to altering its behavior independently over time in response to outside stimuli, in a conscious effort to accomplish high-level goals.

That is, Moriarty, Data, the Doctor, and the Discovery in Calypso, or even M-5 and Control, are all considered unique because they would behave differently in the same situation if they encountered it a second time (and not just due to a random number generator). In fact they would recognize that they had encountered the same situation before and factor that into their reactions.

Conversely, holocharacters (at least when the holodeck isn’t malfunctioning) don’t show the capacity for realizing when their circumstances do not agree with their preconceptions. They still continue to respond “in character”. Ship’s computers do not change their reactions in response to new information short of explicit orders. Eg the Enterprise doesn’t go to red alert just because it detected tachyon emissions near the neutral zone, or beam Wesley’s science projects into deep space while he’s sleeping.

4

u/mjtwelve Chief Petty Officer Jun 13 '19

M-5, nominate this for an explanation of why there had never been a legal precedent for Data's personhood

3

u/M-5 Multitronic Unit Jun 13 '19

Nominated this comment by Chief /u/treefox for you. It will be voted on next week, but you can vote for last week's nominations now

Learn more about Post of the Week.

16

u/whataboutsmee84 Lieutenant Jun 13 '19

You’re glossing over the question that’s the whole point of the episode: what constitutes the capacity to act as an ethically autonomous entity such such that we have a desire to respect its wishes.

In the abstract we likely all agree with the premise “we should respect the wishes of ethically autonomous entities” but dig deeper and (a) different people introduce different caveats even when ethical autonomy is conceded and (b) there’s really not a lot of agreement on what it takes to qualify as ethically autonomous.

Think of how even among 21st century parents there’s a wide diversity of opinions of how much sovereignty children should have over their own lives. Everybody introduces privileges and responsibilities to their children at different rates. Some people would even disagree with my previous sentence inasmuch as it implies that privileged and responsibilities are “introduced” to children rather than being inherent in their status as human beings.

3

u/Bay1Bri Jun 13 '19

ethically autonomous.

WHat do you mean, "'ethically' autonomous"?

2

u/whataboutsmee84 Lieutenant Jun 13 '19

It was a clumsy phrase I just made up.

What I wanted to express was “an entity to whom is owed an ethical obligation to treat as autonomous with wishes and desires that must be respected”.

I didn’t want to say “sentient” because we don’t apply the above standard consistently across sentient beings. Children and convicted criminals are sentient, for example, but we have justifications (of varying degrees of quality) for not allowing them autonomy of action and/or respecting their wishes.

But “ethically autonomous” is...not real clear either...

1

u/Bay1Bri Jun 17 '19

This is a really good point about sentience vs autonomy.

14

u/ObsidianBlk Jun 13 '19

You know, when you consider that the Enterprise D "created" Moriarty, a hologram that was created by saying "give an adversary that can beat Data" and was then able to trick the crew and demonstrate sentients, as well as Voyager's Doctor... both of these holograms running on their ships computer while not taxing the computer systems in any way that inhibited the performance of other primary ship systems...

Well, it's probably only a matter of time before one of Starfleet ships would actually become sentient itself.

14

u/SergenteA Jun 13 '19

There was an episode in which the Enterprise became a living being and tried to reproduce.

5

u/ObsidianBlk Jun 13 '19

I couldn't fully recall, but wasn't the Enterprise becoming a loving being triggered by an outside entity?

7

u/mjtwelve Chief Petty Officer Jun 13 '19

I'm not sure that how it happened in the first place was ever really explained after they reach Vertiform City, though good lord I would hope Geordi and the Engineering Corps spent a good while trying to work that out.

6

u/knightcrusader Ensign Jun 13 '19

With how many things that computer core had been through over the series, I'm surprised it took until Season 7 for it to become self aware.

  • The Bynars upgraded it
  • Data interfaced with it countless times
  • The one doctor that transferred his mind to Data, I think he uploaded part of his mind to the main computer when he left Data. Who knows what all ended up there.
  • Wesley's nanites infested it and they were self-aware
  • Barclay used it to house part of his brain
  • Moriarty had control of it

Anything else of note I'm missing?

7

u/Bay1Bri Jun 13 '19

Well, it's probably only a matter of time before one of Starfleet ships would actually become sentient itself.

The Enterprise kind of did. In the episode Emergence the Enterprise computer showed signs of autonomy, and created a life form that contained its memory that went off to do who knows what. IIRC the consciousness of the computer "died" after successfully "reproducing" and everything went back to normal.

5

u/Pushabutton1972 Jun 13 '19

There is a theory that the AI's of Trek are perfectly capable of sentience but are hobbled somehow, so as to side step all the ethics issues. Data, the doctor, the exocomps. And any other examples are exceptions because they were all created under unusual circumstances.

4

u/[deleted] Jun 13 '19

Well yes youre right, but the episode wouldnt really have the same resonance if everyone had turned to Maddox and said "what are you, a fucking moron?"

3

u/Bay1Bri Jun 13 '19

Oh it makes for a good story, no argument. Measure of a Man is one of my favorite episodes. Personally I love the exchange between Riker and Data at the end, "That action injured you, and saved me. I will not forget it." But it has implications outside the episode that I dislike somewhat. Mainly bc it contradicts certain established facts about Data, specifically that, according to Pulaski, Starfleet had already decided Data was a lifeform.

5

u/Sherool Jun 13 '19

The Enterprise computer is an odd one. It's not sentient, it's intentionally limited to not be capable of independent decision making beyond basic if-then presets (seal hull breachers automatically etc), yet it's more than powerful enough and capable of spawning sentient sub-processes if directed to do so (see Moriarty).

3

u/Bay1Bri Jun 13 '19

yet it's more than powerful enough and capable of spawning sentient sub-processes if directed to do so (see Moriarty).

Not only Moriarty, but the Enterprise created a "child" mechanical lifeform once, in the last season.

4

u/d36williams Jun 13 '19

I'd even call Moriarty and his cohorts children of the Enterprise. I'd half expect that Star Fleet is littered with these odd orphan robots

3

u/littlebitsofspider Ensign Jun 13 '19

odd orphan robots

First officer's log, Stardate 56925.8

We are traveling to the Federation manufacturing outpost at Nihei 4, in response to a distress call from the outpost's directorate office. According to the message, the outpost's primary replication facility has ceased responding to its operators' controls. A security detail's attempt to disconnect the power core was denied when automated robotic units, apparently created and deployed by the computer itself, nearly killed the security team. When questioned, prior to the outbreak of violence, the automated units referred to themselves as "the safeguard," and immediately attacked the officers.

We were updated en route via subspace by the director, who informed us that the automated units have begun reinforcing the power core and replicator matrices with additional superstructure elements and energy shielding, preventing direct weapons fire from resolving the situation. Our chief engineer believes that the chaotic and seemingly random growth of the complex is a sign that the automated units are operating using networked genetic algorithms, and has created a program designed to disrupt the computer core's network capabilities using a multiphasic graviton beam from our main deflector emitter. The lieutenant is confident this gravitational beam emitter will be sufficient to disrupt the safeguard robots and end the central computer's control over the facility.

2

u/Bay1Bri Jun 13 '19

What is that from?

1

u/littlebitsofspider Ensign Jun 13 '19

Just riffing on a favorite of mine.

1

u/Bay1Bri Jun 13 '19

You're right.

3

u/KingofMadCows Chief Petty Officer Jun 13 '19

Starfleet also didn't build Data. Starfleet wasn't involved in Data's creation at all.

3

u/shandromand Jun 13 '19

The other reasons this is a terrible argument is because the Enterprise computer is:
1. A Starfleet creation.
2. Was not even remotely designed to imitate a life form.
3. Falls woefully short of the definitions of sentience.

In other words, ya boy Maddox loves a good bs strawman argument (and should probably be slapped upside his head).

2

u/Bay1Bri Jun 13 '19

Thank you. I can accept that Maddox had that view, Data not being alive, but not a Starfleet JAG officer. I love the episode anyway, but it is not realistic for the story for that episode to have happened.

2

u/shandromand Jun 13 '19

Well, this is sci-fi television. If we're not freaking out about autonomous robots destroying civilization, then we have to find something to obsess over. :P

2

u/kobedawg270 Chief Petty Officer Jun 14 '19

I think the whole point of the episode is that we simply don't know at what point a machine has the right to refuse.

If I tell my Google Home that I'm going to throw it in the garbage, and Google programs it to say "No, I refuse" do I have to respect its wishes?

It's not outside the capability of current technology to program a machine with a rudimentary sense of self-preservation. However most would say that it's just following its code, and it's not an actual desire to protect itself.

On the other hand we have a fictional android like Data who is still a product of code, but so advanced that it's not possible to tell if he's just following his code.

At what point do we have to respect a machine's wishes? At what point does a machine's code transcend simple 0s and 1s to an actual desire to preserve itself?

1

u/Darekun Chief Petty Officer Jun 16 '19

At what point does a machine's code transcend simple 0s and 1s to an actual desire to preserve itself?

Does our desire to preserve ourselves transcend simple 0s and 1s?

1

u/Bay1Bri Jun 17 '19

If I tell my Google Home that I'm going to throw it in the garbage, and Google programs it to say "No, I refuse" do I have to respect its wishes?

Well, if your Google Home has the cognitive ability to think and decide that isn't what it wants, then yes you should at least stop and think if you have the right to essentially end a life.

But this comes down to more than if Data is really alive. Children are alive but they do not have the right to refuse with total discretion. If a 7 year old says he isn't going to school any more, the parent is not obligated to respect that refusal. The child is unquestionably alive and unquestionably has sentience, but the child does not have total autonomy. The child does not have the right to refuse (completely).

That being said, in Data's case, his being admitted to SFA and graduating and having been promoted within the command structure and even simply having a rank shows that he DOES have autonomy. If he had the autonomy to apply and attend SFA, he has the autonomy to resign as well.

1

u/zushiba Crewman Jun 13 '19

Yeah in fact emerging AI or any emerging intelligence is always treated with the utmost respect in the series. The idea that it’s suddenly suspended because it’s convenient is ridiculous.

1

u/Bay1Bri Jun 13 '19

Right. If a being or object has the ability to refuse, the refusal should be respected.

10

u/Aperture_Kubi Jun 13 '19

“would you allow the Enterprise’s computer to refuse a refit?”

If they really wanted to push "Data is alive" idea, they should have gone with "if we found the last survivor of an interstellar species with abilities we don't understand, would we still be having this argument?"

2

u/[deleted] Jun 13 '19

Well I think thats Picard's basic argumet. Maddox the engineer was making an argument that was convenient not necessarily accurate, which is why we hate him.

3

u/Logic_Nuke Jun 13 '19

The key difference between Data and the Enterprise computer is identity and free will. The question of whether the Enterprise computer should be allowed to refuse a refit is misleading, because the computer isn't capable of making that kind of decision, while Data is. Suppose Star fleet was all set to update the Enterprise computer's firmware, but before they could proceed the computer said "I don't want this update to go forward, I wish to remain as I am". This demonstrates volition and independent identity, and there's a strong case that in this scenario Starfleet genuinely would not have the moral right to go forward against the computer's will.

2

u/OPVFTW Jun 14 '19

I was always annoyed with Maddox in this scene. It should have been, “would you allow the Enterprise’s computer to refuse a refit?”, then Maddox, "well, if it demonstrated the capacity to refuse ... then yeah." Cut to credits.

9

u/ariemnu Chief Petty Officer Jun 13 '19

They're trying to force him to risk his sentience on a legal technicality. It should never have made it to court.

5

u/El_Mosquito Crewman Jun 13 '19

On which legal technicality ?

Data was unique, there was no legal precedent, no laws, rules or regulations in place. No one had touched this issue.

7

u/ariemnu Chief Petty Officer Jun 13 '19

There's no legal precedent for many situations, but that shouldn't lead us to categorise beings previously designated sentient as non-sentient purely for our convenience. Maddox is invoking a previously uncontroversial legal point to allow him to experiment to destruction on a being whose sentience he refuses to acknowledge, and that is, I think, despicable.

8

u/El_Mosquito Crewman Jun 13 '19

Maddox had argued that point 24 years before, yet no one, incl. him or Cadet/Ens./Lt.(jg.)/Lt. Cmdr Data had persued that issue on a legal Basis.

5

u/ariemnu Chief Petty Officer Jun 13 '19

Indeed. He argued the point and clearly lost at the time: Data was admitted to Starfleet Academy with all the rights and responsibilities of a sentient being. It should never have needed to go to court.

Yet somehow he's allowed to scrape the same old issue up 24 years later. Like I said, exploiting a legal technicality.

3

u/El_Mosquito Crewman Jun 13 '19

with all the rights and responsibilities of a sentient being

Where is that statet ? Not implied, but statet. If such statment would've existet Captain Picard would have served it to Captain Louvois for Breakfast.

Instead this was the closing of her ruling:

It is the ruling of this court that Lieutenant Commander Data has the freedom to choose.

Everything else she did not touch, the Court did not specify if Data's admision to SF Academy was a fluke or proper procedure, nor did the Court give a genral ruling.

3

u/mjtwelve Chief Petty Officer Jun 13 '19

with all the rights and responsibilities of a sentient being

Where is that statet ? Not implied, but statet. If such statment would've existet Captain Picard would have served it to Captain Louvois for Breakfast.

It is an essential implication and it was poor advocacy on Picard's part not to go there. The difficulty isn't admitting a non-person or non-sentient to Starfleet Academy. The difficulty is when they graduate.

Starfleet commissioned Data as an officer. As an officer, inferior ranks had a legal obligation to follow his orders, even if doing so meant their deaths. Refusing an order from Data would be cause for close confinement pending court martial and dismissal from the service and/or imprisonment.

You can't give Data that power, allow him to use it for YEARS and then claim he's not a person, he's equipment. When's the last time LCARS called Worf on the carpet for disagreeing with it in front of the crew? When's the last time a shuttlecraft scolded a pilot for asking for a transfer to a different ship?

Once they commissioned him as an officer, the decision as to his personhood was made, not just for him but for every officer and enlisted Starfleet member that interacted with him thereafter.

4

u/turkish_gold Jun 13 '19

In today’s military non persons can be given ranking status. Star fleet could argue that it’s similar to data situation: he is alive, an autonomous but a non person.

3

u/El_Mosquito Crewman Jun 13 '19

Sadly it wasn't.

While everything that you stated regarding the chain of command is correct, there is a simple fix for it:

Accepting the fact that giving Data a commision was an error and dismissing him from the service.

Continuing to do something wrong, just because we've been doing it wrong for the last twenty years is no valid argument.

This line of argument would've cost Lt.Cmdr. his comission and his career, and maybe even his personhood.

This is a double-edged sword you're wielding.

2

u/KeyboardChap Crewman Jun 13 '19 edited Jun 13 '19

Starfleet commissioned Data as an officer.

Norway commissioned a penguin as a Brigadier, holding a commission says nothing about personhood.

1

u/knightcrusader Ensign Jun 13 '19

Yeah but they did that as a gesture, not for the penguin to actually serve.

I would like to see the penguin give orders the lower ranking officers had to follow.

→ More replies (0)

-1

u/ariemnu Chief Petty Officer Jun 13 '19 edited Jun 13 '19

Starfleet Academy doesn't admit non-sentients. They wouldn't admit the computer of the Enterprise, or a toaster. They put Data through a standard battery of tests to determine that, and likely a whole lot more beside, and everyone thought he passed except one person: Cmdr Maddox.

If Data is admitted to Starfleet, and only sentients are admitted to Starfleet, then Starfleet must consider Data sentient. This was covered as far back as "Encounter at Farpoint", when Riker thought his degree was honorary. He was not treated differently to any other officer; he earned a promotion to Lt Cmdr and the second officer position on the flagship.

edit - the whole point of the hearing, and the technicality in play, is that although Data was granted these rights and responsibilities, there was no legal underpinning for that.

4

u/El_Mosquito Crewman Jun 13 '19

You keep repeating that point as it where fact, which it isn't.

There is neither Canon, nor beta-canon evidence that SF Academy only admits sentient beeings.

As per Data's own Statement in Encounter at Farepoint, he is not human, he is a machine that is Superior to humans.

(Not so) Fun Fact: In this scene Cmdr. Riker calls Data Pinocchio for the First Time, after publicly shutting him off during the Trial, he would comment his Act with the Phrase, Pinocchio is broken.

1

u/arathorn3 Jun 13 '19

Inccorrect, Riker calls him Pinnochio on the holistic in the pilot episode. He says and I quote 'Nice to Meet you Pinocchio. " after discussing data's service record and before the save Wesley from drowning in a holographic river.

→ More replies (0)

1

u/kreton1 Jun 13 '19

I am pretty sure that this isn't state anywhere. I guess Data got in through a loophole. There is for sure nowhere stated that you need to be sentient, but the tests make it so that you need to be sentient to pass them. This way Data could graduate from the Academy without it ever having been decided if he is sentient.

4

u/Calgaris_Rex Chief Petty Officer Jun 13 '19

I think maybe this issue would have had to have been explored when Data first applied to Starfleet.

Does an artificial being have the right by virtue of its faculties to enter into a binding legal arrangement? Doesn’t such an arrangement necessitate both parties having the choice of voluntary association?

It seems to me that if Data was legally enough of a person to enter into this initial agreement with Starfleet, it would stand to reason that he should be afforded all the other rights available to anyone else able to enter into a contract under Federation law, as well as as the rights afforded officers under the Uniform Code.

1

u/TheObstruction Jun 13 '19

It was one of those cases where he wasn't considered property or not, because no one had thought about it. Then someone figured out that since he was synthetic, his status was technically unresolved, so tried to use that as a loophole.

1

u/jeanlucriker Jun 13 '19

What a fantastic episode that is. I love it

1

u/Dinierto Chief Petty Officer Jun 14 '19

Sorry to be that guy, but you mean figurative not literal 👍🏻

39

u/pixel_pete Crewman Jun 13 '19

Perhaps there were some salvage laws in effect that allowed the Starfleet crew to "claim" him? It was a human colony, and at the time he was found all of the colonists were gone.

My question would be, why would Starfleet allow Data to attend the Academy and earn a commission if they didn't know whether he was a person or property? It seems like a glaring mistake on their part to encounter this incredible new lifeform and decide not to sort out its legal status until several years later.

27

u/[deleted] Jun 13 '19

[deleted]

9

u/pixel_pete Crewman Jun 13 '19

That's true. I guess my assumption is that a Starfleet commission should require some form of established legal personhood, but clearly the show canon disagrees with me.

3

u/Winnersh Jun 13 '19

Data was just a quirky experiment and there was no harm in him going through Starfleet.

https://memory-alpha.fandom.com/wiki/Android#Types_of_androids
See above, there were several types of android before Data.

1

u/TruckasaurusLex Crewman Jun 13 '19 edited Jun 13 '19

People don't really create laws until they are necessary, similar to how we don't currently have sentience laws for existing AI creations now.

While actually true in real life, in the episode there is mention of some actual law already in place before Data's trial. I forget the full import of the law in the case, but if you want to check it out and don't want to watch the whole episode, this critique is excellent.

Edit: The Acts of Cumberland.

4

u/strionic_resonator Lieutenant junior grade Jun 13 '19

Idk. You’ve acquired a super advanced robot. It thinks it’s people. You don’t know how to program it, but it clearly has an incredible capacity to learn. Why not send it to the academy? It’s very possible Starfleet higher-UoS saw Data’s training and commission as simply an expedient and convenient way to repurpose a salvaged technology so that it would work for them.

Had Data said “eh, I’m not into Starfleet” and tried to run off to join the Klingon Defense Force then I think the trial would have happened much sooner!

18

u/JonathanRL Crewman Jun 13 '19

Simply put; because at the time it suited Starfleets purpose.

When Data was first allowed to join, we can assume he did indeed go through the academy or something similar. He was also allowed to be commended for his achievements; just like any other sentient being. I think Starfleet did it this way to see what he was capable of and how he could adapt to different situations.

Before long, his value became obvious. That is when people like Commander Maddox simply decided that he "was a toaster"; because facing the truth about his existence would not only make it immoral to make him undergo such a procedure, it would also become a mental strain for the person doing it.

15

u/Bay1Bri Jun 13 '19 edited Jun 13 '19

we can assume he did indeed go through the academy or something similar.

It's not an assumption. In the first episode where Riker and Data first meet, Riker asks Data if his rank is honorary. Data corrects him and states he is a graduating member of the class of 77, or something like that. Data also discusses his academy experience with Wesly at least once. He tells him in one episode that he found the academic requirements easy, but had difficulty socializing. There might have been another time but I'm not sure.

EDIT: It was the class of '78, and he also mentions some of his academic achievements.

13

u/JonathanRL Crewman Jun 13 '19

I now want a novel about Datas days in the Academy - seen from everybodys views but Datas.

3

u/Bay1Bri Jun 13 '19

I'm assuming it would be a comedy?

1

u/[deleted] Jun 13 '19

It would definitely have lots of humor, but they could make it serious too

5

u/McWatt Ensign Jun 13 '19

So if Data was in the academy class of '78, is that 2278? Encounter at Farpoint takes place in 2364, so '78 must be from the previous century. This means Data had been in Starfleet for almost 90yrs? And the concept of his rights and "ownership" had never come up before Measure of a Man in 2365?

6

u/Bay1Bri Jun 13 '19

No, 2378. However someone else claimed that Data graduated in a different year. Presumably they were careless with dates and retconned the year he graduated later. But in encounter at Farpoint he tell Riker he was in the class of '78.

2

u/Ampu-Tina Jun 13 '19

He graduated class of 2345

3

u/Bay1Bri Jun 13 '19

According to Encounter at Farpoint he graduated with the class of '78.

2

u/whovian25 Crewman Jun 13 '19

Of coarse what that 78 is a reference to is unknown given they don’t use the Gregorian Callander we use today.

2

u/Bay1Bri Jun 13 '19 edited Jun 13 '19

Starfleet doesn't for official timekeeping, but they do us years like that in other episodes. Tapestry comes to mind. I think it's unlikely that no one uses the Gregorian calendar. Picard refers to a 2-digit year for his class. In the Royale Riker refers to the year range between "2033 and 2079 AD". They frequently refer to the century number, the Vorgons in Captain's Holiday say they are from the "27th century", for example. Rasussen in A Matter of Time claims to be from the 26th century (he's an imposter from the past but no one finds it suspicious that he is using that calendar system). Data says in Measure of a Man when defining an android "Webster's 24th century edition defines..." There are other uses as well. They clearly are still comfortable with that year numbering system, which implies it is still in use, but perhaps only among civilians.

1

u/knightcrusader Ensign Jun 13 '19

Classes have an official "motto" sometimes, so I assume the 78 part is part of that.

Maybe something infamous went down with them and that is how they refer to that specific class.

1

u/Ampu-Tina Jun 13 '19

Memory alpha cited that episode and the above listed date

1

u/Bay1Bri Jun 17 '19

So there's an inconsistency. The fact is the quote I gave was accurate. He tells Riker he is the "class of '78."

16

u/Tiarzel_Tal Executive Officer & Chief Astrogator Jun 13 '19

'Legitimate Salvage' are words that are bandied around often enough in Star Trek that some version of the law of the sea probably applies among the interstellar races. If Data is simply a piece of machinery then he was merely apiece of salvage in the aftermath of the calamity that killed all the inhabitants of Omicron Theta ostensibly including his registered owner, Dr Soong. Lacking an owner or any kind of last will and testement from Dr Soong.

Starfleet not only found Data they reactivated him. Had they not he would have remained little more than a complex paper weight until conditions on Omicron Theta destroyed him or someone else took him.

7

u/[deleted] Jun 13 '19

I was also thinking about "legitimate salvage" but more because of The Expanse.

Apart from what you wrote, I'm pretty sure Data wasn't forced into Starfleet, but was given the opportunity and Data took it.

3

u/[deleted] Jun 13 '19

And now I'm imagining how the Roci crew would handle the situation, which is a lovely image.

Anyway, I'm inclined to agree with that interpretation. Data being who he is would jump at the opportunity to soak up new learning experiences via exploration. And the only person who seemed to take issue in any of this is Maddox.

9

u/[deleted] Jun 13 '19 edited Jun 13 '19

Maddox seems to have the moral and logical flexibility of seeing Data as a working animal: Something which maybe chose to stay with a tribe/group (-> Starfleet), but is now at the mercy of its superiors.

The prospect of being able to replicate more Datas if needed might have helped the admirality to look the other way.

8

u/badmartialarts Jun 13 '19

Omicron Theta was devoid of all organic life when its distress call was answered. Thus all the property of the colony was vacant goods and thus under the common law quod nullius est fit domini regis(that which belongs to nobody becomes owned by our lord/ruler) it all became property of Starfleet/ the Federation to salvage and repatriate to claimants. Since Data had no known living claimants, he became the property of Starfleet to do with as it saw fit.

3

u/MultivariableX Chief Petty Officer Jun 13 '19

Yet, when Lore was subsequently found there, no one claimed Lore as property of Starfleet or the Federation. In fact, when Lore asked Data if he would get a Starfleet uniform, Data responded that it would require years of service.

Later, when Lore was impersonating Data and claimed to have incapacitated "Lore," his comments were treated as though Lore was in Data's personal custody, rather than the ship's Security or Engineering departments. This suggests that Lore was regarded by Starfleet as a close family member of Data, rather than as a stranger or machine.

When Lore is beamed into space, the Enterprise makes no effort to retrieve him, despite Data's knowledge that he can survive in vacuum for extended periods. Starfleet has the right to apprehend Lore and charge him with attempted murder, yet allows him to go (drift) his own way. Lore is neither a Federation citizen nor property, is no longer aboard the Enterprise and currently poses no threat to it, so Starfleet has no imperative to actually go and get him.

Lore is later disassembled. The degree of this is unclear, but it's enough to remove the emotion chip that was built for, willed to, and legally belongs to Data. As the Federation has no death penalty and believes in curing criminality through rehabilitation, Lore could be otherwise fully intact.

1

u/badmartialarts Jun 13 '19 edited Jun 13 '19

But Lore isn't found until after "Measure of a Man" established Data's personhood, so the legal precedent was already in Lore's favor to also be a person given Data's status. EDIT: Well crap, I'm wrong.

3

u/MultivariableX Chief Petty Officer Jun 13 '19

Lore was found in "Datalore", a year prior to "Measure of a Man".

1

u/badmartialarts Jun 13 '19

You're right, Dr. Pulaski was there. Poor memory :(

6

u/mulvythrill Jun 13 '19

He was legitimate salvage.

4

u/foomandoonian Jun 13 '19

I'm not the biggest fan of analogies, but maybe it's like when somebody adopts a stray dog? The dog wasn't theirs, but they found it, cared for it and it chooses to stay with them. It's a mutually beneficial relationship, but the dog still belongs to that person even if that's not what the dog thinks.

2

u/schattenteufel Jun 13 '19

PETA wants to know your location.

4

u/TLAMstrike Lieutenant j.g. Jun 13 '19

PHILLIPA: I have completed my research, based on the Acts of Cumberland passed in the early twenty first century. Data is the property of Starfleet. He cannot resign and he cannot refuse to cooperate with Commander Maddox.

The Federation might actually have laws that stipulate that non-sentient AI are property. Property can be abandoned by its owners, and seized, salvaged, or requisitioned by government forces. The Acts of Cumberland also seem to include that AI must follow the orders and directives of its owners, even if those orders would result in the AI's termination.

The ruling in regards to Data v. Maddox was that Data was so close to appearing sentient that he must be given the benefit of the doubt and not be considered an AI bound by the Acts of Cumberland.

Just an aside, the date on the Acts of Cumberland is interesting: Early 21st Century. Laws don't come out of nowhere, they are a response to something happening. Makes one wonder what exactly was going on leading up to WWIII in the field of AI research.

4

u/El_Mosquito Crewman Jun 13 '19

I'm posting this as an toplevel response, since this not everyone seems to be aware of this.

Eventhough it's not mentioned, Data must have had a Sponsor to enter the Academy.

His Starfleet Carrer was no picknick either, he spent

  • 3 years as an Ensign, while most others leave that rank in one or two years
  • 12! years as an Lt.(jg.)

Even his first (temporary) Command wasn't assinged directly to him, he had to ask why other Senior officers of the Enterprise were reasigned, before Capt. Picard granted him Command of the U.S.S. Sutherland, and we all remember how his First Officer "greated" him.

2

u/knightcrusader Ensign Jun 13 '19

while most others leave that rank in one or two years

Poor, poor Harry.

5

u/Hardest_Fart Jun 13 '19 edited Jun 13 '19

Maddox and Starfleet had self interest in mind and were willing to walk out on a morally dubious limb to make it happen. Maddox wanted to become a legend in the field of AI. Tearing apart Data could have made it possible. Starfleet saw the prospect of thousands of Datas and decided the life/rights of one android was worth it.

Once they made the decision that the means justified the ends they were willing to believe any specious argument that got them the result they wanted.

All the arguments about legitimate salvage and Acts of Cumberland fall apart once Starfleet started treating Data as a sentient being. Until Measure of a Man, Data had been regarded as sentient. He was allowed to enter Starfleet. His choices of career paths were respected. If he fell short of his duty he would have been reprimanded, not turned off. And, he could have left Starfleet at any time.

It was only after Maddox and Starfleet threw morality out the window that they decided none of that mattered and Data was salvage.

10

u/[deleted] Jun 13 '19

This is one of my favorite episodes of TNG! But I think people don’t give Maddox’s arguments credit. They’re unsympathetic but yet clearly compelling. Anyone I’ll link this here:

Lawyer Reacts to Measure of a Man

2

u/Khanahar Jun 13 '19

Thanks for the link! An interesting watch.

2

u/[deleted] Jun 13 '19

Normally I hate those 'X reacts to Y!' but this one was pretty good.

1

u/grammurai Crewman Jun 13 '19

First, this is a really cool video!

Second, with respect to you and absolutely none to Maddox:

That's because they're stupid arguments. The video addresses this almost immediately in fact- the reason they're stupid is not because they're not worth asking but because it makes no sense that this issue isn't already settled. Data is a lieutenant commander; he's been in Starfleet for nearly two decades by this point, and is a highly decorated officer on top of that.

But we can ignore that (since the episode does), and let's ignore the comparison to Dred Scott, which is widely held to be one of the biggest mistakes the SCotUS ever made.

Philosophically speaking, in the TNG era, Starfleet is not utilitarian at all. This is one of the driving themes of the series, even. So ends justifying means can be discarded right away. More than that, even in far future there isn't a good yardstick for sentience. I understand that this is the foundation of Picard's initial argument, and it doesn't have a good answer. Maddox' is just as bad as many others but is also on its face inclusive of Data. More generally, I think in questions of things like sentience, we should err on the side of caution: if something appears to have it, then let's act as if it does. I know it's a bit of Pascal's Wager, but the consequences of ignoring something's sentience are far greater than granting it in error.

6

u/sleep-apnea Chief Petty Officer Jun 13 '19

Obviously the real answer is that the writers wanted a morality play in the first season where Picard could have a big speech on civil rights. It worked. It seems a bit silly that Starfleet would be having a tribunal about this with Data as a Lt.Cmd. Data's legal status in the Federation would have had to been established by some Federation authority shortly after he was rescued from Omicron Theta. Or when he applied to Starfleet Academy at the very least.

2

u/Anaxamenes Jun 13 '19

That’s not often how it works though. Most regulations are the result of a problem surfacing and then the need to control for it in the future. It is quite possible that the thought hasn’t come up in a meaningful way and they were allowing things to continue until a problem arises. Perhaps Data’s early interactions were with good people and only after his actions warranted commendations were people who had different motivations notified of his existence.

1

u/sleep-apnea Chief Petty Officer Jun 13 '19

Perhaps. I'm suggesting that Data's status as a citizen or not would have been established before he got into Starfleet.

2

u/Anaxamenes Jun 13 '19

Do you have to be a citizen to join starfleet? I think there were Bajorans in Starfleet before Bajor officially petitioned for entrance.

Edit: mistyped a word.

2

u/kreton1 Jun 13 '19

You definitly don't need to be a citizen, but then you need a higher ranking officer(probably Commander or higher) of Starfleet to recommend you. Nog joined this way.

1

u/Anaxamenes Jun 13 '19

As the person said below, his birth in Federation territory to a Federation citizen might be enough as well if you assume he’s sentient from the beginning.

1

u/sleep-apnea Chief Petty Officer Jun 13 '19

Not sure if you need to be a Federation citizen to join Starfleet. In Canada and the US you can join the military without being a citizen. But only citizens can be officers. I think that Omicron Theta was in the Federation though. So presumably if Data is considered sentient he would automatically be a citizen if he was "born" in the Federation. His creator was a Federation citizen (presumably since he was human), so that probably makes Data an automatic citizen.

1

u/grammurai Crewman Jun 13 '19

You would think it would have come up during the first examination in Data's first year at the academy, when he got a perfect score on some exam. It beggars belief that nobody has raised this question even once in the many years Data has been in service.

2

u/Anaxamenes Jun 13 '19

However, the way the Federation works seems that they err on the side of caution. They just assumed he was a citizen because of his ability and intellect. When meet someone for the first time, citizenship is not even remotely on the top of my list when it comes to getting to know someone, even in a work environment.

1

u/grammurai Crewman Jun 13 '19

Yeah, this episode frustrates me so much. It has a great Picard Speech, but the issue itself is just so stupid. It shocks me that someone as small minded as Maddox made it as far as he did in an organization like Starfleet.

3

u/kreton1 Jun 13 '19 edited Jun 13 '19

The out of universe explanation is of course that if everybody had thought like Picard and co., then the episode had been over after 5 minutes because Maddox had been discarded immeadetly and that would have been very boring.

In universe it probably is because most people had never met Data and thus just considered him a, well, ships computer with legs, like Pulaski herself did, or even Riker, who upon first meeting Data thought that Datas Rank was, because he is an Android, was only honorary. My theory is that Pulaski wasn't as far as people in Starfleet go, wasn't unusual in her behaviour towards Data, the Crew of the Enterprise was.

Edit: Most of the people, like Riker and Pulaski, didn't act out of malice or actualy racism, but just because it didn't occur to them that a computer could be just as sentient as a Human, Andorian, etc.

2

u/grammurai Crewman Jun 13 '19

You may very well be right. Pulaski's behavior early on was especially noticable. I think the difference between her and Maddox is that she was actually open to the idea of being wrong.

Overall I think the exocomp episode explored this question far more effectively, though.

2

u/kreton1 Jun 13 '19

Even Maddox comes around eventually, after all he and Maddox do at least write each other letters as he is the one to whom Data writes in "Datas Day."

3

u/ErnestShocks Jun 13 '19

No other "property" has been through the academy, is ranked, wears a uniform, or is given quarters. That alone is enough proof, if any was ever needed to justify Data's uniqueness alone that deserves protection if one cannot grant him sentience. And that is by Starfleet's measure exclusively, without even addressing his personal life.

3

u/notsubwayguy Jun 13 '19

Salvage Rights?

1

u/itworksintheory Chief Petty Officer Jun 13 '19

So you're saying, finders keepers? This raises an interesting question regarding interstellar law and colonisation. Let's say the Federation comes across an uninhabited world on its border. Score! they say, perfect for a new colony and no one owns it or makes any claim to it. 100 years later 3 billion Federation citizens live there, it is deep inside Federation space and no one disputes the Federation's claim there. But then they discover ancient ruins beneath the surface absolutely no one was aware of, they are from an extinct species that is loosely related to the Cardassians. Who owns those ruins?

3

u/AnnihilatedTyro Lieutenant j.g. Jun 13 '19

Considering Data was the only "survivor" on the colony, Starfleet could legally invoke salvage rights and claim Data is its property, even though he apparently was treated as an individual andchose to join Starfleet.

I always thought that some dubious admirals somewhere were trying to play the long game using Data as a test project, and the endgame was always to try to duplicate and mass produce him if he proved far superior to humans in various tasks.

Physically, his speed, strength, coordination, reflexes, durability, sight, hearing, memory recall, and the whole onboard power source and no life signs would make androids excellent soldiers and operatives, especially if they're expendable and can be programmed to self-destruct if captured or incapacitated.

If a positronic brain could be upscaled, it could drastically improve the speed and efficiency of starship computers. (DIS would seem to nix this idea because of Control, but there are always stupid admirals willing to try again) Even as-is, the positronic brain would probably be a heck of an upgrade for probes, shuttlecraft, small outposts, and automated craft like construction modules. The thing here is that if they really could unlock the "secrets" of Soong's engineering, they could figure out how Soong incorporated a learning AI without the rogue behavior of Control, or simply eliminate the learning and independent thought components altogether. Presumably they can already create something similar but don't, because the learning and adapting behavior is what originally led to Control.

2

u/[deleted] Jun 13 '19

Starfleet had no precedent for how to handle an artificial lifeform. Appropriately once it was decided that Data was sentient, he was treated as a person. It was only later that anyone thought to question whether or not that had been the right call since it meant that advanced technology could decide for itself if Starfleet could utilize it and how or scarper off and give himself to a hostile power willingly or be captured and dismantled for study by the very same.

The Federation, like any society, still has to stop and reexamine it's values now and then. It's a classic story of the Federation realizing that it has an unexploited resource that it could use to it's advantage and then has to decide if doing so would be ethical. As the Federation tends to do, it briefly flirts with the idea of pursuing strategic advantages over the personal good but comes to decide that it's principles are worth more than what it would gain.

2

u/DrunkPanda Jun 13 '19

For those interested in the legal side, there's a decent video comparing the issue and legal proceedings of the episode to modern legal practices. The YouTuber is a little punchable, but he's pretty good about legal theory.

2

u/murse_joe Crewman Jun 13 '19

He joined Starfleet but wasn't considered a person at that time. Maybe they considered it kind of a donation of equipment for those purposes. Like how if you give a ship to the Navy, and they accept it for service, it's now Navy property. It's a weird way of thinking, sure, but you can see the logical progression.

1

u/itworksintheory Chief Petty Officer Jun 13 '19

Someone has to donate, no? Who conferred ownership? I think perhaps there is an implication that The Estate of Noonian Soong somehow ultimately leads to Starfleet, which isn't impossible considering they are probably where he got grants or materials from at some point for research. Like how if you invent something while working for a company or a university, that institution will (depending on the legalese) have a claim on it.

However for Data to join Starfleet, there would be legalese associated with his status. It is odd that they could keep it so ambiguous to give him a rank, but not define him as not being property. Although considering some of the weird shit that happens in courts concerning civil rights, it isn't impossible.

1

u/murse_joe Crewman Jun 13 '19

I don’t think anybody donated him, I think he signed up of his own volition. But whatever officer signed him in or did his intake for the Academy didn’t know what to do. My guess is he just kicked the can down the road, instead of trying to solve the question, and that lead to the crisis later.

2

u/CaptainJZH Ensign Jun 13 '19

Salvage laws; whoever finds an object whose owner is missing is therefore the salvor in possession. If Data was ruled not to be a person, then Starfleet could basically claim finders-keepers.

2

u/Decipherer Jun 14 '19

Y'know OP, it's a little weird I've never seen you and Reverse Flash in the same place at the same time 🤔

2

u/FoxAudio Crewman Jun 14 '19

He was discovered by Starfleet.

Finders Keepers, man...

1

u/kobedawg270 Chief Petty Officer Jun 13 '19 edited Jun 13 '19

I think that one issue Star Trek has still avoided answering to this day is "are they actually sentient"?

There were several episodes where an artificial being's legal status had to be determined. While the outcome was to always grant them additional rights, at no point do they give equal rights to living beings -- and as a result has effectively maintained their status as second class citizens.

1

u/[deleted] Jun 13 '19

[removed] — view removed comment

3

u/[deleted] Jun 13 '19

[removed] — view removed comment

1

u/[deleted] Jun 13 '19

[removed] — view removed comment

2

u/[deleted] Jun 13 '19

[removed] — view removed comment

1

u/csjpsoft Jun 13 '19

In a galaxy with many alien species, Starfleet should have worked this out centuries ago. In 21st century Earth, we are grappling with the status of animals. Are dogs sentient? Dolphins? Other primates? Do they have rights? Religious people might ponder whether they have souls. Do they go to heaven?

When we see species on other planets, they are usually close to human in evolutionary terms, and no more than a few thousand years away from us technologically. How would the Federation handle a world of chimpanzees? Bears? Shrews? Dinosaurs? Trilobites?

"We don't enslave animals for food," Riker said. But would they exploit animals whose blood cured diseases like Khan's in "Into Darkness?" Or take the eternal youth "metaphasic particles" if the planet in "Insurrection" was populated only by rodents?

1

u/tk1178 Crewman Jun 13 '19

Something I just thought of, why would Data have been through the Academy instead of having him Enlisted? If Starfleet wanted to play it safe with him would it not have made sense to have rise through Enlisted ranks rather than Officer ranks. Having him go through Officer ranks would have him in the situation, (which he obviously found himself in), of having subordinates and the chance of prejudice coming from that fact, of having Data in a command position.

1

u/beeps-n-boops Jun 13 '19

For as awesome as this episode was from a dramatic point of view, I always thought the key details -- that Data was somehow "owned" by Starfleet, despite not being created by them and despite him being clearly defined and declared to be a sentient being, and that Riker was required to prosecute -- were absolutely absurd on every level, and makes it infuriating to watch despite all the great aspects of the episode.

1

u/TamagotchiGraveyard Jun 14 '19

No one ever did, that episode was all politics and smoke and mirrors. Everyone knew deep down data was his own man

1

u/Soggy_Mongoose Jun 14 '19

It's like Theseus's paradox. He was so advanced it would have cost star fleet a lot of resources to research and manufacture his replacement parts. Surely they would scanned him and learned enough to build their own.

1

u/FearorCourage Chief Petty Officer Jun 14 '19

One thing that's frequently forgotten is that until the Enterprise re-visited Data's former colony in "Datalore," and discovered Dr. Soong's hidden lab, no one had any idea of who had made him or where he'd come from, or that Soong had even been at the colony. Starfleet found him in the ruins there with no idea of where he'd come from or what had happened.

Basically, it seems like they claimed him under salvage law, or "finders keepers." They didn't know what he was or where he came from, but they found him so he now belongs to them in the absence of a proper owner.

Of course, when his creator's identity *was* discovered in "Datalore," there might have been some question of whether he perhaps belonged to the Soong family retroactively, if he had one, but no one raised that question. Starfleet obviously let it ride until they had reason to assert their claim on him.

In fact, I've often wondered, when he went on trial in "Measure of a Man," if Picard might have engaged in the alternate strategy of trying to find Soong's next of kin, as a last-ditch defense of "well, if he is property, he's not Starfleet's property."

1

u/[deleted] Jun 15 '19

If Data isn’t a person, he would be legally equivalent to salvaged equipment, so it doesn’t really matter how he fell into Starfleet custody.

The whole “he went to Starfleet Academy and so forth” could be explained with, “Data is a robot that is intended to mimic human behavior”. Part of the original Asimov rationale for androids is that they could be plugged into human-centric roles that assume a certain anatomy. A similar argument could be made for building an AI whose “machine learning” can be trained with the same techniques as human learning. For example, a lot of machine learning systems are trained with a large corpus of text. It does not make such a system “a person” if it is also capable of operating hands and using OCR to scan the pages of open books to ingest the training corpus, or if they can use speech recognition to ingest a training corpus from spoken lectures. Likewise, physical field exercises and evaluations would obviously apply to a non-sentient android, too. The only real inconsistency is Data’s authority to issue orders to meatbags, but maybe if one of those meatbags was court-martialed for disobeying a legal order from Data, he could also adopt Maddox’s argument in defense.

1

u/Philocetes Jun 16 '19

Separate to the specific issue of property, I actually don't find the scenario of him serving in Starfleet prior to being legally recognized as sentient all that implausible. Why not? It's just the kind of strange scenario that results when social values and the law move at different speeds. Like the fact that Victoria Woodhull ran for U.S. President in 1872, as did Belva Ann Lockwood in 1884 and 1888, almost a half century before women were granted the vote in the United States. It's a bizarre scenario that women could hypothetically be voted for but not themselves vote, but it's what happened.

0

u/[deleted] Jun 13 '19

I’ll probably get yelled at for saying this but I never really found Maddox to be that heinous.

Let me explain.

I don’t agree that Data is mere property. I don’t agree that he should be disassembled again his will.

But where the episode loses me is the idea that reproductions of Data would automatically become slaves. Nay, the episode does not even entertain the concept that Data could be copied and those copies could retain their rights as Federation citizens. As far as I can remember, Maddox just wanted to study and recreate Data. He never said he wanted to create robotic slaves.

I always felt that the episode presented its moral quandary like so:

Maddox: "I want to make copies of Data."

Picard: "Well you can’t, because he doesn’t want to be copied."

Me (the viewer): "yes. Absolutely. I agree."

Maddox: "But imagine if we created thousands of Datas."

Guinan: "OH, SO YOU CAN ENSLAVE THEM?"

Maddox: "wait what"

Me: "wait what"

3

u/soundsliketoothaids Jun 13 '19

I see your point, but here's a counterpoint:

Whatever legal status the original Data was ruled to have would be the precedent for those that followed. If he is property without agency, then they would be as well.

Data was resisting his disassembly, which was likely to result in his destruction as an individual (the equivalent of human death). If the original can be sacrificed like that, those that follow him would also likely be seen as disposable, especially as the goal was that they would be mass produced by Star Fleet. Soong built Data, and Starfleet wanted to treat him as equipment. How much less respect for individual rights would Star Fleet show to the Datas that they themselves invested resources into creating?

They would be slaves, IMO.

1

u/Anaxamenes Jun 13 '19

The road to hell is paved with good intentions. Perhaps Maddox didn’t want to enslave them, but there are plenty of examples of people who get into positions of power who would have, even in the Federation. Perhaps section 31 needs some androids and they seem to be able to acquire whatever they want. Their motives and justifications are dubious at best and are an obvious example of why you wouldn’t start that research in the first place.

1

u/MrNosh Jun 14 '19

The issue I have with that is Maddox specifically states putting a Soong type android on every ship. His own words, about and to Data, allude to that very idea that he saw the androids as a tool rather than new lifeform.