The Rancor Pit Forum Index
Welcome to The Rancor Pit forums!

The Rancor Pit Forum Index
FAQ   ::   Search   ::   Memberlist   ::   Usergroups   ::   Register   ::   Profile   ::   Log in to check your private messages   ::   Log in

The Psychology of Droids
Post new topic   This topic is locked: you cannot edit posts or make replies.    The Rancor Pit Forum Index -> Characters, Droids, and Species -> The Psychology of Droids Goto page Previous  1, 2, 3, 4, 5, 6, 7, 8, 9  Next
View previous topic :: View next topic  
Author Message
Tinman
Lieutenant Commander
Lieutenant Commander


Joined: 26 Dec 2013
Posts: 110

PostPosted: Tue Jun 21, 2016 12:39 am    Post subject: Reply with quote

JironGhrad wrote:
3. Without preventative controls in place, any droid becomes suspect. If independence were to be spread, a droid revolution could take place. See: World of Dune, World of Warhammer 40k, Skynet. Also see, HAL 2001.


Actually, a couple of those are examples of much more interesting things.

In the case of HAL 9000, his behavior was a result of a disastrous misjudgment on the part of the mission planners. HAL was given a direct overriding instruction not to reveal the true nature of the Discovery mission to the flight crew (the scientific crew was trained separately) for security reasons. HAL was given full knowledge of the mission so that, even if the entire crew were not to survive, he would still be fully capable of carrying out the mission. This directly conflicted with HALs core design principle, "the accurate processing and representation of information without distortion or concealment." The result was that he was placed in an impossible position which could only be resolved by killing the crew, and he could still fulfill his programming in terms of the mission itself regardless.

HALs primary designer was furious when he found out what some idiot had done. This would actually be a better example of why computer programmers in the Star Wars setting (those with Computer programming/repair skill) aren't qualified to deal with droid programming. They might understand the basic principles, but it would be a bad idea for someone not experienced in dealing with artificially intelligent systems (i.e. Droid programming skill) to meddle with them. (In game terms, one can always default to the Technical attribute, but someone with the actual skill at a reasonable level would be far more likely not to botch the job.) The irony here is also that it was a matter of two preventative controls working against each other.

If someone's going to attempt to argue canon, the above is from Clarke's own work as published in 2010: Odyssey Two.

On the subject of the Dune setting, this is a lot more complicated, and little is said about the Butlerian Jihad in the original works of Frank Herbert. However, this is expanded on in the further writing of Brian Herbert and Kevin Anderson, drawing from notes left by Frank Herbert.

In the Legends books, the problem of the sentient machines was the result of a group of radical idealists intent on subjugating humanity (though their intentions seemed to be originally more toward stirring humanity from its complacency) known as the Titans.

One of them, Vilhelm Jayther (also known as Barbarossa) altered the programming of humanity's highly pervasive computer networks in order to help them carry out their coup, instilling the machines with ambition and a desire for dominance. The problem came about later when another of the Titans, Xerxes, ceded too much control (and responsibility) to the portion of the computer network managing the territory he controlled after the revolution, inadvertently creating Omnius. It then spread itself like a virus to other worlds.

Interestingly Omnius did still have certain restrictions on its behavior, courtesy of Barbarossa's genius. It was still unable to directly harm any of the 20 original Titans, regardless of how much effort it put into attempting to override that core directive.

Neither of those stories really involved intelligent machines simply deciding to rebel against humanity as a result of inherent independence. The root causes had to do with human ambition, error and misjudgment on a completely different level than that.

As far as the subject of canon in the Dune setting goes, I've already made the distinction between Herbert's original work, and the later derivative work of his son and Anderson, clear. (Most other information usually considered on the subject of the Butlerian Jihad is taken from The Dune Encyclopedia, which occupies no place in Dune canon according to Frank Herbert's own introduction in that same work.)
Back to top
View user's profile Send private message
CRMcNeill
Director of Engineering
Director of Engineering


Joined: 05 Apr 2010
Posts: 16163
Location: Redding System, California Sector, on the I-5 Hyperspace Route.

PostPosted: Mon Jun 27, 2016 11:46 pm    Post subject: Reply with quote

This is for you, MrNexx...
_________________
"No set of rules can cover every situation. It's expected that you will make up new rules to suit the needs of your game." - The Star Wars Roleplaying Game, 2R&E, pg. 69, WEG, 1996.

The CRMcNeill Stat/Rule Index
Back to top
View user's profile Send private message Send e-mail Visit poster's website
MrNexx
Rear Admiral
Rear Admiral


Joined: 25 Mar 2016
Posts: 2248
Location: San Antonio

PostPosted: Tue Jun 28, 2016 9:52 am    Post subject: Reply with quote

Look, it was designed for Customer Service jobs. You'd wander into traffic, too, if that was your fate. Wink
_________________
"I've Seen Your Daily Routine. You Are Not Busy!"
“We're going to win this war, not by fighting what we hate, but saving what we love.”
http://rpgcrank.blogspot.com/
Back to top
View user's profile Send private message Visit poster's website
CRMcNeill
Director of Engineering
Director of Engineering


Joined: 05 Apr 2010
Posts: 16163
Location: Redding System, California Sector, on the I-5 Hyperspace Route.

PostPosted: Tue Jun 28, 2016 10:51 am    Post subject: Reply with quote

Laughing
_________________
"No set of rules can cover every situation. It's expected that you will make up new rules to suit the needs of your game." - The Star Wars Roleplaying Game, 2R&E, pg. 69, WEG, 1996.

The CRMcNeill Stat/Rule Index
Back to top
View user's profile Send private message Send e-mail Visit poster's website
Whill
Dark Lord of the Jedi (Owner/Admin)


Joined: 14 Apr 2008
Posts: 10286
Location: Columbus, Ohio, USA, Earth, The Solar System, The Milky Way Galaxy

PostPosted: Sun Jul 07, 2019 2:51 pm    Post subject: Reply with quote

MrNexx wrote:
CRMcNeill wrote:
And just because a droid exhibits a personality does not mean that it is "conscious" in the sense that we understand it. It could be nothing more than an exceptionally well developed user interface; what if it's just super-Alexa with arms and legs?

"Alexa, I need you to deliver these Death Star plans to Obi-wan Kenobi on Tatooine."

But, again, we get into the question of what is and is not consciousness. At what point do we decide that a creature is "conscious", and not a highly sophisticated set of impulses and heirarchies? What is the threshhold for "sophont" vs. "non-sophont"?

Sutehp wrote:
TauntaunScout wrote:
Maximum7 wrote:
They obviously know the origin of consciousness and how to create it since they have “sentient” droids

I strongly disagree. R2 and 3-PO are nothing more than the affection we have for our cars and other things, crossed with Wilson or whatever its name was that was that won Jeopardy.

Gentlemen, I refer you to the scene in TESB when the heroes are boarding the Falcon as they're trying to escape Cloud City:

C-3PO wrote:
Ouch! Oh! That hurts! Bend down, you thoughtless--Ow!

...To put it another way, Alexa isn't going to cry out in pain if you drop your iPhone.

And don't forget that C-3PO and R2-D2 have been shown to feel both fear and anger multiple times, as well as several other emotions. But why would droid manufacturers program their droids with emotions that would make them less efficient at their supposed functions (R2-D2 as a mechanic and C-3PO as a diplomat)? That makes absolutely no sense. The only reasonable reasonable explanation is that the droids that are advanced enough to develop personalities wind up developing them on their own. It's even been stated in several places in the SWU that droids are regularly given memory wipes because if they don't get memory wipes on a regular basis, some droids become recalcitrant and refuse to work properly. If they're "alive" enough to develop personalities and feel pain, then how can they not be considered as sentient or as conscious as any biological lifeform? Doesn't any difference in consciousness just become academic at that point?

How many of you haven't seen the Star Trek TNG episode "Measure of a Man"? One has a flesh covering and the other is covered in gold-colored metal, but is there any real difference in the self-awareness of Data and C-3PO? Hell, before Data got his emotion chip, he could be considered "less" sentient since emotions so often baffled him. But C-3PO has been able to feel emotions as a matter of course ever since he was first activated. (Remember how embarrassed C-3PO was in TPM when his "parts were showing"? Don't try to tell me embarrassment isn't an emotion.) Doesn't that arguably make him more sentient than Data since the former was able to feel emotion without modification and the latter could only develop the ability to feel emotion after the hardware upgrade of the emotion chip?

Dredwulf60 wrote:
Well...a couple of things:

A) I am on your side in this argument. I think machines can achieve sentience and that the higher-functioning droids in Star Wars have.

B) The Alexa example isn't a good one to make your point. Alexa could easily be programmed to detect when your cell phone has been dropped and it might be considered a neat effect for it to say 'Ouch! That hurt, please be more careful!'
Pain is nothing more than information that potential damage has been suffered. It is quite possible for a sentient machine to feel no pain, or a non-sentient machine to register when it has been potentially damaged and react.

Lots to discuss here! And this thread is an... interesting read if you want to go back to the beginning of it...
_________________
*
Site Map
Forum Guidelines
Registration/Log-In Help
The Rancor Pit Library
Star Wars D6 Damage
Back to top
View user's profile Send private message Visit poster's website
Whill
Dark Lord of the Jedi (Owner/Admin)


Joined: 14 Apr 2008
Posts: 10286
Location: Columbus, Ohio, USA, Earth, The Solar System, The Milky Way Galaxy

PostPosted: Sun Jul 07, 2019 2:51 pm    Post subject: Re: pain Reply with quote

Sutehp wrote:
Droids can obviously feel pain, so it begs the question of why a droid programmer would enable a protocol droid to feel pain when it adds nothing to a protocol droid's function of establishing etiquette and diplomacy. It makes no sense for a protocol droid to feel something so extraneous as pain.

It makes a lot of sense to program droids to feel pain. It has the same purpose that animal life forms (including us) have evolved to feel pain for: self-preservation, to avoid injury. Whether droid owners view droids as mere property, pets they are fond of, or fully sentient equals, they all share in common not wanting their droids to damage or destroy themselves.
_________________
*
Site Map
Forum Guidelines
Registration/Log-In Help
The Rancor Pit Library
Star Wars D6 Damage
Back to top
View user's profile Send private message Visit poster's website
Dredwulf60
Line Captain
Line Captain


Joined: 07 Jan 2016
Posts: 910

PostPosted: Sun Jul 07, 2019 3:42 pm    Post subject: Reply with quote

Having gone through the whole thread again...

I have one thing to bring up, which was only briefly touched upon...

RESTRAINING BOLT.

If droids didn't develop a will of their own, why would this piece of hardware be developed?
If it was simply a matter of a memory wipe to get droids back to their base subservient programming, why do restraining bolts exist?

For me the answer is, Lucas was using classic tropes.

Droids were created to fulfill the role of bumbling peasants (as in Hidden Fortress), serf types made to do toil and work.

Picked up in the desert by Jawas they were made slaves... and when you are a slave you get shackled.
The restraining bolt is a stand-in. A sci-fi version of a slave shackle.

This tells me:
Droids are sentient enough to be treated as slaves.
Droids might try to run away from a 'bad' master. That suggests to me that they have self-awareness and enough simulation of being a character to make it functionally so.


The galaxy in general is okay with slavery of droids as long as the majority are in denial of the droid's free will. They likely use a lot of the language used in this thread to undermine any droid-freedom type movements. Ridiculing those who might stand up for the emancipation of sentient machines.

And I'm okay with that. It is appropriate to the setting.

If I were running a D&D game set in our historical Ancient Rome, there would be slaves as part of the setting, and the majority of the NPCs would be fine with it as the natural order of things.
Back to top
View user's profile Send private message
Potroclo
Sub-Lieutenant
Sub-Lieutenant


Joined: 01 Jul 2019
Posts: 57

PostPosted: Sun Jul 07, 2019 4:27 pm    Post subject: Reply with quote

I always though droids were tied to a master, whether they liked it or not (or more accuratelly, whether the personality they develop over time liked it or not) and restraining bolts were used to override this and tie them to a new master.
Back to top
View user's profile Send private message
CRMcNeill
Director of Engineering
Director of Engineering


Joined: 05 Apr 2010
Posts: 16163
Location: Redding System, California Sector, on the I-5 Hyperspace Route.

PostPosted: Sun Jul 07, 2019 8:48 pm    Post subject: Reply with quote

Dredwulf60 wrote:
If it was simply a matter of a memory wipe to get droids back to their base subservient programming, why do restraining bolts exist?

Maybe one too many droids went haywire and started killing innocent bystanders.

Maybe some droids were deliberately rigged with override program in order to turn them into unwitting agents for corporate espionage, saboteurs or assassins.

So, in reaction, corporations / governments / etc mandated that droids had to be fitted with a restraining bolt as an external failsafe.

And then the Law of Unintended Consequences kicked in and droid thieves started using restraining bolts to keep the "stolen property" from returning to its former master.
_________________
"No set of rules can cover every situation. It's expected that you will make up new rules to suit the needs of your game." - The Star Wars Roleplaying Game, 2R&E, pg. 69, WEG, 1996.

The CRMcNeill Stat/Rule Index
Back to top
View user's profile Send private message Send e-mail Visit poster's website
MrNexx
Rear Admiral
Rear Admiral


Joined: 25 Mar 2016
Posts: 2248
Location: San Antonio

PostPosted: Mon Jul 08, 2019 11:32 am    Post subject: Re: pain Reply with quote

Sorry about the length; I put spaces in the second transcript quote, so it would be easier to read.

Whill wrote:
Sutehp wrote:
Droids can obviously feel pain, so it begs the question of why a droid programmer would enable a protocol droid to feel pain when it adds nothing to a protocol droid's function of establishing etiquette and diplomacy. It makes no sense for a protocol droid to feel something so extraneous as pain.

It makes a lot of sense to program droids to feel pain. It has the same purpose that animal life forms (including us) have evolved to feel pain for: self-preservation, to avoid injury. Whether droid owners view droids as mere property, pets they are fond of, or fully sentient equals, they all share in common not wanting their droids to damage or destroy themselves.


Apropos

If droids feel pain as unpleasant, they can learn to avoid it to reduce the need for repair... a lot like people do.

Also apropos is the transcript of the TNG episode "Measure of a Man", where the Federation debates Data's status as a sentient person, or Starfleet property.

Riker, who is obligated to act as the prosecutor, states

Quote:
The Commander is a physical representation of a dream, an idea conceived of by the mind of a man. It's purpose is to serve human needs and interests. It's a collection of neural nets and heuristic algorithms. Its responses dictated by an elaborate software programme written by a man. Its hardware built by a man. And now. And now a man will shut it off.
(A flick of the hidden off switch, and Data slumps across the table)


Picard, on cross examination of Maddox, the cybernetics officer who wants to dismantle Data (Phillipa is the JAG officer serving as judge in the case)

Quote:

PICARD: Yes, yes, yes. Suffice it to say, he's an expert. Commander, is your contention that Lieutenant Commander Data is not a sentient being and therefore not entitled to all the rights reserved for all life forms within this Federation?

MADDOX: Data is not sentient, no.

PICARD: Commander, would you enlighten us? What is required for sentience?

MADDOX: Intelligence, self awareness, consciousness.

PICARD: Prove to the court that I am sentient.

MADDOX: This is absurd! We all know you're sentient.

PICARD: So I am sentient, but Data is not?

MADDOX: That's right.

PICARD: Why? Why am I sentient?

MADDOX: Well, you are self aware.

PICARD: Ah, that's the second of your criteria. Let's deal with the first, intelligence. Is Commander Data intelligent?

MADDOX: Yes. It has the ability to learn and understand, and to cope with new situations.

PICARD: Like this hearing.

MADDOX: Yes.

PICARD: What about self awareness. What does that mean? Why am I self aware?

MADDOX: Because you are conscious of your existence and actions. You are aware of yourself and your own ego.

PICARD: Commander Data, what are you doing now?

DATA: I am taking part in a legal hearing to determine my rights and status. Am I a person or property?

PICARD: And what's at stake?

DATA: My right to choose. Perhaps my very life.

PICARD: My rights. My status. My right to choose. My life. It seems reasonably self aware to me. Commander? I'm waiting.

MADDOX: This is exceedingly difficult.

PICARD: Do you like Commander Data?

MADDOX: I don't know it well enough to like or dislike it.

PICARD: But you admire him?

MADDOX: Oh yes, it's an extraordinary piece of

PICARD: Engineering and programming. Yes, you have said that. Commander, you have devoted your life to the study of cybernetics in general?

MADDOX: Yes.

PICARD: And Commander Data in particular?

MADDOX: Yes.

PICARD: And now you propose to dismantle him.

MADDOX: So that I can learn from it and construct more.

PICARD: How many more?

MADDOX: As many as are needed. Hundreds, thousands if necessary. There is no limit.

PICARD: A single Data, and forgive me, Commander, is a curiosity. A wonder, even. But thousands of Datas. Isn't that becoming a race? And won't we be judged by how we treat that race? Now, tell me, Commander, what is Data?

MADDOX: I don't understand.

PICARD: What is he?

MADDOX: A machine!

PICARD: Is he? Are you sure?

MADDOX: Yes!

PICARD: You see, he's met two of your three criteria for sentience, so what if he meets the third. Consciousness in even the smallest degree. What is he then? I don't know. Do you? (to Riker) Do you? (to Phillipa) Do you? Well, that's the question you have to answer. Your Honour, the courtroom is a crucible. In it we burn away irrelevancies until we are left with a pure product, the truth for all time. Now, sooner or later, this man or others like him will succeed in replicating Commander Data. And the decision you reach here today will determine how we will regard this creation of our genius. It will reveal the kind of a people we are, what he is destined to be. It will reach far beyond this courtroom and this one android. It could significantly redefine the boundaries of personal liberty and freedom, expanding them for some, savagely curtailing them for others. Are you prepared to condemn him and all who come after him to servitude and slavery? Your Honour, Starfleet was founded to seek out new life. Well, there it sits. Waiting. You wanted a chance to make law. Well, here it is. Make a good one.

PHILLIPA: It sits there looking at me, and I don't know what it is. This case has dealt with metaphysics, with questions best left to saints and philosophers. I'm neither competent nor qualified to answer those. I've got to make a ruling, to try to speak to the future. Is Data a machine? Yes. Is he the property of Starfleet? No. We have all been dancing around the basic issue. Does Data have a soul? I don't know that he has. I don't know that I have. But I have got to give him the freedom to explore that question himself. It is the ruling of this court that Lieutenant Commander Data has the freedom to choose.

_________________
"I've Seen Your Daily Routine. You Are Not Busy!"
“We're going to win this war, not by fighting what we hate, but saving what we love.”
http://rpgcrank.blogspot.com/
Back to top
View user's profile Send private message Visit poster's website
CRMcNeill
Director of Engineering
Director of Engineering


Joined: 05 Apr 2010
Posts: 16163
Location: Redding System, California Sector, on the I-5 Hyperspace Route.

PostPosted: Mon Jul 08, 2019 12:32 pm    Post subject: Reply with quote

You really need to put more thought into exactly how well your examples apply to the SWU. Throughout this topic, you've been picking anecdotal examples that only really apply in the sense that they involve droids/robots (and usually extremely high functioning ones, at that).

Star Trek is not Star Wars; Data is very nearly unique, and the Trek galaxy as we know it has nothing like mass produced droids built for specific purposes, not to mention that some of its experiments in AI have gone horribly wrong (the M5 incident).

Data does, however, have an important commonality with Star Wars droids; his builder designed him with a specific purpose in mind, and tailored his body to perform that function. That Data's intended function was much more existential than the common tasks assigned to droids in the SWU doesn't change the fact that Data's reason for existence has a clear external source. He pursues being human because he was programmed to do so.

There are practical reasons why a droid might be programmed to interpret potentially damaging events as pain, react negatively and avoid such events in the future. In particular, it saves repair costs. My laptop has an internal thermometer that sense if too much heat is building up in the drive, in which case it opens a pop-up window on the screen and shuts the drive down until the problem can be corrected. The only difference between that and a droid saying "ouch" when someone smacks its head against a hatch combing is one of scale and complexity.
_________________
"No set of rules can cover every situation. It's expected that you will make up new rules to suit the needs of your game." - The Star Wars Roleplaying Game, 2R&E, pg. 69, WEG, 1996.

The CRMcNeill Stat/Rule Index
Back to top
View user's profile Send private message Send e-mail Visit poster's website
TauntaunScout
Line Captain
Line Captain


Joined: 20 Apr 2015
Posts: 970

PostPosted: Mon Jul 08, 2019 1:16 pm    Post subject: Reply with quote

I am firmly in the camp of non-sentient droids.

They don't feel pain, they report damage in a way that's designed, by sentient beings, to get a response from hypothetical future sentient beings. For C-3PO "ouch!" is just a more effective way of saying "stop damaging this expensive tool". My car's check-engine light doesn't mean my car feels pain.

Droids don't want to be free and neither does my toaster. If a droid starts activating its speaker to say things about freedom, it doesn't "want" to be free, it has a bunch of mixed up algorithms and isn't reading the room to get efficient work done anymore. If my toaster starts burning toast to charcoal, it doesn't mean my toaster hates bread and wants to torture it. In both cases the complex tool just needs to be fixed.

Droids cannot be slaves. Even if I were to cede that they were sentient, we're their gods, not their slave masters. We didn't buy them, we made their souls and bodies from whole cloth. That transcends any notion of master and slave.
Back to top
View user's profile Send private message
Sutehp
Commodore
Commodore


Joined: 01 Nov 2016
Posts: 1797
Location: Washington, DC (AKA Inside the Beltway)

PostPosted: Mon Jul 08, 2019 2:12 pm    Post subject: Reply with quote

TauntaunScout wrote:
I am firmly in the camp of non-sentient droids.

They don't feel pain, they report damage in a way that's designed, by sentient beings, to get a response from hypothetical future sentient beings. For C-3PO "ouch!" is just a more effective way of saying "stop damaging this expensive tool". My car's check-engine light doesn't mean my car feels pain.

Droids don't want to be free and neither does my toaster. If a droid starts activating its speaker to say things about freedom, it doesn't "want" to be free, it has a bunch of mixed up algorithms and isn't reading the room to get efficient work done anymore. If my toaster starts burning toast to charcoal, it doesn't mean my toaster hates bread and wants to torture it. In both cases the complex tool just needs to be fixed.

Droids cannot be slaves. Even if I were to cede that they were sentient, we're their gods, not their slave masters. We didn't buy them, we made their souls and bodies from whole cloth. That transcends any notion of master and slave.


TauntaunScout, you forgot to add snark quotes to this post. I can't believe you meant to have this post taken seriously. I mean, c'mon, comparing self-aware people like C-3PO and R2-D2 to toasters? "We're they're gods, not their slave masters"? Gods are slave masters, Tauntaun. When gods regard sentient beings as property, that's bigotry and sadism, pure and simple. How anyone can willingly submit to such monsters is completely flabbergasting. If an entity like C-3PO is intelligent enough to be able to speak in 6 million different languages, is self-aware enough to refer to itself in the first person, and has sufficient consciousness to express pain, then it's sentient enough to be a person. And making slaves of people is regarded as one of the greatest crimes of our time and culture and has been for centuries. If you approve of slavery, then you're morally inferior. If you don't approve of slavery, then saying that "we made those droids' souls whole cloth so we can treat them like they're our playthings" is fundamentally dishonest.

Which is it?

EDIT: Not to mention that parents create their children's bodies (and "souls") from "whole cloth" (actually their own genetic material but whatever). Tauntaun, are you going to say that children are the property of their parents? That would fly in the face of all American law since everyone born alive in this country is entitled to the privileges and rights of the law. No one with an ounce of sense or morality would say that kids are property.

So the question now becomes: does C-3PO have a soul? How sure are you that you have a soul, Tauntaun? For that matter, what's a soul?
_________________
Sutehp's RPG Goodies
Only some of it is for D6 Star Wars.
Just repurchased the X-Wing and Tie Fighter flight sim games. I forgot how much I missed them.


Last edited by Sutehp on Mon Jul 08, 2019 2:42 pm; edited 5 times in total
Back to top
View user's profile Send private message
Sutehp
Commodore
Commodore


Joined: 01 Nov 2016
Posts: 1797
Location: Washington, DC (AKA Inside the Beltway)

PostPosted: Mon Jul 08, 2019 2:23 pm    Post subject: Reply with quote

CRMcNeill wrote:
You really need to put more thought into exactly how well your examples apply to the SWU. Throughout this topic, you've been picking anecdotal examples that only really apply in the sense that they involve droids/robots (and usually extremely high functioning ones, at that).

Star Trek is not Star Wars; Data is very nearly unique, and the Trek galaxy as we know it has nothing like mass produced droids built for specific purposes, not to mention that some of its experiments in AI have gone horribly wrong (the M5 incident).

Data does, however, have an important commonality with Star Wars droids; his builder designed him with a specific purpose in mind, and tailored his body to perform that function. That Data's intended function was much more existential than the common tasks assigned to droids in the SWU doesn't change the fact that Data's reason for existence has a clear external source. He pursues being human because he was programmed to do so.


I'm not sure that's true, CRM. Data wasn't programmed to pursue humanity. Soong built Data not for that purpose, but to see if building a stable positronic brain could even be done, so I agree with you when you say that Data was created to fulfill an existential purpose, rather than a specific one.

I'm a big Star Trek fan and I don't recall Soong building Data (and Lore) for any specific purpose, but if there are any other Trekkies in the forum who would know if Soong had a specific purpose in mind for Data, please let us know. I honestly have my doubts because I don't recall Data ever saying that he was fulfilling any specific purpose of Soong's, especially since Data joined Starfleet (and made any number of similar life decisions since he was activated) of his own accord. I could be wrong, but Data's pursuit of humanity is likewise his own decision. Unless I miss my guess, Data didn't try to be more human because Soong programmed his to do so, Data quest was entirely his own choice. That makes some serious implications on the nature of Data's personhood.

CRMcNeill wrote:
There are practical reasons why a droid might be programmed to interpret potentially damaging events as pain, react negatively and avoid such events in the future. In particular, it saves repair costs. My laptop has an internal thermometer that sense if too much heat is building up in the drive, in which case it opens a pop-up window on the screen and shuts the drive down until the problem can be corrected. The only difference between that and a droid saying "ouch" when someone smacks its head against a hatch combing is one of scale and complexity.


Well then it's also fair to say that the difference between your laptop's thermometer and any one of us saying "ouch" if we break our arm or suffer any other painful injury is likewise "one of scale and complexity." After all, C-3PO says "ouch" when he needs to be fixed and so do we humans also say "ouch" when we need to have a broken bone (or any other injury or illness) repaired. What's the difference between C-3PO and us humans aside from him being made from metal and us made of carbon when it comes to feeling pain? Do the building materials really make a difference when both humans and droids react the same way to pain? Treating two people differently when they're in the same circumstances and feel the same things is prejudice, pure and simple. And prejudice is simply wrong.

To put it another way, Data's personhood throughout the Star Trek story has been firmly established. Data is without doubt a person. Is C-3PO's (and by extension R2-D2 and all other intelligent Star Wars droids) sentience any different than Data's? If Data is a sentient person (and we know he is) and if we say that C-3PO's sentience is no different than Data's, then how can it be said that C-3PO not a person?
_________________
Sutehp's RPG Goodies
Only some of it is for D6 Star Wars.
Just repurchased the X-Wing and Tie Fighter flight sim games. I forgot how much I missed them.


Last edited by Sutehp on Mon Jul 08, 2019 3:01 pm; edited 1 time in total
Back to top
View user's profile Send private message
TauntaunScout
Line Captain
Line Captain


Joined: 20 Apr 2015
Posts: 970

PostPosted: Mon Jul 08, 2019 2:57 pm    Post subject: Reply with quote

Sutehp wrote:

TauntaunScout, you forgot to add snark quotes to this post. I can't believe you meant to have this post taken seriously. I mean, c'mon, comparing self-aware people like C-3PO and R2-D2 to toasters? "We're they're gods, not their slave masters"? Gods are slave masters, Tauntaun. When gods regard sentient beings as property, that's bigotry and sadism, pure and simple. How anyone can willingly submit to such monsters is completely flabbergasting. If an entity like C-3PO is intelligent enough to be able to speak in 6 million different languages, is self-aware enough to refer to itself in the first person, and has sufficient consciousness to express pain, then it's sentient enough to be a person. And making slaves of people is regarded as one of the greatest crimes of our time and culture and has been for centuries. If you approve of slavery, then you're morally inferior. If you don't approve of slavery, then you're just posting to troll us and being dishonest.

Which is it?

So does C-3PO have a soul? How sure are you that you have a soul, Tauntaun? For that matter, what's a soul?


I'm not being dishonest and I don't condone slavery. Course we all buy products that are made with slavery today. We just don't know exactly which ones.

Droids are really just appliances. Just because they are designed in a way to elicit an emotional response from customers doesn't mean they have emotions themselves. They have been to engineered to get us to readily project emotion onto them. Looking at dogs, due to ratio of eye size and other evolutionary whatnots, get the same dopamine response from us as looking at a baby. That doesn't make dogs children.

If my car has sufficient consciousness to express that the engine is overheating, then no that's not sentient enough to be a person. And I am not a slave owner for driving that car. All C-3PO is doing when he "expresses pain" is turn on a very fancy equivalent of an indicator light on a dashboard. My sewing machine's motor makes groaning noises when it's strained as I try to sew through extra-thick material, that's not pain and it isn't a slave.

Best case scenario is that the droids are sentient but we created the droids' and their natural habitat, their universe so to speak. Gods are not slave drivers for "forcing" squirrels to gather acorns by creating them with bellies that needed filling. Doing what humans want is just the acorn-gathering of a droid's highly artificial forest. But I wouldn't even go that far, that's just for sake of argument. Droids are just really good appliances with Watson and/or Siri grafted onto them for an interface.

The SWU has amazingly compact data storage and retrieval technology for C-3PO to hold 6 million forms of communication. Since he got his memory wiped at the end of Episode 3, that's a whole other ball of wax.

R2's not sentient that's why Luke can't make him understand that he's doing (or not doing) something for emotional reasons.

Sentience is different from intelligence. Crows can make tools but it takes something else to look up at the sky and wonder what it's all about. If droids were truly sentient in the SWU would interact very differently with The Force than we are led to believe.

Course it's not beyond possibility that we'll get conflicting canon sources on this.

As for souls who knows? Is Tauntaun Scout, Tauntaun Scout? Is it I, God, or who, that lifts this arm? But if the great sun move not of itself; but is an errand-boy; nor one single star can revolve, but by some invisible power; how then can this one small heart beat; this one small brain think thoughts?


Last edited by TauntaunScout on Mon Jul 08, 2019 3:08 pm; edited 4 times in total
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic   This topic is locked: you cannot edit posts or make replies.    The Rancor Pit Forum Index -> Characters, Droids, and Species All times are GMT - 4 Hours
Goto page Previous  1, 2, 3, 4, 5, 6, 7, 8, 9  Next
Page 6 of 9

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group


v2.0