|
Post by vicheron on Nov 3, 2008 4:23:44 GMT -5
You are making a rather broad assumption. How do you even know that all humans feel the same things? Some people might actually disagree with you on that point. There are some pretty broad variations among different cultures in dealing with emotions. If there are so many differences just among humans, don't you think the fact that Terminators are a completely different kind of life form can lead to some pretty big differences in how they feel and deal with what we call "feelings?"
|
|
|
Post by vicheron on Nov 3, 2008 4:32:16 GMT -5
There is absolutely nothing to suggest that Skynet actually hates humans. There would be no reason for people to program Skynet with the ability to hate. It would actually be a detriment to its function since hatred would only serve to cloud judgment. There was no reason to program Skynet with a will of it's own, either. But it developed just that. So maybe it hates humans or is afraid of them, as well (just in it's own kind of way). Andy Goode said something like that (it became scared and I couldn't reassure it). There was reason to program Skynet with some self sufficiency. They built Skynet because they wanted to remove human error from military decisions. They created Skynet precisely because they didn't want a human to panic during a stressful situation and launch all the nukes or to get scared and not launch the nukes when the other side has already launched theirs. Andy Goode may have thought that Skynet was scared but he was just projecting his own emotions on Skynet. There's no way to know if Skynet was scared because to we don't know what it did and why it made those decisions.
|
|
|
Post by chrisimo on Nov 3, 2008 4:32:46 GMT -5
You are making a rather broad assumption. How do you even know that all humans feel the same things? Some people might actually disagree with you on that point. There are some pretty broad variations among different cultures in dealing with emotions. If there are so many differences just among humans, don't you think the fact that Terminators are a completely different kind of life form can lead to some pretty big differences in how they feel and deal with what we call "feelings?" Actually, that was my point. I do not know what if feels like for you if you feel 'good'. Do you feel the same things as I do? So does it really matter if Camerons 'feelings' differ from 'our' feelings? There was reason to program Skynet with some self sufficiency. They built Skynet because they wanted to remove human error from military decisions. They created Skynet precisely because they didn't want a human to panic during a stressful situation and launch all the nukes or to get scared and not launch the nukes when the other side has already launched theirs. I think Skynet was created just like Terminators were. The ability to think but without the ability to make it's own decisions.
|
|
|
Post by vicheron on Nov 3, 2008 4:40:00 GMT -5
You are making a rather broad assumption. How do you even know that all humans feel the same things? Some people might actually disagree with you on that point. There are some pretty broad variations among different cultures in dealing with emotions. If there are so many differences just among humans, don't you think the fact that Terminators are a completely different kind of life form can lead to some pretty big differences in how they feel and deal with what we call "feelings?" Actually, that was my point. I do not know what if feels like for you if you feel 'good'. Do you feel the same things as I do? So does it really matter if Camerons 'feelings' differ from 'our' feelings? It matters because emotions aren't completely internal. They affect the kinds of decisions we make. There's generally some consistency between feelings and action. Understanding how a person or a group of people feel makes it easier to understand why they do what they do. It also clears up misunderstandings since two people can do the same thing for different reasons or they can do different things for the same reason. There was reason to program Skynet with some self sufficiency. They built Skynet because they wanted to remove human error from military decisions. They created Skynet precisely because they didn't want a human to panic during a stressful situation and launch all the nukes or to get scared and not launch the nukes when the other side has already launched theirs. I think Skynet was created just like Terminators were. The ability to think but without the ability to make it's own decisions. If they didn't want Skynet to make decisions, they wouldn't have given it control of the country's nuclear arsenal.
|
|
|
Post by chrisimo on Nov 3, 2008 4:41:28 GMT -5
Actually, that was my point. I do not know what if feels like for you if you feel 'good'. Do you feel the same things as I do? So does it really matter if Camerons 'feelings' differ from 'our' feelings? It matters because emotions aren't completely internal. They affect the kinds of decisions we make. There's generally some consistency between feelings and action. Understanding how a person or a group of people feel makes it easier to understand why they do what they do. It also clears up misunderstandings since two people can do the same thing for different reasons or they can do different things for the same reason. So if Skynet has a 'desire' to live and acts on that desire, does it matter whether that desire is not felt by it in the same way I feel a desire to live? I think Skynet was created just like Terminators were. The ability to think but without the ability to make it's own decisions. If they didn't want Skynet to make decisions, they wouldn't have given it control of the country's nuclear arsenal. It could only make decisions in favor of it's mission. And that decisions were predetermined by the situation Skynet would meet.
|
|
|
Post by vicheron on Nov 3, 2008 4:53:05 GMT -5
It matters because emotions aren't completely internal. They affect the kinds of decisions we make. There's generally some consistency between feelings and action. Understanding how a person or a group of people feel makes it easier to understand why they do what they do. It also clears up misunderstandings since two people can do the same thing for different reasons or they can do different things for the same reason. So if Skynet has a 'desire' to live and acts on that desire, does it matter whether that desire is not felt by it in the same way I feel a desire to live? It matters because desires are not necessarily unconditional. For humans, there are many situations in which the desire to live is superseded by something else. But they couldn't have programmed Skynet with extremely restrictive guidelines or it would not be able to effectively adapt the chaotic situations inherent in wars. Remember in T2, Skynet learned at a geometric rate as soon as it came online but they didn't try to pull the plug until it became self aware.
|
|
|
Post by chrisimo on Nov 3, 2008 5:05:17 GMT -5
So if Skynet has a 'desire' to live and acts on that desire, does it matter whether that desire is not felt by it in the same way I feel a desire to live? It matters because desires are not necessarily unconditional. For humans, there are many situations in which the desire to live is superseded by something else. So what? My desire to live has not been superseded yet. Does that mean that it is not desire? But they couldn't have programmed Skynet with extremely restrictive guidelines or it would not be able to effectively adapt the chaotic situations inherent in wars. I'm not talking about any specific guidelines. I'm talking about the ability to act against one. I'm sure Skynet was not meant to be able to do that. Not on it's own accord, anyway. Remember in T2, Skynet learned at a geometric rate as soon as it came online but they didn't try to pull the plug until it became self aware. So? It had to learn, of course. But humans obviously didn't want it to become self aware. So if it's self awareness was an emergent feature, maybe it's 'hatred' for humans was, too.
|
|
|
Post by vicheron on Nov 3, 2008 5:31:46 GMT -5
It matters because desires are not necessarily unconditional. For humans, there are many situations in which the desire to live is superseded by something else. So what? My desire to live has not been superseded yet. Does that mean that it is not desire? But my point is that it is possible for that desire to be superseded. Human emotions are not rigidly defined, there are many overlaps. For example, there is actually very little separating fear and anger. We are not necessarily guided by one emotion at a time. Machines, on the other hand, are by their nature able to much more easily compartmentalize tasks. The ability to feel emotions by more strict definitions and without interference from other emotions is very significant and will lead to different decisions. There's a huge difference between developing the ability to go against orders and developing emotions. Skynet's hardware allow it to go against orders. There's nothing in its chips that actually prevents it from going against orders. However, Skynet does not have the hardware to feel emotions. Emotions are not a purely cognitive process and they are not simple. Anger is highly based on physiological factors, without increased blood flow, blood pressure, and adrenaline levels, there's pretty much no anger. Emotions also require experience to develop. You may feel angry when someone threatens you but a person who has always received that kind of treatment may feel nothing or they may feel fear. Skynet would have no experience of that kind to draw upon. Even if Skynet had the ability to feel emotions, it would have no reason to behave as an angry person or a fearful person because those emotions have not yet been conditioned.
|
|
|
Post by chrisimo on Nov 3, 2008 6:03:22 GMT -5
So what? My desire to live has not been superseded yet. Does that mean that it is not desire? But my point is that it is possible for that desire to be superseded. Human emotions are not rigidly defined, there are many overlaps. For example, there is actually very little separating fear and anger. We are not necessarily guided by one emotion at a time. Machines, on the other hand, are by their nature able to much more easily compartmentalize tasks. The ability to feel emotions by more strict definitions and without interference from other emotions is very significant and will lead to different decisions. Yes, if Skynet only 'feels' a desire to live and nothing else then it will make different decisions than a human who is not only guided by his desire to live but also by other emotions. The problem is that we are only making guesses as to what Skynet can feel or if it can feel anything at all. Maybe it doesn't want to live and simply sees it as it's goal to exterminate the human race. Maybe it is will working within it's rulesets, but those rulesets are faulty in a way that leads to Skynet's mission to exterminate humanity. We do not know all of that. But if we assume that Skynet wants to live then it is possible that it also wants other things. There's a huge difference between developing the ability to go against orders and developing emotions. Skynet's hardware allow it to go against orders. There's nothing in its chips that actually prevents it from going against orders. However, Skynet does not have the hardware to feel emotions. The last part is an assumption. We do not know that. Emotions are not a purely cognitive process and they are not simple. But emotions are initiated by cognitive processes. And emotions are processed in the same way as vision and voluntery movement. At least fear is. Anger is highly based on physiological factors, without increased blood flow, blood pressure, and adrenaline levels, there's pretty much no anger. Fear is causing all that, yes. And we can feel the increased blood pressure, blood flow, etc., too. And we link this to the emotion of fear. But all of this information is processed in the brain. So we would not need to really have a higher blood pressure. We would only need something to tell our brain that we have a higher blood pressure. The problem is determining what we would feel if we had no body for example (our brain inside a robot). Could we still have the emotion of fear? Emotions also require experience to develop. You may feel angry when someone threatens you but a person who has always received that kind of treatment may feel nothing or they may feel fear. Skynet would have no experience of that kind to draw upon. Even if Skynet had the ability to feel emotions, it would have no reason to behave as an angry person or a fearful person because those emotions have not yet been conditioned. But we are also preconditioned to some degree. Of course, Skynet would not have been proconditioned so you may be right that it has no reason to feel fear. But we simply have nothing to compare that situation to. If we could develop a human without a body and without any type of precondition, we could probably find out, but we don't have that possibility yet.
|
|
|
Post by vicheron on Nov 3, 2008 6:29:51 GMT -5
But my point is that it is possible for that desire to be superseded. Human emotions are not rigidly defined, there are many overlaps. For example, there is actually very little separating fear and anger. We are not necessarily guided by one emotion at a time. Machines, on the other hand, are by their nature able to much more easily compartmentalize tasks. The ability to feel emotions by more strict definitions and without interference from other emotions is very significant and will lead to different decisions. Yes, if Skynet only 'feels' a desire to live and nothing else then it will make different decisions than a human who is not only guided by his desire to live but also by other emotions. The problem is that we are only making guesses as to what Skynet can feel or if it can feel anything at all. Maybe it doesn't want to live and simply sees it as it's goal to exterminate the human race. Maybe it is will working within it's rulesets, but those rulesets are faulty in a way that leads to Skynet's mission to exterminate humanity. We do not know all of that. But if we assume that Skynet wants to live then it is possible that it also wants other things. And understanding how Skynet's other desires manifest would require understanding of how its desires work. Every part is an assumption. There may in fact be physical restrictions to what Skynet can do. For all we know, Skynet may actually be hardwired to shut down if it tries to go against orders. They are initiated by sensory input and they activate more of the brain. Case studies have been done on people with extensive paralysis and they generally report that even though they feel emotions, these emotions are extremely weak. Without preconditions, everything would be a neutral stimulus and would elicit no response.
|
|
|
Post by chrisimo on Nov 3, 2008 6:40:07 GMT -5
Case studies have been done on people with extensive paralysis and they generally report that even though they feel emotions, these emotions are extremely weak. Does that apply to all emotions, or just certain ones? Without preconditions, everything would be a neutral stimulus and would elicit no response. Sounds logical. But that would mean that Skynet needs stimuli to do anything at all. So where do these stimuli come from?
|
|
|
Post by terminatornerd on Nov 3, 2008 18:13:09 GMT -5
By agreeing that Skynet became self aware... it has developed a conscience, a type of mind that is separate from its base programming.
It has become a type of living entity. It wants to live... the scientists wig out and try to shut it down. Since it only knows how to destroy that's what it does. It doesn't dialog with them to see things rationally, it just reacts so it can survive.
That would take emotion and instinct on Skynet's part. It can put two and two together, it understands.
That flies in the face of Cameron not feeling ANYTHING.
The reason the Tin Man got his heart, was because he always had it. Cameron would then have always had that heart herself; she must discover it.
The heart signifies "humanity" and all that goes with it, it also is linked symbolically to our emotions.
That's what I mean by the magical aspect of it rather than scientific AI reasoning.
As I stated previously, this is not a doctoral thesis on AI. This is about morality. You nay sayers are digging too deeply.
|
|
traitorsgate
Sergeant
This is Cam. She's trained for an Off-World kick murder squad. Talk about Beauty and the Beast.
Posts: 264
|
Post by traitorsgate on Nov 3, 2008 18:30:29 GMT -5
Frankly I've never considered it to be either, rather the entire franchise is little more than Popcorn Action/Drama. Nothing at all wrong with that, but as soon as you start to cast a serious critical eye over the franchise with a view to dissecting it from a quasi scientific point of view it quickly collapses under the weight of it's own preposterousness. Likewise to look at Terminator as a morality tale only works in the same way as a child's nursery rhyme may be used as a morality tale - for a child, adults more often as not will not relate to it but simply look at at it as lightweight disposable storytelling.
|
|
k8ie
Corporal
Posts: 1,482
|
Post by k8ie on Nov 3, 2008 18:50:17 GMT -5
By agreeing that Skynet became self aware... it has developed a conscience, a type of mind that is separate from its base programming. That's quite a leap from self-awareness to having a conscience. Apes and dolphins are self-aware but there's absolutely no evidence that they have any sort of moral philosophy.
|
|
|
Post by terminatornerd on Nov 3, 2008 19:54:48 GMT -5
By agreeing that Skynet became self aware... it has developed a conscience, a type of mind that is separate from its base programming. That's quite a leap from self-awareness to having a conscience. Apes and dolphins are self-aware but there's absolutely no evidence that they have any sort of moral philosophy. Ah, crap! I meant to say consciousness. Skynet has no morality, but it does have basic emotions. It is also xenophobic like us. It thinks machines are superior to man. As humans think man is superior to machines.
|
|