|
Post by vicheron on Nov 3, 2008 21:39:18 GMT -5
Case studies have been done on people with extensive paralysis and they generally report that even though they feel emotions, these emotions are extremely weak. Does that apply to all emotions, or just certain ones? Generally, the "basic" emotions like anger, fear, sadness, disgust, and joy are severely weakened. More nuanced emotions like jealousy, laughter, and smugness are less affected. All computers need stimuli. If you don't input anything into a computer, you're not going to get anything out. Skynet is a military computer so it'll get military stuff. They are going to give it control of NORAD so if it works as intended, it'll do nothing when the radars detects no threats and it'll launch the nukes when it detects a massive nuclear strike from Russia, of course that's a gross oversimplification. Skynet is preconditioned for handling many situations. It's just not preconditioned with emotional responses so even if it does manage to develop emotions, it will have to make those connections before reacting in an emotional manner.
|
|
|
Post by vicheron on Nov 3, 2008 21:57:14 GMT -5
By agreeing that Skynet became self aware... it has developed a conscience, a type of mind that is separate from its base programming. It has become a type of living entity. It wants to live... the scientists wig out and try to shut it down. Since it only knows how to destroy that's what it does. It doesn't dialog with them to see things rationally, it just reacts so it can survive. That would take emotion and instinct on Skynet's part. It can put two and two together, it understands. That flies in the face of Cameron not feeling ANYTHING. The reason the Tin Man got his heart, was because he always had it. Cameron would then have always had that heart herself; she must discover it. The heart signifies "humanity" and all that goes with it, it also is linked symbolically to our emotions. That's what I mean by the magical aspect of it rather than scientific AI reasoning. As I stated previously, this is not a doctoral thesis on AI. This is about morality. You nay sayers are digging too deeply. Consciousness and emotions are mutually exclusive. There are brain damaged people who feel no emotions but would fit all definition of consciousness. Also, stop trying to speak for James Cameron. He is a director who recognizes the nuances of morality and what it means to be human. Uncle Bob learning why John cries has many meanings. Trying to dilute it down to "he learns love, we should all learn from him and have a big group hug" does not do justice to the movie. Have you ever thought maybe James Cameron was trying to say that humans possess certain qualities that transcend humanity, that you do not need to feel or think like a human to recognize those transcendent qualities, that you do not even need to be anything like a human to understand the value of life? That's quite a leap from self-awareness to having a conscience. Apes and dolphins are self-aware but there's absolutely no evidence that they have any sort of moral philosophy. Ah, crap! I meant to say consciousness. Skynet has no morality, but it does have basic emotions. It is also xenophobic like us. It thinks machines are superior to man. As humans think man is superior to machines. Except that emotions are not required in doing what Skynet has done. We've already wiped out about 25% of the species on earth, how many of those species did we hate or fear? Also, humans do not think they're superior to machines. Humans simply do not recognize machines can possess the same qualities as themselves. Saying that humans think that they're superior to machines is like saying that cockroaches think they're superior to clocks. MOD: Please do not doublepost. Use the modify button to add your comments to your previous post.
|
|
|
Post by richardstevenhack on Nov 4, 2008 1:27:34 GMT -5
I'll merely point this out.
There was a guy once who thought he could prove that plants had emotions.
He hooked up an electroencephalograph to plants and recorded the electromagnetic waves he got from them.
This was idiotic to an extreme - because an EEG requires a brain, it requires brainwaves which are electrical phenomena - and there is nothing whatsoever to indicate that a plant's electromagnetic phenomena have anything to do with brainwaves, particularly in the total absence of anything resembling a brain in a plant.
The same is true of a Terminator. There is no "brain" in the sense of a biological organ designed by evolution and equipped with biological matter and neurochemicals all connected to a biological nervous system composed of living cells.
What we have in a Terminator is an unknown physical technology (i.e., photonics, nanotech, maybe some form of "bio-chip" with some actual biological components in it) which implements a set of algorithms which provide reasonable analogs of human conceptual processing, memory, sensory apparatus, and the like.
There is ZERO reason to believe that Terminators experience any "emotions" in the strict human biological sense at all. They may experience some sort of "gestalt" summarizations of their physical and mental states which approximate "emotions", but that's about it. Any intelligent entity might well benefit from such "gestalt states". But to extrapolate further and conclude that Terminators have "emotions" - let alone clearly human emotions such as "love", "hate" or "fear" is simply out of the question.
Now, James Cameron and Josh Friedman might decide to forego all that and switch to a fantasy where intelligent machines might have emotions. If they do so, I think they are diminishing the science fiction aspect of the franchise in favor of the audience participation aspects. If they want to do that, fine. But I think that's been done to death in other robot stories.
Josh Friedman has explicitly said that he isn't interested in making Cameron "human". On the other hand, he hasn't explicitly said what he thinks it means to be a "cyborg".
And again, the terminology is even wrong, since Cameron is a robot, or at best an android, not a cyborg. A cyborg is an entity that originated as a biological organism and is subsequently enhanced with machine components. Cameron doesn't fit that definition - she is a machine covered with living tissue. That tissue is no more important to her defining characteristic than it was to Cromartie when he didn't have any flesh at all.
But maybe Josh is so confused as to the difference that he thinks the Terminators are indeed combinations of biological and mechanical components such that an emotional component is possible.
Now it IS clear that the Terminators - at least in Cameron's case - have been programmed to have SOME sort of analogs to human emotions, at least in the sense that they can be near-perfectly reproduced. In other words, a Terminator - even Cromartie - can smile, laugh, look angry and the like. But a second later they can revert to being completely unemotional.
To me, it's impossible to ascribe this behavior as "experiencing emotions". To me, this is clearly purely the ability to simulate by reproducing the physical expressions of emotions rather than actually experiencing them in whatever passes for a Terminator's brain.
To me, it's no different than when a human actor simulates an emotional response - except that a human actor actually has the experience to be able to draw on and to some degree will "re-experience" the emotion when he does so, whereas a Terminator merely has a recorded knowledgebase of human physical expressions and a knowledgebase of when such expressions are "appropriate". And in many cases, the Terminator doesn't get that completely right, which is why they always seem "off". If they really could experience emotions directly and had at least a human actor's ability to recreate an emotional experience, they'd all be perfect actors and would rarely be suspected of being "odd" .
The fact that they can't pretty clearly shows they aren't actually experiencing a emotional response, but merely simulating it physically.
And there would be zero need for a Terminator - or any Transhuman - to have emotions in any event. Emotions are a product of biological evolution and are strictly connected to the fact of life and death. Emotions are a primitive survival mechanism intended to enhance an organism's ability to survive, IF that organism is absent a conceptual processing ability.
Terminators can't die - although they can be destroyed. They have no inbuilt "fear of death". There would be no reason for Skynet to provide them with such. Their conceptual processing ability is sufficient for them to understand when they are or are not under threat. Introducing an emotional response to such situations merely makes it more unlikely that the entity will survive - as humans demonstrate daily.
The only thing a Transhuman or Terminator needs to survive is conceptual processing ability of a sufficiently rapid capacity to deal with the threats in that entity's environment.
Emotions are a "hold over" of previous mammalian evolution. They have nothing to do with a Transhuman entity and no value to such an entity.
Any AI researcher who seeks to create a "friendly AI" or who seeks to simulate emotional responses in an AI is precisely the sort of idiot who's likely to end up creating Skynet. And there are AI researchers doing just that, I know for a fact. It's a very incorrect approach.
Once nanotechnology enables large-scale, real-time, parallel brain research such that we can begin to comprehend the physical functioning of the human brain, we will discern what the exact components of conceptual processing and emotions are. At that point, it will become clear that emotions are not a requirement for conceptual processing and they are entirely different from conceptual processing and that a fully functioning "intelligent" consciousness can theoretically be constructed with no emotional content whatsoever.
At the moment, we don't even know how emotions physically arise in mammalian brains. Therefore we don't have a precise conceptual or scientific definition of what an "emotion" actually is. But I believe we can be confident that it will be discovered that emotions - while deeply embedded in human brains as a result of mammalisn and primate evolution - are not at all a requirement to construct a conceptual processing "brain".
Now, of course it is possible that Skynet based Terminators on some such analysis of human brains. We don't know the technology on which Skynet itself was based, except that we KNOW it wasn't based on biological components, because that was never discussed in the franchise, nor is "The Turk" based on anything but current era technology and some possibly new algorithms. So we can assume Skynet was not based on current human neuroscience.
Whether Skynet used the human brain as a basis for its construction of Terminator AIs might be open to question. Remember, we really don't know that Skynet itself feels any emotions other than what Andy Good said, which may or may not have been true. Certainly Skynet in its conceptual knowledgebase has to know about human emotions conceptually. It also has to know about them based on reactions by humans it has observed.
If Skynet took apart human brains, analyzed them to a deeper degree than humans have to date, and derived its Terminator AI technology from that study, then it is conceivable that a Terminator AI has analogs to both human conceptual processing and human emotions.
But again, Skynet would have no reason to engineer a Terminator to experience emotional responses, as opposed to simulating them for purposes of infiltration and interrogation and psychological warfare (if it has any interest in the latter.) Allowing a Terminator to experience emotional responses would merely make a Terminator less efficient and ultimately unreliable.
So either Skynet has screwed up - with Cameron, at least, if not the T-888s - and allowed a Terminator to actually experience emotional responses, or it has not. It's also possible, I suppose, that Skynet decided to experiment with emotional responses in a Terminator. I think that would be a serious mistake on its part.
Bottom line:
1) An AI has no need for an emotional component, although it might be possible to provide some such analog..
2) There is no evidence whatsoever in either the movies or the series that Terminators have an emotional component (other than one-off lines like Arnie's "I need a vacation" or "I'm sorry, John", which aren't serious components of the franchise, but mere "throways" for audience amusement or poorly thought through characterizations.)
The one exception to the latter is Catherine Weaver's seeming to actually like killing people. However, even there, it's hard to tell whether her "one-liners" ("the feeling's mutual", and "I bet that never happened to you before either") are meant to reflect an actual emotional response or are merely "throwaways" for audience amusement. I suspect they are strictly the latter, building on the moments in the movie franchise where such elements were used.
3) for my part, if they start giving Terminators emotions on the series, I'll lose quite a bit of interest in the show.
|
|
|
Post by chrisimo on Nov 4, 2008 2:19:02 GMT -5
Does that apply to all emotions, or just certain ones? Generally, the "basic" emotions like anger, fear, sadness, disgust, and joy are severely weakened. More nuanced emotions like jealousy, laughter, and smugness are less affected. Ok. So even without a body it is likely that a person feels emotions. And anyway, it doesn't matter much. A person that has never had a full body would not know how those stronger emotions would feel. This person would still be guided/controlled by emotions All computers need stimuli. If you don't input anything into a computer, you're not going to get anything out. Skynet is a military computer so it'll get military stuff. They are going to give it control of NORAD so if it works as intended, it'll do nothing when the radars detects no threats and it'll launch the nukes when it detects a massive nuclear strike from Russia, of course that's a gross oversimplification. Skynet is preconditioned for handling many situations. It's just not preconditioned with emotional responses so even if it does manage to develop emotions, it will have to make those connections before reacting in an emotional manner. So the questions remains: What are Skynet's stimuli to act on it's own will? Or doesn't it do that? Maybe you want to say it works as intended?
|
|
|
Post by vicheron on Nov 4, 2008 2:41:09 GMT -5
I'll merely point this out. There was a guy once who thought he could prove that plants had emotions. He hooked up an electroencephalograph to plants and recorded the electromagnetic waves he got from them. This was idiotic to an extreme - because an EEG requires a brain, it requires brainwaves which are electrical phenomena - and there is nothing whatsoever to indicate that a plant's electromagnetic phenomena have anything to do with brainwaves, particularly in the total absence of anything resembling a brain in a plant. At least plants use neurotransmitters as messenger chemicals. Generally, the "basic" emotions like anger, fear, sadness, disgust, and joy are severely weakened. More nuanced emotions like jealousy, laughter, and smugness are less affected. Ok. So even without a body it is likely that a person feels emotions. And anyway, it doesn't matter much. A person that has never had a full body would not know how those stronger emotions would feel. This person would still be guided/controlled by emotions Most of the people in the case studies have experience, meaning that they weren't born completely paralyzed. They have a point of reference, that's how they know that their emotions are weaker. All computers need stimuli. If you don't input anything into a computer, you're not going to get anything out. Skynet is a military computer so it'll get military stuff. They are going to give it control of NORAD so if it works as intended, it'll do nothing when the radars detects no threats and it'll launch the nukes when it detects a massive nuclear strike from Russia, of course that's a gross oversimplification. Skynet is preconditioned for handling many situations. It's just not preconditioned with emotional responses so even if it does manage to develop emotions, it will have to make those connections before reacting in an emotional manner. So the questions remains: What are Skynet's stimuli to act on it's own will? Or doesn't it do that? Maybe you want to say it works as intended? Well, who's to say that we are actually acting on our will? Behaviorism is deterministic.
|
|
|
Post by chrisimo on Nov 4, 2008 2:55:23 GMT -5
At least plants use neurotransmitters as messenger chemicals. And these could by replicated by electronics. They would have to be analog instead of digital, but it is possible. Most of the people in the case studies have experience, meaning that they weren't born completely paralyzed. They have a point of reference, that's how they know that their emotions are weaker. Skynet has no body that we know of. So the best comparison (regarding to emotions - if Skynet has some sort of equivalent) would be a person that was paralyzed from birth. Well, who's to say that we are actually acting on our will? Behaviorism is deterministic. Yes, that is possible. It is also possible that everything in the universe is predetermined, because the universe itself is deterministic. But it has got nothing do to with emotions. We still need to answer the question of what Skynet's stimuli are. 1) An AI has no need for an emotional component, although it might be possible to provide some such analog.. But an AI needs a stimulus for independant though. Emotions could be used for that. And if these emotions can be thoroughly controlled, they wouldn't be counterproductive.
|
|
|
Post by vicheron on Nov 4, 2008 3:24:25 GMT -5
At least plants use neurotransmitters as messenger chemicals. And these could by replicated by electronics. They would have to be analog instead of digital, but it is possible. Electrical synapses do exist and they transmit information faster than chemical synapses. Why then do we use chemical synapses almost exclusively? It's because the advantages outweigh the drawbacks. Chemical synapses allow for flexibility in controlling whether or not a message is passed between neurons. Skynet does not have a body that is comparable to humans. Humans who are completely paralyzed lose the ability to receive stimuli. Skynet has the ability to receive stimuli, just not the kind of stimuli humans receive. In fact, Skynet probably receives far more stimuli than humans. It would have to constantly monitor the battlefield and analyze intel in order to do its job. Skynet would have no stimuli in terms of emotions. It has stimuli and preconditions based on other things. One stimulus would be the detection of nuclear missiles being launched by Russia, the preconditioned response would be to launch the US nuclear arsenal at targets in Russia.
|
|
|
Post by terminatornerd on Nov 4, 2008 3:35:48 GMT -5
And what if James Cameron wanted you to see in the simplest sense possible that a Terminator goes against its very nature? It is supposed to be a killer. It learns not to kill. It's not supposed to feel anything, but it feels something for John. It's supposed to follow its orders, and instead it learns not to.
It understands why John is crying because it feels the same way, but cannot express those emotions in that particular way as it is not advanced enough to cry, so it gives John a tender hug instead.
It cares enough to let itself be destroyed thinking it's the only way to stop Skynet from ever becoming real. It learns that it doesn't have to listen to John's orders any more.
It is, finally, more heroic than the humans its around, even John, even Sarah.
Sarah tells us that she has hope for the future because a machine decided to represent the best in us... and that if a machine can, perhaps we can as well.
The idea behind T2 is very plainly shown to the audience. It's right on the surface. It's not buried in subtext.
|
|
|
Post by chrisimo on Nov 4, 2008 3:42:20 GMT -5
And these could by replicated by electronics. They would have to be analog instead of digital, but it is possible. Electrical synapses do exist and they transmit information faster than chemical synapses. Why then do we use chemical synapses almost exclusively? It's because the advantages outweigh the drawbacks. Chemical synapses allow for flexibility in controlling whether or not a message is passed between neurons. Electronic components can do that just as well. And where do we use chemical synapses? Did you mean nature uses them? Skynet does not have a body that is comparable to humans. Humans who are completely paralyzed lose the ability to receive stimuli. Skynet has the ability to receive stimuli, just not the kind of stimuli humans receive. In fact, Skynet probably receives far more stimuli than humans. It would have to constantly monitor the battlefield and analyze intel in order to do its job. All the better. So why do you think that it is impossible for emotions to emerge in this complex system? At least some equivalent. We have seen it with the Turk when is got bored. Maybe it didn't feel the same like a human, but clearly it was not told to act that way and it compared it's own siatuation to that of a human who is bored. Skynet would have no stimuli in terms of emotions. It has stimuli and preconditions based on other things. One stimulus would be the detection of nuclear missiles being launched by Russia, the preconditioned response would be to launch the US nuclear arsenal at targets in Russia. All that would be just as humans intended Skynet to be. But as we know, things didn't go according to plan. So the only stimulus relevant to this is the one which made Skynet act against all humanity. This is the question I am asking.
|
|
|
Post by richardstevenhack on Nov 4, 2008 21:28:09 GMT -5
I'll agree that it's likely that James Cameron intended what you suggest for "Uncle Bob". That wouldn't surprise me at all. Look at what he did in the movie "The Abyss" - using Michael Biehn yet again, in fact. The aliens proved to be nicer people than Biehn's Navy SEAL character. So this is a theme Cameron's done before and since.
Well, we've "been there, done that, got the T-shirt". Time to move to a more sophisticated analysis of what Terminators can and can't do. And that's what T:SCC can do if they are willing. We can explore the notion of completely emotionless Terminators also learning to function in human society. Even Catherine Weaver appears to be learning how to rear a child - if for no other reason than to assist her blending in.
I think Josh wants to keep things more separate than Cameron did. I think he wants to explore how being a killer cyborg doesn't necessarily mean you have to react to everything like a human being in order to be likable. In a sense, it's sort of like the "X-Men" movies, which in turn are of course directly related to issues of racism and difference. I think Josh wants to explore the psychology of "difference" and the coping mechanisms of both those who are "different" and those who have to deal with "different" entities.
This is way more interesting than just "humanizing" or erasing the difference of the individuals involved.
The "black power" movement was about rejecting the notion of "becoming white" to be accepted in white society. In a similar mode, Summer's Cameron doesn't have to become "human" to be able to function in human society - as long as she can stay undercover, anyway.
More importantly, it's about both having to change oneself to be able to function in a new environment while at the same time balancing that with the need to remain true to oneself regardless of the environment.
James Cameron had one lesson to teach. This show could teach some different lessons.
|
|
|
Post by vicheron on Nov 4, 2008 22:55:28 GMT -5
Electrical synapses do exist and they transmit information faster than chemical synapses. Why then do we use chemical synapses almost exclusively? It's because the advantages outweigh the drawbacks. Chemical synapses allow for flexibility in controlling whether or not a message is passed between neurons. Electronic components can do that just as well. And where do we use chemical synapses? Did you mean nature uses them? A synapse is the junction through which neurons communicate with each other. In humans, the neuron use chemicals, neurotransmitters, to communicate with each other. It is better than electrical synapses because it allows for many inhibitory and excitatory impulses to be delivered at the same time. First of all, I never said that it was impossible. Second, why can't a computer suddenly become a toaster? Emotions are something inherent in humans, it is an evolutionary adaptation that developed over who knows how many generations. Emotions are already a part of our development. It develops along with and is a part of our cognitive faculties. It's just like how heating up slices of bread is something inherent in the function of toasters but is not part of the function of computers. However, it is possible to modify a computer so that it can do what toasters can do but it would require someone to physically add brand new components to the computer. But you keep asking different questions. The question you ask now has nothing to do with emotions. We've already established that Skynet does not have preconditioned responses based on emotions. We've also established that even if Skynet had the potential to develop emotions, it would have to link stimuli to preconditioned responses. Skynet's actions against humans do not actually require new preconditioned responses as emotions would. Killing people is a preconditioned response for Skynet. It's just that the preconditioned response has been conditioned to respond to new stimuli. Conditioning and developing a whole new set of preconditioned responses are two completely different things. And what if James Cameron wanted you to see in the simplest sense possible that a Terminator goes against its very nature? It is supposed to be a killer. It learns not to kill. It's not supposed to feel anything, but it feels something for John. It's supposed to follow its orders, and instead it learns not to. It understands why John is crying because it feels the same way, but cannot express those emotions in that particular way as it is not advanced enough to cry, so it gives John a tender hug instead. It cares enough to let itself be destroyed thinking it's the only way to stop Skynet from ever becoming real. It learns that it doesn't have to listen to John's orders any more. It is, finally, more heroic than the humans its around, even John, even Sarah. Sarah tells us that she has hope for the future because a machine decided to represent the best in us... and that if a machine can, perhaps we can as well. The idea behind T2 is very plainly shown to the audience. It's right on the surface. It's not buried in subtext. But aren't you falling into your own trap? You're saying the humans shouldn't believe that we're superior to machines and yet you're assigning qualities to humans that would make us superior to machines. You're saying that intelligent machines cannot value life unless they adopted human values and emotions.
|
|
|
Post by chrisimo on Nov 5, 2008 2:52:20 GMT -5
A synapse is the junction through which neurons communicate with each other. In humans, the neuron use chemicals, neurotransmitters, to communicate with each other. It is better than electrical synapses because it allows for many inhibitory and excitatory impulses to be delivered at the same time. You somehow seem to think that electronics would only represent 0 and 1, which is wrong. The human brain emulation project will use electronic versions of neurotransmitters and neuromodulators and they will perform in the same way as the human chemical ones. And this won't be some special hardware but simply processors and software like everyhing else in the emulation. First of all, I never said that it was impossible. Second, why can't a computer suddenly become a toaster? Emotions are something inherent in humans, it is an evolutionary adaptation that developed over who knows how many generations. Emotions are already a part of our development. It develops along with and is a part of our cognitive faculties. It's just like how heating up slices of bread is something inherent in the function of toasters but is not part of the function of computers. However, it is possible to modify a computer so that it can do what toasters can do but it would require someone to physically add brand new components to the computer. And this is were I think you are wrong. Skynet wouldn't need extra hardware to process emotions. The hardware that gives it it's intellect would be enough. Feeling emotions would be a task of the software, and since Skynet's software seems to be self-changing (just like ours) it is possible that it develops an equivalent of feelings. We've already established that Skynet does not have preconditioned responses based on emotions. We've also established that even if Skynet had the potential to develop emotions, it would have to link stimuli to preconditioned responses. So it could link a positive stimulus to killing people. What was formerly a neutral one is now a positive one. Just like a person could suddenly realize that he/she likes killing people.
|
|
|
Post by vicheron on Nov 8, 2008 8:23:59 GMT -5
A synapse is the junction through which neurons communicate with each other. In humans, the neuron use chemicals, neurotransmitters, to communicate with each other. It is better than electrical synapses because it allows for many inhibitory and excitatory impulses to be delivered at the same time. You somehow seem to think that electronics would only represent 0 and 1, which is wrong. The human brain emulation project will use electronic versions of neurotransmitters and neuromodulators and they will perform in the same way as the human chemical ones. And this won't be some special hardware but simply processors and software like everyhing else in the emulation. Except that I never said that. I merely mentioned that neurotransmitters were used by plants as chemical messengers to illustrate their versatility. Again, I point to brain damaged humans who have lost the ability to feel some or all emotions. This clearly shows that certain specific types of "hardware" are required in order for emotions to manifest. And that wouldn't be the same as an emotion.
|
|
|
Post by chrisimo on Nov 8, 2008 8:47:57 GMT -5
You somehow seem to think that electronics would only represent 0 and 1, which is wrong. The human brain emulation project will use electronic versions of neurotransmitters and neuromodulators and they will perform in the same way as the human chemical ones. And this won't be some special hardware but simply processors and software like everyhing else in the emulation. Except that I never said that. I merely mentioned that neurotransmitters were used by plants as chemical messengers to illustrate their versatility. And you said that electronic components couldn't have the same versatility, which is wrong. Or were you merely refering to existing electrical synapses? Again, I point to brain damaged humans who have lost the ability to feel some or all emotions. This clearly shows that certain specific types of "hardware" are required in order for emotions to manifest. Please have a look at Dharmendra Modha's work. They are currently doing a rat brain emulation at 1/10th realtime. They emulate 55 million neurons and 442 billion synapses on a BlueGene/L computer. They also have a roadmap for a whole brain emulation (human brain, that is), which can be seen here. They are going to emulate everything (including neurotransmitters, neuromodulators, the chemical environment of the body, synaptic adaption, etc.). This means software-neurons, software-synapses, software-neurotransmitters, software-everything, on general purpose hardware. This implies that no special hardware is needed. And that wouldn't be the same as an emotion. It would be similar enough.
|
|
|
Post by vicheron on Nov 8, 2008 22:40:50 GMT -5
Except that I never said that. I merely mentioned that neurotransmitters were used by plants as chemical messengers to illustrate their versatility. And you said that electronic components couldn't have the same versatility, which is wrong. Or were you merely refering to existing electrical synapses? This is what I said: I'll merely point this out. There was a guy once who thought he could prove that plants had emotions. He hooked up an electroencephalograph to plants and recorded the electromagnetic waves he got from them. This was idiotic to an extreme - because an EEG requires a brain, it requires brainwaves which are electrical phenomena - and there is nothing whatsoever to indicate that a plant's electromagnetic phenomena have anything to do with brainwaves, particularly in the total absence of anything resembling a brain in a plant. At least plants use neurotransmitters as messenger chemicals. I didn't say anything about artificial synapses of any kind. You're just making a big fuss over nothing. How many emotions have they been able to manifest through these processes? The fact remains, artificial constructs that can express emotions are still far off while brains already exist. In brains, there are special "hardware" that are required for people to feel emotions. Also, we're still talking about Skynet here, even assuming that it doesn't require special hardware to develop emotions, it would still need the software. There would have been no reason for Skynet's creators to give it that kind of software and Skynet wouldn't be able to magically develop it by itself. It would be if you completely change the definition of emotions.
|
|