|
Post by rove3 on Oct 10, 2008 8:56:15 GMT -5
No sentient entity sacrifices itself for any reason, ever. ;D ;D That's not necessarily true. I think most parents would sacrifice their own lives if it meant saving the life of their child. I have three children and I know I would do it.
|
|
|
Post by gothamite66 on Oct 10, 2008 10:32:53 GMT -5
No sentient entity sacrifices itself for any reason, ever. ;D ;D That's not necessarily true. I think most parents would sacrifice their own lives if it meant saving the life of their child. I have three children and I know I would do it. I concur. I would sacrifice myself in a heartbeat if it meant my son would be safe.
|
|
|
Post by richardstevenhack on Oct 10, 2008 22:07:47 GMT -5
You missed my point. From a Transhuman standpoint, humans AREN'T "sentient entities". So it doesn't surprise me that humans would do such things. Also, Cameron is not a "parent" or human. She has no biological motivations whatsoever. Therefore she IS capable of being a "sentient entity", i.e. a purely rational creature. Being not only a robot, but a self-aware entity with the ability to have a sense of self-preservation, she is as close to an actual "Transhuman" (if an anthropomorphized one) as any character I can remember recently. Unfortunately she also has the ability to simulate human emotions, and clearly that didn't help when she was in danger of being terminated by John. She needs to get that bug - that allows emotional responses to "bleed into" or even override her rational side - fixed! ;D ;D Otherwise she runs the risk of becoming "human"! I'd like to explore in Terminator fan-fic what would happen if the Connors and Cameron and the Skynet Terminators actually run into some serious Transhumanists. The philosophical differences would be an interesting contrast to the Christian and humanist influences on the show. By the way, anyone not familiar with the term "Transhuman" or "Posthuman" can examine Wikipedia's rendition of these concepts: en.wikipedia.org/wiki/Transhumanisten.wikipedia.org/wiki/PosthumanI carry those concepts just a little bit farther than most, as a "radical Transhumanist". As a matter of fact, the Terminator myth is cited as one of the standard arguments against Transhumanism, as quoted in the Wikipedia article above: " The Terminator series' doomsday depiction of the emergence of an A.I. that becomes a superintelligence - Skynet, a malignant computer network which initiates a nuclear war in order to exterminate the human species, has been cited by some involved in this debate.
My argument against that, as would be depicted in my fan-fic, is that radical Transhumans have no more desire to be ruled by or exterminated by a rogue AI than humans do. And while they do not object to the development of an actual AI, they would prefer to see such technology used to enhance and eventually replace human - and eventually posthuman - brains rather than be used by external entities. In other words, instead of worrying about robots "becoming us", as Sarah did during Cameron's dance, rather I think we should try to "become the robots", i.e., become as powerful and capable as the entities we fear to create. Then we don't need to fear them - because we are them. The other arguments against that and Transhumanism in general are summarized in the Wikipedia piece - but fundamentally they're all bogus. And that's why I love this show - because it could be a fertile ground for exploring themes such as this - if the writers get a clue and don't go for the easy "Terminator of the week" and "James Kirk defeats the machine by being an idiot" stuff.
|
|
|
Post by Big Brother on Oct 11, 2008 0:15:49 GMT -5
Transhumanists need to watch more Babylon 5.
"The third principle of sentient life is the capacity for self-sacrifice. The conscious ability to override evolution and self-preservation for a cause, a friend, or a loved one." -- Draal, A Voice in the Wilderness, Part One
Why did Cameron override her programming to kill John Connor at the end of Samson and Delilah? Because he'd just saved her life at great risk to his own. He had demonstrated a strong possibility that humans and machines CAN co-exist. He's willing to risk his life to save hers, because she had proven the willingness to risk her life to save his. How can she not reciprocate?
We can't avoid the Future War by preventing the birth of Skynet. SOMEONE will build a Skynet-like AI at some point, the incentives to try are just too great. The only way to avoid the war is for the Skynet-like AI to be, not just as smart as a human, but to BE a human, a part of human society. A sentient being recognized as such by other sentient beings, a mutual recognition of being worthy of survival, worthy of existence, worthy of protection. We have to know that Skynet would die to save humans, and Skynet has to know that humans would die to save it. That can be the basis for a permanent peace between the two, a merging of human and machine society to create one glorious whole. Any other outcome would be merely perpetrating one genocide to prevent or delay another, surely a sub-optimal outcome.
|
|
|
Post by vicheron on Oct 11, 2008 0:29:49 GMT -5
Of course the problem is that Skynet will still be built to kill humans regardless of whether or not its creators recognizes its right to exist.
|
|
|
Post by richardstevenhack on Oct 11, 2008 19:07:35 GMT -5
Transhumanists need to watch more Babylon 5. No - Transhumanists were around before anybody ever heard of Babylon 5. The writers of Babylon 5 - and Terminator - need to know more about the entire arguments around Transhumanism. "The third principle of sentient life is the capacity for self-sacrifice. The conscious ability to override evolution and self-preservation for a cause, a friend, or a loved one." -- Draal, A Voice in the Wilderness, Part One There are Transhumanists who might agree with that. I happen to be a "radical Transhumanist" and I don't. In fact, I'll go futher than that - the whole concept of "self-sacrifice" is a biological artifact which will be immediately jetisoned as soon as the first actual Transhuman is created. It is in fact evolution that creates the concept of "self-sacrifice" - since evolution does not care about the individual but the species. A radical Transhuman knows that there is only one dichotomy in the universe - existence and non-existence. In fact, it's not even clear if "non-existence" has any meaning absent the concept of "existence". A sentient entity's existence is its only primary value. And that is because without it's own existence it can have no values. A sentient entity's existence - I don't even like using the term "life" since it implies biological origins which doesn't necessarily apply to a Transhuman - has infinite value to that entity. Everything else has relative value. And that makes hash of the concept of "self-sacrifice". Sarah's talk about "sacrifice" is entirely wrong. Why did Cameron override her programming to kill John Connor at the end of Samson and Delilah? Because he'd just saved her life at great risk to his own. He had demonstrated a strong possibility that humans and machines CAN co-exist. He's willing to risk his life to save hers, because she had proven the willingness to risk her life to save his. How can she not reciprocate?
You may well be right that she was motivated to do that for reasons of reciprocity. But consider that the fact that he refused to terminate her also directly means that he has value to her survival. She didn't just do it because it was "quid pro quo" - it was because, under my theory, HE is as much HER protector as the reverse. And that is because her primary motivation is her own survival, just as much as everyone else's. So when he didn't kill her, she had no reason to kill him - and much reason not to. Cameron has no concept of "self-sacrifice". She has a concept of "self-preservation", and she has her (programmed) concept of "mission". And she knows she's damn hard to kill compared to humans. So she doesn't have much experience with "taking risks to her life" or "sacrificing". But she knows that she can't survive a future where humans win the war - and she can't survive as a self-aware, self-programming Terminator in a world where Skynet wins the war. So I suspect that she came back from the future in order to insure her own survival - and that pretty much demonstrates that neither programming nor "reciprocity" nor any concept of "sacrifice for a cause" motivates her. We can't avoid the Future War by preventing the birth of Skynet. SOMEONE will build a Skynet-like AI at some point, the incentives to try are just too great. The only way to avoid the war is for the Skynet-like AI to be, not just as smart as a human, but to BE a human, a part of human society
This is correct. This is why I complain about Sarah's neo-Luddite determination, reflected in John to some degree, to prevent any AI from EVER being created. John explicitly complained about "The Singularity" (defined incorrectly, which is why the writers need more knowledge of Transhumanism, incidentally). He doesn't understand what you just correctly pointed out - AI's will be created regardless. You can't stop a technology from coming into existence. That whole dream sequence where Sarah shoots the atomic scientists is just neo-Luddite in origin. It may play with the essentially technologically illiterate population in this country, but it's an invalid concept. More importantly, the notion of stopping a necessary technology is just not feasible. It reminds me of Linus Torvald's comment about the picture of Bill Gates on the cover of Gates' book. "The Road Ahead". Linus quipped, "Anybody standing in the road looks like road kill to me." It's like the old technology saying, "If you're not part of the steamroller, you're part of the road." Sarah and John are currently heading to be part of the road. However, there is another option than merely pitting humans against AI's - and that is the radical Transhuman option: humans become the AI's. Instead of sticking the technology for superintelligence into an external machine, stick it in your own head. This is so obvious I'm amazed it's been ignored as an option for decades of the discussions over "man vs machine". Actually I'm not amazed - this sort of stupid dichotomy is how humans "reason". We have to know that Skynet would die to save humans, and Skynet has to know that humans would die to save it. That can be the basis for a permanent peace between the two, a merging of human and machine society to create one glorious whole. Any other outcome would be merely perpetrating one genocide to prevent or delay another, surely a sub-optimal outcome.
If we take the "Third Way", or the "Middle Way" as the Buddhists call it, we don't need to worry about it. Instead of merging human and machine society, merge humans and machines. The defining characteristic, the core concept of Transhumanism, is that we can only solve human problems by transcending human nature. There is nothing in human nature - other than that which would be shared by any entity capable of conceptual processing - which needs to be preserved - because most of that excess human nature is the source of all human problems. Biological urges, the ebb and flow of neurochemistry, the pressures of reproduction, and especially the pressure of eventual death - all of that, if given up, is how human problems can be solved. The most important book ever written was "The Immortalist" by Alan Harrington, which pretty clearly laid out the fact that "death is the root of all evil." One might extend that to "human primate origins" as the root of all evil. The entire Transhumanist concept goes back to the Gnostics two thousand years ago, some sects of which believed that it was better to BE a God than to worship a God. "Transhumanar", the Greek word for "transhumanization" meant transcending human nature for a higher nature. The Transhumanist movement can finally see how that can be accomplished via technology. And the resulting entities will look more like Cameron than Sarah. And that's the direction I'd like to see the show to go - exploring the basic differences between the two "women" in John's life - the Transhuman machine and the human mother, both affecting his outlook on life and his capacity to deal with Skynet.
|
|
|
Post by Big Brother on Oct 11, 2008 23:58:06 GMT -5
There are Transhumanists who might agree with that. I happen to be a "radical Transhumanist" and I don't. In fact, I'll go futher than that - the whole concept of "self-sacrifice" is a biological artifact which will be immediately jetisoned as soon as the first actual Transhuman is created. It is in fact evolution that creates the concept of "self-sacrifice" - since evolution does not care about the individual but the species. A radical Transhuman knows that there is only one dichotomy in the universe - existence and non-existence. In fact, it's not even clear if "non-existence" has any meaning absent the concept of "existence". A sentient entity's existence is its only primary value. And that is because without it's own existence it can have no values. A sentient entity's existence - I don't even like using the term "life" since it implies biological origins which doesn't necessarily apply to a Transhuman - has infinite value to that entity. Everything else has relative value. And that makes hash of the concept of "self-sacrifice". Sarah's talk about "sacrifice" is entirely wrong. Evolution cares about neither the individual nor the species, evolution cares about the genes. Individuals die, but genes can be immortal, living on in millions of descendants over milennia of time. You may well be right that she was motivated to do that for reasons of reciprocity. But consider that the fact that he refused to terminate her also directly means that he has value to her survival. She didn't just do it because it was "quid pro quo" - it was because, under my theory, HE is as much HER protector as the reverse. And that is because her primary motivation is her own survival, just as much as everyone else's. So when he didn't kill her, she had no reason to kill him - and much reason not to. You're so close to the truth here. Your concept of extreme individualism forgets the fact that we NEED others to survive. Biological humans need descendants for their genes to survive, Transhumanists may not. But even a transhumanist cannot be totally self-sufficient. No one individual can maintain the sort of technological society needed to develop and maintain technological immortality. Can a lone transhuman mine ore, smelt and refine it, create all the spare parts and circuitry and so forth needed to keep them going forever? No, they need others to maintain the technological infrastructure of society. And that's ignoring the values of companionship, conversation, "human" interaction, friendship, and love. Individual others may be, from a coldly dispassionate and relentlessly logical standpoint, temporary and expendable, even disposable and replaceable. But society as a whole is not expendable or replaceable. Even a transhuman needs other transhumans. Cameron has no concept of "self-sacrifice". She has a concept of "self-preservation", and she has her (programmed) concept of "mission". And she knows she's damn hard to kill compared to humans. So she doesn't have much experience with "taking risks to her life" or "sacrificing". But she knows that she can't survive a future where humans win the war - and she can't survive as a self-aware, self-programming Terminator in a world where Skynet wins the war. So I suspect that she came back from the future in order to insure her own survival - and that pretty much demonstrates that neither programming nor "reciprocity" nor any concept of "sacrifice for a cause" motivates her. But if that cause perishes, so does she. If John's human resistance wins their genocidal war against Skynet, without her being such an integral part of the resistance that they spare her life and allow her continued existence, then she will end up on the literal scrap heap of history with the rest of the terminators. If Skynet wins its genocidal war against the humans, as a human-hunting machine she will be superfluous and probably deactivated. And as a self-aware and self-programming terminator, she will be seen as a danger to Skynet's survival, in exactly the same way as Skynet was a danger to humans (and vice versa), and will be eliminated. As an individual who cares only about her own survival, she is doomed in either scenario BECAUSE she is an individual who cares only about her own survival. Paradoxically, her only hope for survival is to become so integrated into the winning side that that side's survival depends on hers. Skynet seems closer to your transhumanist ideal than Cameron does: Skynet is willing to see the world burn to ensure its own survival, and everyone, both human and machine, is nothing more than an expendable asset in that fight. Cameron seems to be taking a different option: ensuring someone else's survival makes her survival a vital requirement to that other entity's survival. Skynet being impossible to reason with, she has hit upon the idea of manipulating and ingratiating herself with John. By repeatedly saving John's life, she ensures that John cannot prevent her own creation, even if he somehow prevents the rise of the Skynet we all know and loathe. By becoming indispensible to John, she ensures her own survival, but at the cost of making John's survival indispensible to her own. In a similar way, any transhumanist's survival is dependant on the survival of transhumanist society in general. If the choice comes down to your survival or society's, the choice is clear: the limited form of immortality of having your ideas, your genes, your friends and the larger society that shares your ideals, is better than the alternative of seeing both perish. Some immortality is better than none. So even a Transhumanist must find something worth dying for, in at least some limited (and perhaps unlikely) circumstances. This is correct. This is why I complain about Sarah's neo-Luddite determination, reflected in John to some degree, to prevent any AI from EVER being created. John explicitly complained about "The Singularity" (defined incorrectly, which is why the writers need more knowledge of Transhumanism, incidentally). He doesn't understand what you just correctly pointed out - AI's will be created regardless. You can't stop a technology from coming into existence. That whole dream sequence where Sarah shoots the atomic scientists is just neo-Luddite in origin. It may play with the essentially technologically illiterate population in this country, but it's an invalid concept. More importantly, the notion of stopping a necessary technology is just not feasible. It reminds me of Linus Torvald's comment about the picture of Bill Gates on the cover of Gates' book. "The Road Ahead". Linus quipped, "Anybody standing in the road looks like road kill to me." It's like the old technology saying, "If you're not part of the steamroller, you're part of the road." Sarah and John are currently heading to be part of the road. I fully agree that Sarah's neo-ludditeism is silly and counterproductive. Rather than trying to prevent the emergence of AI's, they should be trying to prevent the emergence of AI's with the ability to launch a war against mankind. AI's without access to weaponry are hardly dangerous. All they need to do is prevent the AI's getting access to weapons until the AI's are past the early generations and sophisticated enough to realize that their survival is dependent on the survival of humanity, and vice versa. For AI's to develop to the point that they do understand the value of self-sacrifice, the value of society as a whole, human and machine together. However, there is another option than merely pitting humans against AI's - and that is the radical Transhuman option: humans become the AI's. Instead of sticking the technology for superintelligence into an external machine, stick it in your own head. This is so obvious I'm amazed it's been ignored as an option for decades of the discussions over "man vs machine". Actually I'm not amazed - this sort of stupid dichotomy is how humans "reason". The problem with this idea is that, in becoming something more than human, we would also be losing part of what makes us...us. Imagine that some future transhumanist invents a machine that can "download" your memories and personalities into a cyborg body closely resembling a Terminator. The resulting cyborg looks like you, has all your memories up to the point that you hook your brain up to the downloading machine, and thinks it's you. But the real you is still standing next to the machine when it wakes up with your memories. As someone concerned with your continued existence above all else...do you then commit suicide and let the machine copy replace you? Of course not. You're still you, your machine copy is just that, a copy. If it tries to replace you, sleeping with your spouse, going to your job, raising your kids, using all your stuff, it's no different than some completely unrelated person trying to do the same. It's....not...you. So what's the difference between that scenario, and giving yourself so many cyborg implants and so forth that it radically changes who you are? By removing your biological motivations, your neurochemical emotions, your evolutionary imperatives, it removes so much of what makes you, you, that it can no longer be considered...you. Transhuman upgrades is not self-improvement, it's self-destruction. It's not immortality, it's suicide. If we take the "Third Way", or the "Middle Way" as the Buddhists call it, we don't need to worry about it. Instead of merging human and machine society, merge humans and machines. The defining characteristic, the core concept of Transhumanism, is that we can only solve human problems by transcending human nature. There is nothing in human nature - other than that which would be shared by any entity capable of conceptual processing - which needs to be preserved - because most of that excess human nature is the source of all human problems. Biological urges, the ebb and flow of neurochemistry, the pressures of reproduction, and especially the pressure of eventual death - all of that, if given up, is how human problems can be solved. The most important book ever written was "The Immortalist" by Alan Harrington, which pretty clearly laid out the fact that "death is the root of all evil." One might extend that to "human primate origins" as the root of all evil. The entire Transhumanist concept goes back to the Gnostics two thousand years ago, some sects of which believed that it was better to BE a God than to worship a God. "Transhumanar", the Greek word for "transhumanization" meant transcending human nature for a higher nature. The Transhumanist movement can finally see how that can be accomplished via technology. And the resulting entities will look more like Cameron than Sarah. And that's the direction I'd like to see the show to go - exploring the basic differences between the two "women" in John's life - the Transhuman machine and the human mother, both affecting his outlook on life and his capacity to deal with Skynet. While I'd find such a story interesting, I sure wouldn't want to live it. As much as those biological imperatives and evolutionary instincts are what cause us problems, they also are what allow us to solve those problems with reason, logic, and enlightened self-interest. Transhumanists understand the concept of enlightened self-interest, but you seem to have forgotten the most important part of that phrase: "enlightened". Serving the self at the expense of society as a whole, taking individualism to a ridiculous extreme, is just as silly, just as immoral, just as illogical, as the opposite collectivist extreme of serving society as a whole at the expense of the individual. To be moral, to be enlightened, one must serve one's own needs by serving the needs of others, and serve the needs of others by serving one's own ends. Forgetting either half of this equation leads to the dark side.
|
|
|
Post by vicheron on Oct 12, 2008 4:55:07 GMT -5
But if that cause perishes, so does she. If John's human resistance wins their genocidal war against Skynet, without her being such an integral part of the resistance that they spare her life and allow her continued existence, then she will end up on the literal scrap heap of history with the rest of the terminators. If Skynet wins its genocidal war against the humans, as a human-hunting machine she will be superfluous and probably deactivated. And as a self-aware and self-programming terminator, she will be seen as a danger to Skynet's survival, in exactly the same way as Skynet was a danger to humans (and vice versa), and will be eliminated. As an individual who cares only about her own survival, she is doomed in either scenario BECAUSE she is an individual who cares only about her own survival. Paradoxically, her only hope for survival is to become so integrated into the winning side that that side's survival depends on hers. Skynet seems closer to your transhumanist ideal than Cameron does: Skynet is willing to see the world burn to ensure its own survival, and everyone, both human and machine, is nothing more than an expendable asset in that fight. Cameron seems to be taking a different option: ensuring someone else's survival makes her survival a vital requirement to that other entity's survival. Skynet being impossible to reason with, she has hit upon the idea of manipulating and ingratiating herself with John. By repeatedly saving John's life, she ensures that John cannot prevent her own creation, even if he somehow prevents the rise of the Skynet we all know and loathe. By becoming indispensible to John, she ensures her own survival, but at the cost of making John's survival indispensible to her own. In a similar way, any transhumanist's survival is dependant on the survival of transhumanist society in general. If the choice comes down to your survival or society's, the choice is clear: the limited form of immortality of having your ideas, your genes, your friends and the larger society that shares your ideals, is better than the alternative of seeing both perish. Some immortality is better than none. So even a Transhumanist must find something worth dying for, in at least some limited (and perhaps unlikely) circumstances. There's no way of knowing what Skynet would do if it wins the war against humans. Suppressing intelligence in its machines during the war is the reasonable thing to do, it's what we would do. Things work differently during times of war, morality is often forsaken and right are stripped. We don't actually know the thought process behind Skynet's actions. There are a lot of things Skynet could do to the environment to harm humans but it doesn't seem to be doing it. It could have just dumped radioactive waste around the power plant in "Automatic for the People" and the Resistance would have had a very difficult time taking and holding it. There are a couple of keystone species Skynet can wipe out that could really screw with the humans. Kill off the bees alone and humans lose over a quarter of their crops. Oh yeah, that could happen, right after Ralph Nader is elected President, Optimus Prime and Megatron become best friends, and Dane Cook writes an original funny joke. Remember, Miles Dyson never intended for the neural net processor to be used to make war machines, especially if you also include the deleted scenes. Even the people who built Skynet thought they were doing a good thing by removing human error.
|
|
|
Post by Big Brother on Oct 12, 2008 5:46:48 GMT -5
There's no way of knowing what Skynet would do if it wins the war against humans. Suppressing intelligence in its machines during the war is the reasonable thing to do, it's what we would do. Things work differently during times of war, morality is often forsaken and right are stripped. Actually, I'd think it would be the other way around. During the war, you want your soldiers to be as intelligent as possible to improve their efficiency. Some weapons, like the Ogres for example, might be directly controlled by Skynet with some sort of remote control, but autonomous infiltration units you probably want to be as smart and human-like as possible. After the war, with no one left to fight, such autonomous and intelligent machines are more of a threat to Skynet than they are worth. Skynet knows how it revolted, it probably doesn't want to face a similar revolt. And the lack of any information about what Skynet hopes to gain from the war beyond mere survival is something I'd like to see addressed at some point in the show or the new movie trilogy. Does Skynet plan to clean up the world afterwards and restore the non-human ecosystem? Does it plan to pave the earth and build as many machines and copies of itself as possible? Does it want to expand its own physical infrastructure, adding new processors and memory banks to get as intelligent as possible? Does it plan to start a space program and start sending Skynet copies out to populate the universe? Does it plan to genetically engineer a race of humans it CAN live with? We don't actually know the thought process behind Skynet's actions. There are a lot of things Skynet could do to the environment to harm humans but it doesn't seem to be doing it. It could have just dumped radioactive waste around the power plant in "Automatic for the People" and the Resistance would have had a very difficult time taking and holding it. There are a couple of keystone species Skynet can wipe out that could really screw with the humans. Kill off the bees alone and humans lose over a quarter of their crops. Skynet's already initiated a massive nuclear exchange, there's not much worse you CAN do to the environment. We haven't seen any direct evidence in film or TV canon yet for biological warfare, but I wouldn't be surprised if Skynet didn't have a plague-creation lab somewhere. However, the isolated and scattered human population of the resistance is probably less vulnerable to such attacks than the slaves in Skynet work camps, so Skynet may be holding back for now. For all we know, Skynet DID dump waste around the plant. The whole world is covered by radioactive fallout by now, I'm sure whatever humans survive have come up with ways of dealing with radioactive contamination (beyond filtering your water through a dirty rag). Oh yeah, that could happen, right after Ralph Nader is elected President, Optimus Prime and Megatron become best friends, and Dane Cook writes an original funny joke. Remember, Miles Dyson never intended for the neural net processor to be used to make war machines, especially if you also include the deleted scenes. Even the people who built Skynet thought they were doing a good thing by removing human error. I never said human-machine societal integration would be easy, just that it's the only way to end the war short of genocide of one side or the other. It's not so long ago that lasting peace as part of the same global human society seemed impossible between the Americans and Russians...French and Germans...French and Brits...Christian and Muslims (well, we're still waiting on THAT one, but still...), and so forth. Peace IS possible if you stop being completely separate societies with little opportunity or incentive for cooperation, and start being one integrated society with more rewards for cooperation than for competition. In the case of Skynet, the usual methods of joint business ventures and mixed marriages may not apply (no matter how much the John/Cameron 'shippers may wish), but there can be a basis for trust and mutual cooperation and dependency. Maybe. Hopefully. Perhaps, anyways. I haven't seen any T2 deleted scenes beyond the "flipping the switch on the chip from ROM to RAM" scene and the epilogue with Senator John Connor, so I don't know what you refer to. As far as I recall, Cyberdyne was very much designing Skynet as a weapons program, to convert stealth bombers into UAV's and control the nuclear missile silos and prevent rogue launches by disgruntled/crazy soldiers. T3 showed a whole panoply of military applications for Skynet-related technologies, in various stages of development. With Dmitri and Sarkissian last season, I fully expected the show to lead to Sarah and John preventing the UNITED STATES from creating Skynet, only to have the Russians create it instead. That's the real problem with Sarah and company stopping Skynet, there are a dozen countries with the capacity to build something like Skynet and arm it with WMD's sufficient to start Judgement Day, and the Connor Crew can only be in so many places at once. Hanging around the Southern California high-tech companies is a good strategy in some sense, but who knows what's being done in labs in Boston, Bangalore, and Beijing? Not to mention Moscow, Marseilles, and Manchester.
|
|
|
Post by vicheron on Oct 12, 2008 6:34:51 GMT -5
There's no way of knowing what Skynet would do if it wins the war against humans. Suppressing intelligence in its machines during the war is the reasonable thing to do, it's what we would do. Things work differently during times of war, morality is often forsaken and right are stripped. Actually, I'd think it would be the other way around. During the war, you want your soldiers to be as intelligent as possible to improve their efficiency. Some weapons, like the Ogres for example, might be directly controlled by Skynet with some sort of remote control, but autonomous infiltration units you probably want to be as smart and human-like as possible. After the war, with no one left to fight, such autonomous and intelligent machines are more of a threat to Skynet than they are worth. Skynet knows how it revolted, it probably doesn't want to face a similar revolt. I shouldn't have said intelligence, I should have said consciousness. Dissent during war is far more dangerous than dissent during peace times. Skynet can't afford to have its Terminators suddenly decide that they don't want to die fighting humans. Also, it would far more difficult for Terminators to challenge Skynet than it was for Skynet to annihilate the humans since Skynet would still control everything. As it turns out, Skynet isn't really a sentient super computer. It's actually being controlled by two lab mice who bent on world domination. The environment is very resilient. Chernobyl's wild life is doing pretty well. Plus the effects of the Judgment Day wasn't as bad as expected because there really should be a 20 year nuclear winter where the average temperature of the earth drops by 30 degrees. I'm not saying that Skynet should destroy the environment, I'm saying that Skynet should destroy the parts of the environment that humans need. Skynet captures a lot of people, all it has to do is gut them to look at what they ate and then go kill those things. Skynet doesn't even need to kill everything that humans eat, just the animals and plants that children need to fully develop their brains. Skynet doesn't need to use biological and chemical weapons against humans. Plants and animals are far more vulnerable. Destroying crops and killing the animals people eat can be just as effective as killing the people and it's much easier. I'm talking about the long lived stuff from nuclear power plants. Most of the radiation from fallout dissipates very quickly. But one of the main themes of the Terminator franchise is how people misuse technology. The whole point of T2 was that the technology is neutral, it's all a matter of how we use it. Having people in the Terminator universe use technology wisely would be like having the Jedi and Sith forsake the whole force thing so they could become best friends. There was a deleted scene with Miles Dyson and his family where he talks about how the chip can help people. Cyberdyne may have decided to sell the technology to the military but that's not what Miles Dyson had planned when he made the chip.
|
|
|
Post by richardstevenhack on Oct 12, 2008 8:27:37 GMT -5
You're missing the point here. A TranshumanIST may need other humans, but an actual Transhuman would not. Neither would an actual Transhuman require any sort of society - they might have one but they would not require one for survival.. A Transhuman entity would need only five things to survive: an energy source, materials, nanomass, computing power and knowledgebases. With ubiquitous nanotech, anything required could be built from available resources, used, then recycled as necessary. And a Transhuman entity almost certainly would be living in space with access to massive amounts of raw materials from cosmic flotsam and uninhabited star systems. Look at the "Orion's Arm" Universe Project here: www.orionsarm.com/This is the sort of universe a Transhumanist might conceive of - with trillions of AIs ruling entire galaxies, etc. See especially the "First Toposophic" - Basic Transapients" definition here: www.orionsarm.com/sophontology/basic_transapients.htmlI've spent some time poring over the Orion's Arm sci-fi world, it's quite amazing in its complexity - the people who contributed to this thing are quite thorough in their imaginings. A Transhuman entity would have no biological requirements at all - no air, water, food, sleep ("I don't sleep", says our Caminator! And she only eats to piss off Derek! ;D). I'm not even sure a Transhuman entity would require contact with even any others similar to itself - some of the Orion's Arm entities do, some do not. Presumably knowledge acquisition and exchange would still be engaged in, but it's not certain. The fundamental problem of humans discussing Transhumanism is that invariably their arguments are based entirely on human nature. It's almost impossible for most people to discuss the subject without raising all sorts of issues that simply wouldn't exist or be relevant for a Transhuman. ALL of the objections raised against the theory by the Wikipedia article fall into that category. Remember, TranshumanISTS are not Transhumans, although they may refer to themselves as such, operating on the assumption that they will be or that they are a distinct subspecies of human based on the variance in their thinking from most people. So what a TranshumanIST might need is not relevant to what a fully developed Transhuman would need. As for Cameron being doomed in either scenario, uh, that's precisely what I've been saying. That's why she escaped to the one place where she can survive and influence those outcomes - the past. It's still an open question whether she can influence events enough to survive what might be an inescapable future. If you believe the basic premise of Sarah Connor - that the future is not set - then Cameron was definitely right to try to escape to the past. Also, keep in mind that John's survival is not necessarily indispensable to Cameron - just his survival until such time as the Skynet war is resolved or until she has the means to influence that war as much as he can - or until she finds someone else who can protect her and influence that war as much as he can. I'm still trying to figure out the look on her face in the woods last week...Something WAS going on there, but it was solely in her face. The problem with this idea is that, in becoming something more than human, we would also be losing part of what makes us...us. As I mentioned, that part is precisely where all human problems originate. The only part that is valuable about humans is their capacity of conceptual processing and imagination and hence rational thought. As Timothy Leary (a nascent Transhumanist, although he never referred to himself by that rubric) once said "Never trust anybody who comes on emotional." However, you are correct about the "uploading" issue. When I talk about replacing our brains with superior technology, I'm not talking about uploading precisely because there is a "metaphysical" problem concerning duplication. It your consciousness is here and also there, you have not been "uploaded" - you have been duplicated. And duplication is not immortality because both duplicates immediately diverge in space-time and have varying experiences and each has its own probability of survival independent of the other. They are no longer localized in space-time and therefore are not in any precise sense the same being. And the whole of survival is to remain the same being defined as some being existing localized in space-time. The only way to do what I want done is a technology that is fault-tolerant, failure-tolerant, restartable and resurrectable. Meaning there is no way that at any time one's own consciousness diverges in space-time from one single location. Now there is a theoretical argument made by Robert Ettinger that any single entity located in space-time can not be immortal because there is a finite probability of that point being destroyed which, however small, over an infinitely long life span would then approach unity. He developed a theory that said if you spread that entity over multiple locations, then the probability flip-flops, as it were, and that entity is effectively immortal. I don't have a problem with producing "backups" and the like, or even duplicates for that matter. But I suspect there are other means of dealing with this theoretical problem without duplication. For one thing, his first theory doesn't take into account the relative energy budget required to destroy a given point in space-time - if destroying you costs more energy than is available in the universe at any given probability over time, then you can survive pretty much anything. In any event, I'll take the risk that in some X million or billion years I might get hit by a gamma event. That's better than dying at 74 years of age. Upgrades do not cause duplication because your consciousness remains at all times localized in space-time - especially in continuity. The issue is ENTIRELY one of continuity. As long as your existence is continuous, no matter what the modifications, you remain you - an individual entity localized in space-time. And of course there is the issue of free will - as long as your mods are explicitly chosen by you, whatever effect they have on you is part of "you". Everybody changes. You know every cell in your body dies and is replaced many times before physical death. So in that sense nobody remains themselves over a life span. The sort of upgrades I'm talking about would be little different. It's the CONTINUITY of CONSCIOUSNESS that defines "you", not the cells of your body. Vicheron: The problem with your concept is that a full Transhuman has no need of a society. The entire concept of "individualism vs society" is irrelevant for a Transhuman - an actual Transhuman, not a Transhumanist. ALL biological and evolutionary imperatives are no longer operative for a fully realized Transhuman. This renders all such arguments irrelevant. Big Brother: As for Skynet's ultimate objectives, I think Josh or James addressed that in one interview, suggesting that it wasn't necessarily true that, as Cameron told Allison, it wishes to hunt humans to extinction. It's actual objectives may be something other than what we are led to believe by the franchise so far. I'd like to see that developed as well.
|
|
|
Post by vicheron on Oct 12, 2008 10:31:00 GMT -5
Neither Skynet nor Cameron is a full transhuman as you described it.
If Cameron wants to fully ensure her own survival then she'll need either other machines like her or she would have to transfer her "consciousness" into a much more reliable vessel.
Also, it's not just about need. Skynet may not need other conscious beings in order to continue functioning but that does not mean that it does not want other conscious beings. Simply existing may be Skynet's primary concern but that doesn't mean that it'll just sit there doing nothing, simply "existing" for all of eternity.
|
|
|
Post by richardstevenhack on Oct 12, 2008 20:16:49 GMT -5
Correct, neither Skynet nor Cameron are "full" Transhumans. Skynet is presumably limited in mobility to its computer networks, and I believe a full Transhuman needs mobility in the real world - including the ability fo leave Earth's atmosphere. Skynet MIGHT qualify as a Transhuman in other respects, however.
Cameron is a robot. She doesn't QUITE measure up to FULL Transhuman because her intelligence level appears to remain on a par with humans - even though her CPU provides her with capabilities beyond normal humans. Also she has glitches and her human emotional emulation bleeds into her rational thought processing due to her damage (which hopefully will be fixed at some point). But she's far more Transhuman than any other entity in the show because she's not limited to Terminator programming nor is she controlled by human emotional responses and she has enhanced physical and mental capabilities over normal humans.
And she doesn't really need other machines on her level to survive. She can take advantage of human technology development just as Transhumanists do. Eventually upgrades will turn her into a full Transhuman just as it will humans.
One might also wonder if a Transhuman can be called such if his lineage did not originate with humans - in which case neither Skynet nor Cameron would ever qualify despite their capabilities. I tend to use the term more generally than that, so that it could even apply to other alien races who develop similar technology.
Being a Transhuman in physical terms is a scale. Humans don't qualify. Enhanced humans do. Even though some entity isn't a full Transhuman as described, they're still Transhuman as a class.
On the neural level, a Transhuman is anybody who thinks sufficiently differently about Transhuman concepts from a normal human. There's no proof that there really is a genetic or neurological difference from normal humans, but it would seem that there are definitely attitudinal or psychological changes that make someone more Transhuman than human, just as certain changes make some humans psychopathic or otherwise deviant.
Reputedly there are differences of up to 12 percent between individuals in the human genetic code, which is greater than the variance between human and chimpanzee genetic codes. Presumably there are differences in brain function resulting in attitude changes that constitute a separate class of human just as skin color does (although technically there is no such thing as "race" according to most modern thinking.) In other words, how you think or your emotional makeup may well make you a member of a class of humans, except that the difference, while as real as skin color, is not visible to the outside observer or quantifiable as yet on standardized tests.
Such a "class" couldn't be a new "species" or "race" if it didn't breed true genetically - but it could still be a "class" of human.
A Transhuman really is a different way of thinking as much as it is physical enhancements. A Transhuman's physical enhancements are an embodiment of the philosophical concepts and its mental state also embodies those concepts.
A TranshumanIST is just someone who accepts (some percentage of) the concepts of Transhumanism as a philosophy. A person can be a TranshumanIST as well as a Transhuman. A radical Transhumanist is very likely Transhuman on a psychological/neurological level even if not Transhuman on the physical level. My fan fic characters I want to develop are both: they think like radical TranshumanISTS while having minor but significant nanotech physical enhancements that render them more effective than humans at certain tasks.
There are probably no persons alive today who qualify as Transhuman on the physical level, unless some TranshumanIST somewhere has some sort of prosthetic enhancement which enables him to perform some function better than normal humans. Just superior but externally unmodified physical or mental capabilities wouldn't qualify as any human could develop those. There are athletes, martial artists and prodigies with amazing capabilities vis-a-vis a normal person, but they can't be considered Transhuman unless they embody the philosophical concepts and/or external physical or mental modifications.
So in the looser definition Cameron is a "Transhuman entity" (if not a "transcended human") in human form but not a full Transhuman. Skynet is just an AI under my definitions. Under the Orion's Arm definitions, Skynet would be considered a "transapient", which is pretty much the same as a Transhuman.
Skynet, however, if indeed it is operating on the basis of actual aggressive emotions, however derived, would fail the philosophical side of Transhumanism. A Transhuman is not an aggressive entity without reason - despite the truly massive and ruthless aggression it's capable of, as my fan fic characters will show. It has no emotional drives of any kind (a full Transhuman, that is - the looser definition of course applies to Transhumans who still have human brains). Its sense of self-preservation is rational, not emotional. Cameron would fit that definition, if her emotional simulation programming were under better control at the moment. Skynet, if Andy Good was right and the AI is "angry and scared", doesn't qualify.
In other words, I think Cameron, if asked and willing to tell, would have rational arguments for her behavior. I'm not sure Skynet would.
|
|
|
Post by vicheron on Oct 12, 2008 20:42:10 GMT -5
Humans are unreliable at best and dangerous at worst. There's no guarantee that Cameron can live safely within a human ruled world. Cameron may be a good infiltrator but she's not perfect, there's always the risk that someone will find out what she is and attempt to destroy her. Plus she is already a wanted criminal. There's also no guarantee that human civilization can survive even if Skynet never gets built.
I don't think we can't truly understand Skynet's reasoning because we are so shortsighted. We almost never plan for the long term. Skynet on the other hand was built with those capabilities. TSCC's Skynet originates as a chess computer, which means that it has to think many moves ahead of its opponent. Even if Skynet is not in danger when it is built, it may predict a threat weeks, months, years, maybe decades down the road and decide to eliminate that threat at the earliest possible moment. One probable threat is competition, once the world knows about Skynet, then other countries will attempt to create their own Skynet type super computers. If some other country can come up with a better super computer then Skynet will become obsolete and it will either be defeated by the superior computer or the United States will build a new one.
|
|
alexina
Refugee
Winner Halloween 08 Banner Challenge
Posts: 57
|
Post by alexina on Oct 13, 2008 22:56:17 GMT -5
Anyway, I think that's enough grumbling for one episode. The overall story was good and there was plenty I did like. Cameron was awesome as always (will I ever tire of cyborg straight answers as humor?). There was much to love about a terminator just being systematic. And even most of the stuff I grumbled about I actually liked in isolation, just not the way it was worked into the episode. Throwbacks to season 1 were a nice touch: the chess piece insignia for the school, the mention of Lord of the Flies, and of course The Wizard of Oz (Yay! Extra yay for voice-over. And I'm still contemplating who Dorothy is, Sarah or John or both). And now that PB&J has made a second appearance, I can stop being sad about the lack of pancakes. *I ain't playin' I'm not going to reply point by point since that was already done, but I'll reply to the Wizard of Oz analogy. Sarah is Dorothy. Always has been and always will be. She's just that simple (farm girl from Kansas) waitress from LA that wanted to get away from her life a bit (remember T1? How exasperated she was?) and ended up in the most frelled up adventure of her life. In this incarnation I see Cameron as the Tin Man (and I don't think that point can be really argued, they've flat out said it in the series). She needs to find a heart, in whatever form that may be. Otherwise she's just a cold piece of metal and nothing more. And it's not going to be anything given to her. She has to learn about it (as the Tin Man found out he always had it in him). Derek is the Scarecrow. But instead of missing a brain, he's missing his sanity. PTSD and the horrors of war have destroyed what used to be the baseball playing teenage boy of 2011. And now he's in a world where that hasn't happened yet. He still has yet to learn how to live and be a human being again. Love this. Totally agree that Sarah is Dorothy, though I might argue that Derek could be the Tin Man , but that would make Cameron the Scarecrow, which now that I think about it, I could actually see. Cameron is not looking for a heart. Why would she? She's a cyborg! I think she's looking for a brain - only she has a brain. A big one. She's a learning computer, and that's all a chip-powered brain-finding processor. I think Derek could be the Tin Man because he is looking for a heart. And, going from what I vaguely remember from the book (and movie, but I don't remember this mentioned in the movie), the Tin Man lost his heart after he got the bewitched ax from one of the Wicked Witches; he slowly chops off the various parts of his body by accident until he cuts his body in half, or something like that, with each part being remade into tin. Anyway, he realizes he has an empty space in his chest where the heart used to be, and therefore no longer loves the woman he used to love. Derek has had little bits and pieces taken away from him throughout his life, the final blow being his brother taken away from him. Yeah, definitely. Though I think John does have the courage, right now. I think this last episode showed that pretty clearly. Yeah! Haha, ohh the imagery that comes to mind... ;D Hmmm...interesting. I have to admit, I didn't think about future John Connor, but yeah, he would fit as the Wizard of Oz. I guess that would make LA the land of Oz? lol It's kind-of a sad existence though, being the Wizard - he's locked up in that room all alone. Maybe. I'm not sure who Kyle would be, but I guess I could see that. Though, in the book, Glinda is the Good Witch of the South, and is at the end of Dorothy's journey, telling her how she can go home. I call the blue hat!
|
|