I admit, I am mixing intelligence with self-awareness. I would say children aren't fully self-aware until mortality is thoroughly grasped, but certainly they have a degree of intelligence.
I would say that no one is fully self-aware.
Quote:
As far as intelligence is concerned, I agree. But can't any degree of intelligence be better with self-awareness than without? Or is self-awareness simply an illusion occurring within anything sufficiently intelligent? In which case, how do you know when a machine has reached that point? Does it matter? Probably only if we think ethics and morals should be applied to any sufficiently intelligent system, right?
Awareness is certainly a pre-requisite for intelligence, and self-awareness is part of general awareness.
I agree that morality should only be applied to sufficiently intelligent systems. Would you consider any animals sufficiently intelligent to be held responsible for their actions?
Quote:
You think a system programmed to learn and advanced enough to conceive of its own mortality wouldn't by default have to already have gained a good amount of intelligence to reach that point at all? It sounds as though you are saying it doesn't take much to understand ones own mortality. To that I would disagree.
No, I don't think I stated anything that would imply that.
Quote:
But I think I jumped into the middle of this without defining a couple of basics. Are we in agreement there is such a thing as self-awareness and while related to intelligence it isn't the same thing? Would an independently learning system that is gaining knowledge and in theory intelligence come up with a concept of its own mortality on its own or does it have to be explicitly taught it? Does this even mean anything?
Agreed, lot of animals could be considered to be intelligent to some extent but as far as I know only humans and chimps are self-aware. Maybe gorillas too? I'm not sure.
Quote:
I've also always assumed that conceiving of one's own mortality was an important step in the formation of the concept of the "self" in a person's brain. Without it there isn't that final division between the brain and the world around it. Not only that, but I've always looked towards the conception of mortality as the crowning moment in which the brain decides that the world can't possibly exist without it (since it's never known a world in which it wasn't a part of) and often decides that its own self-awareness is permanent.
I never thought the concept of mortality was very important to intelligence or self-awareness. In fact if you head over to the #immortal channel you will find lots of self-aware, intelligent humans that do not think they are mortal. I'm not kidding.
Quote:
Back to the Turing Test - it sounds like the Turing Test is looking to decide on true intelligence and I agree that it fits the bill. Does that mean that self-awareness comes along with that? I understand that during the course of the Turing Test the machine could demonstrate a deeper understanding of the topics being discussed, but how can you convince the user of true self-awareness given that the entire conversation could be brute forced with a sufficiently powerful machine and a large enough database? That wouldn't be self-awareness so much as it would be a mimic of human conversation.
What makes you think intelligence could be brute forced? Consider the possibility that the machine has to be able to learn and play a game that is made up on during the test.
Quote:
So then does a machine passing the Turing Test mean that debates will be sparked over its "rights" as a self-aware machine? Doesn't the machine first have to provide evidence of its own self-awareness before anyone would be interested in providing it rights of any kind? Is passing the Turing Test the final test that would be required? Would a race of intelligent machines ever deserve any kinds of rights?
Regarding the statement that gorillas may possibly have self awareness...Many gorillas have been taught sign language so that they can communicate with thier handlers, and routinely express thier own needs, as well as questioning the needs of other gorillas, or of the gorilla's handler. This requires that the gorrila be able to communicate the distinction between itself and others. Is the ability to distinguish between oneself and others the same as self awareness?
Next: I would agree that intelligence and self-awareness are not binary properties that are either "on" or "off", but are properties better classified by degrees. This would mean that tests such as the Turing test should be able to determine the degree of intelligence, and the degree of self-awareness. If a test such as the Turing test returns a binary result- measures of intelligence or self awareness that are either "on" or "off", I believe this implies that the purpose of the test is not to provide an abstract measurement of those properties, but rather to determine whether or not the subject being tested is sufficiently intelligent or self-aware for some purpose. I believe that purpose is determining whether or not the machine should be extended rights, and this brings up the moral questions regarding the level of intelligence neccessary to deserve rights. Using the Turing test or another test with binary results to determine eligibility for rights means that rights are awarded in a binary fashion: One receives all the rights of a human or they receive no rights, or perhaps they receive animal rights (O.K., they are awarded in a quantum fashion...in a few distinct increments). I believe that this process is flawed, and that intelligent and self-aware systems should be awarded rights an a scale reflecting the scales of intelligence and self-awareness. While the Turing test could be used in conjunction with other tests to provide a quantum estimation of these properties, it's inability to return a truly fuzzy result should make the Turing test obsolete.
Of course, these considerations would be secondary to the functional consideration of the infinite future.
As to whether or not intelligent machines should ever deserve rights, I would say that in the absence of divinity, it is the product of the system (it's intelligence) and not the constitution of the system (nuerons, wires, or quantum knots) that determine eligibility. So, yes, they should at some point deserve rights.
Re:The Turing test
« Reply #62 on: 2003-05-06 05:35:15 »
Alan Turing once asked "can machines think?" and designed a test to find out. If a machine could hold a convincing conversations with a human, they could be said to think. I ask "can machines fuck?" and suggest that if they can do so with a human convincingly, they could be said to think.
N: You recall my interest in programming a sex bot for you. It's interesting because it's kind of a way to live vicariously through it. add telepresence. add neuro-haptic interfaces.
A: That would be fun. It would also break down sexuality boundaries. Picture the scene - a man is fucking his female-designed robot only to find that a man has been *behind* her all along.
N: Yup. hehe. It would be so much fun to pop that on unsuspecting strangers, and see how they react.
A: Candid Camera? Through the eyes of the bot maybe? *L*
Is it the shape of the genitals or the gender identity which counts? In this case it is the latter, as it is the telepresent man who was doing the fucking. The robot was just a tool. The two guys were still mentally fucking, so it was a homosexual act. Would it be a homosexual act if a man plugged another man into a sex-toy & showed him pictures of women (that he, the shower, chose) until he came? If you having sex with a man who lives & identifies himself as a woman, then you are still by definition heterosexual? But in this case there is no third-party robot...
N: /me's heads explode. =D
A: *L*
The absurdity of sexuality pigeonholes (excuse the phrase) is revealed. Sexuality dissolves/fragments as third parties become involved. Its like sex with another man and a woman. If the guys are just fucking her & not paying much attention to each other, are they still having sex with each other? How much is the vaginal membrane really separating their erections? They can still feel each other for sure. Or are they having sex with the same woman coincidentally? Now multiply the scene with other people or people telepresent via robots or convincing transvestites.... Evidently, sexuality is built around monogamous & clearly defined genital interaction and gender identification. It is not just what you see that defines who you are having sex with, but what the other person thinks about themselves. That said I imagine that most people would still claim male/female genital interaction is still heterosexual, even if the woman acts & lives (& fucks) like a man, that she is masculine, that her *gender* is male, she was raised & has always lived that way. But how different is that from the first telepresence example? The only difference is the physical separation of gender identity from genital sex in the case of the telepresent man. In the example of the masculine woman, she is both in one body.
A new Turing Test springs to mind..... You have sex with two bots. One is a telepresent human, another is a telepresent computer. Later, when we have more convincing synthetic bodies, telepresence isn’t necessary. But by then, I expect this test to be moot.
N: lol - good one! I continue to see applications for human-equivalent AI, since telepresence will likely be able to step in before sufficiently sophisticated software. call it advanced puppetry. It might be a tight race, though.
The things that are holding up advancement used to be technical. today, they're economic. tomorrow, they will be political. after that, they'll be social.
A: Imagine what telepresent bodies could do with each other...? A throw away lover could be a big thing in extreme-sex circles. Imagine killing & being killed.
N: hm. telepresent deathmatch quake in meatspace! mmm - probably wouldn't work, what with all the teleportation going on. As realistic as modern 3D shooters want to be, realism makes a horrible game, commercially speaking.
N: Telepresence, of course, would add a largely unanticipated spin on sex bots. If you can hook an artificial intelligence into a robot, you can hook a human intelligence into it, too. this might cushion the shock a bit, as while AI minds might be touchy, modular bodies and telepresence wouldn't be so controversial.