Announcement

Collapse
No announcement yet.

@ spiffor: resurrecting the animal rights thing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    I get sick of people eleavating humanity so highly. most of the differences beteen us and other apes is in quantaty, not quality. probably the only qualitative diference is symbolic thinking (including language). Humans and chimps are so similar behaviorly it is unreal; hell, we even have very similar patterns of violent behavior.

    Comment


    • #32
      Behavior, except in a few specific instances, is not necessarily connected to sentience at all.

      Comment


      • #33
        It's really sad, you keep claiming you're a skeptic/relativist, and then you pull out of a magician hat some arguments about sentience meant to justify a whole array of fuzzy moral theories.
        In Soviet Russia, Fake borises YOU.

        Comment


        • #34
          Relativism doesn't preclude personal morality.

          If you think my arguments are bad, refute them.

          Comment


          • #35
            It just means that you have to acknowledge your own morality is BS.
            In Soviet Russia, Fake borises YOU.

            Comment


            • #36
              No. You just don't understand relativism.

              Comment


              • #37
                Re: @ spiffor: resurrecting the animal rights thing

                Originally posted by Kuciwalker
                First, I'd like to make a specific distinction between sensation and feeling, as I will use them. Sensation is simply what we call stimulus to some object that acts as if it performed some sort of logical computation on its stimuli to determine its response; thus, a flower senses light but an electron does not sense an electric field.
                What? First of all, "acts as if" is a pretty bad phrase to use in any definition, given that it is a subjective statement made by an observer.


                Fundamentally there is no difference between those two, to me, but it's a useful distinction for practical purposes. Feeling is what resutls when a conscious being senses things - for instance, your eyes don't merely sense light; you feel (or rather see) it as an actual image, in your mind.


                These are two different things. Your eyes are merely light receptors. They can only but sense light. That is their whole evolutionary purpose. The image formed in the mind is just that, a representation within the mind, which might or might not correctly approximate the supposedly real objects being represented mentally.


                Pain is a sensation. When we sense pain, we also feel it, and since we dislike it strongly we attribute (in general) a negative moral value to it. However, at a basic level pain is merely one stimulus among many, one that happens to be a signal that some part of the body is damaged or in danger of becoming damaged. A robot, which had internal sensors to measure stress on its components, could be said to sense pain when those sensors were triggered, but we would not attribute any negative moral value to it - in fact, I doubt anyone would have a problem if we "tortured" several of these robots to death in order to test that the pain sensors functioned properly and that the robots responded properly to them, because they're not self-aware.


                Pain is a sensatrion, but it is a specific form of sensation with, as you described, a very perculiar outcome, which is to warn of danger. Inherent then in the notion of pain is negative consequences for whatever is feeling pain if no action is taken to remedy the cause. Actually, it would be morally reprehensible to create something that actually feels pain and then cause it extreme pain to experiment. You in fact would be in the grand moral minority if you attempted to do so.

                There is no negative moral value to pain in a nonsentient being - we routinely, constantly genocide millions or billions of beings in a horrific way, by causing them to implode, but no one cares because they're bacteria.


                It is highly unlikely that, barring a complex nervous system that bacteria feel pain. They may react to positive and negative stimuli, but pain is not just "negative stimuli". It is a creation of the mind, just as an image is- the brain must process the incoming data before putting out the "feeling" of pain. This makes it different from a purely reflex action.


                My criterion for this is that *any* being that communicates the concept of sentience, without having had the concept imparted to it by another sentient being (such as in the case of an intelligent but nonsentient being that has read about consciousness), must be sentient. This is because I do not see any way an intelligent but nonsentient being could come up with the concept of sentience.


                What? Most people could not communicate to you the idea of sentience, period. How does one communicate the notion of sentience anyways?? And how would you know the difference? For example, in this case of the non-sentient intelligent being, how would a third party observer, when hearing the beaing explain sentience, know that perhaps, in the past, some other being taught it the notion?


                The first conclusion is from the first and second assumptions: human beings (in general; I will address the particular limits and exceptions later) are sentient. I know I am sentient. In addition, I know that there are other sentient humans, because of all the philosophers in the past who talked about it. Knowing from my second assumption that my sentience and theirs is the result of the construction of our brains, I begin to suspect that human brains are all fundamentally similar, in such a way as to make humans sentient. This is, I believe, reasonable.


                How do you know you are sentient? You did not have the concept of sentience before it was taught to you (someone must have taught youthe word, and without the word you could not have had the concept). Why then are you sure you are sentient, and not some intelliegent but non-sentient being?


                The second conclusion is from the previous conclusion and all three assumptions: a computer can be sentient. Since a computer of sufficient size and power can simulate the behavior of a physical system containing a human being, it can simulate the behavior of that human being, and since consciousness is due only to the physical constitution of the human brain, it can simulate sentience. But then it would satisfy the criterion in the first assumption. The computer would, in effect, be a human being.


                "Simulate sentience"? So you are saying that merely mimicking a behavior is proof that some machine understand and has internalized that behavior?


                Finally, I conclude that animals, except possibly our close primate relatives, are not sentient, simply because there's no reason to believe that they share the fundamental similarity that human brains have, and because I've never had the concept of consciousness communicated to me in any way by them.


                As Spiffor said, your likely inability to understand what any alien being tells you makes that rather diffciult. On top of that, similarity to a human brain in what respect? Do humans have any form of brain matter not found in other beings? Is your arguement that anything bellow a certain number of connections between neurons means you have no form of sentience?


                It may be objected that certain emotional displays of animals, especially mammals, suggest that they are sentient, but I see no necessary connection, either way, between sentience and such displays, especially given that it would be possible to create a relatively simple robot that would be furry and such and make identical displays while being incontrovertibly nonsentient. (This isn't currently possible mostly because we don't have access to quite as good motors and servos as nature has designed over millions of years.)


                Why are you using the word "emotion"? You haven't defined what it mean in your system, because if you are using the standard definition, it sort of undermines your arguement, since complex emotions, such as love, are tought to be a rather complex thing, and if you are saying that animals can love, or show love, then you are placing a dagger into your arguement.


                It may also be objected that certain animals recognize themselves in a mirror, and therefore must be self-aware. If the term self-awareness is used so literally, it is not interchangeable with sentience, since there is no necessary or even plausible connection between image recognition and sentience.


                Actually, self-awareness is crucial for even your stated notion of sentience. A being incapable of telling itself from another would not really then be able to judge the sentience of others, being uable to know which being it is while assuming itself to be sentient.

                For example, a program could be written that identifies an object in a camera view based on images and 3D data provided to the program at the beginning ("look for this"), and then this program could be loaded onto a robot which is also the object the program is set to recognize. However, the same program would recognize something else as self if one simply gave it the data of another robot. In addition, I think it's just generally absurd to claim that any good image-recognition software is by definition sentient.


                The issue would be what occurs if a change to that original image occurs, will the being be able to process that change. For example, you make a robot with a blue, plastic and easily molded exterior. You show it itself in a regular mirror and program the software to recognize itself. Well, every time the robot finds itself in its current condition in front of a normal mirror, it will recognize itself. The question is as follows: turn the robot off, take the cover and make it red, and mold it into a very different shape. Then place this changed robot in front a mirror. Turn it back on- under your previous example, would the robot be able to recognize this red and different looking robot as itself?

                You recognize yourself in front of a mirror even after a haircut, and perhaps even with some new scar. It is highly unlike that a mere robot programed with a set image of itself would be able to recognize itself if some radical change occured.

                It has been objected in the past that sentience can sentience could have evolved gradually, but I don't even understand what the heck it means to be, say, "half sentient", and I don't think sentience is something on a continuum. In response to this, the question has been raised "how did humans suddenly magically get it"? This is why I'm willing to concede that some primate relatives may be sentient; I think sentience may have evolved as an aid to empathy, allowing primates to create larger and stronger social organizations, and ultimately giving people the ability to form arbitrarily large social organizations even to the concept of the human race. Such organizations are formed from the extension of the concept of self onto a ever-larger bodies - my family, my tribe, my city, my nation, my race... and once you consider others as part of yourself, you consider their self-interest as part of your own, which is a very good trait for a society in evolutionary terms.
                What says that the notion of sentience as we have created it has ANY evolutionary meaning? Perhaps it is simply a self-serving definition we have created based on our currently incomplete knowledge, just as people created the notion of soul. As for extending your acceptance further out from yourself, how does one explain the fact that the most social of beings would under no ones definition have any sentience? (ants, termites, and other colony insects)
                If you don't like reality, change it! me
                "Oh no! I am bested!" Drake
                "it is dangerous to be right when the government is wrong" Voltaire
                "Patriotism is a pernecious, psychopathic form of idiocy" George Bernard Shaw

                Comment


                • #38
                  Re: Re: @ spiffor: resurrecting the animal rights thing

                  Originally posted by GePap
                  These are two different things. Your eyes are merely light receptors. They can only but sense light. That is their whole evolutionary purpose. The image formed in the mind is just that, a representation within the mind, which might or might not correctly approximate the supposedly real objects being represented mentally.
                  ...

                  There's a difference between there being some physical representation of information in a nervous system and there being a conscious representation of something in the mind.

                  Pain is a sensatrion, but it is a specific form of sensation with, as you described, a very perculiar outcome, which is to warn of danger. Inherent then in the notion of pain is negative consequences for whatever is feeling pain if no action is taken to remedy the cause. Actually, it would be morally reprehensible to create something that actually feels pain and then cause it extreme pain to experiment. You in fact would be in the grand moral minority if you attempted to do so.


                  So when, say, NASA builds a robot with internal sensors that detect if something is broken and then tested said sensors it would be unethical?

                  It is highly unlikely that, barring a complex nervous system that bacteria feel pain.


                  That's my point.

                  They may react to positive and negative stimuli, but pain is not just "negative stimuli".


                  Yes it is. Recall the difference between feeling and sensation.

                  My criterion for this is that *any* being that communicates the concept of sentience, without having had the concept imparted to it by another sentient being (such as in the case of an intelligent but nonsentient being that has read about consciousness), must be sentient. This is because I do not see any way an intelligent but nonsentient being could come up with the concept of sentience.


                  What? Most people could not communicate to you the idea of sentience, period. How does one communicate the notion of sentience anyways??


                  People have managed. And later I point out why I believe others are sentient even though they specifically have not communicated this idea to me.

                  And how would you know the difference? For example, in this case of the non-sentient intelligent being, how would a third party observer, when hearing the beaing explain sentience, know that perhaps, in the past, some other being taught it the notion?


                  So your saying because it's possible my experiment might become contaminated my argument is flawed?


                  The first conclusion is from the first and second assumptions: human beings (in general; I will address the particular limits and exceptions later) are sentient. I know I am sentient. In addition, I know that there are other sentient humans, because of all the philosophers in the past who talked about it. Knowing from my second assumption that my sentience and theirs is the result of the construction of our brains, I begin to suspect that human brains are all fundamentally similar, in such a way as to make humans sentient. This is, I believe, reasonable.


                  How do you know you are sentient?




                  Part of BEING SELF-AWARE is knowing that you are self-aware. This is elementary.

                  []q]You did not have the concept of sentience before it was taught to you (someone must have taught youthe word, and without the word you could not have had the concept).[/q]

                  How do you know? Someone came up with it, therefore it MUST be that a concept can be derived without being taught.


                  The second conclusion is from the previous conclusion and all three assumptions: a computer can be sentient. Since a computer of sufficient size and power can simulate the behavior of a physical system containing a human being, it can simulate the behavior of that human being, and since consciousness is due only to the physical constitution of the human brain, it can simulate sentience. But then it would satisfy the criterion in the first assumption. The computer would, in effect, be a human being.


                  "Simulate sentience"? So you are saying that merely mimicking a behavior is proof that some machine understand and has internalized that behavior?


                  Any device that simulates perfectly the behavior of something is indistinguishable from that thing. There is no way for us to tell if we are in the universe as we know it or are merely some arbitrarily precise simulation on a very powerful computer.


                  Finally, I conclude that animals, except possibly our close primate relatives, are not sentient, simply because there's no reason to believe that they share the fundamental similarity that human brains have, and because I've never had the concept of consciousness communicated to me in any way by them.


                  As Spiffor said, your likely inability to understand what any alien being tells you makes that rather diffciult.


                  Doesn't matter. The fact that my lack of faith makes it difficult for god to talk to me doesn't mean I should hold my disbelief in god in abeyance.

                  On top of that, similarity to a human brain in what respect? Do humans have any form of brain matter not found in other beings? Is your arguement that anything bellow a certain number of connections between neurons means you have no form of sentience?


                  My argument is that we must have some sort of particular arrangement of neurons.


                  It may be objected that certain emotional displays of animals, especially mammals, suggest that they are sentient, but I see no necessary connection, either way, between sentience and such displays, especially given that it would be possible to create a relatively simple robot that would be furry and such and make identical displays while being incontrovertibly nonsentient. (This isn't currently possible mostly because we don't have access to quite as good motors and servos as nature has designed over millions of years.)


                  Why are you using the word "emotion"? You haven't defined what it mean in your system, because if you are using the standard definition, it sort of undermines your arguement, since complex emotions, such as love, are tought to be a rather complex thing, and if you are saying that animals can love, or show love, then you are placing a dagger into your arguement.


                  No, my point is that it ISN'T a dagger in my argument. There is no necessary connect between sentience and emotion. Emotion is simply a behavioral rule that was evolved because it gets the animal to act in a certain evolutionarily good way.


                  It may also be objected that certain animals recognize themselves in a mirror, and therefore must be self-aware. If the term self-awareness is used so literally, it is not interchangeable with sentience, since there is no necessary or even plausible connection between image recognition and sentience.


                  Actually, self-awareness is crucial for even your stated notion of sentience. A being incapable of telling itself from another would not really then be able to judge the sentience of others, being uable to know which being it is while assuming itself to be sentient.


                  So what? You're saying it's necessary. I'm saying it's not sufficient. I don't care if it's necessary.

                  For example, a program could be written that identifies an object in a camera view based on images and 3D data provided to the program at the beginning ("look for this"), and then this program could be loaded onto a robot which is also the object the program is set to recognize. However, the same program would recognize something else as self if one simply gave it the data of another robot. In addition, I think it's just generally absurd to claim that any good image-recognition software is by definition sentient.


                  The issue would be what occurs if a change to that original image occurs, will the being be able to process that change.


                  Yes.

                  For example, you make a robot with a blue, plastic and easily molded exterior. You show it itself in a regular mirror and program the software to recognize itself. Well, every time the robot finds itself in its current condition in front of a normal mirror, it will recognize itself. The question is as follows: turn the robot off, take the cover and make it red, and mold it into a very different shape. Then place this changed robot in front a mirror. Turn it back on- under your previous example, would the robot be able to recognize this red and different looking robot as itself?


                  That's my point, it wouldn't.

                  You recognize yourself in front of a mirror even after a haircut, and perhaps even with some new scar. It is highly unlike that a mere robot programed with a set image of itself would be able to recognize itself if some radical change occured.


                  Humans have much more general image-recognition, and your own image generally doesn't change very much, and usually not in such drastic unexpected ways.

                  What says that the notion of sentience as we have created it has ANY evolutionary meaning?
                  Not I.

                  Comment


                  • #39
                    Re: Re: Re: @ spiffor: resurrecting the animal rights thing

                    Originally posted by Kuciwalker
                    There's a difference between there being some physical representation of information in a nervous system and there being a conscious representation of something in the mind.
                    There is NEVER such a thing as a "physical representation" of anything. Electrical impulses do not constitute "pysical representations"


                    So when, say, NASA builds a robot with internal sensors that detect if something is broken and then tested said sensors it would be unethical?


                    This shows the weakness fo your arguement. Pain is NOT only a statement that something is wrong with the body. After all, you could have something wrong without any pain- the brain feels no pain whatsoever, so you could have a fatal tumor growing there and never know. That you think pain is merely a sensation telling you something is wrong shows your inability to comprehend the notion of feeling.


                    They may react to positive and negative stimuli, but pain is not just "negative stimuli".


                    Yes it is. Recall the difference between feeling and sensation.


                    No its not. I never agreed that your definitions were correct. If anything, the incorrectness of your definitions underpins the problems in your arguement.



                    So your saying because it's possible my experiment might become contaminated my argument is flawed?


                    No, I am saying that because you might misinterpret data either your experiment or the theory underpinning it need to be rethought.


                    How do you know you are sentient?




                    Part of BEING SELF-AWARE is knowing that you are self-aware. This is elementary.


                    Wait, you say later that self-awareness in an of itself is not enough for sentience.....


                    How do you know? Someone came up with it, therefore it MUST be that a concept can be derived without being taught.


                    Correct, but NOT BY YOU. Your arguement as for why, if some human beings are sentient all must be are lacking. After all, perhaps the being whop came up with sentience had some fundamental difference in brain architecture from others, and all others would simply fit the intelliegent but non-sentient category you say exists.


                    Any device that simulates perfectly the behavior of something is indistinguishable from that thing. There is no way for us to tell if we are in the universe as we know it or are merely some arbitrarily precise simulation on a very powerful computer.


                    So, someone could create a very complex robot whose sole purpose would be to espouse and pontificate about the notion of sentience, which according to you is by itself sufficient to prove sentience....


                    As Spiffor said, your likely inability to understand what any alien being tells you makes that rather diffciult.


                    Doesn't matter. The fact that my lack of faith makes it difficult for god to talk to me doesn't mean I should hold my disbelief in god in abeyance.


                    If "god" created you, then why would said "god" make its creations incapable of undertstanding itself? (anymore than someone would create a robot but not give it the ability to understand himself). I am talking about an alien consciousness, which you do not adress with your "god" comment.


                    No, my point is that it ISN'T a dagger in my argument. There is no necessary connect between sentience and emotion. Emotion is simply a behavioral rule that was evolved because it gets the animal to act in a certain evolutionarily good way.


                    Then how would you explain the different emotion reactions of human beings to the very same event, if an emotion is some pre-programed "rule"? While one can posit an evolutionary reason for love, how would one go about explaining its real world behavior??


                    That's my point, it wouldn't.


                    Yes, which is why they do the test on animals by then adding a "deformity" to the animal to see if it still recognizes itself.


                    Humans have much more general image-recognition, and your own image generally doesn't change very much, and usually not in such drastic unexpected ways.


                    Since I doubt you are not a zoologist, and neither am I, neither of us can speculate on the complexity of human image recognition vs. that of other animals. But the fact is that even small changes can throw off recognition. Why is it that a wig and a set of glasses could, for the most part, fool people, however shortly, into the identity of someone? Yet if, while asleep, someone stuck a wig on your head, you would know it was you in the mirror rather quickly.
                    If you don't like reality, change it! me
                    "Oh no! I am bested!" Drake
                    "it is dangerous to be right when the government is wrong" Voltaire
                    "Patriotism is a pernecious, psychopathic form of idiocy" George Bernard Shaw

                    Comment


                    • #40
                      Originally posted by Kuciwalker
                      No. You just don't understand relativism.
                      Yeah sure, no statement has a moral value, but mine do.
                      In Soviet Russia, Fake borises YOU.

                      Comment


                      • #41
                        Kuci:

                        Well, if we can't have a precise definition of concsciousness, I fear we can't advance in the debate. Actually, I'm not even sure that I perfectly understand your definition of consciousness (which may have to do with the fact that the concept is difficult to grasp ).

                        Since I think the debate on whether the animals are conscious cannot progress, let's shifts the issue: Why should consciousness be the criterion that draws the line between having a right to humane treatment, and not having it?
                        "I have been reading up on the universe and have come to the conclusion that the universe is a good thing." -- Dissident
                        "I never had the need to have a boner." -- Dissident
                        "I have never cut off my penis when I was upset over a girl." -- Dis

                        Comment


                        • #42
                          Re: Re: Re: Re: @ spiffor: resurrecting the animal rights thing

                          Originally posted by GePap
                          There is NEVER such a thing as a "physical representation" of anything. Electrical impulses do not constitute "pysical representations"
                          ... yes they do, just as much as a pit in a CD is a physical representation of a 1 or 0.


                          So when, say, NASA builds a robot with internal sensors that detect if something is broken and then tested said sensors it would be unethical?


                          This shows the weakness fo your arguement. Pain is NOT only a statement that something is wrong with the body.


                          Yes it is.

                          After all, you could have something wrong without any pain- the brain feels no pain whatsoever, so you could have a fatal tumor growing there and never know.


                          That's a basic logical fallacy... I said pain -> something is wrong. You provide the example of something being wrong without pain. That doesn't do anything to my argument.

                          Actually, of course, I said pain is supposed to be a signal that something is wrong. It can fail to trigger, or be triggered incorrectly.

                          That you think pain is merely a sensation telling you something is wrong shows your inability to comprehend the notion of feeling.


                          Actually, it shows your inability to comprehend my argument. I point out specifically that pain is not just a sensation but also a feeling to conscious beings.

                          No its not. I never agreed that your definitions were correct.


                          Then stop arguing with me. I defined my terms for the purpose of the argument. It's meaningless to challenge my definitions, since I use them consistently. If you'd like, do a search-replace on all instances of "feel" and replace it with "oobly-wob"

                          If anything, the incorrectness of your definitions underpins the problems in your arguement.

                          No, I am saying that because you might misinterpret data either your experiment or the theory underpinning it need to be rethought.


                          Ideally, I would set up an experiment so I could know conclusively. However, if Vulcans landed on my front yard and told me that they were sentient, I'd be inclined to take their word for it rather than be rigidly scientific.

                          Wait, you say later that self-awareness in an of itself is not enough for sentience.....


                          No I don't. I say that self-awareness, when taken in the idiotically literal sense of "awareness of self" (as in, merely having some bit flipped in memory saying "I am myself" or whatever) is not equal to sentience.


                          How do you know? Someone came up with it, therefore it MUST be that a concept can be derived without being taught.


                          Correct, but NOT BY YOU.


                          How do you know? If an idea can be arrived at once independently, it can be arrived at many times independently.

                          Your arguement as for why, if some human beings are sentient all must be are lacking. After all, perhaps the being whop came up with sentience had some fundamental difference in brain architecture from others, and all others would simply fit the intelliegent but non-sentient category you say exists.


                          Except that I know there are at least two, and very probably many, human beings that are sentient - me, and whoever came up with the idea in the first place. That sentience is such a constant puzzle among people suggests to me that many of them actually are sentient, and therefore sentience is probably universal among humans.


                          Any device that simulates perfectly the behavior of something is indistinguishable from that thing. There is no way for us to tell if we are in the universe as we know it or are merely some arbitrarily precise simulation on a very powerful computer.


                          So, someone could create a very complex robot whose sole purpose would be to espouse and pontificate about the notion of sentience, which according to you is by itself sufficient to prove sentience....


                          Eh, not quite. I know of no sentient being that SOLELY pontificates about sentience. However, someone could create a robot that acts exactly the same as a human, and such a robot would be sentient. This is a direct consequence of my second axiom.


                          As Spiffor said, your likely inability to understand what any alien being tells you makes that rather diffciult.


                          Doesn't matter. The fact that my lack of faith makes it difficult for god to talk to me doesn't mean I should hold my disbelief in god in abeyance.


                          If "god" created you, then why would said "god" make its creations incapable of undertstanding itself?


                          Who knows? Why would God conform to any human sense of rational behavior?


                          No, my point is that it ISN'T a dagger in my argument. There is no necessary connect between sentience and emotion. Emotion is simply a behavioral rule that was evolved because it gets the animal to act in a certain evolutionarily good way.


                          Then how would you explain the different emotion reactions of human beings to the very same event, if an emotion is some pre-programed "rule"?


                          That it's a highly complex pre-programmed rule, that varies slightly between people? Placed in the same situation, Joan and Bismark might join an MPP or DOW, but that doesn't mean that their choices aren't simple rule-based ones.

                          While one can posit an evolutionary reason for love, how would one go about explaining its real world behavior??


                          It's called neurochemistry.


                          That's my point, it wouldn't.


                          Yes, which is why they do the test on animals by then adding a "deformity" to the animal to see if it still recognizes itself.


                          I doubt you could create such a different animal as you can such a different robot.


                          Humans have much more general image-recognition, and your own image generally doesn't change very much, and usually not in such drastic unexpected ways.


                          Since I doubt you are not a zoologist, and neither am I, neither of us can speculate on the complexity of human image recognition vs. that of other animals.
                          However, I can speculate on the complexity of human image recognition vs. that of robots, both being a human and having studied AI image recognition...

                          But the fact is that even small changes can throw off recognition. Why is it that a wig and a set of glasses could, for the most part, fool people, however shortly, into the identity of someone? Yet if, while asleep, someone stuck a wig on your head, you would know it was you in the mirror rather quickly.
                          Because I notice that when I tell my arms to move, my arms move. A robot could be programmed this way too - but if you rewired it so that it no longer actually was wired up to any arms, it would be confused - just like a person - and this robot would definately not be self-aware (in fact, it wouldn't be really much more complicated than the basic image recognition program).

                          fakeboris
                          Yeah sure, no statement has a moral value, but mine do.


                          No. No statement has some objective moral value. My statements do, however, have a subjective moral value.

                          Comment


                          • #43
                            Originally posted by Spiffor
                            Since I think the debate on whether the animals are conscious cannot progress, let's shifts the issue: Why should consciousness be the criterion that draws the line between having a right to humane treatment, and not having it?
                            Intelligence doesn't work, because there are plenty of stupid humans who I would not condone killing. Simply being human doesn't work, because it's absurd to base morality on how close a certain long chemical chain is to a particular standard (the human genome). All that's left, as far as I can tell, is consciousness.

                            Comment


                            • #44
                              No. No statement has some objective moral value. My statements do, however, have a subjective moral value.
                              That's it. Why do you care about exposing them then?
                              Last edited by Fake Boris; May 22, 2005, 22:30.
                              In Soviet Russia, Fake borises YOU.

                              Comment


                              • #45
                                Originally posted by Immortal Wombat
                                That's just a much more complex response.
                                That depends on how broadly "response" is defined.

                                I can see a difference between humans and rabbits. Between humans and bonoboes, on the other hand, is a different story.

                                Originally posted by Immortal Wombat
                                Well obviously. The point is that they are artificial.
                                Of course, all rights are artificial.
                                (\__/) 07/07/1937 - Never forget
                                (='.'=) "Claims demand evidence; extraordinary claims demand extraordinary evidence." -- Carl Sagan
                                (")_(") "Starting the fire from within."

                                Comment

                                Working...
                                X