Announcement

Collapse
No announcement yet.

Queen bans fox hunting!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by General Ludd
    Up untill now you've been claiming that foxes where machines, and humans where different.


    NO! I've been saying both are machines, but humans are self-aware machines!



    But, reading your other post, I see you're not saying that. You're now claiming that humans are "machines" which control themselves and have the ability to make conscious choices, while foxes are machines with no control overthemselves and are slaves to their environment.


    Foxes do have "control" over themselves in the sense that an electron has "control" over itself. However, it is meaningless to talk about the will (and thus the free will) of a nonsentient being.

    And yet this still does not adress any of my scenarios, like the obvious choice a dog has in deciding to obey it's master or not. Why is the decision a dog makes when presented with multiple courses of action not a conscious choice, but when a human makes the same choice, it is?


    1) because I know directly that I'm conscious.

    2) because I can extrapolate from my consciousness to the consciousness of other humans.

    All of which I said in that post you supposedly read.

    If a dog was not capable of choice, how could it be trained? It should mechanically behave acording to it's species and, as a result, either all dogs would be untrainable or all dogs could be easily "trained" in exactly the same method by quickly exposing them to a series of "stimuli". Neither is true.


    Uh, you don't understand anything at all, do you? It behaves mechanically, just like humans, according the the particles of which it is made up (and the external influences exerted onto it). It's a big computer for determine how it behaves. It is possible to have algorithms that learn and can be trained.

    Unless, of course, every dog's brain is a vastly different machine and opperates on different principles (or perhaps laws of physics, as you suggested previously ).


    Every dog's brain is practically similar but specifically different, because the genes in dogs produce the same general structure. Because they have slightly different genes and sometimes vastly different experiences (and because of other purely arbitrary differences in starting conditions), the brains are not the same in every detail. Because of this, it will be easier to train some dogs than others. But this should be frigging obvious, that there are individual differences between individuals

    But if you believe this, why would you also believe that "If consciousness is an emergent property of computation, or of matter performing computation, then it seems reasonable to assume that, since I am conscious, those whose brains operate practically similarly to mine (i.e. other humans) are also conscious."


    Because I don't believe that each is radically different, as I said in the part you just quoted. I believe that all human brains are practically similar, and all dog brains are practically similar, etc.

    And, since I'm thinking about it, why do you find the thought of consciousness so signifigant? Couldn't it just be dismissed as the mechanical answer to the question of life, being, or self? What's so special about a "machine" being able to answer that yes, it is infact a machine?


    Huh?

    Consciousness is significant because it is meaningless to talk about the will of an unconscious being.

    Going by your argument (that consciousness is an "emergent property of computation itself ") , someone thinking that they are conscious is merely an acknowledgement that they are basically a computer and their thoughts - including this realization - are merely the results of computations and outside of their control.


    No, because they do the computations! But how does that follow anyway, that thinking that you are conscious leads logically to the realization that one behaves mechanically? You think you are conscious, and you haven't seemd to realize that yet.

    And, in turn, this thought that it is outside of their control is again merely another answer, and a further acknowldgement that they are nothing but a computer acting on calculations. Shouldn't it be assumed that any "machine" whether it is human or fox, should be able to come to these same basic, mechanical, answers?


    No more than a Gamecube should be able to be an internet server.

    Comment


    • Originally posted by Dauphin
      I'm still interested to know (I have had no post respond to my original question) if you, Kuci, think that during the evolution of man there were (or are) sentient (proto-)humans living alongside non-sentient humans. Assuming you do not retract the quantised concept of either having or not having sentience.
      Yes, I would assume that at one point, a sentient proto-human suddenly was born. However, while sentience is either yes or no, you can still be "more" sentient in the sense that you have greater logical capabilities and are therefore able to derive more logical conclusions from it.

      Comment


      • Originally posted by Kuciwalker
        I didn't say made a convincing imitation of life, I said exhibited pain.


        Unless it can convince me it is feeling pain, I won't believe it is feeling pain.

        And the fact that, so far, we haven't built one, doesn't have any effect on a theoretical discussion.


        The question of whether or not animals feel pain is a practical question, not a theoretical one.
        Christianity: The belief that a cosmic Jewish Zombie who was his own father can make you live forever if you symbolically eat his flesh and telepathically tell him you accept him as your master, so he can remove an evil force from your soul that is present in humanity because a rib-woman was convinced by a talking snake to eat from a magical tree...

        Comment


        • Originally posted by chegitz guevara
          Originally posted by Kuciwalker
          I didn't say made a convincing imitation of life, I said exhibited pain.


          Unless it can convince me it is feeling pain, I won't believe it is feeling pain.
          So if I did build a robot and put fur on it, and its motion was articulated enough to look like a fox, and then I had it writhing on the ground and howling, would you say it was in pain?

          And the fact that, so far, we haven't built one, doesn't have any effect on a theoretical discussion.


          The question of whether or not animals feel pain is a practical question, not a theoretical one.
          Not when we're talking about consciousness.

          Comment


          • Originally posted by Kuciwalker

            Foxes do have "control" over themselves in the sense that an electron has "control" over itself. However, it is meaningless to talk about the will (and thus the free will) of a nonsentient being.
            That's kind of the point: You can't discuss the will of a nonsentient being. Therefor, if you can discuss the will of a being, it must be sentient.

            If a dog was not capable of choice, how could it be trained? It should mechanically behave acording to it's species and, as a result, either all dogs would be untrainable or all dogs could be easily "trained" in exactly the same method by quickly exposing them to a series of "stimuli". Neither is true.


            Uh, you don't understand anything at all, do you? It behaves mechanically, just like humans, according the the particles of which it is made up (and the external influences exerted onto it). It's a big computer for determine how it behaves. It is possible to have algorithms that learn and can be trained.

            Unless, of course, every dog's brain is a vastly different machine and opperates on different principles (or perhaps laws of physics, as you suggested previously ).


            Every dog's brain is practically similar but specifically different, because the genes in dogs produce the same general structure. Because they have slightly different genes and sometimes vastly different experiences (and because of other purely arbitrary differences in starting conditions), the brains are not the same in every detail. Because of this, it will be easier to train some dogs than others. But this should be frigging obvious, that there are individual differences between individuals

            But if you believe this, why would you also believe that "If consciousness is an emergent property of computation, or of matter performing computation, then it seems reasonable to assume that, since I am conscious, those whose brains operate practically similarly to mine (i.e. other humans) are also conscious."



            Because I don't believe that each is radically different, as I said in the part you just quoted. I believe that all human brains are practically similar, and all dog brains are practically similar, etc.
            It's not a matter of how easy a dog is to train, it is a matter of what methods they respond to, and if they can even be trained at all. Surely, if the brains are "practically similiar" and everything they do is a mechnical reaction to stimuli, than it should be possible to breed dogs in a creche where they all experience more or less the same thing and are exposed to the same things, and be able to train them easily and quickly, with a slim margin of failure related to the minute individual differences of each brain outside of the "same general structure".

            So, again I ask you. If individual brains are different enough that some individuals can be trained to certain tasks through stimuli while others do not react to the stimuli at all, how can you be certain that all human's brains function in a "practically similiar" fashion to yours?

            And, since I'm thinking about it, why do you find the thought of consciousness so signifigant? Couldn't it just be dismissed as the mechanical answer to the question of life, being, or self? What's so special about a "machine" being able to answer that yes, it is infact a machine?


            Huh?

            Consciousness is significant because it is meaningless to talk about the will of an unconscious being.
            I meant, why is thinking "I am conscious" so signifigant a feat?

            Going by your argument (that consciousness is an "emergent property of computation itself ") , someone thinking that they are conscious is merely an acknowledgement that they are basically a computer and their thoughts - including this realization - are merely the results of computations and outside of their control.


            No, because they do the computations! But how does that follow anyway, that thinking that you are conscious leads logically to the realization that one behaves mechanically? You think you are conscious, and you haven't seemd to realize that yet.
            You said that consciousness "emerges" from computation. Therefor, realizing that you are conscious is the same as acknowledging that your thoughts are computations, and that that realization in itself was merely the result of a 'mechanical' computation.

            No more than a Gamecube should be able to be an internet server.
            So now you're comparing foxes to gamecubes? Okey dokey.

            You claim that all life forms are machines making calculations, and then claim that self-awareness is not neccisarily needed for consciousness. But self-awareness is all that a machine would need to answer the question which results in the answer, "conscious" that of: "what am I?"

            And that question of "what am I?" is in turn a question - a mechincal response - to being able to reconize one's self and one's influence on the surrounding environment and on others. ie. self-awareness.
            Rethink Refuse Reduce Reuse

            Do It Ourselves

            Comment


            • It was a long time ago and conneceted to the original topic ....

              If this is simply to be banned in England and Wales, can't the hunters just go up to Scotland to do this?

              Scotland banned hunting with dogs some time ago. The result, however, has been been a huge increase in the number of foxes killed. This is because they're now hunted with guns and beaters which is much more effective than dogs. Less cruel possibly but approx 90% successful from the artical i read compared to a 40-60 with dogs.

              Anyway back to robots and self awareness......

              Comment


              • the queen has the power?

                What bunch of power-mad leftie pervies loves banning things?
                Is the fox to be a protected species? Do socialists have coherent thought?
                As the above post has pointed out, more foxes will be killed as a result of a ban. Even now, far more of them meet violent death by other means.
                I wonder how many of the anti-fox-hunting persons think nothing of killing a fly? How many of them drive motor cars? All things we do kill other creatures. Do they realise this?

                Comment


                • Originally posted by General Ludd
                  That's kind of the point: You can't discuss the will of a nonsentient being. Therefor, if you can discuss the will of a being, it must be sentient.


                  Not quite. It's kind of like evolution and design. Even though something that evolved naturally wasn't designed, you can say "X was designed to do this". Similarly, you could say that a fox intends to do X, without necessarily implying will, because it's easier to say. Language is usually imprecise.

                  It's not a matter of how easy a dog is to train, it is a matter of what methods they respond to, and if they can even be trained at all. Surely, if the brains are "practically similiar" and everything they do is a mechnical reaction to stimuli, than it should be possible to breed dogs in a creche where they all experience more or less the same thing and are exposed to the same things, and be able to train them easily and quickly, with a slim margin of failure related to the minute individual differences of each brain outside of the "same general structure".


                  Except that the mechanism derives from the arrangement of subatomic particles, and the interactions between them are chaotic (small changes can produce large deviations), meaning it is impossible to actually expose them to the same environment.

                  So, again I ask you. If individual brains are different enough that some individuals can be trained to certain tasks through stimuli while others do not react to the stimuli at all, how can you be certain that all human's brains function in a "practically similiar" fashion to yours?


                  See above. Plus the fact that the same general techniques work on people, with varying amounts of success.

                  I meant, why is thinking "I am conscious" so signifigant a feat?


                  Because that is the minimum requirement for being conscious.

                  You said that consciousness "emerges" from computation. Therefor, realizing that you are conscious is the same as acknowledging that your thoughts are computations, and that that realization in itself was merely the result of a 'mechanical' computation.


                  Huh?

                  Consciousness is an emergent property of computation, much like planets, life, etc. is an emergent property of matter - it doesn't mean that realizing that one is conscious (the same as being conscious) is an acknowledgement that your thoughts are computations. You can be conscious and think whatever the hell you want, except you will constantly perceive your own consciousness.

                  So now you're comparing foxes to gamecubes? Okey dokey.

                  You claim that all life forms are machines making calculations, and then claim that self-awareness is not neccisarily needed for consciousness.


                  WHAT THE HELL?!

                  Can you read?

                  Self-awareness is consciousness!

                  But self-awareness is all that a machine would need to answer the question which results in the answer, "conscious" that of: "what am I?"


                  Could you please write sentences that make some sort of sense within English grammar?

                  And that question of "what am I?" is in turn a question - a mechincal response - to being able to reconize one's self and one's influence on the surrounding environment and on others. ie. self-awareness.
                  Point?

                  Comment


                  • other creatures

                    There seem to be persons who think that only humans are 'self-aware'. These persons are deluded. To think that a fly (for example) is not of this state of self awarness, is to be truly ignorant.
                    Have you ever watched a fly groom itself? I suggest you do.

                    Comment


                    • You said that consciousness "emerges" from computation. Therefor, realizing that you are conscious is the same as acknowledging that your thoughts are computations, and that that realization in itself was merely the result of a 'mechanical' computation.


                      Huh?

                      Consciousness is an emergent property of computation, much like planets, life, etc. is an emergent property of matter - it doesn't mean that realizing that one is conscious (the same as being conscious) is an acknowledgement that your thoughts are computations. You can be conscious and think whatever the hell you want, except you will constantly perceive your own consciousness.
                      As planets or life are an emergent property of matter, they are also made of matter.

                      If consciousness is born of computation, it must also be made of computation. And so in that respect, saying "I am conscious" is in effect nothing more than acknowledging or comfirming that your thoughts are calculated responses to stimuli - it's reconizing yourself for what you are. But being aware that your thoughts are computational responses does not mean that you have any control over them. And if it did, would they still be computational responses?




                      So now you're comparing foxes to gamecubes? Okey dokey.

                      You claim that all life forms are machines making calculations, and then claim that self-awareness is not neccisarily needed for consciousness.


                      WHAT THE HELL?!

                      Can you read?

                      Self-awareness is consciousness!
                      My bad, I see now that you included it with consciousness isntead of seperating it like I had thought. I was thinking of when you where talking about the "various degrees" of self-awareness.

                      But if you don't think that other animals (except perhaps primates) are self-aware, than I again go back to my example of training a dog and ask how a dog is capable of leading a blindman down a street safely if it is not first aware of itself, it's surroundings, how other beings percieve it, and how it can influence it's environment.

                      What else is needed for you to believe that something is self-aware? These abilities put together would indicate that a dog is fully capable of reconizing itself as a unique being seperate of other individual beings.
                      Rethink Refuse Reduce Reuse

                      Do It Ourselves

                      Comment


                      • Originally posted by Kuciwalker
                        Newsflash: pain is the feeling - the awareness - of certain nerves firing.

                        What distinguishes the "pain" of a fox from the "pain" of a tree?
                        Trees don't have nerves. Isn't that simple?
                        (\__/) 07/07/1937 - Never forget
                        (='.'=) "Claims demand evidence; extraordinary claims demand extraordinary evidence." -- Carl Sagan
                        (")_(") "Starting the fire from within."

                        Comment


                        • Originally posted by Kuciwalker
                          Brain activity, and the fact that they are human brains. I am sentient, and my brain is practically similar to other humans' brains, therefore they ought to be sentient as well.
                          How do you show that you are sentient?

                          Originally posted by Kuciwalker
                          In addition, sentience does not seem to depend on pure computational power, but actual constitution, so it makes sense that the brains of fetuses, once active, would also be sentient, because they have the same pattern as my brain.
                          That's not true. Or rather, the devil is in the details.

                          No two person's "pattern" are the same - by "pattern" I am presuming you mean the arrangement of the neurons in your brain and their interconnections.

                          Now, even though the "pattern" of any two random persons are largely the same, it's the minor differences that cause the big differences in mental abilities.
                          (\__/) 07/07/1937 - Never forget
                          (='.'=) "Claims demand evidence; extraordinary claims demand extraordinary evidence." -- Carl Sagan
                          (")_(") "Starting the fire from within."

                          Comment


                          • Originally posted by Sandman
                            Rights are just something humans invented, and we can extend them to some animals some of the time, if we feel like it. No reason is necessary.
                            Exactly.
                            (\__/) 07/07/1937 - Never forget
                            (='.'=) "Claims demand evidence; extraordinary claims demand extraordinary evidence." -- Carl Sagan
                            (")_(") "Starting the fire from within."

                            Comment


                            • Originally posted by Kuciwalker
                              Robots can be trained. You lose.
                              This is most definitely untrue, if "training" is "learning new skills from experience." No robot can do that. In fact, I am not sure if there are robots that can improve known skills from experience.

                              Originally posted by Kuciwalker
                              Moreover, animals are machines as much as robots. They are very, very complicated machines, but they behave mechanically.
                              Humans are also robots. What's your point?

                              Originally posted by Kuciwalker
                              Dogs hear in the same way that your computer microphone hears. They do not hear, in their mind (not their brain, their mind), the sound.
                              This is wrong, as a computer microphone does not hear. It merely converts one form of energy into another.

                              Originally posted by Kuciwalker
                              When you see something, you don't just register a bunch of intensities. There's a part of you, your consciousness, that actually sees the image, this thing that is aware.
                              A animal does not need to be self-aware to be able to interpret signals in its head. A dog actually recognises different things, such as its master, other people, food, cats (nemesis), etc.
                              (\__/) 07/07/1937 - Never forget
                              (='.'=) "Claims demand evidence; extraordinary claims demand extraordinary evidence." -- Carl Sagan
                              (")_(") "Starting the fire from within."

                              Comment


                              • Originally posted by Kuciwalker
                                Except that the mechanism derives from the arrangement of subatomic particles, and the interactions between them are chaotic (small changes can produce large deviations), meaning it is impossible to actually expose them to the same environment.
                                The question is whether these small variations have any meaning in the macroscopic level.

                                Originally posted by Kuciwalker
                                Consciousness is an emergent property of computation [snipped]
                                Not quite. Our brain works sufficiently differently from a computer that it is rather meaningless to apply the term "computation" as we defined the term. Along the same vine, "algorithms" and "protocols" don't apply, either.
                                (\__/) 07/07/1937 - Never forget
                                (='.'=) "Claims demand evidence; extraordinary claims demand extraordinary evidence." -- Carl Sagan
                                (")_(") "Starting the fire from within."

                                Comment

                                Working...
                                X