Not treating them bad is one thing. Saying they should have rights is stupid on the level of liberal.
Announcement
Collapse
No announcement yet.
Chimpanzees granted 'legal persons' status to defend their rights in court
Collapse
X
-
Originally posted by kentonio View PostI would certainly expect that our species is more intelligent, but it seems pretty much beyond question at this point that chimps are a highly intelligent species with extremely complex minds and the ability to form complicated social structures. We can even achieve levels of communicate with them that we've never managed with other species. I don't think anyone is advocating that we give them driving licenses and the right to vote, but giving them the right to not be treated like an object to be locked up, mistreated or killed on a whim seems like a pretty reasonable thing to expect. They're one of our closest relatives and if we treat them like ****, what does that say about us?
(Why is it wrong to hurt a chimp, but okay to hurt a dumber and less sophisticated animal? Where is the cognitive threshold for moral worth, above a cow but below a chimp, where it becomes wrong, and why did you put it there?)
Comment
-
I am not replying to Sava here, I just want to give a rant that I have wanted to for a while.
So called intelligence tests measure the pattern matching skill in certain situations. This is true for all types of intelligence tests that people discuss (including IQ). Now I agree that pattern matching is a useful skill, but it isn't equivalent to intelligence. There have always been some types of pattern matching situations where humans are worse at than some particular animal, but that doesn't mean that that animal is more intelligent than humans. A chameleon can order his skin to have the right placement to have the right setup of crystals with different indexes of refraction in order to mimic his environment. I would take me years to be able to do that (I think) and that is without consideration to the timescale that the chameleon can achieve this.
To try and improve things many humans try to make their pattern matching skill tests be abstract. I agree that being able to think abstractly is a signature of intelligence so I think that is an improvement, but obviously doesn't not encapsulate intelligence.
What is an obvious (although not the only, obviously) signature of intelligence? The ability to learn new things (And skill at that). A human toddler is way better than any monkey at this ability, I would put the monkeys at the level of a human baby (maybe 6 months?) at this skill. Taking the learning demonstrated by the Bobos that Proteus referred to. Even at 9 months the ability to learn to open nuts with hammer/anvil isn't something that I would announce to the world that my daughter had just learned even if I was announcing something every day. If you go down to every hour, then I might announce it (and I think she would learn how to do that earlier than 9 months).
They are bad at learning things, even if they are infinitely better than 'artificial intelligence' (currently).
Where your(Proteus, Sava?, and others who think similarly) thinking is retarded is that you don't have to put homo sapien sapien through 100k years to learn how to fly a jet or do discover physics or create a band or what have you. You just have to take that homo sapien sapien and have him in our environment and he will learn all of it. That is because he is intelligent and bobos are not.
We have even had this test play out, we take some human stone age tribe and the children can be flying jets, discovering physics, creating bands, etc.
This is why I disagree with the intelligent people who believe we have anything to fear from AI (for example Hawking and Musk).
AI has just been developed to be more skilful than humans at pattern matching in certain classes of problems (certain situations). That is just similar to the dumbest of animals. No computer has displayed any real ability to learn.
Yes, it makes sense that learning and pattern matching skill are related. But there is something very very important required in addition to pattern matching before you get to real learning.
JMJon Miller-
I AM.CANADIAN
GENERATION 35: The first time you see this, copy it into your sig on any forum and add 1 to the generation. Social experiment.
Comment
-
Originally posted by Jon Miller View PostI am not replying to Sava here, I just want to give a rant that I have wanted to for a while.
So called intelligence tests measure the pattern matching skill in certain situations. This is true for all types of intelligence tests that people discuss (including IQ). Now I agree that pattern matching is a useful skill, but it isn't equivalent to intelligence. There have always been some types of pattern matching situations where humans are worse at than some particular animal, but that doesn't mean that that animal is more intelligent than humans. A chameleon can order his skin to have the right placement to have the right setup of crystals with different indexes of refraction in order to mimic his environment. I would take me years to be able to do that (I think) and that is without consideration to the timescale that the chameleon can achieve this.
To try and improve things many humans try to make their pattern matching skill tests be abstract. I agree that being able to think abstractly is a signature of intelligence so I think that is an improvement, but obviously doesn't not encapsulate intelligence.
What is an obvious (although not the only, obviously) signature of intelligence? The ability to learn new things (And skill at that). A human toddler is way better than any monkey at this ability, I would put the monkeys at the level of a human baby (maybe 6 months?) at this skill. Taking the learning demonstrated by the Bobos that Proteus referred to. Even at 9 months the ability to learn to open nuts with hammer/anvil isn't something that I would announce to the world that my daughter had just learned even if I was announcing something every day. If you go down to every hour, then I might announce it (and I think she would learn how to do that earlier than 9 months).
They are bad at learning things, even if they are infinitely better than 'artificial intelligence' (currently).
Where your(Proteus, Sava?, and others who think similarly) thinking is retarded is that you don't have to put homo sapien sapien through 100k years to learn how to fly a jet or do discover physics or create a band or what have you. You just have to take that homo sapien sapien and have him in our environment and he will learn all of it. That is because he is intelligent and bobos are not.
We have even had this test play out, we take some human stone age tribe and the children can be flying jets, discovering physics, creating bands, etc.
This is why I disagree with the intelligent people who believe we have anything to fear from AI (for example Hawking and Musk).
AI has just been developed to be more skilful than humans at pattern matching in certain classes of problems (certain situations). That is just similar to the dumbest of animals. No computer has displayed any real ability to learn.
Yes, it makes sense that learning and pattern matching skill are related. But there is something very very important required in addition to pattern matching before you get to real learning.
JMI drank beer. I like beer. I still like beer. ... Do you like beer Senator?
- Justice Brett Kavanaugh
Comment
-
Originally posted by Jon Miller View PostWhere your(Proteus, Sava?, and others who think similarly) thinking is retarded is that you don't have to put homo sapien sapien through 100k years to learn how to fly a jet or do discover physics or create a band or what have you. You just have to take that homo sapien sapien and have him in our environment and he will learn all of it. That is because he is intelligent and bobos are not.
Comment
-
Originally posted by Elok View PostIf we don't, it only says that we've won enough to safely wax sentimental about the losers. Did the same thing with the Native Americans.
(Why is it wrong to hurt a chimp, but okay to hurt a dumber and less sophisticated animal? Where is the cognitive threshold for moral worth, above a cow but below a chimp, where it becomes wrong, and why did you put it there?)
Comment
-
Originally posted by kentonio View PostIt's not ok to hurt a less sophisticated animal...Click here if you're having trouble sleeping.
"We confess our little faults to persuade people that we have no large ones." - François de La Rochefoucauld
Comment
-
Ah, well. So much for that line of attack.Click here if you're having trouble sleeping.
"We confess our little faults to persuade people that we have no large ones." - François de La Rochefoucauld
Comment
-
The danger with AI is that if we ever do cross that threshold between "not really learning" and "learning" (in the sense that a child can learn) ... it would be able develop extremely fast and without nearly as restrictive upper bounds humans are stuck with.
You know at some point someone is going to say, "hey, what happens if we hook this up to the internet?" or "you know ... we could make a ton of money hooking this up to the internet!" At that point, any system connected to the internet is potentially part of that entity. What the outcome of that would be is hard to say, but I'd guess something along the lines of "A LOT" and "VERY FAST". Dangerous is a fair description.
Comment
-
Originally posted by Lorizael View PostChimps may be more intelligent than they first appear (especially because ... No chimps ... argue over the internet ...
Comment
-
Originally posted by Aeson View PostThe danger with AI is that if we ever do cross that threshold between "not really learning" and "learning" (in the sense that a child can learn) ... it would be able develop extremely fast and without nearly as restrictive upper bounds humans are stuck with.
You know at some point someone is going to say, "hey, what happens if we hook this up to the internet?" or "you know ... we could make a ton of money hooking this up to the internet!" At that point, any system connected to the internet is potentially part of that entity. What the outcome of that would be is hard to say, but I'd guess something along the lines of "A LOT" and "VERY FAST". Dangerous is a fair description.“It is no use trying to 'see through' first principles. If you see through everything, then everything is transparent. But a wholly transparent world is an invisible world. To 'see through' all things is the same as not to see.”
― C.S. Lewis, The Abolition of Man
Comment
-
Originally posted by Aeson View PostThe danger with AI is that if we ever do cross that threshold between "not really learning" and "learning" (in the sense that a child can learn) ... it would be able develop extremely fast and without nearly as restrictive upper bounds humans are stuck with.
You know at some point someone is going to say, "hey, what happens if we hook this up to the internet?" or "you know ... we could make a ton of money hooking this up to the internet!" At that point, any system connected to the internet is potentially part of that entity. What the outcome of that would be is hard to say, but I'd guess something along the lines of "A LOT" and "VERY FAST". Dangerous is a fair description.
Comment
Comment