The Altera Centauri collection has been brought up to date by Darsnan. It comprises every decent scenario he's been able to find anywhere on the web, going back over 20 years.
25 themes/skins/styles are now available to members. Check the select drop-down at the bottom-left of each page.
Call To Power 2 Cradle 3+ mod in progress: https://apolyton.net/forum/other-games/call-to-power-2/ctp2-creation/9437883-making-cradle-3-fully-compatible-with-the-apolyton-edition
Originally posted by MrFun
Oh wow -- I'm impressed by your higher standards in using ad hominems.
What I meant was I use ad hominems, and the back up my arguments with other facts, rather than just using an ad hominem as my argument. The ad hominems are just me venting my incredible frustration.
What I meant was I use ad hominems, and the back up my arguments with other facts, rather than just using an ad hominem as my argument. The ad hominems are just me venting my incredible frustration.
Ok -- just this once then, I will reduce your conviction from a felony to a misdemeanor.
A lot of Republicans are not racist, but a lot of racists are Republican.
Originally posted by The Viceroy
Hi Laz,
Certainly, it is not true to say ALL the countryside is for hunting, but I think there is a strong feeling within the countryside that its just another thing Townies don't understand. It certainly is not supported only by toff's with horses, as I cannot ride a horse, nor could afford to, yet I do not agree with banning it.
I am an extremely rural person and have lived in the deep countryside for most of my life. However I'm from the rural working class- from a long line of hedgers-and-ditchers and small-pit colliers. Fox-hunting was very much the preserve of the upper classes, and for the most part it still is.
In the past they were a minority, and though the rural working class is practically extinct now, they're still a minority due to the influx of "townies" moving to the sticks.
My own experience is that people like myself hated the hunting set with a passion- I was certainly no exception. It was mutual too. Class war does play a part in this issue- that much is undeniable, but I don't think it sullies the cause unduly. In any event, the Country Sports lobby's attempt to depict blanket rural support for hunting is an outright lie
Im not sure about the experiences in Scotland, my understanding was they continued, but the kill is left for the gun ..
That's right. The chases tend to be much shorter.
I still conclude that cash strapped police will not have the resources to be bothered about it, and the hunts in this area will continue regardless. I wonder how many illegal hunts were stopped in Scotland ?? Unless monitored all the way, whats to stop the occasional pack of dogs going for the fox ?? who would know anyway. just seems unworkable to me.
If illegal fox-hunts take place, undoubtedly most will get away with it, but that's true of most other crimes too. It's not a valid argument for legalisation, or for failure to police the issue.
However it looks like most of the big hunts (such as the Beaufort) will switch to lure-hunting in order to keep their structure intact, and hold out to see if the laws are changed at a later date.
When they are justified on the same philisophical principle it is.
A baseless assertion (that animal rights is a 'byproduct' of social liberalism), capped with a false belief (that animal rights is soley derived from the same 'philosophical roots'). Some religions take animal rights very seriously.
The only reason you are calling it a 'byproduct' is because you don't like it, and you can't accept that it has a genuinely useful role to play.
Originally posted by Kuciwalker
Now, because there seems to be some disconnect in communication between me and Spiffor (btw, I take back the "dense" remark, I was just incredibly frustrated) I'm going to go over my meaning and understanding of consciousness/self-awareness/sentience.
Apology accepted I know the feeling
Now, first, it should be obvious that by "consciousness" I don't mean "being awake". Also, when I say that someone is conscious of something or aware of something, I do not refer to merely having the information in the brain/whatever computational device the organism/machine uses. It's hard to phrase it properly, because it's such a fundamental concept... it's feeling your sensations... Spiffor, I'd like you to tell me if I'm making any sense here. I really don't think there are words to describe it.
It makes sense to me, but probably not the sense you'd want me to get For the sake of my own clarity, I'll call this factor "ability to feel".
However, you should strive to make a good definition. In general, concepts that cannot be expressed clearly are cloudy concepts.
As a corollary to this, merely responding to "pain" (really just an electrochemical signal originating from a certain class of nerve endings, in biological organisms) is not proof of self-awareness.
Most definitely.
To explain further, I kinda have to go on a tangent.
[...]
This chain can be carried on to higher and higher levels, and so it is apparent that all systems, even living beings, even human beings, behave mechanically.
I have no sufficient physical knowledge to know whether the world is strictly deterministic or not. However, since it is your premise, I'll follow you.
Now, this leads to the question of free will - if your actions are determined, then you has no control over them and no free will and cannot be blamed for them. Clearly an electron cannot be blamed for anything, so why can a person? The problem with this thinking is that your behavior, while determined, is under your control, because your behavior is determined by yourself.
Not quite. If the world is purely deterministic, so is the entirety of your thought-process. Unless our brains are the only place in the universe where determinism doesn't apply, our behaviour is determined by the interactions between small bits of matter. Our human conceptions call it "ourselves", but that's nothing more than one simplification of reality (among many others which are needed in everyday life).
If some other person were "mind-controlling" you, you would be blameless, but you are not. The reason there is no free will in electrons is because they are not self-aware - and it is meaningless to assign will to something that is not self-aware. Only a consciousness can have a will. We may say a robot, for instance, behaves "as if it had a will of its own", but it doesn't, it just has a chip designed to act in a certain fashion.
But our brain is also designed to act in a certain fashion. You may repeat that "we have a will", and you can even believe that all humans have a will just like you have. But this belief is nothing more (according to your probabilistic premise) than the manifestation of complex interactions between atoms.
Where does this consciousness come from? How do we know we have it? Well, I believe (though it is not entirely provable right now) that there is no dualism, no “spiritual” world to complement the “physical” world*. If you took a computer and ran a simulation of the behavior of all the particles in a human being, you would find that it would behave exactly as the actual human being would. This means it would also, if prompted, claim that it was conscious, and without prompting it would have the thoughts, in its mind, of consciousness. You can say that maybe electrons are conscious, but their consciousness has no effect on reality, but it’s clear that consciousness does have an effect on reality, because I know that I am conscious!
But wouldn't your thinking that you are concious be merely your brain giving a complex answer to a series of stimuli? Just like the fox who pre-emptively avoids being maimed just reacts in a complex fashion to a series of stimuli?
Even if I did not speak a word of it, the evidence would be in the triggering of nerve impulses in my brain representing thoughts about consciousness. From this I conclude that consciousness isn’t just an emergent property of matter, but an emergent property of computation itself - a simulation of a conscious being would also be conscious.
Does computation = brain activity? If so, conscienciousness would appear only within the most developed 'brains', that is adult-sized human brains, and megacomputers of the future. A fetus or a newborn certainly doesn't have the same computing capacity than an 8-year old (and older), far from it. We know that the human child sees dramatic increases in his computing ability in his first years, with the apparition of things like self-awareness in the general meaning, conceptual thought, etc. This means the fetus or the newborn are entirely unable to do so, for lack of brain power. Why would conciousness be different?
If consciousness is an emergent property of computation, or of matter performing computation, then it seems reasonable to assume that, since I am conscious, those whose brains operate practically similarly to mine (i.e. other humans) are also conscious.
Correction: other humans with an adult-sized brain.
cognito ergo sum.
That's cogito ergo sum (no 'n' in cogito).
And the cogito ergo sum idea is quite bunk when used along a deterministic view of the universe. The fact that we express our thought, that we even express our feelings... how is it different from a fox shrieking as it get tore in half?
Both manifestations are a deterministic consequence of matter anyway. And you have no proof that even what I'm writing right now is anything more than an automatic response to external stimuli.
If I understand you correctly, your point can be summed up as follows:
- all animals including humans are machines, whose behaviour is determined by physical rules.
- humans, at the exception of all animals (except maybe primates), have the ability to feel for real, what is in other animals just a stimuli-reaction process.
- that's because humans have a brain power sufficient enough to think, and most notably to think about their own existence. Ergo, they are.
- this brain power could be matched by formidable computers, who would also be consious.
There's far too much noise in here now for me to explain clearly where your argument can be refuted (there wasn't that much nnoise when I wrote my post)
"I have been reading up on the universe and have come to the conclusion that the universe is a good thing." -- Dissident "I never had the need to have a boner." -- Dissident "I have never cut off my penis when I was upset over a girl." -- Dis
The game is a sort of universe, with its own physical laws, and two "inhabitants" - the players. Obviously, the computer and the person both exist in reality, but that's beside the point. The computer player, for all it cares, exists within the Chess universe, and moving a pawn like a knight would be a direct violation of the physical laws of that universe.
So you are comparing the rules of chess to the physical laws of the universe?
But anways, that's my whole point - a computer is confined to the limitation of it's program. It can't respond in any way, to anything, that it hasn't been built to respond to within it's own little "universe".
If you want to argue that a dog is the same as this, only that it's "programming" covers the entire extent of the natural laws of the universe and anything that can possibly happen within their context, then fine. There's your difference. Get some perspective.
Uh, no, you're mixing up the computer player and the computer program.
One and the same.
Or are computer players actually residents of the multi-dimensional chess universe who are summoned to our reality by the rift in space-time that a chess program creates?
3) a computer can too do all of that. We may not have built any that are particularly sophisticated in that way, but that's like someone arguing before 1900 that no computer will ever, say, be able to prove a mathematical theorem.
It's generally a bad idea to base your arguments on science-fiction novels.
Theoretically speaking, though, if an android could be built to be genuinely alive and conscious, why shouldn't they be reconzied as such - along with foxes, gamecubes, and even humans. But we'll discuss that when or if that happens.
4) Any indictment you make against computers applies equally to dogs and people, because they are computers too!Everything is a computer that determines its behavior! Why are you still talking as if that were in doubt, when it's even true a priori! Aargh, you drive me nuts
Because, despite continuously saying this, you still contradict yourself and maintain that a human is somehow not a computer and that it has free will.
I wouldn't define them as such. You lose.
It doesn't matter how the hell you define the term - you could mean "teddy bear" by it - when you are trying to refute a statement by me. It only matters how I use the term.
That's the thing, isn't it? You "refuted" a statement made by me with your definition of the word.
ie. I said that I would contest your claim that psychopath has no empathy. You said I can't possibly do that because the allmighty Kucinich proclaims that it is not, infact, defined as such and so must not be contested!
But it doesn't "see" does it and that's the point - that's the bridge to gap. You've been making distinctions between seeing and "seeing" all along. I'm using your terminology.
Yes, well, you don't have to "see" from someone else's viewpoint to "put yourself in someone else's eyes" (i.e., figure out what their response a situation would probably be). All you need is general knowledge of the stimuli to which they are exposed, how their minds work, and (as an aide to the previous) how they've responded in the past to similar stimuli.
Exactly what I was saying - they need to know how a mind works. This can't possibly be done without "genuine" perception or self-awareness.
But what is perception except a mechanical calculation of input?
Perception is what a conscious being does with stimuli, while perception is merely having the knowledge of the input in memory. When the mechanical calculation of the input is done in a certain way, it resuls in sentience and thus perception of what was merely perceived.
And what does a conscious being "do" with stimuli that's so different?
Does it... maybe... make some calculations with it? What's so different about that?
Where does free will come into play if everything is a "determined" calculationi?
Free will is the determined calculation of a sentient being, because the being determines itself. A machine also determines itself, but will is meaningless outside the context of sentience.
So where does sentience come from?
If both a machine and a "sentient machine" (ie. human) determine themselves through calculations, what is the differnce between them that makes one sentient and the other not?
And what does sentience even mean in this context, except perhaps of an awareness that one's self is a machine and all their thoughts -including their "will" - are merely calculations in response to stimuli.
EDIT: I have to take a test, so I'll be back later. Maybe tomorrow, actually.
Originally posted by General Ludd
So you are comparing the rules of chess to the physical laws of the universe?
But anways, that's my whole point - a computer is confined to the limitation of it's program. It can't respond in any way, to anything, that it hasn't been built to respond to within it's own little "universe".
So is the universe, for that matter. It's limited to its physical laws and its initial composition.
One and the same.
Not at all, except that usually the AI code is part of the same executable, but that's obviously not what you are talking about.
Imagine an FPS that you can play online MP. Then imagine someone writes code for a bot that will connect to the host and play in the MP game. The program run by the host is the "universe", and the humans and the AI bot are the players. Now imagine someone happened to package the code for an AI player along with the code for the "universe", and that instead of an FPS, it's Chess. The AI is still a player inside of the Chess universe.
It's generally a bad idea to base your arguments on science-fiction novels.
Huh? I'm not doing that at all. Computers actually have proved (and devised!) mathematical theorems.
Theoretically speaking, though, if an android could be built to be genuinely alive and conscious, why shouldn't they be reconzied as such - along with foxes, gamecubes, and even humans. But we'll discuss that when or if that happens.
They should. But everything a fox does can be handled by a program that is clearly not sentient.
Because, despite continuously saying this, you still contradict yourself and maintain that a human is somehow not a computer and that it has free will.
No, I say that a human is a computer that has free will. I'm saying that the two are not mutually exclusive.
That's the thing, isn't it? You "refuted" a statement made by me with your definition of the word.
No. I pointed out that empathy was not a necessary condition for sentience, because there are humans without empathy (I used the term "psychopaths") that are obviously self-aware. You claimed that being a psychopath didn't mean that you were lacking empathy. But I was using the word psychopath to mean "someone without empathy".
Exactly what I was saying - they need to know how a mind works. This can't possibly be done without "genuine" perception or self-awareness.
Not unless they need precise or exact knowledge of how their mind works - i.e., the ability to calculate the behavior of every particle in the brain. I said general. Obviously, even humans only deal with general knowledge of how someone else's mind works, because we can't simulate in our own minds another brain even at just the level of neurons, let alone fundamental particles.
And what does a conscious being "do" with stimuli that's so different?
Does it... maybe... make some calculations with it? What's so different about that?
It makes calculations that (among other things) result in perception of the stimuli.
Frankly, I have no clue why sentience evolved - my only guess is that it was an aid to empathy/extending the concept of "self" and thus an aid to the development of ever-larger social groups which ended up overrunning the smaller ones.
So where does sentience come from?
Calculation (and yes, I'll adress the objection to this in the response to the next quote).
If both a machine and a "sentient machine" (ie. human) determine themselves through calculations, what is the differnce between them that makes one sentient and the other not?
I have no clue whatsoever. I find it incredible that through pure calculation sentience can be achieved - for that matter, I find sentience's existance itself incredible. But it obviously does exist.
And what does sentience even mean in this context, except perhaps of an awareness that one's self is a machine and all their thoughts -including their "will" - are merely calculations in response to stimuli.
Sentience does not mean that you are aware that you are determined (plenty of people are not, or dispute it, including you). It arises from determined calculations, but isn't necessarily aware that those calculations are determined. It is aware of the calculations themselves, because they are itself.
That's the thing, isn't it? You "refuted" a statement made by me with your definition of the word.
No. I pointed out that empathy was not a necessary condition for sentience, because there are humans without empathy (I used the term "psychopaths") that are obviously self-aware. You claimed that being a psychopath didn't mean that you were lacking empathy. But I was using the word psychopath to mean "someone without empathy".
Actualy, I said that "being a psychopath doesn't neccisarily imply a lack of empathy but..." and I then went on to my actual point, which was that the proper question was not whether sentient beings have to be empathic, but the opposite - whether empathic beings have to be sentient.
Whether pscyopaths are empathic are not had no baring on my point, it was just a foot note - a clarification that I didn't agree with you on that point, but...
Sentience does not mean that you are aware that you are determined (plenty of people are not, or dispute it, including you)
Following your argument, that is what sentience must imply, regardless of whether it is true in reality or not, or whether individual people percieve it as such. If Sentience is self awareness, and self is determined, then being self-aware is being aware that you are determined.
It arises from determined calculations, but isn't necessarily aware that those calculations are determined. It is aware of the calculations themselves, because they are itself.
Or... is sentience is the delusion that one's self is not determined, despite the reality that it is? That's an interesting way of defining it. One that I kind of like, actually.
But it doesn't explain why free will would be meaningfull in this context.
Last edited by General Ludd; November 21, 2004, 18:05.
Following your argument, that is what sentience must imply, regardless of whether it is true in reality or not, or whether individual people percieve it as such. If Sentience is self awareness, and self is determined, then being self-aware is being aware that you are determined.
No - just because you are self-aware, does that mean you understand completely the working of your mind? No. It means you are aware of yourself, not that you have perfect knowledge of yourself (or at least, perfect knowledge of how you work).
Or... is sentience is the delusion that one's self is not determined, despite the reality that it is? That's an interesting way of defining it. One that I kind of like, actually.
But it doesn't explain why free will would be meaningfull in this context.
Because only a sentient being can intend to do something (and even a determined being can clearly have intentions and purposes), and intending to do something implies the will to do it.
Now, first, it should be obvious that by "consciousness" I don't mean "being awake". Also, when I say that someone is conscious of something or aware of something, I do not refer to merely having the information in the brain/whatever computational device the organism/machine uses. It's hard to phrase it properly, because it's such a fundamental concept... it's feeling your sensations... Spiffor, I'd like you to tell me if I'm making any sense here. I really don't think there are words to describe it.
It makes sense to me, but probably not the sense you'd want me to get For the sake of my own clarity, I'll call this factor "ability to feel".
However, you should strive to make a good definition. In general, concepts that cannot be expressed clearly are cloudy concepts.
OK, how about this: when you look at something, you aren't aware of just a bunch of one's and zeroes like a computer camera, or a set of nerve impulses like an animal; you see a picture, in your mind.
Now, this leads to the question of free will - if your actions are determined, then you has no control over them and no free will and cannot be blamed for them. Clearly an electron cannot be blamed for anything, so why can a person? The problem with this thinking is that your behavior, while determined, is under your control, because your behavior is determined by yourself.
Not quite. If the world is purely deterministic, so is the entirety of your thought-process. Unless our brains are the only place in the universe where determinism doesn't apply, our behaviour is determined by the interactions between small bits of matter. Our human conceptions call it "ourselves", but that's nothing more than one simplification of reality (among many others which are needed in everyday life).
Actually, that's my point. Our thoughts are determined. But they are determined by ourselves. No one else comes in and determines what you think, you do. Those small bits of matter are you.
If some other person were "mind-controlling" you, you would be blameless, but you are not. The reason there is no free will in electrons is because they are not self-aware - and it is meaningless to assign will to something that is not self-aware. Only a consciousness can have a will. We may say a robot, for instance, behaves "as if it had a will of its own", but it doesn't, it just has a chip designed to act in a certain fashion.
But our brain is also designed to act in a certain fashion. You may repeat that "we have a will", and you can even believe that all humans have a will just like you have. But this belief is nothing more (according to your probabilistic premise) than the manifestation of complex interactions between atoms.
Except those atoms happened to be arranged such that, given appropriate stimuli, my brain can actually reach logical conclusions.
Where does this consciousness come from? How do we know we have it? Well, I believe (though it is not entirely provable right now) that there is no dualism, no “spiritual” world to complement the “physical” world*. If you took a computer and ran a simulation of the behavior of all the particles in a human being, you would find that it would behave exactly as the actual human being would. This means it would also, if prompted, claim that it was conscious, and without prompting it would have the thoughts, in its mind, of consciousness. You can say that maybe electrons are conscious, but their consciousness has no effect on reality, but it’s clear that consciousness does have an effect on reality, because I know that I am conscious!
But wouldn't your thinking that you are concious be merely your brain giving a complex answer to a series of stimuli? Just like the fox who pre-emptively avoids being maimed just reacts in a complex fashion to a series of stimuli?
I don't just think that I am conscious; I experience my consciousness. Yes, it is the result of my brain making calculations, but that doesn't mean that any instance of a brain making calculations results in consciousness.
Even if I did not speak a word of it, the evidence would be in the triggering of nerve impulses in my brain representing thoughts about consciousness. From this I conclude that consciousness isn’t just an emergent property of matter, but an emergent property of computation itself - a simulation of a conscious being would also be conscious.
Does computation = brain activity? If so, conscienciousness would appear only within the most developed 'brains', that is adult-sized human brains, and megacomputers of the future. A fetus or a newborn certainly doesn't have the same computing capacity than an 8-year old (and older), far from it. We know that the human child sees dramatic increases in his computing ability in his first years, with the apparition of things like self-awareness in the general meaning, conceptual thought, etc. This means the fetus or the newborn are entirely unable to do so, for lack of brain power. Why would conciousness be different?
It's not computational power that is required. A litre of water has immense computational power. It just has relatively useless (to us) programming. It is the basic programming of the brain, which is practically similar between all humans, that results in consciousness.
cognito ergo sum.
That's cogito ergo sum (no 'n' in cogito).
And the cogito ergo sum idea is quite bunk when used along a deterministic view of the universe. The fact that we express our thought, that we even express our feelings... how is it different from a fox shrieking as it get tore in half?
Both manifestations are a deterministic consequence of matter anyway. And you have no proof that even what I'm writing right now is anything more than an automatic response to external stimuli.
It is an automatic response to external stimuli. However, it is also the product of a conscious mind. I know that I am conscious, because I experience it directly. Just like I know that I see a computer screen in front of me, even if it's just an illusion or hologram, because it's not a statement about the world around me, but a statement about my immediate sensations.
Following your argument, that is what sentience must imply, regardless of whether it is true in reality or not, or whether individual people percieve it as such. If Sentience is self awareness, and self is determined, then being self-aware is being aware that you are determined.
No - just because you are self-aware, does that mean you understand completely the working of your mind? No. It means you are aware of yourself, not that you have perfect knowledge of yourself (or at least, perfect knowledge of how you work).
I'm was saying nothing of what a person might be aware of (beyond themselves - whatever that may be), I'm talk about what the real implications of a sentience derived from computation are.
Because only a sentient being can intend to do something (and even a determined being can clearly have intentions and purposes), and intending to do something implies the will to do it.
Not in that context it doesn't.
If sentience is a delusion that you are not a machine (despite the fact that you are). Then an intention implies no will, merely a delusion of will - a delusion that the calculations (which are what really make up that "intention") are not calculations at all but are, instead, a choice made by your "spirit", "soul", or "duality" (which does not actually exist).
I really do like this idea, though. All life is mechincal, there is no free will, and sentience is a malfunction.
I'm was saying nothing of what a person might be aware of (beyond themselves - whatever that may be), I'm talk about what the real implications of a sentience derived from computation are.
You are saying that sentience being derived from computation implies, logically, that self-awareness is being aware that one is determined. However, that is not the case at all (that A implies B) because self-awarness and sentience do not mean total awareness of every aspect of the self.
Comment