Announcement

Collapse
No announcement yet.

The Death of a Sentient Machine...

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    I don't think sentient AI will be oriented towards efficiency. The main reason is there aren't any tasks that a sentient AI can do that a non-sentient AI can't do more efficiently. So designing a sentient AI towards efficiency is just backwards. It's like using a gold toilet brush. It won't work as well as some cheapo plastic wand, and is a terrible waste of resources.

    We'll want sentient AI to do the inefficient stuff... talk to us, think up interesting stories... fluff. Not the nitty gritty daily life chores that require efficiency... cause you just don't want your automated (golden) toilet cleaner asking itself if it really wants to go to work today...

    Comment


    • #32
      no, it will realize it must work, or I'll kick it's ass.

      Comment


      • #33
        Why do so many people assume that machines would ever be built that resent could anything? Resentment isn't likely to be some emergent property of sentience. You'd probably have go to some lengths to duplicate such complex traits that are obvious consequences of natural selection to get it to show up in your machine.

        Machines will only feel what we carefully design them to be able to feel. Perhaps we will find feelings or their functional equivalents to be an efficient component of some kinds of AI but in that case it is the designers whim not natural selection that will determine what gives the machine "pleasurable mind states" and what gives it "unpleasant mind states."

        Things like resentment would have no usefulness at all that I can imagine. Any suggestions as to why machines would be designed to feel such counter productive things?

        Comment


        • #34
          .
          Last edited by ZEE; April 22, 2011, 06:49.
          Order of the Fly

          Comment


          • #35
            Originally posted by AAHZ
            Geronimo,

            I have always held the possible, and most likely illogical, philosophy that when a machine achieves sentience... traits like emotions and "mind states" would EVOLVE within the respective "mind" of the AI. I do not believe that is something that will be programmed per se, but a trait of the OVERALL functionality of the "AI chip" in question. If we are to carry on a conversation with the sentient AI, it would most likely have to be programmed with certain "personality traits" in order to make the chat interesting. otherwise wont ALL sentient AI's have the same mind set? I guess my point is... how would we make the MIND of AI #1, different than that of AI #2? And how do we make hundreds and hundreds of AI's different that eachother? I believe "AI evolution" as a fascinating concept, and one that other AI programmers have most likely thought of as well...
            AI evolution equals AI change.

            Biological evolution equals biological change.

            Natural selection drove biological change and it gave us various emotional states with obvious survival utility.

            Natural selection would not occur for AIs. So why would you expect much convergence between AI evolution and biological evolution?

            AI's could probably be made to genuinely feel almost anything with careful informed engineering but those emotional states wouldn't be expected to trigger randomly. Mechanisms would also have to be devised to trigger the emotions just as such mechanisms obviously underlie the clearly non random emotional responses of humans.

            If the engineer can't see any use in their AI feeling resentment or hatred or any mind states derived in part from those feelings they are not going to design triggers in the AI for such mind states.

            Comment


            • #36
              .
              Last edited by ZEE; April 22, 2011, 06:50.
              Order of the Fly

              Comment


              • #37
                I suspect that computer AI's will be somewhat humanlike just because they'd be designed to solve the same problems. It's quite possible that our "irrationality" et. al. are necessary compromises for us to be able to do some very complex logic feasibly.

                Comment


                • #38
                  Originally posted by Kuciwalker
                  I suspect that computer AI's will be somewhat humanlike just because they'd be designed to solve the same problems. It's quite possible that our "irrationality" et. al. are necessary compromises for us to be able to do some very complex logic feasibly.
                  I've had thoughts along the same line. That AI's may require a kind of "Artificial intuition". However, intuition is not really the mechanism underlying emotional states. Intuition is probably an emergent property of a neural networking style of information processing. Emotional states on the other hand would probably require a functional equivalent of an artificial limbic system.
                  Last edited by Geronimo; March 20, 2007, 13:29.

                  Comment


                  • #39
                    I, Robot with Will Smith was on FX last night.

                    In comparison to Asimovs work, uggg. Still as a mindless action movie semi-enjoyable.
                    "Just puttin on the foil" - Jeff Hanson

                    “In a democracy, I realize you don’t need to talk to the top leader to know how the country feels. When I go to a dictatorship, I only have to talk to one person and that’s the dictator, because he speaks for all the people.” - Jimmy Carter

                    Comment


                    • #40
                      Exactly my thoughts on it, Ogie.

                      -Arrian
                      grog want tank...Grog Want Tank... GROG WANT TANK!

                      The trick isn't to break some eggs to make an omelette, it's convincing the eggs to break themselves in order to aspire to omelettehood.

                      Comment


                      • #41
                        .
                        Last edited by ZEE; April 22, 2011, 06:50.
                        Order of the Fly

                        Comment


                        • #42
                          Originally posted by AAHZ


                          i didnt think of it that way, good point!

                          If emotional state requires a neural network, it would be impossible for a software-type AI, with a hardware frame, to achieve any type of emotional capacity. however if we ARE able to somehow SIMULATE the effects of a neural network within the frame, well then, THATS a whole other story. but now its all purely conjecture, correct?
                          Intuition may require some sort of neural network, either implemented through software or more likely special hardware but I don't think it is at all clear that emotional states will require a neural network.


                          For all we know a functional equivalent to emotional functionality of a limbic system might be comparitively simple. The limbic system remains at least as poorly understood as any other subsystem of the brain.

                          Comment


                          • #43
                            http://en.wikipedia.org/wiki/Colossus_(novel)

                            With or without religion, you would have good people doing good things and evil people doing evil things. But for good people to do evil things, that takes religion.

                            Steven Weinberg

                            Comment


                            • #44
                              I'd imagine that a decent AI would have an intelligence suited to its form. Therefore, it wouldn't necessarily have the same desires as a human. Unless the programmer was some sadistic bastard.
                              “As a lifelong member of the Columbia Business School community, I adhere to the principles of truth, integrity, and respect. I will not lie, cheat, steal, or tolerate those who do.”
                              "Capitalism ho!"

                              Comment


                              • #45
                                Not the same desires, obviously - AI's will probably be compulsive problem-solvers - but the same emotional irrationality.

                                Comment

                                Working...
                                X