Announcement

Collapse
No announcement yet.

I have officially changed my position on the liberal arts in universities

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Al B. Sure! View Post
    You missed something:

    Of course, this is by it's nature a hindsight system and the idea of happiness having intrinsic value and being something to be aspired to is arbitrary as well. Also problematic is the fact that humans are dealing with uncertain futures and are resource-constrained which prevents an accurate appraisal of the consequences of any particular ethical action. So Kuci's crap is fundamentally flawed as well.
    Again, this is why liberal arts are useless.
    "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
    Ben Kenobi: "That means I'm doing something right. "

    Comment


    • #32
      There's just no way you can objectively and categorically make ethical determinations especially as there are no objective value judgments. Who is to say that Kuci's economic welfare is any more an appropriate goal and basis for morality than Nietzsche's Will to Power?
      "Flutie was better than Kelly, Elway, Esiason and Cunningham." - Ben Kenobi
      "I have nothing against Wilson, but he's nowhere near the same calibre of QB as Flutie. Flutie threw for 5k+ yards in the CFL." -Ben Kenobi

      Comment


      • #33
        Originally posted by Al B. Sure! View Post
        Asher, any ethical calculus rooted in consequentialism (which has dominated ethical thinking since Jeremy Bentham) would necessarily have to consider the practicalities of a chosen alternative. You seem to have a deontological conception of ethics which is fine but you have to understand that any value judgments you make about the correctness of an action are purely subjective and arbitrary. At least a teleological ethical system can 'measure' something, usually happiness, and determine an action to be correct if it generated more 'well-being' than non-chosen alternatives.

        Of course, this is by it's nature a hindsight system and the idea of happiness having intrinsic value and being something to be aspired to is arbitrary as well. Also problematic is the fact that humans are dealing with uncertain futures and are resource-constrained which prevents an accurate appraisal of the consequences of any particular ethical action. So Kuci's crap is fundamentally flawed as well.

        Here's a question for you, Asher: Would you say that there are actions that are categorically wrong?


        EDIT: I guess my post isn't really relevant anymore
        Deontology sucks because its conclusions are easily replicated by simpler* theories like rule utilitarianism. This is even more obvious because the most famous rigorous treatment of deontology (Kant) amounted to "CONSEQUENTIALISM IS WRONG but I will apply logic that is essentially identical in form to rule utilitarianism anyway".

        *in the Occam's Razor sense

        Comment


        • #34
          Originally posted by Al B. Sure! View Post
          You missed something:

          Of course, this is by it's nature a hindsight system and the idea of happiness having intrinsic value and being something to be aspired to is arbitrary as well. Also problematic is the fact that humans are dealing with uncertain futures and are resource-constrained which prevents an accurate appraisal of the consequences of any particular ethical action. So Kuci's crap is fundamentally flawed as well.
          Concisely: no, it's not arbitary. If you approach ethics scientifically as the question of finding the smallest set of axioms that best explains our moral sentiments (with the tradeoff between simplicity and accuracy being handled in the usual Kolmogorov sense) then you should conclude that rule consequentialism (and, particularly, maximizing the happiness of a certain group of people) thoroughly explains almost all of our sentiments.

          Comment


          • #35
            Originally posted by Kuciwalker View Post
            Concisely: no, it's not arbitary. If you approach ethics scientifically as the question of finding the smallest set of axioms that best explains our moral sentiments (with the tradeoff between simplicity and accuracy being handled in the usual Kolmogorov sense) then you should conclude that rule consequentialism (and, particularly, maximizing the happiness of a certain group of people) thoroughly explains almost all of our sentiments.
            Could you try any harder than this? Jesus Christ.

            "usual Kolmogorov sense"? Are you ****ing serious?

            The best way to determine when a person has a weak argument is when they gussy it up in needless complexity.
            "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
            Ben Kenobi: "That means I'm doing something right. "

            Comment


            • #36
              That's the second-best way, I think. The best is when they repeatedly are unable to answer basic questions about the topic.

              Comment


              • #37
                And Kolmogorov complexity is a wikipedia article away, and one you (and probably even AS) should have no trouble grasping immediately.

                Comment


                • #38
                  Asking the strict definition of "good" and "evil" is not a basic question, and in fact it's a textbook tactic to deflect the heat in a conversation.

                  It's very difficult to have a debate with you about morality when you continue to not understand the definition of "morality". You even called a textbook definition of the word "factually incorrect". It was absurd.
                  "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
                  Ben Kenobi: "That means I'm doing something right. "

                  Comment


                  • #39
                    KBE
                    12-17-10 Mohamed Bouazizi NEVER FORGET
                    Stadtluft Macht Frei
                    Killing it is the new killing it
                    Ultima Ratio Regum

                    Comment


                    • #40
                      Originally posted by Kuciwalker View Post
                      Concisely: no, it's not arbitary. If you approach ethics scientifically as the question of finding the smallest set of axioms that best explains our moral sentiments (with the tradeoff between simplicity and accuracy being handled in the usual Kolmogorov sense) then you should conclude that rule consequentialism (and, particularly, maximizing the happiness of a certain group of people) thoroughly explains almost all of our sentiments.
                      If there were a way to tell the future, your reasoning would be correct. There is no way, in a teleological ethical system, to determine with certainty that an action is 'right' before it is undertaken. That appraisal can only come after the action is long exercised and the consequences tallied. So instead, your ethical system becomes a matter of statistical probabilities. An action is 'right' under the assumption of a particular expected outcome but if the consequences differ from that outcome, it could become 'wrong'. What is the role of justice in such a system? How is the man who kills to save 2 people treated (net effect positive)? What if he mistakingly thought he could only save those 2 people by killing a person and they all died (net effect negative)?

                      I don't know. It doesn't jive well with me as 'scientific'. Neither does greatest good for the greatest number. Reminds me too much of Brave New World.
                      "Flutie was better than Kelly, Elway, Esiason and Cunningham." - Ben Kenobi
                      "I have nothing against Wilson, but he's nowhere near the same calibre of QB as Flutie. Flutie threw for 5k+ yards in the CFL." -Ben Kenobi

                      Comment


                      • #41
                        2*xpost

                        If you say that morality is the question of good and evil, and good and evil are... um... yeah, they're good and evil, of course! - then it's not clear how anyone could ever possibly argue a position about them. "I think gay sex is evil." "Well, I don't!"

                        Comment


                        • #42
                          Originally posted by Kuciwalker View Post
                          And Kolmogorov complexity is a wikipedia article away, and one you (and probably even AS) should have no trouble grasping immediately.
                          I've taken enough math to know what it is, but I happen to be smart enough to know Alby would not have run into it (it's usually an algorithmic complexity concept). You used the term, without reference or explanation, in a discussion with him. The only people who do that kind of **** are people who:
                          1) Are generally incompletely incapable of empathy (not able to comprehend what others may view or understand things differently)
                          2) Are trying very hard to sound impressive to compensate for lack of sound reasoning
                          "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
                          Ben Kenobi: "That means I'm doing something right. "

                          Comment


                          • #43
                            Originally posted by Kuciwalker View Post
                            2*xpost

                            If you say that morality is the question of good and evil, and good and evil are... um... yeah, they're good and evil, of course! - then it's not clear how anyone could ever possibly argue a position about them. "I think gay sex is evil." "Well, I don't!"
                            It's almost like you just realized there is no absolute morality.

                            I'm so pleased you finally understand the simple concept I've been arguing all night. Morality is a question that can only be answered by people, which reflects their upbringing, religion, philosophy, and general levels of empathy. That you like to use economics to define morality is your own philosophy -- one I find to be disturbing.
                            "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
                            Ben Kenobi: "That means I'm doing something right. "

                            Comment


                            • #44
                              Originally posted by Al B. Sure! View Post
                              If there were a way to tell the future, your reasoning would be correct. There is no way, in a teleological ethical system, to determine with certainty that an action is 'right' before it is undertaken.
                              So? I've never, ever suggested that we have to find out all of the numbers to infinite precision, or always get the right answer - but mechanisms that, to the best of our knowledge, will get us in the right ballpark of the right answer are probably preferable to arbitrary guesswork.

                              That appraisal can only come after the action is long exercised and the consequences tallied. So instead, your ethical system becomes a matter of statistical probabilities. An action is 'right' under the assumption of a particular expected outcome but if the consequences differ from that outcome, it could become 'wrong'. What is the role of justice in such a system? How is the man who kills to save 2 people treated (net effect positive)? What if he mistakingly thought he could only save those 2 people by killing a person and they all died (net effect negative)?
                              Welcome to the confusion that resulted in the trifurcation of ethics into consequentialism, deontology, and virtue ethics.

                              The answer to "what should I do?" is pretty clearly "the thing that, to the best of your knowledge, will produce the best outcome" etc. Unless our knowledge really really sucks, that rule should pretty consistently perform better than other rules, and we'll just have to live with the fact that sometimes our knowledge is wrong.

                              Of course, sometimes we end up having to adopt rules that produce locally suboptimal outcomes because otherwise we get a globally suboptimal system. e.g. I have to kill those German soldiers (who may be perfectly good people, whatever) so that we can get to Berlin and tear down Nazi Germany. Or I have to throw this guy in prison, even if I don't think he'll commit more crimes, because putting people in prison deters others from committing crimes in the first place. Deontology takes these rules we come up with and ingrain into our social norms and says that they are true in themselves rather than true as necessary results of consequentialism.

                              Virtue ethics comes about when we decide that we want to encourage people to follow these norms, and so say that people who behave morally are "good people" and people who behave immorally are "bad people". Good people are those that do good things and don't do bad things; we should and do want to be good people, so we do good things and don't do bad things. But, again, these are just constructs on top of consequentialism, not basic axioms of morality.

                              Comment


                              • #45
                                Originally posted by Asher View Post
                                I've taken enough math to know what it is, but I happen to be smart enough to know Alby would not have run into it (it's usually an algorithmic complexity concept).
                                Kolmogorov did tons of important work in statistics, and Albie was a finance major. I've run into his stuff on actuarial exams. It was a decent bet at him having run into it.

                                Comment

                                Working...
                                X