
Here's where it started. I was arguing with Aggie about self-interest. I said:
Certainly everyone acts in their self-interest in that they take the course of action that they believe will maximize their happiness. Either that or they're insane (in fact, the definition of a rational agent is one that attempts to maximize its utility function).
Aggie replies with his usual bull****:
That's a mindlessly idiotic definition. See... this is why people need philosophy.
And I said:
No, that's the only meaningful definition. See any decent AI textbook.
Get it yet?
Comment