[ExI] rationality
Aleksei Riikonen
aleksei at iki.fi
Sun Apr 5 10:49:45 UTC 2009
On Sun, Apr 5, 2009 at 12:16 PM, BillK <pharos at gmail.com> wrote:
> On 4/5/09, Lee Corbin wrote:
> <snip>
>> They fear that rational argument may not go their way. They
>> value, in decreasing order,
>>
>> 1. prevailing in an argument, especially anything
>> touching on values
>> 2. prevailing with reason
>> 3. finding and speaking the truth
>>
>> Now none of us can claim that he or she always puts number three
>> first, but if we find that we are engaging in sheer calumny,
>> or merely expressing our feelings and loathings, then for
>> sure you know we are elevating our desire to prevail over
>> everything else, including both rationality and a desire to
>> get at the truth.
>
> The error in your modest proposal is that rationality isn't the whole picture.
> There are more important things than being rational.
> (Don't tell the Bayesians) ;)
>
> If you are discussing whether 2 + 2 = 4, then fine, be as rational as you like.
>
> But if you are discussing religion or politics (the big no-nos), then
> you have to bring real practical considerations in the discussions.
> Crimes against humanity invalidate the most logical of reasoning.
> Sorry, but it doesn't matter if you are rational and logically
> correct, you still lose the discussion.
>
> One can think of situations where logical analysis might recommend the
> enslavement of women, or the forced labor of children, or slavery of
> the unemployed, etc. etc. might produce better results in some areas.
> But these arguments lose because they are trumped by 'That's no way to
> treat human beings!'.
>
> Rationality is pretty much useless in matters of human relationships,
> on the small and large scale.
Strange that you speak of rationality as if it required one to always
speak the truth.
If speaking the truth leads to getting burned at the stake, the smart
person shuts up, and I for one don't consider that choice to be
irrational.
According to Wikipedia, "a rational agent is specifically defined as
an agent which always chooses the action which maximises its expected
performance, given all of the knowledge it currently possesses." For
most goal systems, getting burned at the stake doesn't maximize one's
performance.
--
Aleksei Riikonen - http://www.iki.fi/aleksei
More information about the extropy-chat
mailing list