[ExI] evolution and crazy thinking
William Flynn Wallace
foozler83 at gmail.com
Wed Jul 18 15:00:47 UTC 2018
Job done (I am right) :-) ..... or am I? So far the evidence is consistent
with my hypothesis. It predicts exactly what your post is about: it
predicts humans screwing up badly as a primary cognitive necessity to deal
with the unknown.
cheers
colin
I can't see some of these biases as leading to anything but further bias,
and maybe worse at that. Use the self-serving bias along with the
fundamental attritutional error too much and you never understand the other
people in the world.
And it is unclear to me just how one gets out of the original biased
decision/action. Some will stop at the first approximation - I call this
'good enough for who it's for' - and will be consistently wrong. Add the
self-serving bias to this and you get a person who cannot admit he is wrong
- who maybe knows it but has no way to get his ox out of his ditch.
If one uses unbiased methods one can be wrong many times before one is
right, but, like the scientists, he doggedly keeps with his methods and
finally gets to the truth of things.
In short, I don't see making biased decisions as a necessary first step.
bill w
On Tue, Jul 17, 2018 at 7:18 PM, Colin Hales <col.hales at gmail.com> wrote:
> I wrote about this in my book.
>
> When you don't know something then to understand it you have to make an
> explanatory hypothesis which, at the moment of its creation, is formally
> 'wrong' in the sense that until evidence confirms it, it's not predictive
> yet. Over time your hypothesis acquires a body of evidence and you get to
> be 'right' in the sense of 'predictive'. That is, in order for a human to
> make sense of the world, you have to be able to be 'wrong'. Making wrong
> hypotheses is a double edged sword.
>
> 1) You get to be right post-hoc.
> but
> 2) On the down side, if you're an idiot that has a broken sense of what
> evidence is (...in the 1000 cognitive biases in Wikipedia and in the
> attached 'codex') then you get stuck with your own, pardon me, bullshit.
> Like religion, for example.
>
> In the brutal evolutionary 'get it right or die' process, evolution has
> favoured a creature like us that can be very wrong and use that exact
> ability to then get at the true nature of things. Later you become 'right'.
> I used this to great effect in a formal scientific account of scientific
> behaviour.
>
> Job done (I am right) :-) ..... or am I? So far the evidence is consistent
> with my hypothesis. It predicts exactly what your post is about: it
> predicts humans screwing up badly as a primary cognitive necessity to deal
> with the unknown.
>
> cheers
> colin
>
>
>
>
>
> On Wed, Jul 18, 2018 at 5:51 AM, Spike Jones <spike at rainier66.com> wrote:
>
>>
>>
>>
>>
>> *From:* extropy-chat <extropy-chat-bounces at lists.extropy.org> *On Behalf
>> Of *William Flynn Wallace
>>
>>
>>
>> >…Evolution did a great job but it has a long way to go. I hope it gets
>> the chance. 'Survival of the fittest' does not seem to describe the
>> current state of world affairs in the evolutionary sense.
>>
>> Are we, in fact, not losing the unfit? bill w
>>
>>
>>
>>
>>
>> This observation about survival of the fittest should have been stated
>> survival of the best adapted.
>>
>>
>>
>> If we are discussing humans, fitness in the traditional sense is nearly
>> irrelevant. The unfit prosper in the right environment, such as our
>> technically advanced world. People who are dependent on modern medical
>> technology for instance are likely to reside near a hospital, which implies
>> a big city, where reproductive opportunities are relatively plentiful. In
>> that sense, the unfit are better adapted to our world than the fittest.
>>
>>
>>
>> spike
>>
>>
>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
>>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20180718/ff904abe/attachment.html>
More information about the extropy-chat
mailing list