[ExI] Augmentations to Science
Ben
bbenzai at yahoo.com
Tue Dec 22 11:20:29 UTC 2015
Will Steinberg <steinberg.will at gmail.com> asked:
"Do any of you have any idea what a *very* rough computerized insight
model might look like?"
You need to define 'insight' first. I suspect different people will have
different definitions.
To me, it's pretty much the same thing as 'gut feeling'. In other words,
a cognitive shortcut, where some idea 'feels' right even though you
can't see immediately why it should.
I think this is probably an 'unconscious analogy' type of thing, where
you've detected an analogy to something else you've experienced in the
past, but aren't consciously aware of what the something else is, and
this process is tied to an emotional response, making it jump out more
quickly than the normal micro-evolution process that goes on in the
cortex when we're thinking.
The problem with insights is they're sometimes wrong.
Any algorithm that captured the process would probably have to build on
other algorithms that model thinking anyway, and these would probably be
fast enough that there'd be no point in having an additional 'insight'
system which is less reliable. For a biological organism, the
evolutionary benefits of the extra speed probably compensates for it
being wrong a certain amount of the time. For a non-biological thinking
agent, that won't apply.
Ben Zaiboc
More information about the extropy-chat
mailing list