[ExI] The symbol grounding problem in strong AI
Damien Broderick
thespike at satx.rr.com
Sat Jan 9 20:13:39 UTC 2010
On 1/9/2010 1:54 PM, Gordon Swobe wrote:
> how can I say this without assigning some kind of strange non-computable aspect to natural brains.
> The answer is that I say it because I don't believe the brain is actually [a] computer.
Isn't that exactly saying that you assign some kind of non-computable
aspect to natural brains? (No reason why it should be strange, though.)
As I said several days ago, a landslide doesn't seem to me to compute
the trajectories of all its particles--at least not in any sense that
I'm familiar with. We can *model* the process with various degrees of
accuracy using equations, but it looks like a category mistake to
suppose that the nuclear reactions in the sun are *calculating* what
they're doing. I realize that Seth Lloyd and others disagree (or I think
that's what he's saying in PROGRAMMING THE UNIVERSE--that the universe
is *calculating itself*) but the whole idea of calculation seems to me
to imply a compression or reduction of the mapping of some aspects of
one large unwieldy system onto another extremely stripped-down toy system.
That might be wrong, I know. I hope Gordon knows it might be wrong as well.
Damien Broderick
More information about the extropy-chat
mailing list