[ExI] The symbol grounding problem in strong AI
Stathis Papaioannou
stathisp at gmail.com
Wed Dec 30 06:13:00 UTC 2009
2009/12/30 Gordon Swobe <gts_2000 at yahoo.com>:
> But seriously I see that we have two categories of things and objects 1) the real kind and 2) the computer simulated kind.
>
> I make a clear distinction between those two categories of things. Computer simulations of real things do not equal those real things they simulate, and some "simulate" nothing real in the first place.
A simulated thunderstorm won't be wet except in its simulated world. A
simulated mind, however, will be a mind everywhere. That is because
(a) the mind is its own observer, and (b) it takes just as much
intelligence to solve a simulated as a real problem.
--
Stathis Papaioannou
More information about the extropy-chat
mailing list