[ExI] Meaningless Symbols
gts_2000 at yahoo.com
Sun Jan 10 15:44:48 UTC 2010
--- On Sat, 1/9/10, Stathis Papaioannou <stathisp at gmail.com> wrote:
> I specified "use a word appropriately in every context";
> Google can't as yet do that.
If and when Google does do that, it will have weak AI or AGI. One might argue that it already has primitive weak AI.
Strong AI means something beyond that. It means something beyond merely "using words appropriately in every context". It means "actually knowing the meanings of the words used appropriately". It means "having a mind in the sense that humans have minds, complete with mental contents (semantics)".
I can program my computer to answer "Yes" to the question "Do you have something in mind right now?" Will my computer then actually have something in mind when it executes that operation in response to my question? I might find it amusing to imagine so, but I also understand the difference between reality and the things I imagine.
More information about the extropy-chat