[ExI] The symbol grounding problem in strong AI

Aware aware at awareresearch.com
Wed Dec 30 00:03:11 UTC 2009


[Sorry, resending due to a small oversight which I wouldn't want to
derail the message.]

On Tue, Dec 29, 2009 at 3:37 PM, Aware <aware at awareresearch.com> wrote:
> On Tue, Dec 29, 2009 at 3:06 PM, Gordon Swobe <gts_2000 at yahoo.com> wrote:
>>
>> But seriously I see that we have two categories of things and objects 1) the real kind and 2) the computer simulated kind.
>>
>> I make a clear distinction between those two categories of things. Computer simulations of real things do not equal those real things they simulate, and some "simulate" nothing real in the first place.
>
> To use a thought-experiment familiar to this list, how would you know
> (experientially) whether or not you're being run as a simulation right
> now?  As an embedded observer of whatever local environment of
> interaction you inhabit, you fundamentally LACK THE CONTEXT that would
> make any difference.
>
> It seems that through this protracted thread you consistently and
> simply beg the question, as does Searle.  Functionalism aside (as I've
> said twice now, it's not the issue and needs no defense) you spin
> around the flawed premise that "consciousness" is indisputably
> (according to all the 1st-person evidence you might ever ask for)
> instantiated in, at least, the human brain, but not within the
> workings of any formally described system.
>
> Well, it's not "in" either system (formal or evolved.)  As I've said
> before, the semantics/intentionality/meaning is a function of the
> observer, EVEN when that observer happens to be a functional
> expression of the same brain/body of that which it observes.  It
> simply expresses its nature, "meaningful" as a result of
> evolutionary and developmental process that rejected all manner
> of behavior that was not pragmatic and reinforced behavior
> that was.
>
> Of course it refers to itself as "I".  Of course it perceives its
> experience as true and complete--IT LACKS THE CONTEXT to know
> (experientially) otherwise.  Of course the illusion is convincingly
> seductive; it's the result of thousands of generations of selection
> based on survival and reproduction.
>
> And making it harder for you, even though you're a kind of programmer
> by trade, it's clear you're not comfortable with recursion.
>
> - Jef



More information about the extropy-chat mailing list