[ExI] How not to make a thought experiment

Christopher Luebcke cluebcke at yahoo.com
Thu Feb 18 21:20:34 UTC 2010


Certainty that the human's report is accurate does not provide the basis for certainty that the robot's report is not. This is crucial to the point--while it may be said that a functioning organic brain and nervous system is sufficient for consciousness, it has not at all been shown that it is necessary.


----- Original Message ----
> From: Spencer Campbell <lacertilian at gmail.com>
> To: ExI chat list <extropy-chat at lists.extropy.org>
> Sent: Thu, February 18, 2010 12:49:40 PM
> Subject: Re: [ExI] How not to make a thought experiment
> 
> John Clark :
> > I'm also a little curious why Swobe takes at face value a report from a
> > human being that he has subjective experience but if a robot, regardless of
> > how intelligent, reports the same thing Swobe is certain he is lying.
> 
> Because human beings have brains!
> 
> HUH HUH SWOBE WRONG
> 
> No. It's because Gordon is a human, and Gordon can detect his own
> consciousness, and so Gordon assumes that other humans can do the
> same. It's a reasonable assumption, even if it does vaccinate him
> against a whole class of extremely relevant thought.
> 
> If you must keep picking on him, at least try to distinguish between
> the strong parts of his argument and the weak parts. This was a strong
> part. You only made it stronger.
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat




More information about the extropy-chat mailing list