[ExI] puzzle - animal consciousness

Anders Sandberg anders at aleph.se
Wed May 21 23:01:05 UTC 2014


Mike Dougherty <msd001 at gmail.com> , 20/5/2014 3:09 PM:
So if Bayesian Beagle and Deductive Doberman behave exactly the same way are they interchangeable with respect to this question?  
Note that externally identical behaviour might not be enough to matter; sometimes we care about whether the internals are the same. 
In the case of the BB and the DD I think if they are smart enough they will have to behave similarly. At least the DD will have to be Bayesian because of Cox's theorem, and I think the BB will also have to follow logic (assuming the world does), so they would reach equivalent conclusions despite their different models.
There are a lot of systems that are equivalent because they are rich enough. Tonight I came across Post's Functional Completeness Theorem which is pretty awesome:https://en.wikipedia.org/wiki/Functional_completenesshttp://www.ualberta.ca/~francisp/Phil428/Phil428.11/PostPellMartin.pdfGiven the right, small set of logical connectives, one can build any logical table (or circuit, if you think about digital technology). As the second link shows, some of these sets are pretty odd.
If I use a logic based on Fredkin gates and you use a logic of NAND, are we thinking in the same way? We would reach the same conclusions, but some are easier in one logic than the other. I suspect that when resource limited (time, memory) one would see some differences in our reasoning in terms of what could be proven. Maybe the key test of thinking is to see how it behaves under various constraints, not what it could in principle think.


Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20140522/c7fdbd5e/attachment.html>


More information about the extropy-chat mailing list