[ExI] ai emotions

Stuart LaForge avant at sollegro.com
Thu Jun 27 15:01:14 UTC 2019


Quoting Brent Allsop:

> I’ve been confronting Naive Realists bleating completely qualia  
> blind (only use one word ‘red”, instead of multiple words like red  
> and redness to talk about different physical properties and  
> qualities.) rhetoric on places like quora and reddit.

Redness is the knowledge of red. Consciousness is like a  
physical/mathematical function by which information becomes knowledge  
which is system-integrated information. Redness is thereby a function  
of red. To borrow Tononi's notation, redness = PHI(red). What is the  
distinction between consciousness and the ability to actively learn? I  
have trouble of seeing one. In some respects, the point of  
consciousness seems to be to find out what happens next.

> Sometimes it is so frustrating that so many people just can’t think.

It is only frustrating if you think about it. Just kidding of course,  
but if you need to reach people like that, then try appealing to their  
emotions. It often works better than logic. Logic is overrated anyhow.  
If you start with incorrect premises, then you reach wrong  
conclusions. GIGO applies to people as well as learning machines.

> After feeling so dirty, and frustrated with so little progress, with  
> so many, it is so nice to be pulled back up in the clouds, trying to  
> keep up with you guys taking me where I’ve never  been before.
> Thanks everyone, for providing such an inspiring forum, for so many  
> continued years, and for restoring my faith in humanity so often.

You have done your part to make the list an interesting forum, Brent,  
so one list member to another thank you as well.

Here is a paper about consciousness by physicist Max Tegmark you might  
like entitled "Consciousness as a State of Matter". I notice you don't  
have him or that particular camp set up on Canonizer.

https://arxiv.org/abs/1401.1219

He is a little all over the place, but his ideas are interesting and  
overlap some of mine. I still have to see if I can reconcile our maths  
but that will take time.

Stuart LaForge






Stuart LaForge







More information about the extropy-chat mailing list