[ExI] From Arms Race to Joint Venture

William Flynn Wallace foozler83 at gmail.com
Tue Oct 23 20:57:37 UTC 2018

I think we have devolved to the problem of consciousness.  How could a
computer of any complexity experience fear? It can be told that certain
parts of the limbic system are active, and certain behaviors may follow,
but the human experiences of it cannot, in my opinion, be described even to
a fellow human - only in a global sense.  Explain to someone how a piece of
music affects you.  Or why a certain women is not pretty at all but is
highly attractive.

Of course that future AI may have no use at all for emotions or something
similar like we do.  As Sapolsky has shown, if you cut off the prefrontal
cortex from the limbic system, decisions which are made tend to be awful
and most dither and cannot even make them.  So we cannot do without them.

I have no opinion on AI and self-awareness.

  bill w

On Tue, Oct 23, 2018 at 3:22 PM Ben Zaiboc <ben at zaiboc.net> wrote:

> Bill W wrote:
> "Are AIs somehow to be equipped with superhuman emotions?  What would
> that mean?"
> Yes, they are, but not by us. And of course, we have no idea what that
> would mean. This is, at least to me, the entire point of AIs: They will
> be our successors, the next generation of self-aware, intelligent beings.
> I think it's a big mistake to use the concept of current computers and
> current software when thinking about the AI systems of the future.
> That's like thinking of humans purely in terms of molecules. There are
> many layers of complexity, built on one another, between the molecules
> and their actions, and Bob the Quantity Surveyor and his concerns about
> his tax return. Yes, they are connected, and one is actually built from
> the other, but you can't really talk about one in terms of the other.
> Advanced AI's won't simply be programs running on microprocessors, just
> as we aren't simply ribosomes joining nucleotides together. They will
> have many layers of complexity, just as we do.
> So when considering advanced AIs, thinking about our current computers
> and their software is just as relevant as thinking about how ribosomes
> work when you're considering Bob's ability to complete his tax returns.
> Saying "AIs will never have emotions because computers aren't capable of
> feeling emotions" is like saying "Bob will never be able to understand
> his tax forms because ribosomes aren't capable of reading tax forms".
> --
> Ben Zaiboc
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20181023/f93cd440/attachment.html>

More information about the extropy-chat mailing list