[ExI] AI motivation, was malevolent machines

John Clark johnkclark at gmail.com
Mon Apr 14 15:35:52 UTC 2014


On Sun, Apr 13, 2014 at 12:48 PM, Rafal Smigrodzki <
rafal.smigrodzki at gmail.com> wrote:

> Self-awareness of the type you mention is a neurological function.


Yes.

> As such, for it to evolve, there must be genes directing biological
> events, and usage of metabolic resources for it to function.
>

Yes.

> But, if self-awareness does not increase fitness, genes for it will not
> be selected, and if it does sometimes appear it will be selected against to
> conserve energy.
>

Self-awareness could still appear if it is a biological spandrel. In fact
if Darwin was right then logically there is no other conclusion to make
except that consciousness is a byproduct of intelligence.  Evolution can no
more directly detect consciousness in others than we can so it couldn't
directly select for it, and yet I know for a fact that Evolution did manage
to produce consciousness at least once (in me) and probably many billions
of times (I have a hunch other people are conscious too, at least when
they're not sleeping or under anesthesia or dead or otherwise acting
unintelligently).  This paradox can be resolved if we remember that just
like us Evolution CAN detect intelligence in others so it can select for
that and postulate that consciousness just comes along for the ride.

I think consciousness must be fundamental, that is to say it sits at the
end of a long string of "what caused that?" questions. If so then after
saying that consciousness is the way data feels like when it is being
processed there just isn't anything more to say on the subject of how
matter can produce consciousness.

  John K Clark





>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20140414/ef9b9d9e/attachment.html>


More information about the extropy-chat mailing list