[ExI] Is a copy of you really you?

Keith Henson hkeithhenson at gmail.com
Mon Jun 1 22:05:38 UTC 2020


John Clark <johnkclark at gmail.com> wrote:

On Mon, Jun 1, 2020 at 1:25 AM Keith Henson via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
>
> * > What concerns me is giving a human mind copy vastly more power in an
> artificial brain. Humans seem to have evolved psychological mechanisms that
> detect "looming privation," resource shortages, and for good evolutionary
> reasons respond by attacking neighbors and taking their resources. (After a
> period to build up xenophobic memes.) Consider a raw human mind mapped
> into a powerful AI with substantial enhancement. *

>  have doubts about your theory but even if it's true I don't see it or
Darwinian Evolution in general playing a significant part in the future;

I agree.  My work in this area is looking at what stone age evolution
has left us with in terms of psychological traits.  I find it
terrifying that human behavior can switch into modes that give rise to
the many violent episodes in historical times and ongoing ones like
the current rash of destructive riots.

But I am not entirely confident about the end of Darwinian evolution.
I sincerely hope our civilization does not fall, but if it does, it
will not be the first time.  That could put Darwinian selection back
in the driver's seat.

> the evolution of memes will be more important than the evolution of genes
because minds are involved thus it's vastly faster. And the faster minds
become the faster memes will change.

I think most of extropy-chat readers have seen the article I wrote
where humans could experience 50 million years of subjective existence
before the end of this century.
https://web.archive.org/web/20121130232045/http://hplusmagazine.com/2012/04/12/transhumanism-and-the-human-expansion-into-space-a-conflict-with-physics/

The original seems to be off the net.

> Rather than worrying about outmoded behavioral programming inherited from
our stone-age past I'm much more concerned with what will happen when we
have full conscious control of our emotional control panel.

Marvin Minsky wrote extensively about the dangers of mind
modification.  We are (perhaps, fortunately) a long way from having
such controls.  In the meantime, the stone-age psychological traits of
going to war due to resource shortages may keep us from making
progress in emotional controls.

> Suppose there
was some task that you knew you should do but don't want to because you're
naturally lazy and the task is dull, so you just turn one knob on your
emotional control panel and now you're no longer lazy, now you love nothing
better than hard work, and you then turn another knob and now you find your
job of putting thousands of caps on thousands of toothpaste tubes to be
utterly fascinating, fulfilling, and endlessly enjoyable. It seems to me
that having too much self-control would lead to a positive feedback loop,
and those things tend to head for extremes and rarely produce anything
productive, they usually end up producing either no output or an explosive
output. Sometimes literally explosive.

Mind modification of this sort is something to approach with a great
deal of caution.

* > if you, yourself, are duplicated by the thousands, millions, or
> billions and there is some limit on the number of poker chips that can be
> made, then a large numbers of copies of you are going to make you poor in
> terms of chips.*


> If there is a consensus among the many copies of me that there are just too
many of us then we could decide to merge back together. The resulting being
would then remember doing different things in different places at exactly
the same time, but that would be OK with me.

I suspect this may have problems besides parallel memories.  But this
is rank speculation.

I don't know how much (if any) of humanity will survive the changes we
see coming.

Keith


More information about the extropy-chat mailing list