[ExI] Observation:

Alan Grimes agrimes at speakeasy.net
Mon Oct 25 00:45:24 UTC 2010


Wonderful post, Chrome://messenger! =P

> Alan, would you care to concisely describe there what you believe about
> this issue, so we can find others that agree with you (and I?), and know
> just how many of us there are?  So we can all work together instead of
> standing alone?

There are dozens of different issues here. I'll list some of them and
then do my best to respond to a few of them.

* What are my exact beliefs regarding uploading?

* What are my ideas about the best approach to expand the mind using

* What are my political views on uploading and other transhumanistic topics?

* What do I believe are the common ground technologies for transhumanism?

Regarding uploading, I am most used to explaining things in terms of
flaws I perceive in other people's reasoning. One such flaw I saw
recently was an argument based on theoretical computer science. The
basic idea was: Same program; same state; equivalent processor --> Same

I think much of theoretical computer science is flawed because it
departs a million miles from an intuitive understanding of the art. It
does this because it has evolved to serve one purpose: easy analysis
using mathematical tools evolved in the 1800's. From a scientific
standpoint, it is valid to say that if you successfully copied a
"mem-self", (no mean feat that!), It would naturally re-create a
psy-self all on it's own. Nobody can deny this. But on the other hand we
are not talking about letters on a turing-tape. That which is being
copied will experience a horrible, excruciating death (as all deaths
are), while something completely different *and no better* will be
created somewhere else. There are twenty different ways to interpret
this. The computronium-lubbers mostly gravitate towards any point of
view which argues in favor of you being the copy.

The only point of view which means anything whatsoever to me is whether
the I that am sitting here in this [cheap office depot] arm-chair will
experience something good as a result of it. The answer is an
unequivocal no, my brain will get to experience being discarded in a
dumpster in a biohazard bag. What happens to the copy is entirely

Even when I cover up that problem with a towel, I can't find *ANYTHING*
about being an upload that I find appealing. Furthermore, I have no
difficulty at all identifying technical problems and challenges that one
would face as an upload that are typically dismissed as being irrelevant
or flat out ignored. =\ So yeah, I don't think the uploaders have ANY
idea what they're getting themselves in for.


Politically, it goes against my philosophy to try to prohibit uploading
of any kind. You will not find anyone to whom I've said "no, I will not
allow you to upload". (You are welcome to try).

On the other side of the fence, within the last week, on this very list,
two different people have asserted that the inevitable evolutionary
outcome is that the entire solar system, without exception, will be
reduced to computronium. When called on this, they complained about my
use of the word "reduced".

If it needs to be said, if all the mater in the solar system were
converted to computronium, I wouldn't have anything to build vacuum
tubes with, and that would suck.


Now what I want to try to do in the mental department is to develop a
fairly optimal AI substrate, then mind-meld with it using a neural
interface. Because AI is not constrained, in any way, to the original
neural template, it should be able to achieve orders of magnitude better
performance, on EVERY metric to *ANY* possible upload. Thanks to the
folks at Cray Research, you can mail order a million-core machine. (and
then power it with something like 50kw/chassis)... (folks without a
480volt 3-phase service need not apply! ;) Each chassis literally weighs
a ton too! =P

If you get one of those beasties on an OC256 (13.2gigs/second internet,
baby!), you could rule the world! =P -- not that I have any desire to
rule the pathetic stinking humans but hey... I would use it for R&D,
basic singularity stuff, etc... One of the nagging worries I keep in the
back of my mind is that if I augment myself enough to be able to design
my next body, I will have evolved past actually wanting it at that
point... =\

With sufficient funding, I could have that off the ground in as few as 5
years...  (Maybe I'm just experiencing a delusional week? =\ )


The common ground in transuhumanism seems to be AGI. With AGI there is
the very real possibility of everyone being granted his own genie
machine (for better or worse).

Also, the above vision calls for a "general neural interface" which is
to AGI, as conventional neural interfaces are to today's AI. That might
not be everyone's cup of tea but it would go a long way to keeping your
genie's intentions close to your actual intentions.

General purpose nanotech falls into the same category. -- Everyone benefits!

Also there are some emerging technologies in the quantum realm/nuclear
scale that would seem to be of common benefit to all transhumanists.

Powers are not rights.

More information about the extropy-chat mailing list