[ExI] Rejecting Socrates/was Re: Libertarianism wins again...

Kelly Anderson kellycoinguy at gmail.com
Mon Aug 1 23:35:20 UTC 2011


On Tue, Jul 26, 2011 at 10:53 PM, Brent Allsop
<brent.allsop at canonizer.com> wrote:
>
> Hi Kelly,
>
> Good responses!

Thanks. Sorry for the delayed reply, was out of town and no computer
over the weekend.

> I have a question for you, do you believe we will ever approach perfect
> justice?  And if so, how much do you think the Gods that inherit an immortal
> heaven, will owe to those who created it for them, taking them billions of
> years of struggle, for free?

Perfect justice seems out of reach... here's why. It is always easier
to tear down a building than to build it. It is always easier to
commit vandalism than to create a great work of art... Crime will
always be easier than justice for the same reason, that entropy is on
the side of the criminal, and not on the part of the law keeper. So it
would seem that crime will be with us for a very long time, though
certainly the nature of the crimes committed will change very much. Of
course as extropians, we fight against entropy, but I don't think
we'll win this one.

> Out of everything humanity thinks about, what percentage of time do we spend
> on history, today, despite how much information we lack, or how hard it is
> to get what little we can get?  I'm sure we spend way more than 1% of all
> our time on history, but lets just stay conservative.  So even if a super
> AI, approaching infinite abilities, 1% of infinite is still approaching
> infinite.  And would such work, with the ultimate goal of achiving a history
> of knowing what every last human did and thought, their entire life?  And
> would working on and towards all this perfect justice and perfect history be
> nothing more than watching leave it to Beaver?

I understand that super intelligences would have no trouble
understanding the working minds of all beings in history (up to a
certain point in history where the super intelligences begin to become
more than can be processed) Yet, it will be the work of these more
recent minds that will be of greatest importance to the
super-intelligences of tomorrow. Just as we today care more about
current events than history, I believe that will continue to be the
case. For it to be otherwise would be watching reruns of Leave it to
Beaver. Not that there's anything wrong with spending some of your
time watching Leave it to Beaver, it just doesn't generally contribute
much to your current life, other than mild entertainment. How often do
you watch news from even two weeks ago?

Saying that the future super-intelligences will spend all their time
studying us is similar to us spending all our time studying our
ancestors. And while there is some fun in genealogy, and certainly
some interesting things to think about in terms of questions about
evolution and history, those past thinking pursuits are nostalgic and
empty past a certain point. Living in the present, for the future, is
much more enriching than constantly pursuing the past, don't you
think?

> I believe we are not in a simulation for the very same reason I hope there
> is no sentient God of any kind hiding from us.  This is because creating
> such, and then hiding from us creations like that, would be inhumane,
> immoral and obviously, for so many reasons, unnecessary.

Not at all. Suppose that I created a simulation of myself in a
particular set of circumstances for the sole purpose of having that
copy of my mind write a computer program towards a particular purpose.
I would want that copy of my mind to be fully engaged in that
activity, but also to be having a rich and fulfilling life (which
contributes to good programs) rather than being a slave tied to a
keyboard. Giving that copy of myself knowledge that it is a copy, and
will "die" in three months when the program is written would have a
negative impact on my goal, that of writing the program, and so I can
see myself hiding such information from myself. I do not see this as
being inhumane, and the opposite would be inhumane... Imagine getting
up in the morning aware that you were a simulation that was being run
ONLY for the purposes of accomplishing a particular task... That would
SUCK. The task would not get completed, and it would overall be
horrid.

> For example, we
> can create abstract simulations, that behave exactly like us, yet for which
> there is nothing like it to be such beings.  Such beings, even though they
> will act as if they are in pain, as they watch their loved one die, will not
> really phenomenally feel anything like we do...  Sure, it's not a great
> argument, but still, I certainly HOPE we are not in any kind of terrible
> simulation, and that there is some way to accomplish whatever such would
> accomplish, without us having to suffer through all this terrible primitive
> isolation...

Why do you have the idea that being in a simulation would be terrible?
I'm not implying that you are God because you run a simulation. Nor
does the runner of said simulation put themselves in the position of
God.

> You said you feel sorry for the future members of your religion.

Well, I have no religion myself, but I think Mormonism is uniquely
qualified to survive into the transhumanist future because of its
structure. The continuous revelation aspect of the religion allows it
to be malleable to the point that it need to in order to survive.

> I don't
> think it will be quite like that.  Emagine through some freak one time
> miracle, we managed to extract and reproduce a single human being from
> 50,000 years ago, and all of his life long experiences.  Certainly such a
> being would be very popular today, some would even consider that he would be
> worshiped, in a way, and given anything he wants, especially compared to the
> life he lived 50,000 years ago.  I think of it being more like that.  More
> of a respect for what he was and did, than seeking any kind of guidance from
> us, or anything.

I think we would put him in a cage at the zoo. He might be a curiosity
for a few moments, but we couldn't learn anything much relevant to
today from such a creature. He would be missing all the context
between then and now. And once he learned the context, he would just
be one of us, uninteresting, with a day job.

> So, you don't think I should wonder if those future AGIs will all know
> intimate details, far more than I can remember it myself, of just how
> successful my last sexual encounter with my wife was, how it felt for both
> of us, and all the rest of every intimate details about my entire mortal
> life?  Do you have any interest, at all,  of knowing all such about all of
> your ancestors, especially if such only took up less than 1% of everything
> you can do?

Well, to be honest, I don't have much interest in my parent's or
grandparent's sex lives... That's just TMI for me!!! Thanks, but no
thanks. :-) As for us being the "gods" of the future, that would
require a form of ancestor worship that I don't think would be
consistent with ultra high intelligence. Our view of an intelligent
future is congruent, but our view of what they will do with their
intelligence differs. Hopefully, we'll both live to see who is right.
:-)  Do I still get to be a transhumanist? :-)

-Kelly




More information about the extropy-chat mailing list