[extropy-chat] In the Long Run, How Much Does Intelligence Dominate Space?

Lee Corbin lcorbin at tsoft.com
Sun Jul 9 20:45:42 UTC 2006


Damien Sullivan writes

> And why would you make a star go bang?  How would you store or use the
> energy?  I guess distilling the energy as (anti)matter streams might
> look attractive, but you're talking about passing the energy of a
> stellar mass through a much smaller distillery in a fraction of a
> second.

My remark is really about what the possible motivations of extremely
advanced life. Let's suppose that benefit is directly proportional
to computation per second times total number of seconds.

Now we humans are accustomed to being able to *think* only so much
per second. Therefore, to maximize our benefit, we need to live as
long as possible.  But if *benefit* is really what you want, then
it doesn't really matter what the outside universal clock reads.

Also, one of our prejudices is to live as long as possible in order
to take advantage of opportunities that are not currently available.

But the maximally advanced life that I have been able to envision 
does not have these particular constraints. Therefore, if it's
possible, one might as well do all one's thinking as soon as
possible.  What difference does it make, anyway?

Besides: as I see it, by this time truly intelligent life has already
attempted and already succeeded in having spread long ago away from any 
particular star. So one final burst of EM carrying all the information
gleaned from the last few hours of maximum benefit---read "damn near
Omega point"---following the nova can be sent to nearby stars, in order
that they may profit in time for their novae.

Just what, you might ask, would be sent?  Design info and math results,
that's all.  As I've said, I think that that's all there will be in the
end. The design info is how to become more advanced and enjoy better
(gratification research), and the math is---to pick the most trivial
intriguing problem---what are the highest Ramsey numbers found so far.

> I think his point was that seems inconsistent with blowing up stars to
> live as quickly as possible.

Oh. Well, I think I've dealt with that above, too.

> > Yes, up to the chance that I can partake, and yes, also out of
> > some residual loyalty to the human race, my family, nation, etc.,
> > and other outmoded loyalties in the age of individualism.
> 
> The age of individualism, hmm.  Individual freedom has gone up in some
> ways, but in others we seem more interdependent than ever before.

Yes, and I wish that we could continue a discussion of the granularity
we should expect of extremely advanced intelligence. Robin and you have
your hunches that I "don't appreciate the complex possibilities", that
we "don't know how to divide up [a complex future world] into 
individuals", and "a strong pressure to get along, but a single will
imposing rules doesn't strike me as the only or even the best way of
describing what goes on", that is, your talk of "coordination scale".

And I have my hunches.  And so there we are, until our thoughts
become a little less inchoate.

Lee




More information about the extropy-chat mailing list