[ExI] RES: More Advanced Extraterrestrials

BillK pharos at gmail.com
Tue Jan 13 19:35:34 UTC 2015


On 13 January 2015 at 10:31, Anders Sandberg  wrote:
> First, our mental architecture is very much based on our particular kind of
> world - we need stimuli to keep going simply because they are always there
> except in sensory deprivation tanks (a rather rare environment). Other minds
> may not have that requirement, and we can envision ways of redesigning our
> own minds.
>
> Second, an unchanging world can be indefinitely interesting - ask any
> mathematician.
>
> Third, the real world of a being is not just the physical world but the
> cultural world created by the being and its cultural peers (whether other
> people or subsystems). Most people today care way more about the entirely
> open-ended worlds of celebrities, fiction and social interaction than the
> physical world. There is more competition and salient stimuli from other
> people than the outside world. A superintelligent civilization will likely
> ramp this up orders of magnitude: just try to imagine the
> competition/art/soap opera plots of superminds. Since much effort will go
> into not being bored, we should expect them to be *very* good.
>
> Fourth, this is not an argument that supercivilizations will close in on
> themselves, since there can be at least some interest in the outside world,
> and diverse starting points and internal structure do produce exploration
> even if the bulk prefers virtual.
>


I find it difficult to envisage a mind that doesn't require
stimulation outside themselves. No reaction to outside stimulation
makes it unresponsive.

Your third point I agree with. But AIs processing thousands of times
faster than humans will require entertainment / stimuli that also
happens thousands of times faster than we experience. That implies
that they create virtual worlds that run at the same speed as their
internal clocks.

They don't ignore the outside world, but they know far more about it
than we do. And it is frozen and unchanging from their POV, so of
little interest. I doubt if your idea of lone explorers setting off
into the void applies to post-sing AIs. They might decide to move the
whole civ to a different star system, but they may not need to be near
a star and well already be drifting through the void to ensure they
are not disturbed.


> It seems that any attempt to explain Fermi using cultural convergence needs
> an *extremely* strong argument able to handle exceedingly diverse minds and
> individuals. It is not enough that it sounds believable, it needs to work
> even against AGIs given pathological motivations by their creators (or
> chance). That is a tall order.
>

It may be that surviving a Singularity and creating advanced AGIs
leaves a very narrow definition of the type of AGI that survives.
'Diverse' may not be the appropriate adjective. :)

BillK



More information about the extropy-chat mailing list