[ExI] RES: More Advanced Extraterrestrials
anders at aleph.se
Tue Jan 13 10:31:18 UTC 2015
BillK <pharos at gmail.com> , 13/1/2015 10:42 AM:On 13 January 2015 at 06:59, Rafal Smigrodzki wrote:
> ### Every single one of them? Untold trillions of superminds, over billions
> of years, always and inevitably getting bored? Not even one of them deciding
> to send out even a single self-replicating probe designed to self-replicate
> for ever, just for the heck of it?
That is pretty unbelievable for us humans who rely on continual
stimulation to keep going. ...
But think about the effects of speeding up your brain processing by
possibly thousands of times.
You are trapped in an unchanging physical world.
First, our mental architecture is very much based on our particular kind of world - we need stimuli to keep going simply because they are always there except in sensory deprivation tanks (a rather rare environment). Other minds may not have that requirement, and we can envision ways of redesigning our own minds.
Second, an unchanging world can be indefinitely interesting - ask any mathematician.
Third, the real world of a being is not just the physical world but the cultural world created by the being and its cultural peers (whether other people or subsystems). Most people today care way more about the entirely open-ended worlds of celebrities, fiction and social interaction than the physical world. There is more competition and salient stimuli from other people than the outside world. A superintelligent civilization will likely ramp this up orders of magnitude: just try to imagine the competition/art/soap opera plots of superminds. Since much effort will go into not being bored, we should expect them to be *very* good.
Fourth, this is not an argument that supercivilizations will close in on themselves, since there can be at least some interest in the outside world, and diverse starting points and internal structure do produce exploration even if the bulk prefers virtual.
It seems that any attempt to explain Fermi using cultural convergence needs an *extremely* strong argument able to handle exceedingly diverse minds and individuals. It is not enough that it sounds believable, it needs to work even against AGIs given pathological motivations by their creators (or chance). That is a tall order.
Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat