[ExI] Should we still want biological space colonists?
Jason Resch
jasonresch at gmail.com
Sat Feb 8 14:55:54 UTC 2025
On Sat, Feb 8, 2025, 9:23 AM Adrian Tymes via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> On Sat, Feb 8, 2025 at 7:55 AM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> On Fri, Feb 7, 2025, 11:05 AM Adrian Tymes via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> On Fri, Feb 7, 2025 at 7:06 AM Jason Resch via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>> Any civilization that has mastered technology to the point of being
>>>> capable of building artificial bodies and brains will see the engineering
>>>> of customized robotics as far preferable to terraforming planets and will
>>>> see the transport of uploaded minds inhabiting the unlimited space of
>>>> virtual realities as far more efficient than trying to haul fragile,
>>>> radiation-sensitive, prone to spoil, meat bodies to the stars in generation
>>>> ships.
>>>>
>>>
>>> Even if AGI was about to happen, there is a vast gulf between AGI
>>> extended from current AI efforts and mind uploading.
>>>
>>
>> You are correct that there is a vast gulf in technological
>> sophistication, but the exponential speed at which technology advances
>> implies there's only a short gulf in time between those two milestones.
>>
>> In my estimation, there's less than two decades between AGI and mind
>> uploading. And with superintelligent AI timescales collapse further still.
>>
>
> I emphasize "extended from current AI efforts", as that lends no direct
> support to mind uploading. (Other than "but superintelligence can and will
> figure out anything", which increasingly seems to be running into limits
> the more that premise is examined.) Though, current efforts do not seem to
> lead to AGI in the short term.
>
> I agree space habitats make more sense than terraforming. But what habitat
>> is better than virtual reality whose only limit is imagination?
>>
>
> One that can't simply be unplugged, deleting everyone within. Also, one
> with any measurable impact on the universe outside the habitat.
>
These ships can be designed as resiliently as desired. With self healing
informational redundancy (e.g. erasure codes), armies of nanobot repair
machines, systems running provably correct software, redundant internal
power supplies, and so on.
Looked at objectively, it could be a far more reliable system than humanity
in its current form, where one nuclear war or one engineered pathogen, can
unplug us all.
> It is true that one could live a blissful eternity in a virtual reality
> habitat...and literally nobody else would care.
>
The people having those experiences care.
You could imagine our solar system as a black box, and from the outside
make those same observation: "what does it matter to anyone what goes on
inside this black box?"
But of course this ignores the value and meaning of the trillions of lives
being lived within it.
You could run a million, a billion, a trillion instances of blissful
> eternities in such a habitat, with not a one communicating outside or
> otherwise doing anything of consequence to anyone outside.
>
Nothing prevents communication between the inside or outside. Nothing
limits travel either, as you could jump into a robot body any time. You
could upload and watch YouTube videos, join zoom calls and email with
people outside. It's really no different than life in any place (be it an
apartment, city, country, or planet earth).
> Indeed, some versions of the Heaven tale essentially claim that is what
> Heaven, with God as the highest level system administrator. (There are
> similar tales of Hell, save for being far less blissful for the average
> inhabitant.) And yet, even for those who fervently believe this is true,
> in most cases (with notable exceptions for those unable to keep functioning
> well anyway) given the choice of a longer life on Earth or going
> immediately to Heaven, they keep choosing the former. The reason why can
> be debated, but most people appear to prefer to continue to affect the
> universe they were born into.
>
The only actions that matter are those that effect conscious experience. If
everyone is uploaded, and no one is outside, then no actions taken outside
(where there is no consciousness) have any purpose or value.
> Those who would prefer to opt out of the physical universe and live
> entirely in self-contained virtual habitats, essentially commit suicide so
> far as the outside universe is concerned.
>
I'm not proposing there be a firewall between the inside and outside
worlds. I am only saying that if we build a space habitat, how much better
a habit it would be for it to be one of uploaded minds inside virtual
worlds, than for the habit to be meat bodies stuffed in tight quarters,
finding so little to do and sharing their habitat with so many fewer people
(for a given size of space habitat).
Which means they have no defenses should those who live outside the
> habitats desire to repurpose a habitat's resources.
>
Think of it more like a submarine. A submarine has inhabitants, but being
in control of a vessel the crew can still affect the world. It can send and
receive messages with the outside world. It can defend itself by evading
threats or with offensive weapons. If there is some damage, they can send
out a repair crew to service the outside or use damage control teams to
make repairs inside. It has its own energy supply so no one can unplug it,
and so on.
This is what I am envisioning for such a space habitat, not some cube of
computronium sitting bare on a desk, whose fate is at the whim and mercy of
those generous enough to keep it repaired and supplied with power.
Does computronium become tastier with a greater quantity of independent
> minds it used to run?
>
I'm not sure what this question is asking.
>
>> At the physical limits of computational efficiency, the computation
>> required to run 100 billion human minds could fit in a computer with a
>> volume no bigger than a grain of sand.
>>
>> Can we rule out that we already inhabit a universe filled with such "dust
>> ships"?
>>
>
> No. Perhaps call the dust ships "fairies", and the reason why - and their
> potential impact - becomes clearer.
>
Motivated intelligence is the greatest force in the universe. It's form
factor matters very little.
Jason
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250208/076e779c/attachment.htm>
More information about the extropy-chat
mailing list