[ExI] Should we still want biological space colonists?
Adrian Tymes
atymes at gmail.com
Sat Feb 8 17:51:59 UTC 2025
On Sat, Feb 8, 2025 at 11:53 AM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> On Sat, Feb 8, 2025 at 10:33 AM Adrian Tymes via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> On Sat, Feb 8, 2025 at 9:57 AM Jason Resch via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> On Sat, Feb 8, 2025, 9:23 AM Adrian Tymes via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>> On Sat, Feb 8, 2025 at 7:55 AM Jason Resch via extropy-chat <
>>>> extropy-chat at lists.extropy.org> wrote:
>>>>
>>>>> I agree space habitats make more sense than terraforming. But what
>>>>> habitat is better than virtual reality whose only limit is imagination?
>>>>>
>>>>
>>>> One that can't simply be unplugged, deleting everyone within. Also,
>>>> one with any measurable impact on the universe outside the habitat.
>>>>
>>>
>>> These ships can be designed as resiliently as desired.
>>>
>>
>> The attackers in this case have all the time they want, and superior
>> resources. They have all the same tools as the defenders, and greater
>> ability to develop more (again: superior resources).
>>
>
> If by attackers, you mean the organic humans outside,
>
The outsiders can eventually upload too, at least some of their number. I
said "all the tools": anything those inside can do, those outside can do.
Those outside can also do many things that those inside can not.
> But moreover, what do the uploaded have that those outside can't have for
> themselves?
>
The computronium is there, and convenient. If there is an utter lack of
care, as there likely eventually would be by the outsiders in this scenario
(who might eventually not be sure, or even aware, that there are minds
inside), then there's no reason not to recycle that particular
computronium. It's not even necessarily malicious.
> They especially have the advantage if the defenders abandon all sensing of
>> and interaction with the outside environment:
>>
>
> True, but that doesn't make much sense to leave yourself vulnerable like
> that.
>
Again: that's basically the scenario you're proposing, or at least the
logical end result thereof. Interacting with the outside world comes at
such great personal cost to anyone inside as to reinforce complete
noninteraction.
> And much of defense can be automated (e.g. like the CIWS that
> automatically detect and fire upon incoming threats which today's ships
> already use).
>
True. I am assuming automated but nonsentient (because otherwise that
would be putting a mind outside) defenses and maintenance. The attackers
might see it as an abandoned relic; the fact that it has active machinery
means nothing in an age where active machinery has long since been
everywhere. (Today is still not quite that age: any machinery that's still
running, was almost certainly built recently enough that someone involved -
in construction and/or maintenance - is still alive.)
> Looked at objectively, it could be a far more reliable system than
>>> humanity in its current form, where one nuclear war or one engineered
>>> pathogen, can unplug us all.
>>>
>>
>> Self-sustaining biological human colonies outside of Earth can solve that
>> problem too, and that is the alternative being compared to here.
>>
>
> But what are the advantages?
>
Primarily, that it can be started today, not waiting decades or centuries
for mind uploading before we even begin (and then decades more for
confirmation that uploaded minds do not irreversibly degrade at a faster
rate than biological humans).
> It is true that one could live a blissful eternity in a virtual reality
>>>> habitat...and literally nobody else would care.
>>>>
>>>
>>> The people having those experiences care.
>>>
>>
>> I said "nobody else".
>>
>
> Do you not care for the lives of people in other parts of the world who
> you will never meet or interact with?
>
Prisoner's Dilemma: for any given person on Earth today, there is a nonzero
chance that I will eventually interact with them, or someone who has
meaningfully interacted with them, or someone who has meaningfully
interacted with someone who has meaningfully interacted with them, et
cetera.
If, for example, someone in a position of power - let us call him Rump
- were to declare a genocidal war of extermination against some major
country, the majority of that country's citizens (not to mention other
countries' citizens who cared about said citizens) would not directly
interact with Rump but would contribute resources to an armed minority that
would attempt to inflict consequences on Rump. This applies to anyone on
Earth with enough resources (including positioning, information, and a
bunch of other things as well as personal material) to significantly affect
a large population anywhere on Earth. (Anyone without such resources can
claim to care or not, and the consequence will matter so little that it can
be hard to prove either way.)
That is not the case for a computronium habitat where none inside interact
at all with the outside.
> You could imagine our solar system as a black box, and from the outside
>>> make those same observation: "what does it matter to anyone what goes on
>>> inside this black box?"
>>>
>>
>> It can. Signals escape from the box, so it is not black. Depending on
>> where you place the box's boundaries, machines (the Voyager probes) may
>> have already escaped the box.
>>
>
> The value of the human race is (in my view) much more than the few things
> we've managed to send outside this box.
>
Here you only asked why it might matter. Value is carried on the
communication chains, but mattering at all only needs the existence of said
chains. It's like the 7 layer OSI model in modern computer networking.
> And there is the potential for more to escape later - as opposed to the
>> virtual paradise, that has permanently shunned the outside world.
>>
>
> I don't know why you insist that uploaded minds have to be shut off from
> interaction with the outside world. Consider how many people work entirely
> from home, all their productive work-related activity is then reduced to
> information flows going into and out of their home.
>
They exist at the same speed as the outside parties they deal with. Mind
acceleration - at least at nearly the scale we're talking about here -
simply isn't possible for them today,
But of course this ignores the value and meaning of the trillions of lives
>>> being lived within it.
>>>
>>
>> It does, because the value and meaning to anyone outside it is zero.
>>
>
> I would contest this. Do you place no value on life in other galaxies?
>
Define "value" in this context. My answer is either "no" or "not presently
defined".
Am I willing to give materially to support life in other galaxies at this
time? No. (In part because there's no way for said material, or the
results thereof, to reach them before better methods would be developed.)
Do I receive any benefits from life in other galaxies at this time? No.
Am I morally in favor of life in other galaxies at this time? I shrug at
that, having no basis for that sort of value, unless and until there is
evidence that they exist and what they are like.
I keep saying "at this time", but what will my values be at a given future
time? Ask me then.
> I could tell you now of a habitat that you will never interact with,
>> wherein trillions of people live. Will you give any of your material
>> resources to support it? If not, you are ignoring the value and meaning of
>> those trillions of lives - just like most people ignore it, because the
>> value and meaning is zero.
>>
>
> Do you not care about the future of humanity (which fits all of your above
> stipulations: i.e., trillions of people who you will never interact with)?
>
There is a subset that I will interact with. I do not - and can not - know
at this time precisely what that subset will be, so I can not strictly
carve off the subset I will never interact with. I will generally
prioritize those I am more likely to interact with, especially on shorter
time scales, but this is distinct from the absolute cutoff of the
computronium habitat.
Moreover, no small number of the improvements I wish to make, will benefit
many of the humans I will not directly interact with. In some cases this
is unavoidable collateral benefit (which costs me nothing, so I have no
reason to avoid it anyway); in others, they will interact with other people
who will interact with other people, et cetera, which chain eventually
reaches the people I do directly interact with and causes them to aid me
even more. (For instance, if I were to provide cheap electricity for just
about anyone in the world, the company I would run to do this would
probably act through subsidiaries to get power to most people but would
still wind up getting paid for all this delivery.)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250208/fb13ebcd/attachment.htm>
More information about the extropy-chat
mailing list