[extropy-chat] Are ancestor simulations immoral? (An attempted survey)
A B
austriaaugust at yahoo.com
Fri May 26 20:08:04 UTC 2006
Hi Samantha,
Samantha writes:
"The Universe is a damn big place. Short of crashing space-time I
don't see it happening and of course I don't see any way that could
be done."
How about triggering a Meta-stable Vacuum Decay Bubble that expands outward at the speed of light? If the Universe turns out to be finite, the Bubble would eventually consume all of it, rendering this Universe lifeless and matter-less as well. I'm no expert on the subject, but my impression from reading is that it could *theoretically* be achieved with a reasonably powerful particle accelerator. Although it appears to be extremely unlikely as an accidental occurrence.
Samantha writes:
"I don't think much less than super-totalitarianism by a hopefully
guaranteed benevolent government or later a SAI more powerful (and
kept that way) than anyone and anything else that can come along will
get you full safety. Personally I value freedom far more than that
level of safety. And I am very cynical about the "guaranteed
benevolent" part."
I don't have the answers - but these things need to be deeply considered. I personally also value almost ALL freedoms (spare what I've mentioned already) more than I value *my* own personal safety, but I can't speak for the 6 Billion other people on this planet.
Here's something that needs to be kept in mind: Systems of Governments are dynamic things (although perhaps slow). Who would have thought in 16th Century England that in a few centuries something like Democracy would exist anywhere in the world? Something lost (or absent) in the present is not *necessarily* lost (or absent) forever. Not that I favor Totalitarianism.
If an existential disaster destroys all life (and matter) in this Universe, then of what value is this idea called "Freedom"?
Best Wishes,
Jeffrey Herrlich
Samantha Atkins <sjatkins at mac.com> wrote:
On May 25, 2006, at 12:17 PM, A B wrote:
> Hi Samantha,
>
> Samantha wrote:
> "It is not a contradiction. Freedom includes the possibility to
> really screw up."
>
> Then do you believe that a post-human should have the right to
> trigger an existential disaster that ends all life within this
> Universe?
The Universe is a damn big place. Short of crashing space-time I
don't see it happening and of course I don't see any way that could
be done. Tell me, if a Being came a long so powerful that it *could*
crash space-time what would be able to monitor and control it that
was immune to possibly making the same suicidal error?
Would you want a super-totalitarianism for all posthumans just on the
off chance that one of them might do something really stupid, by
accident or on purpose? Do you want super-totalitarianism or as
close as we can get to it here and now on earth to prevent some evil
genius from say, cooking up gray goo or a super-plague in the privacy
of his or her basement?
I don't think much less than super-totalitarianism by a hopefully
guaranteed benevolent government or later a SAI more powerful (and
kept that way) than anyone and anything else that can come along will
get you full safety. Personally I value freedom far more than that
level of safety. And I am very cynical about the "guaranteed
benevolent" part.
>
> Samantha:
> "Then you don't play violent video games? At what level of
> complexity of software based characters would you stop playing or
> outlaw the games? The actual questions are much more complex than
> just saying "no suffering allowed" in created realities. As I have
> argued there will be suffering in any reality containing autonomous
> beings. We agree on not inflicting suffering as in torture and so on."
>
> I have indeed played violent video games, however, I was quite
> confident that my computer/game was not conscious and suffering. If
> I ever thought otherwise, I would immediately cease playing it. Any
> software that includes intentionally inflicting pain on conscious
> beings should be outlawed.
Even if the players volunteered to perhaps experience such negative
things?
>
> Samantha:
> "This is a straw man that was not advocated."
>
> I didn't mean to claim that it was advocated, I was making my case.
>
Then perhaps it would be better to address the points of actual
contention instead going off a bit on things no one really advocates.
>
> Let me ask you a question:
>
> I assume that we agree that a "real" being and a conscious
> "simulated" being are both composed of hardware and software and
> that both exist at the "real" layer of "reality". Why should ending
> the life of a "simulated" being, be viewed any differently than a
> "real" being murdering another "real" being? Why should torturing a
> conscious "simulated" being, be viewed differently than a "real"
> being torturing another "real" being? The crimes are equivalent.
> The only "factor" that would supposedly separate the status of a
> "real" being from the status of a "simulated" being is that the
> "real" one was born first and therefore supposedly deserves to
> wield ultimate power over the one that was born later. That's
> "messed up"; it's legally allowing murder and torture.
>
At similar levels of complexity murder is murder. However, creating
a world where murder and other grievous wrongs may occur among the
inhabitants is not murder or necessarily immoral. That is the
point I have been attempting to get across.
- samantha
_______________________________________________
extropy-chat mailing list
extropy-chat at lists.extropy.org
http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
---------------------------------
Do you Yahoo!?
Get on board. You're invited to try the new Yahoo! Mail Beta.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20060526/942288ee/attachment.html>
More information about the extropy-chat
mailing list