[ExI] Giving up autonomy for cryogenic suspension

Mike Dougherty msd001 at gmail.com
Sun Nov 22 17:34:49 UTC 2020


I hope the host platform for reanimated mindfiles has an opt-out... but how
do we deal with the "turn it off and back on" scenario for consciousness?

Every reboot establishes the boundaries for what you can tolerate and what
you cannot. As long as the existential torment is below your "nope, I'm
out" threshold, you will endure.  How does this compare to your current
biological hosting platform?


I do expect that the "cost" proposed as a barrier to sadism is actually
negligible for far future potential sadists.  I imagine that "we'll make
great pets" was profoundly prophetic. Think tamagotchi, but instead of a
cheap plastic encased computing device for primitive code, it might be a
cheap virtual sandbox for running primitive mindfiles.  Whether it's a toy
for children's amusement or a bit of decor like a piece of artwork or
houseplant ... we probably shouldn't worry about it so much because there's
really no evidence that we're not already in a existential windowsill
flowerpot. meh.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20201122/319a31c9/attachment.htm>


More information about the extropy-chat mailing list