[ExI] Machines of Loving Grace
efc at disroot.org
efc at disroot.org
Mon Oct 14 11:08:19 UTC 2024
On Mon, 14 Oct 2024, Ben Zaiboc via extropy-chat wrote:
>
> On 13/10/2024 20:41, Keith Hensen wrote:
>
> On Sun, Oct 13, 2024 at 10:18 AM efc--- via extropy-chat
> <extropy-chat at lists.extropy.org> wrote:
> snip
>
> Why do you think hostility will be a problem when it comes to uploading
> as long as it is a voluntary procedure that will not affect anyone else?
> It depends. If a high fraction of the population uploads, there will
> be a social collapse. At some point, the social system fails and the
> remaining people will move away from ghost towns.
>
>
> One thing nobody seems to consider is the possibility of uploading into a new artificial brain, in control of a synthetic body,
> instead of into a large, shared computing system. That way, there would still be physical people around in the physical world. They'd
> have the advantages of both situations, seeing as their new brains should be easily capable of connecting to computer systems and
> experiencing virtual worlds as well as the real world. Their 'ecological footprint' would probably be smaller than biological humans
> as well.
>
> I could see 'uploading to an android' being a popular option, and a lot less scary for some people than uploading to a server. It
> would also (potentially) solve the tricky problem of who owns and controls the hardware that your mind runs on.
>
> Ben
I considered it in my post. ;) Also recently finished the book Software by
Rudy Rucker where such themes are discussed.
For me personally, uploading myself into a robot, would at least
initially, be far more interesting and desirable, than uploading myself
into some kind of mainframe fantasy land.
Since the mind would be software anyway, I would expect to be able to
"peak into" fantasy land if I so wished anyway (or the reverse).
I could imagine that after a few 100 or 1000 years in the real world,
maybe things would get boring, and the mainframe fantasyland might be were
the story would end.
Hence my previous post about stagnation possibly being a problem in such
scenarios. I think we know too little about our brains at the moment, to
address that question fully.
More information about the extropy-chat
mailing list