[extropy-chat] Fools building AIs

Eliezer S. Yudkowsky sentience at pobox.com
Fri Oct 6 03:38:13 UTC 2006


Mike Dougherty wrote:
> 
> I do not claim to be a highly skilled rationalist.  So I will only ask a 
> question.
> 
> What if the superior processing power was something similar to the Star 
> Trek Borg?   I know, this is "science [sic] fiction" but the concept is 
> akin to forced upload into a Matrioshka Brain.  Would we collectively be 
> "better off" in terms of escaping the zero-sum life experience?  Each 
> individual in the collective could feasibly believe they were at the top 
> of the hierarchy.   Rather than fighting amongst ourselves over the 
> limited biochemicals contained on this rock called Earth, we could 
> simulate the misery of earthly existence for the nostalgic masochists 
> who refuse to move on.  I know the 'Borg' were depicted as horrible 
> enemies of humanity - but once you get over the 
> serialization-upload-deserialization procedure, "life" as the uploaded 
> are concerned could be at least equal if not infinitely easier.

I don't understand your "what if".  What if what?  What if the above is 
the actual outcome?  (Answer: it's a complex scenario with no specific 
support given so it's very improbable a priori.)  What about the above 
as an optimal solution from a humane standpoint?  (Answer: it seems 
easier to conceive of third alternatives which are better.)

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list