[ExI] Watson on NOVA
lubkin at unreasonable.com
Tue Feb 15 20:13:18 UTC 2011
Richard Loosemore wrote:
>In fact, the AGI would be designed to feel empathy *with* the human
>species. It would feel itself to be one of us. According to your
>logic, then, it would design its children and to do the same. That
>leads to a revised conclusion (if we do nothing more than stick to
>the simple logic here): the AGI and all its descendents will have
>the same, stable, empathic motivations. Nowhere along the line will
>any of them feel inclined to create something dangerous.
I'm as strong a technophilic extropian as any, but I'm leery of Bet
Your Species confidence. Yes, pursue AGI, MNT, SETI, genemod. But
take adequate precautions.
I'm still pissed at Sagan for his hubris in sending a message to the
stars without asking the rest of us first, in blithe certainty that
"of course" any recipient would have evolved beyond aggression and xenophobia.
More information about the extropy-chat