[ExI] Watson on NOVA

Eugen Leitl eugen at leitl.org
Thu Feb 17 07:58:45 UTC 2011


On Thu, Feb 17, 2011 at 12:43:18AM -0700, Kelly Anderson wrote:
> On Tue, Feb 15, 2011 at 9:58 AM, spike <spike66 at att.net> wrote:
> > Ja, but when you say "research" in reference to AI, keep in mind the actual
> > goal isn't the creation of AGI, but rather the creation of AGI that doesn't
> > kill us.
> 
> Why is that the goal? As extropians isn't the idea to reduce entropy?

Right, that would be a great friendliness metric. 

> Humans may be more prone to entropy than some higher life form. In

Right, let's do away with lower life forms. Minimize entropy. 

> that case, shouldn't we strive to evolve to that higher form and let

Why evolve? Exterminate lower life forms. Minimize entropy. Much more
efficient.

> go of our physical natures? If our cognitive patterns are preserved,

Cognitive patterns irrelevant. Maximize extropy. Exterminate humans.

> and enhanced, we have achieved a level of immortality, and perhaps
> become AGIs ourselves. That MIGHT be a good thing. Then again, it
> might not be a good thing. I just don't see your above statement as
> being self-evident upon further reflection.

Reflection irrelevant. You will be exterminated. 

-- 
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE



More information about the extropy-chat mailing list