[ExI] What might be enough for a friendly AI?

Aleksei Riikonen aleksei at iki.fi
Thu Nov 18 01:41:08 UTC 2010


2010/11/18 Florent Berthet <florent.berthet at gmail.com>:
>
> So why don't we just figure out how to make the AGI understand the concept
> of happiness (which shouldn't be hard since we already understand it), and
> make it maximize it?

Sounds like the AGI you wish for would end up converting all matter in
the universe into "super-junkies", or "orgasmium", i.e. passive
creatures that just sit there being ecstatic, with each creature built
with as little matter as possible, so their total amount would get
maximized.

Such optimizing for happiness would include killing all existing
humans and other creatures, so their matter could be utilized to
create a larger number of creatures better optimized for happiness.

You sure you want such a future?

-- 
Aleksei Riikonen - http://www.iki.fi/aleksei



More information about the extropy-chat mailing list