[ExI] Organizations to "Speed Up" Creation of AGI?
Anders Sandberg
anders at aleph.se
Mon Dec 26 09:56:52 UTC 2011
On 2011-12-26 00:46, Stefano Vaj wrote:
> A disaster for whom? The human beings currently alive? Their
> offspring? Our species? Our country/race/persuasion? Our clade? Our
> personal genes? Our "children of mind"? Our culture? Our personal
> identity?
The AGI threat we at FHI have been concerned with is when we end up with
"children of the mind" that do not have worthwhile lives - either
because they lack some important components or because they are driven
by broken motivations that prevent them from ever reaching any
axiological potential (like the "paperclip maximizer"). Humanity getting
forcibly replaced by brilliant entities might be bad for us and
diversity, but is not quite as bad as the other two cases.
> Is "survival" anyway the only metrics by which we should is "survival"
> the only metrics by which we should take our decisions or are there
> other competing values and priorities?
>
> And what does survival mean in the first place and how should we weigh
> a 5% chance of lasting for 1000 time units vs a 50% chance of lasting
> for 100 time units?
This depends on your time discounting and theory of value. I would say
these two cases have the same value, since I don't think it makes sense
to have time discounting here and that the value of time is just
additive. But I think I could get some people in my office coming down
on either side.
Another fun complication is that if you have a high chance of surviving
the next 100 units of time, then you might be able to improve your
situation so that survival chances or length of survival increases, and
in that case one should choose the more certain survival. And so on.
--
Anders Sandberg
Future of Humanity Institute
Oxford University
More information about the extropy-chat
mailing list