[ExI] nick's book being sold by fox news
hkeithhenson at gmail.com
Sat Nov 1 16:34:43 UTC 2014
On Sat, Nov 1, 2014 at 5:00 AM, Anders Sandberg <anders at aleph.se>
> The fact that the Halting Problem shows that there is no general way of solving certain large problem classes doesn't tell us anything about the *practical* unworkability of top level goals.
It seems kind of remote to apply halting to an AI that was interacting
with the real world. We don't run into that problem, and we have vast
numbers of systems that just wait till something happens (like a
System crashes are a different kind of problem. I once outlined (but
didn't write) a story of the last human left in the real world. His
job was to punch the reset button if the blinking blinking lights quit
blinking. (The rest of the human race were uploads in the system.)
> There is code verifiers that apparently do a decent job despite the general impossibility of finding all infinite looping.
Much real code is organized around an infinite loop. I know Xanadu was.
> Implementing boredom is fairly easy; I did it in some of my research software. But one can have boredom with sub-goals and not top-level goals.
True. Humans never seem to get bored with seeking status in the eyes
of other humans. Assume humans are intelligent (sometimes genes get
them to do things that are really stupid, like wars, but valuable from
the perspective of genes). Then using human evolved psychology as a
base for AIs would seem reasonable. However, some of the evolved
psychological characteristics of humans, such as susceptibility to
xenophobia in the face of bleak economic times should be left out or
very carefully considered. Battles between AIs or large groups of
them don't seem like a good idea in spite of the popularity of "battle
> (Just fill in the details of an imagined way better post given these points).
A mail program that didn't loose an essay would be a good idea too.
More information about the extropy-chat