[ExI] nick's book being sold by fox news

Anders Sandberg anders at aleph.se
Mon Oct 27 19:53:14 UTC 2014

Kelly Anderson <kellycoinguy at gmail.com> , 27/10/2014 8:29 PM:
On Mon, Oct 27, 2014 at 1:12 PM, BillK <pharos at gmail.com> wrote:

 It's already happening. We will soon be surrounded by AI in everything we touch.

I can't argue with that, but we have not yet achieved anything that even feels remotely dangerous, except if it all gets blown away in a solar storm or something like that. That is, for now it is almost all positive (unless you don't like Internet porn, jihadists communicating over twitter or something along those lines) but in the future, as it reaches general intelligence and starts forming its own goals, that's when I get worried. Yes, I might get turned down for cashing a check at Walmart because the AI says this check smells funny, but that's not a super huge negative impact.
What about this model: AI embedded in our infrastructure allows automated, distributed monitoring and planning on behalf of whoever owns the AI systems. As the systems get more powerful, their owner's ability to get what they want increases. At least some of them do not want competitors (they might be fine with other AI-powers, but they should be able to pull the plug on them) so they have an incentive to subvert each other or develop better AI. This leads to a centralization of power into fewer and fewer agencies, racing to become the sole agency that sets the rules. Note that these agencies do not have to be evil. But the competition/race is likely to cause harm. Also, centralizing the ability to direct where the world is going to small, unaccountable groups getting power from technological power rather than civic legitimacy is not a good thing.
Note that this scenario is based on a particular kind of scaling of power. It might be that AI instead diffuses power. That means more and more people can automate whatever processes they want. I think one can easily see risks there too.

Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20141027/fef26036/attachment.html>

More information about the extropy-chat mailing list