[ExI] SXSW - The SINGULARITY: Humanity's Huge Techno Challenge"

Tomaz Kristan protokol2020 at gmail.com
Thu Mar 17 20:30:43 UTC 2011


I like those A-I. But I would add at least J="imported Singularity from the
aliens". Not a very likely scenario on the contrary! But possible non the
less.

On Thu, Mar 17, 2011 at 6:45 PM, Anders Sandberg <anders at aleph.se> wrote:

> On 2011-03-17 18:05, Kelly Anderson wrote:
>
>> 2011/3/13 Samantha Atkins<sjatkins at mac.com>:
>>
>>> On Mar 13, 2011, at 1:12 PM, Natasha Vita-More wrote:
>>>
>>> This tosses in a not so helpful term, Singularity.  There are at least
>>> three
>>> major notions of what it means as you are quite aware.   And it brings up
>>> a
>>> lot of unhelpful hopes and fears.
>>>
>>
>> Samantha, humor me, as I'm new here. What are these three major notions?
>> My view of the singularity is that runaway machine intelligence
>> designs next generation machine intelligence until humanity is left in
>> the dust. Is there something more to it?
>>
>
> I think Samantha referred to 1) accelerating change, 2) a prediction
> horizon and 3) intelligence explosion leading to superintelligence.
>
> But there are at least ~9 notions playing around. See
> http://agi-conf.org/2010/wp-content/uploads/2009/06/agi10singmodels2.pdf
>
>
> --
> Anders Sandberg
> Future of Humanity Institute
> Oxford University
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20110317/e6d28b17/attachment.html>


More information about the extropy-chat mailing list