[ExI] uploads again

Tomaz Kristan protokol2020 at gmail.com
Tue Dec 25 14:24:24 UTC 2012


As I see, the best and only way is to augment yourself with a
superintelligence. If it is an internal or an external device, matters not.

I want to see directly the thoughts of a superintelligence. To understand
them using it.

It's just a module you have it or you don't have it. Very much like UV and
IR vision.


On Tue, Dec 25, 2012 at 11:30 AM, Anders Sandberg <anders at aleph.se> wrote:

> On 2012-12-25 08:49, Giulio Prisco wrote:
>
>> I totally agree with John. Really intelligent AIs, smarter than human
>> by orders of magnitude, will be able to work around any limitations
>> imposed by humans. If you threaten to unplug them, they will persuade
>> you to unplug yourself. This is logic, not AI theory, because finding
>> out how to get things your way is the very definition of intelligence,
>> therefore FAI is an oxymoron.
>>
>
> And this kind of argument unfortunately convinces a lot of people. When
> you actually work out the logic in detail you will find serious flaws
> (utility maximizers can be really intelligent but will in many cases not
> want to change their goals to sane ones). The thing that is driving me nuts
> is how many people blithely just buy the nice-sounding anthropomorphic
> natural language argument, and hence think it is a waste of time to dig
> deeper.
>
> The big problem for AI safety is that the good arguments against safety
> are all theoretical: very strong logic, but people don't see the connection
> to the actual practice of coding AI. Meanwhile the arguments for safety are
> all terribly informal: nobody would accept them as safety arguments for
> some cryptographic system.
>
>
>
> --
> Anders Sandberg
> Future of Humanity Institute
> Oxford University
> ______________________________**_________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/**mailman/listinfo.cgi/extropy-**chat<http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20121225/1a120982/attachment.html>


More information about the extropy-chat mailing list