[ExI] stealth singularity

Darren Greer darren.greer3 at gmail.com
Mon Sep 20 18:27:33 UTC 2010


Brent Allsop wrote:


>I believe that benevolence is absolutely proportional to intelligence.



>I believe that intelligence hiding from us is absolutely evil


But isn't it possible that definitions of what is evil and what is
benevolent might evolve as we gain a greater insight due to increased
intelligence? Even intelligence defined solely from a rationalist
stand-point?


For example, would it be good or benevolent to suppress the technology for
creating weapons of mass destruction from a culture that was so certain of
its right to murder others that in your estimation wouldn't hesitate to use
them? Do we withhold certain information from children regarding sexual
processes and practices until it becomes relevant for them, in order to save
on confusion and misapplication of the knowledge?


The problem with trying to figure out how a much greater intelligence would
behave or think or act is that they might have a much broader multi-faceted
outlook and at the same time see with an intensely nuanced eye. Benevolence
may be a complex and delicate balance of withholding *and* revealing. Evil
might not be defined in absolutes either. Einstein wrote a letter to
Roosevelt urging the construction of nuclear weapons to defeat Hitler. He
later sent another letter retracting this position.


Was he evil? Was he not? Should have he withheld? Should he have not?


If there are more intelligent beings than we in the galaxy, I feel it would
be a mistake to judge them by our standards of what is good and what is bad.
Then again, I'm a dyed-in-the-wool relativist. It's my reactionary position
against religionists, who argue for absolute definitions of good and evil
based on dictums from one god or another.


It's good to be back. I was dropped off the list for while due to server
issues.


Darren




2010/9/20 Brent Allsop <brent.allsop at canonizer.com>

> Spike,
>
>
>
> I believe that benevolence is absolutely proportional to intelligence.
>
>
>
> I believe that intelligence hiding from us is absolutely evil.
>
>
>
> Therefore, you’re proposed possibility suffers from the same problem of
> evil issue any other proposed possible hiding super intelligent being / god
> / ET existing suffers from.
>
>
>
> So, I would tend not to even entertain such an unlikely possibility, unless
> there was profound evidence to support it.
>
>
>
> I am an atheist, and that atheism includes not just the hope there is not
> yet God, but the hope that there is not yet any intelligent beings of any
> kind that is hiding from us.
>
>
>
> For if there is any such being, then there is no hope, and we will be
> condemned to the reality that even if we become as powerful as they are, we
> will also likely not be able to overcome the evil of having to hide from
> others like they allegedly are hiding from us.
>
>
>
> Brent Allsop
>
>
>
>
>
>
> 2010/9/20 spike <spike66 at att.net>
>
>>
>> Imagine that an AI emerged and that it concluded that it wanted to stay
>> invisible.
>>
>> What would you do differently, if anything, given the knowledge that an AI
>> had come into existence, but had not uploaded humanity?  Perhaps the
>> evidence was subtle: the total light reflected from the asteroids was
>> decreasing, but not by much, nothing anyone would notice.  That
>> reading suggested that nanobots were taking material from the 'roids.
>>
>> What would you do?
>>
>> spike
>>
>>
>>
>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
>>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100920/a2bc7840/attachment.html>


More information about the extropy-chat mailing list