[extropy-chat] 'a process of non-thinking called faith' 2

BillK pharos at gmail.com
Sun Nov 19 16:49:51 UTC 2006


On 11/19/06, Robert Bradbury wrote:
>
> I'm not sure.  One could cite the war in Iraq as an example.  Why  was it
> not instead a war in the Sudan?  [the question is rhetorical... lets not
> fall into a political rehashing pit.]
>

The war in Iraq was partly to stop Sadaam's genocide against the Kurds
and his other killing sprees against anyone he thought of as an enemy.
How many wars against genocide do you want at the same time?


> Because almost all people perceive of themselves as quite attached to their
> position and because the nano-santas will generally eliminate classical
> "positions" I expect that many people will become lost and that a lot of bad
> judgements will result.  It is somewhat worse if AGIs develop.
>
> As I've stated before I don't really want to live in a world where I know
> that an AGI is running around climbing the curve at the limits imposed by
> the laws of physics.  It forces me into a position of giving up my
> "position" so as to effectively become equivalent to the the AGI (where the
> past me is probably becoming a microfraction of myself at an extremely rapid
> rate) or choose to position myself someplace in the middle of the range from
> luddite human to AGI-at-the-limits (a *very* large range).   You generally
> do *not* have that choice today and so you don't have to deal with that
> problem.  If the choice develops rapidly as might well be the case one will
> hardly know what choices wil be best.  A Las Vegas "all you can eat" buffet
> is not necessarily a good thing.
>


Certainly, if transhumans just think 'faster', then they won't
necessarily think any 'better' or 'differently' than at present. They
will just reach the same poor conclusions a bit quicker.

Some might suggest that extra processing speed would enable
transhumans to take more factors into account in their decision-making
and thus make better decisions. But while that is true, it won't
actually happen unless some form of mental training in how to think
logically is also added. This seems to be assumed in a lot of
discussion about transhumans. That by some magic, these superhumans
will be really nice and moral and care for the poor ordinary humans
left behind. Whereas it seems just as likely to me that they will be
equally as brutal and nasty as ordinary humans given a bit more power.

Leap twenty years ahead and we might have transhumans that can watch
the ball game, read a comic, listen to rap music, chat to their
friends, trade on ebay, plagiarise an essay for college work, schedule
social activities, etc. all at the same time.
That's progress for you.

BillK



More information about the extropy-chat mailing list