[extropy-chat] 'a process of non-thinking called faith' 2
Robert Bradbury
robert.bradbury at gmail.com
Sun Nov 19 14:47:04 UTC 2006
On 11/18/06, ben <benboc at lineone.net> wrote:
>
> "Robert Bradbury" <robert.bradbury at gmail.com> said:
>
> > If we cannot make good judgements now, how can we be expected to make
> > them in the future?
>
> I though this was a rather odd thing for a transhumanist to say.
> Don't you expect your ability to make good judgements to improve in the
> future?
I'm not sure. One could cite the war in Iraq as an example. Why was it
not instead a war in the Sudan? [the question is rhetorical... lets not
fall into a political rehashing pit.]
Because almost all people perceive of themselves as quite attached to their
position and because the nano-santas will generally eliminate classical
"positions" I expect that many people will become lost and that a lot of bad
judgements will result. It is somewhat worse if AGIs develop.
As I've stated before I don't really want to live in a world where I know
that an AGI is running around climbing the curve at the limits imposed by
the laws of physics. It forces me into a position of giving up my
"position" so as to effectively become equivalent to the the AGI (where the
past me is probably becoming a microfraction of myself at an extremely rapid
rate) or choose to position myself someplace in the middle of the range from
luddite human to AGI-at-the-limits (a *very* large range). You generally
do *not* have that choice today and so you don't have to deal with that
problem. If the choice develops rapidly as might well be the case one will
hardly know what choices wil be best. A Las Vegas "all you can eat" buffet
is not necessarily a good thing.
Robert
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20061119/e0b84672/attachment.html>
More information about the extropy-chat
mailing list