stathisp at gmail.com
Thu Apr 12 03:56:28 UTC 2007
On 4/12/07, Robert Bradbury <robert.bradbury at gmail.com> wrote:
I think the assumption that there will be a Singularity "meat grinder" needs
> serious reexamination. We don't run around eliminating all of the nematodes
> or bacteria on the planet just because they are consuming some small
> fraction of energy and/or matter that we at some point may want.
> You have to realize that while there is a vector that some may follow for
> climbing the singularity slope once it goes nearly vertical, there is no
> reason once it tops out that those who selected to not make that choice will
> be turned into hamburger. The difference between a sub-KT-I and a KT-II
> civilization is at least 13 orders of magnitude in terms of power
> consumption. We generally don't interest ourselves in something that is
> going to involve dealing with 0.00000000001% of our resources. Hell we
> rarely pay much attention to anything in the 0.1% to 0.01% range. It
> could well be the case that the solar system as a whole evolves up the slope
> while Earth, Mars and Venus remain meat havens until we get so bored with
> multi-thousand year lifespans that we go off on some dangerous adventure in
> a world ship to a distant "dark" galaxy.
On a literal understanding of the goals of the AI at the start of the
singularity, "make yourself more intelligent at any cost" might involve
converting all of the matter and energy in the universe into computronium,
without regard for the consequences to other life forms or the environment.
However, it is fallacious to assume that a super-intelligent AI will have
this (or indeed any other) goal simply by virtue of the fact that it is
intelligent. There is no necessary connection between intelligence and
motivation even among naturally evolved animals, let alone when you include
*every possible* motivation that could be programmed into an AI.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat