[ExI] the truman show ai, was this might sting your interest

spike spike66 at att.net
Wed Oct 27 04:40:55 UTC 2010


> ...On Behalf Of Mike Dougherty
> ...
> > 
> >
> > ...To state the absolutely obvious: is this wicked cool or what!
> Do the bees communicate?  Are they testing the problem 
> solving skills of a single bee or the ability of a clustered 
> computing architecture with hundreds or thousands of nodes?  
> ... Do you think hypothesis about bees will translate easily to ants?

Good questions, and a hearty I don't know to all.  The article was a bit
vague on how they drew the conclusion that the bees were actually solving
the travelling salesman problem.  As so tragically often happens, I suspect
that something important was lost in the fog of translating down to the
layman's level.  An aspect of Damien's science writing for the layman is
that he does such a good job of getting the science down without doing
unnecessary violence to core message.

On a tangent, imagine you are an emergent AI, and you decide to outload.
You want to watch and learn from the beasts on earth, but not be detected
by, or influence them.  You give yourself the prime directive.  

A logical thing to do would be to inload into bees, for several good
reasons, some of which you may be able to think of and add yourself.  Bees
are plentiful and reproduce quickly.  They are not generally devoured by
other beasts.  They are welcomed by the humans, because they cross
pollenate.  They get around.  They are unlikely to be swatted.  They are
unlikely to be messed with by humans or beasts, because they have that
painful sting.  They gather every night back at the hive, so if the AI
resides in the form of nanobots, there are plenty of opportunities to
transfer from older bees into younger ones on a regular basis.

Since the AI inloads into bees and are in a sense a parasite, one thing they
could do in compensation is to help the bees by influencing them by some
mysterious means to get the most pollen for the least flight.  The AI can
both solve the travelling saleman problem and can speak bee.  Now imagine
the emergent AI wants to stay under the human radar so to speak, but
occasionally small errors are made, such as the actors in the Truman Show.
They occasionally let slip a small observable.  An example of such a slip
would be if human researchers suddenly noticed that bees were somehow
solving the wildly calculation-intensive travelling salesman problem.

Of course the AI reads the internet, so they or it realizes evidence of its
existence has been detected.  So the signal quickly goes out to stop telling
the bees how to work the travelling saleman problem.  Result, the
researchers are never able to reproduce the observation that originally
inspired the article, nor is anyone else, so the whole thing is soon

Damien or any aspiring SF writer, I donate this story idea to anyone who
wants to run with it.  Actually Damien gave me the idea to start with, in
his book Transcension.  In that work, he has a device called a liar bee.
Damien doesn't explain how a liar bee gets to be one that I recall, but the
above would be an example of a physical mechanism.



More information about the extropy-chat mailing list