[ExI] Slavery in the Future

Gary Miller aiguy at comcast.net
Mon Apr 21 00:59:48 UTC 2008



Stuart LaForge asked:

>>Ok. Let us assume it is possible to program a machine in this fashion,
what would be the benefit to us of programming  >>the machines for emotion?
What utility is there in burdening our machines with millions of years of
our own EP baggage? >>Is the toast from a toaster that loves me any better
than that which comes out of a toaster that is indifferent to me? >>Think
about how annoying garage sales would be with toasters begging us not to
sell them.

My Response:

Think about robotic caregivers.  Robo-nannies, Robo-eldercare, Emergency
services like 911, poison control hotline, where human emotions must be
understood and taken into account to get people to calm down and accept the
help that they need.

Robo-sex surrogates for those too busy, too homely, or too socially inept to
succeed at the human courtship dance.
Providing for not only the physical needs but helping to develop
self-confidence, empathy and social skills necessary to succeed in a real
relationship.

Robo-Career mentors, Robo-personal trainers perhaps built into the gym
equipment itself, tracking your progress and encouraging you to give it all
you got.  With different personality settings to prevent them from becoming
annoying. 

And just because we program emotions doesn't mean we have to program the
negative ones like aggression, hatred, jealousy, greed, lust for power.
People just assume that once we program emotions into an AI that they're
going to get all our negative ones also.

That's an incorrect assumption.  Our negative emotions evolved to help us
survive under primitive hostile conditions.  This is not the same
environment and I see no reason to burden our creations with the worst of
our biological heritage.

But I am sure the military or some rogue dictator will do so.  I just hope
they don't model it after my ex-wife!







More information about the extropy-chat mailing list