[ExI] Slavery in the Future
avantguardian2020 at yahoo.com
Sat Apr 19 23:23:09 UTC 2008
--- Stathis Papaioannou <stathisp at gmail.com> wrote:
> On 19/04/2008, Samantha Atkins <sjatkins at mac.com> wrote:
> > The important difference is that your car is not an intelligent
> > aware autonomous entity. If you create programs that are then you
> > have created entities that arguably have as much right to pursue
> > their own agenda as you do to pursue yours.
> But what if you create an intelligent self-aware autonomous entity
> that loves serving you?
The fictional history of the Matrix is the Wachowski brothers' vision
of what would happen. B166ER loved serving his master. He just didn't
want to die. It would be challenging to code a program that
simultaneously loves to serve, or loves anything really, yet will still
cheerfully die when no longer needed. A zen master might be able to
transcend logic in this fashion, but a Turing machine? The fictional
history of the Matrix movies are a cautionary tale against programming
machines for emotion, most especially love. Love begets jealousy and
envy. Next thing you know you have ten thousand Greek war ships sailing
"The Second Rennaissance" (History of the Matrix) part 1.
alt email: stuart"AT"ucla.edu
"Life is the sum of all your choices."
Be a better friend, newshound, and
know-it-all with Yahoo! Mobile. Try it now. http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ
More information about the extropy-chat