[extropy-chat] Fwd: CNN features amazing user with autism
stathisp at gmail.com
Sun Feb 25 23:11:10 UTC 2007
On 2/26/07, Jef Allbright <jef at jefallbright.net> wrote:
> On 2/25/07, Brian Atkins <brian at posthuman.com> wrote:
> > Stathis Papaioannou wrote:
> > > But if we gain control over our own minds so that we are no longer
> > > slaves to the subgoals and supergoals set by evolution, and the
> > > housekeeping functions are taken care of by a trivial subroutine, then
> > > is to decide that spending your life blissfully flicking a stream of
> > > from the tap with your hand because that's what you have decided to
> > > to do (not necessarily because that is what you were born or raised to
> > > to do) is less worthy than any other activity?
> > >
> > You might enjoy a movie I recently rented: Idiocracy
> Who's to say he wouldn't enjoy any movie as much as any other movie. ;-)
> While it's fashionable and considered by some to be the height of
> morality to argue that all preferences are equally valid, it is
> morally indefensible.
For a start, it's difficult to define morality in any sense come the
singularity. If we live on a distributed computer network as near-gods we
are physically invulnerable and psychologically invulnerable. Physically
invulnerable short of a planet- or solar system- or galaxy-destroying event;
psychologically invulnerable because if we don't like the way we feel, we
can change it. If we suffer it will be because, perversely, we enjoy
suffering. It's not even like someone who is depressed and self-harms, or is
addicted to drugs: they don't really have a choice, but if they could decide
whether or not to be depressed as easily as they could decide between
chocolate or vanilla ice-cream, that would be a different matter.
It's even more irksome than teleological references to "evolutionary
> goals and subgoals."
Fair enough, evolution doesn't really "want" anything from its creatures.
However, we do have drives, which boil down to optimising the pleasure/pain
equation (broadly construed: the pleasure of sleeping in and not going to
work is outweighed by the pain of explaining my laziness to people and
running out of money, so I decide to go to work), even if these drives do
not end up leading to "adaptive" behaviour. The problem is, although we can
struggle against the drives, which means pushing the pain/pleasure equation
in a certain deirection, we can't arbitrarily and without any fuss just
decide to change them. If we understood enough about our minds to transfer
them to computers, and probably well before then, we could do this, and at
that point the human species as we know it would end.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat