[extropy-chat] limits of computer feeling

Stathis Papaioannou stathisp at gmail.com
Mon Mar 12 09:58:03 UTC 2007

On 3/12/07, Russell Wallace <russell.wallace at gmail.com> wrote:
> On 3/12/07, Robert Picone <rpicone at gmail.com> wrote:
> > Regardless of what condition a human may be, moments of relative
> > weakness happen, and these are probably more common and more tempting than
> > the moments of relative strength.  I'd say your solution brings about more
> > problems than it solves, even ignoring the results of making minor mistakes.
> >
> >
> If I had source level write access to myself, the first thing I'd do would
> be to put a 7-day lock on it: any change I try to make goes into a pipeline
> with a 7 day delay and the option to cancel at any point during that time.
> If self-modification is ever legalized in any society even slightly similar
> to ours, I imagine there'll be some such safeguards on it.

It's not really that different now, is it? People impulsively make all sorts
of bad decisions. At least with self-modification you will likely choose a
more salubrious goal. How often have you thought, "gee, I wish I were
suicidal/ a smack addict/ a serial killer"? Even the basest types generally
pay lip service to noble supergoals, and if the effort towards achieving
these supergoals can easily be made more rewarding than doing nothing or
doing something destructive, why would anyone choose to choose doing nothing
or doing something destructive? There are no guarantees where free agents
are concerned, but I feel that in general a world where everyone has chosen
what sort of person they are will be a more moral and more productive world
than the present one.

Stathis Papaioannou
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070312/3a32555d/attachment.html>

More information about the extropy-chat mailing list