[ExI] The second step towards immortality

Kelly Anderson kellycoinguy at gmail.com
Fri Jan 10 17:20:10 UTC 2014

On Thu, Jan 9, 2014 at 5:56 PM, Adrian Tymes <atymes at gmail.com> wrote:

> On Jan 9, 2014 4:12 PM, "Kelly Anderson" <kellycoinguy at gmail.com> wrote:
> > You can't preprogram enough to do what humans do.
> Agreed, assuming that stuff learned and (more importantly) ways to react
> that are learned after the initial setup do not count as preprogrammed,
> even if the mechanism by which they were learned was.
Those are not preprogrammed except in the useless sense that the whole
universe was preprogrammed from the big bang, which it sort of was. That
being said, the uncertainty principle makes even that a somewhat useless

> That said, it is possible to claim that all reactions were preprogrammed,
> that the data for how to react was in fact inside the
> (person/computer/whatever) all along.  Some people have done this, often in
> individual attempts to inspire ("See?  You knew how to do it!") or to
> dehumanize ("They aren't people; they're just machines that look like
> people.").
There is also the more subtle approach of "Free Will" by Sam Harris. That
while we don't have a choice and are essentially automatons, we are also
not predictable in any real way because we can't predict the input and
chaos always plays its part. And this is just about as good as having free
will even if it isn't exactly the same philosophically.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20140110/01aa3736/attachment.html>

More information about the extropy-chat mailing list