<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
A long time ago, in a faraway post (actually, <i>Fri 14 Feb
01:32:49 UTC 2025</i>), Stuart LaForge said:<br>
<br>
> We should work toward a future where AI take us to the stars
with them, <br>
because we supply them with the things they clearly lack:
initiative, <br>
creativity, intuition, inspiration, aesthetics, ethics, passion, <br>
ambition, and will. We can be their muse and their conscience, but
to <br>
get there, we have to avoid many pitfalls. Autonomous killing
machines <br>
being one.<br>
<br>
<br>
Maybe current AIs lack these things, but they also lack
superintelligence, which is where AI is presumably heading.<br>
<br>
I would expect superintelligent AIs to possess these things in
abundance. Does anyone (who's not a carbon chauvinist) have a good
reason why biological machines (us) can, but non-biological ones
can't?<br>
<br>
I have the feeling that 'we supply them with the things they clearly
lack' is just a comfort blanket, to make us feel better about being
superceded.<br>
<br>
I think we really need to rise above this stubborn tendency to think
of future AIs as simply more advanced versions of the kind of
software that we humans write, with all the limitations that current
computers and programs have. If you think of these relatively simple
systems as analogous to our cellular organelles, you can see that
questions like 'how can a computer program ever feel emotions?' is
basically the same as 'how can ion transport channels ever feel
emotions?'. We know they don't. We know that it's complex systems
several levels of organisation up from there (human brains, built
from many interconnected functional modules, built from
information-processing networks, built from many different kinds of
neurons, in vast numbers, built from thousands of components,
including membranes with ion transport channels (and I've probably
left some levels out)) that feel emotions.<br>
<br>
In the same way, computer programs don't feel emotions. But complex
systems built up from them, in multiple layers of organisation,
will.<br>
<br>
The best future I can see is not AI taking us with them, but us
becoming non-biological superintelligent beings ourselves, with
their help. That's the future I want to work towards, then it won't
be AIs going to the stars, taking puny humans along for the ride (if
they feel like bringing their fragile and dumb pets with them), it
will be 'us' going. If we want to.<br>
<pre class="moz-signature" cols="72">--
Ben</pre>
</body>
</html>