[ExI] AI Motivation revisited

G. Livick glivick at sbcglobal.net
Sun Jul 3 08:53:49 UTC 2011


A couple of days back, Stefano stated, in his own way, my usual 
clarification when the discussion of AI starts to drift toward the 
spiritual:

Mmhhh. What defeated Kasparov was a program, not a computer. The
computer merely offered sufficient power to choose moves in the time
required.

If we free Kasparov's opponent from tournament rules regarding time,
and we execute this very program on*anything*, including a Chinese
Room or a Turing machine or a Babbage engine or 30.000 slaves playing
logic circuits in a plain, the end result would not change.


I describe for people with who tend to see the computer as something more than what it is (a misunderstanding that is greatly aided with the widespread and deliberate misuse of terms such as artificial "intelligence") the fact that the basic element of the digital computer, the bit, is perfectly modeled by an empty beer can.  A beer can is bi-stable, and can represent a 1 or a 0 in binary.  Given enough time and space, and enough beer cans, the most fantastic operation of any computer anywhere could be exactly duplicated with beer cans.  (I know exactly how computers work, I've built them, and I program them -- which I declare as sufficient background for making such a claim.)


There's a bunch of talk that always goes on about computer "intelligence," with some predicting a time when the things, with the right software, will become actually intelligent.  In order for me to accept such a concept, it would have to survive the envisioning of a warehouse full of beer cans being manipulated in response to the same algorithm, and with a result adequate to leave me convinced the warehouse was my intellectual superior.


The digital computer is a tool, in my mind, as the abacus is a tool, and as the fingers are tools (for basic math).  I'm not going to bother logging onto my computer, or get out my calculator, when I just want to double check that 4 + 3 = 7; I'll just use my fingers.  On the other hand, if I want to know what 77! is, I'll crank up my computer.  "The right tool for the job" is what my grandpa probably used to say.  The computer is faster at multiplication than I will ever be, and it's a great tool for stuff like that.  Beer cans would work, too, but that would take too long, and get too many of my friends hung over in preparing them for faster execution speed.  There a lot of other tools around that we use to make our tasks easier, or to extend our own capabilities.  Pliers, can openers, lawn mowers and TV clickers come to mind.  If we improve each of them radically over a number of years, is the time foreseeable when these other tools will become, in fact, equal or superior to humans?

FutureMan






On 7/2/2011 7:42 AM, john clark wrote:
>  On *Fri, 7/1/11, Mike Dougherty /<msd001 at gmail.com>/* wrote:
>
>
>     "So what?  If google decided to make AGI the relevant keyword tomorrow
>     your "metric" would be inverted."
>
> If that were to happen and so many people knew what the term meant 
> that Steven Spielberg made a multimillion dollar blockbuster movie 
> called "AGI", then the dilettantes would dream up yet another obscure 
> term that they were certain was unknown by most because, as I've said 
> before, when your ideas are shallow clarity of expression is not your 
> friend.
>
>     You don't like the term "AGI"?
>
>   No.
>
>     Don't use it.
>
> I don't.
>
>     If context requires a disambiguation between narrow AI, wide AI,
>     some other kind of run-of-the-mill AI and the super-special kind
>     of AI [...]
>
> So you want us to believe that if you were to say something like "AI 
> will likely bring on the extinction of biological human beings and 
> lead to computers able to engineer the universe" you worry that people 
> will misunderstand you and think you are referring to "run-of-the-mill 
> AI", so purely in the interests of clarity you will use "AGI" instead, 
> a recently made up term that virtually nobody is familiar with. I flat 
> out don't believe you.
>
>  John K Clark
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20110703/8d2b5686/attachment.html>


More information about the extropy-chat mailing list