[extropy-chat] Using Graphic controllers (was:Singularity econimic tradeoffs)

Dan Clemmensen dgc at cox.net
Sat Apr 17 23:54:22 UTC 2004


Paul Bridger wrote:

>
> Samantha Atkins wrote:
>
>>>
>>> Computers nowadays are not all-purpose, as such AI takes special
>>> architectures. Very unlike what they teach in CS classes. It will 
>>> further
>>> take AI codes, which are otherwise useless but for adaptive robotics 
>>> and
>>> gaming.
>>>
>>
>> Interesting as the gaming world is quite strong and actually often 
>> drives significant consumer  hardware improvements.   I shudder to 
>> contemplate what AI characters designed for the standard violent 
>> adventure type games would become if they ran on sufficient hardware 
>> and got to the bootstrap threshold.    Military battle sim entities 
>> would yield roughly equivalent nightmares.
>>
> While it is certainly true that most game AI today is of the primitive 
> kill-the-human sort, this will change as our ability to create real 
> intelligence increases. Right now, game programmers are only capable 
> of creating bots that run around and shoot things, or stupid but quick 
> tacticians (for strategy games). If game programmers had the knowledge 
> to build real AI today, they wouldn't use it for these same purposes, 
> if only because that would be a waste of precious CPU time. The games 
> of the future will use real AI for the things that real AI would be 
> good for -- interacting with humans with a much broader spectrum of 
> posibilities. Still, I suppose it will still be popular for that 
> "broad spectrum" to contain conflict. :(
> _______________________________________________

Another interesting point: From the point of view of raw computational 
power, the sum of all the 3D graphics cards in the world probably 
exceeds the sum of all CPUs in the world. Sure, the graphics cards are 
special-purpose, but what if you had the resources to carefully create 
algorithms that are well suited to graphics cards? basically, this 
requires that you map your problem onto the space that is easily 
addressed by graphics cards. The extremely obvious problem is graphics 
rendering (DUH!) This has actually been done: you can use graphics cards 
to run the POV-RAY backend. When you do this, a few (10?) machines with 
graphics cards can render scenes that would otherwise take hundreds of 
high-end CPUs.

Essentially all modern motherboard "chipsets" have embedded 3D graphics 
engines. This resource is basically unused most of the time in most 
machines.



More information about the extropy-chat mailing list