[ExI] Watson On Jeopardy

Samantha Atkins sjatkins at mac.com
Thu Feb 24 05:19:31 UTC 2011


On Feb 23, 2011, at 2:57 AM, Eugen Leitl wrote:

> On Tue, Feb 22, 2011 at 04:26:08PM -0700, Kelly Anderson wrote:
> 
>>> *Which* competing platforms? Technologies don't come out of
>>> the blue fully formed, they're incubated for decades in
>>> R&D pipeline. Everything is photolitho based so far, self-assembly
>>> isn't yet even in the crib. TSM is just 2d piled higher and
>>> deeper.
>> 
>> Photo lithography has a number of years left in it. As you say, it can
> 
> Not so many more years.
> 
>> extend into the third dimension if the heat problem is solved. I have
> 
> Photolitho can't extend into third dimension because each subsequent
> fabbing step degrades underlying structures. You need a purely additive,
> iterable deposition process which doesn't damage underlying layers.
> 
>> seen one solution to the heat problem that impressed the hell out of
> 
> Cooling is only a part of the problem. There are many easy fixes which
> are cumulative in regards to reducing heat dissipation.
> 
>> me, and no doubt there are more out there that I haven't seen. By the
>> time they run out of gas on photo lithography, something, be it carbon
>> nano tube based, or optical, or something else will come out. A
> 
> Completely new technologies do not come out of the blue.
> We're about to hit 11 nm http://en.wikipedia.org/wiki/11_nanometer
> Still think Moore's got plenty of wind yet?

Yes, but due to newer technologies.  Some like optical connects (partially empowered by nanoscale light sensors)  between components enable 3D architectures. Others involve things not as close to out of the lab like memristor based designs,  molecular chips,  racetrack memory,  graphene transistors,  quantum dot memory - to name a few.  While we may not continue Moore's law by current means there are many contenders that should enable continuance of that pace for some time.

> 
>> company like Intel isn't going to make their very best stuff public
>> immediately. You can be sure they and IBM have some great stuff in the
> 
> The very best stuff is called technology demonstrations. It is very
> public for obvious reasons: shareholder value.
> 
>> back room. I am not fearful of where the next S curve will come from,
> 
> The only good optimism is well-informed optimism. Optimists
> would have expected clock doublings and memory bandwidth
> doublings to match structure shrink.

I do, largely.

> 
>> except that it might come out of a lab in China, Thor help us all
>> then!
>> 
>>>>> Kelly, do you think that Moore is equivalent to system
>>>>> performance? You sure about that?
>>>> 
>>>> No. Software improves as well, so system performance should go up
>>> 
>>> Software degrades, actually. Software bloat about matches the advances
>>> in hardware.
>> 
>> I know what you are talking about. You are stating that Java and C#
>> are less efficient than C++ and that is less efficient than C and that
> 
> I am talking that people don't bother with algorithms, because "hardware
> will be fast enough" or just build layers of layers upon external
> dependencies, because "storage is cheap, hurr durr". 

True too often.  But almost always a better algorithm buys you more cheaper than buying next year's latest and greatest or even the latest and greatest from five year in the future.  

> 
> Let's face it, 98% of developers are retarded monkeys, and need
> their programming license revoked.
> 

True but we don't need no stinking licenses!


<snip..>

>> 
> 
> There's no Moore's law for software, that's for sure.
> 
>>> In terms of advanced concepts, why is the second-oldest high
>>> level language still unmatched? Why are newer environments
>>> inferior to already historic ones?
>> 
>> Are you speaking with a LISP? I don't think that Eclipse is inferior
> 
> Why, yeth. How observanth of you, Thir.

Because LISP sought to capture the essence and full power of function abstractions and did a good job of doing so while not introducing a lot of canned syntax to get in the way.   The most empowering software environment I ever encountered was a Symbolics workstation in the early 80s. That says a lot for the dearth of improvement is programming environment tools.    Part of it is economics.  Pleasing programmers is hard, they are tight with their money, most won't bother with becoming fluent in the needed abstractions and there are not nearly as many of them as people that want to pay a lot for Microsoft office or the latest iPhone fart app equivalent.     We software folks spend the majority of our careers automating other people's workflow (at best).  

> 
>> to the LISP environment I used on HP workstations in the 80s. I think
> 
> We're not talking implementations, but power of the concepts.
> 
>> it is far better. I remember waiting for that damn thing to do garbage
>> compaction for 2-3 minutes every half hour or so. Good thing I didn't
>> drink coffee in those days... could have been very bad. :-)
> 
> Irrelevant to my point.

GC is actually much faster in modern designs than reference counting - provably so.  And you do not want to even dream of programming in the large without some effective means of GC.  Reference counting is also provably fallible. 

> 
>> We tend to glorify the things of the past. I very much like playing
> 
> Lisp is doing dandy. My question is no current language or 
> environment was capable to improve upon the second-oldest
> language conceptually. Nevermind that many human developers
> are mentally poorly equipped to deal with such simple things
> like macros.
> 
>> with my NeXT cube, and do so every now and again (It's great when you
>> combine Moore's law with Ebay, I could never have afforded that
>> machine new.) The nostalgia factor is fantastic. But the NeXT was
>> fairly slow even at word processing when you use it now. It was a
>> fantastic development environment, only recently equaled again in
>> regularity and sophistication.
>> 
>> Eugen, don't be a software pessimist. We now have two legged walking
>> robots, thanks to a combination of software employing feedback and
>> better hardware, but mostly better software in this case.
> 
> I'm sorry, I'm in the trade. Not seeing this progress thing
> you mention.
> 

Me either and I have done little but software for the last 30 years.


>> Picassa does a fairly good job of recognizing faces. I would never
>> have predicted that would be a nut cracked in my time.
> 
> We're supposed to have human-grade AI twenty years ago. I can
> tell you one thing: we won't have human-grade AI in 2030.
> 

Now that I am not sure of.  Except I am afraid the economic meltdown this decade may kill to much that such an accomplishment needs to rest on.

- samantha




More information about the extropy-chat mailing list