[ExI] Watson On Jeopardy

Eugen Leitl eugen at leitl.org
Wed Feb 23 10:57:45 UTC 2011


On Tue, Feb 22, 2011 at 04:26:08PM -0700, Kelly Anderson wrote:

> > *Which* competing platforms? Technologies don't come out of
> > the blue fully formed, they're incubated for decades in
> > R&D pipeline. Everything is photolitho based so far, self-assembly
> > isn't yet even in the crib. TSM is just 2d piled higher and
> > deeper.
> 
> Photo lithography has a number of years left in it. As you say, it can

Not so many more years.

> extend into the third dimension if the heat problem is solved. I have

Photolitho can't extend into third dimension because each subsequent
fabbing step degrades underlying structures. You need a purely additive,
iterable deposition process which doesn't damage underlying layers.

> seen one solution to the heat problem that impressed the hell out of

Cooling is only a part of the problem. There are many easy fixes which
are cumulative in regards to reducing heat dissipation.

> me, and no doubt there are more out there that I haven't seen. By the
> time they run out of gas on photo lithography, something, be it carbon
> nano tube based, or optical, or something else will come out. A

Completely new technologies do not come out of the blue.
We're about to hit 11 nm http://en.wikipedia.org/wiki/11_nanometer
Still think Moore's got plenty of wind yet?

> company like Intel isn't going to make their very best stuff public
> immediately. You can be sure they and IBM have some great stuff in the

The very best stuff is called technology demonstrations. It is very
public for obvious reasons: shareholder value.

> back room. I am not fearful of where the next S curve will come from,

The only good optimism is well-informed optimism. Optimists
would have expected clock doublings and memory bandwidth
doublings to match structure shrink.

> except that it might come out of a lab in China, Thor help us all
> then!
> 
> >> > Kelly, do you think that Moore is equivalent to system
> >> > performance? You sure about that?
> >>
> >> No. Software improves as well, so system performance should go up
> >
> > Software degrades, actually. Software bloat about matches the advances
> > in hardware.
> 
> I know what you are talking about. You are stating that Java and C#
> are less efficient than C++ and that is less efficient than C and that

I am talking that people don't bother with algorithms, because "hardware
will be fast enough" or just build layers of layers upon external
dependencies, because "storage is cheap, hurr durr". 

Let's face it, 98% of developers are retarded monkeys, and need
their programming license revoked.

> is less efficient than Assembly. In that sense, you are very right. It
> does take new hardware to run the new software systems. The next step
> will probably be to run everything on whole virtual machines OS and
> all, no doubt, not just virtual CPUs...

Virtualization is only good for increasing hardware utilization,
accelerate deployment and enhance security by way of compartmentalization.
It doesn't work like Inception. If your hardware is already saturated,
it will degrade performance due to virtualization overhead.
 
> That being said, algorithms continue to improve. The new, slower
> paradigms allow programmers to create software with less concern for
> the underlying hardware. I remember the bad old days of dealing with

Yes, software as ideal gas. 

> the segmented Intel architecture, switching memory banks and all that
> crap. I for one am glad to be done with it.

Isn't relevant to my point.
 
> But algorithms do improve. Not as fast as hardware, but it does. For
> example, we now have something like 7 or 8 programs playing chess
> above 2800, and I hear at least one of them runs on a cell phone. In

Current smartphones are desktop equivalents of about half a decade
ago. An iPhone has about 20 MFlops, Tegra 2 50-60 (and this is JIT). 
This is roughly Cray 1 level of performance. Of course there's
graphics accelerators in there, too, and ARM are chronically 
anemic in the float department.

> 1997, it was a supercomputer. Now, today's cell phones are dandy, but
> they aren't equivalent to a high end 1997 supercomputer, so something

Fritz has been beating the pants of most humans for a long time
now http://en.wikipedia.org/wiki/Fritz_%28chess%29

Chess is particular narrow field, look at Go where progress
is far less stellar http://en.wikipedia.org/wiki/Go_%28game%29#Computers_and_Go

> else had to change. The algorithms. They continued to improve so that
> a computer with a small fraction of the power available in 1997 can
> now beat a grand master.

There's no Moore's law for software, that's for sure.
 
> > In terms of advanced concepts, why is the second-oldest high
> > level language still unmatched? Why are newer environments
> > inferior to already historic ones?
> 
> Are you speaking with a LISP? I don't think that Eclipse is inferior

Why, yeth. How observanth of you, Thir.

> to the LISP environment I used on HP workstations in the 80s. I think

We're not talking implementations, but power of the concepts.

> it is far better. I remember waiting for that damn thing to do garbage
> compaction for 2-3 minutes every half hour or so. Good thing I didn't
> drink coffee in those days... could have been very bad. :-)

Irrelevant to my point.
 
> We tend to glorify the things of the past. I very much like playing

Lisp is doing dandy. My question is no current language or 
environment was capable to improve upon the second-oldest
language conceptually. Nevermind that many human developers
are mentally poorly equipped to deal with such simple things
like macros.

> with my NeXT cube, and do so every now and again (It's great when you
> combine Moore's law with Ebay, I could never have afforded that
> machine new.) The nostalgia factor is fantastic. But the NeXT was
> fairly slow even at word processing when you use it now. It was a
> fantastic development environment, only recently equaled again in
> regularity and sophistication.
> 
> Eugen, don't be a software pessimist. We now have two legged walking
> robots, thanks to a combination of software employing feedback and
> better hardware, but mostly better software in this case.

I'm sorry, I'm in the trade. Not seeing this progress thing
you mention.
 
> Picassa does a fairly good job of recognizing faces. I would never
> have predicted that would be a nut cracked in my time.

We're supposed to have human-grade AI twenty years ago. I can
tell you one thing: we won't have human-grade AI in 2030.

-- 
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE



More information about the extropy-chat mailing list