[ExI] Watson On Jeopardy
kellycoinguy at gmail.com
Tue Feb 22 23:26:08 UTC 2011
On Fri, Feb 18, 2011 at 6:03 AM, Eugen Leitl <eugen at leitl.org> wrote:
> On Fri, Feb 18, 2011 at 12:16:33AM -0700, Kelly Anderson wrote:
>> Perhaps, perhaps not. But I think ONE out of the several dozen
>> competing paradigms will be ready to pick up more or less where the
>> last one left off.
> *Which* competing platforms? Technologies don't come out of
> the blue fully formed, they're incubated for decades in
> R&D pipeline. Everything is photolitho based so far, self-assembly
> isn't yet even in the crib. TSM is just 2d piled higher and
Photo lithography has a number of years left in it. As you say, it can
extend into the third dimension if the heat problem is solved. I have
seen one solution to the heat problem that impressed the hell out of
me, and no doubt there are more out there that I haven't seen. By the
time they run out of gas on photo lithography, something, be it carbon
nano tube based, or optical, or something else will come out. A
company like Intel isn't going to make their very best stuff public
immediately. You can be sure they and IBM have some great stuff in the
back room. I am not fearful of where the next S curve will come from,
except that it might come out of a lab in China, Thor help us all
>> > Kelly, do you think that Moore is equivalent to system
>> > performance? You sure about that?
>> No. Software improves as well, so system performance should go up
> Software degrades, actually. Software bloat about matches the advances
> in hardware.
I know what you are talking about. You are stating that Java and C#
are less efficient than C++ and that is less efficient than C and that
is less efficient than Assembly. In that sense, you are very right. It
does take new hardware to run the new software systems. The next step
will probably be to run everything on whole virtual machines OS and
all, no doubt, not just virtual CPUs...
That being said, algorithms continue to improve. The new, slower
paradigms allow programmers to create software with less concern for
the underlying hardware. I remember the bad old days of dealing with
the segmented Intel architecture, switching memory banks and all that
crap. I for one am glad to be done with it.
But algorithms do improve. Not as fast as hardware, but it does. For
example, we now have something like 7 or 8 programs playing chess
above 2800, and I hear at least one of them runs on a cell phone. In
1997, it was a supercomputer. Now, today's cell phones are dandy, but
they aren't equivalent to a high end 1997 supercomputer, so something
else had to change. The algorithms. They continued to improve so that
a computer with a small fraction of the power available in 1997 can
now beat a grand master.
> In terms of advanced concepts, why is the second-oldest high
> level language still unmatched? Why are newer environments
> inferior to already historic ones?
Are you speaking with a LISP? I don't think that Eclipse is inferior
to the LISP environment I used on HP workstations in the 80s. I think
it is far better. I remember waiting for that damn thing to do garbage
compaction for 2-3 minutes every half hour or so. Good thing I didn't
drink coffee in those days... could have been very bad. :-)
We tend to glorify the things of the past. I very much like playing
with my NeXT cube, and do so every now and again (It's great when you
combine Moore's law with Ebay, I could never have afforded that
machine new.) The nostalgia factor is fantastic. But the NeXT was
fairly slow even at word processing when you use it now. It was a
fantastic development environment, only recently equaled again in
regularity and sophistication.
Eugen, don't be a software pessimist. We now have two legged walking
robots, thanks to a combination of software employing feedback and
better hardware, but mostly better software in this case.
Picassa does a fairly good job of recognizing faces. I would never
have predicted that would be a nut cracked in my time.
More information about the extropy-chat