[ExI] Watson On Jeopardy

Kelly Anderson kellycoinguy at gmail.com
Wed Feb 23 18:27:59 UTC 2011


On Wed, Feb 23, 2011 at 3:57 AM, Eugen Leitl <eugen at leitl.org> wrote:
> On Tue, Feb 22, 2011 at 04:26:08PM -0700, Kelly Anderson wrote:
>
>> > *Which* competing platforms? Technologies don't come out of
>> > the blue fully formed, they're incubated for decades in
>> > R&D pipeline. Everything is photolitho based so far, self-assembly
>> > isn't yet even in the crib. TSM is just 2d piled higher and
>> > deeper.
>>
>> Photo lithography has a number of years left in it. As you say, it can
>
> Not so many more years.

I understand that Intel thinks that they can stay on track until 2018
using their current approach. Now their current approach seems to
mostly be to just but more cores on one chip, which requires
intelligent compilers and/or programmers. Hopefully, more on the
compiler side as that leverages better.

>> extend into the third dimension if the heat problem is solved. I have
>
> Photolitho can't extend into third dimension because each subsequent
> fabbing step degrades underlying structures. You need a purely additive,
> iterable deposition process which doesn't damage underlying layers.

What about fabbing slices, and "gluing" them together afterwards? Is
there anything in that direction? (I am not a fab expert by any means,
I'm just asking)

>> seen one solution to the heat problem that impressed the hell out of
>
> Cooling is only a part of the problem. There are many easy fixes which
> are cumulative in regards to reducing heat dissipation.

The mechanism I saw was a water circulation mechanism driven off of
the heat created in the chip itself. It was extremely cool (no pun
intended). Get the water out of the chip and you can cool the water
using conventional means. Their pitch indicated that cooling was one
of the biggest problems with going to 3D. There are probably many
more.

>> me, and no doubt there are more out there that I haven't seen. By the
>> time they run out of gas on photo lithography, something, be it carbon
>> nano tube based, or optical, or something else will come out. A
>
> Completely new technologies do not come out of the blue.
> We're about to hit 11 nm http://en.wikipedia.org/wiki/11_nanometer
> Still think Moore's got plenty of wind yet?

Not forever, but seemingly for a few more years.

>> company like Intel isn't going to make their very best stuff public
>> immediately. You can be sure they and IBM have some great stuff in the
>
> The very best stuff is called technology demonstrations. It is very
> public for obvious reasons: shareholder value.

What about race track memory? They were saying that might be available
by 2015 the last time I saw anything on it.

>> back room. I am not fearful of where the next S curve will come from,
>
> The only good optimism is well-informed optimism. Optimists
> would have expected clock doublings and memory bandwidth
> doublings to match structure shrink.

I am optimistic, and on these issues particularly I'm going off of the
stuff Ray put in TSIN. If he's wrong, then I'm wrong. Knowing the cool
stuff they see at MIT all the time, perhaps he is right.

>> except that it might come out of a lab in China, Thor help us all
>> then!
>>
>> >> > Kelly, do you think that Moore is equivalent to system
>> >> > performance? You sure about that?
>> >>
>> >> No. Software improves as well, so system performance should go up
>> >
>> > Software degrades, actually. Software bloat about matches the advances
>> > in hardware.
>>
>> I know what you are talking about. You are stating that Java and C#
>> are less efficient than C++ and that is less efficient than C and that
>
> I am talking that people don't bother with algorithms, because "hardware
> will be fast enough" or just build layers of layers upon external
> dependencies, because "storage is cheap, hurr durr".

A lot of business software is exactly that way because it doesn't have
to be great. Just good enough. In things like computer vision and AI
where the computational requirements are above the current level of
hardware performance, care is still taken to optimize.

> Let's face it, 98% of developers are retarded monkeys, and need
> their programming license revoked.

Only those whos degrees are in engineering, mathematics, physics,
political science and the like (in my experience). Those with actual
degrees in computer science are pretty decent for the most part. That
training matters IMNSHO.

>> is less efficient than Assembly. In that sense, you are very right. It
>> does take new hardware to run the new software systems. The next step
>> will probably be to run everything on whole virtual machines OS and
>> all, no doubt, not just virtual CPUs...
>
> Virtualization is only good for increasing hardware utilization,
> accelerate deployment and enhance security by way of compartmentalization.
> It doesn't work like Inception. If your hardware is already saturated,
> it will degrade performance due to virtualization overhead.

Yes, that's what I meant. Agreed.

>> But algorithms do improve. Not as fast as hardware, but it does. For
>> example, we now have something like 7 or 8 programs playing chess
>> above 2800, and I hear at least one of them runs on a cell phone. In
>
> Current smartphones are desktop equivalents of about half a decade
> ago. An iPhone has about 20 MFlops, Tegra 2 50-60 (and this is JIT).
> This is roughly Cray 1 level of performance. Of course there's
> graphics accelerators in there, too, and ARM are chronically
> anemic in the float department.
>
>> 1997, it was a supercomputer. Now, today's cell phones are dandy, but
>> they aren't equivalent to a high end 1997 supercomputer, so something
>
> Fritz has been beating the pants of most humans for a long time
> now http://en.wikipedia.org/wiki/Fritz_%28chess%29
>
> Chess is particular narrow field, look at Go where progress
> is far less stellar http://en.wikipedia.org/wiki/Go_%28game%29#Computers_and_Go

I am a go player, so yes I'm familiar with this... :-)

>> else had to change. The algorithms. They continued to improve so that
>> a computer with a small fraction of the power available in 1997 can
>> now beat a grand master.
>
> There's no Moore's law for software, that's for sure.

Nope.

>> > In terms of advanced concepts, why is the second-oldest high
>> > level language still unmatched? Why are newer environments
>> > inferior to already historic ones?
>>
> I'm sorry, I'm in the trade. Not seeing this progress thing
> you mention.

I do.

-Kelly



More information about the extropy-chat mailing list