[Paleopsych] Information Week: What The Future Holds

Premise Checker checker at panix.com
Wed Oct 27 15:08:20 UTC 2004


What The Future Holds
http://www.informationweek.com/story/showArticle.jhtml?articleID=49901144

    Six computer scientists take a look into the future. What's in store?
    Think speed.

    By Aaron Ricadela
    Oct. 18, 2004

    25 years In the world of nuclear physics, where moving at Internet
    speed still isn't fast enough, scientists are sending data from the
    CERN particle physics lab in Geneva, Switzerland, to the California
    Institute of Technology at the rate of a CD's worth of information
    every second. By 2007, they hope to double that to a gigabyte of data
    every second, fast enough to send a DVD movie in three seconds. Even
    at those speeds, it would still take 40 minutes to transfer a trillion
    bytes of information--the yardstick particle physicists use to measure
    the information their instruments spew. Scientists now collect a few
    terabytes of data a year, but that could increase a thousandfold by
    early in the next decade.

    Contrast that with the technology many of us have at home today. With
    a dial-up Internet connection, it would take about two years to move a
    terabyte of data to your house, says Jim Gray, a distinguished
    engineer at Microsoft Research who's working with CERN and Caltech on
    the high-speed project. "I'm trying to get things that run in hours or
    days or weeks to run in seconds," says Gray, a specialist in making
    huge databases hum, whose resumé stretches back nearly 40 years,
    including work at Bell Labs, Digital Equipment, and IBM. People want
    answers in real time, Gray says. Slowness, "makes you much more
    reluctant to ask questions."

    Carbon nanotubes, plastics semiconductors, and more esoteric areas of
    research such as using the spin of electrons are being examined as
    successors to silicon, IBM's senior VP and head of research Paul Horn

    Carbon nanotubes, plastics semiconductors, and more esoteric areas of
    research such as using the spin of electrons are being examined as
    successors to silicon, IBM's senior VP and head of research Paul Horn
    says.

    For its 25th anniversary issue, InformationWeek asked six leading
    computer scientists--Gray; IBM senior VP and head of research Paul
    Horn; Hewlett-Packard senior VP of research and HP Labs director Dick
    Lampman; Sun Microsystems executive VP and chief technology officer
    Greg Papadopoulos; Intel senior VP and CTO Pat Gelsinger; and Palo
    Alto Research Center president and director Mark Bernstein--to look
    ahead, to identify the ways the computer industry is likely to change,
    or needs to change, during the next decade. If any single theme
    emerged, it's speed--and the desire for it.

    For these gentlemen and the computer industry in general, slowness
    stands in the way of greater future achievements. Microprocessor
    speeds are flattening after years of phenomenal gains. PCs can't find
    us the information we need fast enough. Supercomputer users thirst for
    faster "time to insight" from their complex machines. Meanwhile, the
    explosion of technology patents confounds companies, making it more
    challenging for them to assemble all the pieces they need to bring
    innovative products to market quickly. And America's universities
    attract fewer students interested in science and technology, as Asia
    and India shine in this area, a development that could slow U.S.
    competitiveness.

    For an in-depth look at the big changes afoot in computer design, the
    office of the future, the wired home, intellectual property, and
    education and globalization, read on. Just make it fast.

    What's Next For Silicon Chips?
    If there's a metaquestion dogging computer designers, it's how much
    longer the industry will keep churning out silicon-based machines that
    are twice as fast as last year's. The most common guess: About a dozen
    years.

    Chips' clock speeds already are increasing more slowly: 10% to 15% a
    year, versus 35% to 40% per year historically. Yet computer
    performance keeps nearly doubling each year, as Advanced Micro
    Devices, IBM, Intel, and Sun Microsystems make their products more
    specialized and combine more computing functions on a single piece of
    silicon. Even so, designers are running into engineering problems that
    rob performance as electronics shrink into the nano scale.
    "Frequencies will continue to go up, but nowhere near at the rate they
    have in the past," IBM's Horn says. "We're going to see a sea change
    in the way processors are designed."

    But what's the limit? The question has many practical implications.
    "We've been on a curve where you lead with your fastest microprocessor
    with the biggest cache," and you charge premium prices for those
    products, Sun's Papadopoulos says. "Now the world's starting to
    unravel."

    The semiconductor industry is building products with electronics just
    65 billionths of a meter wide. The next two generations--45 nanometers
    and 32 nanometers, each about three years apart--look OK, Horn says.
    "It's pretty clear nothing's going to replace silicon in that time,"
    he says. "You go one more cycle out and, I'll tell you, it's getting
    pretty dicey. The problem is, there's no good alternative."

    Carbon nanotubes, plastic semiconductors, and more esoteric areas of
    research, such as using the electronic spin of electrons or the
    quantum mechanical properties of atoms to perform computations, have
    all been posited as solutions. "All have some potential," Horn says,
    "but there's no clear-cut road map as a replacement for silicon."
    Others are more bullish. Nanotechnology won't knock out today's CMOS
    technology right away, says HP's Lampman, "but in the long term, it
    will be the dominant form of electronics." One big advantage would be
    lower-cost production compared with CMOS: A chip fabrication plant
    with all its equipment costs about $3 billion. PARC is researching
    "organic electronics"--using carbon-based materials instead of silicon
    to compute--in hopes that one day they can be cheaply stamped onto
    flexible rolls using common printing techniques. "The use of organics
    is going to have a radical impact," PARC director Bernstein says.

    As for today's technology, gains in the sophistication of software let
    programmers wring more performance from the specialized silicon chips
    that companies are turning out. "Programmability always rules,"
    Papadopoulos says. "There's far more innovation happening in software
    than in hardware."

    The PC Versus The Personal Network
    On the other end of the computing spectrum is the old,
    not-always-so-reliable, PC. While the rarified end of the
    supercomputing sector heads toward the milestone of a petaflop machine
    capable of a quadrillion calculations a second, other engineers and
    scientists are trying to extract power from huge networks of cheap
    PCs. "My agenda isn't to be the first to a petaflop," Intel CTO
    Gelsinger says. "The agenda is supercomputing for the masses." At
    Microsoft, Gray talks about closing the "guru gap" between what the
    most advanced users can get out of Wintel systems and what everyone
    else can.

    Even Microsoft's and Intel's critics concede the PC isn't likely to
    budge from desktops soon. "The PC's going to be around for a long
    time," Papadopoulos says. Horn calls it a "platform that will be with
    us for the foreseeable future." However, there's a lot that needs to
    be addressed.

    In an era of rapidly multiplying E-documents, the hierarchical file
    system is falling down on the job. Apple Computer and Microsoft are
    putting research and development into new ways of pinpointing digital
    files that don't require wading through directories of folders.
    Microsoft and Intel are rethinking the PC's guts so its electronics
    and software are more aware of who's changing what.

    Ideas percolating in research labs could change the nature of office
    work, making the PC just one part of a floating "personal network" of
    information. "The PC represents an architectural point that's
    distinctly unnetworked." Papadopoulos says. "The question isn't
    whether I should have Google-like search on my PC." Rather, he says,
    it's how soon users can unhook themselves from their hard drives and
    take advantage of the the Internet's ubiquitous reach.

    Bill Gates first called that notion "information at your fingertips"
    in a 1990 speech at the Comdex trade show, and it's an increasingly
    popular one. "The nature of the work we're doing hasn't changed that
    much," PARC's Bernstein says. "We're still pounding our fingers on
    keyboards." PARC is researching computer displays that are large
    surfaces that groups of workers can share to call up new information
    by touching the screen. IP phones also will change social protocols at
    work--instead of picking up a receiver and dialing, we might say,
    "Phone Bill in Redmond."

    The notion of a corporate network could change, too, as information on
    people's PCs and PDAs melds into a work-life blur, Bernstein says. But
    different technical standards for computers, cars, and consumer
    electronics make it too hard to ferry those devices between work and
    home, he adds. PARC software, called Obje, can bridge standards among
    cell phones, laptops, PDAs, printers, set-top boxes, and video
    displays from different companies.

    At HP, engineers are working to bring to market another great hope for
    the office of the future: videoconferencing that works, Lampman says.
    Within a few years, HP plans to release a videoconferencing system
    that it has been developing with DreamWorks SKG, which features
    life-size images of people broadcast in high-definition video and
    multidimensional sound that doesn't ring like speakerphone gibberish
    when two people are talking, he says. DreamWorks' system "is the first
    one I've seen that makes you feel emotionally like you're in one
    room," Lampman says. "We've thrown HP tech teams on it to see how to
    commercialize the system and take some of the cost out."

    The Wired Home MAY Be More Entertaining Than You Think
    Tech companies throwing R&D dollars at your office agree where the
    money should go. That's not the case at home. "Where's the interface
    for information? Is it the TV set, the set-top box, the tablet
    computer, or the phone?" Bernstein asks. "The one that will win out is
    the one that's easiest for people to use." A good candidate: TV sets
    with touch screens and speech recognition, areas PARC is researching.
    Everybody agrees that games and other entertainment apps will lead the
    next wave of technology in our homes. They differ in how to get there.

    Intel's goal isn't to be first with a petaflop, Intel senior VP and
    CTO Pat Gelsinger says. Instead, the company's agenda is to deliver
    'supercomputing for the masses.'


    Intel is building concept technology that aims to fuse the functions
    of the PC, digital video recorder, and game machine on a single
    versatile silicon chip, Gelsinger says. "It's very hard to tell where
    consumer electronics stop in the home and PCs start," he says. "We're
    trying to make it very hard to tell the difference." Gelsinger wants
    Intel to compete with PlayStation, Xbox, and TiVo, powering products
    that let consumers play games, record shows, and check their E-mail.
    "I'm going to take x86 [chip] and sell it into [consumer electronic]
    boxes," he says.

    HP is going the specialty route, licensing technology to Swiss
    chipmaker STMicroelectronics N.V. for a processor aimed at DVD players
    and digital TVs. "When you specialize, you can get a huge performance
    boost you can't [get] otherwise," Lampman says. "As the
    consumer-electronics world goes all-digital, getting the performance
    people want takes huge amounts of processing."

    But forget the notion of Internet-ready washing machines,
    refrigerators, and toasters that can chat among themselves in a
    network, IBM's Horn adds. "It's more the high-tech wacko in my
    laboratory who would want to do that than John Q. Public."

    Intellectual Property: Build Or Buy?
    Where's the next great idea in computing going to come from?
    Increasingly, companies are betting it will be outside their walls. As
    the number of technology patents explodes, and information
    technologies find new applications outside the industries for which
    they were invented, it's becoming much harder for any one company to
    control all the pieces it needs to bring great products to market. "As
    a consequence of the breadth of technology, it's very unlikely you'll
    have all the pieces you need to succeed," PARC's Bernstein says.
    "Barriers have dropped tremendously."

    That's the reason Microsoft restructured its licensing business this
    year--to gain greater freedom to license the intellectual property it
    needs to build its products in portfolios, instead of piece by piece.
    And HP has increased the number of patents it has applied for in the
    last five years and become more aggressive in protecting its
    intellectual property, Lampman says. Four years ago, the company made
    decisions similar to the ones Microsoft is making now, he says, and
    this year's revenue from intellectual-property licensing should triple
    compared to last year.

    Fragmentation and faster tech transfer mean hardware, software, and
    services will come to market faster, IBM's Horn says. "Innovation in
    our world is undergoing a fundamental shift." Earlier this month, IBM
    tapped two of its most senior executives, John Kelly and Irving
    Wladawsky-Berger, to head an intellectual-property group. IBM earned
    about $1 billion in profit last year from licensing its intellectual
    property.

    The Education Of The Post-Modern Programmer
    So who's going to build the next wave of great products? Many
    researchers are afraid that a shortage of technical talent will hurt
    U.S. competitiveness. "American education is doing an extremely bad
    job," says Microsoft's Gray, who served on the President's IT Advisory
    Committee during the Clinton administration. Enrollment in college
    science and engineering programs has been dropping since the '80s, and
    participation by women is falling off even more rapidly, to about 15%
    of students. "This is an education catastrophe," he says. The United
    States spends about as much per capita on education as other
    countries, Gray says, but low pay and lack of respect for teachers
    isn't preparing kids to choose technical fields--a decision that's
    often made by the time they're in fourth grade. "Many students aren't
    excited about science and technology. Play it forward, and the
    high-paying jobs are in areas where people have some special
    expertise." Companies are outsourcing tech work to Asia, India, and
    Russia because the workforce isn't just cheaper--it's often more
    talented, Gray says.

    Case in point: HP's lab in Bangalore, India, isn't just a money saver,
    Lampman says. "It's helping HP grow. That's the mission of that lab,"
    he says. That's not to say training more computer scientists at home
    isn't important. "The U.S. has benefited enormously from IT
    investment, in terms of balance of trade and jobs," he says. "I'm
    concerned we've lost that."

    IBM is going one step further, trying to influence university
    curriculum. This fall, the Haas Business School at the University of
    California, Berkeley, is offering the first class in a discipline Horn
    calls "services science." It's being taught by business professor and
    noted R&D expert Hank Chesbrough and IBM researcher Jim Spohrer.
    Boundaries between B-schools and computer-science programs need to
    fall to give students 21st century skills, Horn says. "There wasn't
    even a computer science course until the 1940s," when IBM and Columbia
    University teamed up to teach one, he says. "I'm hopeful this could be
    something like that."

    If the tech industry wants to keep its engines of innovation churning,
    it's going to need more of this sort of fuel.


More information about the paleopsych mailing list