[Paleopsych] NYT: Watching TV Makes You Smarter
Premise Checker
checker at panix.com
Sun Apr 24 20:09:49 UTC 2005
Magazine > Watching TV Makes You Smarter
http://www.nytimes.com/2005/04/24/magazine/24TV.html
By STEVEN JOHNSON
The Sleeper Curve
SCIENTIST A: Has he asked for anything special?
SCIENTIST B: Yes, this morning for breakfast . . . he requested
something called ''wheat germ, organic honey and tiger's milk.''
SCIENTIST A: Oh, yes. Those were the charmed substances that some
years ago were felt to contain life-preserving properties.
SCIENTIST B: You mean there was no deep fat? No steak or cream pies
or . . . hot fudge?
SCIENTIST A: Those were thought to be unhealthy.
From Woody Allen's ''Sleeper''
O n Jan. 24, the Fox network showed an episode of its hit drama
''24,'' the real-time thriller known for its cliffhanger tension and
often- gruesome violence. Over the preceding weeks, a number of public
controversies had erupted around ''24,'' mostly focused on its
portrait of Muslim terrorists and its penchant for torture scenes. The
episode that was shown on the 24th only fanned the flames higher: in
one scene, a terrorist enlists a hit man to kill his child for not
fully supporting the jihadist cause; in another scene, the secretary
of defense authorizes the torture of his son to uncover evidence of a
terrorist plot.
But the explicit violence and the post-9/11 terrorist anxiety are not
the only elements of ''24'' that would have been unthinkable on
prime-time network television 20 years ago. Alongside the notable
change in content lies an equally notable change in form. During its
44 minutes -- a real-time hour, minus 16 minutes for commercials --
the episode connects the lives of 21 distinct characters, each with a
clearly defined ''story arc,'' as the Hollywood jargon has it: a
defined personality with motivations and obstacles and specific
relationships with other characters. Nine primary narrative threads
wind their way through those 44 minutes, each drawing extensively upon
events and information revealed in earlier episodes. Draw a map of all
those intersecting plots and personalities, and you get structure that
-- where formal complexity is concerned -- more closely resembles
''Middlemarch'' than a hit TV drama of years past like ''Bonanza.''
For decades, we've worked under the assumption that mass culture
follows a path declining steadily toward lowest-common-denominator
standards, presumably because the ''masses'' want dumb, simple
pleasures and big media companies try to give the masses what they
want. But as that ''24'' episode suggests, the exact opposite is
happening: the culture is getting more cognitively demanding, not
less. To make sense of an episode of ''24,'' you have to integrate far
more information than you would have a few decades ago watching a
comparable show. Beneath the violence and the ethnic stereotypes,
another trend appears: to keep up with entertainment like ''24,'' you
have to pay attention, make inferences, track shifting social
relationships. This is what I call the Sleeper Curve: the most debased
forms of mass diversion -- video games and violent television dramas
and juvenile sitcoms -- turn out to be nutritional after all.
I believe that the Sleeper Curve is the single most important new
force altering the mental development of young people today, and I
believe it is largely a force for good: enhancing our cognitive
faculties, not dumbing them down. And yet you almost never hear this
story in popular accounts of today's media. Instead, you hear dire
tales of addiction, violence, mindless escapism. It's assumed that
shows that promote smoking or gratuitous violence are bad for us,
while those that thunder against teen pregnancy or intolerance have a
positive role in society. Judged by that morality-play standard, the
story of popular culture over the past 50 years -- if not 500 -- is a
story of decline: the morals of the stories have grown darker and more
ambiguous, and the antiheroes have multiplied.
The usual counterargument here is that what media have lost in moral
clarity, they have gained in realism. The real world doesn't come in
nicely packaged public-service announcements, and we're better off
with entertainment like ''The Sopranos'' that reflects our fallen
state with all its ethical ambiguity. I happen to be sympathetic to
that argument, but it's not the one I want to make here. I think there
is another way to assess the social virtue of pop culture, one that
looks at media as a kind of cognitive workout, not as a series of life
lessons. There may indeed be more ''negative messages'' in the
mediasphere today. But that's not the only way to evaluate whether our
television shows or video games are having a positive impact. Just as
important -- if not more important -- is the kind of thinking you have
to do to make sense of a cultural experience. That is where the
Sleeper Curve becomes visible.
Televised Intelligence
Consider the cognitive demands that televised narratives place on
their viewers. With many shows that we associate with ''quality''
entertainment -- ''The Mary Tyler Moore Show,'' ''Murphy Brown,''
''Frasier'' -- the intelligence arrives fully formed in the words and
actions of the characters on-screen. They say witty things to one
another and avoid lapsing into tired sitcom cliches, and we smile
along in our living rooms, enjoying the company of these smart people.
But assuming we're bright enough to understand the sentences they're
saying, there's no intellectual labor involved in enjoying the show as
a viewer. You no more challenge your mind by watching these
intelligent shows than you challenge your body watching ''Monday Night
Football.'' The intellectual work is happening on-screen, not off.
But another kind of televised intelligence is on the rise. Think of
the cognitive benefits conventionally ascribed to reading: attention,
patience, retention, the parsing of narrative threads. Over the last
half-century, programming on TV has increased the demands it places on
precisely these mental faculties. This growing complexity involves
three primary elements: multiple threading, flashing arrows and social
networks.
According to television lore, the age of multiple threads began with
the arrival in 1981 of ''Hill Street Blues,'' the Steven Bochco police
drama invariably praised for its ''gritty realism.'' Watch an episode
of ''Hill Street Blues'' side by side with any major drama from the
preceding decades -- ''Starsky and Hutch,'' for instance, or
''Dragnet'' -- and the structural transformation will jump out at you.
The earlier shows follow one or two lead characters, adhere to a
single dominant plot and reach a decisive conclusion at the end of the
episode. Draw an outline of the narrative threads in almost every
''Dragnet'' episode, and it will be a single line: from the initial
crime scene, through the investigation, to the eventual cracking of
the case. A typical ''Starsky and Hutch'' episode offers only the
slightest variation on this linear formula: the introduction of a
comic subplot that usually appears only at the tail ends of the
episode, creating a structure that looks like [1]this graph. The
vertical axis represents the number of individual threads, and the
horizontal axis is time.
A ''Hill Street Blues'' episode complicates the picture in a number of
profound ways. The narrative weaves together a collection of distinct
strands -- sometimes as many as 10, though at least half of the
threads involve only a few quick scenes scattered through the episode.
The number of primary characters -- and not just bit parts -- swells
significantly. And the episode has fuzzy borders: picking up one or
two threads from previous episodes at the outset and leaving one or
two threads open at the end. Charted graphically, an average episode
looks like [2]this.
Critics generally cite ''Hill Street Blues'' as the beginning of
''serious drama'' native in the television medium -- differentiating
the series from the single-episode dramatic programs from the 50's,
which were Broadway plays performed in front of a camera. But the
''Hill Street'' innovations weren't all that original; they'd long
played a defining role in popular television, just not during the
evening hours. The structure of a ''Hill Street'' episode -- and
indeed of all the critically acclaimed dramas that followed, from
''thirtysomething'' to ''Six Feet Under'' -- is the structure of a
soap opera. ''Hill Street Blues'' might have sparked a new golden age
of television drama during its seven-year run, but it did so by using
a few crucial tricks that ''Guiding Light'' and ''General Hospital''
mastered long before.
Bochco's genius with ''Hill Street'' was to marry complex narrative
structure with complex subject matter. 'Dallas'' had already shown
that the extended, interwoven threads of the soap-opera genre could
survive the weeklong interruptions of a prime-time show, but the
actual content of ''Dallas'' was fluff. (The most probing issue it
addressed was the question, now folkloric, of who shot J.R.) ''All in
the Family'' and ''Rhoda'' showed that you could tackle complex social
issues, but they did their tackling in the comfort of the sitcom
living room. ''Hill Street'' had richly drawn characters confronting
difficult social issues and a narrative structure to match.
Since ''Hill Street'' appeared, the multi-threaded drama has become
the most widespread fictional genre on prime time: ''St. Elsewhere,''
''L.A. Law,'' ''thirtysomething,'' ''Twin Peaks,'' ''N.Y.P.D. Blue,''
''E.R.,'' ''The West Wing,'' ''Alias,'' ''Lost.'' (The only prominent
holdouts in drama are shows like ''Law and Order'' that have
essentially updated the venerable ''Dragnet'' format and thus remained
anchored to a single narrative line.) Since the early 80's, however,
there has been a noticeable increase in narrative complexity in these
dramas. The most ambitious show on TV to date, ''The Sopranos,''
routinely follows up to a dozen distinct threads over the course of an
episode, with more than 20 recurring characters. An episode from late
in the first season looks like [3]this.
The total number of active threads equals the multiple plots of ''Hill
Street,'' but here each thread is more substantial. The show doesn't
offer a clear distinction between dominant and minor plots; each story
line carries its weight in the mix. The episode also displays a
chordal mode of storytelling entirely absent from ''Hill Street'': a
single scene in ''The Sopranos'' will often connect to three different
threads at the same time, layering one plot atop another. And every
single thread in this ''Sopranos'' episode builds on events from
previous episodes and continues on through the rest of the season and
beyond.
Put those charts together, and you have a portrait of the Sleeper
Curve rising over the past 30 years of popular television. In a sense,
this is as much a map of cognitive changes in the popular mind as it
is a map of on-screen developments, as if the media titans decided to
condition our brains to follow ever-larger numbers of simultaneous
threads. Before ''Hill Street,'' the conventional wisdom among
television execs was that audiences wouldn't be comfortable following
more than three plots in a single episode, and indeed, the ''Hill
Street'' pilot, which was shown in January 1981, brought complaints
from viewers that the show was too complicated. Fast-forward two
decades, and shows like ''The Sopranos'' engage their audiences with
narratives that make ''Hill Street'' look like ''Three's Company.''
Audiences happily embrace that complexity because they've been trained
by two decades of multi-threaded dramas.
Multi-threading is the most celebrated structural feature of the
modern television drama, and it certainly deserves some of the honor
that has been doled out to it. And yet multi-threading is only part of
the story.
The Case for Confusion
Shortly after the arrival of the first-generation slasher movies --
''Halloween,'' ''Friday the 13th'' -- Paramount released a
mock-slasher flick called ''Student Bodies,'' parodying the genre just
as the ''Scream'' series would do 15 years later. In one scene, the
obligatory nubile teenage baby sitter hears a noise outside a suburban
house; she opens the door to investigate, finds nothing and then goes
back inside. As the door shuts behind her, the camera swoops in on the
doorknob, and we see that she has left the door unlocked. The camera
pulls back and then swoops down again for emphasis. And then a
flashing arrow appears on the screen, with text that helpfully
explains: ''Unlocked!''
That flashing arrow is parody, of course, but it's merely an
exaggerated version of a device popular stories use all the time. When
a sci-fi script inserts into some advanced lab a nonscientist who
keeps asking the science geeks to explain what they're doing with that
particle accelerator, that's a flashing arrow that gives the audience
precisely the information it needs in order to make sense of the
ensuing plot. (''Whatever you do, don't spill water on it, or you'll
set off a massive explosion!'') These hints serve as a kind of
narrative hand-holding. Implicitly, they say to the audience, ''We
realize you have no idea what a particle accelerator is, but here's
the deal: all you need to know is that it's a big fancy thing that
explodes when wet.'' They focus the mind on relevant details: ''Don't
worry about whether the baby sitter is going to break up with her
boyfriend. Worry about that guy lurking in the bushes.'' They reduce
the amount of analytic work you need to do to make sense of a story.
All you have to do is follow the arrows.
By this standard, popular television has never been harder to follow.
If narrative threads have experienced a population explosion over the
past 20 years, flashing arrows have grown correspondingly scarce.
Watching our pinnacle of early 80's TV drama, ''Hill Street Blues,''
we find there's an informational wholeness to each scene that differs
markedly from what you see on shows like ''The West Wing'' or ''The
Sopranos'' or ''Alias'' or ''E.R.''
''Hill Street'' has ambiguities about future events: will a convicted
killer be executed? Will Furillo marry Joyce Davenport? Will Renko
find it in himself to bust a favorite singer for cocaine possession?
But the present-tense of each scene explains itself to the viewer with
little ambiguity. There's an open question or a mystery driving each
of these stories -- how will it all turn out? -- but there's no
mystery about the immediate activity on the screen. A contemporary
drama like ''The West Wing,'' on the other hand, constantly embeds
mysteries into the present-tense events: you see characters performing
actions or discussing events about which crucial information has been
deliberately withheld. Anyone who has watched more than a handful of
''The West Wing'' episodes closely will know the feeling: scene after
scene refers to some clearly crucial but unexplained piece of
information, and after the sixth reference, you'll find yourself
wishing you could rewind the tape to figure out what they're talking
about, assuming you've missed something. And then you realize that
you're supposed to be confused. The open question posed by these
sequences is not ''How will this turn out in the end?'' The question
is ''What's happening right now?''
The deliberate lack of hand-holding extends down to the microlevel of
dialogue as well. Popular entertainment that addresses technical
issues -- whether they are the intricacies of passing legislation, or
of performing a heart bypass, or of operating a particle accelerator
-- conventionally switches between two modes of information in
dialogue: texture and substance. Texture is all the arcane verbiage
provided to convince the viewer that they're watching Actual Doctors
at Work; substance is the material planted amid the background texture
that the viewer needs make sense of the plot.
Conventionally, narratives demarcate the line between texture and
substance by inserting cues that flag or translate the important data.
There's an unintentionally comical moment in the 2004 blockbuster
''The Day After Tomorrow'' in which the beleaguered climatologist
(played by Dennis Quaid) announces his theory about the imminent
arrival of a new ice age to a gathering of government officials. In
his speech, he warns that ''we have hit a critical desalinization
point!'' At this moment, the writer-director Roland Emmerich -- a
master of brazen arrow-flashing -- has an official follow with the
obliging remark: ''It would explain what's driving this extreme
weather.'' They might as well have had a flashing ''Unlocked!'' arrow
on the screen.
The dialogue on shows like ''The West Wing'' and ''E.R.,'' on the
other hand, doesn't talk down to its audiences. It rushes by, the
words accelerating in sync with the high-speed tracking shots that
glide through the corridors and operating rooms. The characters talk
faster in these shows, but the truly remarkable thing about the
dialogue is not purely a matter of speed; it's the willingness to
immerse the audience in information that most viewers won't
understand. Here's a typical scene from ''E.R.'':
[WEAVER AND WRIGHT push a gurney containing a 16-year-old girl. Her
parents, JANNA AND FRANK MIKAMI, follow close behind. CARTER AND
LUCY fall in.]
WEAVER: 16-year-old, unconscious, history of biliary atresia.
CARTER: Hepatic coma?
WEAVER: Looks like it.
MR. MIKAMI: She was doing fine until six months ago.
CARTER: What medication is she on?
MRS. MIKAMI: Ampicillin, tobramycin, vitamins a, d and k.
LUCY: Skin's jaundiced.
WEAVER: Same with the sclera. Breath smells sweet.
CARTER: Fetor hepaticus?
WEAVER: Yep.
LUCY: What's that?
WEAVER: Her liver's shut down. Let's dip a urine. [To CARTER] Guys,
it's getting a little crowded in here, why don't you deal with the
parents? Start lactulose, 30 cc's per NG.
CARTER: We're giving medicine to clean her blood.
WEAVER: Blood in the urine, two-plus.
CARTER: The liver failure is causing her blood not to clot.
MRS. MIKAMI: Oh, God. . . .
CARTER: Is she on the transplant list?
MR. MIKAMI: She's been Status 2a for six months, but they haven't
been able to find her a match.
CARTER: Why? What's her blood type?
MR. MIKAMI: AB.
[This hits CARTER like a lightning bolt. LUCY gets it, too. They
share a look.]
There are flashing arrows here, of course -- ''The liver failure is
causing her blood not to clot'' -- but the ratio of medical jargon to
layperson translation is remarkably high. From a purely narrative
point of view, the decisive line arrives at the very end: ''AB.'' The
16-year-old's blood type connects her to an earlier plot line,
involving a cerebral-hemorrhage victim who -- after being dramatically
revived in one of the opening scenes -- ends up brain-dead. Far
earlier, before the liver-failure scene above, Carter briefly
discusses harvesting the hemorrhage victim's organs for transplants,
and another doctor makes a passing reference to his blood type being
the rare AB (thus making him an unlikely donor). The twist here
revolves around a statistically unlikely event happening at the E.R.
-- an otherwise perfect liver donor showing up just in time to donate
his liver to a recipient with the same rare blood type. But the show
reveals this twist with remarkable subtlety. To make sense of that
last ''AB'' line -- and the look of disbelief on Carter's and Lucy's
faces -- you have to recall a passing remark uttered earlier regarding
a character who belongs to a completely different thread. Shows like
''E.R.'' may have more blood and guts than popular TV had a generation
ago, but when it comes to storytelling, they possess a quality that
can only be described as subtlety and discretion.
Even Bad TV Is Better
Skeptics might argue that I have stacked the deck here by focusing on
relatively highbrow titles like ''The Sopranos'' or ''The West Wing,''
when in fact the most significant change in the last five years of
narrative entertainment involves reality TV. Does the contemporary pop
cultural landscape look quite as promising if the representative show
is ''Joe Millionaire'' instead of ''The West Wing''?
I think it does, but to answer that question properly, you have to
avoid the tendency to sentimentalize the past. When people talk about
the golden age of television in the early 70's -- invoking shows like
''The Mary Tyler Moore Show'' and ''All in the Family'' -- they forget
to mention how awful most television programming was during much of
that decade. If you're going to look at pop-culture trends, you have
to compare apples to apples, or in this case, lemons to lemons. The
relevant comparison is not between ''Joe Millionaire'' and ''MASH'';
it's between ''Joe Millionaire'' and ''The Newlywed Game,'' or between
''Survivor'' and ''The Love Boat.''
What you see when you make these head-to-head comparisons is that a
rising tide of complexity has been lifting programming at the bottom
of the quality spectrum and at the top. ''The Sopranos'' is several
times more demanding of its audiences than ''Hill Street'' was, and
''Joe Millionaire'' has made comparable advances over ''Battle of the
Network Stars.'' This is the ultimate test of the Sleeper Curve
theory: even the junk has improved.
If early television took its cues from the stage, today's reality
programming is reliably structured like a video game: a series of
competitive tests, growing more challenging over time. Many reality
shows borrow a subtler device from gaming culture as well: the rules
aren't fully established at the outset. You learn as you play.
On a show like ''Survivor'' or ''The Apprentice,'' the participants --
and the audience -- know the general objective of the series, but each
episode involves new challenges that haven't been ordained in advance.
The final round of the first season of ''The Apprentice,'' for
instance, threw a monkey wrench into the strategy that governed the
play up to that point, when Trump announced that the two remaining
apprentices would have to assemble and manage a team of subordinates
who had already been fired in earlier episodes of the show. All of a
sudden the overarching objective of the game -- do anything to avoid
being fired -- presented a potential conflict to the remaining two
contenders: the structure of the final round favored the survivor who
had maintained the best relationships with his comrades. Suddenly, it
wasn't enough just to have clawed your way to the top; you had to have
made friends while clawing. The original ''Joe Millionaire'' went so
far as to undermine the most fundamental convention of all -- that the
show's creators don't openly lie to the contestants about the prizes
-- by inducing a construction worker to pose as man of means while 20
women competed for his attention.
Reality programming borrowed another key ingredient from games: the
intellectual labor of probing the system's rules for weak spots and
opportunities. As each show discloses its conventions, and each
participant reveals his or her personality traits and background, the
intrigue in watching comes from figuring out how the participants
should best navigate the environment that has been created for them.
The pleasure in these shows comes not from watching other people being
humiliated on national television; it comes from depositing other
people in a complex, high-pressure environment where no established
strategies exist and watching them find their bearings. That's why the
water-cooler conversation about these shows invariably tracks in on
the strategy displayed on the previous night's episode: why did Kwame
pick Omarosa in that final round? What devious strategy is Richard
Hatch concocting now?
When we watch these shows, the part of our brain that monitors the
emotional lives of the people around us -- the part that tracks subtle
shifts in intonation and gesture and facial expression -- scrutinizes
the action on the screen, looking for clues. We trust certain
characters implicitly and vote others off the island in a heartbeat.
Traditional narrative shows also trigger emotional connections to the
characters, but those connections don't have the same participatory
effect, because traditional narratives aren't explicitly about
strategy. The phrase ''Monday-morning quarterbacking'' describes the
engaged feeling that spectators have in relation to games as opposed
to stories. We absorb stories, but we second-guess games. Reality
programming has brought that second-guessing to prime time, only the
game in question revolves around social dexterity rather than the
physical kind.
The Rewards of Smart Culture
The quickest way to appreciate the Sleeper Curve's cognitive training
is to sit down and watch a few hours of hit programming from the late
70's on Nick at Nite or the SOAPnet channel or on DVD. The modern
viewer who watches a show like ''Dallas'' today will be bored by the
content -- not just because the show is less salacious than today's
soap operas (which it is by a small margin) but also because the show
contains far less information in each scene, despite the fact that its
soap-opera structure made it one of the most complicated narratives on
television in its prime. With ''Dallas,'' the modern viewer doesn't
have to think to make sense of what's going on, and not having to
think is boring. Many recent hit shows -- ''24,'' ''Survivor,'' ''The
Sopranos,'' ''Alias,'' ''Lost,'' ''The Simpsons,'' ''E.R.'' -- take
the opposite approach, layering each scene with a thick network of
affiliations. You have to focus to follow the plot, and in focusing
you're exercising the parts of your brain that map social networks,
that fill in missing information, that connect multiple narrative
threads.
Of course, the entertainment industry isn't increasing the cognitive
complexity of its products for charitable reasons. The Sleeper Curve
exists because there's money to be made by making culture smarter. The
economics of television syndication and DVD sales mean that there's a
tremendous financial pressure to make programs that can be watched
multiple times, revealing new nuances and shadings on the third
viewing. Meanwhile, the Web has created a forum for annotation and
commentary that allows more complicated shows to prosper, thanks to
the fan sites where each episode of shows like ''Lost'' or ''Alias''
is dissected with an intensity usually reserved for Talmud scholars.
Finally, interactive games have trained a new generation of media
consumers to probe complex environments and to think on their feet,
and that gamer audience has now come to expect the same challenges
from their television shows. In the end, the Sleeper Curve tells us
something about the human mind. It may be drawn toward the sensational
where content is concerned -- sex does sell, after all. But the mind
also likes to be challenged; there's real pleasure to be found in
solving puzzles, detecting patterns or unpacking a complex narrative
system.
In pointing out some of the ways that popular culture has improved our
minds, I am not arguing that parents should stop paying attention to
the way their children amuse themselves. What I am arguing for is a
change in the criteria we use to determine what really is cognitive
junk food and what is genuinely nourishing. Instead of a show's
violent or tawdry content, instead of wardrobe malfunctions or the
F-word, the true test should be whether a given show engages or
sedates the mind. Is it a single thread strung together with
predictable punch lines every 30 seconds? Or does it map a complex
social network? Is your on-screen character running around shooting
everything in sight, or is she trying to solve problems and manage
resources? If your kids want to watch reality TV, encourage them to
watch ''Survivor'' over ''Fear Factor.'' If they want to watch a
mystery show, encourage ''24'' over ''Law and Order.'' If they want to
play a violent game, encourage Grand Theft Auto over Quake. Indeed, it
might be just as helpful to have a rating system that used mental
labor and not obscenity and violence as its classification scheme for
the world of mass culture.
Kids and grown-ups each can learn from their increasingly shared
obsessions. Too often we imagine the blurring of kid and grown-up
cultures as a series of violations: the 9-year-olds who have to have
nipple broaches explained to them thanks to Janet Jackson; the
middle-aged guy who can't wait to get home to his Xbox. But this
demographic blur has a commendable side that we don't acknowledge
enough. The kids are forced to think like grown-ups: analyzing complex
social networks, managing resources, tracking subtle narrative
intertwinings, recognizing long-term patterns. The grown-ups, in turn,
get to learn from the kids: decoding each new technological wave,
parsing the interfaces and discovering the intellectual rewards of
play. Parents should see this as an opportunity, not a crisis. Smart
culture is no longer something you force your kids to ingest, like
green vegetables. It's something you share.
Steven Johnson is the author, most recently, of ''Mind Wide Open.''
His book ''Everything Bad Is Good for You: How Today's Popular Culture
Is Actually Making Us Smarter,'' from which this article is adapted,
will be published next month.
More information about the paleopsych
mailing list