[Paleopsych] NYT Magazine: In Search of Lost Time
Premise Checker
checker at panix.com
Tue Jan 18 15:38:19 UTC 2005
In Search of Lost Time
New York Times Magazine
December 5, 2004
By CATHRYN JAKOBSON RAMIN
A few months ago, as I trudged down the stairs of my office
building, deep in my thoughts, I noticed a dark-haired
woman waving to me from the window of her car. She looked
vaguely familiar, but I couldn't place her. Like quite a
few others, she had slipped out of my mental Rolodex. In my
brain, the synaptic traces that connected us had frayed.
Yet again, I had misplaced an entire human being.
''So wonderful to see you,'' she said, inquiring by name
after every member of my family, including the two dogs.
Apparently she was not a casual acquaintance. Fending off
panic, I proceeded through a mental list: Work? School?
Synagogue? I couldn't visualize her in these places. I was
about to cut and run with a quick ''nice to see you, too''
when the rear window slid down, revealing a toothy grin.
''We've been to the orthodontist,'' she said. The minute I
saw Sam's freckled face, the mystery was solved. Our sons
were best pals in nursery school and kindergarten. I had
sat in her kitchen, discussing birthday parties. I
remembered her backyard dotted with Little Tikes plastic
play furniture. I knew what she did for work, and the name
of her Portuguese nanny.
''Lisa,'' I said, as if her identity had never eluded me,
''it's terrific to see you.''
Why, as I edge toward the end of my 40's, has so much of
what I know become impossible to access on demand? Where
are the thoughts that spring forth in the shower but
evanesce before they can be recorded, the mental lists that
shed items on the way to the supermarket? The names of
books and movies, actors and authors, le mot juste, the
memory of social plans agreed upon in some calendarless
situation -- what have become of these?
I take some comfort in the fact that I am not alone. In the
space of one week, a psychologist remarked that she had
turned over all social scheduling to her husband, at his
insistence, because the couple had appeared at yet another
dinner party on the wrong night. A woman who publishes a
local magazine noted that she'd just come from the bank,
where she'd spent 10 minutes searching through her purse,
briefcase and pockets for a check that she'd never written.
A freelance illustrator disclosed that he'd gone to work on
Friday, completed a drawing for an editor and mailed it
off, only to return on Monday to re-execute the identical
assignment without any sense of deja vu.
The feelings of embarrassment, frustration and anger that
surround such middle-aged lapses serve to disguise a more
primitive emotion. At the heart of it, there is fear --
cold, implacable anxiety, emerging from the suspicion that
this is just the beginning. Memory, the instrument we
trusted to guide us, has instead betrayed us, making us
deeply uncertain about our cognitive futures. We worry
about decades of dependence, of life with a diminished mind
trapped in a still vigorous body.
To the person who has misplaced his keys three times in two
days or just called a colleague by the wrong name,
forgetfulness in middle age can feel like incipient
Alzheimer's disease. But for most of us, the memory
deficits we encounter in midlife reflect a common pattern
of brain aging and are not thought to be predictive of the
progressive degeneration that leads to dementia.
Neuropsychological tests can help tease out the difference
between normal aging and pathology. Individuals, even in
the early stages of Alzheimer's, show a marked inability to
remember a list of several words after a 20-minute delay.
When they are reminded that, for instance, one of the words
is a type of fruit, they still lack the ''aha!'' experience
that allows the average person to say, ''Oh, yes, it was an
orange.''
''We are hyper-alert about Alzheimer's disease,'' said Dr.
Oliver Sacks, the author and neurologist, when I asked him
why we find cognitive lapses so worrisome. ''Even momentary
forgetting, quite benign, can be unduly upsetting, because
there is general alarm around us. But that is only one part
of it: For people who have always been very competent,
forgetting brings a disturbing sense of the loss of control
and mastery.''
So much that is fundamental is bound up in the ease and
accuracy of recollection. Foremost, there is trust: On the
afternoon when you forget that it is your turn to carpool
and leave three kids and a disgruntled coach standing on a
soccer field in the teeming rain, your belief (and theirs)
that you are a reliable person is severely tested. There is
self-knowledge: at the holiday table, when your brother
recounts your role in setting the garage on fire some 30
years earlier and you can't recall the event, your
historical perspective is altered forever. There is
self-esteem: when you open your mouth to ask a question at
a professional meeting, certain that you have the facts at
your fingertips, but the words elude you, you feel witless
and weak.
''In an information age,'' writes Charles Baxter in a
collection of essays called ''The Business of Memory,''
''forgetfulness is a sign of debility and incompetence. It
is taken as weakness, an emblem of losing one's grip. For
anyone who works with quantities of data, a single note of
forgetfulness can sound like a death knell. To remember is
to triumph over loss and death; to forget is to form a
partnership with oblivion.''
Nearly 15 years ago, when I was pregnant with my first son,
I realized that something was happening to my mind. Beyond
the typical things -- forgetting names, directions to
places I'd been before -- I found it harder to comprehend
or retain complex reading material. I could no longer make
rapid connections between ideas, because I'd lost access to
knowledge I'd already absorbed.
''How bothersome the loss is,'' Dr. Sacks said, ''depends
very much on personality. Someone who has prided herself on
control, on having everything in order, may be much less
tolerant than the easygoing person.'' As an indisputably
Type A person, I was deeply perturbed.
Three years ago, a few days before I turned 45, I went to
the movies with my husband. On the short drive home, I
realized that I couldn't remember the title of the film,
which I had liked very much, or the name of the actor who
played the leading role. Was this just the result of
growing older -- the same middle-aged muddle my friends
felt -- or was it something of a different magnitude? The
question, for me, had become urgent. I could give up,
resigning myself to existence in a mental fog. Or I could
subject my brain to the best analysis and treatment that
science could offer.
I went first to see Dr. Gary Small, director of U.C.L.A.'s
Center on Aging. He's the author of ''The Memory Bible,''
as well as the recently published ''The Memory
Prescription.'' After meeting with him, I enrolled in a
research study he was conducting; I'd adhere to a
high-protein diet including fruits and vegetables with
antioxidant properties, omega-3 fatty-acid supplements,
daily multivitamins and capsules of vitamins E and C. The
program also called for exercise, daily mental challenges
and stress-release activities. Dr. Small was investigating
whether following this regimen for two weeks would improve
memory. In order to assess my base-line cognitive
abilities, Dr. Small ordered two imaging studies -- a PET
scan and an M.R.I. of my brain -- as well as a brief
neuropsychological evaluation. The images were heartening
-- apparently, my brain was free of the signs of
Alzheimer's disease or evidence of stroke or tumors. As for
my neuropsych evaluation, Dr. Karen Miller, a
neuropsychologist working on the research study, explained
gently that although some of my scores were below those of
my peers, when averaged, they were consistent with the
impairment that one might expect at my age.
At first I didn't grasp the importance of what Dr. Miller
had said. That five of my scores showed significant
cognitive deficits was in fact the first concrete evidence
that something was awry in my brain. I clung instead to the
notion that what I was experiencing was ''average.''
And what precisely did it mean to have an average amount of
memory impairment? Although we notice it first in middle
age -- sometimes as early as our mid-30's -- memory starts
to decline in our 20's. This has been demonstrated with
mice, rats, primates and humans, all of whom begin to lose
processing speed at about the same relative age. If you're
a middle-aged rat, 15 months old, this means that it takes
you longer to locate an underwater platform in a water
maze. If you're a middle-aged human, it means that when you
hear a list of words, you begin to lose some of your
ability to ''acquire'' them (place them in short-term
memory and parrot them back immediately), ''store'' them
(move them -- after 10 seconds -- from short-term memory to
long-term memory) and ''retrieve'' them (haul them out of
long-term memory). These abilities don't change overnight,
but by the time a person reaches her early 40's, there are
statistically significant differences from the
early-to-mid-20's peak.
One explanation for these changes is put forth by Dr.
George Bartzokis, director of U.C.L.A.'s Memory Disorders
and Alzheimer's Disease Clinic, who has studied the midlife
breakdown of the myelin sheath, a sheet of lipid fat that
wraps around the delicate branches of a neuron and is
critical to brain development. From infancy, cholesterol
levels in the brain slowly increase in order to facilitate
myelin growth. Bartzokis suggests that at some point after
age 30, these cholesterol levels reach a point where they
become high enough to promote the development of a toxic
protein that begins to eat away at myelin and other
membranes, disrupting the smooth flow of neuronal messages.
(It is not clear whether reducing blood cholesterol has an
effect on levels of brain cholesterol, but researchers
suggest that cholesterol-lowering medications are among the
preventative therapies worth investigating.)
''Our hypothesis is that the very process of myelination --
which allows us to become wise human beings -- sets up the
degeneration,'' Dr. Bartzokis says. ''How to prevent that
degeneration is the focus of a great deal of research.''
In some individuals, the escalation of toxic protein may
begin earlier or progress more quickly than in others,
possibly engendering the development of the plaques and
tangles that are the hallmark of Alzheimer's disease.
Scientists believe that there is a relationship between
this toxic protein production and the Apolipoprotein E
gene. All of us carry two copies of this gene. Research
confirms that individuals who carry the e-4 variant of the
Apolipoprotein E gene (about 20 percent of the population)
are vulnerable to developing Alzheimer's disease at an
earlier age.
There is a blood test to determine whether a person carries
this variant. Several scientists warned me, however, that
the test could not predict whether or not a person would
develop Alzheimer's and noted that it would be difficult to
obtain without a diagnosis of unspecified dementia. My
internist, accustomed at this point to my requests for odd
laboratory tests, simply noted ''memory impairment'' in the
appropriate form and faxed it to Athena Diagnostics in
Massachusetts. If the test was positive, health-insurance
providers would likely consider me a terrible risk. Still,
I wanted to know. Several weeks and half a dozen phone
calls later, I had my answer: I didn't carry the variant.
Generally speaking, middle-aged forgetting follows a
familiar pattern. People's names often go first, because
they are word symbols with no cues attached. Difficulties
with word retrieval tend to follow. Instead of the phrase
you want, you get what James Reason, a psychologist at the
University of Manchester, in Britain, called ''the ugly
sisters'' -- similar-sounding but frustratingly incorrect
combinations of syllables.
Recently acquired ''how to'' memory becomes challenging to
consolidate. You think and think, but you just can't
remember the steps required to back up the new hard drive,
a skill you perfected yesterday. Prospective memory, that
is, remembering to perform an action at some distance in
the future -- to fetch milk from the store on the way home,
for instance -- is vulnerable, particularly in the face of
competing distractions. The cues that are supposed to
remind you that you need milk -- your husband's phone call
a half-hour before, or the Post-it now deep in your handbag
-- fail to alert you, until you pull into your driveway.
''As you age,'' said Dr. Daniel Schacter, a Harvard
psychologist, ''those retrieval cues have to be readily
accessible, unambiguous and informative. The equivalent of
a string around the finger isn't going to do it. You're
going to be asking yourself what that string is for.''
A decline in the availability of working memory, which
allows us to manage several ideas or intentions at the same
time, storing and retrieving them with the fluidity of a
three-card monte player, is perhaps the most odious loss of
all. Multitasking can be frustrating and often
counterproductive. New research, from Dr. David E. Meyer, a
psychologist at the University of Michigan, shows that for
all but the most routine endeavors -- and few cognitive
efforts seem to require such minimal attention -- it is
more time-consuming and wearying for the brain to alternate
among tasks than it would be if the same jobs were done one
at a time.
There are many potential catalysts for forgetfulness; in
fact, the list is so long that it's a wonder we remember
anything at all. Stress, anxiety and depression all inhibit
memory. Hypothyroidism can affect memory and concentration.
Type 2 diabetes and its precursor, insulin resistance, can
also significantly reduce cognitive function. Even
fish-eating can be a hazard. Exposure to neurotoxins, most
commonly the methyl mercury that we consume when we eat
large predatory ocean fish, like swordfish and tuna, can
result in what Dr. Jane Hightower, a San Francisco
internist who wrote the resolution on fish and methyl
mercury toxicity that was adopted by the American Medical
Association, calls ''fish fog.''
None of these factors seemed to account for my own
cognitive troubles, however. I wasn't depressed. My mercury
levels were a little high, but not high enough to cause
fish fog. My thyroid was fine. I felt stressed, certainly,
but for the most part because I was so worried about my
memory. One catalyst, on the other hand, did seem
plausible: lack of sleep. Like many people who had spoken
to me about their memory deficits, I slept poorly -- often
I was up at 3 a.m., when, in the words of the poet Richard
Lang, the bedroom turns into ''a switching yard for the
freight trains of anxiety.'' A modest but constant sleep
shortage can undermine alertness, a University of
Pennsylvania study notes. Those with ''minor'' sleep debts
-- say, sleeping just four to six hours a night -- may
display cognitive declines equal to people who have not
slept for up to two full nights.
Good sleep, both REM and non-REM sleep, appears to be
critical to the ability to absorb information. During
non-REM sleep, which comprises about 80 percent of snooze
time, simple spatial tasks and recollections of personal
experiences may be consolidated, according to Dr. Michael
Perlis, who directs the University of Rochester's Sleep and
Neurophysiology Research Laboratory. Tasks involving visual
skills, like facial recognition and memory of events with
strong emotional impact, appear to be fortified during REM,
as are memories of complex actions and procedures.
Dr. Jan Born, at the University of Lubeck, in Germany,
recently demonstrated how our sleeping brains may continue
to focus on problems that baffle us during waking hours.
That's why, in addition to being well rested while you take
in information, it may also be important to have a good
night's sleep afterward, in order to successfully move that
information into long-term memory.
Born's study suggests that creativity or problem-solving
insight may often happen during that portion of non-REM
sleep known as slow-wave sleep -- the deepest type of
sleep, usually occurring during the first third of the
sleep cycle and usually devoid of dreams. But from age 40,
said Dr. Robert Stickgold, a sleep researcher and assistant
professor of psychiatry at Harvard Medical School,
''slow-wave sleep virtually disappears, diminishing from
about 20 percent of the night to near zero. Since slow-wave
sleep helps us consolidate certain types of memories, this
might explain a substantial component of our memory decline
with age.''
Several pharmaceutical companies, Dr. Michael Perlis said,
are pursuing the development of new compounds that may
reverse declines in slow-wave sleep. ''This new class of
drugs may or may not help people fall asleep as quickly, or
stay asleep as long, as traditional sleeping pills,'' he
said. ''But they have the potential to produce
qualitatively better sleep.''
As a research subject in Gary Small's study, I'd been
following his recommended memory regimen. Along with the
diet, I pursued the recommended exercise program, spending
some time each day on an elliptical trainer. Studies
indicate that as we age, our mental abilities are improved
by regular aerobic and strength-training workouts, while
nonaerobic exercise, like stretching and toning, are less
beneficial. I felt more energetic, but frankly, I didn't
feel much sharper.
Frustrated, I went to see Dr. Jonathan Canick, the director
of the neuropsychological assessment service at California
Pacific Medical Center in San Francisco. Canick is a
pragmatic specialist, accustomed to evaluating patients
with serious dementias, head injuries, brain tumors and
strokes, as well as those with more subtle neurocognitive
disorders. ''When a patient ends up in my office,'' he
said, ''it's because the medical professionals are
stumped.''
Dr. Canick suggested that for most middle-aged people, the
real issue was not so much declining memory or retention
but rather the faltering ability to attend to and process
the onslaught of colliding streams of information coming at
us all day long. ''It gets experienced as a memory issue,''
he said, ''but in reality, it could be about attention,
learning or retrieving information.'' We could blame
evolution: our brains, designed to attend to novel stimuli
like a tiny sound downstairs in the middle of the night,
ignore that which seems old and familiar. A great deal of
what we experience every day -- some of it important, some
not -- simply fails to be encoded. As we age, our brains
slip into ''been there, done that'' mode. ''If it blows by
you,'' he said, ''and it doesn't register, you're never
going to be able to retrieve it -- because it doesn't
exist.''
Over the course of two days, Dr. Canick put me through an
exhausting seven hours of neuropsychological tests. I knew
I was struggling. To test facial recognition, I thumbed
through a book of head shots. A minute later, presented
with a book containing the same photos as well as a group
of new ones, I was unable to say whom I had already seen.
In another task, I was asked to connect the dots through an
alternating and ascending lineup of numbers and letters. I
lost the sequence and had to backtrack to rediscover it.
Numbers, letters, words, figures -- they were bewildering.
''Keep going,'' Dr. Canick said. ''Go to the end.'' He
worked my brain like a trainer works an athlete, looking
for weakness.
Halfway into the testing, he told me that there was no
evidence of a dementing, neurodegenerative or progressive
disorder. But the tests I flubbed nevertheless showed
impairments that were disturbing and not considered
''average'' in midlife. He explained that there might be a
reason for these deficits. People with mild traumatic brain
injuries, he said, often demonstrate variable and reduced
ability for attention, processing information, word-finding
or multitasking. Typically, they interpret their experience
of slowed processing and attention deficits as ''memory''
problems.
''For now, it's only a hypothesis,'' he said. ''But your
symptoms and your results show the distinct neurobehavorial
fingerprint of brain damage, the kind that stems from a
series of mild traumatic brain injuries.''
''That's impossible,'' I said. ''I've never even been
knocked unconscious.''
''And that,'' Dr. Canick said, ''reflects a very common
misperception.'' Concussions do not always result in a loss
of consciousness, he explained; one can have a mild
concussion, experienced as ''seeing stars,'' and remain
conscious. In fact, a person doesn't even have to
experience direct impact to her head. Rapid acceleration or
deceleration of the head, which is often accompanied by a
rotation of the brain, can result in concussion. In some
cases the brain bounces off the interior of the skull,
causing dendrites and axons to be stretched and sheared,
damaging the myelin sheath and disrupting communication in
a way that could cause a person eventually to slow
cognitively and physically. Mild traumatic brain injuries
often are undiagnosed, Dr. Canick said. With successive
concussions, the effect is more logarithmic than linear.
Even if the first injury did little harm, the second can
have exponential impact, as does every injury that follows.
A few weeks later, I broached the subject of brain injury
with my brother Peter, expecting him to agree that Dr.
Canick's hypothesis was ridiculous. He did not. ''Don't you
remember,'' Peter asked, ''when we were children, and I hit
you with. . . . '' I never heard the end of the sentence. I
hadn't given it a thought in 30 years, but in less than a
second, I was 9 years old, back in the basement of our
house in Scarsdale. My brother, a whirling towheaded kid
drunk on centrifugal force, spun in circles, an old
broomstick extended horizontally from his hands. I was in
the wrong place. The impact knocked me flat. For the next
three weeks, as my eye sockets and forehead turned every
color in the rainbow, my fourth-grade teacher referred to
me as Miss Technicolor.
There were other head injuries as well: horseback-riding
wrack-ups and, because I am tall, forehead-smashing
collisions with low-hanging doorways and tree branches. One
by one, these recollections emerged. According to the
National Center for Injury Prevention and Control, at least
1.1 million people each year sustain mild traumatic brain
injuries that result in confusion, disorientation or
impaired consciousness for fewer than 30 minutes. The
number is probably underestimated, given that many people
with mild injuries don't go to a doctor's office or an
emergency room at all.
How could I know that Dr. Canick was right -- that my mild
traumatic head injuries could actually produce long-lasting
neurocognitive deficits? I was reluctant to credit his
diagnosis, suspecting that he might want to be the guy with
the answer, whether or not that answer was correct. I
understood the concept of logarithmic damage, but why had I
failed to notice any impairment until I reached my
mid-30's?
''You must take into account the concept of neuronal
reserve,'' said Dr. Ronald Ruff, a clinical
neuropsychologist in San Francisco, who concurred with Dr.
Canick's findings. ''By age 25, you have all the neurons
you're going to get,'' he said. ''For most of us, the fact
that we experience continuous slow cell death over the
years doesn't become evident until we reach our 80's. If,
on the other hand, you've had concussions, or abused
substances or alcohol, you'll have diminished your share of
neurons, and the slope of decline will be sharper. In your
20's, this is usually no big deal, but by the time you
reach your mid-30's or 40's, the net availability has
declined so much that, when you're called to rise to the
height of your capacity, you start to notice.''
That made sense, I thought -- but why, in my case, had the
onset of cognitive problems occurred so swiftly? Hormonal
changes during pregnancy can affect memory and cognition,
Dr. Canick said. In addition, ''you had an underlying
vulnerability,'' he explained. ''You toughed it out during
your 20's, because you had the neuronal resources to do it.
After the birth, you faced a new situation -- you were
compelled to divide your attention as you never had before,
and you discovered the deficiency in your brain function.''
Serious long-term effects of mild traumatic brain injury
are often missed because the injured person returns rapidly
to normal life, said Dr. Tracy McIntosh, professor of
neurosurgery at the University of Pennsylvania. ''Several
months down the road -- about two months later in mice --
you'll begin to see subtle cognitive changes,'' he said,
because, perhaps, it takes that long for the injured
neurons to die or the neural pathways to become
dysfunctional. The vast majority of these injuries were
thought to resolve completely within a few months or even
weeks, but brain-injury specialists like Dr. McIntosh now
question that assumption.
I returned to Dr. Canick to talk more about the results of
my test and his theory about brain trauma. ''Your results
range from the 10th percentile to the 98th,'' he said.
''You cannot rely upon your own abilities, because they are
so variable.'' He said it was as if neurologically I were
two different people. ''You don't know which of the two
people is going to be available for any task. And that is
destabilizing, as well as a recipe for anxiety, confusion,
angst and self-doubt.''
Dr. Canick's descriptions felt achingly familiar, an
explanation for a dichotomy I'd felt for years. As a
psychologist, Dr. Canick could not prescribe drugs. But he
told me that several of his brain-injured,
attention-compromised patients had improved with the use of
neurostimulants, either Ritalin or Adderall, the same drugs
that are regularly administered to children with A.D.H.D.
Although these drugs enabled people to focus better and
make more effective use of their brains, he cautioned that
they did not bestow abilities that weren't there in the
first place.
I discovered that Adderall -- an amphetamine and a
controlled substance with a high potential for abuse and
addiction -- was rapidly developing a black-market status.
Despite its side effects -- dry mouth, insomnia, lack of
appetite, headache and racing heartbeat -- college students
were using it to improve their focus on exams, some young
professionals have been taking it to increase their
productivity at work and increasingly, middle-aged people
like myself were using it to restore their attention and
concentration. If you were willing to visit a psychiatrist
or a sympathetic general practitioner and answer a series
of rather transparent questions that suggested that you
were suffering from adult attention deficit disorder, it
seemed that prescriptions were readily available.
Adderall and Ritalin appear to provide a boost in focus to
virtually anyone who ingests them. Dr. Anthony Rostain,
medical director the University of Pennsylvania's Adult
A.D.H.D. Treatment and Research Program, suggested that he
wouldn't be surprised if, in the future, hordes of
middle-aged people popped pills for cognitive enhancement.
In fact, he predicted that these stimulants would be
available over the counter. ''Given the performance
orientation we have today,'' he said, ''and the urgent need
to improve productivity, it seems to me that people will
use these drugs in the same way we now use socially
sanctioned stimulants like caffeine.'' Other cognitively
enhancing drugs, he noted, were on the way -- the market
for them was vast, and the pharmaceutical companies had
taken notice.
I wish I could say that the Adderall didn't help. But after
about a week, the gears meshed in my brain. Once again, I
could move sentences around in a manuscript without finding
myself holding a handful of orphaned words. I regained
access to my vocabulary. My errands proceeded in an orderly
fashion.
Dr. Canick asked me to return to his office. He wanted to
test me again now that I was on the drug. On the test of
facial recognition, my scores improved from well below
average -- in the 19th percentile -- to the 93rd. On a test
of mental arithmetic, my performance increased from the
50th percentile to the 91st for people my age. Verbal math
problems, once unfathomable because I could not remember
how many doughnuts Sally, Bill and Jane had each purchased,
became quite easy. Other tests showed more modest
improvements, but the trend was clear.
I didn't kid myself. Drugs aside, the mechanism was still
broken. If Dr. Canick's diagnosis was correct, I was
dealing with a problem that could be patched up but never
fixed. New imaging technology will detect microscopic
damage to axons and specific neural pathways, perhaps
answering the question of whether indeed my brain had
suffered an injury. While I knew that my expectations for
myself were high, and that a pathology could be involved, I
also saw that many people in midlife experienced the same
sense of perpetual distraction and preoccupation. What had
brought us to this point? I wondered. Were we trying too
hard to live fast-paced, information-heavy lives, when our
brains were naturally slowing down? Our fleeting attention,
it seemed to me, might be a protective if ill-timed
response -- the brain's way of saying that it had simply
had enough.
After a month of Adderall I could see that there were side
effects I hadn't read about in the drug literature. I
worked like a demon, but I found myself disconnected. At
the computer I was entirely focused, but off duty, certain
pleasures, like wandering around aimlessly in my own mind,
were no longer available to me. I began to take
mini-vacations from Adderall -- a Sunday off, so that I
could recline in a lounge chair and watch my kids perform
cannonball dives. I suspected that I was gunning a
middle-aged engine at speeds better suited to one with
fewer miles on it, and that there would be consequences.
Because I never experienced the feeling of euphoria that
causes some people to desire ever-increasing doses of the
drug, I didn't worry about addiction, but I was concerned
about psychological dependency.
Sometimes I wondered whom I was trying to fool. Was this
cognitive enhancement actually no more than vanity, as
frivolous as a face lift, but more deceptive, because in
the end, you duped only yourself? I could not imagine
tossing the Adderall prescription and returning to the
mental fog. Nevertheless I found myself wondering whether
at some point in the future, such hard-edged, drug-induced
accuracy might start to feel as unseemly to me as a
thigh-high miniskirt, and I'd quit.
Not long ago, I spent some time with Dr. Thomas Crook III,
a clinical psychologist who had devoted his long career --
including 14 years at the National Institute of Mental
Health, where he served six years as chief of the Geriatric
Psychopharmacology Program -- to helping to establish
age-associated memory impairment as a clinical condition
that warranted attention and treatment. Years ago, he noted
the insensitivity implicit in telling older patients who
complained about their memories that what they were
experiencing was inconvenient but typical. If they went in
complaining that they could no longer read, he wrote in
1993, ''it would scarcely occur to the clinician to inform
them that their problems are no worse than those of other
persons of the same age and, therefore, that they do not
merit treatment.''
Something he mentioned gave me hope that I would not always
feel so troubled by what had happened to my mind. Although
for many, essential cognitive skills, like the ability to
remember names or recognize faces, decline precipitously as
the decades go by, people's self-reported impressions
reflect a different understanding. ''Asked how they would
describe their memories,'' Dr. Crook said, ''people who are
in their 40's are the most critical. In their 50's, they
feel a little bit better about their capacities, and by the
time they reach their 60's, they're as satisfied as they
were in their early 30's.''
With Adderall, I had a Proustian taste of what I thought
I'd left behind. I was glad to know that, at least while
pharmaceutically enhanced, I still had the chops. Still, I
often thought about what Dr. Crook had said. At what point
might I stop dwelling on what had been lost, I wondered,
and begin to relish what I had gained with age? Perspective
and insight, fused with acceptance, formed the cornerstone
of wisdom. The rest, presumably, I could get from Google.
Cathryn Jakobson Ramin is at work on a book about
midlife memory for HarperCollins.
http://www.nytimes.com/2004/12/05/magazine/05MEMORY.html
More information about the paleopsych
mailing list