By Rachael Moeller Gorman, September/October 2008
One lazy Friday night last winter, my husband and I watched TV on the couch while our infant son slept upstairs. During a commercial, an image flashed on the screen—a New York City train station. I smiled, because the place was so familiar. I had been there many times, traveling there for conferences or for fun. I could picture the outer façade, the stars on the ceiling, the brass clock. Ah, that station. That old place where the trains parked. That—
“What the heck is the name of that train station?” I asked my husband.
He gave me a funny look. “Grand Central?”
“Yes!” What was wrong with me? Even though I’m just 32 years old, this kind of thing had been happening more and more often and it was getting annoying. The forgetting of words—especially names and places that I obviously knew but couldn’t conjure up—began a few years ago. I’d had a brief return to my old sharper self during my pregnancy, but soon after my son was born, my brain slowly sank back downhill. Why wasn’t it working like it used to? Would it ever come back? Was there something I could do to drag it back to peak performance? What was it about pregnancy that had made it better?
I’m not the only one trying to figure out how to get smarter. Looking for ways to boost our brain power is big business. In addition to ever-popular alertness boosters like coffee and Red Bull, pills like Focus Factor and Brain Advantage are hot items, with customers shelling out $70 per month or more to stay on top of their mental games. Ginkgo biloba, an herbal supplement billed as a memory enhancer, generates nearly $1 billion in annual sales in the U.S. alone. Some folks go even further: as the prestigious journal Nature recently reported, 20 percent of scientists responding to a survey admitted to taking so-called cognition-enhancing drugs like the stimulants Ritalin (to aid focus) and Provigil (to stay alert without caffeine’s jitteriness), without apology. “It is my duty to use my resources to the greatest benefit of humanity,” said one respondent. One-third said they’d even feel pressure to give their own children these drugs if other kids in their circles were also using them.
Though I wasn’t ready to pump myself with drugs in order to remember a name more quickly, I did want to regain control of my mind, and, if I could, head off cognitive decline. The brief lift of brain fog during my pregnancy—a time of heroically conscientious eating—gave me hope. Could improving my diet help? I began scouring the science to find out. I also wondered whether the American diet I’ve been spoon-fed (and am currently spoon-feeding my son) was to blame for my mental malaise. I wanted to figure out whether a smart menu at each stage of life could fend off dullness and make me—and my family—sharper.
[header = The Baby Brain]
My intellectual journey began in the far reaches of northern Quebec, in a smattering of small villages on the frigid coast of Hudson Bay. No roads connect the villages to each other or to southern Canada, so when Joseph Jacobson, Ph.D., of Michigan’s Wayne State University and his intrepid crew of researchers first arrived 12 years ago, they flew in on small propeller planes from Montreal. Jacobson studies the Inuit, and he does so for just about the same reason cardiovascular disease researchers have been interested in other northern communities for years: their diet. “The Inuit eat a lot of fish,” says Jacobson from his Detroit office. “Arctic char, a type of salmon, is very big in their diet. And it’s all very rich in DHA.”
DHA, or docosahexaenoic acid, a long-chain polyunsaturated fatty acid in the omega-3 family that’s found in fish and their roe (particularly fattier types like salmon and sardines) is the magic nutrient du jour. I have seen literally hundreds of studies investigating its power to prevent cardiovascular disease. Now the focus has turned to the brain: dozens of studies report that mother animals deprived of DHA have offspring with memory, sensory and visual problems, and that supplementing them with DHA improves their performance on learning, memory and problem-solving tasks. This makes intuitive sense: DHA forms the backbone of much of the brain cells’ membranes.
Jacobson wanted to see whether higher DHA levels, both in the womb and after birth, could have the same positive effect on human infants. So working with midwives in the three largest Hudson Bay villages, his team collected umbilical cord blood from 109 newborns. They analyzed the DHA concentration in their cord blood (a good measure of how much DHA the mother consumed during her last few months of pregnancy), and then tested how well the infants performed on tests throughout their first year. He found that at 6 months and 11 months, infants whose cord blood had the highest concentrations of DHA performed better on a number of different tests—such as recognizing faces—than those with lower levels. “The mother’s intake during the third trimester, when the brain’s neurons and synapses are developing at a very rapid rate, is most important. When we focused on that period, we found the most evidence of beneficial effects,” he says.
My mother certainly didn’t eat salmon while pregnant with me, so that could be my problem, but it’s doubtful: my memory problems only emerged recently. Luckily for my son, my OB/GYN is on top of the literature: when she found out that I couldn’t stomach salmon or other fatty fish, she recommended taking a DHA supplement during my third trimester. (Pregnant mothers are advised to get 300 milligrams per day—the equivalent of about three to four 3-ounce servings of salmon a week.)
Apparently, much of the rest of the country isn’t too fond of fatty fish either. “Most populations, and this is particularly true in the U.S. and southern Canada, are not getting nearly the amount of DHA that humans got prehistorically,” says Jacobson, who like many in his field believes that before the agricultural revolution, fish played a much more prominent role in our diets. “In our original environment, we were getting a lot of DHA,” he comments, “then we switched over to a more grain-based diet.”
Compounding the problem, adds Jacobson, is that our diets are rich in another type of fatty acid: arachidonic acid (AA), an omega-6 polyunsaturated fatty acid found in animal fats and formed in the body when we consume linoleic acid from vegetable oils in foods. There’s nothing inherently bad about AA—it’s important for normal growth. But Jacobson and others believe that our prehistoric ancestors evolved to eat a more balanced ratio of omega-6 to omega-3 fatty acids. Today, few people eat enough fish to achieve this balance; the ratio is currently about 10:1 in the U.S. Since AA competes with DHA for space in the membrane and affects other functions in the brain, some experts suggest an abundance of AA is less optimal for cognitive development in babies (and may be associated with early cognitive decline in older adults—more on that later).
Unless they stick to the “eat fish at least twice a week” dietary guideline, it’s hard for most Americans to meet DHA recommendations without supplements. This is why many infant formulas are now fortified with DHA (breast milk can be a great source of DHA, if the mom eats fish or takes supplements herself). Jacobson hints that supplementing formula, however, could be a case of too little too late—in his Inuit study he saw no beneficial effect of breast milk that contained high levels of DHA on the cognitive performance in infancy, although there could still be some beneficial effects on cognitive function in childhood. In the majority of studies that have demonstrated beneficial effects from DHA-enhanced infant formula, he notes, “most of the effects have been limited to preterm babies”—e.g., those who missed getting their full in utero complement of the nutrient. Does this mean that the typically DHA-poor American diet places infants at risk? Jacobson is quick to assure me there is no reason to assume that it does. “You don’t want to oversell the problem, but our data suggest that greater quantities of prenatal DHA intake could be beneficial.”
[header = Milk vs. Formula?]
DHA is not the only critical substance for developing babies’ brains. Researchers have known for some time that iron is also key, but recently they have been discovering just how long the effects of a deficiency can last. Babies are born with a solid store of iron, but by 5 or 6 months they’ve used much of it up and can’t get enough from breast milk to sustain their ever-growing bodies: they need to take it in from the outside world in food or supplements. Studies show that being deficient at points within the 5- to 12-month age block irrevocably slows academic, social and emotional development. Even if children are fortified with iron soon after the deficiency is detected, they never catch up, and can still show signs of cognitive delay even 10 years later.
Iron is not only needed to transport oxygen to the brain in the bloodstream, but it also helps myelinate, or insulate, nerve fibers so signals travel faster—and helps create the neurotransmitters that relay signals between neurons. Until the early 1970s when manufacturers began adding iron to formula, more than 30 percent of infants were iron-deficient; since fortification, that number has plummeted. (At last count, about 7 percent of toddlers were iron-deficient.) But with the rise in breastfeeding, exclusively breast-fed infants are now at risk, especially as they’re transitioning to solid foods. Breast milk is still the best food (bar none) for infants, but physician groups recommend using rice cereal fortified with iron or supplementing with a vitamin drop during and after that critical transition to solid foods around 5 or 6 months.
With irreversible brain delay churning through my gray matter (and after consulting the pediatrician), I drove to the drugstore and bought a multivitamin with iron for my son, who at 7 months was still breastfeeding but beginning to discover the delights of runny rice cereal and mushy peas.
[header = Fuel for School]
As children reach school age, DHA and iron continue to be key to brain development, but for kids sitting in class for seven hours a day, it’s even more important to keep their energy-hungry brains satiated. Reams of studies show that fueling the brain with breakfast is important for thinking, acting and learning; that’s the impetus behind the federal School Breakfast Program, which aims to ensure that every child begins the school day with something in his or her stomach.
Children who are undernourished perform poorly on cognitive tasks. Eating breakfast improves performance on attention and memory games, especially in the undernourished, but it also helps children who get enough food. This may be a simple case of refueling after an overnight fast: the brain needs glucose (its exclusive fuel source) and eating just about any food, from a candy bar to five-grain muesli, provides it. But new research is saying there’s more to it than that, and not just any breakfast will do.
Margaret Anne Defeyter, Ph.D., a senior lecturer in psychology at Britain’s Northumbria University in Newcastle upon Tyne, studies children and the foods they eat. She has a lot to say about how kids prepare their brains each morning. “I was stunned, absolutely stunned to see what children had for breakfast,” she exclaims. “Children tell me they grab a chocolate biscuit [British for cookie] out of the biscuit barrel on their way to school, or stop at the corner shop and buy a can of cola.”
Defeyter wanted to know if a switch to slower-burning carbohydrates might give kids an advantage on tests of attention and memory. To find out, she gave 64 children either low-glycemic-index All-Bran cereal or high-glycemic-index Coco Pops, and then switched them the next day. Guess which one kept kids’ brains a-chugging most of the morning?
“With the high-GI cereal you get this sudden sugar rush, where you perform very well, but it’s quickly followed by a low,” she says. “Whereas with the low-GI cereal, you get a more sustained level of performance. That’s important for children. You want their concentration and attention maintained throughout the school morning for learning.” Other studies swapping in low-GI oatmeal for a higher-GI cereal have shown a similar effect: the lower the GI of the breakfast, the better kids did on cognitive tasks requiring attention and memory. The few studies looking at the effects of breakfast on adult brains showed similar results: low-glycemic-index meals that released glucose slowly into the bloodstream seemed to be associated with better memory.
I thought back to my own childhood, sometimes starting the day with a bowl of Cookie Crisp or, occasionally, Froot Loops. Perhaps it slogged down my timed second-grade multiplication quizzes. But that still didn’t explain why my brain was on the fritz now. Was something else missing?
[header = Iron-Deficient Maidens]
Back in early 2007, I came across a study that I still think about all the time. Here’s the gist: 113 young women, aged 18 through 35, came into the lab at Pennsylvania State University. They took eight different tests on a computer that assessed attention, memory and learning, and their blood was drawn to compare their level of iron to their results on the computer tests. The findings were dramatic: women who were even mildly iron deficient—not yet anemic—scored much lower on many of the tasks and took longer to complete the tasks, than the women whose iron levels were normal. About 10 percent of young women are anemic (because of their monthly loss of iron-rich blood), as are 25 percent of pregnant women. In fact, I’d been told early in my pregnancy that I was slightly anemic, but it never occurred to me that it was much of a problem.
“What that study was able to do for the first time is show that even if you are mildly iron-deficient—you don’t have to be anemic—you have changes in cognitive function,” says John Beard, Ph.D., the iron expert who conducted the study. “It’s a scary thing that people don’t like to hear,” he admits, since a good number of us fall into that slightly iron-deficient gray area.
What really gave me hope was the other half of the study, where Beard put half the women on a slow-release iron supplement containing 60 mg of elemental iron for four months. Unlike the results seen in studies with iron-deficient infants, the women receiving the supplements regained normal cognitive functioning. How? Beard says that since the adult brain is already formed, iron’s primary role is to help feed the brain and build neurotransmitters; some of the brain regions most sensitive to iron deficiency are the prefrontal cortex and the hippocampus, centers of higher intellectual functioning and memory.
In Beard’s study, those women whose blood iron levels improved significantly experienced a five- to seven-fold improvement in cognitive performance. So it is a good bet that part of my problem was lack of iron; my temporary brain boost while I was pregnant could have resulted from the iron-rich prenatal vitamins I took every day. I vowed to do a better job of eating fortified cereals, edamame, clams, white beans, spinach, lentils—and of course meat, which contains the most easily absorbed form of iron. And I started taking a supplement on the side. Better safe than stupid.
Of course, if you’re not iron-deficient, taking more iron isn’t going to do anything to make your brain sharper—and too much iron creates problems of its own, including hemochromatosis (high blood iron), which can lead to liver damage, heart failure or diabetes. Since our bodies are unable to get rid of excess iron (except by bleeding), it makes sense to have your blood iron levels checked before you head to the drugstore for a “brain-boosting” iron pill—especially if you’re a woman past menopause or a man.
[header = Heading Off Decline]
I’m doing my best to build a first-class brain for my son and I work hard to keep all of our minds lubed. But nothing, even the most tenderly nurtured neurons, lasts forever. With my current fuzz, I fear eventually losing myself (or my husband) to dementia, and I wanted to know how to sandbag my family against it. So I called David Smith, professor emeritus of pharmacology and head of Oxford University’s 20-year-old Oxford Project to Investigate Memory and Aging (OPTIMA), which studies nongenetic risk factors (a.k.a. environmental factors) that cause Alzheimer’s disease. His words made me happy: “It’s my personal belief that we will be able to prevent a large proportion of Alzheimer’s disease in the world.”
Smith sees parallels between Alzheimer’s disease and heart disease—an illness whose prevalence has decreased around 60 percent over the past 40 years, largely because of preventive measures taken at a societal, level such as reducing smoking, increasing exercise, eating well and taking drugs to reduce cholesterol and blood pressure. “If we can find what the risk factors are for Alzheimer’s disease, we can have a similar success.”
Unfortunately, no one yet knows what all these risk factors are, but observational studies are beginning to yield clues: it seems the same things that are good for the heart may also be good for the brain. The connection makes intuitive sense: Alzheimer’s disease likely results in part from the accumulation of so-called senile plaques—abnormal brain proteins called A-beta that many scientists think trigger inflammation and oxidation, damaging neurons. If a person has atherosclerosis, their vessels are gunky and inefficient, resulting in fewer nutrients and less oxygen supplying the brain and fewer waste products leaving it, thus exacerbating Alzheimer’s disease.
Smith was particularly interested in fish because observational studies have shown a strong link between high fish intake and a reduced risk of full-blown Alzheimer’s disease. He wanted to know if eating fish regularly could also help people improve their brain power. So he called up colleague Helga Refsum, a professor of nutrition at the University of Oslo who leads the Hordaland Health Study—one part of a huge national project that gathers extremely detailed information about people’s lives and charts cardiovascular disease all over Norway. The county of Hordaland is on the sea, and people there eat lots of fish. Smith found that, of 2,031 healthy Hordalanders aged 70 to 74, those who ate more than a two-ounce serving per week of any type of fish (not just the fatty, DHA-packed variety) scored much higher on cognitive tests than those who ate less.
I asked Smith how it could be that all types of seafood are linked to improved cognitive function, since every study I’ve ever seen points to omega-3s like DHA as the key brain-boosting component. “Of course, the fatty acids are a strong candidate,” he says, “But it may be something else. Fish is very rich in niacin; there have been reports that niacin intake is related to better cognition in the elderly. Fish is also a good source of vitamin B12.” Because the aging body is less able to absorb B vitamins, particularly B12, he explains, the elderly often have low levels, which has been associated with poorer cognitive function. “So there are several candidates in fish and we want to tease them out.”
[header = A Golden Opportunity]
Point taken: eat fish for your brain and your heart. But what happens if you’re a vegetarian, have a seafood allergy or can’t afford to eat fish regularly? Or, like me, just don’t like fish? I’ve never quite gotten over my pregnancy-induced aversion to the stuff.
Perhaps I could cover the taste with curry powder and benefit from a seasoning that’s been coming into focus as a potential anti-Alzheimer’s agent, at least in animals: turmeric. Greg Cole at UCLA and his colleagues have reported that curcumin, a phytochemical in turmeric (which gives curry powder its yellow color) not only helps prevent the buildup of toxic A-beta protein in the brain, but it also has antioxidant and anti-inflammatory properties. It has been used for thousands of years in traditional Indian Ayurvedic medicine as a treatment for respiratory conditions (asthma, allergy), liver disorders, anorexia and cough, among other things, and throughout Asia it’s used to treat arthritis pain and other inflammatory conditions. Cole is in the middle of a clinical trial on curcumin, but an interesting observational study came out in 2006 from Singapore that found that healthy people aged 60 to 93 who ate curry “occasionally” (once a month) or “often or very often” scored better on cognitive tests than people who rarely ate it. It’s also quite interesting to note that Indian citizens in their seventies (whose diets are rich in curry) are four times less likely to have Alzheimer’s disease than American septuagenarians.
[header = Ancient Wisdom]
To really uncover the secret of a clear mind late in life, though, I turned to the people who walk the walk. Some of the longest-lived people in the U.S. are from Cache County in the far northeast corner of Utah, where a majority of folks are Mormon and their beliefs shape a lifestyle that’s relatively free of vices like caffeine, tobacco and alcohol. Many are open to taking nutritional supplements and have the support of a close-knit community—all factors that may pave the way to a long and healthy life. Scientists have been carefully following 90 percent of the elderly population here—around 5,000 people—for 13 years, to see which part of their lifestyle plays the largest role in their longevity. The researchers have documented foods the residents have eaten, the activities they’ve done, the jobs they’ve had. They drew blood and tested cognition, and revisited the subjects about every three years to see why they live such long, healthy lives.
I asked Peter Zandi, Ph.D., an epidemiologist at the Johns Hopkins Bloomberg School of Public Health and a researcher on the study, what he’s found out about nutritional influences on mind decline, and he put it in a single word: antioxidants. “People who took high-dose supplements of both vitamin E [from 400 to 1,000 IU daily] and vitamin C [500 to 1,000 mg or more] had on the order of 60 to 65 percent reduction of the risk of developing Alzheimer’s disease,” Zandi says. “That’s huge. This led us to the notion that it’s really the synergistic effects of both that may afford protection.” (Currently, the Institute of Medicine’s daily recommendation for vitamin E is 22 IU [15 mg] and 75 to 90 mg for vitamin C.) Zandi thinks this vitamin partnership might work because vitamin E is a potent antioxidant that can slip inside cells and mop up the damaging free radicals, while vitamin C waits patiently outside and replenishes vitamin E when it comes back out so it can continue working.
Anytime the body turns glucose into energy, free radicals are produced and oxidation (or damage) to tissue can occur. “The brain uses more energy than any other organ in the body, [thus] the brain is more susceptible to oxidative damage than any other organ in the body,” explains Fernando Gomez-Pinilla, Ph.D., who recently analyzed 160 studies on food’s effect on the brain. A professor of neurosurgery and physiological science at UCLA, Gomez-Pinilla published his meta-analysis this past July in Nature Reviews Neuroscience.
But hold off on buying that mega-antioxidant formula just yet: though some other studies have supported this work, not all are positive, and most experts advise avoiding antioxidant supplements until all the answers are in. Large doses of antioxidants can sometimes have a paradoxical, pro-oxidizing effect and cause cellular damage. However, the research is a strong argument for including more vitamin E-rich foods like walnuts, almonds, sunflower seeds and dark greens in your diet, along with plenty of citrus fruits, tomatoes, cantaloupe and other foods abundant in vitamin C.
Of course, one time-proven, antioxidant-rich way of eating doesn’t involve supplements at all: Mediterranean diets, famously protective against heart disease, may have promise in preventing Alzheimer’s disease as well. Recent studies suggest that people who most closely adhere to the dietary patterns long practiced in the countries surrounding the Mediterranean Sea—plenty of fruits, vegetables and whole grains, little meat, occasional fish and liberal olive oil—have significantly lower risk of developing Alzheimer’s disease later in life. Researchers believe that the antioxidants, omega-3 fatty acids and other micronutrients this way of eating offers may work synergistically to reduce the risk.
[header = Minding our Minds]
Unfortunately, the typical American diet is far from the brain-boosting ideal. Most Americans don’t eat fish multiple times a week, get nine servings of fruits and vegetables daily or regularly season their food with curry. Babies don’t always get their iron, kids eat candy for breakfast and processed foods fill our grocery-store shopping carts. “Our diet today is really very, very different from primitive man’s diet,” says David Smith. So different that it’s bad for our brain? “I think it might be,” he replies.
In addition to not eating enough of the good things, we tend to eat too much of the bad stuff: a number of recent studies show that eating too much cholesterol, trans fat and saturated fat increases risk of cognitive decline and Alzheimer’s disease. One just-out report found that when rats were fed a diet high in saturated fat and cholesterol for eight weeks, their performance on a battery of memory tests declined significantly. Another study suggests that eating 80 milligrams more cholesterol per day than you normally do (the amount in a four-ounce piece of steak) seems to make your brain work, temporarily, as if it were three years older. Even worse, disease and lifestyle issues that continue to plague us, such as high blood pressure, lack of physical activity and diabetes are all pushing us toward cognitive decline.
With the food environment we live in, it’s hard not to eat poorly unless you pay a premium. Rather than subsidizing antioxidant-rich fruits and vegetables, the federal government puts most of its support behind omega-6-rich soybeans as well as corn (thus keeping corn-syrup-laced junk food and sweet cereals cheap). Salmon is typically twice the price of beef. For the school breakfasts that power many kids’ mornings, the federal government’s requirements are broad enough that cheap, sugary cereals or Danish pastries pass muster. (The 2005 Dietary Guidelines declare, “make half your grains whole,” but schools still don’t have to comply. Promisingly, they’re working on the problem, and within two years schools should be more in line with the new guidelines, says Nancy Johner, USDA Under Secretary for Food, Nutrition and Consumer Services.)
Back home in the kitchen, I remember the pink marshmallow Sno Balls and Lucky Charms of my childhood. But my new dietary path already surrounds me on the counter: the baby’s small childproof bottle of multivitamin drops with iron; a cylinder of whole oats for my husband and me; my iron tablets; plenty of vitamin C-rich oranges and vitamin E-packed nuts; salmon (for my husband, and me, if I start liking it again) and lean free-range beef wrapped up in the fridge along with plenty of vegetables and fruits. I have my son’s little mind to think about now, and I’m excited to start. With any luck, I can also head off dementia for my husband and me. Will it work? I’ll tell you in 30 years (if I remember).
Contributing editor Rachael Moeller Gorman won the Bert Greene Food Journalism award for her last EatingWell feature, “Miracle Up North” (June/July 2006).