News FocusImmunology

Testing the Line Between Too Much and Too Little

See allHide authors and affiliations

Science  02 Nov 2007:
Vol. 318, Issue 5851, pp. 740-741
DOI: 10.1126/science.318.5851.740

Keeping peanuts and other risky foods from toddlers in the hopes of preventing allergy has been common practice for years. But is avoidance actually safer?

Eat or avoid?

Peanut allergies are on the rise, but steering clear of them early in life may or may not help keep them at bay.


In Israel, the joke goes, a child's first three words are Momma, Dadda, and Bamba. The last, a wildly popular kosher snack of peanut-flavored puffs, shows a cartoon baby with a tuft of red hair on its packaging. Many infants eat Bamba when they're as young as 6 months old.

Bamba has become an unlikely poster child in the world of food allergy, where physicians are desperate to ease the soaring burden of hyperactive immune responses. In the past 10 years, peanut allergies—among the most common and lethal—have roughly doubled in both the United Kingdom and the United States, where they affect at least 1% of young children.

A British study 4 years ago reported that peanut-allergic children have a lower quality of life than those with diabetes, suffering anxiety everywhere, from riding the bus to cruising the supermarket. “The kids don't go to camp like they should, they don't go to birthday parties,” says Brian Vickery, a fellow in allergy and clinical immunology at Yale University. They and their parents live with “fear and tremendous anxiety” that unexpected exposure will trigger shock and even death.

For years, physicians in many countries have urged parents to avoid feeding high-risk children peanuts until they are 3 years old, on the rationale that one cannot become allergic to a food without being exposed to it and that the immune and digestive systems of older children will be better able to tolerate the peanut proteins that can elicit reactions. But in what seems a contradiction, the countries that endorse this view—including the United States, the United Kingdom, and Canada—have seen an explosion of peanut allergies. Meanwhile in Israel, with Bamba melting in the mouths of babies too young to walk, the rate of peanut allergy in toddlers is 0.04%.

Is this association accidental, or can it tell us something about how food allergies develop? It's a question asked increasingly by allergists. Early this year, a British team launched the first trial aiming for a direct answer. Their study is recruiting 480 high-risk babies and offering peanuts to half of them while withholding them from the rest. Will one group be more allergic? There's no safe bet, as studies of food, pet, and other allergies span the spectrum, with conflicting results on whether exposure is preventive.

Rethinking avoidance

An intricate set of variables can shape the onset of allergies: genetics, eczema, not being breast-fed, and the route of exposure, whether by the skin or the mouth. Some allergists wonder if food preparation plays a role—in particular, if the boiled peanuts consumed in Southeast Asia and Africa are less allergenic than the dry-roasted form eaten in the West. The surge in food allergies also tracks an increase in asthma and autoimmune disease in Europe and North America, and all are thought to be related to the plunge in infectious disease in early childhood. The “hygiene hypothesis,” as it's known, postulates that exposure to pathogens trains the immune system to regulate itself.

Given this complexity, there's little hope that avoidance will have universal effects. And indeed, the evidence so far is mixed. For some children, living with a cat may make cat allergies less likely, whereas living with more dust mites either helps or hurts as far as allergy prevention is concerned, depending on the study.

When it comes to peanuts, many physicians are torn between how little they know and what they viscerally fear: Data favoring early avoidance are skimpy, but doctors are uncomfortable promoting peanut consumption. “Peanut allergy is so lethal; you don't get a second chance,” says Patrick Holt of the University of Western Australia in Perth, who examines mechanisms underlying allergy. Still, he says, “the story that we thought we were seeing 10 years ago,” with allergy risk tracking a dose curve, “doesn't hold up.”

The thought that higher doses may not be worse has come gradually, as scientists dig deeper into how environmental cues prompt an immune response. That the immune system attacks something like Escherichia coli bacteria, a sickness-inducing pathogen, “makes sense,” says Stephanie Eisenbarth of Yale University School of Medicine, who performed allergy studies in renowned immunologist Kim Bottomly's lab for her Ph.D. thesis. “But why would you want to fight ragweed?” The common plant isn't a health threat. This is the question allergy experts have asked themselves for decades.

In someone vulnerable—because of their genes or for other reasons—initial exposures to normally innocuous proteins stir white blood cells known as T helper 2 (TH2) cells into a tizzy. They alert B cells, which churn out antibodies to the protein. Those antibodies, part of a class called immunoglobulin E (IgE), bind to mast cells and set off a full-blown reaction the next time the offending protein, called an allergen, appears. The mast cells empty inflammatory chemicals such as histamines into the bloodstream, causing the classic symptoms of allergy.


Scientists are testing whether exposure to peanuts, dust mites (above), and grass can help prevent allergies in high-risk toddlers.


In the mid-1990s, Bottomly, now president of Wellesley College in Massachusetts, began reporting a more nuanced picture in cells and in mice: The immune response to a foreign substance shifted as the dose of that substance increased. At low doses, her lab saw a classic TH2 response. High doses set off the flip side of the immune system, activation of TH1, which served as a brake on TH2. In 2002, Eisenbarth and Bottomly revealed in mice that the same dose pattern held with an adjuvant, a substance in the environment that primes the immune system to respond to an allergen. “We still don't know” what's behind this, says Eisenbarth.

Testing whether something similar occurs in people, some allergy-prevention studies uncovered a bell-shaped curve, particularly with exposure to pets: no reaction at the zero-dose end, a strong reaction with low or moderate doses, and little or no reaction again at the high-dose end.

Some foods may fall under this rubric, too. In 2004, pediatric asthma and allergy specialist John Warner of Imperial College London described a cohort of babies at high risk for egg allergy and examined the effect of maternal diet during pregnancy. Infants whose cord blood had the lowest or highest levels of antibodies to egg proteins were least likely to experience allergic hypersensitivity, such as eczema, by the time they were 6 months old. Those with midrange antibody levels were the most hypersensitive. The evidence is suggestive, although it remains unclear whether a pregnant woman's diet can influence allergy in her unborn child.

Still, studies such as this one have researchers questioning “whether we can really prevent exposure” completely, says Dennis Ownby, a pediatric allergist at the Medical College of Georgia in Augusta. At least half the toddlers he sees who have suddenly reacted to a food weren't known to have eaten it before. But because an allergy can't develop without exposure first, they must have encountered proteins somewhere. Foods not known to contain peanuts may be one place: In 2001, the U.S. Food and Drug Administration reported that 25% of foods surveyed that did not list peanuts on their labels tested positive for them.

“The problem is the small amounts may be even more allergenic than the large amounts,” says allergist Hasan Arshad of the University of Southampton, U.K. No one knows whether this is true in humans. But in cells and animals, the allergy antibody, IgE, can be stimulated with small doses of certain allergens given intermittently, whereas a protective antibody, IgG, comes forward with large doses given regularly, says Allan Becker, a pediatric allergist at the University of Manitoba in Winnipeg, Canada. Arshad's own work, however, underscores the tension between avoidance and exposure.

The Isle of Wight experiment

Seventeen years ago, Arshad began following 120 infants on the Isle of Wight, a semirural island several kilometers off the coast of England. The children, all at high risk of allergy and asthma, were put in a limited-exposure group or one with normal exposures. In the low-exposure group, breastfeeding mothers avoided cow's milk, eggs, nuts, and wheat. Exposure to house dust mites was reduced with pesticides and mattress covers.

In their first 8 years, these children were half as likely to suffer allergies as the control group. Limited exposure worked on the Isle of Wight, Arshad theorizes, because several allergies were targeted simultaneously and because the children steered clear of small exposures that may be deleterious, something other avoidance studies may not have accomplished.

Some physicians are looking beyond the level of exposure and considering how food proteins get inside the body. In 2003, pediatric allergist Gideon Lack, now at King's College London, reported in The New England Journal of Medicine that by the time they were 6 months old, 21 of 23 preschoolers with peanut allergy had had creams containing peanut oil, which are typically used for diaper rash, applied to their skin. In a control group of 140 children, the number was 59%. Lack believes the route of exposure is key—specifically, that skin exposure in the absence of ingestion, rather than ingestion of small doses, is behind growing rates of peanut allergy. “It would be highly implausible to me that we've evolved as a species so that if we have tiny amounts of food through the gut,” an allergy develops.

One effort to settle the question comes in a study Lack launched early this year. Funded by the U.S.-based Immune Tolerance Network (ITN), it is randomizing babies younger than 11 months old to receive peanuts in different forms, such as peanut butter mixed with banana, or no peanuts until they're 3 years old, and track who develops allergies. About 200 children have enrolled so far; the study will last 7 years.

Holt, in Australia, is running a similar ITN study of grass, cat, and dust-mite allergy, recruiting 200 high-risk babies and giving half of them daily doses of allergen drops under the tongue. “For people on the outside, this sounds like very adventurous stuff,” says Holt. But “any early fears of ‘Oh my God, you're going to create havoc’” by exposing high-risk babies to allergens haven't yet come to be. Results are still a few years off.

In both the United States and the United Kingdom, physicians and health officials are reconsidering guidelines that encourage parents to avoid giving high-risk children potentially allergenic foods; they may shift to a more neutral stance. But “I'm still nervous about peanut,” admits one leader in the field, Hugh Sampson of Mount Sinai School of Medicine in New York City. “I don't have proof, I just have this sort of sense that there's something different about it.” Asked by worried parents what he would do were the child his own, “I'll say, ‘I would probably avoid peanut.’” At the same time, Sampson admits to never quite knowing what to advise. “Whatever we're doing is not working,” he says, “because things have only gotten worse.”

View Abstract

Stay Connected to Science


Navigate This Article