News this Week

Science  25 Mar 2005:
Vol. 307, Issue 5717, pp. 1848
  1. PALEOANTHROPOLOGY

    Discoverers Charge Damage to 'Hobbit' Specimens

    1. Elizabeth Culotta

    Yet another skirmish has erupted in the battle over the bones of the “hobbit,” the diminutive hominid found in the Indonesian island of Flores and last year announced as a new species of human, Homo floresiensis. Late last month the 18,000-year-old bones were returned to their official home, the Center for Archaeology in Jakarta, after being borrowed by Indonesia's most prominent paleoanthropologist Teuku Jacob of Gadjah Mada University in Yogyakarta (Science, 4 March, p. 1386). Now archaeologist Michael Morwood of the University of New England (UNE) in Armidale, Australia, leader of the team that discovered the bones, charges that the specimens were seriously damaged in transit or while in the Yogyakarta lab. Jacob insists that the bones were intact when they left his lab, and that any damage must have occurred when they were no longer under his care.

    Morwood says the left side of the pelvis—which he calls one of the hominid's most distinctive features—was “smashed,” perhaps during transport. He and his UNE colleague, paleoanthropologist Peter Brown, also say that at least one Silastic mold was apparently taken of some of the delicate bones, which were described as the consistency of “wet blotting paper” when found. The molding process caused breakage and loss of anatomic detail in the cranial base of the skull and jawbone, they say. Morwood adds in an e-mail that a second, still-unpublished jawbone “broke in half during the molding process and was badly glued back together, misaligned, and with significant loss of bone.”

    Broken bones.

    The Flores hominid pelvis before transport, and after.

    CREDIT: PETER BROWN/UNIVERSITY OF NEW ENGLAND

    Jacob strongly denies that the bones suffered any damage—“at least not in our lab. We have photographs, taken on the last day, and [the bones are] not damaged,” he told Science. “They used a suitcase [to carry the bones back to Jakarta],” he adds. “I do not use this to transport fossils; we use special bags to carry bones.”

    Jacob, who says his lab is the only one in Indonesia set up for paleoanthropological analysis, says researchers made a mold and one cast of the skeleton, but that it was “impossible” for the procedure to have damaged the bones. He adds that his team reconstructed some of the remains, putting pieces together in order to study them, because this had not yet been done.

    Paleopathologist Maciej Henneberg of the University of Adelaide in Australia says the bones, including the left side of the pelvis, were intact when he viewed them in Yogyakarta on 17 February. He adds that “Professor Jacob's laboratory has decades of experience caring for fossils, and I would be surprised to learn that if they made a mold it would damage the bones,” an opinion echoed by paleoanthropologist Jean-Jacques Hublin of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, who visited Jacob in January.

    Wherever the damage to the pelvis occurred, says Brown, “the most important point is that it was too fragile to move in the first place. [It] should never have left Jakarta.”

  2. ETHICS

    Doctors Pay a High Price for Priority

    1. Martin Enserink

    AMSTERDAM—The drive for priority may have gotten doctors at an academic hospital in the Netherlands in trouble with the law. Several were so intent on publishing the first report on the re-emergence of a rare sexually transmitted disease in 2003, a government agency says, that they did not convey their findings to health authorities while an article was in press, squandering a chance to limit the international spread of the disease. According to a report from the Dutch Health Care Inspectorate last week, the authors violated a law requiring hospitals to report unusual outbreaks immediately.

    The report describes the 2003 discovery of an outbreak of Lymphogranuloma venereum (LGV) in gay and bisexual men in the Netherlands, many of them infected with HIV. LGV, which can produce painful ulcers and swelling of the lymph nodes, is caused by certain types of the microbe Chlamydia trachomatis. When treated with the right antibiotics, LGV can be cured completely; it is prevalent in the tropics but almost never seen in the Western world.

    That changed in December 2003, however, when a group led by Martino Neumann at Erasmus Medical Center in Rotterdam published a case study about a single patient with LGV in Sexually Transmitted Infections. Shortly afterward, Dutch public health authorities, who had not heard about the case, issued an international alert. Since then, more than 200 cases of LGV have been found in the Netherlands, Germany, France, the United Kingdom, and the United States.

    Unreported.

    Authorities did not hear about a rare Chlamydia trachomatis outbreak.

    CREDIT: DAVID PHILLIPS/VISUALS UNLIMITED

    The warning could have come 6 to 8 months earlier, according to the Dutch health inspectors. The Erasmus group saw its first patient in January 2003 and the second in April, and then traced more than a dozen others during the course of 2003. Although some members repeatedly proposed reporting the cluster to the Municipal Health Service, the report says, the group failed to do so—out of “fear that others would run with the data and publish about the matter first.” The result, it says, “in all likelihood [was] a much wider spread of the infection” than necessary.

    LGV is not a notifiable disease in the Netherlands, but the inspectors say that the group broke article 7 of the Dutch Infectious Diseases Law, which obliges the heads of certain institutions to report any unusual clusters of possibly infectious diseases. The report puts most of the responsibility on two doctors, identified only as the “department head of the department of dermatology and venereology” and the “medical head of the STD clinic,” although the latter was not a co-author on the paper. A spokesperson says the inspectorate will file a complaint with the Dutch Medical Disciplinary Board citing these two, as well as a former head of the clinic who acted as a consultant.

    In a statement, Erasmus Medical Center said it believes the delay did not endanger public health. The statement welcomed the disciplinary procedure—“even if it is aggravating to the staff members concerned”—because it could bring clarity about reporting requirements under Dutch law.

    Sexually Transmitted Infections, which accepted the LGV manuscript on 21 July 2003, didn't instruct the team not to report the findings to health authorities, says that journal's editor, Helen Ward of Imperial College in London. Indeed, no medical journal would do that, says former New England Journal of Medicine editor Arnold Relman: “Clearly, public health should always come first.”

  3. CAREER TRANSITIONS

    Panel Throws Lifeline to Bio Postdocs

    1. Yudhijit Bhattacharjee

    For years, postdoctoral scholars have complained that they receive too little help in making the crucial transition from trainee to independent investigator. Last week a new report by the National Academies suggested shoring up that support in ways that might benefit the entire biomedical community.

    The report, from a panel chaired by Howard Hughes Medical Institute president Thomas Cech, asks the National Institutes of Health (NIH) to create individual awards and training grants for postdocs that would make them less dependent on their principal investigators. It recommends allowing foreign postdocs to apply for training awards that are currently open only to U.S. citizens and permanent residents. And it suggests limiting the length of postdoctoral training to 5 years, regardless of a postdoc's source of funding.

    The cost of implementing these recommendations could strain an NIH budget that is no longer growing rapidly. But NIH Director Elias Zerhouni seems willing to give them a try. “There's no wrong time to do the right thing,” he says.

    Melting pot.

    A new report says foreign-born postdocs, a rising share of the pool, should also be eligible for the NRSA program.

    SOURCE: THE NATIONAL ACADEMIES AND BLS

    One recommendation that's already under consideration is to create starter grants for investigators who have a research idea but no preliminary findings to include in their proposal (Science, 25 June 2004, p. 1891). Such applications would be reviewed separately from those proposals, called R01s, submitted by established investigators. Another recommendation, without a price tag, would require senior grant applicants to provide a detailed plan for mentoring their postdocs. That change would force “investigators to think about the careers of young researchers in their laboratory instead of just using them as scientific labor,” says Cech.

    Two of the panel's recommendations—waiving the citizenship requirement for the National Research Service Awards (NRSA) and other postdoctoral training awards, and shifting money from R01s into career development awards—could well face significant opposition. “Making federal support available to those who are not U.S. citizens or permanent residents can be controversial,” the report says about the NRSA program. “But … those who would receive such training awards are likely already supported on research grants and are critical to advances in U.S. biomedical research.” One panelist who requested anonymity noted that a 1998 academies report also called for tapping the R01 pot to fund early-career grants. “It didn't go anywhere,” she says.

    Zerhouni praised other recommendations in the report as being consistent with his belief that NIH should be doing more to nurture the creativity of young scientists. One would expand a small program at several institutes by setting up 200 agencywide career transition awards each worth $500,000. Another would award renewable R01-like grants, with a cap of $100,000 in indirect costs, to university researchers not on the tenure track.

    Offering independent awards to investigators early in their careers, he says, sends the message that NIH wants them to “show us what you can do.” The goal, he adds, is to avoid a situation in which a young scientist regrets not having the chance to demonstrate that “I coulda been a contender.”

  4. SCIENTIFIC MISCONDUCT

    Researcher Faces Prison for Fraud in NIH Grant Applications and Papers

    1. Eli Kintisch

    In the most extensive scientific misconduct case the National Institutes of Health (NIH) has seen in decades, a researcher formerly at the University of Vermont College of Medicine in Burlington has admitted in court documents to falsifying data in 15 federal grant applications and numerous published articles. Eric Poehlman, an expert on menopause, aging, and metabolism, faces up to 5 years in jail and a $250,000 fine and has been barred for life from receiving any U.S. research funding.

    Scientists say the falsified data—including work in 10 papers for which Poehlman has requested retractions or corrections—have had relatively little impact on core assumptions or research directions. But experts say the number and scope of falsifications discovered, along with the stature of the investigator, are quite remarkable. “This is probably one of the biggest misconduct cases ever,” says Fredrick Grinnell, former director of the Program in Ethics in Science at the University of Texas Southwestern Medical Center in Dallas. “Very often [in misconduct cases], it's a young investigator, under pressure, who needs funding. This guy was a very successful scientist.” Neither Poehlman nor his attorney returned calls from Science.

    Retractions.

    Eric Poehlman (shown in 1991 photo) has notified journals about 10 papers that required retractions.

    CREDIT (LEFT): UNIVERSITY OF VERMONT PHOTO

    Poehlman, 49, first came under suspicion in 2000 when Walter DeNino, then a 24-year-old research assistant, found inconsistencies in spreadsheets used in a longitudinal study on aging. The data included energy expenditures and lipid levels for elderly patients. “[V]alues for total cholesterol, insulin, resting metabolic rate, and glucose” were falsified or fabricated, said a statement Poehlman signed last week. In an effort to portray worsening health in the subjects, DeNino tells Science, “Dr. Poehlman would just switch the data points.”

    After DeNino filed a formal complaint, a university investigative panel looked into Poehlman's research and uncovered falsified data in three papers. These included a much-cited 1995 Annals of Internal Medicine study that suggested hormone replacement therapy could prevent declines in energy expenditure and increases in body fat during menopause. In that paper Poehlman presented metabolic data on 35 women taken 6 years apart. Most of the women did not exist, according to the statement Poehlman signed. (In 2003 the paper was retracted.) Poehlman left Vermont in 2001, before the investigation ended, for the University of Montreal. He left there in January and now lives in Montreal.

    A 2-year review by the Office of Research Integrity (ORI) at the Department of Health and Human Services found more falsified data in another dozen federal grant applications, ORI investigators said. Last week the Department of Justice announced that the total was 17, and that NIH and the U.S. Department of Agriculture had given Poehlman $2.9 million in grants based on fraudulent applications. In addition to pleading guilty to making a false statement on a federal grant application, Poehlman agreed to pay $180,000 to settle a civil suit with the government. A plea hearing and sentencing are pending.

    Colleagues say Poehlman's work was extensive but did not affect underlying assumptions about how the body changes during aging. Richard Atkinson, editor of the International Journal of Obesity, said in an e-mail that removing Poehlman's work may reduce the evidence that energy expenditure decreases across time with menopause, but “it does not invalidate the concept.” Judy Salerno, deputy director of the National Institute on Aging in Bethesda, Maryland, says his work “wasn't the final answer.”

    Journal editors say it's hard to guard against such misconduct. A rigorous review process can do only so much, says Harold Sox, who became Annals's editor in 2001: “You just have to trust the authors.”

  5. GENETICS

    Talking About a Revolution: Hidden RNA May Fix Mutant Genes

    1. Elizabeth Pennisi

    When it comes to plants and animals, biologists think of DNA as the sole storehouse of genetic information. But a surprising new study of the mustard plant Arabidopsis thaliana challenges that notion. In the 24 March issue of Nature, Susan Lolle and Robert Pruitt of Purdue University in West Lafayette, Indiana, and their colleagues report that in this weed, gene inheritance can somehow skip generations: Plants sometimes end up with their grandparents' good copy of a gene instead of the mutant ones belonging to their parents. The researchers put forth the radical proposal that plants contain an inheritable cache of RNA that can briefly reverse evolution, undoing mutations and restoring a gene to its former glory.

    “[The paper] suggests the existence of a unique genetic memory system that can be invoked at will,” says Vincent Colot of the Plant Genomics Research Unit at Genopole in Evry, France. If confirmed and extended to animals, the new findings could profoundly affect biomedicine as well as population genetics. For example, geneticists trying to assess disease risk would have to take into consideration the makeup of this RNA memory, notes Emma Whitelaw of the University of Sydney, Australia.

    Pruitt and Lolle first discovered that genes could go back in time about 3 years ago while studying one in A. thaliana called HOTHEAD. In plants with both copies of HOTHEAD mutated, the floral parts are all stuck together into a little ball.

    Mutation in reverse.

    RNA may undo a mutation that causes A. thaliana flower parts to fuse (left) such that offspring flowers just fine (right).

    CREDIT: S. J. LOLLE ET AL., NATURE 434, 505 (2005)

    Typically, when such a mutant plant self-fertilizes, its progeny inherit two copies of the gene responsible for the abnormal trait. Thus, when this Arabidopsis strain reproduced that way, there should have been two mutant HOTHEAD genes passed on, and all the progeny should have had balls instead of flowers. Instead, Lolle and Pruit found that 1% to 10% of the offspring produced normal flowers, indicating that at least one copy of the mutant gene had reverted to its nonmutated form in those plants. “It's something that Mendelian genetics has not prepared us for,” says Pruitt.

    They tested whether the progenies' wild-type version of HOTHEAD had been derived from mutated ones by fertilizing a wild-type Arabidopsis strain with pollen from the original mutant strain. Most of the time, the offspring had the expected genetic makeup—one mutated HOTHEAD and one wild-type allele—and normal flowering. But 8 out of 164 embryos examined had two wild-type alleles, says Pruitt.

    To ensure that wild-type seeds hadn't inadvertently gotten mixed up in their experiments, they checked the DNA of plant embryos removed directly from the HOTHEAD mutant plant, before any exposure to other plants or seeds. Most of the embryos had two mutant genes, but a few showed signs of a reverted version. They also closely examined the HOTHEAD gene sequence and ruled out that the reversions were the result of random mutations or extra copies of the good genes stowed away in the genome.

    “This is the first time that it is shown that an organism can harbor, in a hidden form, additional sets of genetic information from previous generations and that this information can be copied back onto the DNA at the next generation,” say Colot. RNA “templates” derived from the original gene and stored in the gametes are the best candidates for reverting the mutant gene to its original state, says Pruitt.

    Other labs will undoubtedly rush to test that remarkable suggestion. If true, it would join several other recently discovered functions for RNA that biologists are just now beginning to appreciate. “I am not sure the mechanism will turn out to be the right one,” notes Elliot Meyerowitz, a plant developmental geneticist at the California Institute of Technology in Pasadena, California. “But I can't think of any [explanation] that's much brighter than what they have.”

  6. PALEONTOLOGY

    Tyrannosaurus rex Soft Tissue Raises Tantalizing Prospects

    1. Erik Stokstad

    It's not Jurassic Park-style cloning, but a remarkable find has given paleontologists their most lifelike look yet inside Tyrannosaurus rex—and, just possibly, a pinch of the long-gone beast itself.

    On page 1952, a team led by Mary Schweitzer of North Carolina State University in Raleigh describes dinosaur blood vessels—still flexible and elastic after 68 million years—and apparently intact cells. “If we have tissues that are not fossilized, then we can potentially extract DNA,” says Lawrence Witmer, a paleontologist at Ohio University College of Osteopathic Medicine in Athens. “It's very exciting.” But don't fire up the sequencing machines just yet. Experts, and the team itself, say they won't be convinced that the original material has survived unaltered until further test results come in.

    The skeleton was excavated in 2003 from the Hell Creek Formation of Montana by co-author Jack Horner's crew at the Museum of the Rockies in Bozeman, Montana. Back in the lab, Schweitzer and her technician demineralized the fragments by soaking them in a weak acid. As the fossil dissolved, transparent vessels were left behind. “It was totally shocking,” Schweitzer says. “I didn't believe it until we'd done it 17 times.” Branching vessels also appeared in fragments from a hadrosaur and another Tyrannosaurus skeleton. Many of the vessels contain red and brown structures that resemble cells. And inside these are smaller objects similar in size to the nuclei of the blood cells in modern birds. The team also found osteocytes, cells that deposit bone minerals, preserved with slender filipodia still intact.

    A stretch?

    Dissolved T. rex bone yielded flexible, branching vessels (left), some of which contain cell-like structures (right).

    CREDIT: M. H. SCHWEITZER

    If the cells consist of original material, paleontologists might be able to extract new information about dinosaurs. For instance, they could use the same sort of protein antibody testing that helps biologists determine evolutionary relationships of living organisms. “There's a reasonable chance that there may be intact proteins,” says David Martill of the University of Portsmouth, United Kingdom. Perhaps, he says, even DNA might be extracted.

    Hendrik Poinar of McMaster University in Hamilton, Ontario, cautions that looks can deceive: Nucleated protozoan cells have been found in 225-million-year-old amber, but geochemical tests revealed that the nuclei had been replaced with resin compounds. Even the resilience of the vessels may be deceptive. Flexible fossils of colonial marine organisms called graptolites have been recovered from 440-million-year-old rocks, but the original material—likely collagen—had not survived.

    Schweitzer is seeking funding for sophisticated tests that would use techniques such as mass spectroscopy and high performance liquid chromatography to check for dino tissue. As for DNA, which is less abundant and more fragile than proteins, Poinar says it's theoretically possible that some may have survived, if conditions stayed just right (preferably dry and subzero) for 68 million years. “Wouldn't it be cool?” he muses, but adds “the likelihood is probably next to none.”

  7. ASTRONOMY

    Alien Planets Glimmer in the Heat

    1. Robert Irion

    Exoplanets have finally become real. After a decade of inferring the presence of nearly 150 other worlds from oscillating patterns in starlight, astronomers announced this week that they have measured light from two of them for the first time. “We are moving out of the realm of merely counting planets and knowing their orbital paths,” says planetary scientist Heidi Hammel of the Space Science Institute in Boulder, Colorado. “It's a new ball game now.”

    The research, described 22 March at a NASA briefing in Washington, D.C., concerns two “hot Jupiters” eclipsed by their host stars every few days as seen from Earth. In visible light, the stars blaze 10,000 times brighter than the planetary pinpricks. But in infrared light, that factor dwindles to 400 because the planets reradiate torrents of heat from their scalding orbits—just a 10th of Mercury's distance from our sun. Although that is too close for telescopes to make an image, NASA's Spitzer Space Telescope can pick up those faint extra dollops of warmth.

    Two independent teams used Spitzer in late 2004 to stare at the stars for several hours each, spanning the times when each planet was predicted to pass directly behind its sun. Like clockwork, the total infrared light from the stars dimmed by about 0.25% when the planets disappeared and then edged back up again when they emerged. “It was bang-on what we expected from a hot planet going behind its star,” says astronomer David Charbonneau of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts, whose team will report on the planet TrES-1 in the 20 June Astrophysical Journal.

    Extra warmth.

    The Spitzer Space Telescope saw tiny heat signatures from two exoplanets as they emerged from behind their parent stars.

    CREDIT: DAVID A. AGUILAR/CFA

    Spitzer detected each planet at just one or two wavelengths. Proposed studies of the subtle signatures with all of the satellite's instruments will produce a full infrared spectrum of the planets' gaseous atmospheres, revealing their temperatures and ingredients such as carbon monoxide, water vapor, and sodium, forecasts astronomer Drake Deming of NASA's Goddard Space Flight Center in Greenbelt, Maryland. “Spitzer will pin this down beautifully,” says Deming, lead author of a paper on the planet HD 209458b in this week's online edition of Nature.

    The early results deepen one mystery about HD 209458b, Deming says. Studies had shown that the planet is unusually “puffy,” with a radius 35% larger than Jupiter's. Theorists predicted that an unseen sister planet must be forcing HD 209458b into an oval orbit, raising tides in the planet's interior and making it expand. However, Spitzer's timing of the eclipse shows a perfectly circular orbit, says Deming—as do new studies of the star's back-and-forth wobbles by a team led by astronomer Gregory Laughlin of the University of California, Santa Cruz. “I'm sure there will be another flurry of theoretical explanations” for the planet's hefty size, Deming says.

    Ongoing searches for other eclipsing planets will lead to a new cottage industry of measuring light from exoworlds, Hammel believes. “These hot Jupiters are just the starting point,” she says. “These are the biggest, brightest, and easiest.”

  8. PALEOCLIMATE

    Ocean Flow Amplified, Not Triggered, Climate Change

    1. Richard A. Kerr

    Figuring out what's going on with this year's weather is hard enough, so pity the poor paleoclimatologists trying to understand how the world drifted into the last ice age 70,000 years ago. For half a century, paleoceanographers have been studying elements or isotopes preserved in deep-sea sediments as markers of the workings of past climate. This “proxy” approach has worked, but only up to a point. Both the climate system and paleoclimate proxies can be unexpectedly subtle and complex.

    On page 1933, a group of geochemists and paleoceanographers advances another proxy: isotopes of the rare-earth element neodymium, which they believe faithfully trace the ups and downs of the heat-carrying Gulf Stream flow. By their reading of neodymium, changes in the speed of the Gulf Stream—a much-discussed mechanism for altering climate—came too late in major climate transitions to have set the climate change in motion. “It's groundbreaking work,” says paleoceanographer Christopher Charles of the Scripps Institution of Oceanography at the University of California, San Diego. “It's going to stimulate quite a bit of work either to try to extend the analysis or shoot it down.”

    Researchers at Columbia University's Lamont-Doherty Earth Observatory in Palisades, New York, began pursuing neodymium as a circulation tracer because it seemed to offer a prized trait: immutability. The ratio of neodymium-143 to neodymium-144 in North Atlantic and Pacific waters differs enough, thanks to the range of ratios of surrounding continental rocks, that it can be used to follow the mixing of waters as currents flow from basin to basin.

    Lagged.

    Glacial cold and ice grew (1, top) thousands of years before ocean circulation flowed (3, bottom).

    CREDIT: ADAPTED FROM A. M. PIOTROWSKI

    Ocean circulation changes should dominate the changes in the neodymium ratio, say Lamont group members Alexander Piotrowski (now a postdoc at the University of Cambridge, U.K.), geochemists Steven Goldstein and Sidney Hemming, and paleoceanographer Richard Fairbanks. For example, plankton can't change the ratio—as it does the isotopic composition of carbon—because neodymium is too massive an element for biology to separate its isotopes. And in fact, the isotope ratio preserved in the microscopic bits of iron-manganese in a classic sediment core from the southeastern South Atlantic matches the story told by previous tracers. During each of four temporary warmings during the last ice age, the ratio swung down and then back up—just as it should have done if the warm, north-flowing Gulf Stream had temporarily sped up, as more North Atlantic water flowed south in the deep arm of the “conveyor belt” flow.

    In the run-up to the ice age, by contrast, the core told a more complicated tale. Starting about 70,000 years ago, bottom waters cooled as glacial ice grew on the polar continents, as indicated by oxygen isotopes of microscopic skeletons of bottom-living organisms. Then, a couple of thousand years later, carbon isotopes shifted as the growing ice and climatic deterioration shrank the mass of plants on land, sending their isotopically light carbon into the sea. Only after another couple of thousand years did the conveyor belt flow slow down, according to neodymium.

    Given that millennia-long lag behind the growing cold and ice, “ocean circulation responded to climate change,” says Goldstein. At least at glacial transitions, the slowing of warm currents could have put the final chill on the ice age, but “it's not the trigger of climate change.” Presumably, the initial cooling was an indirect response to the decline of solar heating over high northern latitudes brought on by the so-called Milankovitch orbital variations: the ever-changing orientation of Earth's orbit and rotation axis. However, changes in ocean circulation may have triggered abrupt climate shifts once the ice age was under way, Goldstein notes.

    Although many paleoceanographers like the idea of ocean circulation as a follower rather than a leader, a single core is not likely to win the day. Neodymium “seems to be working remarkably well,” says paleoceanographer Jerry F. McManus of Woods Hole Oceanographic Institution in Massachusetts. But the history of climate proxies and a few hints in the South Atlantic record tell him that neodymium may not be the perfect ocean circulation proxy. He and others will be looking for weaknesses.

  9. PROTEOMICS

    Protein Chips Map Yeast Kinase Network

    1. Robert F. Service

    Score another victory for high-throughput biology. In one fell swoop, researchers at Yale University in New Haven, Connecticut, have vastly extended decades' worth of research into the molecular communications between proteins that govern the lives of yeast cells. The Yale team, led by molecular biologist Michael Snyder, used glass chips arrayed with thousands of yeast proteins to track down the molecular targets of the organism's protein kinases, enzymes that modify the function of other proteins by tagging them with a phosphate group. About 160 interactions between specific yeast kinases and their targets had previously been identified; the chip study added more than 4000, allowing the Yale researchers to map out a complex signaling network within yeast cells. Snyder presented this large-scale survey of yeast protein phosphorylation last week in Arlington, Virginia, at the first annual symposium of the U.S. Human Proteome Organization.

    “This is extremely important for the signal transduction community,” says Charles Boone, a yeast geneticist at the University of Toronto in Canada. Drugmakers, Boone adds, are likely to pore over the new bounty of yeast results to find similar kinase interactions in human cells that they can affect.

    Setting the stage for the new work, Snyder and his colleagues initially developed protein chips displaying the majority of yeast proteins (Science, 14 September 2001, p. 2101). A company called Invitrogen in Carlsbad, California, now makes these chips commercially, and for the current study it provided ones that harbor 4088 of yeast's 6000 or so proteins.

    En masse.

    Biochips reveal thousands of new interactions between kinases and their molecular targets in yeast cells like these.

    CREDIT: SCIMAT/PHOTO RESEARCHERS INC.

    Snyder's team expressed and purified 87 of yeast's 122 protein kinases. (The remainder are difficult to express.) They then washed each kinase over a different chip along with radiolabeled ATP, the molecule that provides the phosphate group that kinases attach to a targeted protein. An autoradiography machine then imaged sites on the chips where a radiolabeled phosphate group had modified a protein.

    The results revealed 4192 interactions between the yeast kinases and some 1300 different protein targets. Among the surprises, Snyder says, were the targets of a well-studied family of three protein kinases. Biochemists had previously concluded that these three were redundant, meaning that if one or two were absent, the third would take their place and allow the yeast cells to survive. That suggested they phosphorylate the same proteins. But Snyder reported that each kinase had a very different profile of targets.

    In addition to teasing out individual kinase-protein interactions, the Yale researchers also integrated their results with other yeast protein data sets, including one for the transcription factors that turn genes on and off. That allowed them to build a complex map of protein encounters that regulate life inside yeast cells. Among the lessons from the map, Snyder reported, was that eight particular patterns of protein interactions show up over and over. For example, so-called adaptor proteins commonly interact with both a kinase and a protein it modifies. These adaptor proteins, Snyder says, likely help control the rate at which other proteins are activated.

    The research by Snyder's team is “very exciting and enabling work,” says Harvard University yeast geneticist Steve Elledge.

    Researchers at pharmaceutical companies are likely to be among the scientists most interested in following up on this work. Protein kinases already constitute one of the most important classes of drug targets, with kinase inhibitors such as the cancer drugs Herceptin and Gleevec accounting for billions of dollars a year in sales. The Yale group's yeast research doesn't reveal the critical kinases at work in humans, but it gives drug companies a leg up on identifying equivalent kinase interactions in humans and possibly clues about how to block them.

  10. MAGNETIC IMAGING

    Atom-Based Detector Puts New Twist on Nuclear Magnetic Resonance

    1. Adrian Cho

    Cheap, portable magnetic resonance imaging (MRI) machines could be on the horizon thanks to an exquisitely sensitive magnetic field detector. In their lab at Princeton University in New Jersey, physicists Igor Savukov and Michael Romalis have used an “atomic magnetometer”—essentially, a vial of gas and a pair of lasers—to detect the wobble of atomic nuclei in a magnetic field. That motion, known as nuclear magnetic resonance, provides the signal tracked in MRI scans.

    The approach might open the way for one-shot MRIs for patients instead of tedious scans, the researchers say. But others say such applications are far from a sure thing.

    An atomic magnetometer-based system “is really very simple and cheap in principle,” says Dmitry Budker, a physicist at the University of California, Berkeley, who is also developing the devices. In contrast to competing techniques, such a system requires neither a powerful, pricey magnet nor cryogenic equipment, Budker says. That means an entire system could conceivably cost “a few thousand dollars compared to a few million” for a conventional scanner, he says. But John Wikswo, a physicist at Vanderbilt University in Nashville, Tennessee, says the idea is still a long way from a practical technology. “I would not get carried away with anything that's not proved in the paper,” Wikswo says.

    See them spin.

    A gas-filled cell and laser beams can track the twirl of atomic nuclei and could eventually lead to cheap, portable MRIs.

    CREDIT: TOM KORNACK/PRINCETON UNIVERSITY

    An MRI machine senses the gyration of atomic nuclei, which can wobble, or precess, in a magnetic field much as spinning tops do under the pull of gravity. By prodding hydrogen nuclei with radio waves, a medical MRI scanner twirls the nuclei and maps the abundance of water molecules in living tissue. Wobbling in concert, the nuclei produce their own oscillating magnetic field, which a conventional MRI machine senses with a coil of wire.

    To produce detectable signals, however, a system with a pickup coil requires powerful, expensive magnets. Nuclei precessing in far weaker fields can be tracked with a loop of superconductor known as a SQUID. But SQUIDs must be kept at temperatures near absolute zero, which requires expensive and bulky cryogenic equipment.

    To detect the wobbling nuclei with similar sensitivity, Savukov and Romalis employed a glass chamber filled with helium and potassium vapor. Because of the arrangement of its electrons, each potassium atom acts like a magnet, and the researchers line the atoms up by shining a strong “pump” laser beam on them. The atoms then wobble when exposed to even a tiny magnetic field, and the researchers detect their precession by shining a second “probe” laser through them.

    The magnetometer recorded the oscillating magnetic field produced by protons precessing in water, the researchers report in a paper to be published in Physical Review Letters. It also measured the wobble of nuclei of xenon atoms mixed with the potassium, a technique that might be used to image the lungs by studying traces of exhaled gas. Potentially, the vapor in the cell can encode a three-dimensional image of an object that can be read out like a snapshot, Romalis says.

    But atomic magnetometers present technological challenges of their own, including susceptibility to extraneous magnetic fields. The gas chamber must be heated to 180°C, and shielding it so that it can be placed next to living tissue may not be easy, Wikswo says. Ronald Walsworth, a physicist at the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts, agrees that several “big engineering challenges” remain, but he credits Romalis “for doing some of that hard engineering work.” Whether those efforts will pay off remains, quite literally, to be seen.

  11. ECOLOGY

    Savannah River Lab Faces Budget Ax

    1. Eli Kintisch

    A 54-year-old University of Georgia ecology lab funded primarily by the Department of Energy (DOE) is fighting for its life.

    Located on a 780-square-kilometer nuclear industrial site in southwest South Carolina that is off-limits to development, the Savannah River Ecology Laboratory (SREL) is well respected for its expertise in subjects including the movement of pollutants in streams and the effect of radiation on reptiles. Two years ago, however, DOE moved the lab from its office for cleanup projects to one that focuses on the science of remediation and asked it to focus on issues such as the migration of radionuclides to deep aquifers. Now President George W. Bush has proposed eliminating the lab's budget in 2006.

    Last fall an outside review said that SREL was making progress toward its new subterranean emphasis, although it urged DOE to make use of the lab's “unique” capabilities. Independent science advisers to DOE have also repeatedly urged the agency to nurture its ground-level ecology science. Yet despite that advice and the lab's efforts at refocusing its work, the president's budget request put a priority on subsurface science and high-level radioactive waste. Research on radioecology and surface science was to be wound down this year, the budget stipulated, and terminated in the fiscal year that begins 1 October. “We had to get out of one area of research, [and we picked] surficial science,” says DOE's Ari Patrinos, whose office oversees the lab.

    Not a snap.

    SREL is shifting its focus from aboveground ecology, including herpetology, to subsurface science.

    CREDIT: SREL PHOTO

    SREL Director Paul Bertsch thinks that decision is shortsighted. He argues that other DOE cleanup sites in Tennessee and Colorado will require an understanding of the movement of radioisotopes on the surface that SREL scientists already possess. Indeed, a 1994 decision by DOE not to drain a lake at the Savannah site and remove contaminated sediment, he says, was based on SREL research that suggested the habitat could survive with the sediment intact. Experts believe the decision has saved billions of dollars in cleanup costs (Science, 12 March 2004, p. 1615). “For an $8 million organization, we've had a huge impact,” says Bertsch.

    Patrinos says that SREL scientists are being encouraged to seek support from other federal agencies. But he concedes that the lab “will most likely have to shut down” at some point if Congress accepts the 2006 budget proposal. The University of Georgia, Athens, which provides about $1 million a year, will be hard pressed to make up the difference. “We have our own budget problems,” says Gordhan Patel, the university's vice president for research.

    Ecologists say that much will be lost if the lab is closed. “SREL has been without a doubt the most productive and significant organization in herpetological ecology for the last 25 years,” says ecologist David Wake of the University of California, Berkeley, who notes that the lab has taken advantage of the size and undisturbed nature of the site. That advantage could disappear unless lab officials can escape the president's budget ax.

  12. PROPOSITION 71

    Proposed Legislation Threatens to Slow California Stem Cell Rush

    1. Jon Cohen

    Although California voters last November approved a proposition that promises to push the state to the forefront of embryonic stem (ES) cell research, legislation introduced in the state senate last week may significantly constrain the way that the new California Institute for Regenerative Medicine (CIRM) conducts business.

    Proposition 71 created CIRM to award up to $3 billion over the next decade to academic and industry researchers working in the state on stem cell projects that are ineligible for federal funds because of restrictions on human embryonic research. One far-reaching new measure introduced on 17 March aims to amend the state constitution to redefine CIRM. It would increase scrutiny of potential conflicts of interest, require more open meetings, and guarantee that products or treatments derived from this research are both affordable to low-income residents and pay increased royalty or licensing fees to the state.

    Zach Hall, CIRM's interim president, takes issue with several concerns raised by the legislators. “It really does seem to be a gap between two cultures,” says Hall, a neuroscientist who once headed the National Institute of Neurological Disorders and Stroke in Bethesda, Maryland. One point of contention: whether the working groups that evaluate grants can hold closed-door meetings. “This is the gold standard of peer review, and scientists in public won't speak openly and frankly,” says Hall. He similarly wonders how the state will determine what is “affordable” and cautions that industry will shy away from collaborations that have such limits.

    California state senators Deborah Ortiz (D) and George Runner (R), who introduced the measure, also co-authored a separate bill that calls for a 3-year moratorium on using state funds to pay for hyperovulation of women and retrieval of multiple eggs, which they contend may cause harm. Many researchers hope to create ES cells through somatic cell nuclear transfer (SCNT), which requires human eggs. SCNT uses a hollowed-out egg to “reprogram” cells to their embryonic state. ES cell lines derived from SCNT may enable scientists to study pathogenesis, test drugs, and even treat people directly.

    Nobel laureate Paul Berg of Stanford University, an influential backer of Proposition 71, is surprised that Ortiz, who pioneered legislation encouraging ES cell research, is pushing for these changes. “Ortiz supported the thing all the way through,” he says. R. Alta Charo, a lawyer and bioethicist at the University of Wisconsin, Madison, says she is “dismayed” by the idea of an egg-donation moratorium, which she asserts violates a woman's right to choose and could effectively halt SCNT research.

    Ortiz insists she is merely fine-tuning Proposition 71. She adds that the egg-donation moratorium does not prevent researchers from using private funds to obtain eggs.

    The egg-donation moratorium requires a majority vote in the legislature. The proposed constitutional amendment, however, would need the support of two-thirds of the legislature, which would then place the issue before the voters in November.

  13. EPIDEMIOLOGY

    Mounting Evidence Indicts Fine-Particle Pollution

    1. Jocelyn Kaiser

    Particle air pollution clearly causes substantial deaths and illness, but what makes fine particles so toxic—the size, the chemical compound, or both?

    Talk about heart-stopping news: Spending time in traffic may triple some people's risk of having a heart attack an hour later. That's what German researchers reported last October in the New England Journal of Medicine (NEJM), based on responses from 691 heart attack survivors about their activities in the days before they fell ill. The study seemed to support the notion that tiny air pollution particles from tailpipes, along with stress, could help trigger a heart attack. Yet in another recent study in which volunteers in upstate New York breathed in lungfuls of these so-called ultrafines, particles less than 0.1 micrometer (μm) in diameter, the effects were minimal. If ultrafines were the main culprit, “you would have expected to see something more,” says Daniel Greenbaum, president of the Health Effects Institute (HEI) in Cambridge, Massachusetts.

    The discordant studies illustrate the dilemma posed by fine particle air pollution. The term refers to particles of dust, soot, and smoke consisting of hundreds of chemicals that are defined by their mass and size—2.5 μm in diameter or less, or about one-30th the width of a human hair. They are known collectively as PM2.5. Hundreds of studies have suggested that breathing fine particles spewed by vehicles, factories, and power plants can trigger heart attacks and worsen respiratory disease in vulnerable people, leading to perhaps 60,000 premature deaths a year in the United States. In response, the U.S. Environmental Protection Agency (EPA) in 1997 added new regulations to existing rules for coarser particles (PM10), issuing the first-ever standards for PM2.5. But the move came only after a bitter fight over whether the science supported the rules and a mandate from Congress for EPA to expand its particle research program.

    Now the issue is getting another look as EPA faces a December 2005 deadline for revisiting its PM2.5 standard. EPA scientists, after reviewing piles of new data implicating PM2.5 in health effects, have proposed tightening the 1997 standard to further reduce ambient concentrations of fine particles. Some scientists and industry groups remain skeptical, noting that researchers still haven't pinned down what makes particles dangerous—whether it's mainly size, and that the tiniest particles are most potent; or chemistry, such as metal content; or some combination of the two. Despite 8 years and some $400 million in research, finding out exactly how fine particles do their dirty work has proved frustratingly elusive, researchers say. “We've gotten glimpses, but we don't yet have enough systematic coverage of the problem,” says epidemiologist Jon Samet of Johns Hopkins University in Baltimore, Maryland.

    At risk.

    Studies with elderly volunteers have shown that slight changes in outdoor particle levels can change heart rate variability.

    CREDIT: EPA

    Unmasking a killer

    Although the evidence against fine particles, initially circumstantial, has grown stronger, gaps still remain. It began with epidemiologic studies finding that when levels of particulate matter (PM) edged up in various cities, hospital visits and deaths from heart and lung disease rose slightly, too. Two landmark studies in the early 1990s that tracked more than half a million individuals in cleaner and dirtier cities for many years suggested that PM was shortening the lives of 60,000 people each year. EPA generally regulates air pollutants by chemistry—ozone, sulfates, and mercury, for example—but the 1970 Clean Air Act also regulates total particles. In 1987, EPA switched from controlling total particles to coarse particles 10 μm or less in diameter. These new observations suggested, however, that the rules, which are based on the total mass of particles (liquid or solid) with a diameter of 10 μm or less, weren't enough. The PM10 rule was not catching fine particles that aren't readily expelled by the lungs and can penetrate deep into airways.

    But when EPA proposed the PM2.5 standards in 1996 (along with tighter ozone standards), industry groups and some scientists cried foul, arguing there was no direct evidence that these fine particles were killing people. Congress agreed to the regulations only on the condition that EPA would re-review the science before implementing the rule. Lawmakers also mandated that the National Research Council (NRC) oversee a long-term EPA particle research program funding both in-house scientists and extramural researchers.

    Those and other new studies have firmed up the fine particle-death link. Larger studies and new analyses verified the key epidemiological studies, which held up despite a statistics software problem that lowered the short-term risks slightly. Deaths per day are now estimated to tick upward 0.21% for each 10 micrograms/meter3 increase in PM10 exposure, and long-term risks of dying rise 4% for each 10 μg/m3 rise in annual PM2.5. Similar patterns were reported in Europe: After Dublin banned soft coal in 1990 and levels of black smoke and sulfur dioxide (both contributors to PM) dropped, death rates from heart and lung disease declined as well. Another study found that people living near busy, polluted roads in the Netherlands had twice the risk of dying from a heart attack over an 8-year period than people living in cleaner areas. Although the epidemiologic studies cannot completely disentangle PM2.5 effects from those of other pollutants, such as carbon monoxide, most researchers say the link with PM2.5 is robust. “There's an association with particles that doesn't go away,” says Greenbaum.

    Meanwhile, the list of health effects linked to fine particles keeps growing. An American Cancer Society study found that chronic exposure to PM2.5 is on par with secondhand smoke as a cause of lung cancer (Science, 15 March 2002, p. 1994). Particles of various sizes have been tentatively linked to low birth weight, preterm birth, and sudden infant death syndrome. A study last year in NEJM found that children who grow up in parts of southern California with higher PM2.5, nitrogen dioxide, and acid vapor pollution levels have less developed lungs. Earlier this year came a report that the newborn babies of New York City mothers exposed to PM2.5 containing higher levels of polycyclic aromatic hydrocarbons (PAH), a carcinogenic chemical, had more chromosomal damage that can later lead to cancer than did the babies of mothers with lower PAH exposures. Another report, published in Science, found that fine particles from traffic can cause DNA mutations in male mice that are passed on to their offspring (Science, 14 May 2004, p. 1008).

    Others studies have tightened the link by showing that PM2.5 can cause heart and lung health effects in lab animals with conditions such as heart disease that make them susceptible, as well as subtle effects in human volunteers. Studies in which heart monitors were attached to elderly people, for example, have found that their heart rhythm becomes less variable when outdoor particle levels rise—which makes the heart more vulnerable to cardiac arrhythmia. Researchers are now searching for the mechanisms behind this phenomenon (see sidebar, p. 1858).

    CREDIT: P. HUEY/SCIENCE; ADAPTED FROM BROOK ET AL./ AHA SCIENTIFIC STATEMENT NO. 71-0289; IMAGES: EOL BERKELEY NATIONAL LABORATORY, CDC, JUPITER IMAGES

    Mass confusion

    But researchers are still grappling with what makes fine particles toxic. PM2.5 consists of hundreds of liquid and solid chemicals, including carbon, nitrates, sulfates, metals, and organic compounds, produced by sources ranging from diesel engines to soil blown from farmers' fields. But efforts to sort out which are the most potent components—or whether it's some combination of size and chemistry—have fallen short.

    One reason is that in their animal studies, EPA and academic scientists have often used high doses of particle mixtures such as metal-laden exhaust from oil-burning power plants. These are convenient, but they differ from what people are exposed to. Researchers have also typically inserted the particles directly into the animals' tracheas, which isn't the same as inhaling them. And academic researchers who got grants from EPA have used different protocols or animal models, which makes it difficult to compare experiments to each other and to EPA's. “Lots of the research was relevant, but it wasn't systematic because of the nature of how we do research,” says Samet, who chaired a final NRC review that last year pointed out this problem.

    So far, the evidence on which components are the most dangerous remains confusing. Researchers have, at least, decided that crustal dust, particles on the large end of PM2.5, seem fairly harmless. Particles of various sizes containing metals such as zinc and copper, on the other hand, can cause lung inflammation and heart damage in lab animals. The metals theory got a boost in 2001 from an unusual study. Researchers dug up stored air filters from the Utah Valley during the mid-1980s, when epidemiologists had observed a drop in hospital admissions for respiratory problems that coincided with a 1-year closure of a steel mill. The filters from when the mill was open were richer in metals, and these extracts caused more health effects in lab animals and human volunteers—suggesting that the metals explained the jump in hospital admissions. Still, in general, the amount of metals needed to see toxic effects in lab animals is much higher than the levels in the air people breathe, says EPA toxicologist Daniel Costa.

    Other suspects seem relatively harmless when examined in isolation. Sulfates cause only minimal health effects in animals, and these acids don't seem linked to health effects in short-term epidemiologic studies. The power plant industry—which produces most of the sulfates—has cited these studies as evidence that they're not the problem. Yet sulfates are clearly associated with health effects in some studies following people over many years.

    Others suspect that it's size that determines toxicity, and that ultrafine particles smaller than 0.1 μm in diameter are the culprits. Toxicologists have found that if coarser particles are ground up into ultrafines, they are much more toxic, most likely because the smaller particles have a greater surface area to react with tissues. And ultrafine particles can get into lung tissue and possibly the blood and even the brain. A few epidemiologic studies, such as the one last fall in NEJM on heart attack survivors from epidemiologist Annette Peters's group at the National Research Center for Environment and Health in Neuherberg, Germany, have pointed toward ultrafines, whatever their chemical composition, as the most toxic PM2.5 component. Peters's study didn't find an association with ambient air pollution, only with time spent in cars, buses, trams, or on bicycles or motorcycles; traffic pollution contains more ultrafines than air in general. Yet when Mark Utell and Mark Frampton's team at the University of Rochester in New York had 28 resting or exercising volunteers breathe small amounts of carbon ultrafines, they saw only very slight changes in measures such as heart rhythm and white blood cells—even in asthmatics, whose damaged lungs contained up to six times as many particles as healthy people.

    Danger zones.

    Risks of premature death from PM2.5 pollution are highest on the West Coast and in the Midwest.

    CREDIT: DONALD MCCUBBIN/ABT ASSOCIATES USING 2002 PM2.5 MONITORING DATA ADJUSTED FOR BACKGROUND LEVELS; RISK CALCULATION BASED ON POPE ET AL., JAMA 287, 1132 (2002)

    The explanation may be that it's not size or chemistry alone. The ultrafines used in the Rochester study were pure carbon black, but ultrafines in the real world are likely coated with metals and organic compounds, Frampton says. (Also, the researchers may need to test people with cardiovascular disease.) Likewise, sulfates may form the core of a particle that also contains nastier compounds such as metals, or they could change the chemistry of metals so they're more soluble in blood. Larger particles may irritate and inflame airways, exacerbating the toxicity of PM constituents such as organics and metals, says Costa. And particles may have different effects in the short term and after years of exposure. “It's far more complex than trying to decide which chemicals are toxic,” says toxicologist Joseph Mauderly of Lovelace Respiratory Research Institute in Albuquerque, New Mexico.

    Newer experiments are seeking to use more realistic mixtures. That became possible only a few years ago when researchers invented devices that can collect ambient air from outside a lab and concentrate the particles for use in experiments. Others are looking at pollutants from a range of sources. For example, Mauderly's group at Lovelace is conducting animal studies comparing particles from diesel engines, gas engines, wood smoke, cooking, road dust, and coal to pin down which type is most toxic. HEI, meanwhile, is sponsoring epidemiologic and toxicology studies that will take advantage of a new monitoring network at 54 sites that measures a finer breakdown of the chemicals in particles, such as sulfates, elemental carbon, and trace elements, than has been gathered previously. And EPA recently launched a $30 million, 10-year study led by University of Washington researchers that tracks correlations between these finer air pollution measurements and the health of 8700 people over age 50.

    Down the road, this new information should help guide regulations—for instance, if carbon particles from wood burning were the main problem, or diesel engines, EPA could specifically target those sources. Controlling only mass, as EPA does now, might actually be counterproductive. For example, if larger PM2.5 particle levels go down but levels of ultrafines do not, “that could make things even worse,” Frampton says. That's because ultrafines tend to glom onto larger PM2.5 particles, so they don't stay in the air as long when the larger particles are around.

    Time to act

    Those results won't be available for years, however, and EPA is under a court order to decide whether to tighten the current PM2.5 standard by the end of 2005. EPA scientists in January recommended that the agency consider tightening the standards from the current annual average of 15 μg/m3 to 12 to 14 μg/m3, and the daily average from 65 μg/m3 to as low as 25 μg/m3. They also suggested replacing the PM10 standard with a new one for particles between PM10 and PM2.5 to better target coarser particles between those sizes. In April, EPA's clean air advisory board will weigh in.

    PM2.5 levels have already dropped at least 10% since 1999 due to acid rain regulations and new diesel engine standards (see sidebar, p. 1860). They will fall further thanks to additional cuts in sulfates and nitrates from coal-burning power plants through new regulations issued this month and possibly the Administration's proposed Clear Skies program. But a tighter standard could trigger additional controls in areas with the highest particle levels, such as Los Angeles and the Northeast. Environmental and health groups as well as many scientists say that, as with tobacco smoke and lung cancer, policymakers can't wait for all the scientific answers before taking action to prevent deaths from dirty air.

  14. EPIDEMIOLOGY

    How Dirty Air Hurts the Heart

    1. Jocelyn Kaiser

    A decade ago, most cardiologists never suspected that breathing tiny particles of soot and dust could damage their patients' hearts, let alone trigger a heart attack. Today “there's no doubt that air pollution plays a role in cardiovascular disease,” says cardiovascular researcher Robert Brook of the University of Michigan, Ann Arbor.

    Fine particles seems to affect the heart in two ways: by changing the heart's rhythm and by causing systematic inflammation. Many studies—from animal experiments to tests in which retirement home residents wore heart monitors—have shown that breathing particle pollution can slightly quicken the pulse and make the heartbeat less variable. The mechanism isn't yet known, but one possibility is that airway receptors stimulate nerves in the heart. A less variable heart rate, in turn, makes the heart more prone to arrhythmia (irregular heartbeat), which can presage cardiac arrest.

    People don't usually die from arrhythmias unless they are very ill already, Brook notes. But particles also penetrate the lung's alveoli and cause inflammation and oxidative stress. The lung cells then pump proteins called cytokines into the bloodstream. This apparently sparks other immune responses that promote blood clot formation and the constriction of blood vessels. These effects, in turn, may cause deposits of lipids known as plaques to rupture and block blood flow to the heart. “If these things all come together, somebody who's vulnerable might be pushed over the edge” and have a heart attack, says epidemiologist Annette Peters of the National Research Center for Environment and Health in Neuherberg, Germany.

    Hardhearted.

    The aortas of mice prone to atherosclerosis developed more lipid plaques (red) when they breathed concentrated particles for 5 months than did the same strain of mouse breathing clean air.

    CREDIT: L. C. CHEN AND C. NADZIEJKO/NYU SCHOOL OF MEDICINE

    Over the long term, inflammation from breathing particles may also contribute to atherosclerosis, or hardening of the arteries, in the same way that secondhand tobacco smoke is thought to inflict damage. For instance, a report in the 15 April 2005 issue of Inhalation Toxicology found that mice engineered to be prone to atherosclerosis develop lipid plaques over 57% more area in the aorta if they breathe concentrated ambient particles instead of filtered air for up to 5 months. “This is the first animal study mimicking” long-term exposures of people, says lead author Lung-Chi Chen of New York University.

    Although particle pollution is a minor risk factor for heart disease compared to, say, high cholesterol, the impact is large because so many people are exposed. A recent examination of cause-of-death data from a long-term study tying particle pollution to mortality revealed that few extra deaths are from pulmonary disease; the majority are from cardiovascular disease. Citing the body of evidence, an American Heart Association scientific panel in last June labeled fine particles a “serious public health problem” and urged the Environmental Protection Agency to consider “even more stringent standards.”

  15. EPIDEMIOLOGY

    Regulations Spark Technology Competition

    1. Marie Granmar*
    1. Marie Granmar is an innovation journalism fellow.

    The clampdown on particle air pollution in the United States (see main text) and similar regulations expected in Europe are forcing diesel vehicle manufacturers and industries to update technologies and look for new ones. “In the next 5 years, the diesel industry will clean itself up as much as the car industry has done in 30 years,” predicts Richard Kassel, director of clean vehicle projects at the Natural Resources Defense Council (NRDC) in New York.

    In the United States, efforts are mainly focused on trucks, buses, and larger diesel engines, which produce a major fraction of fine- particle emissions known as PM2.5. Several Environmental Protection Agency (EPA) diesel regulations issued since 2000 will steeply reduce emissions by 2015. Besides requiring low-sulfur fuels, which reduce sulfates (a PM2.5 component), the rules mandate that the heavy diesel fleet (including buses) be retrofitted with particle filters; EPA estimates costs at $400 to $1000 per vehicle. EPA expects that the majority of new diesel vehicles in 2007 will have particle filters. The devices generally work with a combination of a metal catalyst and a very hot multichannel trap in which soot particles burn off.

    During the past 5 years, the U.S. diesel industry has put almost $5 billion into the development of better technologies. For example, researchers are working to find materials that are more resistant to the high temperatures needed to burn off the particles so the filters will last longer.

    Culprit.

    The heavy diesel fleet in the U.S. is a major source of fine particles.

    CREDIT: JUPITER IMAGES

    Car manufacturers are further ahead in Europe, where diesel cars are more common. The French company Peugeot launched its first diesel car with a catalyst particle filter 5 years ago; over 1 million Peugeots are now equipped with these filters, which burn off 99.9% of the particles. Mercedes-Benz also offers an optional filter in new diesel cars for 580 euros ($800). Some U.S. car manufacturers, such as Ford, are about to follow suit in anticipation of future regulatory requirements.

    Industries that produce PM2.5, such as coal-burning power plants, have the option of using off-the-shelf technologies—usually a combination of electrostatic filters and bag filters to catch the finest particles. These filters are quite expensive—in the range of $1 million to $2 million for a small power plant using low-sulfur fuel. Some U.S. plants are also adopting a newer device called an agglomerator, developed in Australia, that reduces emissions of both PM2.5 particles and mercury, enabling them to satisfy two regulations. The agglomerator uses a so-called bipolar charger to separate the dust and give half of it a positive charge and the other half a negative charge. It then switches the charges and mixes the particles, which causes even the smallest particles to form agglomerates that are then easily captured by an electrostatic filter.

    Utilities and other industries will need to install such technologies to comply with a March 2005 regulation to control nitrogen oxides, sulfur dioxide, and particles by 2010. As these and other new regulations controlling PM2.5 emissions kick in, EPA predicts that PM2.5 levels will fall 10% to 20% over the next decade.

  16. U.S. EDUCATION RESEARCH

    Can Randomized Trials Answer the Question of What Works?

    1. Yudhijit Bhattacharjee

    A $120 million federal initiative to improve secondary math education hopes to draw on an approach some researchers say may not be ready for the classroom

    When Susan Sclafani and her colleagues in Houston, Texas, received a $1.35 million grant from the National Science Foundation (NSF) to work with secondary math and science teachers, nobody asked them to demonstrate whether the training improved student performance. “All we had to do was produce qualitative annual reports documenting what we had done,” she says. Sclafani thought that wasn't nearly enough and that NSF should be more concerned about whether the project helped students learn. Now, a decade later, she's in a position to do a lot more. And that's exactly what worries many education researchers.

    As assistant secretary for vocational and adult education at the Department of Education (ED), Sclafani is championing a $120 million initiative in secondary school mathematics that is built in part on money shifted from the same NSF directorate that funded the Houston grant. The initiative, included in President George W. Bush's 2006 budget request for ED now pending in Congress, will give preference to studies that test the effectiveness of educational interventions in the same way that medical researchers prove the efficacy of a drug. Randomized controlled trials (RCTs) of new approaches to teaching math, Sclafani says, will help school officials know what works, and they can then scale up the most promising new curricula and teaching methods. “Randomized studies are the only way to establish a causal link between educational practice and student performance,” she says.

    But some researchers say that such trials won't tell educators what they need to know. And they believe their discipline is too young to warrant a large investment in experimental studies. “Rushing to do RCTs is wrongheaded and bad science,” says Alan Schoenfeld, a University of California, Berkeley, professor of math education and adviser to both NSF and ED. “There's a whole body of research that must be done before that.”

    Prove it.

    The Department of Education's Susan Sclafani wants to see more experimental evaluations in math and science education.

    CREDIT: U.S. DEPARTMENT OF EDUCATION

    The proposed math initiative at ED would be a competitive grants program to prepare students to take Algebra I, a gateway course for the study of higher mathematics and the sciences. Applicants will be encouraged to use RCTs and quasi-experimental designs to measure whether the reform works, Sclafani says. The initiative comes at the same time the Administration has requested a $107 million cut in NSF's $840 million Education and Human Resources (EHR) directorate. The cuts include a phasing out of NSF's portion of the $240 million Math/Science Partnership program—a joint effort with ED to improve K-12 math and science education by teaming universities with local school districts—and a 43% decrease for the foundation's division that assesses the impact of education reform efforts (Science, 11 February, p. 832). Sclafani says this “reallocation of education dollars” reflects the Administration's eagerness for clear answers on how to improve math and science learning across the country. That's OK with NSF Director Arden Bement, who says ED is in a better position than NSF to implement reforms nationwide.

    Although NSF watchers are unhappy with the proposed cuts to the foundation's education budget, a bigger concern for some education researchers is that ED may be overselling RCTs. It's unrealistic to think that RCTs and other quasi-experimental studies will magically produce answers about what works, they say. Before comparing the performance of students in the experimental and control groups (one receives the intervention, the other doesn't), researchers must study the factors affecting any change in curriculum or teaching methods, such as group vs. individualized instruction, or working with students whose native language is not English. Answering such contextual questions, the critics say, is similar to finding out whether a medicine needs to be taken before or after meals.

    “You can design an RCT only after you've done all this work up front and learned what variables really count,” Schoenfeld says. ED's approach, he argues, is likely to drive researchers to skip those necessary steps and plan randomized studies without knowing why an intervention seems to work.

    Department officials insist that the time is ripe and have begun funding a handful of projects drawn from 15 years of work in curriculum development and teacher training, including efforts funded by NSF. One is a study of Cognitive Tutor, a computer-based algebra course for middle school students. Another looks at a new approach to training 6th grade science teachers in Philadelphia. Diana Cordova of the department's Institute of Education Sciences predicts that within 3 years, “they will tell us with reasonable certainty if an intervention can improve student learning.”

    Luck of the draw.

    Elk Ridge School is one of three Philadelphia, Pennsylvania, schools implementing a grade 6 reform curriculum as part of a randomized controlled trial.

    CREDIT: COURTESY OF ELK RIDGE SCHOOL

    Some of the researchers conducting these studies aren't so sure, however. One hurdle is convincing a large enough sample of schools to agree to randomization. “Everybody wants to have the treatment, nobody wants to have the placebo,” says Kenneth Koedinger, a psychologist at Carnegie Mellon University in Pittsburgh, Pennsylvania, who's leading the Cognitive Tutor study. Another problem is inconsistent implementation across the experimental group. Allen Ruby, a researcher at Johns Hopkins University in Baltimore, Maryland, who is conducting the Philadelphia study, says that problems at two of the three schools involved could end up masking evidence of whether the training is working.

    Schoenfeld predicts that these and other problems will confound any analysis. “The likely findings from this study would be something like this: Sometimes it works; sometimes it doesn't; and on average, the net impact is pretty slight compared to a control group,” he says. “What do you learn from such findings? Nothing.” On the other hand, Schoenfeld says, a detailed analysis of how the implementation was done at each school and how teachers and students reacted to it could tell educators the conditions under which it would be most likely to work.

    The still-emerging field of evaluation research needs investments in both qualitative and experimental studies, says Jere Confrey, a professor of mathematics education at Washington University in St. Louis, Missouri, who last year chaired a National Research Council report on the need to strengthen evaluations (Science, 11 June 2004, p. 1583). “You need content analysis to determine if a curriculum is comprehensive. You need a case study, because a randomized trial makes sense only if you know exactly what a program is and are certain that it can be implemented over the duration of the experiment,” she says. Analyses are lacking on hundreds of interventions now in use, she adds.

    Sclafani says she doesn't disagree with the value of contextual studies. But she says that taxpayers deserve more from their considerable investment in school reform. “NSF has supported exploratory work for a long time. There was an opportunity to collect evidence about their effectiveness, but that opportunity has been lost [because NSF didn't insist on experimental evaluations].”

    Judith Ramaley, who recently stepped down as head of NSF's EHR directorate, says she's glad that ED wants to build on NSF's work in fostering innovations in math and science education by testing their performance in the classroom. “The medical model makes sense for them,” says Barbara Olds, who directs NSF's evaluation division within EHR. “We think there are many fundamental questions in education that have not been answered.”

    ED officials are working with states to spread the gospel of experimental evaluations. Under the department's $178 million Math/Science Partnership program—the money from which has flowed directly to the states for the past 2 years—state governments have funded more than three dozen projects with a randomized or quasi-experimental study component. (None has yet yielded results.) And the department plans to do the same thing with the new math initiative.

    “Teachers are telling us: ‘We know what works in reading; tell us what works in math and science,’” says Sclafani. “We hope to be able to tell them that, if you do a, b, and c, you'll be sure to see results.”

  17. ASTRONOMY

    American Astronomers Lobby for the Next Big Thing

    1. Robert Irion

    With a successor to the Hubble Space Telescope seemingly assured, U.S. researchers state their case for a complementary mammoth telescope on the ground

    Astronomers like to view the heavens through as many eyes as possible: some on Earth, some in orbit, some tuned to every reach of the electromagnetic spectrum. Soaring costs for next-generation telescopes, however, are forcing researchers in the United States to make hard choices—or vigorous arguments. The top item on their wish list—the $1.6 billion James Webb Space Telescope (JWST), successor to the Hubble Space Telescope—enjoys steady support from NASA and seems on track for launch in late 2011. Less certain are the prospects for priority number two: a multimirrored behemoth on the ground spanning 20 to 30 meters, aimed at surpassing Hawaii's Keck Telescopes. Astronomers want it up and running by 2015 to make simultaneous observations they consider crucial during JWST's projected 10-year lifetime.

    Fearful that funding shortfalls might narrow that window, U.S. astronomers have sharpened their case for the giant ground-based telescope. On 15 March, the Astronomy and Astrophysics Advisory Committee (AAAC)—established in 2002 at the request of the National Academy of Sciences—issued a report* to Congress urging the National Science Foundation (NSF) to boost its early funding of technology research for the telescope, which to date has relied almost entirely on private grants. What's more, the next-generation ground and space science working groups—usually isolated—are finishing a novel white paper titled, in part, The Power of Two. The document claims that the field's central questions are within reach of JWST and an earthbound partner.

    Both reports raise the specter of foreign competition. “We felt that if we did not have a large enough telescope in place during the JWST era and our European colleagues did, then our competitiveness at the frontier would be diminished significantly,” says Power of Two co-author Stephen Strom of the National Optical Astronomy Observatory (NOAO) in Tucson, Arizona.

    To set the scene, the reports cite major advances from the pairing of Hubble and the 10-meter Keck Telescopes, such as the discovery of “dark energy” by observing distant supernovas from space and their dim host galaxies from the ground. A partnership in the next decade promises more deep insights, says Strom, thanks to facilities that will be an order of magnitude more powerful.

    Competing visions.

    The proposed Giant Magellan Telescope (left) would use seven large mirrors, whereas the Thirty Meter Telescope (right) could have up to 1000 segments.

    CREDITS (RIGHT TO LEFT): GMT/CARNEGIE OBSERVATORIES; AURA (INSET)

    With its 6.5-meter mirror and a cold orbit 1.5 million kilometers from Earth, JWST will peer to the dawn of galaxies and into the hearts of star-forming clouds and planetary nurseries in infrared light. A ground telescope will examine the same objects in near-infrared and optical light, but five times more crisply than JWST because the vast mirror will collect so much light. Spectrographic analysis of that light should yield exquisite details about motions and compositions of gas that will elude JWST, says astronomer Alan Dressler of the Carnegie Observatories in Pasadena, California. “JWST's sensitivity will be fantastic, and its images will be untouchable,” he says. But the ground spectra will show how the first galaxies actually assembled their gas. The spectra also may shed light on worlds like our own in the innermost zones of planetary systems around nearby stars, which JWST won't see, Dressler says.

    The space telescope's designers won't duplicate those strengths. “We're very strict about that,” says JWST senior project scientist John Mather of NASA's Goddard Space Flight Center in Greenbelt, Maryland. “If anybody's ever going to do it on the ground, we don't do it.”

    How the terrestrial telescope will work its magic is hotly debated. Two teams have devised distinct ideas for a Giant Segmented Mirror Telescope (GSMT), and each is convinced its approach is superior. One consortium of more than a half-dozen institutions, led by the Carnegie Observatories, proposes seven 8.4-meter mirrors in a tight floral pattern. The other group—the California Institute of Technology, the University of California, NOAO, and Canada—wants to scale up the 36-mirror honeycomb of the twin Keck Telescopes to a huge surface made of 800 to 1000 segments. Both designs require rapidly flexible optics and a daunting array of laser beams pointed at the sky to detect and erase the blurring of Earth's atmosphere.

    The technology challenges are steep, with price tags to match: $500 million and $800 million for the respective designs. Both teams expect more than half of the money to come from universities and private U.S. donors, but even that may not be enough. “For the next-generation facilities … it would be very advantageous to have international partnerships,” says astronomer Rolf-Peter Kudritzki of the University of Hawaii, Manoa, chair of the GSMT science working group.

    Ideally, those ties should include European astronomers, the AAAC report urges. Current plans in Europe call for a 60-meter to 100-meter “Overwhelmingly Large Telescope,” which many researchers view as overly ambitious. If U.S. agencies keep channels open with European colleagues, the world community might sustain two 30-meter-class telescopes—one in the north and one in the south—the report states.

    Meanwhile, AAAC urges NSF to up the ante for the technology development needed to explore both U.S.-led designs. A $35 million proposal is pending at NSF, but this year's bleak budget outlook means the two teams probably will get some small fraction of that, according to an astronomer familiar with the deliberations. Even so, NSF is likely to support as much as 50% of one project when resources permit, says Wayne Van Citters, director of the agency's division of astronomical sciences.

    But further slippage of up-front funding might push the completion date for a huge ground telescope uncomfortably deep into JWST's limited life span. It's time to make progress toward the best design, says NASA's Mather: “We just want somebody to build one.”

  18. INFECTIOUS DISEASES

    True Numbers Remain Elusive in Bird Flu Outbreak

    1. Martin Enserink,
    2. Dennis Normile

    Confounding reports about human cases of bird flu have fueled concerns about a pandemic. But the true spread of H5N1 remains unknown

    When 5-year-old Hoang Trong Duong from Vietnam's Quang Binh province was diagnosed with avian influenza last week—about 10 days after his 13-year-old sister died, presumably from the same disease—he became the 70th victim since the H5N1 bird flu strain started its march across Asia.

    That's the official number. But most flu experts believe that H5N1 has infected many more people, and some are increasingly worried that, 17 months into the current outbreak, there still hasn't been a concerted effort to establish the true extent of the spread of the virus—which could spawn a pandemic—among humans. “I'm very worried that information has been slow to emerge,” says flu scientist Nancy Cox of the U.S. Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia.

    There are many reasons for the information gap—from quirky test results to a lack of lab capacity in the affected countries to political sensitivities. A broad testing program, combined with carefully collected epidemiological information, is “absolutely crucial” to answer some basic questions, says virologist Albert Osterhaus of the University of Rotterdam, the Netherlands. Among them: How many people have become infected? What are the ways in which the virus spreads? And is the virus getting better at human-to-human transmission—the first step on the road to a pandemic?

    There's good reason to assume that there are more cases than those found so far, researchers say. Mostly seriously ill patients get tested for H5N1, says Cox; milder cases are likely to slip through the cracks. (That may also be why the official death rate stands at a staggering 67%.) Moreover, during H5N1's first outbreak, in 1997 in Hong Kong, several infected children had mild or no illnesses. Two weeks ago, Vietnam also reported two asymptomatic infections.

    Food for thought.

    Contact with ducks—which can carry H5N1 without symptoms—is a risk factor for human infection.

    CREDIT: ASSOCIATED PRESS

    As an example of what needs to be done, Osterhaus points to a study done after the 1997 outbreak by a team from the Hong Kong Department of Health and CDC, which included Cox. They screened 51 household contacts of H5N1 patients, as well as 26 people who went on a 4-day plane and bus trip with one of the patients and 47 co-workers of a bank employee who became ill; they also collected detailed information about each subject's contacts with the patients. (The results were somewhat reassuring: Only one member of the tour group and six household contacts had antibodies to H5N1.)

    Cox says several recent developments reinforce the need for broader testing. A paper published online by the New England Journal of Medicine on 17 February reports that H5N1 can cause diarrhea and brain inflammation, suggesting that cases may be missed because doctors simply don't think of bird flu. The increasing number of reported family clusters of H5N1 in Vietnam might also signal human-to-human transmission, she says.

    But broad efforts have been stymied. One problem is the technical difficulty of testing for antibodies. Flu researchers traditionally rely on hemagglutinin inhibition tests to detect antibodies in serum samples. But those have proven “simply not sensitive enough” in the case of avian influenza, says Klaus Stöhr, who coordinates WHO's global influenza program. The alternative—a so-called microneutralization assay—is more reliable but also more time-consuming and expensive. And because it uses live H5N1 virus that infects cells, it should only be carried out in a biosafety level 3 laboratory, of which the region has very few.

    Amassing samples and epidemiological questionnaires is labor-intensive. “The Vietnamese are too busy just trying to keep track of the cases of serious illness,” says Peter Cordingly, spokesperson for WHO in Manila. Nonetheless, the Hospital for Tropical Diseases in Ho Chi Minh City now has several hundred samples—from cullers, contacts of patients, health care workers, and controls—that may be shipped to the WHO collaborating center in Hong Kong, says Peter Horby, a WHO official in Hanoi.

    In Thailand, meanwhile, two groups have tested hundreds of samples collected during the outbreak last year. Unfortunately, some of the corresponding epidemiological data weren't properly recorded. The results are still being reviewed, but Scott Dowell of the International Emerging Infections Program in Bangkok, a collaboration of the Thai Ministry of Health and CDC, says that so far, the data don't suggest large numbers of subclinical cases—a result virologist Malik Peiris of the University of Hong Kong characterizes as “surprising.”

    The lack of information has been frustrating for labs with the expertise but no samples to test. CDC, for instance, has trained Vietnamese researchers and has tested some samples from Thai health care workers, but would like to do more, says Cox. Getting renowned international labs involved should also help avoid mistrust and confusion about the results, says Osterhaus. But such collaborations require diplomacy, says Stöhr: “You can't just march into Vietnam with an army of researchers.”

    Meanwhile, fresh bird outbreaks of H5N1 were reported in Indonesia last week, and South Korean newspapers reported that North Korea culled thousands of chickens last month to curb an outbreak near Pyongyang. As a member of the World Organization for Animal Health, North Korea must report outbreaks of avian influenza, which so far it hasn't done. Stöhr notes that an outbreak in the secretive nation would be hard to investigate or bring under control.

  19. The Dynamic Gut

    1. Elizabeth Pennisi

    As snakes, frogs, birds, and other wild creatures attest, digestive systems need flexibility to meet energy demands as well as the challenges of environment, diet, and predators

    You haven't had a meal for so long that your stomach and intestines have atrophied. And when you finally do get something to eat, it's almost as big as you are. Yet you cram it into your mouth and hope that your gut can somehow cope with it. That's what life's like for many snakes: feast or famine.

    Other animals ride similar nutritional roller coasters. Birds travel thousands of kilometers without eating, then gorge themselves when they stop to refuel—often on a completely different type of food than they are accustomed to. Not a morsel of food or liquid passes the lips of a hibernating animal for months, yet their digestive systems kick in as soon as they greet the world again.

    How does the gut accommodate such extremes without shriveling up and dying or being completely overwhelmed by a sudden flood of nutrients? Those are questions that have given biologists a lot of food for thought.

    “Most of us learned in our textbooks about a rather static digestive system,” says William Karasov, a physiological ecologist at the University of Wisconsin, Madison. But “the gut is a very dynamic organ.” It can “adjust to changing demands and energy supply,” adds Matthias Starck, a functional morphologist at the University of Munich, Germany. And those adjustments can be radical in the extreme.

    The guts of a python's survival

    Among those intrigued by the gustatory habits of snakes are Jared Diamond, a physiologist at the University of California, Los Angeles, and Stephen Secor, a physiologist now at the University of Alabama, Tuscaloosa. In 1995, they reported that during the months between meals, the python's stomach and intestine atrophy. Yet once the snake begins to eat, it quickly revs up its digestive function. Secor and Diamond's work suggested that the first food to reach the gut—particularly proteins or their amino acids—stimulates a dramatic expansion of the gut lining. The intestine doubles in size, significantly increasing its absorptive surface area.

    Over the past decade, Secor, Starck, and, more recently, Jean Hervé Lignot of the Louis Pasteur University/CEPE-CNRS in Strasbourg, France, have worked independently and collaboratively to sort out how this gastrointestinal rebirth occurs with every meal. The gut lining consists of “fingers” of cells called villi, and the cells themselves sport projections called microvilli. Secor's initial research indicated that new cells are added to the gut lining when the intestine shifts into high gear. That shift, especially the rapid reactivation of the stomach and the production of stomach acids, is quite energy-intensive, taxing the body's reserves, he says.

    Over the past 5 years, Starck has developed an alternative explanation, one he will detail in an upcoming issue of the Journal of Experimental Biology. He finds that the gut lining grows not because the number of cells increases but because blood pours into shrunken villi once a snake eats, expanding the surface area of the villi and flooding them with materials needed to digest the incoming meal. According to Starck, this proposed mechanism requires less energy than Secor's original explanation; digestion can begin even when fat reserves are relatively low, he notes. And once digestion starts, the ingested food provides the energy needed to complete it.

    Fast response.

    In snakes that eat irregularly, the intestines (lower left) and their fingerlike projections (black and white, left) shrink between meals but quickly expand (right) after eating.

    CREDITS (TOP AND MIDDLE RIGHT): MATTHIAS STARCK; (MIDDLE LEFT AND BOTTOM LEFT AND RIGHT) STEPHEN SECOR

    Starck's new proposal draws on ultrasound measurements of blood flow to and within the gut lining, as well as histological studies that characterize physical changes in the lining. In one series of experiments, Starck and his colleagues fed mice to snakes. The snakes' metabolic rates rose, their small intestines grew, and blood flow to the intestines tripled. This blood flow was responsible for half the intestine's increase in size, according to Starck. Microscopic globs of fat absorbed from the gut further bloat the cells lining the intestine, accounting for the rest of the increase, he says.

    Secor too has now found that cells swell rather than divide, but he thinks blood has little to do with their expansion. Instead, his work indicates that the lining's villi absorb the digestive fluids from the gut itself. Working with Secor, Lignot has used electron microscopy to show that each cell's microvilli also swell significantly, quadrupling in size within 24 hours.

    Even though Secor maintains that increased blood flow is not the secret to the rapid growth of a snake gut, he has shown that it is important to digestion. Animals typically divert blood to the gut during digestion, just as exercise results in increased circulation to muscles. Last year, Secor and his colleagues reported that snakes go to extremes, increasing blood flow to the gut by about 10-fold compared to the 50% increase humans experience during digestion. “[It's] really going through the roof,” Secor notes. “The cardiac output is comparable to somebody going full-out in exercise.” And the python can sustain this additional blood flow for days—plenty of time for its intestinal muscles to work without fatigue at breaking down the skin, bones, and underlying tissue of its prey.

    It takes a lot of heart to pump all that blood. Indeed, the snake's heart actually grows once it eats, increasing 40% in mass in just 2 days, James Hicks, a comparative physiologist at the University of California, Irvine, and his colleagues reported in the 3 March issue of Nature. Athletes' hearts can also expand—but that growth happens over years, says Secor.

    Ready reserves

    Even as they digest their food, snakes stockpile resources for their next great feast, says Lignot. In pythons, the stomach operates for 4 to 6 days, then begins to wind down, while the intestines keep going for at least another week to finish the job. Lignot's latest experiments show that as the intestine takes these extra days to process nutrients, it generates and banks cells for a subsequent supper. “This is the key adaptive factor: a dormant gut with unused cells that can quickly restart functioning when food is available again,” says Lignot.

    On duty.

    While hibernating (right), 13-lined ground squirrels maintain minimal function in their guts but efficient digestive capacity.

    CREDITS: HANNAH CAREY/UNIVERSITY OF WISCONSIN SCHOOL OF VETERINARY MEDICINE

    A similar phenomenon occurs in fasting rats. Even as they are reduced to breaking down their body's proteins—a desperate measure that can cause organs to waste away—their intestine begins to produce new cells, and programmed cell death in the lining stops, Lignot and his colleagues reported last year. “After a prolonged fast, the intestinal lining prepares itself for eventual refeeding,” says Lignot.

    Other animals faced with widely spaced meals have also adapted their guts to keep some cells in their digestive system up and running during the lean times. Perhaps the most extreme examples are creatures that hibernate during the cold winter or their summer counterparts that burrow and become dormant to escape life-threatening heat or dryness, a strategy called estivation. For example, Rebecca Cramp and Craig Franklin of the University of Queensland in Brisbane, Australia, have investigated the guts of green-striped burrowing frogs, an Australian species whose estivation can last more than 10 months. Once it rains hard, the frogs surface—sometimes for just a week—and they must quickly stock up on food to help find mates and build up fat reserves. Cramp and Franklin collected frogs in the wild, then allowed them to estivate in mud trays for up to 9 months. Within 3 months, the animals lost as much as 70% of their gut mass, the researchers found, and an additional 10% had disappeared by 9 months. The microvilli on cells of the frogs' intestinal lining also shrank.

    Always on standby.

    Australia's green-striped burrowing frog keeps a few gut cells up and running in preparation for a feeding frenzy when it surfaces

    CREDIT: ED MEYER

    Overall, this gastrointestinal atrophy saves energy for the frogs, explains Cramp. Nonetheless the gut of an estivating creature can still absorb nutrients, even though maintaining that readiness entails “a significant energetic cost” at a time when the frog has little energy to spare, says Cramp. But the cost is a worthwhile investment, she has discovered. Her tests have shown that frogs surfacing from estivation absorb nutrients 40% more efficiently than frogs that hadn't burrowed. Newly aroused frogs “can maximize their digestive capability from the outset,” says Cramp.

    Hibernating mammals called 13-lined ground squirrels also keep digestive capacity in reserve. To save energy as they hibernate, the animals' gut lining atrophies, and their intestines thin. Some cells undergo programmed cell death in response to fasting. But tissue damage is minimal as genes that promote survival and curtail cell death become active at the same time, says Hannah Carey, a physiologist at the University of Wisconsin School of Veterinary Medicine in Madison. She has taken a close look at the remaining cells and found that microvilli remain intact on the surviving cells, and their density on each cell sometimes increases. In these microvilli, transporter proteins and digestive enzymes continue to be plentiful. Thus, during brief moments when the animals come out of their stupor during those quiet months, they can absorb nutrients still left in the gut.

    Indeed, when Carey warmed intestinal tissue from hibernating animals so that it could function normally, she observed that, gram for gram, the tissue worked more efficiently than did the intestines of the nonhibernating squirrels. “This is likely a beneficial adaptation,” she suggests: Squirrels emerging from hibernation could make the most of spring's low food supply and still have enough energy for breeding.

    Meals on the fly

    Like hibernation, migration places unusual demands on the gut. Migrating birds, for example, devote extraordinary amounts of energy to flying, and they must be able to digest the different foods they encounter along the way. For many migrating birds, “the way to cope with season-to-season differences in food quality and energy expenditure is to change gut size,” says Theunis Piersma, an evolutionary biologist with the Royal Netherlands Institute for Sea Research on the island of Texel and the University of Groningen.

    Food-flight dilemma.

    Because they have different refueling strategies, red knots (left) have shrunken guts when they migrate, but western sandpipers (inset) do not.

    CREDITS: CHRIS KELLY; (INSET) TOM MIDDLETON/PACIFIC WILDLIFE FOUNDATION

    Piersma studies the red knot, a bird species that migrates between Antarctica and Northeastern Canada and Greenland. He recently showed that its gut can adjust to different diets. Red knots typically eat two types of food: relatively soft stuff such as shrimp, crabs, or spiders, and hard stuff—cockles, mussels, and other bivalves. The red knot's gizzard, the muscular extra stomach used to crunch shells, grows quickly when the bird switches from soft food to hard food, Piersma reported last year.

    He and his colleagues first measured the gizzards of red knots fed trout chow for several years. They then began serving small mussels to the birds. Over the next 3 weeks, the gizzards grew by 4.9 grams; the red knots only weighed about 130 grams total. Overall, the birds gained 7.3 grams. The dietary changes experienced by migrating birds “have apparently favored the evolution of intestinal plasticity,” says Rick Relyea of the University of Pittsburgh, Pennsylvania.

    Piersma and his colleagues then used ultrasound to see what happens to a red knot's gut during migration. They found that when the birds arrive at a stopover, their guts are much reduced. Less intestinal baggage apparently gives the birds a better shot at making it to their destination. Yet Piersma's group has documented that when red knots make a pit stop along the Wadden Sea, their digestive systems quickly regain their ability to process food and absorb nutrients, albeit temporarily.

    His team's observations show that the speed at which the gut comes back online is critical to the birds' ability to replenish the fat reserves they need to complete their journey. According to some physiologists, that ability should be limited by how fast the birds can catch their prey. But Piersma's studies show that the gut's ability to process food is the limiting step, and that their food choice—and therefore their rate of consumption—is in large part dictated by the size of their gizzards. The faster gizzards can bulk up, the greater the birds' range of food choices. “There are incredibly strong interactions between the type of food and type of digestive machinery,” Piersma points out.

    He and his colleagues have looked at food choices of red knots at the Wadden Sea stopover. The researchers set up patches of food along the mudflats that differed in food quality. One had many good-sized mollusks, which are relatively tough to digest. The other had a sparse smattering of crabs, which are easier for the gizzard to process but gram for gram pack less nutritional value than the mollusks. They watched as radio-tagged birds foraged in these patches and used ultrasound to measure their gizzards.

    Red knots with large gizzards, which could digest the shelled food faster, favored the larger bivalves. Those with small gizzards—the majority of the migrating birds when they first arrived—had to forage almost exclusively in the other patch. As a result, they needed to spend more time—and wasted precious energy—looking for meals. Thus, the gizzard's size seems to drive the food choice and foraging strategy of the migrating red knot, Piersma and his colleagues reported in the January Journal of Animal Ecology, and only if the gizzard grows fast enough will the migrants be able to lay in the stores they need for the final leg of their trip. Says Starck: “[Red knots] that optimize gut size and function arrive with more energy reserves in their breeding grounds and thus [have an] advantage over others.”

    In contrast to red knots, having a big gut seems to be the secret to success for some migrating birds. Consider Western Sandpipers, which migrate from Central America to Alaska. These birds actually increase the size of digestive organs for migration, according to research by Tony Williams of Simon Fraser University in Burnaby, British Columbia, his graduate student R. Will Stein, and Christopher Guglielmo, now at the University of Western Ontario, Canada.

    The researchers attribute the difference between red knots and sandpipers to the species' contrasting migration habits. Red knots travel very long distances, fueling up just once during a monthlong stopover. Sandpipers hop from food stop to food stop, so they don't need to be as trim as red knots to get to their destinations.

    These gastrointestinal tales of famished snakes and ravenous frogs may have implications beyond explaining how animals adjust to the nutritional ups and downs of life in the wild. Maybe, researchers speculate, a key to understanding some of the digestive diseases that affect millions of people lies in the intestines of a hibernating squirrel or the stomach of a migrating bird. “Wild animals provide model systems to study regulatory physiology [of the gut] that may be of biomedical importance,” says the University of Wisconsin's Karasov.

  20. What's Eating You?

    1. Elizabeth Pennisi

    Sometimes, it's who's digesting you that determines how you digest. In general, long guts absorb more food and make digestion more efficient. But having a big belly, so to speak, can slow an organism down—and that's bad news in a world of predators.

    Demonstrating a link between a creature's gut size and its enemies, Rick Relyea of the University of Pittsburgh in Pennsylvania and his graduate student Josh Auld have raised wood frog tadpoles under different conditions. They've set up tanks with all the ingredients of the tadpole's natural pond environment. In some, they put just tadpoles, anywhere from 20 to 160 per tank. In others, the researchers added a predator: immature dragonflies. These insects were kept in underwater cages, so the tadpoles were safe, but the dragonfly smell signaled danger. In the first experiment, which lasted a month, Auld removed 10 tadpoles from each tank several times, measured their sizes, and preserved the specimens. He later dissected the preserved specimens and measured gut lengths.

    Au naturel.

    Tanks that simulate the wood frog tadpole's natural environment enable University of Pittsburgh graduate student Nancy Schoeppner to study how predation and competition affect its gut size.

    CREDIT: RICK RELYEA

    In the September 2004 Ecology Letters, Relyea and Auld reported that the greater the competition for food—such as in tanks with 160 tadpoles—the longer the tadpoles' guts grew. In crowded conditions, if a tadpole is lucky enough to find food, it needs to extract as much energy as possible, notes Relyea. “A great way to increase [digestive] efficiency is to have longer intestines because it forces the food to spend more time traveling through [them],” he explains—the slow journey provides more opportunities for the gut lining to absorb nutrients.

    Big tail.

    When these tadpoles live among predators, they sacrifice gut size for larger tails (right) and faster escapes.

    CREDIT: RICK RELYEA

    In the tanks with the caged dragonflies, however, the fear-provoking chemical cues emitted by the insect stunted gut growth. This makes sense, says Relyea. In the wild, tadpoles use their tails to dart away from dragonfly larvae, and the longer the tail, the better. But growing a long tail puts demands on the tadpole's resources, and gut length is sacrificed.

    The shorter gut may be poorer at processing food, but in dragonfly-infested waters, the tradeoff for a swifter escape is likely worth it. “Animals can be amazingly sophisticated at fine-tuning their gut length to strike an effective balance between the two opposing forces of predation and competition,” Relyea concludes.

  21. A Mouthful of Microbes

    1. Elizabeth Pennisi

    Oral biologists are devouring new data on the composition, activity, and pathogenic potential of microbes in the mouth, the gateway to the gut

    Thanks to dentists, “open wide” has become a command to fear. A century ago, dental practitioners depended on a shot of whiskey and pliers to treat toothaches and sore gums. Today's dentists have laser drills and Novocain, but they haven't completely shed the aura of pain. In the near future, however, the tools of choice may be less scary: a DNA test followed by a chaser of “good” bacteria, for example.

    Over the past 40 years, oral biologists have been taking stock of the vast microbial communities thriving on and around teeth, gums, and the tongue. It's been known for quite some time that bacteria that normally reside in the mouth can escape to other parts of the body and cause problems. There's a well-substantiated link between one oral pathogen and heart problems, for example. And last year, tests in mice lent support to the theory that a common mouth bacterium can slip into the bloodstream of pregnant women and infect their uterus and placenta, eventually causing premature births.

    But poor oral health also causes harm directly. Three out of 10 people over 65 have lost all their teeth. In the United States, half of all adults have either gum disease or tooth decay; Americans spend more than $60 billion a year to treat tooth decay alone. Indeed, cavities are the single most common chronic disease of childhood, with a rate five times greater than that seen for the next most prevalent disease, asthma. Adding insult to injury, about one-third of the general population also suffers from halitosis, better known as bad breath.

    A lot of those problems can be traced to the mouth's microbes, say oral biologists. In healthy mouths, “good” bacteria and other microbes compete with nefarious cousins and keep them in check. But if conditions change, pathogenic microbes can gang up against the beneficial species and gain control of the mouth's surfaces. Bleeding gums, cavities, and bad breath can result.

    Father of oral biology.

    Using the recent invention of microscopes to explore the human body, Antony van Leeuwenhoek discovered the first microbes in the mouth and recorded the diversity of these organisms.

    CREDIT: CORBIS

    By comparing healthy mouths to unhealthy ones, researchers are now rooting out the problematic microorganisms, which include bacteria, viruses, fungi, and even archaea. In addition, efforts to sequence the genes, if not the whole genomes, of oral microbes are helping identify the villains in the mouth and how they trigger disease. “We've been able to get a better understanding of the variety and scale of the microorganisms that are present,” says Richard Lamont, a microbiologist at the University of Florida, Gainesville.

    The new research clearly shows that the mouth is a complex ecosystem. Some microbial species are pioneers, producing proteins that serve as welcome mats to later colonizers. It's also become evident that in the mouth, microbes gang up to cause troubles. Unlike other infectious diseases, “we are looking at infections that are caused by more than one organism,” says Howard Jenkinson, a molecular microbiologist at the University of Bristol in the United Kingdom.

    No microbe is an island

    The study of oral bacteria goes back almost 400 years. When Dutch glassmaker Antony van Leeuwenhoek began using microscopes to study the human body, the microbial world of the mouth opened up to him. He examined samples of plaque from his own teeth and made some of the first drawings of what we now call bacteria. His notes set the stage for oral microbiology. “He pointed out the nature of the diversity [in the mouth],” says David Relman, a microbiologist at Stanford University in California. Since then, “there's always been a sense that the community is what matters; the community is capable of doing things that individual [species] cannot.”

    Mixed communities.

    The mouth contains a vast number of microbial inhabitants of all different shapes and sizes.

    CREDIT: ZIE SKOBE FORSYTH INSTITUTE

    Still, it didn't hit home that many oral problems came from microbes until the 1960s, when Sigmund Socransky of the Forsyth Institute in Boston, Massachusetts, discovered bacteria buried in the gums of people with trench mouth. Soon afterward he and Forsyth's Anne Haffajee found that the population of microbes associated with plaque-encrusted teeth changes with the severity of the disease, indicating that more than one microbe is involved.

    It also became clear that bacteria grow in thin layers—called biofilms—on the mouth's surfaces. Studying these biofilms, however, was problematic. Researchers would try to use a cheek swab or tooth scraping to grow oral microorganisms in the lab, but only about half the microbial species took hold, leaving a knowledge gap.

    In the early 1990s, Socransky and colleagues developed a method, called checkerboard DNA-DNA hybridization, that greatly sped up the characterization of the bacterial community in an individual's mouth. Instead of being limited to analyzing 100 samples per year, they could look for 40 bacteria in 28 samples simultaneously. “We can [now] do tens of thousands,” says Haffajee. Since then, oral biologists have added another tool to their repertoire; they've been identifying oral bacteria that have yet to be cultured by differences in the gene for 16S, a ribosomal RNA subunit.

    Genes, genes, genes.

    Seven sequenced genomes, with more on the way, are revealing the secrets of the mouth's microbes.

    CREDIT: ADAPTED FROM NATIONAL INSTITUTE OF DENTAL AND CRANIOFACIAL RESEARCH

    Using this latter approach, Bruce Paster, a molecular microbiologist at the Forsyth Institute, and others have built extensive databases of 16S gene sequences collected from different parts of the mouth. Each sequence represents a different microorganism. “Collectively speaking, there are over 700 species in the human oral cavity, of which over half cannot yet be grown,” says Paster. He's now putting together a microarray with all the 16S genetic sequences on it. Researchers could then use it to check DNA from clinical samples: “You can quickly assess the microbial profiles of [each] mouth,” says Paster.

    Microbial whodunit

    To a microbe, the mouth is a planet full of different surfaces and environmental conditions. Not surprisingly then, oral microbes have their favorite homes—teeth or the tongue, for example. In each case, says Paul Lepp, a microbiologist at Minot State University in North Dakota, “if you can figure out which organisms should be present during health, that may lead us down the road to find some way of trying to encourage those to survive.”

    Like the microbes they study, oral biologists tend to specialize in particular areas of the mouth and the diseases arising there. Some researchers analyze microbial invasions of teeth. That's the site of perhaps the most studied mouth microbe, Streptococcus mutans. This bacterium colonizes the teeth of almost every child and feeds off sugar in the mouth, generating lactic acid that destroys tooth enamel and leads to lesions known as caries or cavities.

    As for Relman and Lepp, it's the microbes that reside in the deepest part of the narrow crevice between the gums and the base of the tooth that are of great interest. In a healthy mouth, there's a tight fit between the tooth and the gum collar. But if disease sets in, the crevice widens and deepens, gums swell and bleed, and teeth loosen.

    In 2004, as part of their surveys of this crevice, Relman, Lepp, and their colleagues discovered that some of these more troublesome microbes weren't even bacteria. They were archaea, members of a group of microbes best known for thriving in extreme conditions. The researchers initially found these archaea in a third of 58 people with diseased crevices but not in individuals with healthy mouths. The more severe the disease, the larger the archaea population—and that population shrank with treatments that lessened gum disease. Oral “disease involves all sorts of organisms that we've never [thought of],” says Lepp.

    Fast work.

    Porphyromonas gingivalis relies on other microbes to attach to teeth. In minutes, thousands of the bacteria (red, yellow) invade gum cells (green) and head to each cell's nucleus.

    CREDIT: C. M. BELTON, K. T. IZUTSU, P. C. GOODWIN, Y. PARK, R. J. LAMONT, CELL MICROBIOL. 1, 215 (1999)

    Paster's chosen territory is the tongue. He's pinpointed 92 groups of microorganisms living there and is homing in on species that may create bad breath. In early 2003, Paster and his colleagues compared data from six people with bad breath to five without it. There were key differences in the microbial milieu of the two groups' mouths. Three bacterial species thrived in the healthy mouths but not in the six mouths with bad breath. Paster's team also identified a half-dozen microorganisms on foul-smelling tongues that were not found on the sweet-smelling ones. In an experiment whose data have yet to be published, they have confirmed that the tongue's microbial makeup is important in halitosis. The investigators tracked the microbial communities on the tongues of 33 people with halitosis before and after their bad breath was eliminated through daily scraping of the tongue and use of mouthwash and toothpaste containing zinc. As halitosis lessened, the tongue's microbial communities changed, says Paster.

    Sometimes a microbe in one oral setting helps other microbes establish themselves elsewhere. Take the gram-positive bacterium called S. gordonii. These microbes settle on the saliva film coating teeth early in life by recognizing certain saliva proteins, which they proceed to break down and use as nutrients. In turn, surface proteins of the streptococci attract other bacteria, such as the pathogen Porphyromonas gingivalis. It attaches to the streptococci to get a toehold at the tooth-gum interface. But in a matter of minutes, it moves on to invade the gum's epithelial cells.

    To help understand how P. gingivalis colonizes the gums, Lamont has recently chronicled the pathogen's changing repertoire of proteins. Using a series of analytical techniques, including tandem mass spectrometry, capillary high performance liquid chromatography, and two-dimensional gel electrophoresis, he and his colleagues documented that the bacterium made about 220 new proteins when it encountered gum epithelial cells. Many of them are surface proteins that likely establish a nutrient pipeline from the host or protect the bacteria from immune responses, Lamont and his colleagues reported in the January issue of Proteomics.

    New tools of the trade.

    Oral researchers are getting a clearer view of the mouth's microbial communities using enamel chips mounted on teeth. Using microarrays (inset), they are learning which proteins are important.

    CREDITS: P. KOLENBRANDER/NIDCR; (INSET) Y. ZHANG, T. WANG, W. CHEN, O. YILMAZ, Y. PARK, I. Y. JUNG, M. HACKETT, R. J. LAMONT, PROTEOMICS 5, 198 (2005)

    Wash that mouth out

    Although the genomes of dozens of microbes are now in the public databases, the sequencing of oral bacteria is just ramping up. Thus far, researchers have sequenced seven species that settle in the mouth, and a half-dozen more are in the pipeline, says Dennis Mangan, a microbiologist at the National Institute of Dental and Craniofacial Re-search (NIDCR) in Bethesda, Maryland, which supports the sequencing projects. And in late December 2004, NIDCR started a 3-year study with Relman and The Institute for Genomic Research in Rockville, Maryland, to catalog all the microbial genes in the mouth. For this project, Gary Armitage of the University of California, San Francisco, will collect DNA samples from seven sites in the mouth—the teeth, gums, tongue, palate, cheek, and so on—from a mix of people in different states of oral health. He and his colleagues estimate that the project will identify about 40,000 genes by its conclusion. “We will have a good handle on those organisms that are associated with health and those organisms that are associated with disease, and what genes are prevalent in disease,” says Lepp.

    With that knowledge, researchers may be able to pinpoint weaknesses in pathogenic oral microbes or learn how to boost the mouth's response to microbial attacks. That might ultimately lead to mouthwashes that inhibit just bad mouth microbes instead of good and bad alike, or to drugs that disable the surface proteins microbes use to infect the mouth.

    Armed with the growing understanding of the mouth microbial ecosystem, one company, Oragenics Inc. in Alachua, Florida, has already taken steps to pit bacteria against bacteria in the battle for healthy teeth. The approach, called replacement therapy, would use genetically modified S. mutans bacteria to supplant normal cavity-causing ones; the altered germs don't make lactic acid. The company's microbe-laden mouthwash has been controversial, however, and the Food and Drug Administration put the first clinical trials of it on hold because of concerns that the researchers couldn't guarantee that, should something go wrong, they could get rid of the microbes once in the mouth. But in December 2004, Oragenics got approval to continue the trial with denture wearers whose dentures—and bacteria—could be removed if a problem occurs.

    It may be years before Oragenics's cavity-fighting bacterium makes it into dental practice. And it's not clear how popular this particular approach will be. But Mangan is convinced that all the work characterizing the mouth's microbial communities will eventually have a big payoff. “[It's] going to point to ways to control the numbers and the species of bacteria there, particularly the dangerous ones,” he says. That could make cavities, gum disease, and bad breath a thing of the past—and bring a healthy smile to the mouths of many people.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution