News this Week

Science  01 Nov 2002:
Vol. 298, Issue 5595, pp. 938

    Conflict Brewing Over Herbicide's Link to Frog Deformities

    1. Rebecca Renner*
    1. Rebecca Renner is a writer in Williamsport, Pennsylvania.

    A witches' brew of controversy is bubbling up over the potential link between atrazine, one of the most widely used herbicides in the United States, and the decline of amphibians. The latest additions to the brew are new findings from developmental endocrinologist Tyrone Hayes's group at the University of California, Berkeley, suggesting that exposure to very low levels of atrazine in the wild is turning male frogs into hermaphrodites. But new experimental results in another frog species, to be presented by experimental toxicologist James Carr of Texas Tech University in Lubbock and other researchers at a meeting later this month, cast doubt on such low-dose effects. At stake could be continued regulatory approval for atrazine.

    Earlier this year Hayes set the kettle boiling when he reported in the Proceedings of the National Academy of Sciences that in the lab, male tadpoles exposed to low levels of atrazine developed into hermaphrodites or had other reproductive-organ deformities, apparently due to disruptions in their endocrine system (Science, 19 April, p. 447). The study used African clawed frogs (Xenopus laevis), known as the “lab rat” of amphibian toxicology studies. Now he's extended the finding to a native species, the leopard frog (Rana pipiens), in both lab and wild populations. Counterintuitively, the lowest doses of atrazine appear to be the bitterest pill for frogs. But other teams, including Carr's, say that they have been unable to replicate Hayes's original Xenopus findings.

    Atrazine is used throughout the world in countries that are major corn growers, according to Timothy Pastoor, head of global risk assessment for Syngenta, a major producer of atrazine. Several countries in Europe have banned the compound but not directly due to health concerns. The countries have a policy of banning any questionable pesticide that occurs in drinking water at levels higher than 0.1 parts per billion (ppb).

    The U.S. Environmental Protection Agency (EPA) is currently reevaluating the risk that atrazine could affect human or ecological health at environmental levels. The possible link between atrazine and reproductive-organ abnormalities in frogs is so important to this reevaluation that the agency is delaying the process to convene a specially selected panel of scientists to evaluate the available data, according to an EPA spokesperson. Amphibian populations have been declining precipitously in the past decade or more, ecologists have reported, and Hayes and others have proposed that widespread use of atrazine could contribute to the problem. Syngenta, meanwhile, has been sponsoring an independent, multimillion-dollar research program into the potential effects of atrazine in the environment on animals' endocrine systems. The Carr study is funded through this program.

    Digging up trouble.

    Tyrone Hayes, here building frog traps in Nebraska, finds that atrazine makes male frogs hermaphroditic.


    Hayes says that his new study shows the real-world dangers of atrazine: “This isn't just about an amphibian ‘lab rat’ anymore.” The paper, published in this week's issue of Nature, “is very provocative and we are taking it very seriously, but we need to see more data,” says a senior EPA scientist who requested anonymity. Many experts who are familiar with the studies would only speak anonymously, citing concern that their comments might bar them from consideration for EPA's review panel.

    Hayes's team exposed Rana tadpoles in the lab to one of two doses of atrazine. The testes of some exposed males contained oocytes—eggs that should only be found in female frogs—or were underdeveloped. A higher proportion of males, 29%, had testes containing oocytes at the lower dose, 0.1 ppb; 8% of the frogs exposed to 25 ppb were abnormal. (Drinking water in the United States is allowed to have up to 3 ppb atrazine.)

    Hayes's team suggests that atrazine disrupts male frogs' hormones by inducing production of aromatase, an enzyme that converts androgens into estrogens. Endocrine disrupters often have more marked effects at lower doses, according to zoologist Louis Guillette of the University of Florida, Gainesville.

    The team also examined 800 newly metamorphosed Rana from eight sites across the western and midwestern United States. They found hermaphrodite males at the seven sites with water-borne atrazine contamination higher than 0.2 ppb. The highest proportion of abnormalities, 92%, was recorded from a relatively pristine site in Wyoming. The one site with no measurable amounts of atrazine hosted healthy frogs.

    But toxicologist Keith Solomon of the University of Guelph, Ontario, Canada, one of the members of the Syngenta-sponsored team, argues that the results show no meaningful link between atrazine exposure and abnormalities. Low levels of atrazine don't induce aromatase in frogs, he claims. Without the aromatase link, “there should be a correlation between dose and response. But the highest proportion of abnormalities were found at a site with atrazine levels close to background. On this basis how can the cause be atrazine?” As for the lab-based findings, “if this effect is robust, [our] new laboratory study on Xenopus should produce clear results—it doesn't,” he says.

    Carr's team, like that of Hayes, exposed Xenopus tadpoles to atrazine until they completed metamorphosis. They found an effect, “but the results were only statistically significant at the highest dose, nominally 25 ppb, but actually 19 to 21 ppb based on water measurements,” he says. A third, unpublished, Syngenta-sponsored study conducted in zoologist John Giesy's lab at Michigan State University, East Lansing, found a similar proportion of abnormalities in control and atrazine-treated frogs, says Carr. In addition, “field studies of Xenopus in South Africa—the frog's native habitat—show no response,” says Solomon. Results of the Carr study and the South African field studies are to be presented at the Society of Environmental Toxicology and Chemistry annual meeting in Salt Lake City, Utah, 16 to 20 November.

    The EPA atrazine docket, a publicly available record of comments about atrazine reregistration, reveals that the Hayes and Carr teams have been swapping detailed and pointed critiques about each other's studies. But an amphibian toxicologist not involved in either effort suggests that the complexity of long-term amphibian studies might account for some of the discrepancies. “These three studies are from good labs,” he says, adding, “there's a lot that we don't understand about simple things.” The researcher says that it's not clear if any of the seemingly minor differences between the Hayes and Carr protocols matter, such as the different strains of frogs, densities of tadpoles, or materials used to build the tanks. Such is the witches' brew of ingredients that the EPA panel will ponder.


    Scientists Recommend Ban on North Sea Cod

    1. David Malakoff,
    2. Richard Stone

    CAMBRIDGE, U.K.—In what could prove to be a serious blow to Europe's ailing cod-fishing industry, fisheries scientists last week advised the European Union (E.U.) to ban cod fishing in the North Sea and several other historic regional trawling grounds. But economic pressures might lead politicians to tone down the advice.

    The recommendation is based on surveys suggesting that major cod stocks in the northeast Atlantic are at historic lows. European fishing quotas for 2003 are due next month, and scientists say that a ban is the only way to protect future stocks. “Populations will collapse if there are not drastic reductions in fishing,” says Robin Cook, director of the Fisheries Research Services Marine Laboratory in Aberdeen, United Kingdom.

    But some researchers wonder whether politics will trump science, noting that a ban could cost 20,000 jobs in the United Kingdom alone. “I would be delighted but shocked if [the European Commission] agrees to all the needed restrictions,” says Andrew Rosenberg, a former chief fisheries regulator in the United States and now a dean at the University of New Hampshire, Durham. Other researchers predict that even quick action might not restore healthy stocks soon.

    Smaller schools.

    The number of spawning cod in the North Sea has sunk to record lows, imperiling the fishery.


    Europe's cod drama reprises one that gripped North America in the last decade, when groundfish stocks collapsed in the western North Atlantic and subsequent fishing bans devastated U.S. and Canadian fleets. European fishers are now following the same chart, says the Copenhagen-based International Council for the Exploration of the Sea (ICES), which advises the European Commission on fisheries. According to scientific surveys and catch statistics, the North Sea's cod spawning schools have dropped to just 15% of what they were in the early 1970s. “The stock is half of the absolute minimum” needed to sustain healthy populations, says Hans Lassen, ICES's fisheries adviser.

    Even with a ban, hard-hit stocks might not bounce back quickly. Some Canadian populations haven't recovered even after a decade of restrictions, notes fisheries scientist Jeffrey Hutchings of Dalhousie University in Halifax, Nova Scotia, Canada. U.S. stocks have fared better, as the fisheries were closed before populations plummeted to unsustainable levels and the fish mature faster than their northern cousins.

    The European stocks appear to share some of that robustness, says Hutchings, although dwindling numbers “raise major worries.” The U.K.'s Cook agrees: “We're right on the end of the graph,” he says, predicting that it will take at least 4 years to rebuild fishable cod populations.

    A cod ban could also harm other fleets. That's because ICES has recommended banning vessels that target other species—such as haddock or shrimp—but also net cod as so-called bycatch. “You cannot look at the cod in isolation,” says Lassen.

    Rosenberg is skeptical that politicians will crack down on bycatch. The enormous economic implications of that move, he notes, have led policy-makers to ignore ICES recommendations in the past, “making the hole they are in now even deeper.” Still, he hopes that E.U. officials will create some “closed areas that are big enough to provide the fish with a substantial refuge.”

    Industry is pleading for less drastic steps. George MacRae of the Scottish white fish producers association told The Guardian last week that a sweeping ban “is the doomsday scenario.” Policy-makers now face the slippery task of balancing the future of the fish against that of the fishers.


    Iron Deficiency Reveals Nearly Pristine Star

    1. Robert Irion

    Astronomers have found an ancient star that preserves a chemical record of the infant cosmos. The little star, just now facing the end of its long life, suggests that the first stars in the universe might not all have been the colossi that models predict. “It's astounding that we can glimpse such an early stage of the universe through the composition of this star,” says astronomer Catherine Pilachowski of Indiana University, Bloomington.

    Stars are relentless element factories, transforming hydrogen and helium (products of the big bang) into heavier elements such as carbon, oxygen, silicon, and iron. Massive elements also form in the fires of supernova explosions, which spray the rich mixtures into space. Generations of stars have seeded our Milky Way galaxy in this way, altering its primordial composition into a potpourri more conducive to rocky planets and computer chips.

    Ancestral stars might persist, burning slowly on the Milky Way's sparse outskirts where new stars no longer arise. Astronomers have scoured space for those objects for more than 2 decades. Previously, the most primitive star found in such searches contained about one 10-thousandth as much iron as the sun. Some researchers speculated that they would never come closer to the so-called Population III—the first stars, born with no heavy elements (Science, 4 January, p. 66).

    However, an ambitious survey of more remote parts of the galaxy has uncovered a star 20 times as anemic. Astronomer Norbert Christlieb of the University of Hamburg, Germany, and his colleagues scrutinized the star in December 2001 with one of the four 8-meter telescopes in the European Southern Observatory's Very Large Telescope array in Paranal, Chile. Analysis of the light from the star, called HE0107-5240, shows that its atmosphere is a strikingly unspoiled broth of hydrogen and helium with the barest dash of heavy elements: just one iron atom for every 7 billion atoms of hydrogen. The team's results appear in the 31 October issue of Nature.

    HE0107-5240 might record an imprint of the first supernovas, says co-author Timothy Beers, an astronomer at Michigan State University in East Lansing. For instance, dollops of nickel are evident in the previous most iron-poor star, but HE0107-5240 is nearly nickel-free (see figure). That absence might reflect a basic difference in how the earliest supernovas forged elements, because even a single modern supernova would have supplied enough nickel to pollute the star. “This may be the first example of a true second-generation star,” Beers says. “It's our best look at the starting recipe that led to the rest of the periodic table [of the elements].”

    Clean slate.

    Ultraviolet spectral lines of iron and nickel reveal that a newly found ancient star (third from top) contains the lowest proportion of heavy elements yet seen.


    The star is now a red giant: the bloated end stage of a star that has fused most of its hydrogen fuel. However, Beers notes, it lived for at least 12 billion years as a small star just 80% as massive as our sun. Current models maintain that primitive gas clouds with almost no heavy elements could not have formed tiny stars, because hydrogen alone can't cool clouds to the frigid temperatures needed for small clumps of gas to collapse. Rather, theories hold, the first stars were enormous—perhaps 100 to 1000 times larger than our sun. HE0107-5240 suggests that little stars were in the initial mix as well or were born soon thereafter. Some tiny stars might have formed as companions to gigantic ones and survive as relics to this day, Pilachowski notes.

    The Hamburg survey might reveal more primitive stars to help fill in the tale. Christlieb's team has analyzed just one-quarter of its most promising candidates so far.


    HapMap Launched With Pledges of $100 Million

    1. Jennifer Couzin

    A consortium of six nations is diving into a massive new genomics project it hopes will pinpoint the genes behind common diseases. After months of passing the hat among countries and private companies, the U.S. National Institutes of Health (NIH) announced earlier this week that it's garnered the $100 million the 3-year effort to construct a so-called haplotype map is likely to cost. But even as the project was announced with considerable fanfare, many details remained sketchy.

    The idea for the HapMap, as it's informally known, arose soon after scientists discovered that the human genome has a surprisingly structured architecture. Thousands of DNA bases, and the patterns of single-base variations among them, fall into roughly the same order in many people. A popular theory is that slight tweaks in those DNA blocks, or haplotypes, could mean the difference between health and ailments ranging from cancer to diabetes. Researchers plan to examine 200 to 400 genetic samples from four populations in Africa, Asia, and the United States. (Previous studies have shown that haplotype patterns differ in part based on migratory histories.)

    Enthusiastic about the HapMap's potential to provide medical answers that the full human genome sequence has yet to offer, NIH paved the way, planning a $40 million commitment early this year. Since then, the Canadian government kicked in a little under $10 million and, more recently, the Wellcome Trust Sanger Institute in Hinxton, U.K., about $25 million. Japan, China, and the SNP Consortium, a public-private group seeking single-base differences among genomes, are also adding to the pot.

    Work is expected to begin as soon as participants at genome centers in the United States and abroad agree on some ground rules for the project, perhaps the most unwieldy collaboration since the sequencing of the human genome. They have yet to determine, for instance, how data collection will be standardized. Also uncertain is precisely how the map will be structured and how the work will be divvied up.

    “We've learned how to find good ways to work together,” says David Bentley, head of human genetics at the Sanger Institute. But he notes that unlike the 3 billion bases biologists knew they'd uncover in the genome project, here no one knows quite what to expect.


    More Questions About Hormone Replacement

    1. Jennifer Couzin,
    2. Martin Enserink

    Three months after a review panel abruptly stopped a 16,600-woman study of hormone replacement therapy (HRT), a stunned medical community is trying to resolve questions raised by the trial. Last week, several hundred experts and observers gathered at the National Institutes of Health (NIH) in Bethesda, Maryland, to weigh the implications. Most agreed that hormone therapy should not be used to prevent disease. But HRT might still have valid, short-term uses in treating the symptoms of menopause. The risks are not clear, however, nor will they be easy to study, for many acknowledge that large-scale hormone trials might no longer be feasible or ethical.

    That point was underscored when the U.K.'s Medical Research Council announced in London at the same time that it was abandoning a similarly ambitious hormone study. The British trial, Women's International Study of Long Duration Oestrogen After Menopause (WISDOM), had planned to enroll up to 22,000 women. It was already struggling to recruit volunteers when the U.S. study of Prempro, a drug combining estrogen and progestin, was halted in July. An interim analysis of the U.S. research, part of NIH's Women's Health Initiative (WHI), had shown that the hormones increased the risk of heart disease, breast cancer, and stroke more than they reduced chances of osteoporosis, bone fractures, and colorectal cancer (Science, 19 July, p. 325).

    WISDOM's leaders, fighting to keep their trial alive, argued that the benefits might still outweigh the risks for many women. But the Medical Research Council overruled them. Results from the $32 million study, not expected until 2016, were unlikely to differ enough from those of WHI to alter clinical practice, says Oxford University's Ray Fitzpatrick, chair of an international panel that recommended terminating the study.

    WHI's outcome, meanwhile, has sown confusion among women and their doctors. NIH organized the workshop in an attempt to clear it up. The befuddlement was due in part to the fact that most women take hormones to counter symptoms of menopause such as hot flashes, which the trial was not designed to evaluate. It examined other health endpoints among women whose average age was 63. Many doctors questioned whether the WHI results applied to women typical of those in their waiting rooms—in their early 50s and just entering menopause. Could the risk of disease attributed to hormone use be lower in a younger cohort?

    Shutting down the trial raised broad questions like these, said Deborah Grady of the University of California, San Francisco: “The dilemma now is [how do we decide] who's at too much risk to take hormone replacement therapy?” WHI investigators are poring over 5 years of data to try to identify risk factors. Grady and others cautioned against making assumptions that are not backed up by WHI's data.

    The study was halted when 38 per 10,000 women receiving Prempro for a year were diagnosed with invasive breast cancer, compared to 30 in the placebo group. Although this 26% increase is substantial, the risk for an individual woman remains small.

    In the future, researchers should “focus on 50 to 59 [year-olds],” was the message from the audience, says Marian Limacher, a WHI investigator at the University of Florida College of Medicine in Gainesville. But she thinks it would be next to impossible to run such a trial: “Who's going to be willing to stay on long-term hormones now?” she asks. Not many, if the aborted WISDOM trial is any indication. Although closely watched trials of HRT to prevent Alzheimer's disease will continue, others—including one on lupus patients—have been abandoned, according to NIH officials.

    One of the most vexing questions is whether the risks linked to Prempro use apply to the four other combination HRT products on the market. The National Heart, Lung, and Blood Institute (NHLBI), which funded the WHI study, hopes to find out, although NHLBI's Jacques Rossouw agrees that “women might be a little leery” about enrolling in another hormone trial. Although the Food and Drug Administration is considering relabeling combination hormones to reflect the risks, Janet Woodcock, director of the agency's Center for Drug Evaluation and Research, says differences among product recipes make it “not possible to extrapolate” from Prempro to other medications.

    While efforts to sift the results continue, investigators are watching for the next step by Wyeth, Prempro's manufacturer. In July, Wyeth requested access to the study data; NIH agreed to hand the information over. “Once it's released we can't control what they do with it,” explained Limacher, who's unhappy that Wyeth will access the data before investigators publish all the findings. But Wyeth vice president Ginger Constantine argued, as Prempro sales plummeted, that “nobody needs science more than us.”


    Going Head-to-Head Over Boas's Data

    1. Constance Holden

    Studying skull dimensions is commonplace in forensics and paleoanthropology. But two new papers offering diametrically opposed analyses of a classic study by Franz Boas suggest that the technique is still controversial for many anthropologists entwined in the ongoing debate over the relation among genes, environment, and race.

    Boas, the father of American anthropology, published a study in 1912 challenging the prevailing belief that ironclad genetic rules govern cranial shapes. He took measurements from 13,000 European immigrants and their offspring living in New York comprising seven ethnic groups, the largest being Hebrews (Jews from Eastern Europe), Bohemians, central Italians, and Sicilians. He compared parent-offspring resemblance in immigrants whose children were born in the United States with those whose children were born in Europe to see whether living in the New World had an effect on skull shape (see graphic).

    Using the cephalic index—the ratio of head breadth to head length—Boas found what he saw as a small but significant trend: The U.S.-born children in the four largest groups were more different from their parents than were the foreign-born. Jews, who had “very round head[s],” became more “long-headed,” he reported, while long-headed Italians became more short-headed—“so that both approach a uniform type in this country.” The study is often cited as evidence that humans can't be pigeonholed in racial categories because their morphology is too malleable.

    Rudimentary as his statistical methods may have been, “in general, we conclude that Boas got it right,” say Clarence C. Gravlee of the University of Michigan, Ann Arbor, and colleagues in a paper posted online ( months ahead of its publication in the American Anthropologist. The difference in the two groups of offspring, the authors state, is small but “highly significant.”

    Taking their measure.

    Two teams of researchers have reanalyzed data (top) from a classic study by Franz Boas of European immigrants in America—and come to contrasting conclusions.


    Wrong, say Corey Sparks of Pennsylvania State University, University Park, and his adviser, Richard Jantz of the University of Tennessee, Knoxville. The divergence in the U.S.-born offspring is “negligible” and the influence of the environment “insignificant,” they say in the 7 October Proceedings of the National Academy of Sciences. “Uncritical acceptance of [Boas's] findings has resulted in 90 years of misunderstanding about the magnitude of [cranial] plasticity.”

    Sparks doesn't disagree that Boas found a difference in cranial shape between foreign and domestic-born children. And Gravlee does not quibble with Sparks about the high heritability—and, hence, stability—of the trait. But the two sides disagree on whether the differences, although statistically significant, are also scientifically meaningful.

    Sparks says that the differences pale when compared with the much greater variation seen among ethnic groups. “About 99% of the variation [among all the groups studied] is due to ethnic variation and 1% to immigration,” Jantz explains. “Boas was right in identifying a small immigration effect,” but that has been confirmed in many subsequent studies, he says. “The real value of Boas's work, as reinterpreted by us, is how small that environmental response is.” Henry Harpending of the University of Utah, Salt Lake City, supports Sparks's analysis, arguing that “with samples this large, almost anything can become statistically significant even if it is not worth any attention.”

    Gravlee, however, insists that the numbers confirm Boas's “overarching conclusion,” namely, that “the cephalic index is sensitive to environmental influences and therefore does not serve as a valid marker of racial phylogeny.”

    The practical impact of the two papers is not clear. Sparks thinks that his analysis will help those who want to use cranial data to study population history, because the Boas study “has been a burr in our bed for 90 years.” Indeed, Jantz was a plaintiff in the long-running suit by scientists seeking permission to study Kennewick Man, a 9000-year-old skeleton found in 1996 in Washington state.

    Anthropologist Alan Goodman of Hampshire College in Amherst, Massachusetts, agrees with Gravlee that it's risky to rely on cranial data to identify the origins of long-gone populations. “The evidence does suggest that crania do change,” he says. “If you want to apply [craniometrics] to Kennewick Man and you know there's instability over a 10-year period, what can you expect over a 9000-year period?” Sparks responds that the instability is largely owing to genetic changes, not plasticity, and that a common ancestor can still be inferred by comparing an ancient skull with a modern one that resembles it.

    Heavy media coverage of the Sparks paper prompted the American Anthropological Association to post the Gravlee paper, scheduled for March 2003, on its Web site. The two authors will go head-to-head again in June when the American Anthropologist revisits the issue.


    Placentas May Nourish Complexity Studies

    1. Virginia Morell

    It's one of the oldest riddles in evolutionary biology: How does natural selection gradually create an eye, or any complex organ for that matter? The puzzle troubled Charles Darwin, who nevertheless gamely nailed together a ladder of how it might have happened—from photoreceptor cells to highly refined orbits—by drawing examples from living organisms such as mollusks and arthropods. But holes in this progression have persistently bothered evolutionary biologists and left openings that creationists have been only too happy to exploit. Now a team of researchers presents a model system for studying the evolution of complex organs—in this case, the placenta—that Darwin could only dream about.

    “Darwin had to use organisms from different classes,” explains David Reznick, an evolutionary biologist at the University of California (UC), Riverside, “because there isn't a living group of related organisms that have all the steps for making an eye.” And there's no way to put Darwin's solution on firm scientific footing through experiments, Reznick notes, because the organisms in his model are so distantly related to one another.

    On page 1018, Reznick and his colleagues propose that guppylike fish in the genus Poeciliopsis can solve such problems for the placenta and, by extension, other complex organs. Placentas have evolved independently three times in closely related Poeciliopsis species, they report. Other species in the genus lack placentas, and some have partial maternal provisioning by means of tissues that might be precursors of placentas. Thus the fish present the full trajectory of steps involved in the evolution of this organ, Reznick says, allowing researchers to “see what's been added, or what has changed, and eventually identify the genes associated with the evolution of each trait.” Adds Stephen Stearns, an evolutionary biologist at Yale University, “Since these placentas have evolved multiple times, we now have a promising model that can be explored and manipulated in the lab—something we've needed for a long time.”

    Placentas serve as a decent stand-in for eyes and other complex organs such as the heart or kidney whose histories evolutionary biologists have never been able to trace, Reznick and colleagues argue. By definition, complex organs are composites of independently derived features; for instance, the human eye focuses light and also perceives color. In the case of the placenta, the organ provides nutrients for the fetus while simultaneously managing waste products and regulating gas exchanges. Evidence of the intermediate steps for acquiring such organs is missing from the fossil record, enabling creationists to claim they were “created” de novo.

    Recurring theme.

    Placentas (left) that provision embryos (right) have evolved three times in Poeciliopsis fish.


    Reznick began to suspect that the poeciliid fish and their placentas could serve as a model for addressing thorny evolutionary questions 15 years ago, while writing a review of live-bearing fish. Earlier this year, one of his co-authors, Mariana Mateos of the Monterey Bay Aquarium Research Institute in Moss Landing, California, developed a phylogenetic tree for Poeciliopsis. By combining the tree with Reznick's earlier research and phylogenetic analysis by UC Riverside's Mark Springer, the team now demonstrates that there were three independent origins for placentas in six species of Poeciliopsis.

    The team also estimates the amount of time required for the separation of the poeciliid species. They based their clock on the rate of mutations in the fishes' mitochondrial DNA and incorporated dates of geological events that probably led the species to diverge. The shortest time interval between a poeciliid species with a placenta and its last common ancestor without one was 750,000 years—a period in keeping with the 400,000 years other researchers have calculated for the evolution of the eye. Despite this relatively short period, “it's not a problem for evolution to create this kind of complexity,” says Stearns.

    Other researchers, such as Günter Wagner, an evolutionary biologist who is also at Yale, caution that Reznick has yet to demonstrate convincingly that the poeciliids' placenta is a complex organ. But even with this caveat, Wagner concedes that the poeciliid model offers evolutionary biologists a rare opportunity: “We should welcome any model, and especially one like this that has several related species with all the variations in the evolution of this trait.”

    Reznick admits that the poeciliid placenta might not be as sophisticated as the mammalian placenta. But like the evolution of the eye, the evolution of the mammalian placenta is lost in history. “We can't ask how this kind of adaptation evolved with mammals because it only happened once over 100 million years ago,” he says. The answer might come instead from small, guppylike fish.


    Is Sugary Toxin the Smoking Gun?

    1. Jocelyn Kaiser

    A team of researchers claims to have found more support for the controversial assertion that a toxic microbe called Pfiesteria is responsible for massive fish die-offs along the eastern United States. But the new studies, which include the first rough sketch of the toxin, have failed to convince skeptics.

    For 10 years, aquatic ecologist JoAnn Burkholder of North Carolina State University in Raleigh has argued that a potent neurotoxin from the dinoflagellate Pfiesteria has killed more than a billion fish in East Coast estuaries and sickened lab workers and fishers. However, the toxin has not been identified. This past summer, doubts escalated when researchers at the Virginia Institute of Marine Science (VIMS) in Gloucester Point and other universities reported in two top journals that they could not find a toxin, and that Pfiesteria can kill larval fish by feeding on them (Science, 11 October, p. 346).

    Toxic or just hungry?

    Scientists disagree on how deadly a sugarlike molecule reportedly made by the Pfiesteria microbe (above) is to fish.


    Last week, at the 10th International Conference on Harmful Algae in St. Petersburg, Florida, Burkholder said that her critics had not established the right conditions for making Pfiesteria produce toxin. Her lab coaxed the strain of Pfiesteria shumwayae used in the VIMS experiments to kill juvenile tilapia in less than 4 hours, which meets her criteria for toxicity. Collaborating chemist Peter Mueller of the National Oceanic and Atmospheric Administration in Charleston, South Carolina, described a toxic chemical isolated from Burkholder's fish-killing Pfiesteria strains. Burkholder says the NOAA lab also detected this chemical in water and cells from the VIMS strain. It appears to be a glycoside, a molecule that's half sugar, half some other chemical group that hasn't been identified.

    Other algal toxin researchers remain skeptical. Wayne Carmichael of Wright State University in Dayton, Ohio, says that, although he knows of one other algal toxin that's a glycoside, this kind is unlikely to cause the neurotoxic effects reported in fish and humans. “It would not explain the range” of observations, he says. VIMS fish pathologist Wolfgang Vogelbein points out that nobody has yet shown that this purified toxin produces the lesions he sees on fish physically attacked by Pfiesteria.

    Burkholder's critics want the chance to test her toxic strains. Burkholder, who has long been criticized for not sharing her strains, says that “there were discussions” at the meeting of organizing blind testing of her cultures by other labs, but it's “still in the planning stages.” The key issue, she says, is for other scientists to follow her protocols.


    NATO Ordered to Cut Science Program

    1. Richard Stone

    CAMBRIDGE, U.K.—The idea was to celebrate science at the North Atlantic Treaty Organization (NATO). But 2 days before last week's first-ever “Grand Gathering” in Brussels of researchers and others connected with NATO's science program, the alliance's political overseers slashed the program's $24 million budget by 13%. The fete quickly turned into a self-examination of a program that has struggled to find a suitable mission to replace its former role in helping Western nations stand up to Soviet hegemony. It also spawned a behind-the-scenes effort to reverse the cuts.

    The science program supports research grants, fellowships, and workshops for scientists from NATO's 19 member countries and 34 nations in Eastern Europe, Central Asia, and North Africa. Its budget—a slice of NATO's civilian budget, which itself is only 14% of the alliance's roughly $850 million war chest—“is peanuts,” admits Jean Fournet, NATO's assistant secretary general for scientific affairs. But after a decade of budgetary stagnation, “this is the first year we've had a substantial cut,” says University of Oslo mathematician Jens Erik Fenstad, a 10-year veteran of the science committee that helps set program policy. “NATO has to decide whether it wants a science program or not,” adds committee member Charles Buys, a medical geneticist at the University of Groningen, the Netherlands.

    Science has never been a high priority for NATO's military masters. The program survived a temporary Canadian withdrawal in 1997, thanks in part to a report by a blue-ribbon panel that urged NATO to expand its scientific efforts in Eastern Europe (Science, 31 October 1997, p. 795). But the program is under renewed scrutiny along with NATO itself, which next month prepares to welcome up to nine new members.

    In recent years, the science program has won praise for funding security projects in such flash points as the Caucasus and Central Asia and for its innovative “Virtual Silk Highway” Internet project that links scientists from Vancouver to Vladivostok. The program responded to the 11 September terror attacks by bolstering its portfolio of nonclassified research and workshops on nonproliferation and fighting terrorism. And its robust ties with Russian scientists have aided that country's integration into the NATO family.

    Propped up.

    A NATO science project on seismic risk is studying the aftermath of a 1988 quake that devastated Gyumri, Armenia's second largest city.


    But just as the science program appeared to be adapting to the changing geopolitical landscape, the overlords of the alliance's civilian affairs—NATO ambassadors from each member country—delivered a harsh setback by lopping off a big chunk of its budget. Although their 22 October deliberations were secret, a few of the smaller member nations have been demanding cuts in NATO's civil budget, says Thordur Jonsson, Iceland's representative to NATO's science committee. The science program, the largest civil line item apart from salaries, proved a tempting target.

    When word filtered out the next day, representatives on the science committee from 18 of the 19 member states immediately signed a statement denouncing the cuts. The lone holdout was the U.S. representative, physicist Vic Teplitz of Southern Methodist University in Dallas, Texas. “Ididn't particularly want to sign it,” says Teplitz, adding that he favors “a more thoughtful reaction.” The letter was expected to go this week to NATO's secretary general, George Robertson, who according to committee members can weigh in before the decision is finalized.

    Anticipating bad news, Fournet's team had already decided to revamp the popular fellowships program, which places a few hundred scientists a year from Eastern Europe and other disadvantaged regions in Western labs. It hopes to save money by awarding salaries and equipment grants to about 1000 scientific émigrés a year who are willing to return East for at least 3 years. “We'll do more with less” by taking advantage of the large differential in salaries, says Fournet. But Jonsson and other committee members worry that the fellowship program might have to be scrapped altogether. Buys, the Dutch representative, warns that “further cuts will be disastrous.”

    Science committee members hope to convince Robertson of the value of science in strengthening the alliance. “If only they would forgo buying one F-16, you could use the money to transform NATO science,” notes one member. At $25 million, such a financial transfer would more than double the science budget. On the other hand, a continued decline in the science program might leave it too poor to buy even spare parts for the fighter plane.


    Venter's Next Goal: 1000 Human Genomes

    1. Rebecca Spieler Trager*
    1. Rebecca Spieler Trager is an editor for The Blue Sheet in Chevy Chase, Maryland.

    Fundraising campaigns often repay donors with mugs, buttons, or books as a token of thanks, but DNA sequencer J. Craig Venter is offering something more personal. People who donate $500,000 to his recently formed J. Craig Venter Science Foundation can have their genome analyzed and get the results on a disk.

    Venter, who left the position of CEO at Celera Genomics in Rockville, Maryland, in January, is making this offer as he drums up support for several research projects that his nonprofit foundation will oversee. They include a scheme to develop hydrogen- producing organisms, a genetics policy shop, and a lab for high-speed DNA sequencing. The lab is a high priority, and Venter says he hopes to get it launched by January 2003; within 2 years, he expects it to sequence the genomes of 1000 individuals, including those of interested sponsors. It will also test new technologies, including an efficient DNA sequencing system patented last month by U.S. Genomics of Woburn, Massachusetts (Science, 25 October, p. 735).

    Venter is steering $50 million of the funds he controls into the sequencing facility, much of it originating from the endowment of The Institute for Genomic Research (TIGR), also in Rockville. He founded TIGR in 1992; it is now run by his wife, biologist Claire Fraser, and it receives 95% of its funds from grants and contracts. Venter expects to find strong private support for human sequencing based on initial reactions, but he hasn't received any pledges as yet.

    Donors to the project will have a chance to learn about their own DNA and at the same time contribute their genetic information to a pooled database for use in medical research, according to Venter. The data will be placed in the public domain, possibly in the National Institutes of Health's GenBank, he says. These human genomes will not be as complete as those produced last year by Celera and the public Human Genome Project led by the National Human Genome Research Institute, as the plan this time is to sequence only the “essential” gene-coding regions. Venter adds that he intends to publish his findings in a scientific journal.

    Premium offer.

    J. Craig Venter is proposing to analyze the genomes of interested major donors to his foundation.


    Venter does not plan to collect medical data on donors, but he hopes to team up eventually with a health center that will be able to interpret the results and possibly even offer clients diagnostic information. He does not yet have such a medical partner. Venter believes that his new human DNA database will be more valuable than earlier ones containing “homogeneous genome sequence,” because it will include many more individual genomes, making it easier to “identify associations between traits and genetics.” Because this research involves human subjects, the project will follow “standard procedures,” according to Venter, including securing informed consent from participants and approval by an institutional review board.

    Despite such assurances, Arthur Caplan, director of the University of Pennsylvania Center for Bioethics in Philadelphia, is concerned that volunteers who offer to donate DNA be told that they are unlikely to receive much benefit from participating. Caplan says that any genetic risk profile emerging from this effort is likely to be “loose, weak, and unreliable,” because the field is so young. According to Caplan, proper informed consent should communicate “how poor the information is likely to be.” This would not be a strong selling point for the foundation's fundraising efforts.

    Richard Gibbs, director of Baylor College of Medicine's Human Genome Sequencing Center in Houston, Texas, also thinks that the plan to sequence the genomes of 1000 individuals is “flawed in some of its details,” although he praises Venter for “pushing the envelope.” The computer models used to identify genes in DNA data have not been fully validated, Gibbs says, suggesting that ramping up to do human genomes at high speed might yield unreliable results. He also is concerned that the road map for dealing with ethical issues is not clear, either, as there is no federal legislation in place to protect against genetic discrimination. In addition, he wonders whether Venter will find many people who are willing to participate in the project and can afford a $500,000 donation, the projected cost of a genome analysis.

    Venter has faced skepticism before. He remains confident that his collection of 1000 human genomes will become a powerful tool for identifying the causes of disease, and that public fears about misuse of such data can be overcome. Venter explains that he donated his own DNA to the Celera genome sequencing project—and announced this fact—because he wanted to lead by example.

    His new lab, in addition to probing human genomes, will test new high-throughput sequencing technologies available from a variety of companies such as U.S. Genomics and Solexa, a U.K.-based biotechnology company. The 3700-square-meter facility will also house biological energy research activities already under way at the Institute for Biological Energy Alternatives (IBEA)—established by Venter earlier this year—in addition to new microbial genome sequencing initiatives and environmental projects. “Researchers at TIGR and IBEA are simultaneously looking for new organisms and analyzing known organisms that metabolize carbon or create hydrogen,” according to a TIGR statement.

    TIGR will not be involved in the 1000- human-genomes sequencing effort, but Fraser hopes to use some of the new capacity for the 40 simultaneous sequencing projects it has under way, including genomes of Plasmodium vivax, a human malaria parasite, and other organisms. With TIGR's current backlog, Fraser says, “we could immediately make use of a 50% increase in sequencing capacity.” The total capacity of the new sequencing lab will represent a fivefold increase over TIGR's current capability.

    As Venter sees it, the critical challenge for scientists in his field is to connect with clinical medicine. “The most important thing now,” he says, “is to make the human genome relevant to the general population.”


    Tapping DNA for Structures Produces a Trickle

    1. Robert F. Service

    Research teams around the world are trying to speed up protein structure determination—and running into many barriers

    BERLIN, GERMANY—The two dozen labs that signed up for a venture called “structural genomics” several years ago had hoped to be pumping out a stream of results right about now. Their goal, set in 2000, was to follow the lead of the Human Genome Project, ramp up quickly, and have each lab solve hundreds of new protein structures per year. It was a bold idea, but no one knew whether it would be possible to automate the research to this degree. So when research teams met to compare notes here last month,* they were disappointed to learn that everyone was having plumbing troubles. Their pipelines have sprung leaks, and instead of delivering a flood of results, so far they're delivering just a trickle.

    Consider the trials of Ian Wilson, a structural biologist at the Scripps Research Institute in La Jolla, California. Wilson heads one of nine U.S. pilot projects that are trying to dramatically speed up the three-dimensional atomic mapping of proteins. Wilson's team began 2 years ago by automating the numerous steps involved in crystallizing proteins and collecting the x-ray data needed to solve their structures. According to data presented at last month's meeting, Wilson's group has encountered difficulties at each stage of the process. Although the group started with more than 1870 protein targets, they've generated only 23 completed protein structures.

    Wilson's group is not alone. Other groups presented similar results here in what was the first opportunity for the 30 or so publicly funded structural genomics projects around the world to compare their data in a public forum. So far, these projects have targeted more than 18,000 proteins but solved the structures of only about 200. Says Naomi Chayen, a protein crystallization expert at Imperial College, London: “The number of structures is disappointingly low.”

    Although the numbers suggest that the latest field to embrace industrial-scale biology is floundering, the mood at the conference was surprisingly upbeat. “We had to start somewhere,” Wilson says. And the high rate of attrition, he says, “is just what we expected.” Structural biologists working with one protein at a time have faced these grim attrition rates for years, Wilson adds. Compared to the traditional output rate, the new programs are doing very well.

    Still, most experts acknowledge that the challenges are formidable. “We knew this would be hard, and it is,” says John Norvell, who coordinates the nine structural genomics pilot projects funded by the National Institute of General Medical Sciences (NIGMS) in Bethesda, Maryland. It is still an open question whether this research can be carried out in production-style facilities, but nobody is yet losing faith.

    Proteins on parade.

    An international collaboration drew on proteins from a variety of organisms to produce these structures.


    Picking up the pace

    The stampede into structural genomics began 5 years ago when the Japanese government announced plans to build the RIKEN Genomic Sciences Center in Yokohama; it was designed among other things to turn out protein structures at high speed. The U.S. National Institutes of Health (NIH) followed suit in September 2000, funding seven “structural genomics” pilot projects around the country through NIGMS. Last year, NIGMS added two more centers. Similar pilot projects got started in France, Germany, Canada, the United Kingdom, and South Korea. More recently, Switzerland, China, and Finland have jumped in.

    All of this adds up to a major new source of support for structural biology. Japan alone has promised to spend $100 million a year for the next 5 years in support of the RIKEN effort and eight new centers. NIGMS is chipping in another $50 million a year for its centers, and the European Community has added about another $4 million a year on top of the tens of millions spent by member countries. Over the next 5 years, governments will spend roughly one-quarter the cost of the entire Human Genome Project, all just to see whether large-scale protein mapping is feasible. But then again, the genome project itself was far from a sure bet in its first few years.

    The cost is no surprise. Protein mappers have long known that speeding up structure analysis would be difficult. “This is much harder to do than sequencing DNA,” says Thomas Terwilliger, an x-ray crystallographer at the Los Alamos National Laboratory (LANL) in New Mexico. Like genes, proteins consist of a linear chain of building blocks. But each of these protein chains—composed of amino acids—folds into a complex 3D web, and the shape determines the chemical function. Because researchers can't reliably predict protein structure from the amino acid sequence, they must physically map each one. They do this by bombarding crystals of a protein with powerful x-ray beams or by bouncing radio-frequency signals off a solution of proteins in a nuclear magnetic resonance (NMR) machine, using the reflected energy to measure surfaces.

    Powerful incentives are motivating this work. For basic researchers, mapping a protein can be the key to determining exactly how it functions. By scanning large numbers of structures, they hope to gain insights into how families of similar proteins evolved. Drug designers use protein maps to help tailor pharmaceuticals to block or enhance a protein's chemical activity.

    But the new strategy is a departure from tradition. The ability to sequence full genomes “has turned structural biology on its head,” says Chris Sander, a computational biologist at the Memorial Sloan-Kettering Cancer Center in New York City. Instead of picking targets because they are biologically interesting, structural genomics researchers can now scan genome databases for stretches of DNA encoding genes of completely unknown function, hunt down their proteins, study the results, and perhaps discover entirely new realms of biology in the process.

    Leaky pipes

    Getting this new method to work is like mastering a frustrating computer game: Every time you reach the next level, a new challenge awaits you. Research teams must first engineer Escherichia coli and other organisms to produce the proteins they want to characterize. Then they have to purify them and see if they are soluble in water, an essential step for creating the crystals used in x-ray studies and the liquid solutions used in NMR research. X-ray teams then try to coax the proteins to form well-ordered crystals, which they take to a synchrotron to collect the x-ray data. NMR teams, meanwhile, must ensure that their proteins are stable for the extended period of time required for NMR scans.

    The attrition is severe at each step. Take the numbers cited by the Northeast Structural Genomics Consortium (NESGC), a collaboration among eight institutions led by NMR expert Gaetano Montelione of Rutgers University in Piscataway, New Jersey. At the Berlin meeting, NESGC members reported that to date they have pursued 5187 DNA targets, cloned 1675 of them, and expressed 1295 as proteins. But only 773 were soluble. Consortium members purified 719 of their proteins, but they crystallized only 94. So far they have determined 50 structures, 22 using x-ray analysis and 28 using NMR. All the pilot projects report similar statistics. “Right now everything is a bottleneck,” says Sander.

    One of the biggest problems is a basic one: coaxing E. coli and other organisms to express the right proteins and getting them in a soluble form. Researchers are successful about half the time when they try to express bacterial proteins in bacterial vectors. They typically have a lower success rate (20% to 30%) getting bacteria to copy eukaryotic proteins, such as those from yeast or humans, in part because they often require “chaperone” molecules and other factors to encourage proper folding. Some groups are trying to express eukaryotic proteins in yeast and other eukaryotic organisms, but these organisms are widely viewed as finicky and tricky to handle. The result, says Montelione, is that “most of the structures we're seeing right now are bacterial proteins.”

    Crystal medium.

    Using oil to contain crystals as they grow may remove a production barrier.


    Researchers continue to have trouble getting proteins to form the crystals needed for x-ray studies, which produce the lion's share of 3D structures. “We take a big hit in crystallization and optimization of crystals,” says Montelione. “What we are seeing is that high-throughput is not enough,” Chayen says. “What we need is higher output.”

    Still, Montelione and most others at the meeting say that they expect output will improve rapidly as groups gain experience and bring new technology on line. The current numbers, NIGMS's Norvell says, “are only the initial look” and reflect the fact that most groups have only recently started to produce any structures at all. Just because proteins haven't yielded structures on the first try, “that doesn't mean they're out of the pipeline,” says Norvell. Individual groups will likely take more than one crack at solving the difficult ones. William Studier, a biologist at Brookhaven National Laboratory in Upton, New York, agrees: “The phase at which the results are really going to come is still a year or two away.”

    Plugging holes

    New robots and clever biological tricks might solve some of the problems researchers are now facing, according to scientists at the meeting. Chayen, for example, described a new scheme for growing protein crystals in oil that yields faster and higher quality results, increasing the odds of obtaining precise atomic maps.

    At RIKEN, biochemist Shigeyuki Yokoyama is working with another promising technique, known as cell-free protein synthesis, that breaks open cells to isolate and capture the protein-producing ribosomes, eliminating most of the rest of the cellular machinery. Intact cells often cannot handle the large quantities of protein that they're asked to produce for structural studies; by stripping away most of the cell, Yokoyama has solved this toxicity problem. His method also allows researchers to fiddle with the protein production mixture—adding chemical cofactors and other enzymes that help proteins fold properly—to boost the likelihood of producing working proteins. “It's very promising,” says Montelione. “For the future, all groups will have to work on that technology.”

    Another key step in preparing material for imaging is to obtain properly folded proteins; Geoffrey Waldo of LANL reported a new way to do this. The technique links each gene for the protein of interest to the gene for a protein-based light emitter, called green fluorescent protein (GFP). The two proteins are linked in such a way that the proper folding of the first causes GFP to fold into a light-emitting shape. As a result, by simply shining light on a panel of growing cells, researchers can tell which proteins are likely expressed and folded correctly. Researchers can also use the approach to screen thousands of mutant versions of proteins to see if tinkering with wild-type proteins creates novel folding patterns that make it easier to conduct structural studies. “This could be a very attractive method to a lot of people here,” says Udo Heinemann, who heads the Protein Structure Factory, Germany's primary structural genomics project, located in Berlin.

    Engineering teams, meanwhile, are setting up new high-speed robots and computer software designed to produce, purify, and crystallize proteins. They also will be used to scan for the best quality crystals, collect x-ray and NMR data at high speeds, and turn those data into final 3D structures. Whether or not they achieve the production targets, says Montelione, “the new technology development will be very important and valuable for all of structural biology.”

    Questions of scale

    Whether these promises will translate into a flood of new structures remains a big unknown. Structural genomics “hasn't proven itself yet,” Sander says. And a funding crunch might be on the horizon.

    When NIH launched its protein structure effort in September 2000, it originally set a goal of producing structures for 10,000 unique proteins in 10 years. That was a tall order considering that only some 2000 such independent proteins had been mapped in the past 4 decades. The agency funded nine centers through 2005 as pilot projects to test out new high-throughput technologies. But NIGMS will soon confront some tough decisions, Norvell says, noting that “by the end of 5 years, we certainly won't be where we need to be.” The common view is that 5 years from now, each center will likely be able to produce 100 to 200 protein structures a year, or about 1500 in total. As a result, the expectation all along, says Scripps's Wilson, has been that NIH would select a subset of centers to scale up. Norvell adds that over the next year an NIH advisory committee will begin trying to sort out the best way to proceed after 2005. The answer, he says, likely won't come until early 2004, about the same time the current pilot efforts should be hitting their stride.

    Similar projects around the globe also face uncertainty. Science budgets are particularly tight in France and Germany at the moment. For many of these programs, it seems, the question is whether the payoff will come in time to convince funding agencies to stick with structural genomics' hefty price tag.

    • * International Conference on Structural Genomics, 10–13 October.


    Big Biology Without the Big Commotion

    1. Robert F. Service

    Like its forebear, the Human Genome Project, structural genomics is an exercise in industrial-scale biology. But in at least one respect the two projects seem different: To date, structural genomics has not experienced the noisy head-to-head competition between academic groups and companies that roiled the genome project. If anything, relations are downright cozy—to the extent that two structural genomics companies are even members of U.S. public projects and also plan to deposit some of their private results in a public database.

    Companies were quick to gear up their own high-speed efforts to map protein structures. Private start-ups raised approximately $500 million, an amount that approaches what governments around the world have promised through 2005. But the public and private efforts are largely pursuing separate goals. Whereas academic groups are looking to catalog the diversity of proteins in nature, companies are focusing on targets for new drugs. Says Stephen Burley, chief scientist of Structural GenomiX in San Diego, California: “One is going for breadth, the other is going for depth.”

    At a recent meeting in Berlin (see main text), Burley and Eric Adam of Syrrx, another San Diego-based structural genomics company, revealed a sampling of the companies' efforts and early results, showing off their formidable analytical power. Syrrx, for example, turned out 56 structures of unique proteins in the last 8 months and has delivered a total of 80 structures since the company formed in 1999. Structural GenomiX has already banked more than 100 structures. Taken together, that's about the same number produced by the nearly 30 publicly financed programs worldwide and roughly 10 times the number produced by a typical major pharmaceutical firm in a year. Both Burley and Adam say their companies have yet to hit full speed.

    Good bet.

    Studying new protein families may lead to major drug finds, Ian Wilson says.


    Aside from their focus on putative drug targets, the primary difference in the company efforts is scale. According to Adam, who has recently left Syrrx, the company has already burned through about $70 million on robotics and other automation technology. The company still has $50 million in the bank, he says, meaning that its financial backing is roughly half what the U.S. National Institutes of Health plans to spend on its nine structural genomics centers combined over 5 years. It's still early days, but many academics are impressed by the fleet of automation tools that kind of money can buy. “I'm stunned by the technology development,” says Wim Hol, a structural biologist at the University of Washington, Seattle.

    Adam says that the structures Syrrx has produced so far are either protein kinases or proteases, popular drug targets. Some appear to play key roles in diabetes and breast, colon, prostate, and skin cancers, according to Adam, who adds that the company is working to develop drugs that inhibit them and then team up with major pharmaceutical companies to push the compounds through clinical trials. The goal, he says, is to move compounds into the clinic by 2004. Structural GenomiX has similar goals, Burley says, and it is also pursuing novel kinases that appear to be involved in cancer.

    Generating lots of protein structures doesn't guarantee a company a new blockbuster drug, says Ian Wilson, a structural biologist at the Scripps Research Institute in La Jolla, California. However, he adds, the strategy is “a reasonable bet.” Companies want to find new proteins because they hope this will give them a head start on designing new classes of drugs, which is often the key to turning out a blockbuster. It's a possibility that entices academics and investors alike

  13. JAPAN

    Postdocs Get Primer on How to Survive Abroad

    1. Dennis Normile,
    2. Andrew Lawler

    The U.S. approach to intellectual property rights is foreign to many Japanese scientists. But ignoring it could mean jail

    KYOTO, JAPAN—Takuhiro Hoshino has just entered a master's degree program in medicine at Kyoto University. But he's already mapping out plans to become a postdoctoral researcher in the United States. His to-do list includes spending long hours studying the basic regeneration processes of the human immune system and reading Stephen King novels to hone his English-language skills. Hoshino is also boning up on U.S. intellectual property rules—the latest survival skill for Asian scientists hoping to avoid the legal traps that have ensnared some of their colleagues.

    Last month Hoshino was one of about 50 participants in a roundtable discussion entitled “Working in the U.S.: Advice for Young Scientists” at the annual meeting here of the Japanese Biochemical Society. The subtext: “We want to explain how to go abroad without being arrested,” quipped Ken-ichi Arai, dean of the University of Tokyo's Institute of Medical Science and a co-organizer of the event, which featured advice on a variety of topics from 10 Japanese scientists who have spent many years in North American laboratories.

    Arai might have been exaggerating, but the discussion is an outgrowth of recent incidents of alleged industrial espionage involving Japanese researchers. In May 2001, the U.S. Justice Department charged two Japanese-born scientists with conspiring to “benefit a foreign government” by stealing trade secrets in the form of cell lines and DNA samples from a laboratory at the Cleveland Clinic Foundation in Ohio, where one had worked from 1997 to 1999 (Science, 18 May 2001, p. 1274). U.S. prosecutors are still seeking the extradition of Takashi Okamoto from Japan (Science, 10 May, p. 1003).

    That case, which was splashed across the Japanese media, was followed barely a year later by the arrest of a research couple formerly at Harvard University—one a Chinese native, the other from Japan—for allegedly conspiring to steal Harvard-owned trade secrets and for shipping university property across state lines. The case is slowly making its way through the U.S. legal system (Science, 28 June, p. 2310).

    Words of advice.

    Ken-ichi Arai (left) and Yoko Fijita-Yamaguchi use the Kyoto meeting to reinforce suggestions from the Japanese government.


    The two incidents reflect the chasm between Japanese and U.S. practices regarding the handling of academic research materials and data, says Yoko Fujita-Yamaguchi, a biochemist at Tokai University near Tokyo and a co-organizer of the roundtable. U.S. laws that foster the commercialization of federally funded research have prompted academic labs to become increasingly concerned about protecting intellectual property rights, says Fujita-Yamaguchi, who returned to Japan 2 years ago after more than 20 years in the United States, primarily as a principal investigator at the Beckman Research Institute of the City of Hope in Duarte, California.

    “You must always remember that, as a postdoc [in the United States], your research results belong to your boss,” she bluntly told the audience. “Without your boss's acknowledgment, you must not take samples or data away from the lab.” In contrast, she says, Japanese researchers and institutions are only now developing an interest in patenting research results, and materials and information are still typically passed around without any written agreements.

    The Japanese government is beginning to recognize the problem this cultural difference can pose for expatriate researchers. In July, the Ministry of Education, Culture, Sports, Science, and Technology distributed a “checklist of things to be aware of” to universities and research institutes, asking them to forward it to any researchers headed overseas (see table). There's a special concern about contracts, says Toichi Sakata, deputy director-general of the ministry's Research Promotion Bureau.

    View this table:

    Contracts in Japan are simple documents, he says, and are often modified verbally to suit changing or unforeseen circumstances. In contrast, U.S. contracts are detailed and are interpreted strictly. “America is a contractual society,” he says. “In Japan, what's important is to maintain harmonious relations between parties.” That difference, he adds, might not be clear to a freshly arrived Japanese postdoc who's asked to sign a complex agreement that “is probably baffling to a nonnative speaker of English.”

    U.S. universities probably don't do enough to help foreign postdocs understand the complex issues of intellectual property, conflict of interest, data management, and authorship, agrees Penny Rosser, who directs the international office at the Massachusetts Institute of Technology in Cambridge. Postdocs are “bombarded with documents” upon their arrival and might not know where to turn, she says. That's also the case at Stanford, says Leland Madden, assistant director of the international center at Stanford University in Menlo Park, California. “I assume most [foreign postdocs] sign these documents without giving them a lot of scrutiny,” he says.

    Some U.S. universities are trying to do more, taking steps that will also benefit domestic postdocs. Last year, for example, the University of North Carolina, Chapel Hill, created an office of postdoc services, with legal counsel available to discuss intellectual property issues. “We're responding to a need to provide better advice,” says Sharon Milgram, a cell biologist and faculty adviser to the new office. “Until a postdoc is faced with an issue involving intellectual property or conflict of interest, they don't treat it as that important,” she says. “And that's true for faculty, too.”


    Crime and (Puny) Punishment

    1. Gretchen Vogel

    A seizure of highly enriched uranium on the Bulgaria-Turkey border shows that heightened vigilance and high-tech forensics are not sufficient to deter would-be nuclear smugglers

    KARLSRUHE, GERMANY—In May 1999, a nervous-looking man caught the attention of Bulgarian border guards as he attempted to enter the country from Turkey. A search of his car turned up a certificate “for the purchase of uranium 235” written in Cyrillic and a lead container labeled “uranium 235.” Inside was a glass ampoule filled with several grams of fine black powder that Bulgarian scientists later confirmed to be highly enriched uranium (HEU). According to press reports, Uskan Hanifi, a Turkish citizen, told police that he had bought the uranium in Moldova. He had been trying to sell it in Turkey, he explained, but having failed, he was attempting to return to Moldova.

    The small amount of uranium seized at the border is far short of the quantity terrorists would need to fashion a crude nuclear bomb—that would require at least 10 kilograms of HEU, experts say. But the incident underscores the vulnerability of poorly secured research reactors, a likely source of the seized uranium. Just such a concern prompted U.S. and Yugoslav authorities to whisk 48 kilograms of unused HEU fuel from a research reactor in Belgrade last summer (Science, 30 August, p. 1456). But the Bulgarian case, detailed at a conference* here last week sponsored by the International Atomic Energy Agency (IAEA), also reveals an Achilles' heel of scaled-up efforts around the globe to prevent terrorists from getting their hands on materials that could be used to create a nuclear device or a radiological, or “dirty,” bomb: In many countries where the potential for smuggling is greatest, authorities lack the legal tools to give convicted smugglers much more than a slap on the wrist. The smuggler in this case was given only a fine.

    The case also illustrates how the latest nuclear forensic techniques are being brought to bear in tracing smuggled materials. A year after the uranium-filled vial was seized in Bulgaria, the U.S. Department of State arranged for it to be sent to Lawrence Livermore National Laboratory in California with the hope that detailed analyses could offer clues to where the sample originated. Working with several other Department of Energy labs, a team led by Livermore scientist Sidney Niemeyer uncovered a wealth of telling characteristics. Analysis by x-ray diffraction and electron microscopy revealed that the powder was particularly fine-grained uranium oxide, team member Nathan Wimer of Livermore told the meeting. The grains were strikingly uniform in size and shape, suggesting that the powder was milled in a sophisticated lab. Chemical and isotopic analyses revealed that the uranium was 73% U-235 and 12% U-236, consistent with material that had been recycled from very highly enriched nuclear reactor fuel.

    Further work allowed the researchers to zero in on the process used to produce the uranium and when it was last handled. Chemical separations, thermal ionization mass spectrometry, and radiation spectrometry allowed the scientists to precisely measure the ratios of seven isotope pairs produced as the uranium decays. These ratios suggest that the material went through so-called Purex reprocessing—a chemical treatment that separates spent fuel into waste products and reusable uranium and plutonium—around the end of October 1993. The mass spectrometry also revealed that the uranium powder was loaded with impurities such as sulfur, chlorine, iron, and bromine.

    Nuclear sleuthing.

    Livermore scientists used the IsoProbe, a state-of-the-art mass spectrometer, to precisely measure radioactive isotopes and impurities in a uranium sample seized by Bulgarian border guards. An electron microscope revealed unusually fine grains of uranium oxide, like this sample (bottom).


    Taken together, the powder's characteristics would allow scientists to match it to another sample from the same source, Wimer says. “This is quite unusual material and quite indicative” of certain types of reactor processes, he says. Theoretical models of processes that could have produced the powder's particular ratio of radioactive isotopes suggest that it is derived from fuel that was originally 90% uranium-235, Niemeyer says. The scientists concluded that the sample is consistent with material from a research reactor, most likely in the former Soviet Union, although they were unable to pinpoint the exact source.

    More conventional forensic techniques corroborated the notion that the sample emanated from Eastern Europe. The ampoule—a type sometimes used to archive nuclear samples—was lined with an unusual paraffin wax tinged yellow by barium chromate, a colorant rarely used in Western countries but common in Brazil, China, India, and Eastern Europe. The label on the lead container and a piece of paper wrapped around the ampoule were both derived from a mixture of hardwood and softwood trees commonly found in Eastern Europe. And the isotopic signature of the container suggested that the lead came from a mine in Asia or Eastern Europe.

    But even before the Livermore team brought its analytical firepower to bear on the sample from Bulgaria, the smuggler was long gone. He had been convicted of trafficking in controlled nuclear materials a few months after his arrest at the border. At his sentencing, however, the judge asked how much the seized uranium was worth; Bulgarian scientists replied that a lab might sell a sample of that size for legitimate purposes for $3000 to $4000. Accustomed to locking away drug smugglers who traffic in much more valuable commodities, the judge fined the man several thousand dollars and let him go, according to Alexander Strezov of the Institute for Nuclear Research and Nuclear Energy in Sofia, Bulgaria.

    Bulgarian and U.S. officials are still hoping to discover exactly where the sample came from and whether a larger cache exists that the smuggler and his associates were hoping to sell on the black market. But that will require more political cooperation. Although scientists at the original reactor could certainly identify the sample, there is not enough publicly available information to make a conclusive match. “Unless the responsible country is forthcoming, there is not going to be a resolution” to the question of the sample's origin, Wimer says.

    In the absence of such cooperation, several meeting participants suggested that the development of a database of known nuclear and other radioactive sources, perhaps coordinated by IAEA, could help trace seized materials. Although secrecy could thwart the development of a comprehensive database, says Lothar Koch of the European Commission's Institute for Transuranium Elements in Karlsruhe, IAEA or another organization could at minimum seek to convince countries to identify matches if presented with details of a suspicious sample.

    Stronger links between the scientific community and law enforcement are another vital line of defense against nuclear trafficking. In another case described at the meeting, a bus at the Presevo border crossing between Macedonia and Yugoslavia triggered a recently installed radiation detector. A search revealed a suspicious container with Chinese lettering. Later analysis revealed that it contained highly radioactive cobalt-60. The border guards evacuated the bus, but then they allowed everyone to go—missing the chance to determine who might have been exposed to potentially dangerous levels of radiation from the cobalt-60, not to mention allowing the smuggler to escape.

    “The scientific problems are important,” Strezov said at the meeting's closing session, “but more important are law enforcement personnel. They are on the front line.” Well-trained police and laws with teeth are just as important as high-tech analyses for preventing the stuff of nuclear nightmares.

    • * Advances in Destructive and Non-Destructive Analysis for Environmental Monitoring and Nuclear Forensics, Karlsruhe, Germany, 21–23 October.


    Evo-Devo Enthusiasts Get Down to Details

    1. Elizabeth Pennisi

    Researchers seek out variation among individuals to help them understand development's role in evolution

    Some researchers are turning Theodosius Dobzhansky's famous quote, “in biology, nothing makes sense except in light of evolution,” on its ear. Evolution, it turns out, makes no sense except in light of biology—developmental biology, to be precise. Ever since Darwin formalized the idea that species change through time in response to their environments, researchers have been debating how this happens. Does evolution proceed in leaps, possibly through sudden, major genetic changes? Or do new organisms arise slowly, through the gradual accumulation of more subtle genetic perturbations?

    Today many researchers from a field that melds evolutionary and developmental biology—evo-devo—are turning their attention away from dramatic evolutionary events and toward seemingly mundane ones. They hope their work will eventually help explain how subtle genetic changes can sometimes make evolution appear to skip ahead, possibly even reconciling the positions of those who champion large-scale changes with the positions of those who pay heed to more minor variations. Their studies of butterfly eyespots, nematode sex determination, and cavefish eyes, for example, are yielding insights into how the same mechanisms might underlie both types of evolution.

    Evo-devo work hasn't always had such a mechanistic bent. When developmental biologists began delving into evolution more than a decade ago, they tended to focus on the big picture: so-called macroevolution. The early emphasis was to survey a broad range of organisms, chasing down developmental genes common to them all. That such genes existed was a startling revelation, suggesting that organisms' body plans were more highly conserved across species than people suspected.

    For a while, researchers were taken with trying to figure out how such similar genes could underpin the development of wildly different creatures. But that approach has proven limited. “You can collect lists of conserved genes, but once you get those lists, it's very hard to get at the mechanisms [of evolution],” explains William Jeffery, an evolutionary developmental biologist at the University of Maryland, College Park. “Macroevolution is really at a dead end.” The lists gave no insight into how, in the end, organisms with the same genes came to be so different. And given the evolutionary distance between, say, a fruit fly and a shark, “there isn't really an experimental manipulation to let you get at what the genes are actually doing,” says Rudolf Raff, an evolutionary developmental biologist at Indiana University, Bloomington (IUB).

    The solution, say Jeffery and others, is to focus on genetically based developmental differences between closely related species, or even among individuals of the same species. This is the stuff of microevolutionists, who care most about how individuals vary naturally within a population and how environmental forces affect this variation.

    In adopting a microevolutionary approach, these evo-devo researchers are placing themselves smack in the middle of the ongoing debate about how evolution proceeds. The fundamental question, Maryland's Eric Haag points out, is whether the mutations that result in real novelty are the same mutations that happen day to day or are the ones that occur only rarely, on a geological time scale.

    Taking this new approach will not be easy for biologists coming from the development side of evo-devo. To those who study how single cells grow into full-fledged organisms, variety within a species is more nuisance than spice of life. They traditionally study organisms with very consistent developmental trajectories to make sense of the process. But because variation is the stuff of evolution, “what developmental biologists consider noise, the [microevolutionists] consider gold,” says Raff.

    Now Raff and others with developmental backgrounds are beginning to pan for that gold, too. “We think a case can be made that this is the only way that we are going to be able to unravel the actual mechanisms by which developmental pathways diverge,” says IUB's Michael Lynch. Sometimes their prospecting yields small differences within the same developmental gene in different individuals. More often variation is turning out to be caused by differences in the way those genes are regulated.

    These efforts might help reconcile the microevolutionary and macroevolutionary mindsets: Small variations in genes involved in development might be springboards to both macroevolutionary and microevolutionary changes. Rare, major genetic events might sometimes occur, but they aren't necessary; minor genetic changes can elicit speciation events that are decidedly less glamorous but in many ways as dramatic as those favored by macroevolutionists.

    Genetically wide-eyed

    Antonia Monteiro has begun to document how minor genetic changes in butterflies can cause evolution to speed ahead. An evolutionary developmental biologist at the University at Buffalo, New York, she has been breeding butterflies to promote “evolution” in eyespots, dark patches on the wings that distract predators. In one experiment, she began with 800 Bicyclus anynana and from their progeny bred the 40 males and 100 females with the biggest spots.

    Spotting genetic diversity.

    Butterfly eyespots have the normally hidden potential to shrink or expand in just a few generations.


    The study simulated a situation in which large eyespots provide an advantage and therefore are favored by natural selection. She reversed the process in other experiments, picking out those with the smallest eyespots. Her goal was to see how much variation was already built into the butterflies' genetic repertoires.

    In September at the annual Integrative Graduate Education and Research Traineeship Symposium in Bloomington, Indiana, Monteiro reported that she saw a dramatic shift in the range of eyespot sizes in just six generations. “[We] started changing what the population as a whole looks like,” she reported. Some individuals even evolved eyespot patterns not seen in any members of earlier generations: For example, in selecting for ever smaller eyespots, her colleagues came up with butterflies with no eyespots at all. In nature, “if there was [similar] selection, one species can change into another in a very short amount of time,” she concludes. The accumulation of minor, hidden variations enables relatively large evolutionary changes, she says.

    To home in on the cause of eyespot shrinkage, Monteiro began experimenting with embryos, carrying out some “very nice manipulations,” says Jeffery. In this way she has been able to look at the basis of the variation in eyespot size. For example, she transplanted a small piece of tissue from one pupa's wing into a hole cut into the pupal wing of a butterfly destined to have a different-sized spot. The experiments showed that cells called the central signaling cells proved important: “If we put these cells into a [small-eyespot] line, they produce a very large spot,” she says.

    Like microevolutionists, Monteiro is hot on the trail of the genes behind these cells' powers, working from the few already implicated in eyespot development to the full genetic complement. She also plans to follow these genes throughout development and track down those that interact with them to help determine an eyespot's appearance. She wants to find which pathway within the genetic hierarchy—wherein one gene turns on a second gene, and so on—is more likely to vary and therefore make possible the evolution of a trait. “Is [the source of variation] in a gene high up in a developmental cascade or in a lower downstream target?” she asks. She still has a long way to go, but with these plans, “her work hits the strict definition of microevolution of development right on the head,” Haag notes.

    Worm by worm

    Scott Baird, an evolutionary developmental geneticist at Wright State University in Dayton, Ohio, has also joined the growing group of investigators studying microevolution. Instead of looking at a single component of development, such as eyespot size, he studies the destinies of certain larval cells, charting where they go and how they divide. His subjects are relatives of Caenorhabditis elegans, the nematode whose development has been tracked cell by cell and whose genome is now sequenced.

    C. elegans is a developmental biologist's dream come true. Its development is very consistent: Each embryonic cell has a specific destiny and gives rise to the same numbers and kinds of cells—the same cell lineage—in each individual. As a result, “until recently, nematodes were famously thought to be morphologically invariant,” explains Armand Leroi, an evolutionary biologist at Imperial College in London.

    But the more Leroi, Baird, and others study worm species that are closely related to C. elegans, the less consistency they see. In 1999, Leroi's examination of 13 of these relatives showed that their cell lineages were quite variable. Then Marie-Anne Felix, a developmental biologist at the Jacques Monod Institute in Jussieu, France, looked at the development of the vulva, a female sexual organ. She found that even there, where consistency would seem to be paramount to ensure proper mating, cell lineages that build the vulva varied quite a bit even within a species. As might be expected, such variation becomes even more pronounced between species.

    Felix has since tracked down several genes that underlie this variation. “Variation is there,” Baird says, “not necessarily hidden, but underutilized.” The variation doesn't seem to interfere with the worms' development and doesn't seem to lead to speciation, at least not at this point in time. But that might not always be the case, says Jeffery: “Variations in a reproductive organ could in principle be a cause of reproductive isolation and subsequent speciation.”

    Baird unearthed variation of a different sort in his studies of sex determination in worms. He hybridized two C. elegans cousins, breeding different strains of the hermaphroditic C. briggsae with different strains of C. remanei, which has male and female members. To his surprise, the reproductive system in the offspring varied depending on the strains used. One mating might yield males and hermaphrodites; another, all hermaphrodites.

    It seems that the sequences of the genes involved vary slightly, Baird has determined. That difference doesn't seem to matter when it comes to intraspecies matings. But it can cause havoc during hybridization. Baird observed that slight incompatibilities between the two species' genomes disrupted the normal determination of the sex of the offspring. “We are currently trying to map the genes responsible for that variation, and then [we] want to look at [base changes] to try to see what differences are affecting the interactions,” Baird explains. Studying hybrids, he is uncovering variation in the sex-determining pathways that might otherwise go undetected. That hidden reservoir of individual differences might allow the species to adjust to environmental changes, he speculates.

    Eyes at a price

    Even as Baird and others track down the genes that make nematodes vary, Maryland's Jeffery has a gene in hand from his microevolutionary studies of a cavefish species found in Mexico. He has been looking at two populations of Astyanax mexicanus. One group lives underground and lacks functional eyes; the other lives at the surface and sees quite well. In exploring the genetic and developmental basis of this difference, he found a tradeoff: The blind cavefish had bigger jaws and more teeth than the surface ones. These traits, it turns out, are tied to the set of genes that also determine the development of eyes.

    Sensory seesaw.

    In cavefish, the regulation of one gene tips development to favor either eyes or bigger jaws and teeth.


    This led Jeffery to think that eyes disappeared only because the bigger teeth and jaws proved so advantageous in this new environment. When he began looking for how this evolution occurred, he discovered that it didn't take major genetic changes to tip development in favor of one phenotype or the other. Instead, he and his colleagues found that a slight alteration in where a gene called sonic hedgehog was active in the developing head caused eyes to form or not form (Science, 23 June 2000, p. 2119). “A fairly small change was able to give a fairly large phenotypic result,” he points out.

    These efforts exemplify the power of studying evolution on an ever finer scale. Evolutionary researchers such as Lynch hope their developmental colleagues will be inspired to go a step further in incorporating microevolutionary ideas into their thinking. “Often what is compared is just the end [physical and physiological appearance] rather than the actual developmental pathway that led to its production,” he laments. He would like to see a more sophisticated approach in which researchers figure out the interplay between genetics and development, keeping in mind that changing either one too much or too fast will lead to organisms incapable of procreating. He also points out that factors such as the size of the population in which the variation develops and the number of genes that influence a changing trait need to be considered. Nonetheless, recent efforts signal that “people are starting to get on the same page about what needs to be done,” he says. That should help make sense of the interplay between the micro and macro sides of evolution.


    A Bonanza of Bones

    1. Erik Stokstad

    NORMAN, OKLAHOMA—Paleontologists came sweepin' down the plain to the 62nd annual meeting of the Society for Vertebrate Paleontology. From 9 to 12 October, some 1000 attendees heard about new ideas and specimens that spanned the taxonomic gamut.

    Marsupial Shoulder Restraint

    Placental mammals have evolved shoulder girdles capable of such diverse activities as powered flight, deep-water diving, and playing racquetball. Their marsupial cousins, on the other hand, never came up with forelimbs with these kinds of exotic shapes. Why? The standard answer is that the evolution of their shoulders has been hamstrung by a unique demand of marsupial reproduction: After birth, marsupials must make a life-or-death crawl to a teat in their mother's pouch, where they continue to develop.

    Now Karen Sears, a graduate student at the University of Chicago, has tested this long-standing hypothesis for the first time and confirmed it. The results are “insightful and so extremely important for understanding the pattern of marsupial evolution,” says Farish Jenkins, a vertebrate paleontologist at Harvard University.

    In preparation for their crawl, fetal marsupials develop the bones of the shoulders and forelimbs much faster than the rest of their skeleton. They even temporarily fuse the shoulder blade and collarbone to get more power for the climb. Because important muscles attach to the shoulder blade, or scapula, its shape could have a large influence on how the shoulder girdle and forelimb develop. To check whether adults really are limited in their anatomy, Sears first measured the shape of these bones in 97 families of placental mammals and 21 families of marsupials. She included 15 taxa of extinct marsupials, such as Diprotodon, a giant wombat relative from Australia, to get as full a range as possible. A battery of statistical tests showed that marsupials indeed had significantly less variety in the shape of their shoulder bones.

    All thumbs.

    Marsupial diversity is limited because newborns like this wallaby must develop large arms to crawl to a teat.


    Two lines of evidence suggest that this is due to constraints imposed by the crawl. First, marsupial species that crawl to the teat tend to develop their scapula in much the same way, whereas many placentals showed greater variation during their development. (Interestingly, bandicoots differed slightly from other marsupials. These small, pointy-headed marsupials fall onto the teat, rather than crawl to it, which might allow more flexibility in how their scapula develops, Sears says.) Second, the shape of the marsupial pelvis—which is not needed during the crawl and doesn't develop until afterward—has more variation than the scapula. “It's really a very clear difference,” says Marcelo Sanchez-Villagra of the University of Tübingen, Germany.

    Now that the role of the shoulder blade has been pinned down, Sanchez-Villagra says it would be interesting to look at other parts of the anatomy that are used in the neonatal crawl, such as the hands, to see if their morphological diversity is also constrained. But Jay Lillegraven of the University of Wyoming, Laramie, hopes that Sears will next compare morphological features of the brain, which he thinks will turn out to be the most important factor that helped placentals diversify.

    Dinosaur ‘Mummy’ Unveiled

    A duck-billed dinosaur from Montana is wowing paleontologists with its remarkably well-preserved state. The 77-million-year-old hadrosaur, described at the meeting, retains traces of 80% of its skin, as well as other tissues and its last meal.

    The 6.7-meter-long member of the hadrosaur family, called Brachylophosaurus canadensis, was excavated in the summer of 2000 by Nate Murphy, an amateur paleontologist at the Phillips County Museum in Malta, Montana, with help from Timescale Adventures of Bynum, Montana, a nonprofit organization that organizes fieldwork tourism. The group discovered the fully articulated skeleton—with 90% of its bones—and excavated it in a single 6.5-ton block. “When we get a specimen with so much soft-tissue preservation, it's a real opportunity,” Murphy says.

    Highlights of the find start with skin impressions covering most of the body. The creature's throat seems to be intact, as does what appears to be a shoulder muscle. And exposed in the chest and pelvic areas are fossilized plant remains. Dennis Braman, a palynologist at the Royal Tyrrell Museum in Drumheller, Canada, identified more than 40 kinds of plants, including freshwater algae, ferns, liverwort, and angiosperms, among the contents of the digestive system.

    Dinosaur fossils with preserved soft tissue are extremely rare, and this one might contain more information than two famous mummies discovered early in the 20th century by famed fossil hunter Charles Sternberg. “It's a beautiful specimen,” says Lawrence Witmer of Ohio University in Athens. “There's a lot you could learn from it.”

    Weighing Dinos by Bones Alone

    Almost anyone who has stared at the hulking skeleton of a sauropod or woolly mammoth has wondered how much the beast weighed. Paleontologists are even more curious, because body mass influences heart rate, temperature, population size, and many other important aspects of ancient life. At the meeting, Ryosuke Motani of the University of Oregon, Eugene, presented a new method for estimating the body mass of extinct animals from the dimensions of their limb bones. “It clarifies, simplifies, and makes things much more specific,” says Matt Carrano of Stony Brook University in New York.

    There are a variety of ways to get a rough handle on how much an extinct animal weighed. Some researchers have made plastic models, dunked them in water, and measured the displacement. Others use computer reconstructions to estimate volume. Either way, these methods require some speculation about the animal's shape. Alternatively, one can try to infer mass from the shape of bones, say, or the length of the skeleton. In 1973, biologist Thomas McMahon of Harvard University proposed that body mass could be calculated solely from the dimensions of limb bones (Science, 23 March 1973, 1201), because all limbs must bear weight without buckling. But when examined in living animals, the relation didn't hold up.

    Looking back, Motani realized the reason for the failure: McMahon's equations took into account only the weight of the limb itself, whereas in fact, limbs bear the weight of the entire body. After rewriting the equations, Motani checked them with limb measurements from 94 mammals of known body mass belonging to 12 orders and ranging in size from shrews to an elephant. The limb dimensions correlated extremely well with body mass. “That means limb bones may be used to infer body mass,” Motani says.

    Applying his equations to the long extinct, Motani tried to estimate the mass of the colossal dinosaur Brachiosaurus. This sauropod is unusual because of its long upper arm. Motani's method yielded an upper limit of 36 metric tons. That's about 30% less than usual back-of-the-envelope estimates, which typically rely just on bone circumference. One reason for the difference is that the new equations take into account the fact that longer bones break more easily than shorter bones of the same circumference.

    The method will work for any animal that held its limbs erect rather than sprawling like lizards. “I think it's really good,” says Don Henderson of the University of Calgary, Canada, who constructs computer models of dinosaur locomotion. He says the method could be a useful independent check of other methods. And although no one approach is perfect, Motani has come up with a definite improvement. “It gives a truer picture of the real situation,” Henderson says.

Log in to view full text