News this Week

Science  13 Jun 2003:
Vol. 300, Issue 5626, pp. 1634

    Environmental Report Paralyzes Italian Neutrino Lab

    1. Alexander Hellemans*,
    2. Charles Seife
    1. Alexander Hellemans is a writer in Naples.

    NAPLES—at the world's largest underground facility for the study of subatomic particles from space, Italy's Gran Sasso National Laboratory, has ground to a near-halt following a court-ordered report that said the lab's drainage system could contaminate local water supplies. A local court has ordered one experimental area closed indefinitely, and the government agency that runs Gran Sasso has halted most experiments throughout the lab. But physicists there say that fears are exaggerated and that the shutdown could set their research back by months or more. “This is a disaster for us,” says Gianpaolo Bellini of the University of Milan, a spokesperson for a solar-neutrino detector under construction at the lab called Borexino.

    The laboratory, a 6000-square-meter complex shielded from cosmic radiation by 1400 meters of rock, was built in the mid-1980s alongside a 10-kilometer-long highway tunnel under Gran Sasso Mountain, between Rome and the Adriatic Sea. It cools its equipment with water from the same aquifer that supplies drinking water to two nearby towns, L'Aquila and Teramo. But despite its sensitive position, for years environmental tensions stayed below a simmer.

    Then, last August, two researchers working on the Borexino experiment spilled pseudocumene (1,2,4-trimethylbenzene), a liquid chemical that scintillates when a neutrino strikes. The Borexino detector uses 300 tons of the fluid to spot neutrinos coming from reactions of beryllium-7 in the sun. “They quickly realized that pseudocumene was being drained from the vessel. They kept the incident to 50 liters and reported it immediately to the lab authorities and local authorities,” says Frank Calaprice, the principal investigator of Princeton University's contribution to the Borexino project. The volatile chemical quickly evaporated, but some of it drained into a nearby creek, where picnickers smelled it and it killed a fish, Calaprice says.

    High and dry?

    Gran Sasso National Laboratory put experiments on hold after a chemical spill raised fears that the lab might be leak prone.


    Environmentalists seized on the incident and took the lab to court. A tribunal in Teramo ordered an independent engineer to prepare a report on environmental safety conditions at the lab. Meanwhile, Italy's high-energy physics research institute, the National Institute of Nuclear Physics (INFN), ordered the lab to halt any work involving pseudocumene in Hall C, the area where the spill took place.

    The 10,000-page report was released on 29 May. It acknowledged that no traces of trimethylbenzene had been found in local drinking water but concluded that chemicals spilled in the future could find their way into the local drinking water and river. In response, the Teramo tribunal immediately sealed off Hall C indefinitely, stopping all activities except safety checks, and INFN banned the handling of fluids throughout the lab.

    Some experiments—such as Icarus, a detector for muons produced by cosmic ray interactions in the atmosphere—do not require the handling of liquids and are still running, reports Gran Sasso laboratory director Alessandro Bettini. But others, such as Borexino, are in limbo. “I'm worried it could take a long time, but I have no idea,” says John Wilkerson, a physicist at the University of Washington, Seattle, who sits on Gran Sasso's review board. “I hope that the government and INFN will resolve the issue quickly, or there will be damage to the [Borexino] project and damage to the lab.” In the meantime, he says, “most of the experiments are certainly on hold.” Adding to the uncertainty, Bettini is stepping down this month. Wilkerson thinks major decisions about making the lab fully operational again will wait until Bettini's successor, Eugenio Coccia of the University of Rome, comes on board.

    “Borexino is the best hope, in the near future, of seeing beryllium-7 neutrinos,” says Wick Haxton, a physicist at the University of Washington, Seattle. “This is a very bad time for them.”

    GNO, a gallium-based solar-neutrino detector that has been operating since 1998, is also shut down while the ban on liquid operations is in place; it uses a tank of germanium chloride to detect neutrinos, Calaprice says. It's not clear whether the ban applies to the several experiments that are cryogenically cooled by liquid nitrogen, he adds.

    Bettini says he hopes to persuade the local judiciary to reopen Hall C. INFN is prepared to appeal to Italy's Supreme Court of Cassation in Rome, he says. A recent meeting with government officials left him optimistic that the lab's piping system that evacuates excess water will be inspected during the next few months, he says. Bettini also says the lab has received approval to build an extra safety tunnel that would allow better access to the laboratory as well as safer evacuation of water and its treatment in external basins in case of accidental contamination.

    To Calaprice, the concern over leak plugging is only half the story. “I'm concerned in the long term, one, how long will it take to fix the technical problem?” he says. “Two, how long will it take to regain the trust of the locals and for the local authority to stop putting seals here and there?” Until Gran Sasso gets on better terms with its neighbors, the headaches will almost certainly continue.


    Academia Gets No Help From U.S. in Patent Case

    1. David Malakoff

    There's no end in sight to a bitter legal battle over the use of patented tools in academic research. The U.S. government's top lawyer has recommended that the Supreme Court reject pleas from Duke University and other institutions to review a lower court ruling that could end a 190-year-old practice of allowing academic scientists free use of such technologies in basic research.

    The suggestion from Solicitor General Theodore Olson, which is likely to be adopted, would send the battle back to a lower court—and possibly into Congress. That would leave universities with the unappealing prospect of having to seek licenses for all of the patented tools their researchers use—or risk potentially costly lawsuits. “This is going to wreak a fair amount of turmoil,” predicts David Korn of the Association of American Medical Colleges (AAMC), one of several groups allied with Duke.

    The justices asked for the government's views after Duke appealed a specialized patent court's ruling* in a long-running dispute with a former faculty member (Science, 3 January, p. 26). Physicist John Madey, the inventor of the free-electron laser, claims that the Durham, North Carolina, school improperly removed him in 1998 as head of a research center and has since infringed on his patents by using his lasers and other equipment without permission. Madey, who now works at the University of Hawaii, Manoa, also wants Duke to ship the devices to his new lab.

    In 1999 a lower court sided with Duke, ruling that the university wasn't infringing because its scientists were involved in noncommercial studies. That so-called research exemption is rooted in an 1813 case that suggested researchers could freely tinker with patented devices if the work was “for amusement, to satisfy idle curiosity, or for strictly philosophical inquiry.” But last October a federal appeals court sided with Madey, calling Duke a businesslike entity that profited from the use of the lasers. The research “unmistakably further[ed Duke's] legitimate business objectives, including educating and enlightening students and faculty” and helped it “lure lucrative research grants,” wrote U.S. Circuit Court of Appeals Judge Arthur Gajarsa. He also noted that Duke and other universities increasingly act like commercial enterprises, profiting from their own patents and routinely suing infringers.

    Outside opinion.

    Solicitor General Olson says academia's worries about patent ruling are overblown.


    The finding stunned many university administrators, who predict that it will slow academic research and increase costs. In its January appeal to the Supreme Court, Duke argued that the decision “turned long-settled law on its head … and effectively eliminated the experimental use exception for research institutions.”

    The solicitor general, however, suggests that Congress, not the courts, should resolve the debate. In a 19-page brief filed on 30 May, government lawyers argue that academia's worries are overblown. They note that industrial research has “proceeded at a rapid and increasing pace” without any exemption and that Gajarsa's ruling “is generally in line with” past rulings that have narrowed the exemption. “If problems materialize” for universities, they conclude, “Congress may be the proper forum [in which to] … devise a comprehensive solution.”

    Duke is responding to the brief. Legal experts expect the justices to follow the solicitor general's advice and send the case back to a North Carolina district court, where the university will get another chance to argue that it is involved in “philosophical inquiry.” But university groups are skeptical that the lower court will settle the broader patent issue. And they doubt that Congress will step into the fray anytime soon. Several recent legislative proposals to enact a broad research exemption based on European models have sunk with little trace because of the paucity of evidence that inventors have demanded payments from academics for their use of patented inventions in basic research.

    That situation could soon change, AAMC's Korn and others predict. Emboldened by last fall's ruling, inventors have already begun asking researchers for hefty royalties to use an array of basic tools—from genetic probes to catalytic molecules—once considered free for the picking. Universities worry that resisting such requests could, as with Duke, land them in court.

    • *Madey v. Duke University, No. 01-1567, Federal Circuit Court of Appeals, 3 October 2002.


    Review Gives Embattled MRC a Boost

    1. Daniel Bachtold

    CAMBRIDGE, U.K.—The British government has leapt to the defense of its beleaguered Medical Research Council (MRC), the U.K.'s main public funder of biomedical research. In a report last week, the Department of Trade and Industry (DTI) rebutted allegations of financial mismanagement at MRC. But the report also says that the council unwisely kept researchers in the dark about its precarious funding situation.

    DTI's forgiving tone contrasts with the harsh words from the U.K. House of Commons' science and technology committee, which claimed in March that the council had left scientists in the lurch after it “gambled” on hefty budget increases that never materialized (Science, 28 March, p. 1958). After awarding a large number of new grants in 1999–2000, a fiscal boom year, the council was forced to slash funding for new grants over the past 3 years from $342 million to $98 million to meet its commitments. MRC, which spends about $700 million annually on research, was also rebuked for failing to adequately consult with the research community before going ahead with Biobank, a $70 million project aimed at linking the genetic makeup of Britons with lifestyle factors.

    DTI concluded that MRC's decision to boost the number of grants in 1999–2000 was made using the “best information available at the time.” DTI also defended MRC's handling of Biobank, saying the council and its co-funders relied on international experts to avoid potential conflicts of interest among domestic scientists. Furthermore, DTI contends, the community's anger over the recent funding crunch is not the council's fault: “There are more high-quality research opportunities than there are funds … and disappointment in some sections of the community is inevitable.”

    MRC chief executive George Radda says the council learned a lesson from the precipitous dip in funding for new grants and is “refining [its] financial planning processes.” In future years, MRC intends to hold back up to 10% of its budget to ensure that it can fund a steady number of new awards even in tough financial times.

    Even so, legislators aren't backing down from their earlier criticism. Biologist Ian Gibson, chair of the House of Commons' science committee, thinks his panel has “stimulated a debate and has won widespread support.” His goal, he says, is not “winning the argument” but “making sure that medical research in Britain stays at the top.” That will require an institutional transformation, says Gibson, who's counting on Colin Blakemore “to change the culture of the MRC” once he takes over as chief executive on 1 October.


    U.K. Probes Public Opposition to GM Crops

    1. Quinn Eastman*
    1. Quinn Eastman is an intern in the Cambridge, U.K., office of Science.

    CAMBRIDGE, U.K.—This month, Britons are getting their chance to say whether they want genetically modified (GM) products in their food. The U.K. government is sponsoring a series of six public debates around the country to give interested parties and members of the public a chance to voice their opinions on GM crops. Malcolm Grant, head of the debates' organizing committee and pro-vice chancellor of Cambridge University, calls them a “unique experiment to find out what ordinary people really think.”

    The debates, which are scheduled to end this week, are part of a coordinated series of events to help the government decide later this year whether to allow planting of GM crops in the United Kingdom. Like other European Union (E.U.) countries, Britain has maintained a moratorium on approving new GM crops for commercial planting since 1998. But pressure for an official E.U. policy is mounting from the U.S. government, agbiotech companies, farmers, consumers, and environmental pressure groups—although they mostly want different things.

    Farm minister Margaret Beckett has promised that the government will listen to the debates and provide a detailed written response. At the same time, the prime minister's strategy unit is preparing an economic analysis of various options, including attempting to remain GM-free; it will present its report this summer. And David King, the government's chief scientific adviser, is leading a review of agricultural biotechnology that will include the results of the government-sponsored Farm Scale Evaluations (FSEs): the world's biggest ever experiment into the safety and environmental impact of using herbicides with GM crops, due to finish in July.

    Direct action.

    Protesters tear up genetically modified rape plants in government-sponsored trials last year; 80 were arrested.


    The British government hopes that these evaluations and the FSEs will show that it is safe to go ahead with GM crops. But even if they do, judging by the first of the debates, held in Birmingham last week, the government will have a tough time persuading the public.

    Participants mostly scoffed at claims that GM crops could reduce pesticide use and increase yields, benefiting farmers in developing countries. Discussion focused mainly on safety and risks to the environment. Sue Mayer, head of GeneWatch UK, a genetic technology watchdog group, asked about the possible allergenic effects of transgenes being used in plants. Organic farmers expressed concern about contamination: They wanted to be able to assure their customers that their products were GM-free. Patrick Noble, an organic farmer, agreed on the need for caution and more scientific information. “All I see is the development of technology, as opposed to real science,” he said. Such attitudes are not surprising: Polls in the United Kingdom indicate that opponents of GM crops outnumber supporters four to one.

    At least some supporters of GM crops came away disappointed. “I wish that there had been more scientific information available,” complained farmer Cecil Thomas. Thomas is used to hearing strong opinions about GM crops. Two years ago, he agreed to plant a strain of herbicide-resistant corn on his farm near Coventry, as part of the FSEs. But local opposition and objections from farmers associated with the nearby Ryton Organic Gardens, who were concerned about contamination and cross-pollination, prevented him from joining the trials. At the Birmingham debate, Thomas faced off against some Welsh organic farmers and members of Friends of the Earth, an environmental advocacy group. “Conventional farmers like me need to prevent the discussion from being overwhelmed by the anti-GM brigade,” he says.

    The British government is not alone in grappling with these issues. The U.S. government has been urging the E.U. to lift the GM moratorium in the interests of free trade, and last month it filed a complaint with the World Trade Organization in an effort to force it to do so. Although no government has officially acted, there are signs that the anti-GM consensus may be breaking down. Spanish farmers already grow corn engineered to produce toxins against corn borer pests, and the Italian government is lobbying for the E.U. to restart trade in GM agricultural products.

    But even if the moratorium is abandoned, the European Parliament last week approved a protocol allowing individual countries to refuse import of GM agricultural products if they believe safety evidence is lacking, and in July it will vote on a bill establishing strict labeling and traceability requirements. It promises to be a long, hot summer in the Euro GM wars.


    Retrenchment at the Max Planck Society

    1. Gretchen Vogel

    HAMBURG—The Max Planck Society is being forced to shrink its operations after more than a decade of expansion. In response to a budget freeze this year, the premier German research organization will close 12 of its 270 departments by 2007 and has scrapped plans for nine others. One complete center, the Institute for Experimental Endocrinology in Hannover, will close; most of its researchers will move to the Institute for Biophysical Chemistry in Göttingen. In addition, research budgets in existing departments are being cut by as much as 15%.

    The “consolidation plan,” announced here at the organization's yearly meeting last week, is a major turnaround for the society, which has opened 18 new institutes in the past 12 years, mostly in the former East Germany. But as the building in the east came to a close and the German economy slowed, the customary 5% yearly budget increases dwindled to 3.5% annually between 2000 and 2002. Late last year, the federal government announced a flat budget for 2003 for Max Planck and several other research organizations.

    “It is a significant step backward for the society,” says Christiane Nüsslein-Volhard of the Institute for Developmental Biology in Tübingen. “If we don't watch very carefully,” she says, the cuts could have serious long-term effects. The society thrives on its ability to open new institutes and departments in response to scientific developments, she says, but at the moment, “if we came up with a new idea, it would not go forward.” Developmental geneticist Herbert Jäckle, one of the society's vice presidents, agrees: “The potential to hire young, innovative people and to seed a little bit into new fields becomes very limited. Our strength is that in principle we can close down a place and open something new. Right now that opportunity is diminished.”

    Victim of consolidation.

    Budget cuts will shut down the Max Planck Institute for Experimental Endocrinology in Hannover.


    The decisions about which departments to close were made more on practical than on scientific grounds, Nüsslein-Volhard says. The society has little room to maneuver, because the scientific members who head departments are guaranteed a job until retirement. Of the roughly 40 directors who plan to retire in the next 4 years, 12 will not be replaced. The Max Plank Institutes for Physics and for Quantum Optics in Munich will each lose a department, as will two in Heidelberg, the Institutes for Medical Research and for Nuclear Physics. The Institutes for Solid State Research and for Metals Research in Stuttgart will each lose a department. Five more departments will be closed at the Institutes for Neurological Research in Cologne, Radiation Chemistry in Mülheim an der Ruhr, Experimental Medicine in Göttingen, Radio Astronomy in Bonn, and Psycholinguistics in Nijmegen, the Netherlands. The Institute for Limnology's river research station in Schlitz, central Germany, will be closed when its current director retires in 2007.

    The nine new departments whose plans are being shelved include a new position planned at the Institute for Infection Biology in Berlin, which will be postponed until at least 2008, says Jäckle. At the Research Center for Ornithology in Radolfzell, he says, only two of the three planned director positions will be filled.

    Even if the budget picture brightens, the consolidation may continue, Jäckle warns. Because of Germany's employment laws, salary costs—and yearly increases—are fixed, and they take up about half of the society's $1.5 billion budget. New departments typically cost as much as a third more than their predecessors, he says, so the society may be able to recruit only two new directors for every three retirees.


    U.S. Monkeypox Outbreak Traced to Wisconsin Pet Dealer

    1. Martin Enserink

    The field of infectious diseases can't stay off the front pages these days. No sooner do reports of SARS subside than a new surprise comes along: Monkeypox, a cousin of smallpox that normally dwells in the rainforests of Western and Central Africa, has popped up in towns in three states in the U.S. Midwest, infecting pet prairie dogs and their owners. It is the first time that a monkeypox outbreak has been seen in the Western Hemisphere.

    As Science went to press, four people had been diagnosed with the disease, and another 33 possible cases were being investigated. Although monkeypox historically has killed 1% to 10% of its human victims, none of these patients has died. The epidemic, coinciding with the start of the U.S. West Nile season, had public health officials and virologists reeling, and, once again, changing tack. “SARS was the virus du jour,” says poxvirus expert Peter Jahrling of the U.S. Army Medical Research Institute of Infectious Diseases in Fort Detrick, Maryland. “Now, it's taking a back seat.”

    The virus in this outbreak was first isolated and recognized as an orthopoxvirus last week by Kurt Reed, a pathologist at the Marshfield Clinic in Marshfield, Wisconsin; subsequently, samples were rushed to the Centers for Disease Control and Prevention (CDC) in Atlanta, where genetic tests nailed it as the monkeypox virus on 7 June.

    Index case.

    After being bitten by a sick prairie dog, a child in Wisconsin developed monkeypox.


    In a report posted on its Web site, CDC says that the first patients got sick in early May, which led some experts to wonder why, in an era of smallpox jitters, it took more than a month to recognize a disease with symptoms similar to those of smallpox. But Wisconsin state epidemiologist Jeffrey Davis says the first patient—a young girl—didn't become ill until 15 May; besides, he says, she developed a rash on her finger after a bite from her prairie dog, so plague and tularemia, both carried by the critters, were more likely candidates. “The Marshfield did everything extraordinarily rapidly,” Davis says.

    Monkeypox derives its name from a series of outbreaks among lab monkeys in the 1950s and 1960s, but the virus is known to infect other primates, several rodent species, and rabbits as well. Its natural reservoir is still unknown, but squirrels are prime suspects, says Joel Breman, a former poxvirus research director at the World Health Organization who is now at the Fogarty International Center in Bethesda, Maryland.

    Almost all of the suspected monkeypox patients in Wisconsin, Illinois, and Indiana appear to have contracted the disease from prairie dogs that they had recently purchased as pets. An investigation has linked all the sick animals back to a distributor in Milwaukee, Wisconsin, who sold animals to two pet shops and at a “pet swap meet” in northern Wisconsin. The distributor also had owned a sick Gambian giant rat, which may have been the source of the virus, CDC says.

    Humans have become much more susceptible to orthopoxviruses after smallpox vaccination was halted worldwide in the 1980s. Monkeypox has been a particular concern; some scientists worry that the virus may fill the niche vacated by smallpox if it evolves and becomes more readily transmissible among humans. That fear was fueled by an 1996–97 outbreak in the Democratic Republic of the Congo in which human transmission appeared to be very prevalent (Science, 18 July 1997, p. 312). But there's no evidence yet that monkeypox is replacing smallpox, Breman says, and there are no known cases of human-to-human transmission in the current outbreak.

    It would be bad enough, however, if the virus became established among U.S. pets or wild animals, creating yet another permanent human health hazard. That's why authorities are now feverishly trying to trace every infected prairie dog, says Davis, hoping none of them will be released into the wild. To gauge the potential for spread in pets, Jahrling is planning to study hamsters' and gerbils' susceptibility to the virus. Says Reed: “We're still hoping this is a one-shot deal for the history books.”


    Oldest Members of Homo sapiens Discovered in Africa

    1. Ann Gibbons

    For more than a century, scientists have wondered when and where modern humans arose—and what they looked like. Genetic evidence has pointed firmly to Africa in the past 200,000 years, but there have been few fossils from the right time and place to back that up.

    Now three partial skulls from Ethiopia are putting a face on the earliest modern humans. These ancestors were African, with big brains, robust features, and a taste for hippopotamus and buffalo meat. Dated to 154,000 to 160,000 years ago, the skulls are the first strong fossil evidence that modern humans originated in Africa. “This is the oldest clear example of an early modern human we have found,” says paleoanthropologist Chris Stringer of the Natural History Museum in London.

    The skulls belonged to two men and a child, and they are being introduced on the cover of this week's issue of Nature as the immediate ancestors of modern humans. A team of American and Ethiopian researchers says that the face of the most complete skull already has a “modern gestalt.” “It is so similar to ours that there is little doubt it is the face of a direct ancestor,” says co-author and paleoanthropologist Tim White of the University of California, Berkeley. The skulls also have cut marks, suggesting that they were defleshed and handled after death, perhaps in rituals for the dead.

    Until now, the oldest undisputed modern human remains were skulls dated to 90,000 to 120,000 years ago. They were found not in Africa but in the caves of Skhul and Qafzeh in Israel. The oldest anatomically modern Africans are skull fragments from South Africa dated to 100,000 years ago. Another half-dozen African fossils thought to be 130,000 to 300,000 years old are poorly dated or fragmentary, and they are not enough to prove the leading model that modern humans arose in Africa, says Stringer.

    The best evidence for this Out of Africa model has come instead from DNA. Geneticists have consistently traced the oldest types of modern human DNA to ancestors who lived in Africa in the past 200,000 years. Most recently, population geneticist Sarah Tishkoff of the University of Maryland, College Park, reported at a meeting in April that the oldest versions of maternally inherited DNA arose 170,000 years ago and are found in the Sandawe people of Tanzania and the !Kung San of the Kalahari desert, who both may have roots in northeastern Africa.

    Now, for the first time, fossils fit the genetic data. The new skulls were discovered on a quick stop in November 1997, when White spotted a butchered fossil hippopotamus skull on the ground near the village of Herto, about 230 kilometers south of Addis Ababa. When the team returned to explore, White sent graduate student David DeGusta and Turkish paleontologist Cesur Pehlevan to survey while he set up a tarp for shade. Both found skulls before lunch. A week later, Berhane Asfaw of the Rift Valley Research Service in Addis Ababa found a child's cranium, shattered into 200 pieces.

    Modern look.

    This skull of an adult male from Ethiopia is about 160,000 years old, but it already looks like a modern human's.


    It took the team 3 years to clean, prepare, and reassemble the fossils. Then White and Asfaw compared the skulls with 6000 others from around the world. They concluded that the most complete adult skull was clearly a Homo sapiens, with a pentagonal shaped vault, wide upper face, and moderately domed forehead. It also had divided brow ridges and a flat midface like modern humans. But the skull's huge vault (1450 cubic centimeters) was slightly above the modern human average. And a few primitive features, such as a flexed bone at the rear of the braincase and protruding brows, link it with more ancient African fossils. The team concluded it was a near-modern and named it H. sapiens idaltu, using the word for elder in the local Afar language.

    The site also yielded a surprising mix of stone tool technologies. The team found crude stone hand axes as well as stone flakes produced by a more efficient and sophisticated toolmaking technique. Another puzzle is that the child's skull, in addition to cut marks, had polish on its side and back. “This is some kind of mortuary practice extending well beyond the death of the individual,” says White, who adds that the marks most resemble those seen on skulls handled in rituals in New Guinea.

    The team next faced the problem of dating the skulls, which was difficult because two were on the surface and all were beyond the range of carbon dating. However, the most complete skull was found embedded in ancient cemented sands. Geochronologist Paul Renne of the Berkeley Geochronology Center used isotopes of argon to date volcanic material in the same layer as the fossil to 160,000 years ago. But the ash in the layer above the fossil, which would give the upper bounds on its age, was contaminated with crystals from older eruptions. So other team geologists traced the volcanic layer to a site dated reliably to 154,000 years ago.

    Although a few colleagues are grumbling about whether the subspecies designation is needed, no one disputes that the new fossils are early H. sapiens. “This is a great discovery because there is no doubt these fossils are the forerunners of the early modern people at Skhul and Qafzeh,” says Harvard University archaeologist Ofer Bar-Yosef, co-discoverer of the Qafzeh fossils.

    Researchers also agree that the new fossils do not resemble Neandertals, whose lineage evolved in Europe from 400,000 years ago to about 30,000 years ago. That's important because a minority view of modern human origins holds that living people inherited their DNA primarily from modern humans coming out of Africa but also from Neandertals they met in Europe and other archaic humans in Asia. “It does confirm that Neandertals were not part of the direct ancestry of early modern humans,” says Erik Trinkaus of Washington University in St. Louis, Missouri. He adds that it is still possible that these modern Africans interbred later with Neandertals or other archaic people.

    The new skulls come from a site where the Middle Awash team has found a remarkable sequence of early human fossils, dating to 1 million years, 600,000 years, and now 160,000 years ago. The sequence shows that species of Homo were living in this corner of Africa off and on for a million years, and that they were evolving more modern features over time. “Now we have a great sequence of fossils showing our species evolved in Africa, not all over the globe,” says White.


    Another Middle East Showdown

    1. Richard Stone

    Next week, the IAEA board will discuss concerns over Iran's nuclear program, including a confidential report claiming it has violated a nonproliferation agreement; pressure is mounting for sweeping inspections

    VIENNA—Last February, atomic inspectors saw with their own eyes what intelligence sources had been rumbling about for months: that Iran appears to have achieved a major milestone on the road to nuclear statehood. Near the town of Natanz, 250 kilometers south of Tehran, a delegation from the United Nations' nuclear watchdog, the International Atomic Energy Agency (IAEA), took stock of a uranium-enrichment facility that is under construction. The technological leap evident that day was row upon row of gleaming new gas centrifuges for separating uranium hexafluoride by isotope: a high-tech way to cull uranium-235 from its abundant but less fission-prone cousin, uranium-238. Only a handful of countries are capable of pulling off this atomic alchemy.

    It's unclear how Iran acquired the know-how to manufacture the high-precision machines. A more urgent question, however, is what it will do with them. Iran insists its aim is to produce low-enriched uranium fuel for a series of power plants it hopes to build with Russian help. The United States and like-minded countries argue that Iran intends to produce highly enriched uranium—containing 20% or more of uranium-235—the raw stuff of nuclear bombs. The truth could have momentous consequences for stability in the Middle East.

    As a result, pressure is mounting on Iran to give IAEA carte blanche to fully inspect its nuclear program. At a press conference on 3 June, following the G8 summit in Evian, France, President Vladimir Putin said that Russia—which had aided Iran and long discounted it as a proliferation threat—“will insist that all Iranian programs in the nuclear sphere are overseen by [IAEA].” U.K. Prime Minister Tony Blair, in remarks to Britain's Parliament the same day, revealed that Putin had assured other G8 leaders in private that Russia would suspend cooperation on the centerpiece of Iran's nuclear power program, a $1 billion light-water reactor facility in Bushehr, if Iran fails to give IAEA far-reaching authority to monitor nuclear activities throughout its territory.

    The diplomatic maneuverings intensified later in the week, when a confidential report from IAEA declared that Iran had violated an international agreement by importing and processing uranium. Iran's top atomic official has denied the allegations and said his country would issue a detailed rebuttal at a meeting of IAEA's governing board next week.


    A representative of the National Council of Resistance of Iran points to a map of the once-secret enrichment facility near Natanz a day before an IAEA visit.


    Western officials are hoping that Putin will lean on his powerful atomic ministry to postpone shipments of low-enriched uranium to the Bushehr reactor, due to come online in 2005, until Iran signs a new protocol to the Treaty on the Non-Proliferation of Nuclear Weapons (NPT) giving IAEA sweeping powers to conduct inspections of a broad range of facilities on short notice. As Science went to press, the Iranian government was resisting the demand, and there were conflicting signs about whether Russia's atomic establishment would fall in line.

    Nuclear loopholes

    When the NPT came into force in 1968, the five nuclear powers—China, France, Russia, the United Kingdom, and the United States—negotiated a free pass that shielded their weapons programs from IAEA inspection but placed under safeguards civilian facilities that use or store nuclear materials. Nonweapons states that signed the treaty entered an agreement to place all their nuclear facilities and materials under IAEA watch.

    For years the safeguards agreement was seen as sufficient to monitor treaty compliance. It compels countries to list facilities that could be used to produce nuclear weapons and open them up to IAEA inspection, and it gives IAEA authority to monitor stocks of uranium and plutonium. But the agency's “conclusions are limited to what a country declares,” says Jill Cooley, director of concepts and planning in IAEA's Department of Safeguards at the agency's headquarters in Vienna. IAEA has no authority to inspect facilities that are not on the list.

    Iraq showed how easily an NPT country could exploit this loophole by remaining silent; its clandestine nuclear weapons R&D efforts came to light after the 1991 Gulf War. “Iraq was a wake-up call,” says nuclear chemist David Donohue, head of the Clean Laboratory Unit at the agency's Safeguards Analytical Laboratory in Seibersdorf, Austria. Now, some argue that the same subterfuge is being used in Iran.

    View this table:

    Late last week, members of IAEA's governing board—representatives of 35 of the agency's member states, including Iran—received their copies of a confidential report on Iran's nuclear program prepared by IAEA experts. A senior IAEA official says the document lays out what is known of Iran's nuclear program but leaves it to the board members to conclude whether any “anomalous activities”—agency jargon for potential weapons development—can be interpreted as a nascent weapons program.

    The report does assert that Iran is in violation of its safeguards agreement. One diplomat who has seen the document confirms that it accuses Iran of failing to account for nuclear material, report its processing and use, and declare facilities where the material is stored and processed. The official, from a country with close ties to Iran, says that the most serious allegations involve what he calls a “trivial” amount of uranium—tens of grams—that Iran has not accounted for and is said to have processed into a metallic or gaseous form. He rejected claims in the media that uranium had been enriched.

    As Science went to press, an IAEA team was in Tehran to attempt to resolve the discrepancies. It is examining Iranian documentation and will take environmental samples for its Seibersdorf laboratory to test for enrichment, according to a source close to the mission. Much is at stake: If Iran fails to clear up the concerns, the IAEA board could declare it out of compliance with the NPT, which could trigger sanctions.

    Sanctions could slow or even imperil Iran's already delayed 1000-megawatt Bushehr reactor project, which is under IAEA safeguards. Iranian officials have said they hope to build a second such reactor and eventually generate 20% of their electricity supply at nuclear plants. Russia's Ministry for Atomic Energy (Minatom) and Iran are wrapping up negotiations on an agreement to return the reactor's spent fuel rods to Russia, which would prevent Iran from reprocessing them to obtain plutonium.

    Under scrutiny.

    The Bushehr complex is already under safeguards, but pressure is on Iran to sign an additional protocol that would greatly expand the scope of IAEA inspections.


    More worrisome is a facility that is not part of the deal with Russia: the Natanz enrichment plant. “Bushehr was never the real problem, as is now revealed,” says Kenneth Luongo, executive director of the nonprofit Russian-American Nuclear Security Advisory Council in Washington, D.C., and Moscow.

    The Natanz facility came to light last August only after the National Council of Resistance of Iran smuggled information out of the country. The facility's two main centrifuge “cascade” halls are being built underground, satellite images confirm; some observers speculate that this design is meant to protect the facility from the kind of aerial bombardment that Israel used to take out Iraq's Osirak nuclear facility in 1981.

    After months of delays, Iranian officials allowed an IAEA group led by Director-General Mohamed ElBaradei to visit the site on 21 February. The team counted roughly 160 assembled centrifuges and saw components in the vast hall for hundreds more. “ElBaradei was really stunned when he saw what was in that place and how advanced it was,” says one observer. A U.S. government official told Science that the machines are thought to be based on a Chinese design adapted for Pakistan's nuclear weapons program.

    Iran may not have contravened the NPT by keeping Natanz under wraps. The safeguards agreement, in force in Iran since 1974, mandates only that countries divulge design information on such a facility 180 days before it receives nuclear material. During ElBaradei's visit, Iran agreed to turn over design data much earlier, becoming the last NPT party to sign such an agreement. It has since provided preliminary design information on Natanz to IAEA, which is now drawing up a safeguards plan for the site.

    Sweeping new powers

    One consequence of the Iraq “wake-up call” was an effort to strengthen IAEA's ability to sniff out clandestine attempts to enrich uranium or extract plutonium from irradiated fuel rods. The result: The IAEA board approved a new protocol to the NPT in May 1997. The protocol gives IAEA the right to demand access to any facility linked to a nuclear program—not just those that a country has declared—and take samples for analysis. It requires only 24 hours' notice—or less in some instances. States that cooperate also agree to report to IAEA information on all R&D activities that have a connection to nuclear energy. (Theoretical and basic research is exempt.)

    The catch, of course, is that IAEA has these powers only in countries that have agreed to the protocol. Although 73 countries have signed on, only 35 have ratified it to date—and Iran is not one of them. Even if Iran accepts a protocol, some fear that it could suddenly abandon the treaty, as North Korea has done, and reprocess spent fuel to extract plutonium or enrich uranium to weapons grade. That the NPT allows the production and stockpiling of fissile material at all is “a gaping hole” in the treaty, contends Luongo.

    Nevertheless, if Iran were to sign the new protocol, IAEA would be able to monitor events at Natanz more closely. IAEA would also have the authority to monitor a heavy-water production plant at Arak and investigate recent allegations by the Iranian resistance movement that there are two more enrichment labs near Tehran.

    The next few weeks, as nuclear experts digest the IAEA report, are likely to be filled with intense behind-the-scenes diplomacy. At a press conference on 2 June, Hamidreza Assefi, a spokesperson for Iran's foreign ministry, declared that “nuclear weapons have no place in Iran's agenda.” But he suggested that Iran would consider signing the additional protocol only if the United States lifted sanctions targeting its nuclear program: “We will not sign any other international treaty as long as the West does not respect its obligations outlined by the NPT and does not help us with nuclear technology as the NPT obliges them to.”

    A lot could rest on whether Putin follows through on his pledge to G8 leaders to use the Bushehr agreement as leverage to persuade Iran to open up its nuclear program. But late last week Minatom chief Alexander Rumyantsev was sticking to his guns: At a press conference on 4 June, he stated that uranium deliveries to Bushehr would begin by next year and would not be linked to Iran signing the additional protocol. The IAEA board next week will stiffen either Putin's resolve—or Rumyantsev's.


    State-of-the-Art Nuclear Sleuths

    1. Richard Stone

    SEIBERSDORF, AUSTRIA—There's nothing high-tech about taking a sample from a suspicious nuclear facility or its surroundings: An inspector simply wipes an object of interest—a piece of lab equipment, say, or a patch of ground—with a plain cotton swipe that resembles a cocktail napkin. The science gets interesting after the samples arrive at a gated compound near this village a half-hour's drive south of Vienna. The swipes come in, coded for confidentiality, along with matched controls taken from the inspector's hands before he or she arrives at the site, to rule out erroneous measurements due to vagabonding, or lingering contamination from an inspector's home institution or past inspections.

    Once in the safeguards lab, the samples go through a battery of analytical chemistry tests. First, samples are screened for overall radioactivity using gamma ray spectrometry, and selected swipes are measured with an x-ray fluorescence spectrometer designed by the International Atomic Energy Agency (IAEA) lab. When irradiated with x-rays, uranium and other elements fluoresce; within a few hours the instrument generates a contour map of uranium deposition on the swipe. (Plutonium concentrations are usually below the detection limit for this technique.)

    Samples are then prepped for the heavy artillery: instruments that can assess the mix of radioisotopes in a sample and thus reveal whether enrichment or reprocessing has occurred. The swipes are dipped in a solvent such as heptane and blasted with ultrasound to dislodge micrometer-sized particles of uranium or plutonium. For nearly every sample sent to the safeguards lab, one replicate swipe is dispatched for fission track analysis. This is the best technique for deciphering a sample's contents.

    Only three facilities in the world—France's Commissariat à l'Energie Atomique, the U.K.'s Atomic Weapons Establishment site in Aldermaston, and the U.S. Air Force Technical Applications Center in Florida—are able to do fission track analysis to IAEA specifications. The hot particles in suspension are fixed in a plastic called Lexan and blasted with neutrons in a reactor. The neutrons trigger particles to undergo fission; the daughter particles leave damage trails in the plastic that experts painstakingly examine under a microscope to determine what isotopes are revealed. It's this human touch that makes fission track the premier analytical tool. “It's more sensitive to the highest enrichments,” such as uranium-233 and uranium-235, says nuclear chemist David Donohue, the lab's chief. For that reason, he says, “at the moment, every sample goes to fission track.” But the process takes up to 3 months and costs $10,000 per sample.

    Fishing for fissile.

    In the above images of the same sample from an IAEA inspection, SIMS singles out particles of weapons-usable uranium-235 (left; yellow is highest intensity) interspersed with uranium-238 (right). IAEA's homemade fluorescence spectrometer gives a quick picture of the overall radioactivity of a swipe.


    For a quicker, albeit cruder, answer, the safeguards lab runs the other half of each sample through either scanning electron microscopy with electron-excited x-ray fluorescence spectrometry (SEM/XRF) or secondary ion mass spectrometry (SIMS). SEM/XRF is used to image hot particles, information that can hint at a particle's origins. Usually more informative, however, is SIMS, in which the particle suspension, dried on a graphite planchet, is bombarded with high-energy oxygen ions. The oxygen reacts with the radioisotopes, producing an array of secondary ions that a computer measures to calculate the relative abundance of fissile isotopes. “No human is involved in the automated measurement process,” says Donohue. Samples can be turned around within 2 weeks, at $2000 a pop.

    What analysts look for is the pattern of radioisotopes. The ratio of uranium-238 to uranium-234, −235, and −236 can be a smoking gun because enrichment shifts the balance toward the latter three isotopes, which are rare in nature. “If a process is going on,” says Donohue, “they can't hide it.”

    Another technique that soon may come into play is accelerator mass spectroscopy, which could be useful for detecting traces of iodine-129, a radioisotope released when plutonium is extracted from irradiated fuel rods. Recent field trials suggest that this approach “is a good method to find reprocessing,” Donohue says.

    Also on the horizon is what IAEA calls wide-area environmental sampling, mainly to collect airborne particles. A network of experimental stations that monitors compliance with the Comprehensive Test Ban Treaty (CTBT) uses filters to trap airborne radionuclides, but IAEA can't get access to those filters, says Donohue, because CTBT officials “would not want to portray what they do as providing safeguards assurances.” Donohue hopes IAEA can set up its own network of air filters to flag windswept uranium and plutonium as well as volatile radioisotopes such as tritium, iodine-129, krypton-85, and xenon-135.

    IAEA ran a short-lived field trial in which two air filters were set up near Baghdad until late 1998, when IAEA inspectors were pulled out of the country. The results were equivocal. “The filters got clogged up with dust,” Donohue says. The bottom line, he says, is that “nobody has decided that it's worth it yet” to use air filters.

    One obvious question is whether such filters could shed light on activities in countries such as North Korea that refuse to permit safeguards inspections. It would be unprecedented for IAEA to engage in nuclear eavesdropping, as the agency's mandate is to work with cooperating signatories to the Treaty on the Non-Proliferation of Nuclear Weapons. And even if IAEA's governing body were to approve such surveillance and allow the agency to put air-filter stations near the border of North Korea, says Donohue, it would be difficult to interpret the results. Japan's nuclear power industry (along with those of France and the United Kingdom) releases enough krypton-85, for instance, during its plutonium operations to swamp any signal from North Korea. “To look for a small release against the background would be hard,” Donohue says. “There would be a substantial signal-to-noise problem.”


    Can Great Apes Be Saved From Ebola?

    1. Gretchen Vogel

    Desperate times have some researchers proposing desperate measures, but others argue that the plans would be a possibly dangerous misuse of resources

    Ongoing Ebola outbreaks in central Africa are taking a gruesome toll on both humans and great apes. Conservationists, primatologists, and disease experts agree on that much, but in an increasingly heated debate, they are arguing over whether they can or should do anything to limit the spread of the disease.

    The outbreaks, which have so far killed more than 150 people and thousands of apes, are spreading ominously toward Congo's Odzala National Park, which shelters one of the world's largest populations of gorillas and chimpanzees. Some researchers argue that drastic measures should be taken to protect the region's great apes. Proposals include transporting hundreds of apes to safe areas and clearing rivers of debris to divide infected from uninfected populations. Anything that slows the current outbreak would be worthwhile, says ecologist Peter Walsh of Princeton University. “We need to knock this thing down right now and give ourselves time for developing things like vaccines” that could confer longer lasting protection, he argues.

    But others say such plans are logistical nightmares that might have little or no effect on the spread of the virus. “We may just be stuck at the scene of an accident and there's nothing we can do but watch,” says Les Real, a disease ecologist at Emory University in Atlanta.

    Key to settling the debate, Real says, is understanding more about how the virus spreads. Some think the geographic pattern of outbreaks suggests that apes are catching the disease primarily from other apes. Others argue that an unidentified reservoir is an important source of new infections. If the virus is spreading from one ape to another, then attempts to keep infected and uninfected animals apart might make a difference. But if bats or rodents are carrying the virus over long distances, then such measures would be in vain.

    Last fall, as word of the epidemic's scale began to emerge, primatologist Christophe Boesch of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, set off an ongoing e-mail discussion among primatologists and conservation organizations working in the region, urging the community to discuss possible interventions. “I don't know the answer,” he says. “I just thought this should be discussed very seriously by people in the field so we can see what is to be gained. There are tens of thousands of great ape and hundreds of human lives at stake” in this remote region.

    Dire straits.

    Ebola outbreaks are moving toward Congo's Odzala National Park, home to one of the world's largest gorilla populations.


    Walsh, who recently published an analysis in Nature of the effects of Ebola and hunting on ape populations (Science, 11 April, p. 232), argues for immediate action. He believes the sporadic human outbreaks—thought to arise mainly from people butchering infected apes—over the last decade in Gabon and Congo are all part of one large epidemic spreading primarily from ape to ape. If that is the case, he says, erecting barriers between infected and uninfected populations could help protect apes that haven't been exposed. Like an army trying to slow an invader's advance, he says, workers could clear rivers of natural bridges that allow infected gorillas and chimps to cross into unexposed areas. “I would at least like to have people out there trying to see if it would work” on an experimental scale, he says, and he has spoken with several potential donors willing to finance such a project.

    But clearing rivers on any large scale “would be a Herculean task,” says Conrad Aveling of the European Union conservation organization ECOFAC in Libreville, Gabon. “On a purely practical and logistical level, it's impossible. … We clear 100 kilometers of river for tourists in Odzala National Park, and that is a never-ending job.” Constructing an effective barrier would mean clearing thousands of kilometers of rivers, Aveling says. Another proposal—transporting animals to uninfected areas—would be an unprecedented undertaking that could end up killing more apes than it saves, he says. Even if donors are willing to support such efforts, “we could perhaps better use those dollars to make sure the gorillas survive after the passage of Ebola.”

    And if, as other researchers suspect, the virus spreads through other animals, the work could be in vain. “We know there have been human outbreaks in areas without great apes, so clearly there is something else at play,” says William Karesh of the Wildlife Conservation Society in New York City.

    Those on both sides of the debate agree that there is no time to waste in tracking down the answers. This week, Karesh, Eric Leroy of the International Center for Medical Research in Franceville, Gabon, and their colleagues arrived in the Lossi gorilla sanctuary in Congo, site of a recent outbreak. They will spend a month tracking and anesthetizing gorillas to collect blood samples that will reveal how many of the animals have been exposed to Ebola.

    In the meantime, there is one surefire way to help protect the besieged populations, Karesh says: fight poaching through increased patrols and education programs. “One of the things we can do right now is reduce the hunting pressure. Let's take what we do know—that people can get this disease from eating infected primates—and use that to do something we know will protect great apes.”


    Spinning Junk Into Gold

    1. Ingrid Wickelgren

    Scientists are finding surprising uses for DNA that interrupts genes. Some stretches encode enzymes that ferry genes into precise locations; other “junk” seamlessly edits proteins

    Genomes are typically littered with “junk”: stretches of DNA with no obvious function that are scattered among genes. But scientists are now finding that some of this junk, at least that from lower organisms, can be astoundingly useful to people, if apparently not to the organisms that carry it.

    This surprisingly handy DNA is located within genes and not in the no man's land between one gene and the next. It comes in two types. So-called introns are clipped out of a gene's RNA before a protein is made. By contrast, the less well known inteins are translated into protein but then immediately removed.

    In simple creatures such as yeast, algae, and bacteria, some introns aren't just tossed out like so much cellular garbage. Instead, the introns serve as templates for making other proteins. Among these are spectacular enzymes that inject new stretches of DNA into precisely defined spots in a genome.

    Scientists are tailoring these enzymes to shoot genes into new locations. The technique may yield edible vaccines, hardier cheese cultures, and better-controlled gene therapy for diseases. “I see enormous possibilities for the use of introns and their enzymes in biotechnology and medicine,” says Marlene Belfort, a geneticist at the Wadsworth Center of the New York Department of Health in Albany.

    Scientists are mining inteins, which occur primarily in yeast, algae, viruses, bacteria, and archaea, for a different skill: the ability to seamlessly extract themselves from a protein and tie the loose ends back together. In the past few years, researchers have parlayed this talent into methods for purifying proteins that otherwise can't be made in bacteria. Inteins have also endowed crops with new genetic traits that are unlikely to cross over to nearby plants. “Not only have inteins fundamentally changed how we view gene expression, they've also been harnessed as workhorses that have revolutionized protein chemistry,” says molecular biologist Francine Perler of New England Biolabs in Beverly, Massachusetts, who discovered much of the basic biology of inteins. “For something most people have never heard of,” adds chemical engineer David Wood of Princeton University, “this is a hot area.”

    Say cheese.

    A modified intron from the bacterium Lactococcus lactis can disrupt existing genes in other cells or introduce new genes.


    Cut and paste

    Cell biologists identified introns in 1977, and in the mid-1980s a team discovered a yeast intron with an odd power. It codes for an enzyme later dubbed a homing endonuclease. The intron and its enzyme from one yeast cell are transferred during mating into the second cell. There the enzyme makes a single clip in the recipient cell's DNA at the exact same location the intron occupied in the first yeast cell. The cell then patches the break by pairing up its chromosome with the homologous chromosome from the first yeast cell, thereby inserting the intron sequence into its own DNA. Thus, the intron invades the new genome like a harmless parasite.

    Hundreds of similar enzymes spun by these so-called group I introns have since been found in yeast, algae, viruses, and the mitochondrial and chloroplast genomes of higher plants. All of them are very precise cutting tools. Restriction enzymes commonly used in the lab to cut DNA home in on strings of just six base pairs, sequences expected to occur about once every 4000 base pairs. But the intron-encoded enzymes target sequences from 15 to 40 base pairs long. Researchers estimate that most of them clip spots that would occur only once in a billion base pairs.

    Such an enzyme would thus cut very few sites—perhaps just one—in the human genome. Consequently, in the past few years, scientists have been trying to engineer homing endonucleases for made-to-order sequences so that they might, for example, insert therapeutic genes into a chosen location. This could provide better control of the inserted gene's expression and perhaps circumvent some hazards of today's random insertion technology, which is thought to have caused leukemia-like disease in some patients when the therapeutic gene apparently activated a gene associated with leukemia (Science, 17 January, p. 320).

    Two research teams have now taken the first step toward creating custom cellular scissors. Barry Stoddard and his team at the Fred Hutchinson Cancer Research Center in Seattle reported in the October 2002 issue of Molecular Cell that they could craft a new homing endonuclease by combining halves of two natural homing endonucleases. The researchers used a computer model to determine how to reconfigure key amino acids in half of the endonuclease I-Dmol, hailing from a heat-loving single-celled archaea, and half of a similar enzyme called I-Cre1 from an alga so that they fit together like pieces of a jigsaw puzzle.

    In test tube experiments, the hybrid enzyme cleaved DNA at the predicted spot in a 22-base-pair sequence that consisted of half of the target sequence for I-Dmol and half of the target for I-Cre1. “We show that it is possible to treat the individual enzyme subunits as modular, like pieces of Lego,” Stoddard says. A group at the Paris-based biotech firm Cellectis has also shown, in work to appear in Nucleic Acids Research, that a similar artificial enzyme works not only in solution but also in yeast and mammalian cells.

    Not for naught.

    Despite being clipped out of a gene's RNA, some introns go on to build proteins of their own.


    Now both the Seattle and Paris groups are trying to tailor homing endonucleases to cleave DNA targets of their choosing. Both are using a combination of computer-aided protein design and mutagenesis to generate millions of variants of these enzymes that they can screen for the ability to cleave a given DNA target. “Everybody's waiting to see if it's possible to do this,” Stoddard says.

    Meanwhile, other researchers are focusing on so-called group II introns, which may be easier to direct to new DNA targets. These introns are also extremely precise, recognizing sequences of 30 to 35 base pairs—in this case, using both the intron-encoded protein that cuts the DNA, among other duties, and the intron's RNA. The RNA and protein work together to insert the intron at a site in the genome, but the RNA has the primary role in specifying that site—and RNA is much easier to modify than a protein.

    Indeed, Alan Lambowitz of the University of Texas, Austin, and his team have shown that they can retarget an intron by altering its RNA without tinkering with the protein. They randomly mutated RNA for an intron from Lactococcus lactis, a bacterium that produces lactic acid and is used in making cheese. The researchers selected those mutants that integrated into two genes involved in HIV infection that had been inserted into Escherichia coli bacteria (Science, 21 July 2000, p. 374).

    Since then, the Lambowitz team has created hundreds of introns that insert into different sites, enabling the researchers to deduce more refined rules about how best to alter the intron's RNA sequence to retarget it to a particular gene. Lambowitz and his team have recently codified their rules into a computer program that tailors the L. lactis intron to any gene of interest, selecting the best flanking and insertion sites and specifying an RNA sequence that binds optimally to the target.

    In unpublished work, the Lambowitz team used its computer program to figure out how to knock out 23 of 24 so-called DEAD-box protein genes in E. coli by inserting the intron into the gene sequences. From 1% to 80% of the bacterial colonies exposed to the introns sported the desired knockouts—a big improvement over the 0.1% insertion frequencies the team achieved 2 years ago using older techniques.

    A St. Louis-based firm, InGex, recently began selling Lambowitz's gene-insertion technology, dubbed the Targetron, which provides a plasmid toting the intron as well as Web access to the intron-designing computer program. This system can be used to disrupt bacterial genes with the intron. Alternatively, genes can be added to a bacterium by attaching them to the intron. Lambowitz is now trying to make his technology work in higher organisms. The trick, he says, is to get the intron's protein-RNA complexes into cell nuclei in sufficient concentrations for insertion to occur. “If we could do what we do in bacteria in animal cells, it would be extremely powerful,” Lambowitz says.

    The bacterial work is already moving toward commercial application. David Mills of the University of California (UC), Davis, and his colleagues used an intron technology based on Lambowitz's to safely add viral resistance to cheese-making lactic acid bacteria. Traditionally, genetic engineers insert an antibiotic-resistance gene as a marker along with the desired gene and then expose the bacteria to an antibiotic. By process of elimination, this identifies the bacteria that successfully received the new genes. But because the L. lactis intron invades genomes at high frequency, Mills found an engineered colony by random genetic screening. “We don't want to put antibiotic-resistance genes in anything going into food,” Mills says.

    Using this technique, the researchers inserted a gene for resistance to a bacterial virus inside the L. lactis intron and inserted it into a laboratory strain of L. lactis. The intron and gene integrated into the organism's DNA, making the strain more resistant to the virus, they reported in Applied and Environmental Microbiology in February. Mills's team didn't make cheese, but the technology could be used to provide viral resistance to any of hundreds of cheesemaking strains that are currently vulnerable to the virus.

    Mills and other UC Davis scientists are now working with the California Dairy Research Foundation and RZ Syntopical, a biotechnology firm in Sacramento, California, to apply the same technology to make an edible vaccine for respiratory syncytial virus (RSV). No vaccine currently exists for this virus, a serious childhood respiratory pathogen that can also be deadly to adults with impaired immune systems.

    Ejection seat.

    Proteins quickly extract inteins embedded within them.


    The plan is to use the L. lactis intron to insert an RSV antigen into a strain of the edible lactic acid bacterium, which could then be put into a milk-based formula. In the gut, these bacteria would then produce the antigen and stimulate the immune system. Mills says the intron vaccine technology—still in its earliest stages—will more easily pass Food and Drug Administration scrutiny than other technologies do because no part of the delivery vector is foreign to the food itself. “It's a very clean system,” Mills says.

    Sticking together, splitting up

    Introns are not the only genetic junk that has been put to extraordinary uses. Inteins, pieces of peptide removed from proteins, have become an increasingly versatile tool for molecular biologists, chemists, and drug developers. The first intein was discovered in 1990 by three independent labs. All identified a gene in yeast cells that was much larger than the protein it encoded, but the additional DNA did not appear to encode an intron.

    The researchers thought that the additional material must be spliced out at the protein level. But because they could not find the larger precursor protein, hardly anyone believed them. Finally Perler and her colleagues at New England Biolabs reported in Cell in 1993 that they could insert an intein between two other proteins, purify the precursor at low temperature, and cause it to splice at a higher temperature. This nailed the case. Researchers went on to find inteins in more than 50 creatures, and some 400 papers about inteins now appear in a database called InBase (

    Inteins break free when the section of protein on the carboxyl end of the intein—the so-called C-extein—grabs onto its N-extein counterpart and tears it off the intervening intein, attaching it to itself. The intein then cuts itself loose from the newly connected exteins. The reaction is spontaneous, occurring as soon as a protein folds—timing that explains the earlier difficulty in finding the natural protein precursors.

    Perler, her former postdoc Ming Xu, and their colleagues quickly spun this mechanism into a cheaper way of purifying proteins. Ordinarily, a protein is purified by genetically fusing it to a tag that will selectively stick the protein to a coating on a purification column. But freeing the protein from its tag then requires an expensive protease enzyme. To get around this, the New England Biolabs team inserted an intein between the protein and its tag. The researchers then used a chemical trick to coax the intein to cut itself away from the protein at just one end, releasing the pure protein.

    A team led by Wadsworth's Belfort, including her husband Georges Belfort of Rensselaer Polytechnic Institute in Troy, New York, and Princeton's Wood, has since developed a single-step method for mass-producing purified proteins using inteins. And on the other side of the size scale, the Belfort group has developed an unpublished miniaturized version of its system that isolates small amounts of protein. The technique should speed up proteomics studies, such as determining the functions of genes or identifying new protein targets for medications.

    Inteins can get around other laboratory obstacles as well. The Belfort team, including Wadsworth's Victoria Derbyshire and Wei Wu, has developed a technique for producing hard-to-make proteins such as blood-clotting factors or malarial proteins needed for vaccine development. Many labs use bacteria to mass-produce proteins, but these particular proteins often kill the bacteria that express them.

    For 14 years, the Wadsworth team had been unable to produce the endonuclease I-TevI in bacteria because it is toxic to the organisms. But it got around this problem last year by inserting an intein gene into the I-TevI gene, disabling the resulting protein so that E. coli could produce it. Once the hybrid protein was purified, a drop in pH caused the intein to splice out, restoring I-TevI to its native form.

    Inteins can split up genes for a different application: making safer transgenic plants. Because the chloroplasts of most crops are almost always maternally transmitted, says New England Biolabs' Sriharsa Pradhan, their DNA isn't carried by pollen, which can spread nucleus-based genes far and wide. Thus, putting one part of a new gene into a plant's chloroplast protects against transfer of the full gene to other plants.

    Pradhan's team has now used this tactic to confer herbicide resistance on tobacco plants. The researchers attached one of two parts of an algal intein to each half of a herbicide-resistant gene, they report in the 15 April Proceedings of the National Academy of Sciences. One fragment went into the chloroplast genome, the other into the nucleus. The intein-containing fragments reassembled in the chloroplast and spliced to yield an intact protein that protected the plant.

    Inteins can also link proteins to each other or to small molecules by attaching a kind of molecular Velcro to the protein when it is purified. Starting in 1998, chemist Tom Muir of Rockefeller University in New York City showed that purifying a protein using a variant of a New England Biolabs intein-based system leaves the protein with a sticky end composed of a thiol ester. This enables the protein to be easily joined to other molecules such as a sugar, a lipid, or another protein.

    Biologists have used this approach to attach phosphate groups to certain sites on proteins. Phosphorylated proteins are often the functional versions of the proteins in cells, but they can be extremely difficult to make in the lab in pure form, even though they make up an estimated one-third of natural human proteins. “The beauty of this chemistry is that it's so simple,” says Muir. “Anybody can do this.”

    Now Muir and postdoctoral fellow Henning Mootz have developed a novel way of using inteins to rapidly switch proteins on or off inside a cell and thereby gain clues to their functions. The technique works within minutes, providing much finer temporal control than is possible by controlling gene expression, which can take hours to produce an effect. In this system, an intein splices out of a protein—and thereby activates or inactivates it—in response to the addition of the immunosuppressant rapamycin.

    The original concept was to rapidly reconstitute two parts of a single protein, but the first test of the idea, reported last year, showed the linkage of two unrelated proteins. Muir and Mootz attached the gene for each of the test proteins to half of a yeast intein gene plus a gene for either of the human cellular signaling proteins FKBP or FRB, both of which are affected by rapamycin. The researchers mixed the tripartite proteins together in a test tube and added rapamycin, which binds simultaneously to FKBP and FRB. This brought the two protein constructs together and reconstituted the intein, which then spontaneously spliced, removing itself and the FKBP and FRB hooks and linking the two test proteins.

    This scheme could be used to quickly activate a single protein within a cell by bringing two halves of it together with rapamycin, Muir says. Alternatively, one might inactivate a protein by attaching it to an inhibitory peptide. In unpublished work, Muir, Mootz, and their colleagues have shown that their splicing reaction works in cultured mammalian cells.

    The applications of this technique are only beginning to be explored. “The idea of bringing two polypeptides together … is a basic tool that can have a number of different applications,” Muir says. “We haven't thought of them all yet.” Indeed, researchers are just beginning to uncover all the treasures buried in inteins and their intron cousins—which are turning out to be anything but junk.


    Cracking the Khipu Code

    1. Charles C. Mann

    Researchers take a fresh look at Incan knotted strings and suggest that they may have been a written language, one that used a binary code to store information

    In the late 16th century, Spanish travelers in central Peru ran into an old Indian man, probably a former official of the Incan empire, which Francisco Pizarro had conquered in 1532. The Spaniards saw the Indian try to hide something he was carrying, according to the account of one traveler, Diego Avalos y Figueroa, so they searched him and found several bunches of the cryptic knotted strings known as khipu. Many khipu simply recorded columns of numbers for accounting or census purposes, but the conquistadors believed that some contained historical narratives, religious myths, even poems. In this case, the Indian claimed that his khipu recorded everything the conquerors had done in the area, “both the good and evil.” The leader of the Spanish party, Avalos y Figueroa reported, immediately “took and burned these accounts and punished the Indian” for having them.

    But although the Spanish considered khipu dangerous, idolatrous objects and destroyed as many as they could, scholars have long dismissed the notion that khipu (or quipu, as the term is often spelled) were written documents. Instead, the strings were viewed as mnemonic devices—personalized memorization aids with no conventionalized signs—or, at most, as textile abacuses. The latter view gained support in 1923 when science historian L. Leland Locke proved that the 100 or so khipu at the American Museum of Natural History in New York City were used to store the results of calculations.

    For these reasons the Inca have often been described as the only major Bronze Age civilization without a written language. In recent years, however, researchers have increasingly come to doubt this conclusion. Many now think that although khipu probably began as accounting tools, they had evolved into a writing system—a kind of three-dimensional binary code, unlike any other on Earth—by the time the Spanish arrived. “Most serious scholars of khipu today believe that they were more than mnemonic devices, and probably much more,” says Galen Brokaw, an expert in ancient Andean texts at the State University of New York, Buffalo.

    Knotty problem.

    Scholars have decoded mathematical khipu, but the meaning of other sets of strings, perhaps recording narrative, remains a mystery.


    Yet the quest to understand khipu faces a serious obstacle: No one can read them. “Not a single narrative khipu has been convincingly deciphered,” laments Harvard University anthropologist Gary Urton, who calls the situation “more than frustrating.” And so Urton, spurred by new insights gained from textile experts, is now preparing the most sustained, intensive attack on the khipu code ever mounted. In a book to be released next month, Signs of the Inka Khipu (University of Texas Press), he has for the first time systematically broken down khipu into their constituent elements. He is using that breakdown to create a khipu database to help identify patterns in the arrangement of knots. Just as Maya studies exploded in the 1970s after researchers deciphered Maya hieroglyphs, Urton says, breaking the khipu code could be “an enormous potential source of insight” into the lives and minds of the still-mysterious Inca, who in the 16th century ruled the largest empire on Earth.

    Binary code?

    All known writing systems used for ordinary communication employ instruments to paint or inscribe on flat surfaces. Khipu, by contrast, are three-dimensional arrays of knots. They consist of a primary cord, usually 0.5 to 0.7 centimeters in diameter, to which are tied thinner “pendant” strings—typically more than 100 and on occasion as many as 1500. The pendant strings, which sometimes have subsidiary strings attached, bear clusters of knots. The result, as George Gheverghese Joseph, a mathematics historian at the University of Manchester, U.K., has put it, “resembles a mop that has seen better days.”

    According to colonial accounts, Incan “knot-keepers”—elite bureaucrats called khipukamayuq—parsed the knots both by inspecting them visually and by running their fingers along them Braille-style, sometimes accompanying this by manipulating stones. For example, to assemble a history of the Inca, in 1542 colonial governor Cristóbal Vaca de Castro apparently summoned khipukamayuq to “read” the strings. Spanish scribes recorded their testimony but did not preserve the khipu; indeed, they may have destroyed them.

    Locke showed that the numerical khipu were hierarchical, decimal arrays, with the knots used to record 1's on the lowest level of each string. Other knots were tied on successively higher levels in a decimal “place value” system to represent 10s, 100s, 1000s, and so on. “The mystery has been dispelled,” exulted archaeologist Charles W. Mead after Locke's discovery. “We now know the quipu for just what it was in prehistoric times … simply an instrument for recording numbers.”

    But Locke's rules did not decode all of the estimated 600 khipu that survived the Spanish. Nor did they detail what objects were being accounted for in these records. According to Cornell University archaeologist Robert Ascher, about 20% of khipu are “clearly nonnumerical.” In 1981, Ascher and his mathematician wife, Marcia, published a book that reignited the field by intimating that these “anomalous” khipu may have been an early form of writing.

    The Aschers focused mainly on khipu knots. But in 1997, William J. Conklin, a research associate at the Textile Museum in Washington, D.C., suggested that knots were only part of the khipu system. “When I started looking at khipu,” says Conklin, perhaps the first textile specialist to investigate them, “I saw this complex spinning and plying and color-coding, in which every thread was made in a complex way. I realized that 90% of the information was put into the string before the knot was made.”

    Taking off from this insight, Urton proposes that khipu makers made use of the nature of spinning and weaving by assigning values to a series of binary choices (see diagram), including the type of material (cotton or wool), the spin and ply direction of the string (which he describes as “S” or “Z” after the “slant” of the threads), the direction (recto or verso) of the knot attaching the pendant string to the primary, and the direction of slant of the main axis of each knot itself (S or Z). As a result, he says, each knot is a “seven-bit binary array,” although the term is inexact because khipu had at least 24 possible string colors. Each array encoded one of 26 × 24 potential “information units”—a total of 1536, somewhat more than the estimated 1000 to 1500 Sumerian cuneiform signs and more than twice the approximately 600 to 800 Egyptian and Maya hieroglyphic symbols. In Urton's view, the khipu not only were a form of writing, but “like the coding systems used in present-day computer language, [they were] structured primarily as a binary code.”

    If Urton is right, khipu were unique. They were the world's sole intrinsically three-dimensional “written” documents (Braille is a translation of writing on paper) and the only ones to use a binary system for ordinary communication. In addition, they may have been among the few examples of “semasiographic” writing: texts that, like mathematical or dance notation but unlike written English, Chinese, and Maya, are not representations of spoken language. “A system of symbols does not have to replicate speech to communicate narrative,” explains Catherine Julien, a historian of Andean cultures at Western Michigan University in Kalamazoo.

    Knotted string communication, however anomalous to Euro-American eyes, has deep roots in Andean culture. Khipu were but one aspect of what Heather Lechtman, an archaeologist at the Massachusetts Institute of Technology's Center for Materials Research in Archaeology and Ethnology, describes as “a technological environment in which people solved basic engineering problems through the manipulation of fibers.” In Andean cultures, Lechtman says, textiles—ranging from elaborately patterned bags and tunics to missile-hurling slings and suspension bridges—were “how people both communicated messages of all sorts and created tools.” Similarly, Urton explains, binary oppositions were a hallmark of the region's peoples, who lived in societies “typified to an extraordinary degree by dual organization,” from the division of town populations into “upper” and “lower” moieties to the arrangement of poetry into dyadic units. In this environment, he says, “khipu would be familiar.”

    Talking knots.

    Each knot in a khipu has its own binary signature, based on a series of choices about the kind of thread and knots used. These signatures may have encoded information and allowed Incans to “read” khipu narratives, as seen in this 16th century drawing.


    But this grander view of khipu as written narrative also has its critics. “Due to cultural evolutionary theory, people have decided that cultures are not really any good unless they have writing,” says Patricia J. Lyon of the Institute of Andean Studies in Berkeley, California. “People feel this great need to pump up the Inca by indicating that the khipu were writing.” Agreeing with the 17th century Jesuit chronicler Bernabé Cobo, Lyon believes that khipu “were mnemonic devices, no matter what you dream up.”

    Even some of Urton's supporters are cautious about his interpretation. Conklin, for instance, agrees that the khipu were charged with meaning, but he worries that the analogy to computer language may not fit. “The Andean concept of duality is different than ours,” he says. Whereas each 1 or 0 in a binary display is completely independent, the Andean dualities “are like the ebb and flow of a tide: opposing, interacting aspects of a single phenomenon.” In his view, understanding khipu will require finding “a way other than our independent zero and one to express Andean dualism.” Still, he says, Urton's work “is the first attempt to push khipu forward since Leland Locke.”

    Seeking a Rosetta stone

    One way to settle the debate decisively would be to find a written translation of a khipu to another language—an Incan Rosetta stone. In 1996, Clara Miccinelli, an amateur historian from the Neapolitan nobility, caused a stir by announcing that she had unearthed just such a find in her family archives: an explicit translation into Spanish of a khipu that encodes a song in Quechua, the Incan language, which is still spoken today. But because the same collection of documents also contains sensational claims about the Spanish conquest, many scholars have questioned their authenticity. Miccinelli has thus far refused to let researchers around the world freely examine the documents, although she did allow an Australian lab to use a mass spectrometer to test the khipu that accompany them. The results, published in 2000, date the khipu to between the 11th and 13th century. According to Laura Laurencich Minelli, an Andeanist at the University of Bologna working with the Miccinelli documents, the early age could be explained by the Andean tradition of weaving important khipu with old thread “charged with the strength of the ancestors.”

    Because they cannot examine the documents, most researchers are “strategically ignoring” them for now, says Brokaw, and are tackling khipu using less controversial means. Urton and mathematician and database manager Carrie Brezine intend to have their khipu database, which is funded by the U.S. National Science Foundation, running this fall and will eventually put it online. Their database, a successor to one set up by the Aschers at Cornell, will let scholars search for patterns across most of the 600 surviving khipu.

    At the same time, Urton and other khipu hunters are searching for their own Rosetta stone: a colonial translation of a known khipu. For example, some Spanish documents from Peruvian Amazonia are thought to be transcriptions of khipu, 32 of which were recently found in the area. No definitive match has yet been made between a document and the newly discovered khipu, but Urton has uncovered some suggestive clues. He is now searching archives in Peru and Spain for more documents—a quest, according to Western Michigan's Julien, that “has a chance of bearing fruit.” The 40-plus Incan provinces had similar, overlapping records, she notes. “Information from one province could easily be found in another form in another [province].” If Urton or some other scholar can find a match, she says, “we may be able to hear the Incans for the first time in their own voice.”

  13. Modernizing the Tree of Life

    1. Elizabeth Pennisi

    A new generation of systematists seeks to transform its field with the tactics of big science

    Bar codes have revolutionized the retail industry, allowing scanners to instantly identify and price everything from beans to beach balls—and allowing retailers to skip the labor-intensive step of having a person actually examine the product. Biologist Paul Hebert of the University of Guelph, Ontario, hopes to adapt that concept to an even more ambitious task: species identification.

    Just as cashiers no longer need to know which brand of beans is on sale or whether they are looking at white, brown, or confectioner's sugar, Hebert envisions handheld scanners that automatically read a DNA “bar code”—which conveys differences in the sequence of a single gene—to identify species. By relying on a mitochondrial gene called cytochrome oxidase I to create such bar codes, he claims, “we can make a very efficient engine that would be able to take us through animal life,” from barnacles to butterflies.

    Identification made easy.

    These skipper butterflies look similar, but a DNA “bar code” may be able to separate them into two species.


    To experts who have struggled for decades to sort out the identities of closely related organisms, this is a wild claim indeed. “I sputtered about [bar codes] when I saw the first suggestion,” recalls invertebrate systematist Frederick Schram of the Institute for Biodiversity and Ecosystem Dynamics at the University of Amsterdam. Adds Hebert, “We were being lampooned in taxonomic corners around the world.”

    Many researchers doubt that a single gene can resolve all animals into individual species. And many systematists have an almost visceral distaste for the notion of bar-coding animals. But a few, including Schram, are beginning to think Hebert may be on the right track. “As a way of cataloging biodiversity, why not?” says Schram. Even the skeptics are beginning to recognize the need for this sort of broad-brush approach to identifying and classifying great swaths of biological diversity, given the sheer magnitude of the task ahead. Although biologists have cataloged 1.7 million species, they know they have just begun: Estimates of the total number of species on the planet range from 4 million to 100 million.

    Naming them all is merely the first step. Researchers need to know where each creature fits into the grand scheme of evolution, from the first single-celled microbe to complicated plants and animals. This scheme is often described as the tree of life, a metaphor proposed 150 years ago by German biologist Ernst Haeckel. Every life form, from krill to whales, seaweed to sequoias, protists to pachyderms, fits somewhere along this tree's twigs and branches. But where?

    To find out, more and more biologists are proposing bold new methods that are transforming the practice of taxonomy and systematics. Many of those who find Hebert's bar codes farfetched are impressed with an even more visionary scheme to analyze several genes at once and so allow amateur biologists to both identify an organism and see its evolutionary relationships with the touch of one button. Scores of other researchers are banding together in large collaborative projects and building “supertrees” with data from many studies. Some hope to see their field take on the trappings of a high-tech, big-science endeavor, one in which they tackle the whole tree of life—or at least large chunks of it—at once, in much the same way geneticists now tackle whole genomes. Plant systematists have already embraced such tactics, with notable results (see sidebar on p. 1696).

    Not long ago, such ambitions were rare in a field accustomed to much more narrow objectives. But big-science projects are becoming the norm: In 2002, the National Science Foundation (NSF), the chief funder of systematics in the United States, gave out more than $15 million for a half-dozen multi-investigator projects, each tackling a different group, from bacteria to plants to birds. “Over the last 10 to 15 years, researchers in the systematics community have been working on the small branches on the tree of life,” explains NSF plant systematist James Rodman, who runs this program. “We thought it was time to take a more coordinated approach.”

    Overall, says Michael Donoghue, a botanist at Yale University in New Haven, Connecticut, “we have made good progress in outlining the major groups.” It's well accepted, for example, that placental mammals share a common ancestor and that birds are more closely related to lizards than to fish. “But there are 1.7 million species [identified], and we are not anywhere close to putting them all on the tree of life,” Donoghue says. Researchers estimate that only about 80,000 species now have a place on their trees.

    Large-scale endeavors are increasingly necessary, and not just because we're curious to know who else inhabits our planet. The urgent call for systematic information is now coming from several fields at once. “There are a lot of practical questions” that rely on taxonomy, notes Fred Grassle, a marine ecologist at Rutgers University in New Brunswick, New Jersey. Conservationists need to know, for example, whether the Florida scrub jay is really a species—and therefore subject to protection—or just a subspecies. Likewise genomicists trying to interpret the burgeoning array of DNA sequence data are demanding to know where those sequences fit in the evolutionary scheme. The evolutionary distance between, say, zebrafish and nematodes affects what biologists look for as they compare genomes, and it helps track the evolution of particular genes. Even biomedical researchers are embracing taxonomy to help them understand how pathogens become more virulent over time or how new diseases emerge.

    Thus taxonomists are finding their science in greater demand than ever before and in use in far corners of biology. “You pick up any biological journal—it doesn't matter what field it is—and it will have phylogenetic data,” says David Hillis, an evolutionary biologist at the University of Texas (UT), Austin.

    Tree plus tree.

    Supertrees combine two or more trees into a single big one by making use of species overlaps.

    CREDIT: M. S. SPRINGER AND W. W. DE JONG, SCIENCE 291, 1709 (2001)

    From morphology to molecules

    When Carolus Linnaeus set out to catalog all organisms, he had little idea what he and his future colleagues were in for. Even so, he came up with a system for naming species and set up a classification system that has survived for 250 years and in some quarters is still going strong.

    Until recently, it was enough for budding taxonomists to pick a particular group—dung beetles, canines, oaks—and spend their lives gathering and examining specimens. Researchers pored over organisms' structures, from obvious traits such as the number of legs to subtleties such as the relative heights of spines in a fish's anal fin. Their quest: to find an effective way to compare creatures within a group, first to see how they differed and later to determine how they were related to each other.

    Such taxonomic work has advanced over the centuries, but it moves too slowly for some. Many organisms are scantily described and poorly illustrated in the literature, so systematists must hunt down original specimens in museums in order to compare them to new specimens, complains Scott Miller, an entomologist at the Smithsonian Institution's National Museum of Natural History (NMNH) in Washington, D.C. If a species is new, researchers spend months or even years painstakingly describing and illustrating it. This workload means that each of the world's 6000 to 15,000 traditional taxonomists is lucky to describe 250 organisms in a career. At that pace, a complete tree of life is centuries away, says Terry L. Erwin, another NMNH entomologist.

    When molecular techniques spread through biology, they promised to circumvent these problems. Taxonomists found they could cover more ground by looking at differences in DNA and ordering species along the tree accordingly. But their results often clashed with traditional morphological classifications. To make matters worse, many early DNA studies were flawed because they were based on an unreliable stretch of DNA or inappropriate organisms, says Andrew Smith, a systematist at the Natural History Museum in London.

    Fortunately, better statistical methods, a better understanding of DNA changes, and cheaper, more accurate sequencing have boosted the credibility of molecular studies. Researchers now use more DNA and go beyond simple differences in bases, assessing small deletions and insertions and other features of the genome's landscape.

    All the same, “molecular workers are realizing that their data sets are not ‘silver bullets’ for revealing the history of life,” says David Lindberg, a systematist at the University of California (UC), Berkeley. Meanwhile, “morphological workers are realizing that their long-held ideas need to be reexamined. The field is ripe for revision and conversion,” he says. Morphologists are invigorating their work with techniques such as electron microscopy and incorporating a broad range of traits, including behavioral differences. This, plus the molecular challenge, has helped many morphologists accept new ways of looking at organisms, notes Michael Lee, a reptile expert at the University of Queensland, Australia.

    As a result, the two sides are finding more common ground. For example, in 1999, Tim Littlewood of the Natural History Museum in London analyzed the DNA of a marine worm called the acoel, which morphologists had lumped with the parasitic flatworms in part because both have a simple body plan. Littlewood instead proposed that although flatworm morphology evolved from a more complex body plan (as is common in parasites), the acoel is primitive: Its body has always been simple. That makes this humble worm a relative of all bilaterally symmetrical organisms, from millipedes to humans (Science, 19 March 1999, p. 1823). At first morphologists protested, but now, several molecular studies later, “the story is effectively resolved,” Littlewood says. These days, he says, “when incongruities [between molecular and morphological data] exist, we choose to see this as an area requiring additional data, not a barrier to resolving the tree of life.”

    Many controversies remain, from the relationship among bacteria, archaea, and eukaryotes to whether the springtail is a primitive insect. These battles get the headlines and the journal pages. But in most cases, the different data point to the same tree, says Lee: “It should be stressed that [these data] agree more often than they disagree.”

    Banding together

    Molecular and morphological types may have achieved détente, but they still have only a rough outline of the evolutionary branching pattern of many groups. For many systematists, that's not enough. To fill in the tree of life, says NSF's Rodman, researchers need to figure out how to work together.

    Too often, two systematists will take on the same organisms while other taxa go unstudied. And even those working on similar or related groups often lack the benefit of one another's data, so no one else can take full advantage of their labors, says amphibian expert David Cannatella of UT Austin. That's why NSF is funding large projects, and why people such as Cannatella are lobbying for joint tree-of-life efforts. “We'd like to get people to actively collaborate and share samples and tissue resources” to build an amphibian tree, he says.

    Such collaboration is the first step toward synthesizing systematic information. But it demands a change in the field's culture. “Systematists have a tradition of working alone,” says Cannatella. “But I think getting together and pooling what we have is the way to go.” Given the biodiversity crisis and the increased use of systematics data by other types of biologists, he and others would like to see their field take a lesson from the recently completed Human Genome Project. More than a dozen laboratories throughout the world banded together, sharing technological innovations and freely contributing human and other sequence data to a centralized public archive called GenBank. Systematists need such a central archive, says Yale's Donoghue. “What I think is critical is databasing everything in such a way that it's accessible,” he says. Then, notes Charles O'Kelly, a systematic biologist at the Bigelow Laboratory for Ocean Sciences in West Boothbay Harbor, Maine, “one can ask about studies at any level. … It's a way that we can make the [research] go much faster.”

    There are already some attempts at coordinated data sharing. For several years now, plant and animal biologists have been pooling molecular data and submitting both the DNA results and phylogenetic trees to a Web site called TreeBASE (; more and more journals are requiring authors to deposit their phylogenetic data there. Maureen O'Leary of the State University of New York, Stony Brook, and colleagues are planning MorphoBank, a GenBank-like repository that stores digital images.

    These kinds of archives, if they take off, will allow researchers to put together all the data at hand. The challenge would then be to analyze the giant data sets that result, notes Hillis. Mathematically minded biologists are already busy making computer programs to meet the demand.

    One approach is to build a supertree, in which a computer program compiles smaller trees of life into a gigantic one, merging branches of the trees without reanalyzing the original data. “There have been any number of areas in biology, especially evolutionary biology, that have really wanted large phylogenies at the species level,” says John Gittleman, an evolutionary biologist at the University of Virginia, Charlottesville. “What we need is the complete tree.”

    Supertrees fulfill that need, and they are gaining momentum as more phylogenies become available online. Gittleman estimates that a few years ago, only five researchers were braving supertree analysis; now there are at least 100. “If we really want to get a big picture of the tree of life—of say a million species—then we really need a method of piecing the snippets together,” says Donoghue. A supertree analysis looks among smaller trees—which might use either morphological or molecular data—for a few species in common, then it uses the overlap to come up with a bigger branching pattern. Supertrees “provide another line of insight into what can be very intimidating mountains of data,” says the University of Amsterdam's Schram.

    Supertrees have yet to incorporate as many species as Donoghue envisions, but even when small, these trees can highlight which groups are well studied and which have conflicting phylogenies. For example, one supertree of mammals, which synthesized data from 315 papers and 400 phylogenetic trees, supported most of the existing mammalian classifications but questioned the relationship between megabats and microbats and also the ancestry of insectivores (Science, 2 March 2001, p. 1786).

    A supertree's overview may also help make sense of a particular family tree. For example, for 50 years, biologists have assumed that foxes and weasels—diminutive carnivores—split into an unusually large number of species because their small size lets them exploit more parts of their environments, says Gittleman. But a supertree analysis “found very little support for this hypothesis,” he says. The most diverse groups—those with the densest “foliage”—did not have the smallest mammals. Instead, he theorizes that faster rates of reproduction account for the species-richness of these animals, an idea that has yet to be tested.

    But this method, too, has its detractors. “Supertrees don't have much value,” says Ward Wheeler, an invertebrate systematist at the American Museum of Natural History in New York City. He sees them as summaries of summaries, two steps removed from real data.

    That's why some researchers are using another meta-analysis approach that digs even deeper than the supertree. Called the supermatrix, these analyses merge the original data from a host of smaller trees. The supermatrix may consider 100 or more species on one axis and upward of 10,000 or more traits on the other; the information is usually gathered from previous studies and includes both morphological and molecular traits. A computer then looks for patterns in all these data that reveal the relationships among species. “It's a way of incorporating all the information into one picture of evolution and diversification,” says Wheeler.

    Brave new taxonomy

    All this work on very large data sets is producing larger and larger trees. But even these big trees are not enough to satisfy a few researchers, who have their eyes fixed on the grand goal of an all-encompassing tree of life. Traditional systematists start with a small group of organisms and work to fill in the neighboring branches and twigs of the tree. But an increasing number of researchers envision methods that tackle organisms from all or most of the tree of life's branches at once—and that don't even require a systematist's expertise to apply.

    Hebert's species identification plan is one example. He envisions a handheld device that requires just a small sample, such as an insect leg, to read the sequence of at least 645 bases of the mitochondrial gene cytochrome oxidase I, which is involved in energy production. According to Hebert, fewer than a dozen base substitutions in this gene can distinguish one species from another, yet there's enough variation in the sequence to allow for the discrimination of hundreds of millions of species. He also contends that the gene's sequence doesn't appear to vary among individuals of the same species, most likely because the gene plays some essential but species-specific role.

    If those claims pan out, the gene could be a powerful tool for classification. The bar-code device could compare sample DNA to a database of sequences from thousands of other species. If there was no match, then the sample would become a new addition to the database, thereby reducing the number of unknown organisms by one.


    David Hillis has incorporated 3000 species into this circular tree of life, best viewed when enlarged to a 1.5-meter diameter


    No such device exists yet, but Hebert claims that results to date are promising. In a report published online early this year by the Proceedings of the Royal Society of London B, Hebert and his team determined the distinguishing cytochrome oxidase I sequence for seven phyla, eight insect orders, and 200 species of moths. They then used these sequences to classify DNA from 300 organisms from all walks of nature. It worked, says Hebert: The bar code correctly identified each of 150 moths of various species, and almost all the other test organisms wound up in the proper group. Working with Daniel Janzen at the University of Pennsylvania in Philadelphia and John Burns of NMNH, Hebert is now tackling a group of butterflies called skippers, some of which look enough alike to be one species. Burns suspects that they represent about a dozen species, distinguished by slight color differences and feeding preferences. In a pilot project, the bar codes differentiated the various skippers while placing individuals of the same species together.

    Hebert has managed to get a few other systematists excited about the potential of his scheme. “There's enormous value in gathering a library of these sequences,” says Janzen, who is trying out bar codes with Hebert's help. Rutgers's Grassle, too, has embarked on bar coding, as part of the Census of Marine Life project. “If we can do this on a very large scale, it may accelerate the study of [neglected] species,” he says.

    Hidden diversity.

    Researchers have only just begun to catalog the many kinds of microbes on the tree of life.


    Hebert is calling for taxonomists to use the cytochrome oxidase I gene to develop bar codes for museum specimens throughout the world. NMNH is considering asking Congress for $15 million in the 2005 budget for a 5-year bar-coding project. “We're very encouraged by the resolution [of the technique],” says museum director Cristián Samper. “We feel it's an important tool.”

    But there are many, many skeptics, including UC Berkeley's Brent Mishler, who think it's highly unlikely that the cytochrome oxidase I gene can discriminate among all species. Mishler calls the approach “basically terrible and arbitrary,” and he worries that spurious counts of biodiversity will result.

    Hebert predicts that a global bar-code project would require $1 billion over the next 20 years. So far, NSF officials are lukewarm at best. “It's not research,” says NSF's Rodman, who notes that bar codes are still unproven.

    As Hebert tries to drum up support for his vision, other researchers are also dreaming of new devices to automate taxonomic tasks. UT Austin's Hillis envisions a handheld species analyzer that would sequence parts of several genes, not just one, and offer an instant phylogenetic analysis as well as a species designation. He plans to use some genes that mutate relatively quickly, which help identify species, and some that evolve slowly, which reveal ancient mutations and evolutionary history. “[Together] they allow you to place an organism within the larger context of the tree of life,” says Hillis. “Even if you don't know about something, the [method] instantaneously classifies it.”

    Someday, both Hebert and Hillis hope to have devices that fit in the palm of the hand and require little expertise to operate. “People could go out anywhere and identify any organism,” says Hillis. Such scanners may sound like science fiction. In fact, both Hebert and Hillis liken their imagined devices to tricorders, the whirring handheld scanners that instantly classified alien life forms on the Star Trek series. Better yet, says Hillis, call them biocorders.

    Very few of their colleagues expect these 21st century tricorders to show up on the lab bench anytime soon. “I am a little doubtful whether it will come down to something that simple,” says NMNH's Burns.

    Yet whether or not Hebert and Hillis's dreams ever become reality, the existence of such radical schemes signals a field on the edge of transformation. There's a new sense that knowledge of the complete tree of life, once considered an all-but-unreachable goal, may indeed be within biologists' grasp. Three years ago, NSF hosted information-gathering workshops to decide how to accelerate systematic work. Rodman was surprised to find that the several dozen participants were willing to commit to a deadline. They predicted that with enough resources, there could be a rough draft of a tree of life in the next 10 to 20 years, although getting to the species level would take longer. “I think we can have a tree everyone can live with, but that there will be fine-tuning for decades and occasionally some startling shifts,” Schram predicts. That fine-tuning and those shifts are part of the process, adds Littlewood. Otherwise, “we wouldn't be engaged in the science.”

  14. Drafting a Tree

    1. Elizabeth Pennisi

    Systematists often say the tree of life is in good shape. But ask them to illustrate this notion with a single diagram, and most throw up their hands in frustration. All the same, a few intrepid souls have made the attempt. For example, Joel Cracraft, an ornithologist at the American Museum of Natural History in New York City, and botanist Michael Donoghue of Yale University in New Haven, Connecticut, put together a tree based on molecular, morphological, and fossil data provided by speakers at a Tree of Life conference a year ago.

    Science began with their tree in creating the diagram below. Where possible, we used the common names of familiar organisms within a group, followed by the scientific name of the group as a whole. Circles pinpoint a few branch points, and dashed lines indicate uncertain groupings. The tree represents the 80,000 living organisms now classified, but many groups, including many bacteria, have been left off because of lack of space. Thus it gives only a sketchy impression of microbial diversity.

    To help identify unknown or controversial areas, Science polled about a dozen additional systematists; the shaded yellow sections reflect their comments. The shading, intended to convey the mottled state of our knowledge, indicates a general sense of ongoing research and should not be used to infer particular phylogenetic problems.

    Given the diversity of views, we realize that specialists may take issue with parts of the tree. The point is to show that although the tree is well along, it is very much a work in progress. Even so, it makes clear the complexity of life on Earth.

    Enhanced online version of “A Tree of Life” (191 KB)

    PDF (printable) version of “A Tree of Life” (279 KB)

  15. Plants Find Their Places on the Tree of Life

    1. Elizabeth Pennisi

    Researchers trying to piece together the tree of animal life are hacking through dense foliage, barely able to see the top branches, never mind the distant twigs (see main text). But their colleagues studying plants have many of their phylogenetic trees neatly pruned and manicured. Whereas the animal and microbial types are wrestling with new techniques and beginning to talk about collaborations, botanists have already embraced the culture and methods of big science.

    Over the past decade, 200 plant taxonomists from a dozen countries have been analyzing and refiguring the evolutionary history of their favorite flora in an effort called Deep Green. For other systematists, the endeavor has become one to aspire to (Science, 13 August 1999, p. 990). “The plant people have made major advances as far as I can tell” and are moving faster than animal-centric researchers, says Frederick Schram, a barnacle expert at the University of Amsterdam.

    By coming together, Deep Green researchers were able to identify poorly studied groups and holes in the data. They then parceled out the work to fill those holes. Although it sounds simple, Deep Green depended on the vision of several systematists who rallied their colleagues, says James Rodman, a plant systematist at the U.S. National Science Foundation (NSF). Others say the plant people are succeeding because their field and their trees—filled with a mere 300,000 species—are smaller. As a result, it seems “that the plant people have a good handle on all sorts of data, almost to the point of being truly comprehensive,” says John Gittleman, an evolutionary biologist at the University of Virginia, Charlottesville.

    For the next step, the Deep Green researchers have been busy figuring out the best way to combine their data into that one tree—“our most accurate representation of the history of green plants,” says Charles O'Kelly, a systematic biologist at the Bigelow Laboratory for Ocean Sciences in West Boothbay Harbor, Maine. The work of wrestling their very large data sets into a single tree is part of a 5-year, $2.7 million NSF grant involving six U.S. teams. At the same time, there's an increased push to put all these data into public databases.

    Thus for plant taxonomists the next 5 years promise to be a data gold rush. Several other projects, independently funded but interconnected, are delving into less well-covered aspects of the field. One, called Deep Gene, will help plant experts make use of plant genomics information and vice versa. “Deep Time” researchers will blend plant fossil finds with modern botany. And “Deepest Green” delves into the base of the plant tree to sort out the relationships among the green algae.

    Green and growing.

    On the Deep Green Web site (, this tree of plants gets more detailed with a click of the mouse.

    Such a multipronged, coordinated assault makes some animal systematists green with envy. “I think that the animal [researchers] would profit by a similar approach rather than the winner-take-all rodeo that seems to prevail right now in zoology,” says Schram. If his colleagues' competitive spirits could be converted to cooperation, he says, “in a few years, we could achieve wonders.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution