News this Week

Science  21 Mar 2008:
Vol. 319, Issue 5870, pp. 1598

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Proposal to 'Wikify' GenBank Meets Stiff Resistance

    1. Elizabeth Pennisi

    When Thomas Bruns turns to GenBank, the U.S. public archive of sequence data, to identify a fungus based on its DNA sequence, he does so with some trepidation. As many as 20% of his queries return incorrect or incomplete information, says Bruns, a mycologist at the University of California, Berkeley. In a letter on page 1616, Bruns, Martin Bidartondo of Imperial College London, and 250 colleagues who work on fungi urge GenBank to allow researchers who discover inaccuracies in the database to append corrections. GenBank, however, says such a fix would cause more problems than it solves.

    The letter comes from a relatively small research community concerned primarily with making sure that the species from which a sequence came is correctly identified. But “the problem extends far beyond fungi, to much bigger—and [more] recognizable—creatures,” says James Hanken, director of the Museum of Comparative Zoology at Harvard University. Other sorts of errors—such as inaccurate information on a gene's structure or on what its proteins do—also plague the database.

    Incorrect data are more than just an inconvenience. Analyses of new data depend in a large part on comparisons with what's already in GenBank—be it right or wrong. Computers predict gene function, for example, based in part on similarities with known genes. And Bruns and others ferret out species' identities—often of organisms otherwise indistinguishable—by looking for matches to named GenBank entries. Under the current setup, “error propagation is all too likely,” says Thomas Kuyper, a mycologist at Wageningen University in the Netherlands.

    Tangled mess.

    The fungal threads (white fluff) on these pine roots require GenBank comparisons to identify.


    What the mycologists are asking for is a scheme like those used in herbaria and museums, where specimens often have multiple annotations: listing original and new entries side by side. It would be a community operation, like Wikipedia, in which the users themselves update and add information, but not anonymously.

    Up and up.

    Critics fear that GenBank's rapid growth is leading to error propagation.


    GenBank's managers are dead set against letting users into GenBank's files, however. They say there already are procedures to deal with errors in the database, and researchers themselves have created secondary databases that improve on what GenBank has to offer. “That we would wholesale start changing people's records goes against our idea of an archive,” says David Lipman, director of the National Center for Biotechnology Information (NCBI), GenBank's home in Bethesda, Maryland. “It would be chaos.”

    The standoff over the quality of GenBank's data is in part a product of the database's success—and the issues are only going to get more intense. Researchers have been contributing genes, gene fragments, even whole genomes to GenBank since 1982, making it an incredibly valuable resource for many thousands of investigators worldwide. Today, GenBank provides 194.4 publicly accessible gigabases, a number that will double in 18 months, thanks in part to cheaper, faster sequencing technologies and a rise in “environmental” sequencing: mass sequencing of all the DNA in soil, skin, or other samples.

    From early on, researchers recognized that errors would be inevitable (Science, 15 October 1999, p. 447), and although GenBank runs some quality-control checks on incoming sequences, it cannot catch many mistakes. GenBank has just one mycologist on staff, for example, but 150,000 fungal sequences were deposited this past year. “That's not something that a single person can curate,” says Lipman.

    GenBank's creators consider the database a “library” of sequence records that, like books or journal articles, belong to the authors and therefore can be changed only by the submitters of that data. A note indicates when a record has been updated and points to the archived original. Stephen O'Brien, who does comparative genomics at the National Cancer Institute in Frederick, Maryland, argues that author privilege is necessary. “One of the reasons GenBank is so doggone useful and comprehensive is that nobody edits or micromanages it except the authors,” he says. “This makes for downstream errors but almost universal buy-in.”

    Lipman says authors do take the time to make corrections. GenBank gets about 30 such messages a day, he points out. But others disagree, citing case after case in which problems were not fixed. Often the submitters have moved on to other projects and never get around to making the changes, says Steven Salzberg, a bioinformaticist at the University of Maryland, College Park. And, he adds, the big sequencing centers—which churn out genome after genome with preliminary annotation—are the worst offenders: “They won't let anybody touch their GenBank sequences, and they won't change it, for whatever reason.”

    Lipman points out that other researchers improve on GenBank's data in a variety of ways. NCBI, for example, curates genes, along with other interesting DNA and RNA sequences, and puts them in a database called RefSeq that is updated as new information about these sequences comes along. And researchers focused on particular groups of organisms have set up their own secondary databases, such as FlyBase for the fruit fly genomes and TAIR for Arabidopsis, that offer cleaned-up GenBank data, along with other genomic information and tools for analyzing them. And, Lipman notes, NCBI even offers a way for researchers to do third-party annotation. But it's not the third-party annotation scheme the mycologists want.

    For starters, GenBank has set a high bar for accepting changes: Entries must be backed by a publication. Annotations concerning a gene's function, for example, require published experimental data about that gene's protein or a related one. This discourages legitimate improvements, says Carol Bult, a geneticist at the Jackson Laboratory in Bar Harbor, Maine, because often a proposed correction doesn't justify an entire publication. Furthermore, an indication that additional annotation exists is deeply buried in the original sequence record.

    Although he's adamant that NCBI is not going to “wikify” GenBank, Lipman says he's eager to work with mycologists to come up with a solution, possibly through RefSeq. Salzberg thinks NCBI will eventually come up with a way to maintain GenBank as an archive while allowing greater community involvement in annotation. “I think it will be solved eventually,” he says. “But it's not clear how it will be solved.”


    Millennium Ancestor Gets Its Walking Papers

    1. Ann Gibbons

    Ever since its discovery in 2000, the 6-millionyear-old fossil known as the Millennium Ancestor has been in a sort of scientific purgatory, with researchers disagreeing about its identity as one of the earliest ancestors of humans or other apes. Now, an independent team's analysis of this primate's thighbones on page 1662 concludes that its species, Orrorin tugenensis, was indeed an early ancestor of humans. But it challenges a controversial proposal that Orrorin gave rise to our genus, Homo, directly.

    Walking the walk.

    A new study says the primate Orrorin really did walk upright in Kenya 6 million years ago.


    The new study confirms that Orrorin walked upright—a defining characteristic of being a hominin, the primate group that includes humans and our ancestors but not other apes. “The data provide really strong confirming evidence that it was bipedal about 6 million years ago, which reinforces its status as a hominin,” says author Brian Richmond, a paleoanthropologist at George Washington University in Washington, D.C.

    Richmond got permission to measure O. tugenensis in 2003, 3 years after the fossils were discovered in the Tugen Hills of Kenya by Martin Pickford of the Collège de France and Brigitte Senut of the Muséum National d'Histoire Naturelle in Paris and their co-workers. The pair proposed that it was a hominid based on features in the teeth and the upper thighbone, or femur. But that bone is incomplete, and many researchers had reservations about Pickford and Senut's analysis (Science, 24 September 2004, p. 1885) and about controversy surrounding the pair's permits to work in the Tugen Hills (Science, 13 April 2001, p. 198).

    Richmond took eight measurements of the femur, which has been stored in a bank vault in Nairobi, using calipers while a burly bodyguard watched. He plugged these measurements into standard statistical analyses that calculated the size and shape of the bone and compared them with those from about 300 thighbones from great apes and fossil and modern humans.

    The analysis suggests that Orrorin is most closely related to australopithecines, a diverse group of hominids that arose about 4 million years ago in Africa. That's in contrast to Pickford and Senut's proposal that it was a direct ancestor of our genus, which would have pushed australopithecines off the line to modern humans. “Frankly, I was surprised to see how similar it was to australopithecines, since it was twice as old,” says Richmond. The new analysis “goes a long way toward resolving the mystery of Orrorin,” says paleoanthropologist Henry McHenry of the University of California, Davis. “Few of us agreed that Orrorin gave rise to Homo [directly]. … This study helps lay that hypothesis to rest.”

    Richmond's analysis shows that the femur was adapted for upright walking, and he proposes that this set of adaptations persisted from Orrorin's time with only minor changes through all the australopithecines, until early Homo evolved a new hip and thigh configuration. “The overall mechanics of walking appear to be pretty darn similar from 6 million years to 2 million years ago,” says William Jungers of Stony Brook University in New York, Richmond's former thesis adviser.

    Pickford and Senut say they are glad to see confirmation of their proposal that Orrorin walked upright. But they still argue that other features not included in Richmond's analysis, such as the tilt of the bony head of the femur and a bump called the lesser trochanter, link it more closely to Homo than australopithecines.

    Many researchers say the new analysis is the “most convincing evidence” so far that Orrorin walked upright, but they are more skeptical that early hominins had a single type of upright walking. “The situation was much more complex,” says anatomist Christopher Ruff of Johns Hopkins University in Baltimore, Maryland. To resolve this debate, says anatomist Owen Lovejoy of Kent State University in Ohio, researchers should also look at the pelvis, back, foot, and ankle of other early hominins, still under analysis.


    Pfizer Denied Access to Journals' Files

    1. Jocelyn Kaiser

    A federal judge in Chicago last week denied a company's efforts to obtain confidential peer-review documents about arthritis drugs it manufactured. The company, Pfizer, sued for files from three major medical journals. It lost against two in Illinois and is waiting for a decision in Massachusetts on the third.


    JAMA Editor-in-Chief Catherine DeAngelis convinced a U.S. judge that peer review must remain confidential.


    Pfizer's actions stem from a lawsuit in which the company was sued by patients who took the drugs Bextra and Celebrex, which have been linked to serious side effects. In January, Pfizer filed a motion in Massachusetts to force the New England Journal of Medicine (NEJM) to comply with subpoenas for peer-review documents from 11 studies the journal had published on the drugs. Pfizer also sued in Illinois to get peer reviews from the Journal of the American Medical Association (JAMA) and the Archives of Internal Medicine, which together had also published 11 studies on the drugs. Pfizer said data from accepted and rejected studies could be useful for its defense.

    Attorneys for the three journals argued that releasing confidential reviews would compromise the anonymity of peer review. [The outgoing editor-in-chief of Science, Donald Kennedy, filed an affidavit supporting NEJM's position (Science, 22 February, p. 1009)]. In an affidavit, JAMA Editor-in- Chief Catherine DeAngelis argued that if the courts routinely allowed such subpoenas, it could result in a “severe decline” in the number of peer reviewers and affect the journals' ability to “properly discharge their mission to advance the betterment of public health.”

    The U.S. court in Chicago agreed with DeAngelis. “Although her statements are quite dramatic, it is not unreasonable to believe that compelling production of peer review documents would compromise the process,” wrote Judge Arlander Keys. The court also found that Pfizer had not adequately explained how unpublished information could help it defend itself. Keys's conclusion: “Whatever probative value the subpoenaed documents and information may have is outweighed by the burden and harm” to the journals.

    “We're delighted,” says DeAngelis. “If you interfere with the process and the confidentiality, you might as well pack it up and go home.”

    A handful of such cases have come up before, such as a 1994 subpoena of NEJM seeking peer-review comments as part of breast-implant litigation. Journals have usually prevailed, but the judge in each case must weigh the arguments anew, notes Debra Parrish, an attorney in Pittsburgh, Pennsylvania, who specializes in science law. The JAMA decision “is important,” she says.

    The NEJM case appears to be winding down as well: At a hearing last week, Pfizer narrowed its request to the peer-review comments returned to authors, according to NEJM's Boston attorney, Paul Shaw. He expects a decision within days.


    EPA Adjusts a Smog Standard to White House Preference

    1. Erik Stokstad

    EPA tightened its health standard but less than its science advisers urged.


    In December 2005, Stephen Johnson dunked himself in hot water. Johnson, the administrator of the U.S. Environmental Protection Agency (EPA), decided to discard advice from a scientific advisory committee when he set a major air-quality standard for soot. Scientists and environmental groups were outraged (Science, 6 January 2006, p. 27). Last week, Johnson did it again with ozone, the main component of smog. And this time, the hand of the White House was plain to see. The Administration is “flouting the law” by not protecting public health adequately, says epidemiologist Lynn Goldman of Johns Hopkins University Bloomberg School of Public Health in Baltimore, Maryland, who was assistant administrator for the EPA's Office of Prevention, Pesticides, and Toxic Substances during the Clinton Administration. “It's tragic.”

    The Clean Air Act requires EPA to review the standards for six major pollutants, including soot, also known as particulate matter, and ozone, every 5 years. The agency last did this for ozone in 1997, so the American Lung Association (ALA) and other groups sued and won a deadline of 12 March for the agency to issue a new standard. These standards influence the regulation of power plants, vehicles, and other sources of the chemicals that react with sunlight to become ozone.

    The lobbying leading up to the decision was heavy. Industry groups told Johnson to leave the primary ozone standard, which is designed to protect public health with a margin of safety for sensitive groups such as asthmatics and children, at 80 parts per billion (ppb). ALA and other groups, including EPA's Children's Health Protection Advisory Committee, pushed for a standard of 60 ppb. And EPA's Clean Air Scientific Advisory Committee (CASAC) unanimously recommended that the standard not exceed 70 ppb, citing “overwhelming scientific evidence.”

    On 12 March, Johnson announced that he had “carefully considered” the scientific advice but was tightening the primary standard to just 75 ppb. “I followed my obligations; [I] adhered to the law, [and] I adhered to the science,” he said during a telephone press conference. After comparing models of exposure and analysis of uncertainty by EPA staff scientists, Johnson notes in the final rule that “there is not an appreciable difference, from a public health perspective,” between 70 and 75 ppb. Public health advocates disagree, citing an analysis in which EPA concluded that 70 ppb would mean 780 fewer deaths a year, 280 fewer heart attacks, and 720 fewer visits to emergency rooms for asthma attacks.

    Johnson veered from his scientific advisers again when he tightened the secondary standard. This standard is intended to protect “human welfare,” a broadly defined phrase in the Clean Air Act that includes effects on soil, vegetation, visibility, and property. The secondary and primary standard for ozone have been measured in the same way—a daily 8-hour average—and set at the same levels, but CASAC recommended that EPA change the method of measurement to better protect trees, crops, and other vegetation from the cumulative damage of exposure to ozone throughout the growing season. In EPA's preliminary rule, released for public comment last year, the agency agreed, although it preferred a standard of 21 parts-per-million hours (hourly concentrations summed over 3 consecutive months) rather than the 15 ppm-hours that CASAC recommended.

    Less than a week before the final rule was due, EPA received a memo from the White House, which is now part of the public record of the regulation. Susan Dudley, who heads regulatory affairs at the White House's Office of Management and Budget, objected to changing the method of measurement for the secondary standard. She argued that EPA had focused exclusively on vegetation and ignored other impacts, such as those on “economic values.” EPA apparently interpreted this as Dudley asking EPA to consider the economic costs of changing the standard, which by law it cannot do. In a memo the next day, EPA Deputy Administrator Marcus Peacock defended the agency's position.


    Critics say EPA's standard won't adequately protect plants, such as this maple, from ozone.


    Shortly before the press conference announcing the final ruling, however, EPA received another memo from Dudley saying that President George W. Bush had sided with her. This memo had not yet been placed in the public docket by press time, but Science obtained a copy from a public advocacy group in Washington, D.C. EPA officials postponed the press conference for 5 hours while they rewrote the rule, keeping the measurement method and level of the secondary standard equivalent to the new primary standard. CASAC member Richard Poirot of the Vermont Department of Environmental Conservation's Air Pollution Control Division in Waterbury suggests that “the White House was concerned about the dangerous precedent of having any environmentally focused secondary standard at all.”

    During the press conference, Johnson also announced that he plans to ask Congress to amend the Clean Air Act. Among the changes he sketched briefly, Johnson would like EPA to be able to consider costs and feasibility of implementation when it sets air standards. In a statement, ALA called those ideas “completely unacceptable.” It's unlikely that the Democratcontrolled Congress would make such changes. Meanwhile, Henry Waxman (D-CA), chair of the House Oversight and Government Reform Committee, plans to hold a hearing on the ozone standard on 10 April. Expect more hot water.


    Wisconsin Stem Cell Patents Upheld

    1. Constance Holden

    Scientists are still grumbling about the Wisconsin Alumni Research Foundation's grip on stem cell patents—a hold strengthened by rulings this and last month affirming WARF's patents on primate and human embryonic stem (ES) cells. But there is a widespread feeling that challenges to WARF's patents and continuing public pressure have had a desirable effect. “I think [WARF has] been moving toward what I would consider to be a more reasonable policy” with regard to giving scientists access to the cells, says bioethicist LeRoy Walters of Georgetown University in Washington, D.C.

    WARF, affiliated with the University of Wisconsin, Madison, holds three patents arising from work done in the 1990s by Wisconsin researcher James Thomson. On 25 February, the U.S. Patent and Trademark Office (PTO) upheld a 2006 patent that describes a method for cultivating pluripotent cells. Then on 10 March, it upheld patents, granted in 1998 and 2001, on nonhuman primate and human ES cells, which apply to all such cells regardless of how they are derived.

    Two citizens' groups first challenged WARF's patents in October 2006, claiming that the method for deriving the cells was “obvious” and could have been successfully applied by anyone equipped with the necessary resources. Last April, PTO agreed to reexamine them (Science, 13 April 2007, p. 182). In its final decision, PTO rejected the challengers' arguments, saying that “[I]n view of the unpredictability” associated with both the isolation and long-term sustainability of primate ES cells, “the present claims are not obvious. …”

    WARF Managing Director Carl Gulbrandsen, who had predicted a “tough fight,” proclaimed WARF to be “heartened that … the patent office reached the correct conclusion.”

    During the course of the patent examination, WARF eased up on proprietary claims: It no longer demands licensing fees from companies that do university-based research with its cells; and it narrowed its claims to apply only to ES cells derived from fertilized embryos and not pluripotent cells from other sources, such as clones or the newly developed induced pluripotent stem (iPS) cells. These actions mean that despite the PTO ruling, “we think we've already won a major victory with these patent challenges,” says John Simpson of the Foundation for Taxpayer and Consumer Rights in Santa Monica, California, the “requestor” in the dispute.

    Nonetheless, the groups plan to appeal the decision on the one patent that can be appealed under PTO rules. “Frankly, I don't trust them to behave well unless we keep up the pressure,” says stem cell researcher Jeanne Loring of the Scripps Research Institute in San Diego, California, who supports the patent challenge.

    Harvard University stem cell researcher Chad Cowan agrees with others that the case has “caused WARF to finally wake up to the fact that they needed to be a lot more engaging with academic scientists.” But Cowan still thinks the patents are a drag on the field. The recent early successes with iPS cells, which can be grown without the use of eggs or embryos (Science, 1 February, p. 560), will only intensify the interest in ES research, he says. ES cells are still needed to validate iPS cells, and even if iPS cells prove viable substitutes for ES cells in research, some scientists believe they will never be suitable for cell therapy.

    Alan Trounson, president of the California Institute for Regenerative Medicine in San Francisco, says his biggest concern is down the road, because “the patents could delay developments” of therapies with ES cells. He says it would be bad for everyone if a biotech company got a monopoly on certain therapies.

    Meanwhile, as iPS patent applications flood into PTO, future patent issues will no doubt become even more complicated.


    Showdown Looms Over a Biological Treasure Trove

    1. Richard Stone*
    1. With reporting by Hao Xin.

    BEIJING—Can rubber plantations and tropical rainforest coexist? How about hydropower dams and a mountain-ringed refuge for golden monkeys, gibbons, and half of the rhododendron species on Earth? In impoverished Yunnan Province in southwestern China, a confrontation is brewing between economic growth and habitat preservation—and authorities are sending mixed signals about their intentions.

    Awaiting its fate.

    The Nu River, which wends through diverse ecosystems in southern China, is being considered for a massive dam-building program.


    Conservation got a boost at a conference in Kunming last month and on the sidelines of a major political gathering in Beijing last week, when Yunnan Governor Qin Guangrong unveiled a $986 million, 3-year initiative to protect biodiversity in the province's northwest. But also last month, according to news accounts, work quietly commenced on a controversial series of hydroelectric dams on the Nu River.

    The ecological situation may be even more precarious in southern Yunnan's Xishuangbanna region. There, two-thirds of a unique rainforest has been lost over the past 30 years, largely to rubber plantations, two new studies report. Yet last week, Xishuangbanna's top official vowed to expand his region's rubber industry.

    To ecologists, the northwest initiative may be the bright spot in an otherwise grim picture: It would protect biodiversity in an 80,000-square-kilometer area fed by three rivers—Nu (Salween), Lancang (Mekong), and Jinsha—that wend through deep gorges, creating a patchwork of ecosystems. The Three Parallel Rivers area amounts to 1% of China's territory but has a third of the country's native species, including three kinds of gibbons found nowhere else in the world. Although the region boasts three national nature reserves, logging on unprotected land is rampant. “The destruction of forest even on the edges of the nature reserves has been going on for a long time,” says Sun Wei-Bang, executive director of Kunming Botanical Garden.

    Although details are sketchy, the biodiversity initiative plans to expand nature reserves in northwestern Yunnan, reforest degraded land, and fund research on energy and environmental protection. It will also provide unspecified compensation to villagers and businesses affected by natural-resource extraction. The two main industries in northwestern Yunnan are mining and hydropower.

    “We must strike a balance between environmental protection and the need for development,” Qin told China Daily last week. “Restricting development is not a solution.”

    Some observers speculate that the Yunnan initiative is designed to appease critics of the Nu River hydropower project. In 2003, a consortium led by China Huadian Corp., a holding company that manages several regional utilities, proposed erecting 13 dams on the Nu with a combined capacity of 20,000 megawatts; the current plan is not publicly available. A chief argument of critics is that as reservoirs behind the dams fill up, flooding and landslides would imperil habitats. Tens of thousands of people would be relocated.

    Four years ago, in a decree that delighted conservationists, Premier Wen Jiabao suspended the dam project pending an environmental review. The review was carried out by a research arm of China Guodian Corp., another power holding company. But the report is classified as a state secret. It's unclear whether the central government has given a green light to recent resettlement and earthmoving at the site for the planned Liuku dam, described in local news accounts last month.

    In southern Yunnan, meanwhile, a unique tropical seasonal rainforest is under siege. The Xishuangbanna region—three counties that border Myanmar and Laos—is a few degrees cooler and has less rainfall, on average, than Southeast Asia. A dense fog during the dry season keeps vulnerable dipterocarps and other trees on life support. Tropical seasonal rainforest covered 10.9% of Xishuangbanna in 1976. By 2003, according to satellite imagery, coverage had eroded to 3.6%, representing a loss of nearly 140,000 hectares, ecologist Ma Youxin of Xishuangbanna Tropical Botanical Garden (XTBG) in Kunming and colleagues report in the 20 February issue of Forest Ecology and Management. XTBG ecologist Zhu Hua, in an article last week in the debut issue of the online journal Tropical Conservation Science, lays the blame squarely on the rubber industry. It's not just a matter of plantations razing acreage: Rubber trees are also invading intact forest. “In Xishuangbanna, the area suitable for rubber to grow is exactly the area that's suitable for tropical forest,” says Zhu.

    The only way to save the rainforest is to limit new rubber plantations to land now used for other crops, says Ma. “We've recommended to the local government not to allow farmers to convert rainforest to plantations,” he says. That may be a hard sell. Last week, the Communist Party secretary of Xishuangbanna, Jiang Pusheng, told Yunnan Info Daily that rubber plantations on his turf covered 23.9 million hectares by the end of 2007. “Xishuangbanna will continue to spare no effort to develop its rubber industry,” he said. Nevertheless, local authorities have asked XTBG to draft a plan that would help rein in rubber-plantation expansion, Ma says.

    Four decades ago, the central government began pushing rubber as a way to help pull Yunnan's ethnic melting pot out of poverty. That strategy has largely backfired, as many plantations are now run by people from outside Yunnan, says Zhu: “Most of the money just leaves the province.” An alternative cash crop touted by scientists—teak and other expensive hardwoods—has flopped. “The teak just gets cut down and replaced by rubber,” Zhu says.

    For a region that prides itself on its biological riches, the losses are mounting.


    Expert Panel Lays Out the Path to Algebra--and Why It Matters

    1. Jeffrey Mervis

    The voyage spanned 2 years, 12 public meetings, and 14,000 e-mails. But Larry Faulkner, a chemist and former president of the University of Texas, Austin, has successfully steered the National Mathematics Advisory Panel through some of the roughest waters in U.S. education. The result, out last week, is a 120-page report on the importance of preparing students for algebra, normally taught in the eighth and ninth grades, and its role as a gateway course for later success in high school, college, and the workplace (Science, 7 December 2007, p. 1534).

    The report ( urges educators to keep it simple: Define a few key topics and teach them until students master them. Along the way, it says, students should memorize basic arithmetic facts and spend more time on fractions and their meaning. How teachers achieve those goals is up to them, Faulkner says, advice that allowed the panel to avoid taking sides in a debilitating, 2-decade-long debate about the appropriate balance between drilling students on the material and making sure they understand what they are doing.

    The 19-member panel was supposed to rely on sound science in its advice to U.S. Secretary of Education Margaret Spellings, but only a relative handful of the 16,000 studies it examined turned out to be useful. The vast majority, says Faulkner, were of insufficient quality, too narrow in scope, or lacked conclusive findings. The literature is especially thin on how to train teachers and how good teachers help students learn.

    Spellings has promised to hold a national summit this year on implementing the panel's 45 recommendations. But the primacy of local control over education could make the federal government more of a cheerleader than a participant.

    Faulkner, whose day job is president of the Houston Endowment, a Texas philanthropy, spoke with Science on 13 March, the day the report was released.


    Q: The report notes that U.S. elementary students do okay on international math tests and that the falloff begins at the end of middle school and accelerates into high school. So why focus on K-8 math?

    L.F.: You can also argue that the falloff reflects the inability of students to handle algebra. If you look at success rates in algebra or proficiency in algebraic concepts, there's ample evidence that students are not succeeding, and our charge is to increase the likelihood that they will succeed.

    Q: Why do so many students have trouble with fractions?

    L.F.: Fractions have been downplayed. There's been a tendency in recent decades to regard fractions to be operationally less important than numbers because you can express everything in decimals or in spreadsheets. But it's important to have an instinctual sense of what a third of a pie is, or what 20% of something is, to understand the ratio of numbers involved and what happens as you manipulate it.

    Q: How could schools lose sight of that?

    L.F.: Well, they did.

    Q: Was the panel disappointed by the overall quality of the existing research?

    L.F.: I think quality is an issue, but that's not all there is. Some of what we examined was topically irrelevant, or the studies were not very generalizable. Some high-quality studies were so narrowly defined that they don't tell you much about what goes on in the classroom.

    It may have to do with what the researchers could do with the money available. So we want to be careful about throwing rocks at people. … We go to great lengths to point out that we think the nation requires a balanced program that includes what I would call smaller scale, pilot-oriented research as well as larger scale investigations that are more analogous to clinical trials in medicine. We found a serious lack of studies with adequate scale and design for us to reach conclusions about their applicability for implementation.

    Q: Should the government be spending more money on this research?

    L.F.: Education research covers a lot of territory, so we don't really know. … When I briefed the science adviser, Jack Marburger, yesterday, I said maybe his office should be thinking about it. He just nodded. We think this is an item that deserves the attention of the federal government. It probably means bigger grants. If you want to get the value, you probably need to pay for it.

    Q: Were you surprised by the dearth of good data on professional development programs?

    L.F.: There's tremendous variation in inservice programs. And the evidence is that many are not very effective. … I think districts should be very careful. Large amounts of money are being spent in this area, and serious questions should be raised.

    Q: What's the panel's view on calculators?

    L.F.: We feel strongly that they should not get in the way of acquiring automaticity [memorization of basic facts]. But the larger issue is the effectiveness of pedagogical software. At this stage, there's no evidence of substantial benefit or damage, but we wouldn't rule out products that could show a benefit. If a product could be demonstrated to be effective on a sizable scale under various conditions, the panel would be interested.

    Q: What message should the next president take from this report?

    L.F.: The most important thing is that success in math is not just about a school subject. It's about the real opportunities it creates for people and for the well-being and safety of society. It's important that we succeed to a better level than we do now.


    Driven to Extinction

    1. Dennis Normile

    Rinderpest, an animal disease that devastated cattle and other animals--and their human keepers--across Eurasia and Africa for millennia, may join smallpox as the only viral diseases to have been eradicated.

    Rinderpest, an animal disease that devastated cattle and other animals—and their human keepers—across Eurasia and Africa for millennia, may join smallpox as the only viral diseases to have been eradicated

    Almost normal.

    With rinderpest gone, African herders can spray cattle for less lethal parasites and diseases.


    For just the second time in history, scientists are on the cusp of declaring that a devastating infectious viral disease has been wiped off the face of the planet. The first was smallpox, officially vanquished in 1979. This time it is rinderpest, an animal disease that has plagued cattle and related animals—and their human keepers—for millennia.

    “There is growing confidence that rinderpest has been eradicated,” declares Peter Roeder, a veterinarian associated with a global rinderpest-eradication program from its inception in 1993 until he retired last year. A disease once endemic throughout Eurasia and Africa has almost certainly been eradicated save for the Somali pastoral ecosystem that straddles the borders of Kenya, Somalia, and Ethiopia. And the latest field-surveillance results, now being reviewed by experts, suggest the virus is gone from there as well.

    If the absence of rinderpest can be confirmed, “it would be a remarkable achievement for the veterinary profession, probably the most significant achievement in its history,” says Roeder. “Rinderpest is such a nasty disease, its eradication would be a huge step forward, particularly for the underdeveloped world,” seconds Tim Leyland, a vet now with the United Kingdom's Department for International Development.

    But nobody is yet declaring victory. An earlier rinderpest-eradication effort was ended prematurely, and the disease came roaring back. This time, Roeder would like to see governments take the extra steps necessary to ensure that the disease is gone for good.

    A natural calamity

    Rinder pest can be devastating. The rinderpest paramyxovirus is one of a family of single-stranded RNA viruses. (The family includes the viruses causing canine distemper and human measles.) It attacks the lymph nodes and the epithelium of the alimentary, respiratory, and urogenital tracts and spreads through the infected droplets of the breath and excretions of a sick animal.

    Symptoms start with a fever often missed by herders and vets. Days later, animals stop feeding but become thirsty and restless and labor to breathe. Oozing sores appear in the mouths, nasal passages, and urogenital tracts. Diarrhea sets in, followed by dehydration and wasting and then death 6 to 12 days after symptoms appear. Like human measles, it is a disease primarily of the young; animals that survive an infection are immune for life.

    Rinderpest epidemics have washed across Eurasia since antiquity, typically killing 30% of affected herds. Its catastrophic effect on naive herds and wildlife was horrifically illustrated when cattle shipped from India to feed an Italian army carried the virus to the horn of Africa in 1889. By 1897, the virus had reached Cape Town, South Africa, killing about 90% of the cattle as well as large proportions of domestic sheep and goats in sub-Saharan Africa. Domesticated oxen died, leaving farmers unable to plow fields.

    The virus also decimated wild populations of buffalo, giraffe, and wildebeest. With herding, farming, and hunting all but gone, mass starvation set in. An estimated one-third of the population of Ethiopia and two-thirds of the Maasai people of Tanzania died of starvation. The rinderpest epizootic also altered the continent's ecological balance by reducing the number of grazing animals, which had kept grasslands from turning into the thickets that provide breeding grounds for the tsetse fly. Human sleeping sickness mortality surged. The rinderpest epizootic was “the greatest natural calamity ever to befall the African continent, a calamity which has no natural parallel elsewhere,” author John Reader wrote in his 1999 book Africa: A Biography of the Continent.

    The virus never became established in the Americas. In the 1920s, Europe managed to eradicate rinderpest by controlling animal movements and slaughtering infected animals. But periodic outbreaks continued to afflict much of Asia and sub-Saharan Africa. Wars, which create demand for livestock brought in from afar to feed troops, and droughts, which bring herders to scarce sources of water, often triggered rinderpest outbreaks. After rinderpest wreaked havoc with food production in Asia and Africa following World War II, the disease became a priority of the fledgling United Nations' agricultural efforts and then the U.N. Food and Agriculture Organization (FAO), which was created in 1945. A vaccine grown in goats became available in the 1950s. And in the 1960s, a British virologist named Walter Plowright developed a live attenuated vaccine that became widely used in rinderpest-eradication efforts.

    Try, try again

    There have been almost as many rinderpest-eradication campaigns as rinderpest pandemics. One effort came close to freeing Africa of the virus. With international funding, the Organization of African Unity (OAU) launched Joint Project 15 (JP15) in 1962. Operating in 22 countries, the strategy was to vaccinate all cattle each year for three successive years and thereafter all calves annually. At first, JP15 seemed like a dramatic success. By the mid-1970s, rinderpest had disappeared from many countries, and outbreaks were local and sporadic. But success led to complacency. Many countries terminated or scaled back expensive vaccination and surveillance programs, and in the late 1970s, the disease resurged, fanning out from two lingering foci of infection: an area on the Mali-Mauritania border in west Africa and from southern Sudan in the east.


    In Asia as well, inadequate vaccination and poor surveillance allowed the virus to spread from several pockets of persistence. By the early 1980s, rinderpest had recolonized a swath of Asia stretching from Turkey to Bangladesh, virtually all of the Middle East, and Africa from Senegal to Somalia and Egypt to Tanzania. Losses from the second African rinderpest pandemic rivaled those of the first a century before. One study estimated 100 million cattle died; the economic loss in Nigeria alone ran to nearly $2 billion. “JP15 showed what you could do with mass vaccination campaigns, but without an endgame, pockets were left and it slipped back” into broad circulation, says William Taylor, a retired veterinary virologist who battled rinderpest in several countries in Africa and Asia. “There was a huge lesson in that,” he adds.

    OAU decided to try to again with the Pan-African Rinderpest Campaign, which started in 1987 in 34 African nations with the goal of ridding the continent of the virus. Similar regional campaigns were launched in south and west Asia. Yoshihiro Ozawa, then chief of veterinary services for FAO, says that at about that time, it became clear that cattle exported from India and Pakistan were continually reseeding the rinderpest virus on the Arabian Peninsula, indicating “that a global campaign should be organized.” To stitch together regional and national efforts and share expertise, FAO initiated the Global Rinderpest Eradication Programme (GREP) in 1993. Roeder, then FAO's chief of veterinary services, helped establish the program and was named secretary of GREP in 2000. The program set the ambitious goal of eradicating rinderpest by 2004; this would be followed by a period of intensive surveillance to confirm that the virus was gone by 2010.

    Once again, the program started with mass vaccinations. But to finish the job, “we knew we needed to understand the epidemiological situation,” Roeder says. Traditional epidemiological investigations were augmented by new molecular and serological tools unavailable for previous campaigns. Molecular analyses revealed that there were three lineages of rinderpest—two in Africa and one in Asia—and enabled scientists to track outbreak viruses to their source reservoir. Tests to detect antibodies to the virus helped monitor the effectiveness of vaccination campaigns and—more important for the later stages of the program—look for evidence of the virus when vaccination was stopped.

    Remaining reservoirs

    Evidence soon pointed to a limited number of endemic rinderpest reservoirs, primarily in the herds of often-isolated communities in Ethiopia, Sudan, Yemen, Pakistan, and India. From time to time, the virus would spread from these reservoirs into normally free areas. Roeder notes that the virus has a complicating quirk: It can persist in a mild form, producing symptoms missed even by experienced vets, but it can quickly turn virulent when introduced into a new herd. As mass vaccination campaigns drove the virus back to these reservoirs, GREP decided to intensify vaccination in those foci of infection while stopping it in peripheral areas. With proper surveillance, unvaccinated animals could be the sentinels warning that the virus was expanding its range again.

    The last drop.

    Tom Olaka, a community animal health worker from Uganda's Karamajong community, was among scores who drove rinderpest toward extinction.


    Unfortunately, many of those reservoirs were hard to reach. Leyland, who was working for the United Nations Children's Fund (UNICEF) in southern Sudan in the early 1990s, explains that traditional vaccination campaigns relied on town-based vets or internationally staffed mobile veterinary clinics to make periodic forays into the countryside. “U.N. agencies put fortunes into buying Land Rovers and Land Cruisers to have mobile clinics touring around,” he says. But this didn't work where governmental vet services had broken down or in areas beyond the end of the road. In addition, herders often didn't want to bring their animals to vaccination posts “because it would put them at risk of being raided,” says Christine Jost, a vet from Tufts University's Cummings School of Veterinary Medicine in North Grafton, Massachusetts, who worked with a rinderpest-eradication program in Uganda.

    As an alternative, Leyland says the program adapted community-based animal health approaches previously tried in Afghanistan and other places where traditional vet services were weak or had been disrupted. This included training government vets in afflicted countries in techniques to get reliable epidemiological information directly from livestock farmers, says Manzoor Hussain, a veterinary virologist who worked for the Pakistani government and is now a consultant to FAO. At first, both government vets and farmers were dubious, says Hussain, but it turned out to be enormously useful in tracking the disease.

    GREP also enlisted the help of other international and government agencies. UNICEF's involvement in rinderpest eradication in Sudan, for instance, was born of necessity. That country's long-running civil war had displaced thousands of families. UNICEF wanted to vaccinate Sudanese children against childhood diseases, but many families refused. At the time, Leyland recalls, herders were losing 80% of their calves to rinderpest. Without cattle, there was no milk for the children, and the herders saw no point in vaccinating starving children. “They said, ‘Bring us cattle vaccine first and then we'll let you vaccinate our children,’” Leyland says. Because international agencies would not send their workers into conflict zones, UNICEF workers asked each community to select a trustworthy representative for training in vaccination and other animal health basics.

    The program then got a boost from a new vaccine. The effective workhorse vaccine of the 1960s and '70s had one drawback for use in remote areas of Africa: It had to be refrigerated up to the point of use. In 1990, Jeffrey Mariner, a vet at Tufts University School of Veterinary Medicine, developed an improved freeze-drying process that produced a live attenuated vaccine that retained efficacy at 30°C for at least a month. “This simplified the logistics and opened up options” for vaccination, says Mariner, now at the International Livestock Research Institute in Nairobi.

    By 1992, several labs in Africa were producing the new vaccine. It proved a perfect fit for the community animal health system, which started vaccinating cattle in conflict-ridden south Sudan in 1993. When international organizations pulled their staffs out of the path of advancing troops, community animal health workers continued to operate. Leyland says some would walk 2 or 3 days to reach an operating supply post in a safe area and then walk back, carrying several thousand doses of vaccine in gunnysacks periodically soaked so evaporation would keep it cool. It was a dangerous job. Several workers were killed when caught crossing the territory of a rival tribe, Leyland says. Still, “it was a hugely successful program” that contributed to eventually squeezing rinderpest out of Sudan, he says.

    The endgame

    As rinderpest became increasingly scarce, eradication experts progressively scaled back vaccination. “Stopping vaccination was the hardest thing we achieved,” says Taylor, then an adviser to the government of India. Livestock farmers and local vets didn't want to run the risk of losing animals again. But Taylor and his colleagues believed vaccination had to cease to determine whether the virus was truly gone, as antibody tests can't distinguish a vaccinated animal from one that has survived infection. There was also a risk that the attenuated virus used in the vaccine might regain its virulence, sparking a new rinderpest outbreak. This happened at least once, in the Amur Region of Russian Siberia in 1998. “This outbreak was thousands of kilometers from the nearest known source of rinderpest virus,” Roeder says. Investigators traced the outbreak to a vaccine-derived virus used in a buffer zone along the borders with China and Mongolia.

    Sri Lanka and Iran reported their last outbreaks in 1994, India in 1995, Iraq in 1996, Saudi Arabia and Yemen in 1997, and Pakistan in 2000. In Africa, Uganda has apparently been free of rinderpest since 1994, Ethiopia and Djibouti since 1995, Tanzania since 1998, and Sudan since at least 2001. The virus was last detected in 2001 in wild buffaloes in Meru National Park in Kenya, which lies on the edge of the Somali ecosystem, the putative last remaining reservoir. For several years, studies in the region have detected antibodies to the rinderpest virus in cattle, but Roeder suspects that this comes from sampling older cattle still carrying antigens from long-ago vaccinations. The most recent serosurveillance results from late 2007 “strongly support the view that rinderpest is no longer there,” Roeder says. At this point, says Roeder, no one is vaccinating against rinderpest, and no one is even making a rinderpest vaccine, although several labs are keeping stockpiles, and production could be rapidly resumed if necessary.

    One down, others to go?

    With the end of rinderpest in sight, experts are pondering the lessons for other animal health issues. Taylor says that in many ways, rinderpest was “a doable target” because the virus was well-understood, good vaccines existed, and herders, animal health experts, and donors alike agreed on the benefits of a concerted eradication effort.

    Tool of the trade.

    For many community animal health workers, such as this unidentified Sudanese man, an AK-47 was an essential piece of personal protection equipment.


    Other animal diseases that conceivably could be eradicated include foot-and-mouth disease and peste des petits ruminants, a highly virulent disease related to rinderpest that is increasingly affecting sheep and goats in Africa and Asia. Some experts, such as Leyland, note that rather than another eradication campaign, money might be better spent upgrading basic veterinary services in developing countries. “There's a huge amount to do just on ordinary diseases,” he says.

    Before the animal health community moves on, Roeder says that even though he is retired, he is determined to push countries to confirm that the rinderpest virus is gone. For rinderpest-free accreditation by the Paris-based World Organisation for Animal Health, countries must have stopped vaccinating for at least 2 years and during that time had no outbreaks or evidence of infection, as documented by an adequate surveillance program. Some 30 countries with histories of rinderpest outbreaks scattered across central Asia, the Middle East, and Africa still need to be accredited, he says. If the disease does emerge from some obscure pocket of infection, it would be devastating to the now-naive herds of Africa and Asia.

    It's impossible to tease out exactly what had been spent in the fight against rinderpest, but Roeder estimates that since 1986, international donors and participating countries spent approximately $610 million on animal health in Africa and Asia, primarily targeting rinderpest but covering other diseases and infrastructure. One FAO estimate puts the benefits of rinderpest eradication at $1 billion annually in Africa alone. The additional $10 million or $12 million needed for the remaining countries to complete the accreditation process “would be a small price to pay for finalizing the eradication of this devastating disease,” he says. With just a final push, rinderpest could officially join smallpox as a disease of the past.


    Protein Structure Initiative: Phase 3 or Phase Out

    1. Robert F. Service

    The production-line approach to finding protein structures is rapidly filling up databases. But is it the data researchers want, and is it worth the cost?

    The production-line approach to finding protein structures is rapidly filling up databases. But is it the data researchers want, and is it worth the cost?


    The catalytic part of a human phosphatase enzyme.


    In the early 1990s when structural biologist Andrzej Joachimiak was working in the labs of Paul Sigler and Arthur Horwich at Yale University, he and six colleagues worked together for more than 2 years to solve the x-ray crystal structure of a protein known as GroEL. To obtain such structures, researchers must arrange copies of a protein into the regular pattern of a crystal and then ricochet beams of x-rays off it to map out the position of each of the protein's atoms. At the time, the structure was hailed for offering a host of insights into how GroEL carries out its role as a “chaperone” helping other proteins fold into their proper three-dimensional shapes. But GroEL's large size made it a bear to solve.


    Andrzej Joachimiak and his colleagues at the Midwest Center for Structural Genomics are pushing the pace for solving protein structures.


    Today, as head of the Midwest Center for Structural Genomics, a consortium of investigators at eight institutions in the United States and Canada, Joachimiak and his colleagues churn out some 180 such structures a year, an average of one every 2 days. Not all are as difficult as the GroEL structure, but Joachimiak estimates that recent technological advances would al low them to solve something as complex as GroEL within about 2 months. That, Joachimiak says, “is a true revolution.”

    But some in the field say the revolution has gone far enough. Joachimiak's center is one of four high-throughput structural biology centers participating in the Protein Structure Initiative (PSI), a big-science project funded by the U.S. National Institute of General Medical Sciences (NIGMS), part of the National Institutes of Health (NIH) in Bethesda, Maryland. PSI is doing for protein structures what the Human Genome Project did for sequencing: turning it into a mass-production exercise. Already, PSI's four main centers and six smaller ones have turned out nearly 3000 protein structures and over the past 7 years have contributed about 40% of all the novel structures deposited in the Protein Data Bank (PDB), a global repository for protein structures.

    But with PSI now halfway through its second 5-year phase, critics say the cost of the program is too high. This year, NIGMS will spend approximately $80 million on PSI. By the end of phase 2 in July 2010, the total tab will be more than three-quarters of a billion dollars. At a time when NIH funding is flat, many critics argue that the money is better spent on traditional small-scale structural biology projects, ones geared toward solving particular questions about the detailed working of proteins highly relevant to biology and medicine. In December, that message was underscored by an external review committee of prominent biologists charged with assessing PSI. Among the report's conclusions: “The large PSI structure-determination centers are not cost-effective in terms of benefit to biomedical research.” Structural biologist Gregory Petsko of Brandeis University in Waltham, Massachusetts, echoed the sentiment in an editorial last year in Genome Biology, in which he labeled PSI “an idea whose time has gone.”

    PSI proponents have plenty of counterarguments, and the debate shows no signs of waning. “It's a real hot point in the community,” says Janet Smith, a structural biologist at the University of Michigan, Ann Arbor, who led the recent PSI review panel. “It's a fairly contentious topic, and opinion tends to run high,” she adds. In the midst of this debate, NIGMS officials will have to decide soon on PSI's fate. The current round of PSI funding is scheduled to run through July 2010. If agency officials want to continue uninterrupted funding for the project, they must send out a request for proposals sometime next year, according to NIGMS Director Jeremy Berg. That means they will likely need to decide by the end of this year whether PSI has a future. At this point, Berg says, funding for PSI 3 “is not a given.”

    A family affair

    Whereas genomics can reveal the sequence of amino acids in a protein, structural biology tells us how that sequence folds up into a particular shape, which is key to a protein's function. These structures have long been seen as a treasure trove of information about life's molecular machines. By revealing structures through x-ray crystallography and nuclear magnetic resonance spectroscopy, structural biologists glean insights into how they operate. In some cases, those insights can discover the likely function of an unknown protein, lead to a deep understanding of how misshaped proteins cause disease, and potentially reveal a path to new drug treatments. For example, resolving the structure of the HIV-1 protease led to the creation of the first protease inhibitors used to fight AIDS.

    Structural biologists have traditionally taken a hypothesis-driven approach to their science, asking questions about proteins known to be of interest. PSI, by contrast, chose a novel and somewhat controversial strategy: a “discovery-based” approach primarily targeted at proteins from different structural classes, or “families,” throughout the protein landscape. Members of each family fold up into similar shapes, often adopting similar functions, such as proteases, kinases, and phosphatases. One major goal of PSI has been to obtain structures of representatives of as many of these families as possible, in particular the large families that have the most members. Proponents argue that each structure could be the key to many more: information on how the sequences fold into proteins should enable computational biologists to create “homology models,” detailed simulations of closely related family members for which no physical structure exists, and thereby glean insights into their function (see sidebar, p. 1612).

    Success was far from certain. When the project started in July 2000, perhaps the biggest question was whether PSI centers would be able to automate all of the many steps involved in mapping proteins. Unlike genomics, which relies on speeding up one technology—reading the sequence of DNA's nucleotide bases—PSI leaders had to speed up numerous technologies including cloning genes into microbes, expressing and purifying proteins, coaxing them to form crystals, testing their quality, collecting x-ray data, and solving the structure. “Early on, we didn't know whether we were going to be able to build these pipelines,” says Ian Wilson, a structural biologist at the Scripps Research Institute in San Diego, California, who heads one of PSI's four large centers, the Joint Center for Structural Genomics in San Diego.

    Building a bigger pipeline.

    PSI groups created a series of new technologies to speed up the many steps involved in determining a protein's structure, such as robots to purify and crystallize proteins.


    But the recent review panel concluded that PSI's technology development had been “highly successful,” with advances dramatically speeding all phases of structure determination. In many cases, the panel's report concludes, PSI has fostered technology that can be adopted by more traditional structural biology efforts. The centers not only developed new technology, they've applied it effectively, too: The large centers now crank out an average of 135 protein structures each per year.

    PSI proponents argue that this production- line approach has dropped the cost of solving structures from about $250,000 apiece in 2000 to about $66,000 today. But PSI's success is not just about the bottom line, they argue. It's also revealing a diversity in protein structures never seen before. A 2006 analysis in Science by Steven Brenner and John-Marc Chandonia of Lawrence Berkeley National Laboratory (20 January 2006, p. 347) in California found that PSI centers account for about half of the novel structures submitted to PDB. These are structures for which their immunoacid sequence overlaps with that of any other proteins by less than 30%. Another study in the Proceedings of the National Academy of Sciences last year by Michael Levitt at Stanford University in Palo Alto, California, showed this trend continuing, and that PSI centers reversed an earlier steady decline among structural biologists in the number of novel structures being added to PDB. Wilson and several colleagues argued in an editorial in the January issue of Structure that the novelty of the PSI structures is a great benefit to the community because it provides data complementary to traditional structural biology rather than simply answering the same sets of questions.

    But researchers are still divided over just how useful all this new information is. “I had reservations from the outset,” says Petsko, who says he objected because protein structures are only useful when they can answer specific biochemical questions about the detailed workings of a protein. The recent PSI assessment report echoes this criticism, calling PSI's strategy of focusing primarily on novelty “seriously flawed.” One problem, Smith and her co-authors argue, is that the number of new protein families identified by gene-sequencing efforts worldwide continues to grow more rapidly than the number of protein structures being produced. A team of researchers reported last March in PLoS Biology, for example, that a random sequencing of DNA from the world's oceans showed that more than half of all the protein families they found had never been seen before, suggesting that researchers are nowhere near completing their survey of the diversity of protein families. That makes the challenge of obtaining representative structures from each family “an open-ended problem,” say the authors of the assessment report.

    What is more, the assessment panel concluded that although having a protein structure can help computer modelers make models of other members of that protein family, those models almost always have a low resolution and lack detail of the precise location of all the protein's different amino acid residues. Such detail is key to nailing down the exact biochemical workings of a protein and often its specific function. “The ability to model structures, particularly complex ones, is very far from being able to connect most PSI structures to function,” the report states. Even if an accurate model can be made, using that to discern a protein's function is not a straightforward task. A structure, Smith says, “is a little bit of data” that can be used to discern a protein's function. “But it's not as much as folks had hoped it would be.”

    On top of these problems, critics say PSI's data are not getting picked up by the broader community of biologists. In part, they argue that's because only a relatively small fraction of this broader community knows how to use this type of structural information. The bottom line, Smith says, is that “the number of structures provided [by PSI] is not providing a boon to biology.” By contrast, she adds, when the Human Genome Project began to release its data, it was instantly seized upon: “There was no need to ask, ‘Was this worthwhile?’”

    Function follows form

    PSI leaders counter that although it's true that the number of protein families is growing rapidly, most of the newly discovered families have only a few members. The majority of proteins are found in a small number of large families that are the focus of PSI's targeting. Gaetano Montelione, a structural biologist at Rutgers University in Piscataway, New Jersey, argues that as a result of focusing on large families the impact of PSI structures is increasing, because each solved structure carries more leverage, or ability to model a greater number of related structures, than those solved along traditional lines. Even with the limitations of current computer models, “the large information leverage provided by determining the first structural representatives from very large sequence families is tremendously enabling to biomedical research,” Montelione writes in a February 2008 response to the PSI assessment report. Although homology models may not always reveal a protein's function, Montelione and others argue, in many cases it can offer important clues to guide future biochemical experiments designed to nail down that function.

    Many researchers also dispute the claim that many PSI structures lack biological relevance. A fraction of PSI targets are chosen for biological interest. And any of its structures' relevance, as with the value of any basic research, takes time to grow, they argue. “The benefits we will see 2 to 3 years from now will be very great,” Wilson says, and will include a growing understanding of how protein families evolved and the evolutionary connections between different families. And as for the dissemination of PSI data, PSI leaders say that a new knowledge base ( that came online earlier this month should improve matters dramatically.

    David Baker, a computational biologist at the University of Washington, Seattle, adds that the large number of PSI structures is also making possible an emerging approach to designing new therapeutics. Baker's group uses the full gamut of PDB structures to help them design better protein-based inhibitors to toxins, as well as vaccines for diseases such as HIV. When the group designs their proteins, they start with the shape of the target they are trying to block. Then they conduct a computer scan through all the known protein structures in PDB—including PSI structures—looking for as many proteins shaped to fit into those targets as possible. Then they set their computational program loose to refine those matches and design a novel protein for an optimal fit. And the more close matches they have, the more accurate and effective the designed protein tends to be. As such, Baker says, the value of the database will only grow. “These structures are really going to help protein design,” Baker says. “I don't think that was anticipated originally.”

    The messenger.

    Janet Smith led a recent panel of biologists that criticized PSI for having a limited impact on the broader community of biologists.


    A question of value

    Smith and others say they readily agree that PSI is producing good science, but they question whether it's worth the cost. “It's how do you get the most bang for your buck,” says Philip Cole, a pharmacologist who specializes in signal transduction at the Johns Hopkins School of Medicine in Baltimore, Maryland.

    Still, Cole and others worry that even if PSI isn't funded for a third phase, there's no guarantee that money saved will flow to traditional structural biology groups. That's not how science funding works. “If PSI were to be discontinued, the money would go back to the general pool within NIGMS,” Berg says. Structural biology funding, he adds, accounts for about 10% of the NIGMS budget, with traditional single-investigator grants taking up about 6.3%. So doing away with PSI would likely increase the share of funding for individual structural biology grants from about 6.3% to perhaps 6.7%, Berg says. What is more, structural biologists currently working on PSI would then be competing for those funds. So the net result could wind up being “a pretty big negative” for the community, Berg says.

    Off the charts.

    PSI centers are determining structures at a pace never seen before. But critics doubt that the impact has kept pace.


    So what's next? Berg says NIGMS is currently evaluating all of its large-scale projects to decide which ones to continue. The PSI assessment panel argued against continuing the project in its present form. “Future effort might be focused on smaller projects with much higher experimental coupling to biological function and improving computational methods of analyzing and predicting protein structure,” the report concluded. In his response to the report, Montelione agreed that connecting more directly with the priorities of biologists “needs to be a priority” in designing PSI 3.

    Others agree that perhaps the best solution is to focus more tightly on protein targets with known biological relevance, such as multiprotein complexes, proteins that are embedded in cell membranes, and proteins from disease-causing microbes. “This can evolve,” says Joel Sussman, a structural biologist at the Weizmann Institute of Science in Rehovot, Israel. “Now you can use these enormous platforms [built in the PSI centers] to tackle biological problems.” Whether the broader biological community can agree on such a compromise will largely depend on whether NIGMS sees budget increases anytime soon. Says Wilson: “When the money gets tight, the knives come out.”


    Researchers Hone Their Homology Tools

    1. Robert F. Service

    How well do homology models, which use solved structures as templates for computer models of the three-dimensional shapes of closely related proteins, work? Not well enough, according to members of a review panel that issued a mixed report card on the Protein Structure Initiative in December.

    The Protein Structure Initiative (PSI) is churning out new protein structures at a pace never seen before. But even the hundreds of structures the initiative unveils each year don't make much of a dent in the millions of proteins and multiprotein complexes thought to be out there. One hope for PSI, however, is that the proteins it has solved will give researchers insights into the structures and functions of some of those whose shapes are unknown. For such work, computational biologists employ “homology models,” which use solved structures as templates for computer models of the three-dimensional (3D) shapes of closely related proteins.

    Model behavior.

    Computational biologists use known protein structures to help them model the shape of closely related proteins.


    How well do homology models work? Not well enough, according to members of a review panel that issued a mixed report card on PSI in December. Although such models often get the general shape of related proteins correct, they typically lack the atomic-scale resolution needed to gain specific insights into how a protein does its job—or even what job it does. “The large numbers of new structures determined by the PSI effort have not led to significant improvements in the accuracy of homology modeling that would allow modeling of more biologically relevant proteins, complexes or conformational states,” the report concluded.

    But computer modelers say that conclusion misses the mark on several counts. First off, they point out, it was never a stated goal of PSI to improve the accuracy of homology models. “This was a complete red herring,” says John Moult, a computational biologist at the University of Maryland Biotechnology Institute in Rockville. Moult says the initial intention was simply to allow computational biologists to apply existing models to a larger number of target proteins. And that, he says, has undoubtedly occurred. Of all the structures submitted to the global Protein Data Bank, PSI now contributes about 40% of all the “novel” protein structures—those significantly different from any solved previously. And according to one recent estimate, those allow for the creation of more than 40,000 homology models that could otherwise not be made.

    That said, Moult and others argue that PSI is actually now beginning to contribute to the improvement of homology models themselves. In its second phase, PSI has supported two small centers geared toward improving computer models and has also supported individual computer-modeling groups. That bioinformatics support was perhaps “a little slow” in coming, says Andrej Sali, a computational biologist at the University of California, San Francisco. But he and others argue that this support, together with the increased number of structures, has helped spur advances in the basic algorithms to improve the accuracy of models.

    Whether due to PSI or not, Moult and others say there's plenty of evidence homology models are improving. For starters, they point to a biennial competition among computational biologists to predict the structure for a series of proteins. The Critical Assessment of Structure Prediction (CASP), which began in 1994, will hold its eighth competition later this year. The first “was embarrassing,” says Moult, who heads the CASP competitions. Few of the early models even came close to figuring out the actual structure of their target proteins, which were also simultaneously solved by x-ray crystallography for comparison. But by 2002, 60% of the models got close enough to the final structures to add useful information. By 2006, that number had climbed to 80%. “I don't want to say modeling was improving only because of the PSI,” Moult says. But the added structures in the database, he argues, are making a “very significant contribution.” Adds David Baker, a computational biologist at the University of Washington, Seattle: “Homology modeling is definitely getting better.”

    In another key advance, improved computer models are making it easier for x-ray crystallographers to solve their structures. Experimentalists solve these structures by firing powerful beams of x-rays at protein crystals and tracking how those x-rays ricochet off their targets. These data give them much of what they need to nail down the position of all the atoms in the protein. But for a complete 3D picture, researchers typically compare the original data with another set taken from a closely related protein. Combining the two data sets is usually enough to finish the job. Not all proteins have close relatives that have been solved. But in a Nature paper last November, Baker's team showed that it was possible to use newer high-resolution homology models as the close relative to help researchers solve the x-ray structures. “It's not an established method yet,” Baker says. However, he argues, it shows the synergy that can occur between high-quality experimental data and computational models.