News this Week

Science  18 Sep 1998:
Vol. 281, Issue 5384, pp. 47

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    NIH to Produce a 'Working Draft' of the Genome by 2001

    1. Eliot Marshall

    Many scientists were skeptical last May when DNA sequencer J. Craig Venter and his private backer—the Perkin-Elmer Corp. of Norwalk, Connecticut—said they were going to decode the entire human genome in just 3 years. At the time, the government-funded Human Genome Project wasn't due to deliver the goods until 2005. To some academics and government genome sequencers, Venter's pace seemed too fast to be credible. Then, in August, Incyte Pharmaceuticals Inc. of Palo Alto, California, joined the race. It said it was going after the entire human genome too, aiming to get just the genes in 2 years. Now, faced with growing private competition, the skeptics of rapid sequencing have become believers. In a radical change of plan, the chiefs of the U.S. genome project announced this week that they intend to match the private sector's pace and deliver comparable results just as fast.

    The U.S. National Human Genome Research Institute (NHGRI) unveiled a 5-year plan this week that promises to produce a “working draft” of the human genome—including highly accurate sequences of most of the protein-coding regions—by 2001. The plan also promises to yield a polished, gold-standard version of the entire genome by 2003, 2 years ahead of the old schedule. If successful, this scheme will not only speed up the pace at government-funded labs but also, according to some of NHGRI's advisers, release data so rapidly that companies such as Perkin-Elmer and Incyte may not be able to get exclusive rights to all the DNA they hoped to patent.

    Francis Collins, director of NHGRI, and Ari Patrinos, head of the Department of Energy (DOE) genome program, presented this aggressive new strategy to a meeting of NHGRI's advisory council on 14 September. The council voted its approval, with minor revisions. The plan has already been reviewed and endorsed by DOE.

    Collins claims the companies didn't prompt this new vigor but acknowledges that “all of this stir in the private sector has opened people's eyes to the possibilities” of rapid sequencing. “This is not a reaction,” he insists: “It is action.” NHGRI, he says, had been considering taking this step before Venter made his announcement in May.

    In a friendly gesture, Collins called Venter last week to invite his newly formed sequencing firm, Celera Genomics Corp. of Gaithersburg, Maryland, to collaborate on the new plan of action. Venter says he's already sharing data with DOE scientists and is “absolutely delighted” with the invitation. He is banking on a risky “whole-genome shotgun” approach to sequencing and says it would be “nice” to have help in identifying where certain DNA clones fit on the genome. The publicly funded groups will be using a more cautious approach that involves mapping DNA clones to the genome before sequencing. Venter says that he would be happy to include NHGRI-funded labs as co-authors on his papers.

    With or without contributions from the private sector, Collins says, it will be “a stretch” to meet the targets in this “ambitious” new government plan. But the agenda is creating “a sense of excitement” among DNA sequencers, he adds. The strategy puts an emphasis on speed and “getting the good stuff,” as Collins notes—a shift from the early approach of determining every DNA base to the greatest degree of accuracy. For example, Collins estimates that the current rate of sequence production at NHGRI-funded labs will double or triple, based on improving skill and technology. And, as an immediate goal, NHGRI-funded centers are being asked to focus on “gene-rich regions of the genome.” The strategic plan calls for the creation of a peer-review panel to prioritize the hottest regions for sequencing and assign them to labs that bid for them.

    By 2001, according to Collins, this process should yield final DNA sequence for about one-third of the entire human genome. And if the bidding system for hot areas works well, this batch of data should include complete versions of a majority of human genes. These completed segments, according to the plan, should be contiguous over an undefined “long range” and finished to a high degree of accuracy, with an error rate of no more than 1 base per 10,000.

    The genome will be sequenced less than completely in this initial push, producing a “working draft” that “could cover at least 90% of the genome” with an overall error rate of one per 100 bases. After 2001, the plan says, researchers will close the gaps and polish the data, producing a final draft in 2003 with a one in 10,000 error rate. But even this version, a footnote advises, may not include all the bases from areas that are hard to decipher or clone.

    Some members of the genome community had been wary of the plan, fearful of compromising earlier and more demanding quality standards. Indeed, NHGRI council member Leroy Hood, molecular biologist at the University of Washington, Seattle, says he's been pushing for faster sequencing for “several years,” but that cautious members of the community have resisted—until now. The new scheme represents the “strong consensus” of NHGRI center directors who met to discuss it on 3 September, Hood says, but it did not get unanimous support. Robert Waterston, director of the sequencing center at Washington University in St. Louis, says that between May and August, there was “considerable debate among the centers,” but that now “the mood is strongly convergent.” Richard Gibbs, director of the sequencing center at Baylor College of Medicine in Houston, says he thinks there are “almost no reservations” about the plan today—mainly because people have become more skilled in fixing flaws in the data and more adept in using sequencing technology.

    What will it cost the government to double or triple the DNA sequence output of the Human Genome Project? Collins insists that the new plan is “not dependent” on any big increase in funding. It can be supported, he claims, within the typical budget increases NHGRI has received recently—about 10% a year. NHGRI's budget this year is $220 million. But he adds with a twinkle: “Additional resources could be used very effectively.”


    Traces of Ancient Mariners Found in Peru

    1. Heather Pringle*
    1. Heather Pringle is a writer in Vancouver, British Columbia.

    Most immigrants to the Americas have arrived by sea, but the very first Americans simply walked in. Or so goes archaeologists' traditional view, which holds that the first inhabitants were the big game hunters called Clovis people, whose ancestors crossed the Bering land bridge and swept southward through the Americas perhaps 11,200 years ago. But dates as early as 12,500 years ago at a site in Chile have raised questions about this model, and many researchers have speculated about a shadowy alternative: that the first Americans set the pattern for later immigrants by arriving by boat, leaving few traces of their journey.

    Now on page 1830 of this issue, two independent research teams report finding the first hard evidence, albeit indirect, for the maritime settlement theory. The discoveries, which reveal an ancient maritime culture in South America about 11,000 years ago, are “about the best kind of evidence that you're going to find that people familiar with the ocean were migrating down through the Americas,” says geologist David Keefer of the U.S. Geological Survey in Menlo Park, California, lead author of one study.

    As long ago as the mid-1970s, archaeologist Knut Fladmark of Simon Fraser University in Vancouver proposed that coastal peoples from Asia settled the Americas by paddling southward down the Pacific Coast with simple watercraft and a hefty dose of maritime savvy. Fladmark also noted that the theory would be hard to verify, because most of the clues left along the coast by these putative coastal explorers would now be underwater, drowned some 10,000 years ago by sea levels rising after the last ice age.

    Along the southern coast of Peru, however, the sea floor slopes steeply away from the coast. As a result, “very little land horizontally was lost to rising sea level,” says archaeologist Daniel Sandweiss of the University of Maine, Orono. “This is one of the reasons I was looking for sites in this region.” There, one U.S.-Peruvian team led by Sandweiss and another led by Keefer found two ancient campsites of a maritime culture. Radiocarbon tests on charcoal indicate that Quebrada Jaguay, Sandweiss's site, is 11,100 years old, while Keefer's, Quebrada Tacahuay, dates to 10,700 years, making these cultures among the most ancient in South America. A few Andean sites are between 11,000 and 11,500 years old, and the famous Monte Verde site in central Chile has been put at 12,500 years old, although some researchers still have reservations about this date (Science, 28 February 1997, p. 1256).

    Seafarer's home?

    Ancient Americans may have traveled by water to the arid coastal site of Quebrada Jaguay.


    Bones and other refuse found at the new sites show, says Keefer, that the inhabitants “were familiar with and were using the sea.” At Quebrada Tacahuay, people concentrated on fishing for anchovies and hunting seabirds, particularly cormorants. “What we're seeing is really an economic specialization,” says faunal analyst Susan deFrance of the University of Florida, Gainesville, a co-author of the paper on this site. “Clearly they focused on this small group of birds,” she says, systematically butchering them to remove breast meat. So intently did Quebrada Tacahuay's inhabitants focus on the ocean that 99.8% of the bones at the site belong to marine creatures.

    At Quebrada Jaguay, the inhabitants earned their living by gathering clams and capturing small schooling fish, chiefly species in the drum family. Moreover, at both sites the teams found remains of small, calorie-rich fish, indicating an early net fishery—a very specialized maritime occupation, notes Keefer. But the coastal dwellers at Quebrada Jaguay also had intimate connections to the Andes. Studies of trace elements in the obsidian they sometimes used for tools show that the stone came from highland sources 130 kilometers to the east, indicating that these people either traveled to the highlands themselves or traded with people who did, says Sandweiss.

    Both the early dates and the maritime lifestyle make it unlikely that these people were the descendants of land-lubbing Clovis people, says Anna Roosevelt, an archaeologist at the University of Illinois, Chicago. After they reached South America, the Clovis were thought to have headed first for the Andean highlands, where the temperate, open habitat supported big game. “They weren't supposed to reach the coast … until later,” says Roosevelt.

    Coastal finds.

    Campsites of an ancient maritime culture hug the Peruvian coast.


    What's more, she and others have found equally old Paleoindian sites in South American rainforests, where they adopted a plant-collecting, foraging, and fishing lifestyle, again very different from that of the Clovis people (Science, 19 April 1996, pp. 346 and 373). Thus the ancient maritime sites “suggest that Clovis is just one of several regional early Paleoindian occupations. There's no apparent ancestral relationship between Clovis and these people in South America,” says Roosevelt.

    But if Clovis isn't the mother of these maritime cultures, who is—and how did the ancestral stock get there? The obvious answer is by sea, says Keefer, although such a claim is far from proven yet. In Keefer's view, the net fishery and reliance on ocean food sources indicate a sophisticated and ancient knowledge of the ocean. That means that “the most logical scenario would be for them to migrate down the coast,” he says. The extreme aridity of the Peruvian coast—one of the driest places on Earth, both then and now—argues for water travel, says Sandweiss. “If you had watercraft, then you could carry water and you could move more quickly” than traveling overland, he says.

    Even researchers who have invested their careers searching for the earliest South Americans in highland sites are now giving the maritime idea serious consideration. “I've long pushed the idea of people moving down the flanks of the mountain zone all the way from the Isthmus of Panama down the Andes,” says archaeologist Tom Lynch, director of the Brazo Valley Natural History Museum in Bryan, Texas. “But it may be that people actually came along the coastal fringe.”


    Strict Rules Rile Indian Scientists

    1. Pallava Bagla*
    1. Pallava Bagla is a correspondent in New Delhi.

    New Delhi—Proposed rules to create a government-run system to regulate research using animals have triggered a fierce debate in India. Drafted by a committee chaired by social justice minister Maneka Gandhi, an outspoken animal-rights activist, they are set to go into effect on 8 October. But research groups are trying to block them, arguing that they are extreme and threaten valuable research.

    Issued last week, the proposed rules would require all labs doing animal experimentation to register and obtain prior written approval from the government. They would effectively ban animal testing and other contract work for foreign institutions and companies by prohibiting any research done on behalf of unregistered institutions. Registered labs would also need to provide quarterly updates on their activities and could neither transfer nor acquire animals without permission. Gandhi describes the proposed rules “as conforming to well-established norms adhered to in the West.”

    The new system would be run by the Committee for the Purpose of Control and Supervision of Experiments on Animals (CPCSEA), which Gandhi chairs. It would supplant voluntary guidelines issued in 1992 by the Indian National Science Academy. Last week the committee released a survey done by a private advocacy group that found that many of the country's leading research labs haven't been following even those guidelines. “Why should my animals be subjected to cruel tests for the sake of Western companies?” Gandhi said in an interview with Science. “I am very happy that there will be more paperwork [for the scientists].… They are used to doing whatever they feel like. Now they will have to fall in line.”

    Perhaps, but not quietly. Last week officials from the National Academy of Sciences urged Prime Minister Atal Bihari Vajpayee to prevent the implementation of the proposed rules. A dozen heads of biomedical labs and secretaries of government scientific departments also met last week and called for more discussion. “They are fraught with serious consequences to the progress of biomedical research in India toward new vaccines and new drugs,” says Vulimiri Ramalingaswami, a pathologist and former chief of the Indian Council of Medical Research (ICMR). International research would also be jeopardized, says Chhitar Mal Gupta, director of the Central Drug Research Institute in Lucknow, which has a long-running project with the U.S. Walter Reed Army Institute of Research testing new therapeutics against malaria. “This collaboration will fall apart and drug development will be shattered,” he says.

    Each year Indian scientists at 5000 laboratories use more than 5 million animals—ranging from frogs and rats to monkeys and buffaloes—at a cost exceeding $10 million. India has traditionally been a major source for experimental Rhesus and Bonnet monkeys caught in the wild, and several international pharmaceutical companies stepped up their animal testing facilities in the country after a 1978 ban on exports. Experts say that such tests cost one-tenth as much as they would in the West.

    Institutions are supposed to follow the academy's voluntary guidelines, which lay down broad policies on housing and feeding of animals and proper experimental procedures. They specify that all institutions should have animal ethics committees that must include at least one outside scientist and one member of the public. Whereas ICMR Director-General Nirmal Kumar Ganguly says that “all Indian institutes are conforming to the [academy] guidelines,” a new survey carried out at the behest of the CPCSEA found otherwise.

    The survey released last week found that of 30 labs sampled (including many national research institutes, pharmaceutical companies, and at least one veterinary college), only half had any form of animal ethics committee, and only two had any outside members. In fact, Science has learned that ICMR's premier institution, the National Institute of Communicable Diseases in New Delhi, does not have an animal ethics committee and that the National Institute of Immunology in New Delhi formed its panel only this summer. “If they [research labs] don't even have a semblance of an animal ethics committee, how can you expect them to self-regulate?” asks primatologist Iqbal Malik, who conducted the survey as head of Vatavaran, a New Delhi-based advocacy group.

    Although many Indian scientists agree that there is room for improvement, they say the proposed guidelines will merely add to an already heavy administrative burden. “There is an urgent need to have a pragmatic animal-testing policy [because] we may not have done well in the past,” says Raghunath Anant Mashelkar, director-general of the Council of Scientific and Industrial Research. “But overregulation does not help anybody.”

    Some researchers admit, however, that the stick of government regulation may work better than the carrot of voluntary compliance. “Who bothers to implement guidelines given out by an academic body?” says entomologist Vinod Prakash Sharma, director of the Malaria Research Center in New Delhi. “Only guidelines given out by the government have any hope of ever being followed.”


    Graduate Admissions Down for Minorities

    1. Marcia Barinaga

    When California voters approved an anti-affirmative action referendum in 1996, and a district court that same year banned affirmative action at universities in Texas, Louisiana, and Mississippi, educators feared that minority university admissions would suffer in those states. A report* released last week by the American Association for the Advancement of Science (AAAS, publisher of Science) shows the situation to be even worse than many expected: Minority enrollments in graduate science and engineering programs dropped precipitously in 1997, not just in Texas and California but across the country. The report's authors attribute the fall to the uncertainty the laws and legal challenges have bred about what forms of affirmative action are legally allowable.

    Coincidentally, that gloomy news came out within a day of the publication of The Shape of the River, a new book by William Bowen, president of the Mellon Foundation, and former Harvard University President Derek Bok, documenting the achievements of past affirmative action programs. Published by Princeton University Press, it concludes that such policies at top undergraduate colleges and universities have largely been successful in giving black undergraduates a boost toward financial success, professional and graduate study, and leadership positions.

    “The society is fortunate that these two reports appear at the same time,” says Luther Williams, assistant director for Education and Human Resources at the National Science Foundation (NSF). The detailed data gathered in both reports, he says, will “provide a factual basis” for new plans to recoup losses and increase minority enrollment in science and engineering programs. Federal support for such a plan was already evident last week: At a White House ceremony honoring mentors for minorities in math and science, President Clinton instructed the National Science and Technology Council (NSTC), an interagency committee that reports to the White House, to “develop recommendations within 180 days on how to achieve greater diversity throughout our scientific and technical workforce.”


    The number of black and Hispanic students enrolling in science and engineering programs dropped in 1997.


    The authors of the AAAS report reached their conclusion by analyzing admissions data for the past 4 years from science and engineering graduate programs at 93 major research universities. They found little change in black graduate admissions and a slight increase in Hispanic admissions from 1994 to 1996. But in 1997, black admissions declined 20% and those of Hispanics dropped 16%. Report co-author Shirley Malcom, director of Education and Human Resources Programs at AAAS, attributes the plunge to a lack of clear direction from the federal government. “Administrators are feeling really uncertain because they don't know what is allowed and what is not,” says Malcom. The report says that uncertainty often translates into “lukewarm attention to minority recruitment and retention” and heavier reliance on GRE scores for admission, which hurts underrepresented minorities.

    The president's new initiative should help in that regard, says Arthur Bienenstock, associate director for science at the White House Office of Science and Technology Policy: “That directive will necessarily lead the NSTC to provide clarification in this issue of what can and cannot be done in the targeted admission of minorities.” NSF, which is mandated by a 1980 law to work to boost the participation of women and minorities in science and engineering, also plans to help clarify the situation by revamping its minority programs (Science, 28 August, p. 1268). Williams says he doesn't know yet what the changes will be but will look to the book and the AAAS report for guidance as to what approaches will work best.

    As those efforts seek to reclaim lost ground, Bowen and Bok's book documents how successful race-sensitive admissions can be. The authors studied admissions and student performance data from 28 highly selective undergraduate institutions, as well as survey responses from 31,000 students who entered those institutions in 1976 and 1989. Their findings challenge several commonly held negative beliefs about affirmative action.

    For example, their data refute the idea that affirmative action promotes minority students beyond their ability to succeed. Bowen and Bok compared black students with identical SAT scores at different institutions. They found that the students graduated at a higher rate from the more selective colleges, where the average SAT score was up to 200 points higher than theirs, than from less selective colleges where the average SAT score was more like their own.

    The success of those black students at the top schools wasn't achieved by avoiding tough majors. At the schools surveyed, the same percentage of black students as whites (20%) majored in science and engineering. “That is so different from the myths one hears, that [blacks] are all majoring in African-American studies,” says Bowen. Moreover, 40% of the black students completed a professional or doctoral degree, compared to 37% of their white counterparts.

    Bowen says he hopes the book will help warm the current chilly climate toward affirmative action and encourage policy-makers to endorse race-sensitive admissions. But given the growing list of legal strictures, NSF and NSTC must walk a fine line in pursuit of their goal.

    • *From “Losing Ground: Science and Engineering Graduate Education of Black and Hispanic Americans.” Ordering information available at


    A Record Grant for College Programs

    1. Jennifer Couzin

    The Howard Hughes Medical Institute (HHMI), best known for picking elite researchers and providing them with generous funding, announced this week that it is making a huge investment in the next generation of potential Hughes investigators. It will provide the largest grant in U.S. history to support undergraduate education in biology: $91.1 million to 58 universities.

    The initiative will serve “to train the next generation” of biologists, says Joseph Perpich, HHMI's vice president for grants and special programs. “But it's also to provide very strong biology education to anyone who wants it.” The new grants, which range from $1.2 million to $2.2 million over 4 years, continue an undergraduate science program Hughes launched 10 years ago. All but four of the recipients have received grants in the past.

    Existing HHMI-funded programs range from a Biology Scholars Program at the University of California, Berkeley, that reaches out to women and minorities underrepresented in the life sciences, to matching undergraduates with faculty members conducting lab research at the University of Arizona, Tucson. That program has expanded from 19 participating faculty members in 1988 to more than 230 today. HHMI support for undergraduate science “has really helped change the value system at research universities,” says Sam Ward, a professor of molecular and cellular biology at the University of Arizona and program manager for the HHMI grant.

    One of the rookies in this year's program, Clemson University in Clemson, South Carolina, plans to spend its $1.6 million grant on a combined effort among the biology, education, and earth science departments in training middle and high school teachers in hands-on biology methods. The University of Arizona, one of two schools receiving the maximum grant of $2.2 million, will also use some of its money to support teacher training. It plans a sabbatical program in which high school science teachers will spend a year on campus studying science.

    HHMI announced its new round of awards just a week after the National Research Council (NRC) released a report that pointed to a glut of life sciences Ph.D.s flooding the academic job market (Science, 11 September, p. 1584). Perpich says this new series of grants is not aimed at pushing more biologists into that pipeline. The intent, he says, is to produce graduates better educated in the life sciences, regardless of what career path they choose after college. Shirley Tilghman, a biologist at Princeton University who chaired the panel that wrote the NRC report, agrees: “I'm 100% behind these undergraduate science grants.” Tilghman, an elite Hughes investigator herself, says the Hughes grants “stimulate faculty [members by] giving them the resources” and the freedom to implement innovative teaching methods.


    China Sets Rules for Foreign Collaboration

    1. Hui Li*
    1. Li Hui writes for China Features in Beijing.

    Beijing—China is about to issue new rules governing the export of human genetic materials that will provide a legal framework for foreign collaborations in biomedical research. The rules strengthen the rights of patients involved in international studies and establish a formula for sharing any commercial proceeds among the collaborators. Although scientists who have read the rules generally applaud them, some worry that the additional bureaucratic procedures—including the collection of fees by local authorities at the start of a project—could raise the cost and extend the duration of many projects.

    The regulations, drafted by the Ministry of Health and the former State Commission of Science and Technology (now a ministry), will tighten controls on work being done in China by outside researchers and pharmaceutical companies. Press reports of such activities, including one in Science (19 July 1996, p. 315), triggered concern that foreigners were plundering China's genetic resources. As a result, all such collaborations ground to a halt last year while the government drafted the new rules (Science, 17 October 1997, p. 376). The State Council approved the regulations on 10 June, but health ministry officials say the country's preoccupation with the disastrous flooding this summer has delayed a formal announcement.

    One of the few scientists who has been given a copy of the regulations, Jiang Feng of Shanghai Medical University, says their existence is “good news. … At least we know where we stand and what to do next.” Jiang is working with Thomas London and others at the Fox Chase Cancer Center in Philadelphia on a study of the molecular epidemiology of liver cancer in men from the eastern Jiangsu city of Haimen. The study has been suspended for more than a year, and Jiang estimates that the delay has prevented early diagnosis of the disease in at least 150 patients. A colleague currently in the United States, Shen Fumin, is going over the new rules with his U.S. partners to make sure that the 5-year project, funded by the National Institutes of Health, is in compliance.

    The regulations will be implemented by an office, jointly staffed by both ministries, that is being set up to oversee all international collaborations involving human genetic materials—blood, tissue, organs, and so on—and to approve exports of such material. Any materials lacking the required approval will be confiscated by Chinese customs. “We will absolutely stop those projects that do not conform to the regulations,” says the health ministry's Yu Xiucheng, who helped draft the rules and is scheduled to head the new office. However, he said the office will be more lenient toward projects that will not take blood samples and other sensitive material out of China.

    Yu says the office plans to review applications in batches, every 3 months, using an expert panel of scientists from around the country. Before seeking approval from his office, however, potential collaborations must first submit an application and draft contract to local departments where the Chinese partner is located and include written approval from donors of any genetic materials and their relatives. “This is to show respect to local authority with the immediate access to the sources,” Yu says. Local authorities would collect an as-yet-unspecified fee, he adds.

    But some scientists are worried about possible delays, and they question the imposition of a fee. “Once every 3 months is too slow. It should take 4 weeks at most,” says Jiang. Harvard epidemiologist Xu Xiping also worries that the government will not provide enough staff to handle the workload and that some investigators will abandon their projects out of frustration. Other scientists say that the fee gives the impression that the government is trying to make money from research and selling the opportunity to do science to the highest bidder.

    The regulations also address the issue of ownership, both of the material itself and any commercial value it may have. Patent rights and any profits will be shared in proportion to the contributions of each party. Yu says the requirement for informed consent reflects the government's concern for human rights and brings China in line with Western practices. He adds that there are also plans to review the rule in 3 to 5 years.

    Yang Huanming, director of the newly established Human Genome Center within the Institute of Genetics of the Chinese Academy of Sciences, predicts that the new rules “will do more good than harm.” He notes that “most of these projects have already sought permission from the relevant authorities” but that “the new regulations provide uniform principles.” Wu Ming, a leading geneticist in Shanghai, praises the new provisions to protect patient rights. “After all, human beings are not animals, and they deserve due respect,” he says. The new regulations, he adds, should send an important signal to the global research community about working in China: “Those who used to do whatever they liked will now have something in their way.”


    Impact of Primate Losses Estimated

    1. Jocelyn Kaiser

    Like doctors battling a deadly disease, conservationists go about their work knowing that many species will die out despite their best efforts. A new analysis of looming primate extinctions now adds to the gloom: It suggests that the impact of extinctions in certain regions could be more damaging than one might expect from numbers alone, and that conservationists should pay more attention to the ecological value of species.

    A team at the State University of New York, Stony Brook, combined data on the endangered status of primates with information on what those primates do in an ecosystem—disperse seeds, pollinate plants, or serve as prey for other animals, for example. In the current issue of the Proceedings of the National Academy of Sciences, they predict that in some parts of the world entire guilds of primates that perform specific and critical ecological roles will be lost, leading to deep impacts on ecosystems. “If we eliminate some of these species, there's going to be a whole hunk of ecosystem health that will be gone forever,” says primatologist Patricia Wright, who did the work with her husband, evolutionary biologist Jukka Jernvall, at the University of Helsinki in Finland.

    Ecologists have long noted that the loss of species that do a specific ecological job can have ripple effects across an ecosystem. Jernvall and Wright sought to quantify such impacts for primates, well-studied mammals that play key roles in many ecosystems.

    The duo made their predictions using two potential waves of extinction, first removing all the endangered species, then all those now listed as threatened. Next, they examined 17 variables such as diet, habitat, tooth type, and body size and used them to characterize species' ecological roles—as predators or seed dispersers, for instance. Finally, they mapped out how the ecological diversity of the primates in a particular region would change as species die out.

    The results varied dramatically by region. In South America, the set of doomed primates spans ecological niches, so the impact should be proportional to the number of species lost. But the ecological impacts will be worse in Asia, Africa, and especially Madagascar, where entire guilds of primates with similar specializations will be lost in a single clump. For example, in Madagascar the potential losses include a group of fruit-eating lemurs that disperse seeds, and in Africa they include the great apes, which also disperse seeds and eat massive amounts of foliage. “After that, no primate is doing that job in the forest,” Jernvall says. Such losses might hasten the extinction of trees dependent on the seed-dispersers and so affect organisms dependent on the trees, says Wright.

    But some primatologists say the results may not mean much for conservation. “It's an interesting exercise, but it doesn't get us that far in practical terms,” says Ian Tattersall, a primatologist at the American Museum of Natural History in New York City, who notes that knowing the ecological effects of extinctions doesn't help much in staving them off.

    Still, ecologist Stuart Pimm of the University of Tennessee, Knoxville, thinks this kind of study is quite valuable, because it helps “bridge the gap” between studies of extinction and of ecosystem productivity. “Most of what we do in terms of documenting species loss tends to look at the species as completely independent of each other,” says Pimm. “In fact, the better analysis would be that you're tinkering with a complex piece of machinery.”


    Fighting Corruption in the Quantum World

    1. Andrew Watson*
    1. Andrew Watson is a writer in Norwich, U.K.

    If there is one sure thing in the computer industry, it is that sooner or later, engineers will not be able to squeeze any more circuits onto chips. But an enthusiastic group of researchers is speculating about a whole new realm of miniaturization: devices so small that they operate according to the unfamiliar quantum laws of the atomic world. Quantum computers could remain a dream unless physicists can find a way around the vexing tendency of quantum information to leak away and degrade. But now a team of Los Alamos theorists and East Coast experimenters has shown that quantum computers could identify errors and fix them.

    “What we have done is demonstrated in an experiment for the first time that we can make quantum information more robust, that we can protect it against corruption,” says Raymond Laflamme, a member of the team at Los Alamos National Laboratory in New Mexico. According to David Deutsch of the Centre for Quantum Computation at the University of Oxford in England, “it's an important step toward the goal of building a useful quantum computer.”

    Current “classical” computers process information, or bits, as digital 0's and 1's. In quantum computers the element of information, the qubit, is a blend of both a 0 and a 1, their relationship expressed by the qubit's “phase.” This mingling allows an array of qubits to carry a whole swath of numbers simultaneously, even though actually reading the array will yield just one value as the quantum states “collapse.” By working on entire sets of numbers all at once, a quantum computer can in principle solve certain types of problems incredibly efficiently. Factoring big numbers, for example—a taxing task for today's computers—would be a cinch for quantum computers and would render obsolete today's most secure encryption systems, which are based on the difficulty of this task.

    Unfortunately, a passing atom can interact with a qubit, causing some of its information to leak away and introducing errors. Skeptics say that the fragility of quantum information threatens the whole idea of a practical quantum computer. Because there is no way to avoid the errors, the next best thing is to correct them. This is not easy for quantum information, because reading it out to check for errors or correct them instantly collapses the qubit array, spoiling its number-juggling capacity. “The whole trick of the quantum error correcting code was to find a way to know what the error was without knowing what the message is,” says Laflamme.

    In 1996 Peter Shor of AT&T Bell Labs and, independently, Andrew Steane at Oxford devised a theoretical scheme for doing so. The basic idea is to spread the information of one qubit into a family of linked qubits so that, should any be corrupted, the information can still be recovered from its partners. Now Laflamme and his Los Alamos colleagues have teamed up with a group of Massachusetts-based specialists in nuclear magnetic resonance (NMR) to demonstrate the scheme with atomic nuclei that encode qubits in their magnetic orientations. A nucleus can behave like a small magnet and point either up or down relative to a strong magnetic field. Thus a molecule could be used as an array of qubits, with the nuclear orientations encoding 0's (up) and 1's (down). To control such an array, the researchers used NMR to manipulate the orientation of nuclear magnets by tweaking the nuclei with radio-frequency waves.

    In the 7 September Physical Review Letters, the team describes tests on two molecules: alanine, an amino acid, and trichloroethylene. Both provide a suitable set of three neighboring nuclear magnets: a single information qubit plus two control qubits to provide error correction. The researchers first used a radio-frequency pulse to twist the linked nuclear magnets into a particular position, then left them to the mercy of their surroundings. Errors caused the three magnets to drift out of alignment before a further radio-frequency pulse reversed the initial twisting. Because the three magnets are linked magnetically, enough information was contained in the misalignment of the two control qubits to allow the team to figure out the error on the information qubit without having to measure it directly. The experimenters then showed that they could correct the error with another pulse. “We've demonstrated in the smallest and simplest code that we had enough control to do the right operation and preserve the quantum information,” says Laflamme.

    Although enthusiastic about this demonstration of principle, other researchers emphasize that this is just the first step toward full quantum error correction. “It's a long way from three qubits to a quantum computer powerful enough to solve significant problems,” says Shor. A future “significant step” would be to demonstrate error correction in a five-qubit system, enough to guard both the phase information and correct another type of error that flips the 0's and 1's, explains Shor. “We hope to do this in the next few months,” says Laflamme.


    NRC Seeks Boost for Base, Special Projects

    1. Wayne Kondro*
    1. Wayne Kondro writes from Ottawa.

    Ottawa—Canada's oldest and most revered scientific institution, reeling from 3 years of budget cuts, is pleading with the government for more money to shore up its scientific base and to launch projects in five areas. The National Research Council (NRC) makes its case in a still-secret report, which has been obtained by Science, that has been presented to government officials in a series of briefings. Insiders say that the NRC stands a good chance of regaining much of its core funding and winning approval for at least some of the special initiatives.

    Established in 1916, the NRC holds a premier place in Canada's science establishment as a supporter of basic and applied research in industrial sectors or technologies seen as critical to the nation's economic development. But a $58 million cut in its budget, now at $242 million, has forced NRC's own labs to seek out contract services and short-term R&D projects. “As NRC's capacity to maintain projects and facilities at the leading edge diminishes, key staff and top scientists may be lost,” officials write in the report, dated 31 August.

    To recoup its losses, the NRC is asking for annual increases of $16 million in federal funds in each of the next three fiscal years, beginning on 1 April 1999. It is also requesting $165 million for new “strategic initiatives” in five areas: aerospace, genomics, optoelectronics, fuel cells, and information networking. “NRC has been successful in the past because it has been willing to invest for the long term,” says president Arthur Carty. “Our strength is medium- to long-term R&D in strategic areas that are absolutely crucial to Canada.” Government officials declined to comment on the details of the plan.

    The fate of the initiatives, say observers, depends on the NRC's ability to make them stand out in a crowd. The proposal to build an optoelectronics prototyping foundry for small- and medium-sized businesses within the information and telecommunications sectors, for example, is being touted as “unique in the world.” But the plan to develop a national genomics program conflicts with a similar initiative from the Medical Research Council (MRC) that has been slow to get off the mark (Science, 3 July, p. 20), and with a separate proposal from the Canadian Institutes of Health Research (Science, 8 May, p. 821) for a network of centers that could include work in genomics. In fact, the MRC and NRC recently struck a preliminary agreement to develop a joint genomics initiative that would stand a better chance of winning support.

    One point in the NRC's favor is the promise of matching funding from industrial and university partners for each of the five proposed initiatives. That commitment is particularly strong for the aerospace proposal, which includes building a center for more energy-efficient generation of gas turbines in Ottawa and an advanced aerospace manufacturing technology facility in Montreal. Some industrial partners have long been urging the government to do more. Last month, in announcing a planned 25% cut over the next 18 months in its R&D workforce, Pratt & Whitney Canada said the government's “commitment to support future R&D is insufficient to allow us to stay fully competitive in the global aerospace market.”


    Fast Chemistry Snares Stray Plutonium Isotope

    1. Robert F. Service

    Since it was first used to produce nuclear weapons in 1945, plutonium has inspired its share of fear. But the element has inspired mysteries as well, notably the case of its missing isotope. Over the past 50 years, researchers have isolated a total of 17 plutonium isotopes, all with different numbers of neutrons in their nuclei. But one predicted isotope—plutonium-231—remained at large. Now at last the search for plutonium-231 is over. At the American Chemical Society meeting in Boston late last month, a team from the University of California, Berkeley, and the Lawrence Berkeley National Lab (LBNL) reported using some fleet-footed chemistry to pin it down.

    “People have been saying all along that it should be there. But it wasn't easy to find,” says Alice Mignerey, a nuclear chemist at the University of Maryland, College Park. The challenge came in spotting plutonium-231's characteristic pattern of radioactive decay amid those of other nuclides. “It's really quite a coup these days to measure anything new,” says Mignerey. The new isotope isn't likely to find much practical use: It has a half-life of just 8.5 minutes. Still, other nuclear chemists are hailing the discovery for filling in a long-sought piece of the nuclide table and confirming models of nuclide stability.

    Isotope unmasked.

    Nuclide table showing the possible decay routes of “missing” isotope plutonium-231.


    Making the isotope was the easy part for the Berkeley team, led by postdoc Carola Laue and chemistry professor Darleane Hoffman. The group used LBNL's cyclotron to bombard a stack of uranium targets with helium-3 ions. As the helium ions—each containing two protons and a neutron—collide with the targets, some or all of their protons and neutrons fuse with the uranium nuclei to produce new nuclides, in this case plutonium, neptunium, uranium, and thorium.

    Isotope hunters track their targets by looking for the characteristic chain of decays as an unstable nucleus splits apart or spits out various particles, yielding daughter nuclides that decay further until they reach a stable nuclide. Plutonium-231 is hard to identify, however, because its decay chain includes uranium and neptunium isotopes that can be produced in the same cyclotron reaction, mimicking the plutonium-231 signal. Hence the researchers had to do some rapid chemistry to sift the plutonium isotopes from the other elements in time to watch for the plutonium-231's signature decay chain.

    To do so, the researchers gathered the nuclides blasted out of the target into a gas stream flowing into a thin capillary tube. They had spiked the gas with ultrafine potassium chloride particles that bound to the radioactive elements. At the end of the capillary tube the particles, now laced with nuclides, were deposited on a collection plate. To extract the plutonium isotopes, Laue dissolved the potassium chloride in nitric acid, which then passed through a tiny separation column. The column contained an ammonium-based resin, which binds to heavy elements with four positive charges, snagging the plutonium and thorium isotopes. After washing everything else out of the column, Laue flushed out the thorium with hydrochloric acid, then added hydrogen iodide to free the plutonium so it too could be washed out. A quick dry-out left a residue of pure plutonium.

    The final challenge came in picking out plutonium-231's decay signal. Calculations suggested that plutonium-231 would either emit an alpha particle to create uranium-227 or snag an electron, converting a proton to a neutron and creating neptunium-231. Plutonium-232, which the cyclotron reaction also produced, emits an alpha to create uranium-228. The decay chains of all three of these daughter nuclides are well known. And because the researchers had previously removed any uranium or neptunium, they could now be sure that if their sensitive detector registered uranium-227 or neptunium-231, these chains originated from plutonium-231. Once they had placed their pure plutonium sample in the detector, “we just watched the [uranium-227 and neptunium-231] signatures grow back in,” says Hoffman, confirming the presence of plutonium-231.

    Laue and Hoffman note that several other isotopes, such as americium-231, remain to be found. Flushed with solving one mystery, the Berkeley detectives are now off to tackle another.


    Assembling the World's Biggest Library on Your Desktop

    1. Joseph Alper*
    1. Joseph Alper is a writer in Louisville, CO.

    The “universal library,” an amalgamation of all recorded human knowledge, searchable from your personal computer, sounds like a fantasy. But the elements are now under development

    Ask Raj Reddy or Michael Shamos what the library of the future might look like, and they tend to get carried away. Imagine, they say, sitting at your computer and having access via a lightning-fast Internet connection to the entire corpus of recorded human knowledge and creation. Want to see the inside of Saint Peter's? Some keystrokes, a few mouse clicks, and you are off on your own walking tour. A painting high on a wall captures your interest—click on it and you can find out who its creator was and where you might see other examples of that artist's work, work of similar style, or perhaps a history of Italian baroque art. Or say you have observed what you believe is an unusual interaction between a species and its environment. A quick online query provides examples of similar interactions along with maps showing where those other habitats exist, videos of those species, sound clips of their vocalizations, and digests of research germane to your particular pairing—all in Spanish, your native language.

    As Reddy and Shamos, who are computer scientists at Carnegie Mellon University in Pittsburgh, see it, you will access this universal library from your desktop via a simple interface akin to today's search engines such as HotBot or Yahoo! Behind it, however, will lie multiple search tools that convert your request for information into formats and languages that can query thousands of individual digital libraries housing text, two-dimensional and three-dimensional images, music, maps, and other types of data. “We're talking about a means of bringing together materials from many libraries that have little in common, in a way that makes the process transparent to the user,” says Reddy. “Are we close to having such a library today? Not even remotely so. Can we get there? Yes, I believe that over the next 20 to 50 years, the truly universal library will exist and, importantly, that it will be worth the effort and money that we will spend to make it a reality.”

    Reddy, who first coined the term “universal library,” is working with colleagues at Carnegie Mellon on methods of indexing and searching video clips, as well as creating searchable transcripts and abstracts of video images. He is also acting as chief negotiator, getting collection holders as large as entire governments to support this endeavor. And thanks in part to Reddy's evangelism, enthusiasm for the universal library has spread beyond Carnegie Mellon to computer science laboratories worldwide. In the United States, they're building on projects funded over the past 4 years by the federal Digital Libraries Initiative (DLI-1), which developed schemes to collect, store, and organize information in digital forms and make it available over communication networks. Now, with DLI-1 winding down, the challenge is to meld those individual libraries into a seamless whole—a feat that will require breakthroughs in information processing and retrieval, in addition to teraflops and gigabucks. U.S. funding agencies are set to kick in $50 million over 5 years for DLI-2, but industry will have to spend hundreds of millions more to make the universal library a reality.

    First, however, there are some tough nuts to crack. At the top of almost everyone's list is developing new, fast methods for finding information in what will be a widely distributed library. “It's amazing that we find anything using today's browsers and search engines,” says Susan Dumais, a senior researcher in the Decision Theory and Adaptive Systems group at Microsoft Research in Redmond, Washington. Storing and analyzing images, music, and other nontext information is a second challenge. Developing abstracting programs that can rapidly summarize search results or display results in some manageable form, so that users will not be swamped with too much information, is a third.

    “We have enormous problems to solve before we can efficiently search all the available text documents, let alone nontext information such as mathematical formulas or music or art, in a comprehensive and useful manner,” says Paul Kantor, director of Rutgers University's Distributive Laboratory for Digital Libraries. “But I don't doubt that those problems are solvable.” Adds Gloriana St. Clair, head librarian at Carnegie Mellon, “This effort now has enough momentum to make the universal digital library a reality.”

    Building blocks

    Some of the building blocks for the universal library are already being laid down: Hundreds, probably thousands, of digital libraries are quietly amassing huge collections of nearly every type of data imaginable and making them accessible via the Web. Some, such as Project Gutenberg (, resemble traditional libraries, updated for the digital world. As of May, Gutenberg had 1306 out-of-copyright classic texts available at a mouse click for downloading in ASCII format. A step up in complexity are searchable libraries such as JSTOR (, which now contains current and back issues of 83 academic journals. Then there are what could be called virtual digital libraries, such as the Electric Library (, a commercial gateway through which subscribers can search hundreds of other private and public-access libraries including newspaper and magazine archives and transcripts of government hearings.

    Still more complex are libraries of materials that are not as easily searched as straight text files. The Alexandria Digital Library at the University of California, Santa Barbara (, which is getting ready to open its doors to the public, is a collection of maps searchable by location. Still to come are libraries containing digital representations of three-dimensional objects. Takeo Kanade at Carnegie Mellon, for example, has constructed a geodesic imaging dome fitted with cameras at each of the dome's 51 corners that produces a three-dimensional, manipulable image. At Stanford University, Marc Levoy uses a boom-mounted laser that travels around an object to create images with millimeter resolution, fine enough to see an object's imperfections. The Vatican will use this system to image its enormous collection of art in Rome. And the National Ethnological Museum in Osaka, Japan, uses five video recorders and a laser dimension-measuring system to digitize every object in the museum's collection for eventual online display.

    But this proliferation of libraries is far from the ideal that Reddy and others espouse of a universally accessible information source. “To begin with, we don't even know of all the digital libraries that exist and what's in them, so just as a start we need a worldwide registry of all the libraries and their content. This isn't that hard to do, we just have to set up the right mechanisms,” says Shamos, operations director of the Carnegie Mellon project. GenBank (, a library of human genome sequences administered by the National Institutes of Health, might be one model. Journals require investigators to register sequence data with GenBank before publishing papers on the work. Similarly, digital libraries might have to register their presence and content before being granted a uniform resource locator (URL) Web address.

    Unearthing buried text

    A far bigger challenge is devising ways to extract information from all these libraries. “We're getting very good at building data tombs, huge repositories in which information becomes buried forever. What we need now are ways of getting that information out of these repositories,” says Usama Fayyad, a senior researcher in the Decision Theory and Adaptive Systems group at Microsoft Research.

    Today's commercial search engines do a mediocre job at best at finding text-based items that satisfy a particular query, and they are not designed to find image- or sound-based information. The problem, say researchers, is that the information on the Web is so heterogeneous in structure as to be almost unsearchable using any one approach. “Key-word matching, the heart of your basic text search engine, is fine if you are searching a small database,” says Bruce Schatz, director of the digital library project at the University of Illinois, Urbana. “But that only works if all your documents are in one language, if everyone consistently uses the same words in the same way, and if the person conducting the search already knows all the terms germane to a particular topic.”

    Steve Cousins, a search expert at Xerox's Palo Alto Research Center in California, took on the challenge while he was a graduate student at Stanford. His system, the Digital Library Integrated Task Environment, provides a simple user interface hiding a translation system that can reformulate search requests into many different forms capable of interacting with different data structures at a contextual level rather than simply matching words. Now he is developing what he calls wrappers, programs that would sit over a digital library and translate its particular data structure into a form understandable by the particular search engine querying that library.

    Another approach to gleaning useful results from many different kinds of libraries is to analyze context as well as search for key words. At Microsoft Research, Susan Dumais's language processing group is using statistical models to analyze how often words occur together in documents as a means of drawing inferences about what a document says, rather than just the words it contains. Schatz's team has taken a similar approach to create searchable concepts within the 10 million items in the National Library of Medicine's MEDLINE database of biomedical literature (see sidebar on p. 1785).

    Making search engines more discriminating won't solve another problem that plagues Web searches today: the overwhelming number of potentially useful Web pages they often return, with little indication of the content of the pages or how they may be related. To ease this problem, Marti Hearst of the University of California, Berkeley, has developed Cha-Cha, a program that determines the home page for each item retrieved, records the shortest path to get from that home page to the retrieved page, and then groups together information that shares pathways. The result is an organizational map of the relationship between different pieces of information. Although this still requires manually browsing one page in each group of pages, it allows the user to narrow a search quickly to a smaller subset of information. “This seems simple, but it turned out to be a surprisingly effective way to organize what was otherwise perceived as a disconnected jumble of pages,” she explains. Cha-Cha is now being tested on the 300,000 Web pages in Berkeley's system (

    In another effort to make sense of search results, a software development team at the University of Illinois, led by Schatz, Bill Pottenger, and Kevin Powell, has created cMap, which displays search results on a colored three-dimensional surface. Broad concepts sit on the tops of “hills” that the user can easily spot; diving down the face of the hills leads to more focused topics. Neighboring hills contain related concepts to which the user can easily jump. “It's a way of visually organizing data that reveals more about the relationships between different sets of information,” says Schatz.

    Others are working on systems that rely on an iterative, give-and-take approach with the user. Stanford's SenseMaker, designed by Michelle Q. Wang Baldonado, now at Xerox, and her Ph.D. adviser Terry Winograd, allows the user to explore a subset of an initial search result, observes the user's reactions to assess his or her preferences, and refines the search criteria to suit them.

    Getting the picture

    Searching images, including maps, artwork, photographs, and video, is a more challenging task. “For all the problems associated with searching text, it is still an inherently searchable medium,” says Shamos. “But with images, we don't have that built-in verbal content handle to grab onto.” The Getty Information Institute in Los Angeles is leading an effort to add that verbal content by developing standard reference terms for cataloguing digital images.

    Others are working on methods to search images based on characteristics of an image itself. Shi-Fu Chang, an expert in content-based image processing at Columbia University, and his colleagues have developed VisualSEEk (, a collection of three search engines that query online art museum collections based on subject matter, physical properties, or similarity to a random sample of object types. For example, the SaFe search engine ( looks for objects that match color, texture, and composition elements specified by the user.

    Ultimately, the universal search engine will probably comprise several task-specific information seekers—one for images, another for video, a third for music, perhaps several for text—that will find all relevant knowledge to satisfy a particular query. But such a search engine will need multiple user interfaces. “There are many different cognitive search styles among users,” says Carnegie Mellon's St. Clair. One user, for example, might feel frustrated having to weed out extraneous information from a search, while others may want information peripheral to their main question.

    Besides these research problems, a number of practical issues are looming, most of them as yet with no solution at hand. Foremost is language—the Web is very English-centric today, but a universal library implies access for everyone, including those who speak Chinese, Swahili, and Urdu. This will require some type of translation system that ideally would provide literal translation on the fly and, at a minimum, translate concept and content information.

    Copyright and reimbursement issues could also restrict the availability of collections through a universal library. And then there is the problem of bandwidth—moving all this data around the world in a reasonable amount of time. The Alexandria Map Library, for example, ends up shipping its digitized maps to users on tapes via Federal Express because sending a gigabyte file via the Internet is not practical today.

    But Stephen M. Griffin, who heads the National Science Foundation's digital library efforts, is not worried. “This infrastructure issue is going to take care of itself because the commercial demand is there. The scientific and social issues that we have yet to solve are the real bottlenecks in creating a universal library.”


    Taming MEDLINE With Concept Spaces

    1. Joseph Alper*
    1. Joseph Alper is a writer in Louisville, CO.

    At 10 million entries and growing, the National Library of Medicine's MEDLINE is one of the world's largest public databases and a well-used resource for the biomedical research community. But this online collection of papers and abstracts is also unwieldy. “If you work hard at it and you have a lot of time, you can usually, but not always, find the information that you are looking for,” says Richard Berlin, a surgeon and medical director for Health Alliance, a regional health maintenance organization in Champaign, Illinois. “If you need an answer right away, forget it.”

    MEDLINE's problem is that in spite of its size and the many disciplinary boundaries that its entries cross, it is constructed along much the same lines as most other databases. Human indexers assign an average of 14 key words, known as Medical Subject Headings, or MeSH terms, to each bibliographic entry, of which an average of four so-called central MeSH terms provide the most specific categorization. Users query the database and retrieve all abstracts whose MeSH terms match those in the query. Where MEDLINE fails, though, is in searching for information that crosses even minor disciplinary boundaries—developmental genetics research on two different species, for example. “If you are a biologist working on Drosophila genes and you are trying to find out what's been done in the Caenorhabditis elegans literature, MEDLINE is not going to give you many good answers because the terminology that the two disciplines use, and the MeSH terms, are [different],” says information scientist Bruce Schatz of the University of Illinois, Urbana. Now, with the help of one of the largest computations ever, Schatz has come up with a scheme for bridging these vocabulary differences.

    Two years ago, as part of the Digital Libraries Initiative, Schatz and colleague Hsinchun Chen of the University of Arizona, Tucson, used semantic indexing techniques to classify more effectively a collection of 1 million abstracts in the engineering literature (Science, 7 June 1996, p. 1419). These techniques involved using key words to partition the large database into smaller collections that correspond to community repositories for specialized disciplines. Each smaller collection presumably shares a common set of jargon. Computations using statistical methods create sets of related entries, called concept spaces, listing all the terms that occur with one another within the collections. Terms that appear together in one concept space—“highway” and “pavement,” for example—are likely to be linked to the same underlying concept. The result is a thesaurus of alternative keywords for searching the entire database. Building this semantic index was a computationally intensive task, one that required 10 days of processing time on the world's fastest supercomputers at the National Center for Supercomputing Applications (NCSA) on the Illinois campus.

    Now, Schatz has repeated this feat on the MEDLINE database, creating the first semantic index for a complete scientific discipline. The 10 million MEDLINE entries were partitioned into 10,000 community repositories, by using the central MeSH terms classifying each entry. Each repository thus represents a specialized scientific subdiscipline, such as colon cancer or C. elegans genetics. The current largest NCSA supercomputer, the 128-node Silicon Graphics Origin 2000, then produced the semantic indexes, taking roughly 8 days for testing and 2 days to process the entire database. The resulting concept spaces are then searchable as part of the Illinois Digital Library test-bed.

    Early demonstrations ( have gone over well with physicians who have had access. Querying is still done using word matching, but the searchers now rely on the semantic index rather than on MeSH terms. “It's wonderful,” says Jonathan Silverstein, a surgeon and member of the informatics faculty at the University of Illinois in Chicago. “I'm now getting far more useful information out of MEDLINE, and I'm getting it in a time frame that I was actually able to get information I needed while the patient was in my office.”

    For Schatz, this success is merely a precursor to what he hopes will be a semantic index of the entire Net. “I see this as a model for the day 10 years from now when we have a billion small community libraries distributed around the world and we will need a fast, searchable index of that body of knowledge,” says Schatz.


    A Closer Look at SNPs Suggests Difficulties

    1. Elizabeth Pennisi

    Using the wildly popular genome markers called SNPs to track genes may be less straightforward than researchers expected

    Skokloster, Sweden—During the past year, single-nucleotide polymorphisms, commonly referred to as SNPs (pronounced snips), have taken the genomics community by storm. SNPs are single-base variations in the genetic code that occur about once every 1000 bases along the 3-billion-base human genome. Many researchers think that knowing the locations of these closely spaced DNA landmarks will ease both the sequencing of the human genome and the discovery of genes involved in such major human diseases as asthma, diabetes, atherosclerosis, schizophrenia, and cancer. But earlier this month at the first international meeting devoted to SNPs,* enthusiasts heard sobering news.

    Although no one doubts that SNPs will ultimately prove to have some value in tracking disease genes and understanding human genetic diversity, new results presented at the meeting suggest that the task could prove more difficult than many had initially thought. In some cases, SNPs might fail to pick up disease genes, or researchers will need to have many more SNPs located in and around the suspected disease gene to make their case than first anticipated. Other work suggests researchers will also need more information about the history of the people being studied, such as their migration patterns, to make sense of their SNP data.

    By the end of the 3-day conference, even the organizers went home shaking their heads. “There are so many problems and unanswered questions,” complained Anthony Brookes, a co-organizer and geneticist from the University of Uppsala in Sweden. “At the moment, we're finding our way in the dark.”

    SNPs seem simple in part because the wealth of genome data being generated by the Human Genome Project and a range of faster, cheaper ways to find SNPs (Science, 15 May, p. 1077; 17 July, p. 363) are causing these markers to pile up quickly in both public and private databases. They are much more plentiful than other markers, such as microsatellites, used as genetic landmarks for tracking genes. And they have the added advantage of existing within genes as well as near them, possibly making them useful in identifying the specific variant of the gene that causes disease.

    Indeed, most previous gene hunts required studying large, multigenerational families. But in 1996, epidemiologists Neil Risch at Stanford University in California and Kathleen Merikangas at Yale University in New Haven, Connecticut, suggested that SNPs might even be used to track down genes in unrelated people, particularly when the gene merely increases the risk for a disease. This would involve looking for differences in the patterns of SNPs between healthy and unhealthy people (Science, 13 September 1996, p. 1516). Prospects such as those led prominent geneticists, such as Francis Collins, director of the National Human Genome Research Institute, and Aravinda Chakravarti of Case Western Reserve University in Cleveland, Ohio, to propose that researchers find enough SNPs to perform such association studies (Science, 28 November 1997, p. 1580).

    But as two groups reported at the Skokloster meeting, using SNPs to track genes may be less straightforward than thought. Both groups had problems in trying to use patterns of DNA variation to link test genes to diseases with which they were already known to be associated—heart disease in one case and sickle cell anemia in the other.

    Working with Charles Sing from the University of Michigan, Ann Arbor, and his colleagues, population geneticist Andrew Clark from Pennsylvania State University in University Park focused on heart disease risk, first examining the role of the lipoprotein lipase (LPL) gene. Previous studies had shown that this gene, when mutated, causes high blood lipid concentrations and an increased incidence of heart disease in some families. Clark and his colleagues decided to use SNPs to find out which, if any, LPL gene variants might be increasing the risk for heart disease in the general population.

    To do this, Sing's team first sequenced a 9700-base pair region of DNA containing the LPL gene in samples obtained from 24 people from each of three populations: one in Finland, the second in Rochester, Minnesota, and the third in Jackson, Mississippi. The researchers found that the region contained 88 SNPs, seven of which were in the protein-coding regions of the gene.

    Clark and colleagues wanted to use the SNPs in epidemiological studies aimed at understanding the complex chain of genetic and environmental factors that affect heart disease risk. To do this, they first tried to construct a tree representing the historical sequence of mutations that gave rise to the SNPs. The idea was to group different variants of the LPL gene according to their ancestral relationships and then compare disease risk among the different lineages. But it immediately became clear that this would be difficult if not impossible; parts of the gene had been shuffled by recombination, the DNA exchanges that occur between the maternal and paternal copies of a gene during sperm and egg formation.

    “There has been almost as much recombination as mutation,” Clark reported, and that, he adds, “is going to make SNP mapping and association tests much more difficult.” Recombination can break down the correlation between the SNPs and variants that inflate disease risk, making it much harder to identify the association. Everybody hopes that only a few regions of the human genome will exhibit this high level of recombination, but where it does occur researchers will need many more SNPs to increase the odds of finding some that correlate with the pertinent mutations.

    Rosalind Harding of the John Radcliffe Hospital in Oxford, United Kingdom, shares Clark's concerns about the utility of SNPs, particularly if researchers try to depend on SNPs alone to identify disease genes. A single-base change in the β-globin gene has long been known to cause sickle cell anemia, and she tried to see if SNPs would reveal the mutant gene. By analyzing DNA samples from 500 people randomly selected from around the world, Harding and her colleagues found that the β-globin gene has dozens of SNPs located in and around its coding sequences.

    One of these SNPs turned out to be the sickle cell mutation itself. But when Harding looked at the frequencies of individual SNPs in the 500 samples, and also at inherited SNP patterns called haplotypes, searching for some sign that a particular SNP or haplotype was different, she found nothing that pointed to the sickle cell mutation. With SNP data alone, Harding concluded, “there will be nowhere near enough information to find something unusual and say ‘there's a disease gene.’”

    She predicts that, in addition to relying on SNPs, researchers will need to know about the patterns of disease and the history of the people being studied. “There has been this naïve idea that once you've gotten to the gene, you'll be able to decide which is the [pertinent] mutation,” she adds. “But this is going to be very hard.” Others concur. “You can't have just SNPs on their own,” says Nigel Spurr, a geneticist with SmithKline Beecham in Harlow, United Kingdom. “You must have [other information and technology] to go with it.”

    For statistical geneticist Joseph Terwilliger of Columbia University in New York City, Harding's and Clark's experiences with SNPs are indicative of the underappreciated complexity of the genome and of the pitfalls of thinking SNPs will easily lead geneticists to elusive disease genes. “Risch and Merikangas have been taken out of context” by overly enthusiastic promoters of SNPs' potential, he argues.

    Terwilliger notes that although Risch and Merikangas found association studies practical for identifying disease genes in which one mutation accounts for most of the increased risk, that situation may be uncommon. In a survey of all the new disease genes reported in the American Journal of Human Genetics during the past 1.5 years, Terwilliger found that about 90% of those genes had more than 10 pertinent mutations that predispose an individual to disease. With so many different mutations involved, none is likely to stand out in a SNP analysis. And that's the easy case, involving diseases caused by mutations in a single gene. The situation will be worse for cancer and the many other diseases in which multiple genes contribute to increased risk. “It's not just the underestimated complexity of the genome as much as it is the underestimated complexity of the etiology of a complex disease,” he adds.

    Researchers at Skokloster agreed that it will be difficult to gauge the usefulness of SNPs until they know more about how genomes vary between and within the world's ethnic groups. Because the most universal SNPs will be among the oldest, they are likely to exist in people both with and without disease. This means that there may be no distinctive pattern of SNPs specifically associated with a key variant of a gene. It could be easy to miss an important association or to make an association with the wrong gene variant. “If we don't think carefully before we do these experiments, we'll wind up with a lot of false signals,” Uppsala's Brookes says.

    Others at the meeting pointed out that association studies require that researchers look at much larger numbers of people than typical family studies, to sift out the false signals. “It's not enough to have 70 controls and 50 patients,” says Gert-Jan Van Ommen, head of the Human Genome Organization and a geneticist at Sylvius Laboratories in Lieden, the Netherlands. “You're talking about requiring populations of several thousand.” SNP analysis won't begin to be useful without new, high-speed technology for analyzing the thousands of DNA samples required, says Spurr.

    Even with these caveats, however, the researchers expect to see SNPs research proceed. “We know they will be successful in certain situations,” comments Case Western's Chakravarti. “We just don't know how successful they will be.”

    Already, association studies have linked a few gene variants to diseases. The tying of the ApoE4 gene to an increased risk for Alzheimer's disease in Caucasians is one often-cited example. And geneticist Daniel Cohen, head of Genset in Evry, France, says that his company has worked out many of the issues raised by the conference participants, in part by developing new methods—which he would not describe in detail—for analyzing the data and discerning real associations. In October, for example, he plans to announce the SNP-based discovery of two genes involved in prostate cancer. “I am absolutely confident of this strategy,” says Cohen. “It works.”

    Although others may not share Cohen's confidence, they want SNPs to be put to work. “Provided they are not being regarded as the panacea for complex disease findings, there is value in producing SNPs,” says Van Ommen. “[SNPs] are going to make a big difference.”

    • *The 1st International Meeting on Single-Nucleodtide Polymorphism and Complex Genome Analysis was held in Skokloster, Sweden, 29 August to 1 September.


    More SNPs on the Way

    1. Ken Garber*
    1. Ken Garber is a science and health writer in Ann Arbor, Michigan.

    Late last year, the National Cancer Institute (NCI) launched a project to find genome markers called single-nucleotide polymorphisms, or SNPs, to use in tracking down the hundreds of genes thought to affect cancer risk. NCI has already put about $1 million into the project, called the Genetic Annotation Initiative (GAI), which began generating SNPs in the spring. Researchers running the initiative are hoping that their approach will avoid many of the problems in using SNPs discussed at a recent conference in Skokloster, Sweden (see main text).

    NCI is taking what NCI geneticist Ken Buetow, who oversees the GAI project, calls a “gene-based” approach. Instead of creating a genomewide map of anonymous SNPs, Buetow says, NCI will look for SNPs in the coding regions, and in the sequences at both ends, of several thousand genes suspected of contributing to cancer susceptibility or resistance. Besides the 100-plus known cancer-promoting oncogenes and the three dozen or so tumor suppressor genes, the pool will include DNA repair genes, genes that drive the cell division cycle, and genes involved in drug metabolism, immune responses, embryonic development, and cell migration and metastasis. Genes from the NCI's huge Cancer Genome Anatomy Project, which aims for a complete genetic profile of cancer cells (Science, 16 May 1997, p. 1023), will also be included as they're identified.

    Buetow expects the average gene to yield three to five SNPs, a marker density that makes it much more likely that at least one will be close enough to any cancer mutation to be inherited with it as a block—a phenomenon called linkage disequilibrium. That doesn't ensure researchers won't miss the mutation when screening cancer patients—one of the researchers describing SNP problems at Skokloster had just such an experience with the sickle cell gene—but it should help. “We are less dependent on linkage disequilibrium relationships existing over long distances,” says Buetow. “We're going to be right inside the genes.” The data generated by GAI will also help determine how common the problems reported at the meeting are.

    Once identified, the SNPs will be posted on a database of the National Center for Biotechnology Information, where researchers can access them and design and conduct “association studies” to see if the SNP patterns of cancer patients are different from those of controls. The hoped-for result: hundreds of new cancer genes. Cancer researchers welcome the new initiative. “Given the present technology, it seems to be the obvious next step,” says Sofia Merajver, a breast cancer researcher at the University of Michigan, Ann Arbor.

    Several issues are still up in the air, however. NCI hasn't decided which populations to screen for SNPs. Right now it's using the DNA of four people from the largely Caucasian families collected at the Centre d'études du Polymorphisme Humain in France. The GAI wants more diversity, but no one agrees on what that means. “There is concern about stigmatization of populations and concern about what is a representative population,” says Buetow. “There are going to be dramatic differences [in SNP frequency] based on geography.”

    Also under debate is the question of how deep to dig for cancer SNPs. Some would be satisfied with the common ones, in which case screening as few as eight individuals should yield the vast majority. But others argue that the newer, rarer SNPs are also needed, because they're more often in linkage disequilibrium with cancer mutations and thus more likely to show up in cancer association studies.

    But the biggest question mark is what technology will be used to discover SNPs and then to detect or “score” them in cancer patients. “Not only has this not been done on a mass scale, but new technologies are being developed so fast, it's hard to know what to do,” says the NCI's Mike Dean. To begin, Dean is using a high-performance liquid chromatography mutation-detection method developed by Peter Oefner of Stanford University. Buetow is doing conventional gel-based sequencing, which would be tedious and expensive for large-scale studies.

    One technology now in high demand is the DNA “chip,” which can quickly identify SNPs across long stretches of DNA. Affymetrix, a Santa Clara, California, biotech company, has developed such chips, which researchers at the Whitehead Institute for Biomedical Research at the Massachusetts Institute of Technology and Affymetrix are using to do SNP prospecting (Science, 15 May, p. 1077). The National Institutes of Health is now negotiating with Affymetrix for a license, and both parties are optimistic. “We would be very happy to collaborate with the NIH in the area of SNP discovery,” says Robert Lipshutz, Affymetrix's vice president of corporate development.

    Whatever the outcome, Buetow is optimistic about finding methods that will make all kinds of cancer gene discovery projects easy. “We hope to push the technology to enable investigators to do any kind of study they want to do,” he says.


    Software Helps Australia Manage Forest Debate

    1. Elizabeth Finkel*
    1. Elizabeth Finkel writes from Melbourne.

    A computer program to promote biodiversity gives loggers and conservationists a chance to end their fierce fighting over forest reserves

    Melbourne, Australia—The forests of New South Wales (NSW) have seen many bitter battles in the last 20 years between logging interests eager to feed an insatiable Japanese appetite for wood pulp and conservationists trying to preserve the country's dwindling arboreal heritage. Those battles have taken a heavy toll on the participants. Just ask Col Dorber, the executive director of NSW Forest Products Association. In 1995, Dorber suffered a stress-induced heart attack after being roundly condemned by government and industry officials and vilified in the media for publicly defending a logger caught punching a “greenie.”

    Now back on the job, Dorber sees his remarks as an unfortunate reflection of the historic enmity between the two camps. That's why he's so encouraged by an experiment drawing attention from ecologists and resource managers around the world that attempts to inject science into forest management and that respects the interests of all parties. “Since 1995, we've been through a culture change,” he says. “Prior to that, we [industry and conservationists] wouldn't speak to each other. But now we've learnt to respect each other. It's a fantastic process.”

    That process is a joint initiative by the federal and state governments to negotiate long-term agreements for forest reserves that allow continued logging while maximizing biodiversity. At the core of the negotiations is a computer program, called C-Plan, that gives adversaries a chance to trade in their swords for software. Like some ecological card game, the software puts a biodiversity value on each parcel of land and presents stakeholders with various packages that meet the conservation target. C-Plan was developed by NSW National Parks and Wildlife Service conservation planners Bob Pressey, Simon Ferrier, and colleagues, and programmer Mathew Watts at the University of New England in Armidale, NSW. So far it has been used in two major sets of negotiations; a third exercise, involving a large swathe of old-growth forest, has just begun.

    “It's setting the gold standard in the field,” says ecologist Reed Noss, co-executive director of the Conservation Biology Institute in Corvallis, Oregon, and president of the international Society for Conservation Biology. Indeed, the World Bank is using C-Plan for an assessment in Guyana, and Pressey is currently in South Africa to help plan new reserves in the southwest portion of the country. Officials at the U.S. Fish and Wildlife Service (FWS) are also thinking of applying it to reserve designs now under way in Indiana and Illinois as part of a national assessment of biodiversity. “We think C-Plan might give us a repeatable and scientifically defensible tool,” says FWS biologist Forest Clark, who leads the Indiana team.

    Pressey has pushed to get science into the process of reserve selection since 1986, when he attempted the first systematic assessment of the state's western region. Although a large number of hectares had been reserved, he concluded that the attempt to preserve biodiversity had failed and that most reserves were in areas left untouched simply because they were too rugged for logging, pasture, or mining. Pressey, Ferrier, and Watts developed C-Plan to improve the process of designating reserves. The program puts an “irreplaceability” value on land based on its contribution to biodiversity. For example, an area of pasture that connects two sullied remnants of rare “swamp heath” forest may be deemed more valuable than a stand of pristine mountain forest.

    Carving out a plan.

    Comprehensive assessments for two forest areas followed an initial analysis in 1996 of eastern New South Wales.


    The program does not set conservation goals. Those are determined in advance by a panel of experts, chosen by the various stakeholders, in accord with a 1992 national policy that recommends reservation of forest types at 15% of 1750 levels. Instead, C-Plan is applied to decide which bits of forest are most valuable as reserves. Unlike most reserve-selection software, which apply a “pass or fail” test to select the so-called “minimum set” that will achieve the conservation target, C-Plan keeps every land unit—and its irreplaceability value—on the table so negotiators can mix and match options.

    C-Plan was designed with conflict resolution in mind. “Negotiators sit around the computer screen together, request modifications, and see the outcomes,” explains Tom Barrett of NSW's National Parks and Wildlife Service, who has worked extensively with the program. “The idea is to make the whole process as scientific and transparent as possible.” It grew up alongside another software program, called Bio-Rap (rapid assessment of biodiversity) and developed by a team at the Australian National University (ANU) in Canberra, that suggests a pattern of land use that will minimize costs while maximizing biodiversity for a given conservation target.

    For years, however, C-Plan was an idea looking for an application. Its big opportunity arrived in 1995 with the election of a Labour government, which had placed forestry reform at the top of its list of election promises. The first region to be targeted for assessment was public land along the state's eastern seaboard. More than half of this region is covered by forests, including the wild remnants of Australia's Gondwana heritage, which contain one of the largest concentrations of species in the country. But to the logging industry the area also represents a lucrative production forest, with 2.4 million hectares on state-owned land and more on private land.

    The region is infamous for confrontations such as the one that enveloped Dorber. Although the Labour government had offered funds to help restructure the industry in return for a reduction in logging, the devil lay in the details of deciding which areas should be reserved and which sacrificed. Even industry officials knew that timber production had to be curtailed. “We had a timber demand that was unsustainable,” says Dorber.

    State officials decided to use the homegrown C-Plan for a 1996 assessment that spanned four intense weeks of negotiations. One of the outcomes was the creation of nine new nature reserves covering 250,000 hectares, nearly 20% of which was on land rated by C-Plan as highly irreplaceable. Pressey says the result is a victory for rational reserve selection given competing interests. “It was driven by explicit conservation targets,” he says, “and people decided on areas that would contribute to those targets while trying to maintain timber supply.”

    The deferred areas were to be revisited in a second round of negotiations. The first of these involved 120,000 hectares in the Eden region—a forested paradise in southeastern NSW and the center of a large woodchipping operation by Harris-Daishowa. Eden supplies more than half the state's pulp logs. Four options were developed, all of which appear to meet 40% to 50% of the conservation targets. Yet within that apparent convergence, conservationists and industry took polarized approaches. For example, conservation groups pushed for the minimum logging quota set by the NSW government, while industry opted for the government's minimum number of reserve hectares. Numbers aside, the options also differed greatly in how the suggested reserves should be structured.

    As a result, neither side is happy with the outcome. Many stakeholders said that the Eden exercise, unlike the first assessment, was neither a genuine negotiation nor an open process. The federal and state governments, representing different parties, failed to agree on the conservation targets proposed by outside experts. And many observers feel that state officials rushed the process in hopes of achieving results before the next election in early 1999. There was also the problem of limited data. Uncertainty about the distribution of some animals meant that decisions about land use were made in the dark, says Henry Nix, director of ANU's Center for Resource and Environmental Studies. “[The lack of knowledge] is a disaster,” he warns. “We could be giving away the crown jewels.”

    The next test for C-Plan is the 10 million hectares in the Northeast region, much of it majestic stands of untouched old-growth forests, where the data sets are far more solid. It is also an important center of biodiversity that, together with bordering Queensland, boasts Australia's only representation of wet subtropics ecology. These are the best studied of NSW's forests, with well-established preservation targets for some 200 forest ecosystems, 800 endangered plant species, and 140 animal species.

    The negotiations, which began this month, will be a critical test of C-Plan. But scientists are hoping that, with so much at stake and so many stakeholders, science will not again play second fiddle to politics. “A little bit of science is all we can hope for,” says Andrew Beattie, director of the Key Center for Biodiversity at MacQuarie University in Sydney. “At least it's better than the days of the old boys in the back room.” And it's certainly better than loggers and environmentalists punching each other out.


    From Supermarket Boss to Science Minister

    1. Nigel Williams

    This summer saw some radical changes to Britain's administration of science; the new minister, Lord Sainsbury, talks about his role

    London—Some 3 decades ago, former British Prime Minister Harold Wilson offered one of the more memorable election promises in the country's history when he pledged to forge Britain's future in “the white heat of technological revolution.” His successor, Tony Blair, seems to have taken a leaf from Wilson's book. In early July, Blair announced a $1.75 billion boost in research funding. Then in a Cabinet reshuffle later in the month he appointed a wealthy business executive and science enthusiast, David Sainsbury, as Britain's new science minister. At the same time he strengthened the position of his science adviser, population biologist Robert May, and appointed his close colleague, Peter Mandelson, as Sainsbury's boss at the Department of Trade and Industry (DTI). Shortly afterward, he appointed an industrial physicist, who has long advocated boosting basic research, as the new director-general of Britain's six research councils.

    All this is being seen by researchers as a welcome sign that, after many lean years under the former Conservative government, the present administration is finally paying attention to their needs. One of Mandelson's first acts as DTI chief was to address scientists at Imperial College, London, on the government's commitment to a strong science base. John Mulvey, spokesperson for the lobby group Save British Science, says the move was “very encouraging.”

    Sainsbury, former head of a large supermarket chain, is one of a clutch of Labour supporters from business and the arts appointed by Blair to the House of Lords to reduce its Conservative bias. He told Science this month that he “was absolutely delighted” with the job. “It brings together my three main interests: business, politics, and science,” he says. His appointment has been welcomed by researchers. “He has demonstrated a deep interest in science and has used his own influence and resources to support a number of areas,” says Mulvey.

    Sainsbury is responsible for the Office of Science and Technology (OST), which disburses the bulk of Britain's basic research funding through the research councils, represents Britain's space interests, and pays dues to international bodies such as the European Space Agency. His new team will include in January John Taylor, director of Hewlett-Packard's European research center in Bristol, who takes over as head of the research councils. Taylor's support for a strong science base, which he has voiced to many parliamentary inquiries, has won plaudits from researchers, says Mulvey. Meanwhile, May, who heads the OST, gets at least a symbolic boost with the addition of a permanent desk in the Cabinet Office, so that he can work more closely with the Prime Minister's policy-making staff.

    Sainsbury brings one particularly relevant skill to his new post: He has run his own research foundation, a family charity called the Gatsby Charitable Foundation (see sidebar). In his new job, he will be working closely with a much larger private philanthropy, the Wellcome Trust, which is kicking in $640 million to the government's efforts to help upgrade university equipment and buildings. “The new partnership is tremendous, and I'm very much looking forward to working with the Trust,” he says.

    After the lean Conservative years, Sainsbury says one of his first priorities is to ensure the new money is used to repair the impoverished research infrastructure in universities. “Researchers have been living on their seed corn in recent years,” he says. “I hope the additional funds will go a long way to improving buildings and equipment” (Science, 21 August, p. 1141). He is also concerned about the number of researchers stuck in a series of short-term contracts and believes the problem needs to be tackled, in part by strengthening career guidance at the doctoral and postdoctoral level. “I think it is important to address these people issues which are joint problems for the research councils and the universities.”

    With his years of business experience, Sainsbury is also eager to tackle the perceived inability of British businesses to exploit research. “We are good at transfer of some elite science to the pharmaceutical, aerospace, and biotechnology industries, but we need to make certain that as many other industries as possible create competitive advantage by making better use of our science and engineering base,” he says. And his consumer retail background makes him acutely alert to the needs of the public. He supports the creation by his predecessor, John Battle, of a new panel to coordinate public consultation on developments in the biosciences, as providing a model for involving the public in science policy-making. “Public confidence in some areas of policy has become quite low,” he says.

    But the government's changes in the administration of science have not satisfied everyone. Oxford University neurobiologist Colin Blakemore, president of the British Association for the Advancement of Science, told the group's annual science festival last week that he thought the OST is inappropriately located within the DTI. “It makes sense to place science in its own department with a minister at Cabinet level,” he says. Sainsbury rejects such a move: He told a press conference at the meeting that merely making the OST independent would make little difference to the administration of science, while taking research away from other departments into a single ministry would “create a worse set of problems than the one you are trying to cure.”

  17. Sainsbury: Science Philanthropist

    1. Nigel Williams

    Britain's new science minister, David Sainsbury, is no stranger to science or the funding of science. Thirty years ago, he set up the Gatsby Charitable Foundation, which currently awards $32 million annually to projects in seven main fields, including basic plant science and, more recently, cognitive neuroscience.

    The foundation has provided $35 million over the past decade—one of its biggest projects—to build and support the Sainsbury Laboratory for molecular plant pathology. The lab, which forms part of the John Innes Centre (JIC), a government-funded plant and microbial science lab in Norwich, has recently undergone international peer review and won funding for a further 5 years. Gatsby's philosophy of funding projects generously has attracted top-quality scientists to the lab, says JIC director Richard Flavell: “The value of the lab to the center is enormous.”

    As part of a new Gatsby initiative, Geoff Hinton, a British computational neuroscientist currently working in Canada, has been lured back home to set up a new group at University College London with a $16 million award from the foundation. “The funding is amazingly generous and will allow researchers to concentrate on their work and not have to worry about writing grant applications,” says Hinton.

    Sainsbury, who planned to study history at the University of Cambridge but switched to a joint degree with psychology in the late 1950s, says he first became interested in science because of the enthusiastic accounts from science student friends about the discoveries flowing from Watson and Crick's discovery of the genetic code. “It was a tremendously exciting time to be at Cambridge,” he says. He named the foundation after one of his favorite books, F. Scott Fitzgerald's The Great Gatsby, to distance its charitable work from the Sainsbury's supermarket business.


    A Sweet Way to Keep Proteins Safe

    1. Carol Potera*
    1. Carol Potera is a free-lance writer in Great Falls, Montana.

    A strategy used by plants and insects to wrap proteins in a protective sugar cocoon is poised to move from lab bench to market place

    To Carl Leopold, the secret to life is rock candy. In 1987 the plant physiologist at the Boyce Thompson Institute in Ithaca, New York, set out to probe how corn seeds can survive dry storage for as much as half a century, then, after being sprinkled with water, revive suddenly and send up shoots. Leopold found that a dried seed's delicate enzymes and membranes are protected by a kind of glass armor—hardened sugars that melt in water, releasing the seed's protein machinery to crank up the process of germination.

    Now, drug developers are wielding these same glassy shields to protect fragile protein drugs from the harsh world until they are absorbed by the bloodstream. Inhale Therapeutic Systems, a biotech company in San Carlos, California, has licensed the approach and begun clinical tests of an inhalable insulin encapsulated in sugar. The company is now exploring the same strategy for other therapeutic proteins, from calcitonin for osteoporosis to α-1-antitrypsin for emphysema. And other researchers hope to develop sugar-encased drugs and nutritional supplements for countries that have scant means for refrigerating and transporting fragile proteins. Leopold says he's amazed at how this basic plant survival mechanism, called vitrification, is blossoming into a practical tool. Adds John Baust, director of the Center for Cryobiological Research at the State University of New York, Binghamton, “We're going to see more applications of vitrification emerge in the coming years.”

    When Leopold struck out on his sugar-coated path, he says, his team members “certainly weren't thinking of drug delivery systems at all.” After finding that corn's glass is made from sucrose and raffinose in a 3:1 ratio, Leopold's group set out to reproduce the sugar shield in the test tube. The researchers dissolved the sugars and mixed them with the enzymes isocitrate dehydrogenase, glucose-6-dehydrogenase, or luciferase, all of which are easily damaged when dried. They used a vacuum pump to dry the solutions and put the residue on a shelf. When liberated from the sugars several weeks later, the enzymes worked fine. The vitrified sugars, the researchers found, protect proteins from denaturation—much like insects stuck in tree sap can be preserved for millions of years if the sap hardens to amber.

    Leopold wasn't the only researcher to discover this life-preserving mechanism. In the late 1980s, Felix Franks, a physical chemist and water expert at the University of Cambridge in England, found that certain insects also tolerate freezing by forming glassy sugars. When temperatures drop below about −20 degrees Celsius, insects such as Eurosta solidaginis and Epiblema scudderiana—bugs that trigger tumor formation in plants—convert their reserves of glycogen, a glucose polymer, into an array of simple sugars and sugar-based alcohols. The sugars then vitrify, preventing bodily fluids from forming damaging ice crystals. In spring, the glass melts, vital proteins regain activity, and the insects revive.

    Unknown to each other, Leopold and Franks applied for U.S. patents on their glass-stabilization methods in the early 1990s. “It's fascinating that two scientists came to fairly simultaneous inventions from totally different backgrounds,” says Stephen Hurst, Inhale Therapeutic's vice president of licensing and intellectual property.

    Inhale is now developing the idea into an inhalable insulin, a product that has eluded scientists for decades. Researchers coat insulin with sugars, then shoot the mixture through nozzles that disperse it into particles, less than 3 micrometers in diameter, that dry immediately. In the lungs, the tiny spheres dissolve and are absorbed. In a clinical trial completed last June, 60 diabetics on inhaled insulin for 3 months maintained blood sugar levels as stable as those of diabetics on injected insulin. Inhale plans to launch a larger trial in November before seeking federal approval to market the insulin.

    The approach does have drawbacks: For instance, the inhalable insulin is effective for only a few hours at a time, so most patients would need a nightly injection to control insulin levels while they sleep. Still, many needle-shy patients “would be better off if they could get more insulin,” says diabetes specialist Carl Grunfeld of the University of California, San Francisco. They might benefit from the inhalable version, he adds. Inhale's other inhalable drugs are further up the pipeline.

    Animal-feed supplements could also benefit from sugar-coating. Leopold and Cornell University animal nutritionist Xingen Lei are trying to improve commercial corn and soybean pellets by spiking them with a sugar-encapsulated enzyme called phytase. The enzyme should enable livestock to exploit more of the phosphorus in the pellets because it breaks down phytic acid, which is attached to much of the phosphorus and limits the element's absorption.

    Such work could have “tremendous application” to people too, Lei says, because phytic acid contributes to malnutrition by locking up calcium, zinc, and iron in grains that are staples in many developing countries. Says Cornell nutritionist Gerry Combs, who studies mineral deficiencies in Bangladesh, “It would be a wonderful thing if we had an economical, heat-stable phytase supplement made available to all people in the developing world.” A little sugar, it seems, is the best way to make the nutrients go down.

Stay Connected to Science