News this Week

Science  13 Sep 2002:
Vol. 297, Issue 5588, pp. 1784

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Pioneering Stem Cell Bank Will Soon Be Open for Deposits

    1. Gretchen Vogel

    The world's first embryonic stem cell bank should be up and running within a year.

    The U.K. Medical Research Council (MRC) announced on 9 September that it has awarded a contract to establish the new bank, which will collect and distribute stem cell lines derived from human embryos and fetal and adult tissue.

    Scientists hope they can learn to use stem cells to treat a range of maladies such as diabetes and Parkinson's disease. If that dream is realized, the bank might someday hold thousands of human embryonic stem (hES) cell lines, says George Radda, executive director of MRC. The idea is to collect enough cell lines to match the immune system of any possible patient, he says. By some estimates that could require as many as 4000 cell lines. The project will begin on a much smaller scale, though. The National Institute for Biological Standards and Control (NIBSC), which will run the bank, will receive £2.6 million ($4 million) over 3 years, three-quarters from MRC and one-quarter from the Biotechnology and Biological Sciences Research Council.

    The bank could help the United Kingdom establish a lead in stem cell research. Federal funds cannot be used for embryo research in the United States, and the European Union is still debating its policy (see sidebar), but scientists in the United Kingdom can use government money to derive new hES cell lines. (hES cells, which can in theory become any type of cell in the body, are controversial because they are derived from week-old human embryos.) “We in Britain now have a world-leading advantage and need to make sure we can exploit that,” says Radda.

    Clarifying plans.

    NIBSC won a $4 million contract to run the bank.


    U.K. researchers must obtain a license from the national Human Fertilisation and Embryology Authority to derive new cell lines, and one of the conditions of the license is that any resulting cell lines must be deposited in the stem cell bank. In addition, the bank “will actively recruit” the holders of existing hES cell lines around the world to deposit their cells, Radda says, and it will work out intellectual-property agreements with cell donors on a case-by-case basis.

    The bank will make cells available to researchers worldwide, although all donors and recipients will have to abide by ethical conditions yet to be set by the bank. Academic researchers will pay a modest fee for the cells, Radda says, but the bank hopes to support itself in part through higher fees from commercial researchers. An independent steering committee—to be announced in the coming weeks—will draw up the bank's detailed rules of operation.

    NIBSC, located in Potters Bar, just outside London, oversees standards and safety for a range of biological products including vaccines, blood products, and hormones. Stephen Inglis, the institute's director, says the bank will not conduct research on the cell lines, but it might help establish standards for culturing and characterizing various types of stem cells, something the field still lacks. It would also oversee safety issues for any future clinical trials with the cells.

    The bank “is an extremely important move” for the stem cell field, says Roger Pedersen of the University of Cambridge. “One of the most difficult points for anyone starting out in the field is getting hold of well-characterized cells,” he says. The stem cell lines that Pedersen's lab derived when he was a researcher at the University of California, San Francisco (UCSF), might not end up in the bank, however. UCSF holds the rights to the cells, and university spokesperson Jennifer O'Brien says it's unclear whether the university would relinquish control of the cells to another body. Indeed, it's an open question how many holders of existing hES cell lines will be willing to relinquish even partial control of one of biomedicine's hottest commodities.


    Framework and Stem Cells: The Fight Goes On

    1. Gretchen Vogel

    Disagreements over stem cell research have again thrown the European Union's 6th Framework program into turmoil. The 5-year, $17.4 billion research program is due to be officially launched in early November, but its final approval has been delayed by drawn-out wrangling over what kinds of embryo research the program can support. Now the latest deal has been derailed.

    Research on human embryos and the embryonic stem cells derived from them is only a tiny slice of the program's total budget, but it is by far the most controversial area. Some E.U. states have their own strict prohibitions on embryo research and have clashed with those in favor of allowing research to go forward. A compromise proposed in late July would have prohibited funding for research that would harm human embryos—such as the derivation of new embryonic stem cell lines—until the end of 2003. E.U. funds could have continued to support research on already-existing cell lines while a committee drew up new rules for embryo research for the remaining 4 years of the 6th Framework program.

    The E.U. Council of Ministers was set to give its final approval to the plan on 6 September, but several members of the European Parliament protested that the moratorium had not been authorized by Parliament and was therefore illegal. If the moratorium went forward, they threatened, Parliament could block the program's entire budget. The threat sent negotiators back to work. A new proposal from representatives of Parliament and the Council of Ministers is expected before the end of the month.


    World Summit Adopts Voluntary Action Plan

    1. Jocelyn Kaiser

    Cynics might say that the summit in Johannesburg demonstrated only that rhetoric is a sustainable resource. But several scientists put a more positive spin on the 10-day gathering of global leaders in the South African capital that ended last week.

    “There's something to build upon, but it's more like a statement of intentions,” says economist Jeffrey Sachs, head of Columbia University's Earth Institute. Like others, Sachs was disappointed by the dearth of concrete plans that emerged, including the lack of support for his own suggestion to triple the budget of the world's global network of agricultural research. “But there's at least a fighting chance of making this a real plan of action,” he says.

    Nobody expected great things from the meeting, seen as a follow-on to the 1992 Earth Summit in Rio. Officially called the United Nations World Summit on Sustainable Development, the meeting did produce a 65-page Plan of Implementation in which more than 100 governments agreed to work together to protect the environment and reduce poverty. Delegates “made halting progress on some new sustainable issues, which is great,” says Brooks Yeager, vice president of the Worldwide Fund for Nature (WWF). But a U.S.-led campaign against specific timetables and goals undermined efforts to go further, says Sachs, who serves as an adviser to U.N. Secretary-General Kofi Annan.

    At the meeting, delegates agreed to restore fisheries to maximum sustainable yields by 2015 and establish a network of marine protected areas by 2012. The action plan also calls for slowing biodiversity loss by 2010 and slashing by half the number of people without access to sanitation by 2015. But proposed language was often softened during the course of the meeting; for example, a suggested 10% boost by 2010 in the use of renewable energy became “substantially increase.” In addition, the plan is not binding and, unlike the 1997 Kyoto treaty to curb production of greenhouse gases, does not include plans for individual countries.

    Still kicking?

    Activists declared the sustainability summit dead, but others—including scientists—say some progress was made.


    Science didn't make it into the political summation of the meeting, but it is discussed in the action plan, which encourages research collaborations between developed and developing countries. And scientists were featured prominently at the meeting. The Paris-based International Council for Science (ICSU), together with other international science and engineering groups, submitted reports and sent delegates, unlike at Rio. The groups also joined with the South African science ministry in a concurrent science forum that “people said was one of the most useful sessions,” Sachs says. According to ICSU executive director Thomas Rosswall, “We are extremely pleased with the high profile of science and technology during the summit itself and at many side events.”

    If the summit lacked substance, it at least served as the backdrop for various initiatives that Yeager predicts “will have lasting impact.” The United States committed at least $36 million over 3 years to protect the Congo rainforest, including new protected areas and training of park managers. “I'm very excited by that,” says ecologist Stuart Pimm of Duke University in Durham, North Carolina. “It's one of the most important places on the planet for biodiversity.” Brazil, the United States, WWF, and other donors also announced a contribution of $81 million toward an ambitious 10-year plan by the Brazilian government to triple the country's strictly protected areas.

    Canada and Russia used the summit to declare their intention to ratify the Kyoto climate treaty. Those parliamentary votes would allow it to enter into force without the United States, the world's biggest polluter. Countries also agreed to boost funding for the Global Environment Facility to $2.9 billion (Science, 31 May, p. 1596), and a campaign was launched to save crop seed banks (Science, 6 September, p. 1625).

    The U.N. says that “partnerships” are essential to fulfill the summit's objectives, and Rosswall says that scientific societies now hope to flesh out their own action plans and find ways to fund them. “To go from words to action, that's the challenge,” Rosswall says.


    A Little Respect for the Asteroid Threat

    1. Richard A. Kerr

    ARLINGTON, VIRGINIA—Asteroids fall to Earth. They always have and always will, unless humankind finds a way to intervene. If one were to strike tomorrow, it could rain death and destruction on a scale that would threaten civilization's very existence.

    At a NASA-sponsored workshop* held here last week, researchers heard mixed tidings about the asteroid threat. The good news is that the search for civilization-ending asteroids seems to have passed the halfway point and is on track to reach NASA's goal of detecting 90% of them before the end of the decade. On the other hand, astronomers haven't gotten far finding the tens of thousands of smaller bodies that could still wreak havoc across a megalopolis. And if an asteroid of whatever size were detected on a collision course with the home planet, no one would know what to do about it.

    “In some sense, we have the future of the world in our hands,” said astronomer David Morrison of NASA Ames Research Center in Mountain View, California. In principle, the asteroid hazard, unlike any other natural hazard, can be totally predictable, he notes. But scientists and the public have yet to decide how hard astronomers should look for smaller killer asteroids and how much planetary scientists need to learn about the enemy before confronting it.

    By the end of the decade, astronomers should have a good idea whether there's a sizable asteroid out there with our name on it. Recent estimates of the number of asteroids that can cross Earth's orbit and can therefore hit us—based on measures such as the rate at which such near-Earth asteroids (NEAs) are being discovered—have ranged from 700 to 1200 NEAs 1 kilometer in diameter and larger. Those are the ones thought capable of disrupting the environment badly enough to deal civilization a death blow. At the meeting, astronomer Alan Harris of NASA's Jet Propulsion Laboratory in Pasadena, California, reported that the current discovery rate of about nine NEAs per month now supports a range of 1000 to 1200 NEAs that size.

    Predictably safe passage.

    The asteroid Toutatis will pass Earth in 2004.


    At the behest of Congress, NASA began an NEA search in 1998 with the goal of finding 90% of NEAs 1 kilometer and larger by 2008. So far, 635 of them have been discovered and tracked. Only one looks to have any chance of ever hitting Earth (Science, 5 April, p. 27), and that's a slim 1-in-300 chance at most in 2880. “It looks like we're going to be real close to making” the 2008 goal, said Harris, “if not making it.”

    Many researchers, however, think more needs to be done. Monster 1-kilometer asteroids jolt Earth on average only every few hundred thousand years, but a still-formidable 300-meter body strikes every 60,000 years or so, they point out. As telescopic imaging technology has improved, surveying such 200- or 300-meter “subkilometer” objects might soon be practicable. If such an impactor hit within hundreds of kilometers of the U.S. Atlantic coast, it could send a 100-meter tsunami into Boston, New York City, and Charleston, planetary scientist Erik Asphaug of the University of California, Santa Cruz, reminded the meeting attendees.

    NASA has just begun looking at how seriously subkilometer asteroids threaten us and what could be done to find the dangerous ones, NASA Solar System Exploration Division Director Colleen Hartman told those attending the meeting. A subkilometer survey would cost considerably more than the $4 million per year NASA is spending on the current 10-year search. In the past 2 years, the National Research Council has twice recommended that NASA and the National Science Foundation (NSF) jointly fund a survey facility such as the ground-based Large-Aperture Synoptic Survey Telescope (LSST) currently under study by NSF (Science, 19 July, p. 317). With something like a $95 million start-up cost, LSST could find 90% of 300-meter NEAs in 10 years if it did no other scientific work, Harris says.

    But even if found, dangerous NEAs present an as-yet-insurmountable problem. Any number of ways of nudging an asteroid off its collision course have been offered, among them blowing it out of the way with a nuclear explosion, attaching a rocket engine of some sort, creating a jet of vaporizing rock by focusing sunlight with a giant solar mirror, and scooping rock off the asteroid and hurling it away. But every method depends to varying degrees on the nature of the particular asteroid. NEAs range from solid chunks of rock or iron-nickel at the small end (less than a few hundred meters) to “rubble piles” of shattered rock covered by a loose layer of pulverized rock. Physicist Keith Holsapple of the University of Washington, Seattle, warned listeners that “whacking” a porous, debris-covered rubble pile out of the way with a nuclear blast would be “like trying to punch a very large marshmallow”—bad news if many near-Earth asteroids fit that description.

    To understand NEAs well enough to deflect them effectively, space agencies would need to send interplanetary missions for radar and seismic probing, said astronomer Michael Belton of Belton Space Exploration Initiatives LLC in Tucson, Arizona. Belton estimates that such studies would probably take $1.5 billion and 25 years, not to mention another $3 billion or so to fashion practical deflection methods for every sort of beast in the asteroid zoo. But given the odds for the next impact, noted planetary scientist Daniel Durda of the Southwest Research Institute in Boulder, Colorado, “Captain Kirk is probably going to be out there before we have to do mitigation” of the asteroid hazard.


    Bid to Save Kamchatka's Wildlife

    1. Paul Webster*
    1. Paul Webster is a writer in Moscow.

    PETROPAVLOVSK-KAMCHATSKIY, RUSSIA—For decades, the Soviet military cloaked the far-eastern region of Kamchatka from the outside world because of the 1500-kilometer-long peninsula's proximity to Alaska and Japan. That isolation and its unique climate preserved Kamchatka as a haven for thousands of rare animals and plants, from eight unique species of lichen to a subspecies of brown bear. The end of the Cold War saw farmers, poachers, and timber and mining companies rush in to exploit the relatively untouched land, but now the besieged peninsula has found a new protector. A group of international and Russian agencies this week announced a 7-year project to bolster biodiversity conservation and research in four protected areas, with funding of almost $13 million. “We've been waiting a decade for this,” says Olga Selivanova, a marine biologist at the Kamchatka Institute of Ecology and Natural Resources here in the region's capital city.

    Soviet taxonomists won international recognition in the 1970s and '80s for their work on Kamchatkan species. Some 10% of the peninsula's 1168 plant species are found only here. The peninsula is home to the world's greatest diversity of salmon, trout, and char, as well as an estimated 15,000 Kamchatka brown bears, the second largest subspecies in the world. And Kamchatka is the breeding grounds for the world's largest eagle, the Steller's sea eagle. But with 59 Kamchatkan fauna on Russia's endangered species list, some experts contend that time is running out to protect what they see as a northern Galápagos.

    Big bird.

    Steller's sea eagle, the world's largest eagle, makes its home in Kamchatka.


    Although the Russian government has approved modest expansions of some of the peninsula's parks, setting aside an additional 16,000 hectares over the past decade, cash-hungry bureaucrats and businesses threaten many biodiversity preserves, says Paul Grigoriev, a conservation biologist working for the United Nations Development Programme.

    The Kamchatka project—the richest conservation effort yet in the Russian Federation—suggests that “the situation is improving,” says Grigoriev, who's leading the effort. The project, backed by the Global Environment Facility, the Canadian International Development Agency, and the United Nations, will seek to bolster the legal status of protected sites and improve monitoring and management of almost 30,000 square kilometers, covering habitats ranging from temperate deciduous forest to arctic tundra and volcanic wastelands. The project will also fund scientific research and ensure that local people benefit by promoting tourism and integrating aboriginal hunting and fishing into site management.

    The scientific program, with a budget of $550,000 over 7 years, will tackle the classification of existing specimens collected over the years. Selivanova, who crafted the project's scientific program, says that some 45,000 samples collected during Soviet times remain unclassified. She and others also hope to get back out into the field, as collecting trips were mostly abandoned after the Soviet collapse. “We want to get back to doing it now,” she says.


    IOM Panel Weighs In on Diet and Health

    1. Jennifer Couzin

    Consumers trying to make sense of divergent diet-book claims aren't likely to find easy answers in a new, 1000-page tome on diet and health issued by the Institute of Medicine (IOM). But they will find a review of the risks and benefits of consuming the disputed “macronutrients”: carbohydrates, fats, and proteins. Although the report makes specific recommendations, it also laments gaps and contradictions in nutrition research, suggesting that even the experts are struggling to sort out the information.

    After nearly 3 years of spirited debate, the 21 scientists on the IOM panel agreed on a bottom line: 20% to 35% of one's calories should come from fat, 45% to 65% from carbohydrates, and 10% to 35% from protein. A similar panel in 1989 suggested hard numbers within these ranges: no more than 30% from fat, no less than 50% from carbohydrates, and the rest from protein.

    The flexible 2002 standards might reflect a newfound humility. “Twenty-five years ago, guidelines were presented with absolute certainty, [for example] ‘Thou shalt not eat eggs,’” says Walter Willett, an epidemiologist at Harvard's School of Public Health in Boston, who was not on the panel. “I think [this report] is a healthy acknowledgment that we don't know absolute truths.”

    The panel set out to determine the impact of macronutrients on chronic diseases such as diabetes. The assignment proved enormously complex. Fat, for example, is an umbrella that covers the omega-3 fatty acids and monounsaturated fat, which are considered healthy; the trans-fatty acids, which are considered unhealthy; and the saturated fats, about which there is no consensus.

    The report also comes during a raging public debate on diet. To remain neutral, the panel members “put on blinders” to the policy implications of their work, according to panel chair Joanne Lupton, professor of nutrition at Texas A&M University in College Station. Popular diets, such as the heavily criticized Atkins diet, advocate nearly eliminating carbohydrates and relying on fat and protein. At the same time, the evidence favoring a low-fat diet has been questioned (Science, 30 March 2001, p. 2536).


    The public has been offered a bewildering array of recommendations on healthy diets.


    IOM panelists tried to limit the scope of their review by focusing on diets for healthy individuals, not those seeking to lose weight. But the duel over fat and carbohydrates edged its way into discussions anyway, as the panelists examined scientific studies dating back to the 1930s. “It's a very, very difficult decision as to whether high carb … and lower fat is better,” says Sheila Innis, a panel member and expert in pediatric nutrition at the University of British Columbia in Vancouver. On the one hand, Innis notes, “there are populations that do very well with high-fat diets,” such as the Greeks. Their so-called Mediterranean diet, though, is composed largely of the healthy fats found in fish and olive oil—not the kind consumed by most Canadians and Americans, the report's intended audience. In the end, says Innis, the panel leaned toward carbohydrates because, in the context of a North American diet, they were deemed safer.

    Although the IOM report aims to stay out of the big battle, it is one of the first government-funded efforts to parse out the underlying science. And it stakes out some clear positions. For example, it suggests roughly doubling average fiber intake, to 38 grams per day for men and 25 grams for women.

    Certain recommendations—such as urging every adult to get an hour's daily exercise, twice the amount recommended in the past—seem to ignore the real-life lifestyles of North Americans. “I couldn't possibly do an hour of exercise a day,” says Marion Nestle, chair of the department of nutrition and food studies at New York University. Nestle, who was not on the panel, complains that the report is too complex for “an already confused public.” What causes obesity “isn't rocket science … eating too much [does].”

    Indeed, panel member Ronald Krauss, who studies diet and heart disease at Lawrence Berkeley National Laboratory in California, agrees that the report might not respond to the question people generally ask: “How much of this should I eat?” Unfortunately, science isn't able to deliver such detailed diet advice quite yet.


    Biology Departments Urged to Bone Up

    1. Erik Stokstad

    In 1998, Sheldon Wettack, a dean at Harvey Mudd College in Claremont, California, decided that undergraduates needed a better appreciation of the connections between biology and the physical sciences. He and a few faculty members devised a program, called the Interdisciplinary Laboratory, that parallels intro chemistry and physics classes and includes such exercises as how thermodynamics affects animal design. Wettack hoped that the new team-taught lab would strengthen the biology curriculum and maybe even attract majors from the more quantitative sciences into biology.

    This kind of approach is exactly what's needed to train the next generation of biomedical researchers, says a new report by the National Research Council (NRC).* “Biological research is already highly interdisciplinary, but undergraduate education is not,” says panel chair Lubert Stryer of Stanford University in Palo Alto, California. “And the gap is increasing.”

    The NRC panel, funded by the National Institutes of Health and the Howard Hughes Medical Institute (HHMI), found that undergraduate biology education also needs a more rigorous curriculum. Many of the recommended changes are longtime favorites of science education reformers (Science, 31 August 2001, p. 1607), including thought-provoking lab exercises and independent research projects. To improve quantitative skills, faculty members should include more concepts from math and physical sciences in biology classes. Ideally, the report says, the entire curriculum would be revamped to add more heft.

    Now Hear This.

    New report emphasizes hands-on activities over lectures for undergraduates.


    But these changes face many obstacles, including the expense of developing new course materials and the conservative influence of the Medical College Admission Test (MCAT), a national qualifying exam for would-be U.S. medical students. “It is time that the curriculum started driving the MCAT, not the other way around,” says David Hillis of the University of Texas, Austin. When it comes to curricula, there is also a massive amount of inertia in higher education, says Peter Bruns, vice president for grants and special programs at HHMI. “People say the only institution more conservative is the [Catholic] Church.”

    Even when reform is on the agenda, it's hard for departments to agree on how to carry it out. “If you add something, you have to take something away,” says Hillis, and no one wants his or her subject trimmed. Personal foibles can play a role, too: A professor who agrees that number-crunching skills would be useful might still be loath to admit ignorance. “Most faculty have trouble saying, ‘I don't know much about this topic, but you should,’” says chemist Ronald Breslow of Columbia University in New York City.

    One way to achieve change is by sweetening the pot. Toward that end, next week HHMI will award $1 million over 4 years to each of 20 senior faculty members who have proposed ways to improve undergraduate biology education at their institutions. The idea is to provide role models as well as the necessary resources. That approach makes good sense to Stryer, who says that energetic leadership is a key ingredient in making the panel's recommendations a reality.

    • *BIO2010: Undergraduate Education to Prepare Biomedical Research Scientists (National Academy Press, 2002).


    The Genome Chose Its Alphabet With Care

    1. David Bradley*
    1. David Bradley is a writer in Cambridge, U.K.

    Of all the nucleotide bases available, why did nature pick the four we know as A, T, G, and C for the genomic alphabet? Researchers have long put it down to the composition of the primordial soup in which the first life arose. But Dónall Mac Dónaill of Trinity College Dublin says the answer is much more interesting. He believes that the choice of A, T, G, and C incorporates a tactic for minimizing the occurrence of errors in the pairing of bases, in the same way that error-coding systems are incorporated into ISBNs on books, credit card numbers, bank accounts, and airline tickets. “The answer may lie partly in the error-coding aspects of information transfer,” he says.

    There are 16 possible nucleotide bases that could pair up to make DNA, and researchers have created strands of synthetic DNA using all the combinations. Informatics might be the key to why nature ignored all but four of these possibilities, Mac Dónaill suspected, and he built on the structural work of biologist Eörs Szathmáry of the Collegium Budapest in Hungary to test his hunch.

    In the error-coding theory first developed in 1950 by Bell Telephone Laboratories researcher Richard Hamming, a so-called parity bit is added to the end of digital numbers to make the digits add up to an even number. For example, when transmitting the number 100110, you would add an extra 1 onto the end (100110,1), and the number 100001 would have a zero added (100001,0). The most likely transmission error is a single digit changed from 1 to 0 or vice versa. Such a change would cause the sum of the digits to be odd, and the recipient of that number can assume that it was incorrectly transmitted.

    Binary bases.

    Representing nucleotides as binary numbers reveals how they were chosen to avoid errors.

    Mac Dónaill asserts, in a forthcoming issue of Chemical Communications, that a similar process was at work in the choice of bases in the genetic alphabet. First he represented each nucleotide as a four-digit binary number. The first three digits represent the three bonding sites that each nucleotide presents to its partner. Each site is either a hydrogen donor or acceptor; a nucleotide offering donor-acceptor-acceptor sites would be represented as 100 and would bond only with an acceptor-donor-donor nucleotide, or 011. The fourth digit is 1 if the nucleotide is a single-ringed pyrimidine type and 0 if it is a double-ringed purine type. Nucleotides readily bond with members of the other type.

    Mac Dónaill noticed that the final digit acted as a parity bit: The four digits of A, T, G, and C all add up to an even number. Nature restricted its choice to nucleotides of even parity, says Mac Dónaill, because “alphabets composed of nucleotides of mixed parity would have catastrophic error rates.” For example, nucleotide C (100,1) binds naturally to nucleotide G (011,0), but it might accidentally bind to the odd parity nucleotide X (010,0), because there is just one mismatch. Such a bond would be weak compared to C-G but not impossible. However, C is highly unlikely to bond to any other even-parity nucleotides, such as the idealized amino-adenine (101,0), because there are two mismatches. So, nature has avoided such mistakes by banishing all odd-parity nucleotides from the DNA alphabet.

    Computational chemist Graham Richards of Oxford University thinks the finding is important: “It is a novel idea which should provoke others to explore aspects of informatics in the genetic code,” he says, adding: “Instinctively, one feels that the DNA code should have evolved systems to minimize errors. Mac Dónaill's work shows how this could have been achieved.” Larry Liebovitch of Florida Atlantic University, Boca Raton, agrees. “Mac Dónaill's clever analysis shows how well different nucleotides could serve as matches in DNA and how much different pairs differ from each other,” he says. “This analysis gives us a reason to believe that the A-T and G-C choice forms the best pairs that are the most different from each other, so that their ubiquitous use in living things represents an efficient and successful choice rather than an accident of evolution.”


    Report Urges Leeway for Developing World

    1. Jennifer Couzin,
    2. Pallava Bagla

    An independent commission appointed by the British government is advocating weaker intellectual property (IP) laws in developing countries in hopes of fostering innovation. The report, released this week, also criticizes the World Trade Organization's agreement on Trade-Related Aspects of Intellectual Property Rights (TRIPS), which in 1995 delineated minimum global standards that nations should achieve.

    “We've recommended that patents aren't necessarily a good idea for many developing countries,” says Charles Clift, a London economist and head of the secretariat that managed the six-member commission. Commissioners believe that developing countries can ill afford to pay fees on patented inventions, both domestic and foreign, that would help them expand their technological base. Devinder Sharma, an analyst at the Forum for Biotechnology and Food Security in New Delhi, India, says he's pleased that “for the first time, a high-powered body is saying, ‘Go slow on patents.’”

    The U.K. government launched the commission after a review of the problems associated with globalization. Its conclusions ( reinforce arguments made for years by nongovernmental organizations. But some say a position that favors narrower patents is misguided. “A policy saying you should not grant patents on things that are really, truly inventive makes no sense for developing countries,” says Jeffrey Kushan, a patent attorney at Sidley Austin Brown & Wood LLP in Washington, D.C. “At the practical level, a lot of these issues don't have any data.”

    Patent potpourri.

    More lenient patent laws allow Indian biotech companies to produce generic AIDS drugs.


    The commissioners acknowledge that a dearth of IP regulations in the least developed countries forced them to guess the impact of U.S. or European-type IP laws on a country's technological development. After consulting with institutions in nine developed and developing countries, however, they concluded that many developing nations ought to avoid issuing patents that, for example, allow the kind of broad protection available in the United States because it would stifle additional innovation. The commissioners also encouraged developing nations to adopt broad exemptions for educational and research use of patented materials. The group agreed, too, that a single IP system cannot serve the diversity of developing countries. Commission member Raghunath Anant Mashelkar, a polymer engineer and director-general of India's Council of Scientific and Industrial Research, says the report's message to authorities is “Don't force-feed stringent IP [rights] laws to poor countries that do not have the inherent capacity to implement them.”

    The commissioners were also concerned about the potential high cost of licensing a patent. For example, developing countries are generally not permitted to sell cheap generics of drugs still under patent protection (AIDS drugs being a notable exception). Stanford University law professor John Barton, who chaired the commission, suggests that such fees might prevent widespread use of an invention.

    Kushan dismisses that argument, saying that “if a company wants to make money in a market, it's going to adjust its license fees.” He also says that raising IP standards has historically promoted international competition and investment, citing reforms in Brazil in the late 1990s that preceded a $2 billion infusion from U.S. drug companies. Lila Feisee, IP director at BIO, an industry group in Washington, D.C., agrees. “Obviously, every country's got its own special criteria,” she says. Still, “we push very hard to try and harmonize the patent laws of different countries.”

    The report will be presented next week to officials from the World Trade Organization and the World Intellectual Property Organization. The commissioners hope that their ideas will be incorporated into TRIPS, which continues to undergo revisions, or other international agreements.


    Ubiquitin Lives Up to Its Name

    1. Jean Marx

    The cleanup protein has been known to be present in all higher organisms, but now researchers are discovering that it has a hand in everything from directing protein traffic to regulating gene activity

    A small protein called ubiquitin is turning out to be the Clark Kent of cell biology. Like Superman's alter ego, ubiquitin has long been regarded as worthy but somewhat dull, a player in the cast of characters that carry out housekeeping functions for the cell. But recent findings are beginning to reveal it as a kind of superhero, performing feats that few suspected.

    Early work showed that ubiquitin, which was discovered in the mid-1970s, is part of the cell's janitorial services. It binds to other proteins, tagging them for destruction by a large multiprotein complex called the proteasome. This kiss of death eliminates damaged proteins, an essential job but perhaps not one to catch the eye of Lois Lane. But ubiquitin- mediated protein disposal soon turned out to have a more glamorous role: helping regulate such key cellular processes as the cell division cycle. Now researchers are finding that ubiquitin's functions go far beyond even these crucial activities.

    Recent work, much of which was on display at a meeting* last month, shows that ubiquitin plays traffic controller as well as janitor. Ubiquitin tagging directs the movement of important proteins in the cell, determining, for example, whether they end up on the cell membrane or in an internal vacuole, where they are destroyed without the proteasome's help. “The whole aspect of ubiquitin-mediated [protein] trafficking inside the cell is brand-new,” says cell biologist Annette Boman of the University of Minnesota, Duluth. “It was really a surprise to me and to pretty much everyone else in the field as well.”

    Other work indicates that ubiquitin and related proteins play direct roles in controlling the machinery that brings about gene expression. The multipurpose molecule also helps regulate the many signaling pathways that control the cell's responses to environmental and other changes. Indeed, the meeting co-organizers, Cecile Pickart of Johns Hopkins University in Baltimore and Linda Hicke of Northwestern University in Evanston, Illinois, say that ubiquitin's actions in the cell might be as pervasive—and as important—as those of the well-known regulator phosphate, which controls the activities of thousands of proteins.

    The new appreciation of ubiquitin's multiple roles has medical implications. It turns out, for example, that ubiquitin helps turn off the cell's responses to growth factors; without this safeguard, the uncontrolled cell growth of cancer might result. And researchers are also finding that certain viruses that bud from the cell surface, including Ebola and HIV, do so by commandeering the same ubiquitin-dependent transport machinery used for protein trafficking in the cell.

    Entry and sorting

    One of the most advanced lines of work on ubiquitin's newfound powers deals with the protein's role in directing protein movements. Some puzzling observations in the mid-1990s provided the first clues that ubiquitin might somehow be involved in bringing membrane-bound proteins into the cell.

    At the time, cell biologists suspected that some membrane proteins are ubiquitinated and degraded in the standard fashion by the proteasome. But they found that ubiquitin is also added to proteins, including growth factor receptors, that meet a different fate. Cells turn these receptor responses down or off by bringing the receptor into the cell in tiny membranous sacs called endosomes, which form when the external membrane bulges into the cell and buds off. Once inside, the endosome cargo is either directed to vacuoles called lysosomes for degradation or recycled back to the cell membrane.

    Traffic signal.

    A ubiquitin tag (yellow ovals) tells proteins, whether on the outer membrane or newly synthesized and in the Golgi apparatus, to move into the endosome and the multivesicular body (MVB).


    The finding raised suspicions that the ubiquitin tag might be the signal for internalizing the receptors, but as Pickart recalls, “one thing [we thought] we knew for sure is that ubiquitin had nothing to do with the lysosome.” Direct proof that it does came a few years later, providing what Pickart calls “a satisfying reverse of course.”

    One series of key experiments came from Hicke, then working with Howard Riezman of the University of Basel, Switzerland, on a yeast cell receptor called Ste2p, which responds to one of the factors that controls yeast mating. The researchers showed that Ste2p becomes ubiquitinated when it binds the mating factor and that as a result, it is taken into the cell and degraded by the lysosome. Replacing the amino acid where ubiquitin latches onto Ste2p with a different one eliminated both the ubiquitination and internalization of the receptor. Studies with other mutants confirmed that the resulting degradation takes place in the lysosome.

    Subsequent work also provided an explanation for how the cell might determine which ubiquitinated proteins are to be degraded by the lysosome and which by the proteasome. Proteins headed for the proteasome are tagged with a string of at least four ubiquitins, whereas Ste2p was marked with only one. Since then, numerous researchers have shown that ubiquitin tags membrane proteins for internalization in endosomes in both yeast and more advanced organisms, including mammals.

    More recent work, reported within the past year or two, has uncovered a broader role for ubiquitin in directing protein traffic within the cell. Not only does it mark membrane proteins for internalization, it also helps determine whether newly synthesized proteins get to the membrane in the first place. Proteins destined for the cell surface are separated from others at the end of a series of membranous compartments collectively called the Golgi apparatus. Ubiquitin has now been implicated in this decision. For example, at the meeting, Chris Kaiser of the Massachusetts Institute of Technology described evidence for the idea, obtained in studies on a yeast protein called Gap1 that carries amino acids across the yeast cell membrane.

    Gap1 is made all the time in yeast cells, but it is normally transported to the cell surface only when yeast are growing on a poor nitrogen source. If there's ample nitrogen, Gap1 moves into the endosome and from there possibly into the lysosome for degradation. Kaiser and his colleagues found that mutations that increase ubiquitin addition to Gap1 cause it to move into the endosome even when yeast is growing on a poor nitrogen source. Conversely, mutations that decrease Gap1 ubiquitination result in its being transported to the cell membrane when yeast cells have an ample nitrogen supply.

    “Ubiquitin tagging can be used as a sorting signal,” Kaiser says. He adds that a system in which cells continuously synthesize Gap1, and then use ubiquitin to determine its fate, might enable yeast to respond more rapidly to changes in nitrogen availability than would be possible if they had to fire up Gap1 synthesis from scratch.

    Ubiquitin tagging also helps sort proteins at a later stage in the protein transport pathway, determining which go to the lysosome for degradation and which stay in the endosomal membrane for possible recycling. Earlier studies had revealed that this sorting occurs when the endosome membrane buds inward, forming smaller vesicles inside the larger one. This so-called late endosome or multivesicular body (MVB) then fuses with the lysosome and dumps the small vesicles into the lysosomal interior where they can be degraded.

    Breaking away.

    Newly formed HIV particles bud from the membrane of a normal cell (top), but in the absence of a ubiquitin-recognizing protein called TSG101 (bottom), they can't finish the job and remain stuck to the cell or each other.


    About a year ago, two teams, one including David Katzman, Markus Babst, and Scott Emr of the University of California, San Diego (UCSD), and the other including Fulvio Reggiori and Hugh Pelham of the U.K. Medical Research Council's Laboratory of Molecular Biology in Cambridge, reported evidence that ubiquitinated proteins end up in the vesicles inside the MVB. For example, they found that mutations that prevent ubiquitin addition to the proteins cause them to be missorted, ending up in the endosomal membrane rather than the lysosome. Ubiquitin tagging is “a beautiful mechanism for separating those [proteins] that recycle and those that don't,” Emr says.

    In addition, the Emr team has identified a multiprotein complex, called ESCRT-I (for endosomal sorting complex required for transport), that apparently recognizes ubiquitinated proteins in the endosome and somehow shepherds them into the MVB interior vesicles. The ESCRT-I component that achieves this recognition is a protein called Vps23 in yeast and TSG101 in mammals. In a particularly interesting twist on the protein-trafficking theme, researchers have recently tied TSG101 to certain key steps in viral infectivity.

    Appropriated by viruses

    Virologists have known for years that viruses often subvert immune attack on the cells they infect by removing major histocompatibility complex (MHC) proteins from the cell surface. MHC proteins evoke immune responses by displaying fragments of foreign antigens, including those from viruses; thus their loss results in a weakened immune attack. In work described at the meeting and also in the 15 May issue of EMBO Journal, Paul Lehner of the Cambridge Institute for Medical Research in Cambridge, U.K., and his colleagues showed that the virus responsible for Kaposi's sarcoma is involved in the down-regulation of MHC proteins in two ways.

    They found that a viral protein called KK3 promotes the addition of ubiquitin to MHC class I proteins on the cell membrane, fostering their movement into the cell interior in endosomes. Lehner's team has also shown that the ubiquitin-recognizing protein TSG101 is necessary for the eventual degradation of the MHC proteins in lysosomes, an indication that the ubiquitin tag is needed for this step as well.

    Researchers including Wes Sundquist of the University of Utah in Salt Lake City, Paul Bieniasz of the Aaron Diamond AIDS Research Center and Rockefeller University in New York City, and Carol Carter of the State University of New York, Stony Brook, have shown that ubiquitin and TSG101 play a different, but related, role in the life cycle of RNA viruses, including HIV and Ebola, that exit infected cells by budding from the cell membrane.

    When this budding is about to occur, the RNA-containing core of the virus makes its way to the outer cell membrane, where viral envelope proteins have already been incorporated. The membrane then bulges outward until the virus is released, ensconced in its envelope. Sundquist and the other researchers have found that completion of this budding requires TSG101. It apparently recognizes the viral envelope protein gag, which is ubiquitinated and also carries a particular four-amino acid sequence needed for the interaction. TSG101 presumably then draws other proteins needed for budding to the cell membrane. As UCSD's Katzman noted at the meeting, “the virus is usurping the [MVB] budding system to its own ends.”

    The findings also raise the possibility of targeting TSG101 or other components of the budding machinery with antiviral drugs. “You can envision that inhibitors would give fairly broad antiviral activity,” Sundquist says. He cautions, however, that at this early stage of the work, “we have no idea how toxic such an inhibitor will be.”

    Central command

    Protein trafficking takes place in the cytoplasm, but ubiquitin's range has recently been extended to the nucleus. Several teams have linked the protein to various components of the machinery that carries out the first step in gene expression: copying the DNA's code into messenger RNA. Previous work had shown that ubiquitin can control gene activity indirectly by tagging for destruction various proteins involved in gene expression. But new findings suggest that it also has a direct role in determining whether genes are turned on or off.

    Clues that this might be happening date back 25 years to when ubiquitin was discovered. One of the first proteins found to be modified by ubiquitin was a histone: a member of a family of proteins that packages DNA into chromatin. At the time, that was puzzling, says David Allis, a transcription researcher at the University of Virginia, Charlottesville. Histones were thought to be “rock-stable” and therefore not susceptible to ubiquitin-mediated degradation. “For 25 years, it remained unclear what [ubiquitin's] role was in chromatin. We were really scratching our heads,” Allis remarks.

    It now looks as if ubiquitination could contribute to the chromatin remodeling that helps regulate gene activity. In early 2000, for example, in what Allis describes as “beautiful” work, Mary Ann Osley, now at the University of New Mexico Health Sciences Center in Albuquerque, and her colleagues showed that histone H2B in yeast is ubiquitinated by an enzyme called Rad6 or Ubc2 (Science, 21 January 2000, p. 501).


    Ubiquitin can attach to its various substrate proteins, either singly or in chains, and that in turn might determine what effect the ubiquitination has. (K29, K48, and K63 refer to the particular lysine amino acid used to link the ubiquitins to each other.)


    More recent evidence from the Osley group indicates that this modification aids the uncoiling of the chromatin necessary before a gene can be transcribed. A mutation that prevents ubiquitin addition to the histone partially inhibits transcription of two genes called Suc2 and Gal1. The effect was much magnified if the researchers also prevented histone acetylation, a chemical modification known to facilitate gene transcription—an indication that the two modifications work together.

    In other circumstances, ubiquitin addition by Rad6 might be involved in gene silencing instead of gene activation. Addition of a methyl group to histone H3 leads to the inactivation of certain genes, and earlier this year, Zu-Wen Sun, a postdoc in the Allis lab, and independently Ali Shilatifard of St. Louis University School of Medicine and his colleagues showed that ubiquitin has to be added to histone 2B by Rad6 before the H3 histone can acquire its methyl group. The findings “give the chromatin field a whole new modification to worry about,” says Allis.

    Meanwhile, researchers studying transcription factors—proteins that interact with DNA to alter gene expression patterns—are taking a new look at a modification that seemed to be well understood. Ubiquitination is an established way to tell the cell to eliminate short-lived, or unstable, transcription factors. But it might do much more than that.

    A year or two ago, William Tansey and his colleagues at Cold Spring Harbor Laboratory (CSHL) in New York state picked up on an odd coincidence: The region in the transcription factors that serves as a signal for ubiquitin addition overlaps with a region required for activating gene transcription. That overlap, Tansey says, is “found in just about every unstable transcription factor.”

    Last year, the CSHL group provided evidence that this overlap has functional significance. The researchers showed that for at least one transcription factor, addition of ubiquitin to the site is needed for it to turn on gene expression. Ultimately, the same ubiquitin might serve as a signal for degradation. Cell biologist Joan Conaway of the Stowers Institute for Medical Research in Kansas City, Missouri, describes these findings as “most intriguing, but not yet well understood.”

    For instance, researchers don't yet know what enzyme puts ubiquitin on transcription factors, although Joan Conaway and Ronald Conaway of Stowers and their colleagues have a possible clue. Many proteins must cooperate to bring about gene expression, and the researchers found that a multiprotein coactivator of transcription called Mediator contains a component that might be involved in recruiting ubiquitin-adding enzymes to the transcription machinery.

    Also unclear is exactly how ubiquitin addition promotes transcription factor activity, but it might help recruit some of the other proteins needed for gene activity to the target genes. For example, Stephen Johnston, Thomas Kodadek, and their colleagues at the University of Texas Southwestern Medical Center in Dallas have found that a portion of the proteasome itself is involved in transcription. Still, all of this remains to be established. “We have a lot of possibilities for what [ubiquitination] might be doing [in transcription], and we're in the early days of trying to figure this out,” Joan Conaway says.

    In addition to ubiquitin's roles in protein trafficking and gene transcription, the protein and its relatives are turning up as possible regulators of many of the cell's signaling pathways. For example, at the meeting, Ajay Chitnis of the National Institute of Child Health and Human Development in Bethesda, Maryland, reported evidence implicating ubiquitination in regulation of a major developmental pathway called Notch. And Rick Firtel of UCSD described a role for a ubiquitin relative called SUMO in controlling the response of the slime mold Dictyostelium to chemical signals. Echoing a refrain sounded by many others, Firtel said, “We didn't start out studying ubiquitin but then found ourselves right in the middle of the research.”

    Indeed, Northwestern's Hicke says that ubiquitin regulation might be even more widespread and versatile than regulation by phosphate. As proteins, she notes, ubiquitin molecules can be joined to other proteins in a variety of ways, either individually, as apparently happens when proteins are marked for uptake by endosomes, or in chains, as occurs in tagging for destruction by the proteasome. In addition, some half-dozen related proteins, such as SUMO, are turning up as regulators of cell activities. “The variety of [possible] modifications is virtually endless,” Hicke says. Whether or not ubiquitin turns out to be superpowerful everywhere it raises its head, it certainly seems poised to keep cell biologists in thrall.

    • *The meeting, “Nontraditional Functions of Ubiquitin and Ubiquitin-like Proteins,” was sponsored by the American Society for Cell Biology and held from 11 to 14 August in Colorado Springs.


    Chemists Search for Solutions

    1. Robert F. Service

    BOSTON, MASSACHUSETTS—From 18 to 22 August, nearly 15,000 chemists gathered here to discuss work on problems ranging from health to energy sources. Among the highlights: tailored immunosuppressants, a new understanding of the molecular miscues that trigger diabetes, and a novel way of storing hydrogen gas for fuel-cell vehicles.

    Sugar Chain Promotes Tolerance

    Each year nearly 6000 patients in the United States alone die while waiting for a lifesaving organ donation that never comes. Using organs from animals such as pigs might help relieve the shortage. Unfortunately, the surfaces of pig cells express a sugary coating not found in humans, which prompts the human immune system to mount a violent attack that kills the transplanted organ within hours. Researchers have made progress in engineering pig tissues that don't express the sugar. But at the meeting, chemist Laura Kiessling of the University of Wisconsin, Madison, offered hope for a simpler route: teaching the human immune system to tolerate the alien sugar.

    Kiessling reported that she and her colleagues and students synthesized polymers of varying lengths and tested them on animals and cell cultures. Long-chain polymers, they found, provoked strong immune reactions, but short ones—including those made by linking molecules of the death-baiting sugar—induced tolerance.

    “It's beautiful work,” says Peng G. Wang, a chemist at Wayne State University in Detroit, Michigan, who specializes in making similar complex sugar chains, called carbohydrates. The new work, Wang says, helps explain the fundamental process by which immune cells decide either to mount an immune reaction or to promote tolerance. It should also provide researchers with a systematic way of triggering different types of immune responses—a tool that could prove useful in coming up with drugs to promote tolerance. Such drugs will still likely be years in the making, Wang predicts, “but this is an important first step in that direction.”

    Using carbohydrate chains to control the immune system is not a new approach. In recent years Samuel Danishefsky of Memorial Sloan-Kettering Cancer Center in New York City and others have made important progress in showing that sugar chains can serve as vaccines by coaxing the immune system to generate antibodies against the sugars—which also abound on the surfaces of cancer cells. Danishefsky's group and others have shown that long chains of the sugars bind to numerous receptors on antibody- generating cells called B cells. If enough of the B-cell receptors are activated, they set off a signaling cascade inside the cell that spurs it to mass-produce antibodies to the sugars, which are then released into the body.

    B cells, however, don't launch an immune attack on everything they encounter. In particular, when they run up against smaller amounts of an antigen, they typically ignore it. So Kiessling and her team reasoned that if they made carbohydrate compounds that triggered fewer B-cell receptors, they might generate immune tolerance instead of an antibody attack.

    To find out, Kiessling's students first conducted a test tube test to see if they could induce tolerance to an antigen called dinitrophenyl. They synthesized a series of polymer chains that displayed roughly 25, 50, 100, and 200 dinitrophenyl groups. They then fed the polymers to separate dishes of B cells sporting dinitrophenyl receptors. As expected, the long polymers containing 100 and 200 copies of the dinitrophenyl prompted a strong immune reaction in the B cells, whereas the shorter chains did not.

    Encouraged, the researchers turned to galactosyl-α1-3-galactose, the sugar that prompts human immune cells to reject pig organs. Galα1-3Gal, as it is called, is a small sugar, not a polymer. The cells of pig tissue are spangled with copies of the molecule, attached to much larger proteins. In the body of a transplant patient, the sugars bind to clusters of receptors on each B cell, sounding the call to arms that dooms the transplanted cells—just as long dinitrophenyl polymers do. Without this clustering, however, the sugar molecules wouldn't set off the alarm.

    To make Galα1-3Gal induce a mild clustering effect, Kiessling's team strung together sugar molecules to build two different polymers, containing 25 and 50 sugar groups. They hoped that the polymers, like the short dinitrophenyl chains, would bind to fewer receptors and thus promote tolerance without setting off the cellular tripwire.

    Teaming up with researchers at BioTransplant, a biotech company in Boston, Kiessling's group tested the sugar compounds in mice genetically engineered not to express Galα1-3Gal. Because the mice's immune systems weren't trained to recognize the sugar, they should have launched powerful immune responses when they encountered it. But when the mice were injected with the 25- and 50-sugar polymers, their B cells stopped producing antibodies, presumably because they had become tolerant of them.

    The result shows that “we can control whether compounds produce an immunity response or a tolerance response,” Kiessling says. And that might lead to a new generation of tailored immunosuppressants designed not to wipe out the entire immune system but rather to lower the body's defenses to just the antigens of choice.

    A Master Key to Diabetes?

    If Stuart Schreiber is right, researchers might need to rethink their understanding of the molecular misfires that trigger adult-onset diabetes. Schreiber, a chemical biologist at Harvard University across the Charles River from the meeting, described a raft of recent published and unpublished experiments that suggests that diabetes might result from a single nutrient-sensing pathway gone awry rather than from a combination of separate molecular missteps as is commonly thought today. What's more, his team discovered a small, druglike molecule that in test tube studies seems to correct the error and return the cells to normal.

    John Schwab, a chemist in the division of pharmacology, physiology, and biological chemistry at the National Institute of General Medical Sciences in Bethesda, Maryland, calls the new studies “incredibly interesting.” Schwab says the work could lead to a new molecular understanding of how diabetes occurs and might eventually pave the way to novel treatments.

    Schreiber's group has long been picking up hints that a problem with the way cells sense glucose and other nutrients could play a role in diabetes. In the late 1980s, he and his team started studying the molecular workings of an immunosuppressant drug called rapamycin. They discovered that in mammalian cells rapamycin binds to a protein known as FKBP12 and that the rapamycin-FKBP12 complex, in turn, binds to another protein called FRAP. Later, they found that this molecular trio tricks yeast cells into behaving as if they are starving. It turns off dozens of genes involved in glucose metabolism and cell growth and division, and it turns on a suite of genes that allow the cells to burn proline, a metabolic pathway that is less efficient but valuable when food is scarce.

    Such insensitivity to glucose is a hallmark of diabetes. But Schreiber's team didn't know whether the condition in yeast had anything to do with nutrient sensing in humans, who, like other mammals, have a far more complex energy-sensing system than yeast does. The system comes into play when molecular receptors on islet cells in the pancreas detect glucose, a signal that triggers them to release the hormone insulin. The insulin then binds to cells throughout the body and spurs them to metabolize glucose.

    In type II (adult-onset) diabetes, however, it is thought that islet cells lose their sensitivity to glucose and that insulin- responsive cells lose their sensitivity to insulin. The prevailing wisdom is that both pathways must go awry for a person to develop diabetes. But Schreiber and colleagues suspected that the story might be simpler than that. Could the FKBP12- rapamycin-FRAP complex, they wondered, override the entire nutrient-metabolism apparatus in otherwise normal mammalian cells, just as it does in yeast? If so, a single molecular monkey wrench might knock out both pathways at once.

    The researchers already knew that the complex hit the insulin pathway. Earlier studies had shown that normally insulin- responsive cells treated with rapamycin seemed oblivious to insulin even when they were swimming in it. To find out about the glucose pathway, Schreiber and colleagues bathed rat islet cells in either a high-glucose or a low-glucose nutrient solution. They then added rapamycin to some of the cells and analyzed the effects with gene chips, which tracked the activity level of 8800 different genes in the cells. The glucose-rich cells treated with rapamycin showed essentially the same genetic profile as glucose-poor cells grown without rapamycin. The similarity suggests that rapamycin was somehow interfering with cells' ability to process glucose, thus coaxing mammalian cells into behaving as if they were starving—just as it did in yeast cells. In his talk, Schreiber also noted that drug company data show that as many as 20% of patients who take rapamycin as a drug also develop diabetes.

    Still, few diabetics are ever exposed to rapamycin. So how do they develop the disease? Schreiber suspects that either genetic or environmental factors interfere with the same nutrient-sensing pathway that rapamycin strikes.

    If so, there might be hope for reversing the damage. In a final set of experiments Schreiber described at the meeting, he and his team created a library of small, druglike molecules and screened for a compound that would block rapamycin's cell-starvation activity. They found one, dubbed SMIR-4, that when added to yeast cells in a glucose-rich medium that was spiked with rapamycin blocked the cells' starvation response and returned their gene expression patterns to normal. Just how SMIR-4 manages the trick is still unclear, Schreiber says. But down the road, either this compound or others like it will likely be tested in animals to see if they can reverse the nutrient-sensing deficiencies that appear to play a role in diabetes. If successful, the compounds could then become candidates for treating diabetes, a disease for which patients at best can only try to minimize their symptoms.

    Conducting Plastics Pack the Hydrogen

    Hydrogen gas is a tantalizing fuel source, because it generates only water when burned. But the so-called fuel of the future has its drawbacks. Among other things, the gas is so lightweight that it's tough to store enough of it in a vehicle's gas tank. To solve that problem, chemists have sought solid materials that trap hydrogen much as a sponge soaks up water. At the meeting, Sung June Cho, a chemist at the Korea Institute of Energy Research in Taejon, reported a potential breakthrough in the search: cheap polymers that can store about twice as much of the gas as another leading material. If the polymers can release that hydrogen on demand—a feat not yet demonstrated—they could lead to plastic gas tanks that carry cars hundreds of kilometers between fill-ups.

    Researchers say they've heard such promises before. In 1999, a team of physicists from Singapore reported that nanotubes spiked with metals could absorb up to 20% of their own weight in hydrogen. That result has never been reproduced. Still, other teams have shown that the tiny tubes can hold about 4% or so, close to the magic figure of 6.5% needed for a viable hydrogen-storage material.

    Gas guzzler.

    Hydrogen “sponge” might put H-powered vehicles on the road at last.


    Cho and colleagues suspected that the storage capabilities of nanotubes result in part from their ability to conduct electrical charges, which might help hydrogen molecules adhere. But nanotubes can cost hundreds of dollars a gram, making them impractical for real-world use. So Cho and his colleagues decided to see whether cheaper conducting plastics are equally good at capturing hydrogen.

    They got a pleasant surprise. When the researchers made films of polyaniline and polypyrrole—two common conducting plastics—and added pressurized hydrogen, they found that the polymers could hold up to 6% of their weight in hydrogen at room temperature. When they then treated the films with hydrochloric acid, which perforated the film, storage capacity jumped up to 8%.

    The result “kind of surprised a lot of people,” says Kurt Rothenberger, a chemist at the National Energy Technology Laboratory in Pittsburgh. “This is something that other groups will be very interested in,” he says. But Rothenberger and others at the meeting stressed that a practical hydrogen-storage material would have to give up the gas when it is needed. Cho says his team is testing his plastics for that ability right now.


    Engineered Fish: Friend or Foe of the Environment?

    1. Erik Stokstad
    1. Animal Biotechnology: Science-Based Concerns.

    With the world's fish consumption rising, transgenic fish might alleviate pressure on wild stocks. But researchers worry that genetically engineered fish, if they escaped, could wreak ecological havoc

    In fine weather, the floating pens full of salmon bob gently off the coast of Maine. They're designed to take the brunt of a storm, but a fierce nor'easter that hit Machias Bay in December 2000 was too much for them. The pens' steel rims gave way, and some 100,000 fish escaped. Fearing that the runaways would interfere with the few remaining wild salmon, conservationists demanded a moratorium on new fish farms in the state. Feral fish have also plagued North America's west coast, where an estimated half-million salmon escaped from 1987 to 1997; since then, farm-raised salmon have been found spawning in British Columbia.

    Farmed salmon are big, hungry, and aggressive. When they were added to an experimental stream in a 2000 study, the number of native cousins dropped by a third. And imported alien species on the lam from farms have been known to overrun native species (Science, 23 November 2001, p. 1655). Researchers are paying renewed attention to such escapes as they debate the future of a new type of farmed fish: transgenic species. They have the potential to be even more of a menace than existing farmed fish. No genetically engineered fish are commercially farmed yet, but a modified Atlantic salmon is under review by the U.S. Food and Drug Administration (FDA), and others are nearing commercialization elsewhere. Says Eric Hallerman of Virginia Polytechnic Institute and State University in Blacksburg: “It's my sense that transgenic fish will go forward in some places in the world whether Greenpeace likes it or not.”

    Outfitted with an extra gene for growth hormone, the engineered salmon grow up to six times as fast as other domesticated salmon. That growth rate, and a 20% gain in efficiency at converting food to flesh, could offer fish farmers bigger profits. Broadly applied, such technology might provide better yields for developing countries, reduce pollution from fish farms, and ease pressure on wild fisheries. Yet a question lingers in the minds of many scientists: What would happen if the fish escaped?

    The answer is unclear. A study* released by the National Research Council (NRC) last month cited environmental impact—in particular, by highly mobile creatures such as fish—as the greatest risk associated with animal biotechnology (Science, 23 August, p. 1257). Bulked-up transgenic fish might outcompete wild fish for food. If outfitted with genes for cold tolerance, they could spread to more extreme latitudes like an invasive species. And if they interbred with wild fish, any number of scenarios could result, including local extinction of natives.

    Yearning for freedom.

    Fish raised in pens have escaped to join their wild kin.


    To get a handle on those risks, scientists have been testing transgenic fish in the lab. Some have used hormone patches to simulate the effects of transgenes and then studied how well the fish survive in the wild. Others have taken a more theoretical approach by modeling what might happen to population genetics if transgenic fish escaped and interbred. Commercial developers insist that the risk is negligible, as transgenic fish currently in development will be sterilized. But most scientists say they simply don't yet understand enough about the potential ecological impacts to approve of raising even sterile transgenic fish in the seas, as some producers propose. “It's a realm of unknown risks,” says William Muir, a geneticist at Purdue University in West Lafayette, Indiana.

    The contenders

    One thing is certain: Demand for seafood is rising, and it might double by 2040, according to the Food and Agriculture Organization of the United Nations. Many wild fisheries are already depleted, so much of the supply will likely come from aquaculture, says James Carlberg of Kent SeaTech Corp. in San Diego, California. But intensive aquaculture has its troubles. For instance, when fish are crowded together, disease becomes more common and can spread to wild fish.

    Transgenics offer potential solutions, and researchers around the world have been after them for more than 2 decades (Science, 2 August 1991, p. 512). Many kinds of fish have been engineered, some for disease resistance but most for faster growth. In Cuba, for example, molecular biologist Mario Pablo Estrada García of the Center for Genetic Engineering and Biotechnology in Havana and colleagues have added a viral promoter to tilapia, a freshwater fish from Africa, to increase expression of a native gene that codes for a growth hormone. In the lab, the fish grow twice as fast as domesticated tilapia. The group is a few years away from commercializing the fish, Garcia says. Another group, led by Norman Maclean of the University of Southampton, U.K., has tested a different type of growth-enhanced tilapia in field trials in Hungary. On average, the transgenic tilapia were three times heavier than nontransgenics at harvest. And Zhu Zuoyan of the Chinese Academy of Sciences' Institute of Hydrobiology in Wuhan has been working on fast-growing transgenic Yellow River carp engineered to carry a growth hormone gene from grass carp. Field trials demonstrated that the first-generation offspring grow 42% faster than nontransgenic carp.

    The most famous transgenic fish is a growth-enhanced Atlantic salmon made by Aqua Bounty Farms Inc. of Waltham, Massachusetts. By adding a growth hormone gene from Chinook salmon along with a promoter sequence, the company has created a line of Atlantic salmon that can produce growth hormone all year long rather than just during spring and summer. The modified fish put on weight up to six times as fast as traditional hatchery salmon. Although the fish don't end up larger than normal farmed Atlantic salmon, they reach market size up to a year sooner. Before the company can sell eggs or young fish to farmers, FDA must give approval. Aqua Bounty has begun to submit safety data for human consumption, says the company's Joseph McGonigle. FDA hasn't published any guidelines for genetically modified fish and says it will use the recent NRC report on animal biotechnology to develop its regulations.

    Predicting trouble

    Aqua Bounty fish aren't likely to endanger human health: The transgenic growth hormone is identical to that already consumed in Chinook, and growth hormones are destroyed by cooking and digestion anyway, McGonigle says. But the environmental safety of transgenic fish raises more questions. In trying to answer them, many researchers have examined transgenic fish in the lab for traits that might influence their activity in the wild. Some of the results suggest that transgenic fish would outcompete natives. For instance, a study by molecular biologist Robert Devlin of the governmental organization Fisheries and Oceans Canada in West Vancouver, British Columbia, revealed that transgenic salmon eat almost three times as much food as nontransgenic peers of the same size.

    Still, researchers say it's a stretch to conclude that the fish would sink wild fish, given the narrow range of conditions in the lab. “It's virtually impossible to scale up from these experiments to the real world,” says Mart Gross, a conservation biologist at the University of Toronto.

    To get a more realistic view, Jörgen Johnsson of Göteborg University, Sweden, and colleagues have studied brown trout in experimental streams. Because genetically engineered fish may not be released in the wild, they use fish with implants that slowly release growth hormone. Over 1 year, growth-enhanced trout grew 20% faster than wild trout and survived just as well, Johnsson's group reported last year in Functional Ecology. That result ruled out an earlier suggestion that growth-enhanced fish might be more susceptible to predators or might starve from growing out of season, and it adds to fears that transgenics might outcompete wild fish, Johnssson says.

    Other problems could arise if transgenics interbred with wild relatives. A model of population genetics developed by Muir and Richard Howard of Purdue calculates the likelihood that a transgenic fish would spread its genes. The program crunches just a few key measures of an animal's fitness: the age at which it reproduces, the likelihood that it will survive that long, its fertility or fecundity, its ability to attract mates, and how many times it might breed.

    Growth spurt.

    Engineered fish (largest individual, bottom) outpace nontransgenic brethren.


    The mix can generate surprising results. In a 1999 paper in the Proceedings of the National Academy of Sciences, Muir and Howard reported that even if a transgene decreased one component of a fish's fitness, it could still harm a native population. For instance, a transgene that makes young fish less likely to survive but boosts their ability to mated if they do would still spread through a population. As transgenic fish mated with wild ones, both populations would shrink, generation by generation, until they disappeared.

    But others caution against making decisions based on models at this time. “The models may be past our ability to plug in reliable data, so maybe we shouldn't be using them for management,” Devlin says. Virginia Tech's Hallerman points out that there aren't enough data yet on Atlantic salmon or other species of interest to make it meaningful to run Muir's model. Complicating matters is the fact that the effect of a transgene can depend heavily on a fish's genetic background. This variability was demonstrated by an experiment by Devlin's group, published in Nature last year. When rainbow trout were given a gene for growth hormone, wild fish put on much more weight than did domesticated ones. Among coho salmon, in contrast, the domesticated fish bulked up more than wild ones. Because different populations of a species can vary substantially in their response to certain gene products, Devlin says, a given genetic enhancement might not pose the same risk for all members of a wild stock. Muir notes that there isn't an alternative to the model at the moment: “It represents our best guess, and perhaps that is all we can do.”

    The safest approach to transgenic aquaculture, of course, would be to raise the fish in tanks on land. But the added cost—an increase of about 40%—would make it difficult for producers to compete with conventional marine salmon farmed off, say, Chile or Tasmania. So in its proposal to FDA, Aqua Bounty suggests a compromise: raising fertile broodstock in secure facilities on land and sending only sterilized fish to sea pens for rearing.

    Opponents say the plan is still too risky. Aqua Bounty sterilizes eggs in pressure chambers, a process that adds an extra set of chromosomes, and says it achieves 100% effectiveness. But that's in lab conditions with only thousands of eggs, treated by experienced operators. Muir and others suspect that at the industrial scale, some eggs would be bound to become fertile females. Practically speaking, “there isn't anything such as 100% sterility,” says Sue Scott of the Atlantic Salmon Federation in St. Andrews, New Brunswick, Canada. What's necessary, argues Anne Kapuscinski of the University of Minnesota, Twin Cities, is to have multiple barriers, such as sterility plus confinement on land.

    “We've spent a decade developing and testing transgenic fish, and we're still not confident in our risk assessment,” says Devlin, who has no plans to commercialize his lab's fish. Other researchers say the technology has a lot to offer: “I view it as a hopeful process for biodiversity, because it will ease harvesting pressure on wild populations,” says Gross. The enhanced productivity of transgenic fish might make their advent—in at least some corners of the world—as likely as the rise of biotech plants.


    Radioecology's Coming of Age--or Its Last Gasp?

    1. Richard Stone

    A group of rebel scientists contends that efforts to protect humans from radioactive contamination have short-changed the environment. But some experts are balking at their calls for a radical new approach

    MONTE CARLO, MONACO—The devastating atomic bombs dropped on Japan at the end of World War II created a new branch of science almost overnight for charting radiation's effects on the human body. At about the same time, scientists also began probing how radioactive contamination filters up the food chain to people. Compared to studies of human health, however, the field of radioecology has struggled to make an impact. Its second-class status is even enshrined in a decade-old statement from the influential International Commission on Radiological Protection (ICRP): “The standards of environmental control needed to protect man to the degree currently thought desirable will ensure that other species are not put at risk.” Now, some radioecologists are beating the drums for a better deal for the rest of the living world.

    The ICRP stance, says radioecologist Jan Pentreath of the University of Reading, U.K., “sounds like a religious statement”: There's no explicit scientific evidence to back it up, he says. In most environments, including obvious places such as the ocean floors, “humans are likely to be the least exposed” to radioactivity, says Pentreath. Adds Per Strand, the newly elected president of the International Union of Radioecology (IUR), “It's very strange that we don't protect the environment from radioactivity like we do for other contaminants.”

    This heretical new view had its coming-out party here last week at a gathering of 300 radioecologists from around the world. “This is the first time we've brought this issue to a broader audience,” says Strand, who chaired the conference and is one of the movement's leaders. Their campaign might soon score its first victory. Following months of often fractious debate, ICRP last month put out an updated statement for public comment, this time declaring that nonhuman species should be protected in their own right from the harmful effects of radionuclides spawned by nuclear power generation, atomic weapons production, and other human activities. The rebel radioecologists see this as just the first step. They also want to put the field on a more systematic and quantitative footing, perhaps laying the groundwork for tighter regulations on radionuclide release in areas such as coastal waters. “There is a general feeling that the current system of radiation protection we have is inadequate,” says Carl-Magnus Larsson of the Swedish Radiation Protection Authority.

    Some experts, however, think this expansionist push is an attempt to prop up a discipline that has little new to say. “Responsible scientists should not try to manufacture a crisis before there is scientific evidence that such a crisis exists,” argues physical chemist Pier Roberto Danesi, who recently retired from the directorship of the International Atomic Energy Agency's (IAEA's) laboratory in Siebersdorf, Austria. “Lack of knowledge about the effects of radioactivity on many species doesn't necessarily mean there's a problem.” Indeed, many presentations at the meeting here highlighted the environment's resilience to radionuclides. Others say that focusing too much effort on marginal impacts on some species could sap efforts to tackle more important issues, such as disposing of the vast accumulations of radioactive waste. “To me that's the biggest problem to solve,” says radioecologist Graeme (George) Shaw of Imperial College in London.

    A dying breed?

    This is not the first time that radioecology has reached a crossroads. Following the boom years of the 1950s and 1960s, during which it traced the effects of fallout from nuclear weapons tests, the discipline suffered a slow decline. By the mid-1980s, when it was clear that global fallout posed minimal risk to ecosystems, radioecology was searching for an identity.

    Then on 26 April 1986, reactor number 4 at the Chornobyl Nuclear Power Plant exploded. The blast and subsequent fires released a radioactive plume that settled over large swaths of Europe, triggering scores of cases of childhood thyroid cancer, despoiling cropland, and gouging a deep psychological wound that has not yet fully healed. The Chornobyl disaster breathed life into radioecology. “It really renewed the field,” says Shaw, an associate editor of the Journal of Environmental Radioactivity.

    Radioecologists have been at the forefront of studies to determine the contamination's consequences, particularly from the most prevalent radioisotope left on the land, cesium-137. But experts disagree about whether many of the effects they're seeing in wildlife around Chornobyl—for example, higher levels of genetic variation in yellow-necked mice and other denizens of radiocesium-ridden land—pose a threat to the animal populations or whether the animals are taking the radiation in stride.

    The difficulty in deriving meaning from such data has contributed to radioecology's identity crisis. “We've lost our sense of direction in the last few years,” says Shaw. “Chornobyl and radiocesium have been our main focus for too long.” There are plenty of exceptions: Highlights from the Monaco meeting included presentations on the risks posed by depleted uranium munitions used in the Balkans and the Gulf region (see sidebar) and studies showing that no radioactivity was released from the Kursk, the Russian nuclear submarine, either as a result of the explosion that sank it or the effort to raise the vessel from the seabed.

    But although Chornobyl may have been the lifeblood of the field in recent years, it may also pose its greatest threat. The accident crippled the world's nuclear industry, and some prominent radioecologists have argued that if nuclear energy disappears from the world's energy programs, “there will no longer be a justifiable role for radioecologists,” Shaw says.

    A raison d'être?

    Amid the angst, a few individuals have begun to sketch out a new direction for the field. One of the pioneers is Pentreath, who in 1999 proposed a framework for charting the effects of radioactivity on sentinel species that could serve as “reference flora and fauna” for whole ecosystems and encourage a more systematic approach to estimating radiation exposure and dosimetry across species. “We need a common lexicon,” Pentreath says, that would replace today's piecemeal approach.

    IUR adopted Pentreath's ideas as a proposed strategy for the field in 2000, and the approach is gaining favor among funding agencies. A project sponsored in part by the European Union called Environmental Protection for Ionising Contaminants in the Arctic has put together a list of reference organisms, ranging from lichens and soil invertebrates to mammals, including lemmings, voles, and reindeer. A second European project called FASSET is attempting to build a framework for assessing radiation effects—in particular, death, illness, reproductive impairment, and cellular damage—across species. “It is a dramatic change that has occurred over the last few years,” says Larsson, “and a welcome change.”

    More contentious, however, are the potential implications of the approach for regulations. Such work could, for example, influence regulations on radionuclide releases into the environment. Although dumping radioactive waste on the high seas is now prohibited, most countries allow waste to be discharged from pipelines into coastal waters. “We have a rather weak international regime for the control of discharges, although there are moves to strengthen it in some regions of the world,” notes IAEA's Gordon Linsley.

    Critics point out that much of the driving force for the new approach to radioecology is coming from scientists in Scandinavia, which has long tended to take an aggressive stance on nuclear issues. “The IUR was heavily criticized; they called us Greens,” recalls Strand, who rejects that characterization. “The IUR simply said, ‘Let us be open-minded and look [at] how we can assess the consequences of radioactivity.’”

    In the meantime, Strand was invited to join a task group crafting the ICRP's new statement (now on the Web at reflecting that view. ICRP will take comments through the end of the year and expects to finalize the statement in April 2003. “It really has to be sold around to the radiation-protection community,” says Strand. For some observers, it will be a tough sell.


    New Findings Allay Concerns Over Depleted Uranium

    1. Richard Stone

    When several NATO peacekeepers in Kosovo contracted leukemia after their tour of duty, some people pointed the finger at depleted uranium (DU). Because uranium is roughly 70% denser than lead, it makes an effective armor-piercing weapon. NATO aircraft had fired several tons of ballpoint pen-sized DU projectiles at Serb military targets in Kosovo in 1999; much of the ordnance would have fractured and disintegrated on impact, dispersing uranium particles into the air and soil. Fears were heightened after DU penetrators collected in Kosovo were found to contain traces of plutonium and highly radioactive uranium-236, indicating that at least some of the uranium had been irradiated and reprocessed and thus would be more radioactive than typical DU. To assess the danger, if any, to soldiers and local people, the United Nations Environment Programme (UNEP) dispatched teams of researchers to Kosovo in November 2000.

    At a radioecology conference last week in Monaco (see main text), one of those teams presented results that should calm the nerves of peacekeepers and Kosovars. The team, led by physical chemist Pier Roberto Danesi, former director of the International Atomic Energy Agency's (IAEA's) laboratory in Siebersdorf, Austria, confirmed that some patches of soil from known impact sites in Kosovo are tainted with DU. But the amounts, the team maintains, are so tiny that the radioactivity poses virtually no cancer risk. Moreover, Danesi's group found no evidence of elevated plutonium levels in the soil. Their findings jibe with those of other bodies, including the U.K.'s Royal Society and the European Union, that have surveyed the DU literature. “There is a consensus now that DU does not represent a health threat,” says Danesi. The latest findings, asserts radiochemist Corrado Testa of the University of Urbino in Italy, “confirm that there is no risk from DU.”

    Bringing in the big guns.

    Soldiers measure radiation levels on a Yugoslav army tank in western Kosovo in January 2001.


    Depleted uranium is what's left of natural uranium after the fissile isotope U-235 is extracted for nuclear weapons or fuel. According to NATO, its aircraft shelled 112 locations in Kosovo in 1999 with 30,000 rounds of DU munitions totaling about 9 tons. Newspaper reports linking the munitions to cancer cases, particularly leukemia, soon followed.

    Danesi's group collected 16 soil samples near DU penetrator holes and underneath penetrators found on the soil surface at five sites. Minefields prevented the team from visiting other areas hit by DU penetrators. Back at the IAEA lab, the researchers threw everything at the samples: instruments ranging from a secondary ion mass spectrometer to a scanning electron microscope equipped with an energy dispersive x-ray fluorescence detector. They found that in the most contaminated places, a few milligrams of soil could contain hundreds of thousands of DU particles—but still not a high enough concentration to elevate cancer risk, Danesi says.

    Plutonium levels in the Kosovo soil—about 1 becquerel per kilogram—accorded with global levels of fallout from atmospheric nuclear tests. For comparison, soil levels in the Alps, near Salzburg, are nine times as high, thanks to Chornobyl. “As far as the plutonium is concerned, you could feed this soil to someone and he'd be fine,” Danesi says. His team will elaborate on its findings in companion articles in the December issue of the Journal of Environmental Radioactivity. Other field investigations in Kosovo have yielded even more comforting results. “We found it very difficult to distinguish between DU and natural uranium,” says Testa, whose lab performed analyses for UNEP.

    The findings might be reassuring, but the DU issue will not be laid to rest. UNEP is organizing a sampling mission in Sarajevo next month, where 3 tons of DU was dropped during the Balkans war, and Iraqi officials have called for investigations into DU on their territory. Nevertheless, maintains Testa, “for me this is a false problem. We could be spending money on more urgent problems”—toxic solvents, heavy metals, and organic pollutants, to name a few, he says.