News this Week

Science  01 Apr 2005:
Vol. 308, Issue 5718, pp. 30
  1. U.K. BIOETHICS

    Divided Committee Urges Less Restriction on Embryo Research

    1. Mason Inman

    CAMBRIDGE, U.K.—The United Kingdom has some of the least restrictive rules in Europe governing research on human embryos. But in a wide-ranging and controversial report* issued last week, the House of Commons Science and Technology Committee argues that they should be relaxed even further: The report says the government should consider lifting the current absolute ban on research involving genetic modification of human embryos and the creation of chimeric human-animal embryos, and that it should even reopen debate on human reproductive cloning.

    Some of the recommendations go against mainstream public opinion and venture into territory where many scientists are reluctant to go. And the committee is itself bitterly divided over the report's approach and conclusions: Five of its 11 members signed a statement disavowing the report, saying that the majority adopted “an extreme libertarian approach,” producing a report that is “unbalanced, light on ethics, goes too far in the direction of deregulation, and is dismissive of public opinion and much of the evidence.”

    The report is part of a reevaluation of the country's regulation of medical and scientific use of human embryos. Reproductive research in the United Kingdom is regulated by the 1990 Human Fertilisation and Embryology Act, drawn up before mammals had been cloned or human embryonic stem cell lines created. The U.K. Department of Health requested the report from the parliamentary committee, led by biologist Ian Gibson, now a Labour member of parliament. The report “asks politicians and the public to justify any extra regulation or any legislative prohibitions in arguments of principle with potential harms to be based on evidence rather than myth or prejudice,” Gibson said in a statement. Parliament would eventually consider any changes to the law.

    Division of opinion.

    Ian Gibson says rules should be based on “evidence rather than myth or prejudice.” (Right) 4-day-old blastocyst.

    CREDITS (LEFT TO RIGHT): PHOTO BY DAVID GRAEME-BAKER; YORGOS NIKAS/PHOTO RESEARCHERS INC.

    Overall, the report argues that “alleged harm to society or to patients need[s] to be demonstrated” before research on reproductive technologies and their clinical use is “unduly impeded” by regulations. The panel offers more than 100 recommendations on specific issues. For example, it says that selection of embryos before implantation should be allowed solely on the basis of their sex. This flies in the face of British public opinion; 85% of the respondents in a 2002 poll said they were against sex selection for nonmedical reasons.

    The report says there is no justification “at present” for changing the rule that research on embryos cannot be conducted beyond 14 days after fertilization. But it goes further, arguing that genetic modification of human embryos should be permitted during that 14-day period for research purposes—and perhaps sometime in the future for reproductive uses “under tightly controlled circumstances if and when the technology is further advanced.” It also suggests that the government should consider relaxing the ban on the creation of hybrid or chimeric embryos if they are destroyed after 14 days. About such mixtures of human and animal cells or embryos, the report notes, “it could be argued they are less human, and therefore pose fewer ethical problems for research, than fully human embryos.”

    As for reproductive cloning, the report points out that it is not now safe and that ethics prohibits performing human experiments to work out the bugs. But the government needs to separate issues of feasibility from safety and ethical concerns and come up with principled arguments to maintain a total prohibition on reproductive cloning, the report says: “Without such arguments, an indefinite absolute ban could not be considered rational.” One problem the report points to with an absolute ban is a gray area between reproductive and therapeutic cloning, such as the use of cloning techniques to create artificial gametes as an infertility treatment.

    “I hope the report will encourage research,” says geneticist Robin Lovell-Badge of the U.K.'s National Institute for Medical Research in London. He says the recommendation that research be permitted on human-animal chimeras is logical. “What is the difference between conducting experiments with human embryos up to 14 days and human-animal chimeras up to the same age?”

    However, Stephen Minger, a stem cell researcher at King's College London, says “the views that are expressed [in the report] are very much different from those of researchers in stem cell work and reproductive medicine.” About the recommendation on reproductive cloning, he says, “I'm a bit surprised that they say that's something we should consider. We already decided reproductive cloning should be banned.” He adds: “I don't think it fosters public support to issue a report with so much dissension in it.”

  2. SELECT AGENTS

    Researchers Relieved by Final Biosecurity Rules

    1. Jocelyn Kaiser

    In a move that many university researchers welcome, the government has slightly relaxed new regulations aimed at beefing up security at biodefense research labs. The final rules on “possession, use, and transfer of select agents and toxins” do not dictate exactly what procedures labs should use but instead allow for flexibility—an approach scientific groups had recommended. At least one critic says the rules are a step backward, however.

    The new rules on handling select agents—viruses, bacteria, and toxins that could be used to harm people, crops, or livestock—were required by a bioterror law passed in response to the 2001 anthrax letter attacks. Interim rules issued in December 2002 by the U.S. Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia, and the U.S. Department of Agriculture (USDA) require registration with the government and background checks for anyone handling select agents, and they call for security procedures such as keeping agents in locked containers. The rules also require prior approval from the Department of Health and Human Services (HHS) for genetic engineering experiments that can make an agent more toxic or resistant to drugs. Violations can result in fines and criminal penalties.

    When the interim regulations were first issued, scientists and universities protested that they were confusing, expensive—up to $730,000 in start-up costs per lab—and could delay or impede research (Science, 21 February 2003, p. 1175).

    Handle with care.

    Final rule allows flexibility for labs dealing with agents such as anthrax.

    CREDIT: JANICE CARR/CDC

    The final rules, published in the 18 March Federal Register, should be more workable, says Emmett Barkley, who heads lab safety for the Howard Hughes Medical Institute in Chevy Chase, Maryland. One key change is that institutions will be allowed to tailor biosecurity plans to their own situations. The rules also now emphasize limiting “access” to select agents rather than securing a physical “area” or lab. This means biodefense researchers can share lab space with other researchers, and only those working with select agents have to undergo background checks. CDC now estimates that the total annual cost per lab will be $15,300 to $170,000.

    Although the flexibility is good news, says Janet Shoemaker, public affairs director for the American Society for Microbiology (ASM) in Washington, D.C., universities will need more guidance. Depending on the agent, she says, “do you need guards 24 hours a day? Or are access cards and a locked freezer enough?” But helping institutions get up to speed shouldn't be too arduous, she notes; only 417 labs are now registered to handle select agents, less than half the number predicted when the interim rule came out, and just 105 are academic institutions. Biodefense labs will receive further guidance from CDC and USDA this spring, and more this summer in the latest version of the CDC/National Institutes of Health's biosafety manual.

    Biodefense critic Richard Ebright, a microbiologist at Rutgers University in Piscataway, New Jersey, asserts that this relaxation of the rules increases the risk of accidents. In particular, he says, the focus on “access” rather than an entire lab “is especially egregious.”

    Government officials left some issues unresolved—for instance, CDC declined to modify the select agent list, as ASM and others have requested. The government is also “studying” whether other experiments, such as engineering a vaccine-resistant virus or aerosolizing an agent, should require special approval from HHS.

  3. INFECTIOUS DISEASES

    A Puzzling Outbreak of Marburg Disease

    1. Martin Enserink

    As many as 120 people may already have died in northern Angola from what could become the largest recorded outbreak of Marburg virus, a rare cousin of the Ebola virus that also causes hemorrhagic fever. Early this week, experts were rushing to the scene armed with diagnostic equipment, hoping to stanch the epidemic and learn more about the mysterious disease, which has surfaced just six times over the past 4 decades.

    To scientists, both the outbreak's location and its manifestation are unusual. Marburg was known to occur only in Eastern and Central Africa, and “based on geography, you'd think this had to be Ebola,” says Thomas Geisbert of the U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID) in Fort Detrick, Maryland. According to the World Health Organization (WHO), about 75% of cases occurred in young children, also strange for a hemorrhagic fever virus, says Thomas Ksiazek of the U.S. Centers for Disease Control and Prevention in Atlanta, Georgia, the lab that first identified Marburg almost 2 weeks ago in samples shipped from Angola. Initial sequencing, however, does not suggest it's an unusual strain, Ksiazek adds.

    Marburg—which can cause fever, pains, diarrhea, coughing, nausea, and hemorrhaging—was first discovered in 1967, when a shipment of monkeys from Uganda caused simultaneous outbreaks in the German towns of Marburg and Frankfurt and in Belgrade, then the capital of Yugoslavia, sickening 31 and killing seven. Three small outbreaks are known to have occurred in Africa in the 1970s and 1980s involving six people, three of whom died. The largest outbreak so far occurred between 1998 and 2000 in the northeastern part of the Democratic Republic of the Congo, with 149 known cases and 123 deaths.

    Casualty.

    Italian pediatrician Maria Bonino (center), who worked in Uige province, has died, presumably from infection with the Marburg virus.

    CREDIT: DOCTORS WITH AFRICA CUAMM/HO/AP PHOTO

    Although a few cases have been identified in the Angolan capital Luanda, the current outbreak is concentrated in the northern province of Uige, according to WHO, which has a team on the ground to help local authorities identify patients and raise awareness of the disease. Overcoming logistical hurdles in a poor, war-ravaged country like Angola can be a challenge, but stopping the outbreak shouldn't be “particularly problematic,” Ksiazek says. Marburg is not highly contagious; infection requires close contact with a patient. Tracing and strictly isolating patients will usually bring the virus under control. But contact with patients can be risky: Several health care workers are reported to have died, including Maria Bonino, a pediatrician who worked in the area for CUAMM Medici con l'Africa, an Italian charity.

    To help with diagnosis, virologist Heinz Feldmann and lab technician Allen Grolla of Canada's National Microbiology Laboratory left for Angola this weekend, carrying with them a mobile lab—small enough to fit in five or six suitcases—to test samples locally. Although stamping out the disease comes first, the team hopes to do some research as well, says Feldmann's colleague Steven Jones—for instance, by trying to find out which immune response protects some people from the disease.

    The presumed animal reservoir for Marburg is still unknown, as is the reservoir for Ebola. Investigating the first cases in the current outbreak, which occurred as early as November, may give researchers clues, says virologist Hans Dieter Klenk of the University of Marburg.

    Besides offering supportive care, there's not much health workers can do for patients or those at risk of contracting the virus. There are currently no drugs or vaccines for Marburg. Feldmann and Jones have developed a live vaccine against Ebola that has shown promise in monkeys (Science, 14 November 2003, p. 1141), and recently the team reported that a similar vaccine for Marburg protected monkeys as well. USAMRIID researchers are trying several other vaccine and drug strategies, but all are years away from use in the field.

  4. SPACE EXPLORATION

    Japan Weighs Moon and Beyond

    1. Dennis Normile

    TokyoJapan's space agency has drawn up a new vision for exploration that includes a crewed space program and a scientific base on the moon. But a new report by an outside panel of experts suggests that the agency, which is trying to erase the stain of several costly failures, already is trying to do more than its sagging budget can handle.

    Next week, the Japan Aerospace Exploration Agency (JAXA) will lay out a 20-year vision for the agency that would move Japan into the elite circle of nations sending humans to the moon and beyond. The vision is more of a wish list than a plan, however, and would require vast new spending. And although details are sketchy ahead of the 4 April release, space scientists are already concerned that science might take a back seat in such a massive undertaking. “We are not yet confident about whether JAXA top management is really prepared to give proper attention to space science,” says Takeo Kosugi, an astrophysicist at the Institute for Space and Astronautical Science (ISAS), a formerly independent body that is now a division within JAXA. “We will need to be continuously battling for a long time.”

    Hands-on.

    Dan Goldin presents report on improving mission success rates to JAXA's Keiji Tachikawa.

    CREDIT: COURTESY OF JAXA

    Hiroshi Matsumoto, a space radio scientist at Kyoto University who served on the committee drafting the vision, says that crewed missions are seen as a way to rekindle public support for space activities. “Otherwise, ordinary people are just not too excited about space,” he says. But that blasé public attitude stems in part from “the poor reliability of Japan's spacecraft,” says Hiroyuki Yoshikawa, an engineer and former president of the University of Tokyo whose research has focused on manufacturing reliability.

    Space advocates see the successful February launch of a weather satellite aboard the error-plagued H-2A rocket as a shot in the arm for JAXA, which was created in 2003 by combining ISAS with two agencies focused on the commercial development of space and on aeronautics. In February 2000, an ISAS M-V rocket failed to place in orbit the Astro-E satellite, intended to study high-energy x-ray sources such as black holes. In December 2003, ISAS's Nozomi probe to Mars missed its target because of thruster problems.

    The agency's other components have suffered similar setbacks. Over the past decade, three of the country's 13 heavy rocket launches have failed, and two Earth observation satellites lost power due to problems with their solar panels. Government spending on space has dropped by 20% since a 1999 peak that included the completion of commitments to the international space station.

    Back in business.

    A successful launch of the H-2A rocket in February has improved morale at JAXA.

    CREDIT: COURTESY OF JAXA

    The failures led to soul-searching at JAXA, which last summer assembled a seven-member Advisory Commission for Mission Success “to see if there were structural or systemic issues within the organization that were contributing to the problems,” says Yasushi Horikawa, a JAXA associate executive director. The commission was headed by Daniel Goldin, former head of NASA, and included the former heads of both France's and Germany's national space agencies.

    The group's report, submitted last week, listed 21 steps that, “if properly implemented, could allow JAXA to significantly improve” its performance. The report recommends that the agency “resolve the discrepancy between JAXA's broad mandate of mission activities … and its budget.” It also urges better integration of the previously separate parts into “one JAXA” with a unified vision and strategic plan.

    Kosugi agrees that closer cooperation among the different arms of JAXA could lead to greater sharing of proven technologies for things such as satellite power systems. But Kosugi adds that the merger is already having a negative effect on basic science. A planned launch this winter of a replacement for the Astro-E satellite lost in 2000 was pushed back to this summer, he notes, to make room for the H-2A launch.

    The commission also wondered whether Japan's space activities are too extensive for its $1.7 billion budget, which is dwarfed by the $5.4 billion that Europe collectively spends and NASA's $16 billion pot. Despite that disparity, it notes, JAXA has a portfolio that matches those of its rivals in conducting basic science and earth observation missions, launching communications and weather satellites, developing new rockets, and providing components for the space station. Trying to do so much with limited resources, it suggests, may have contributed to the previous failures.

  5. FRENCH SCIENCE

    Politician Sails Into a Storm at Oceans Agency

    1. Barbara Casassus*
    1. Barbara Casassus is a writer in Paris.

    ParisScientists at the French oceanographic research agency Ifremer are furious that a nonscientist with strong political connections has been parachuted in as their director. The new arrival, Jean-Yves Perrot, began work last week, replacing Jean-François Minster, a respected physicist and former official at the main French research agency, CNRS. Minster had sought a second 5-year term as chief of the semipublic Ifremer but was turned down in order to clear the way for Perrot, research union officials say.

    Perrot, an adviser to former Finance Minister Hervé Gaymard, found himself out of a job when the minister resigned in February over a housing scandal. Gaymard had rented a 600-square-meter apartment at taxpayers' expense and was accused of misusing public funds. He denied the allegations, but his contradictory statements led to embarrassing press reports, ending in his resignation after 3 months in office.

    Mayor of the smart Paris suburb Marly-le-Roi, Perrot is a member of the right-wing UMP party and sits on the Ile-de-France regional council. He has no intention of giving up politics and “is under no obligation to abandon these posts,” says an Ifremer official, who declined to be identified. “It is the first time that a politician with no background in science or technology and with important elected posts has been appointed head of a research institute” in France, according to a written statement from the union CFDT.

    Biochemist Anne-Marie Alayse, a leader of the CGT union at Ifremer, complains that Perrot “knows nothing about research but will have to take important scientific decisions.” Ifremer's $194 million budget supports studies of ocean ecosystems with the goal of aiding the maritime economy. Although Alayse acknowledges that Minster also encountered resistance when he pushed through an unpopular reform, he was unlike Perrot in that he arrived “with ideas in the pipeline,” she says.

    A graduate of the elite École Nationale d'Administration, Perrot has taught at leading political science schools and held high-level posts in the transport ministry since the mid-1990s. He was chief aide to Gaymard when Gaymard was agriculture, food, fisheries, and rural affairs minister starting in 2002 and became Gaymard's special adviser when the latter was promoted to finance last November.

    Perrot dismisses the scientists' criticisms of his appointment. “I am a civil servant and a man of reflection, not a politician, and am not the first nonscientist to be appointed as head of a research agency in France,” he tells Science. “I will consult all the experts necessary before I take any scientific decisions and will maintain the Ifremer objectives that have already been endorsed,” he adds. He insists that political duties will not make him a part-time director. “My job as chief aide to a minister was at least as time-consuming as this one, and my political posts never prevented me from doing what I had to do.”

  6. ITALIAN RESEARCH

    Universities and Institutes Face Industrial Revolution

    1. Susan Biggin*
    1. Susan Biggin is a writer in Trieste, Italy.

    ROME, ITALY—A revolution is sweeping over Italy's universities and scientific community as Prime Minister Silvio Berlusconi's conservative government tries to align publicly funded research more closely with the needs of industry. While a university reform bill that aims to overhaul the structure and recruitment procedures for academic staff lumbers through Parliament, a stopgap government decree partly addressing some of the urgent measures in the bill passed into law this week. University rectors have declared much of it “totally unacceptable.” Scientists are also uneasy about a national research plan for 2005–07, which was unveiled last month. Although it injects an additional $2.3 billion into science, it would require many scientists to build closer ties to industry. Berlusconi has called on companies “to make more effort” to similarly boost their R&D spending.

    Italy's Ministry for Education, Universities, and Research, headed by Letizia Moratti, has been pushing this pro-business agenda since Berlusconi took power in June 2001. Her aim is to raise private investment and boost high-tech industry. This ties in with the European Union's goal, set out at a meeting in Lisbon in 2000, for member states to spend 3% of their gross domestic product on research by 2010.

    Italy currently spends about 1.2%—more than half of which now comes from public coffers. Berlusconi argues that attracting more industry investment will require reform of the universities and government research institutes. The decree that became law this week fixes an annual 31 March deadline for universities to submit their rolling 3-year programs and staffing requirements and, more controversially, it approves salary hikes for 22,000 entry-level researchers after just 1 year instead of the current three, although it provides no extra cash to pay for the raises. The decree also diverts 7% of university funding from public to private universities.

    Leading the charge.

    Letizia Moratti is trying to align publicly funded research with industry's needs.

    CREDIT: RICCARDO DE LUCA/AP PHOTO

    These measures, plus others in the bill, angered the College of Rectors. It sent a strongly worded letter to Moratti calling for a 10% increase in funding, as well as scrapping the researcher post in favor of a new position which leads more naturally to grades of associate and full professor. Moratti agreed to reconsider and sent the bill back to Parliament for reworking.

    Government research centers are facing similar upheavals: The Institute for the Physics of Matter has been folded into the National Research Council (CNR), which itself has been restructured into strategic programs designed to appeal to industry (Science, 11 March, p. 1543). This policy is defended by CNR chief Fabio Pistella, who says “state funding is no longer sufficient.”

    Last month's national research plan continues this trend. The plan is built around 10 programs and creates 11 new technology “districts,” each specializing in a particular field, such as telecommunications in Piemonte and nanotechnology in Veneto. The plan has divided scientists. Many are angered by the creation of new institutes—such as one in Lucca for innovation and marketing—when existing centers remain underfunded. But Umberto Veronesi, science director of the European Institute of Oncology in Milan, argues that a new biomedical center in the city—modeled on the National Institutes of Health's complex in Bethesda, Maryland—will help improve people's access to the latest treatments.

    Aldo Schiavone, director of the Italian Institute for Human Sciences and a specialist in higher education policy, says the university reform bill provides a unique opportunity for a once-and-for-all overhaul of the academic career structure. The College of Rectors, he says, should seize the chance to modernize and not use it just to wring more resources from government. Moratti insists that achieving the Lisbon goals means the universities must look beyond their own interests to the needs of the private sector: “We need a vision that relates to the country as a whole.”

  7. NEUROSCIENCE

    Economic Game Shows How the Brain Builds Trust

    1. Greg Miller

    As any economist will tell you, people don't always behave rationally when it comes to money. For instance, we sometimes trust complete strangers with our hard-earned dough. This suggests to many that a tendency to trust is hard-wired into the human brain.

    Until now, little was known about the neural circuitry underlying the capacity to trust. But on page 78, neuroscientists and economists from Texas and California report an intriguing insight: Activity in a brain region called the caudate nucleus reflects one person's intention to trust another with a sum of money. Their results also suggest that trust isn't purely noble—it may stem from a cold calculation of expected rewards.

    “I think it's a very important paper. It's going to change the way we think of social interactions,” says Paul Zak, who directs the Center for Neuroeconomic Studies at Claremont Graduate University in California. “It's an exceedingly well done and rigorous study,” agrees Paul Glimcher, a neuroscientist at New York University.

    The research exemplifies the fledgling field of neuroeconomics, which combines the brain imaging tools of neuroscience with the exchange games economists have invented to probe how people behave during financial transactions. It's also one of the first studies in which the brains of two people were scanned simultaneously during a social interaction. Two volunteers played a trust game from inside functional magnetic resonance imaging scanners, one at the California Institute of Technology in Pasadena and the other at Baylor College of Medicine in Houston, Texas.

    Tête-à-tête.

    Brain scans of the investor (left) and trustee in an economic exchange game shed light on the neural basis of trust.

    CREDIT: R. MONTAGUE

    In each of 10 rounds, one player, the designated “investor,” received $20. The investor then had the option of sending some, all, or none of the $20 to the other player, the “trustee.” According to the rules of the game, which were known to both sides, any money the trustee received tripled. The trustee then had the option of returning a portion of the new sum to the investor. The players' only knowledge of each other came from numbers flashed on a monitor that indicated the amount of money changing hands in each round, as well as each player's total for the game.

    The extent to which a player trusted another with his or her money depended on the recent history of the exchange. If an investor increased the contribution to a trustee immediately following a round in which the trustee had reduced payback, the trustee generally rewarded this benevolent reciprocity with a greater return in the next round. But if an investor demonstrated malevolent reciprocity by repaying generosity with stinginess, the trustee usually returned less the next time around.

    Examining the trustees' brain scans, the researchers found that activity in the caudate nucleus was greatest when the investor showed benevolent reciprocity and most subdued when the investor showed malevolent reciprocity. Moreover, caudate activity rose and fell with changes in the amount of money trustees returned to their investors on the subsequent round. The team concludes that activity in a trustee's caudate nucleus reflects both the fairness of the investor's decisions and the trustee's intention to repay those decisions with trust (or not).

    The caudate nucleus's “intention to trust” signal appeared about 14 seconds sooner in later rounds of the game, an indicator that the trustee is building an opinion of the investor's trustworthiness, says Read Montague, who led the Baylor team.

    The caudate nucleus is well connected to the brain's reward pathways, and previous work has shown that it revs up when subjects expect a reward such as juice or money. Montague and colleagues speculate that trust, admirable trait that it is, boils down to predicting rewards—in this case, the “social juice” of the investor's reciprocity. Trust has been an element of human social interactions for many thousands of years, says Ernst Fehr, a neuroeconomist at the University of Zurich in Switzerland, so it makes sense that it would tap into ancient neural systems like the reward pathways.

  8. MATHEMATICS

    'Cranky' Proof Reveals Hidden Regularities

    1. Dana Mackenzie*
    1. Dana Mackenzie is a freelance writer in Santa Cruz, California.

    Mathematicians crave patterns, and nowhere do they find richer pickings than in the theory of numbers. Five years ago, a breakthrough in a long-standing problem connected with one of the simplest functions of number theory yielded an unexpected bonanza of new patterns. Now, a new proof suggests that that was just the beginning. “It's almost certain that there will be more where this came from,” says number theorist George Andrews of Pennsylvania State University, University Park, whose work helped pave the way for the new result.

    The proof involves the partition function, which counts the number of ways you can reach any integer by adding other positive integers. For instance, the number 4 can be partitioned in five different ways: 1 + 1 + 1 + 1, 2 + 1 + 1, 2 + 2, 3 + 1, or simply 4 itself. In other words, the fourth “partition number” is 5. Similarly, the fifth partition number is 7. Partitions crop up throughout number theory and have proved handy for balancing energy budgets in particle physics.

    In 1910 or so, Indian mathematical genius Srinivasa Ramanujan noticed that not only the fourth partition number but every fifth partition number after it is also divisible by 5. What's more, every seventh partition number (beginning with 7) is divisible by 7, and every eleventh partition number (beginning with 11) is divisible by 11. There, mysteriously, the pattern stops.

    Freeman Dyson, now a professor emeritus at the Institute for Advanced Study in Princeton, New Jersey, read about the pattern as a schoolboy in the 1940s and discovered an explanation for Ramanujan's first two observations. “It was my first real piece of research,” he says.

    Sorting it out.

    “Rank” and “crank” functions divide partitions (above, of 9) into classes.

    CREDIT: TIM SMITH

    Dyson defined what he called the “rank” of a partition: the largest number in a partition minus the number of terms in the partition. By noting that the partitions of 4, 9, 14, and so on could be sorted into five equal-sized bins according to rank, Dyson explained why the number of partitions in each case is divisible by five. Similar reasoning showed why the number of partitions of 5, 12, 19, and so on must be divisible by 7. Unfortunately, Dyson's rank formula didn't work for Ramanujan's third pattern, the one dealing with divisibility by 11. He conjectured that some other binning procedure, which he called a “crank,” would explain that pattern, but he never found it. Andrews and another number theorist, Frank Garvan of the University of Florida, Gainesville, finally discovered the elusive crank in 1988.

    Meanwhile, mathematicians had started turning up a few other patterns like Ramanujan's. In 2000, Scott Ahlgren of the University of Illinois, Urbana-Champaign, and Ken Ono of the University of Wisconsin, Madison, hit the jackpot. Using methods similar to those that Andrew Wiles of Princeton University had used in 1994 to prove Fermat's Last Theorem, they showed that divisibility patterns in the partition function exist not just for 5, 7, and 11, but for every prime number greater than 3. For example, every 157,525,693rd partition number, beginning with the 111,247th, is divisible by 13. “Instead of two or three galaxies, they showed that there is a sky full of galaxies,” Andrews says.

    One of Ono's graduate students, Karl Mahlburg, set out to find a “cranky” proof of the newly discovered patterns, although Ono tried to warn him away from what he considered a hopeless task. “I admit to a certain ignorance,” Mahlburg says. “There were a lot of things that could have gone wrong.” The effort paid off. In a paper submitted to Annals of Mathematics, Mahlburg shows that for prime numbers p bigger than 11, the crank does not divide the partitions into equal-sized bins—but it does group them in multiples of p. Thus, in a slightly different way, the crank accounts for Ono's congruences after all.

    Next, Mahlburg says he plans to use a computer to find new divisibility patterns in the crank function itself. Andrews says he is eager to see what happens when Ono and his students try the Wiles-like technique out on other functions of interest in number theory.

  9. ACADEMIC JOBS

    Tenured UCLA Professor Under Fire

    1. Jennifer Couzin

    The University of California, Los Angeles, is taking steps that could lead to the firing of tenured biomathematics professor Sally Blower, accusing her of threatening students and harassing faculty, according to documents supplied to Science by Blower and her husband, UCLA geneticist Nelson Freimer. Blower, who left the University of California, San Francisco, 5 years ago with Freimer after accusing the university of gender discrimination (Science, 7 April 2000, p. 26), says the charges are false. UCLA administrators declined to comment, citing confidentiality rules.

    According to the documents, UCLA Vice Chancellor Donna Vredevoe last week referred five charges against Blower to the school's Committee on Privilege and Tenure, which will help decide Blower's fate. The charges include “failure … to hold examinations as scheduled,” “use of the position or powers of a faculty member to coerce the judgment or conscience of a student,” and “verbal abuse, false statements, disparagement, and harassment of faculty.”

    The charges are only the latest storm surrounding Blower. On 12 November 2004, the dean of the UCLA School of Medicine, Gerald Levey, served Blower notice that she was barred from entering the biomathematics administrative offices pending resolution of charges filed in June. The November letter accused Blower of causing hives and increased blood pressure in two department administrators because she allegedly refused to leave their offices until security was summoned. Blower denies intimidating the administrators, noting that the incidents left her “almost in tears.”

    Imminent departure?

    UCLA's Sally Blower and her husband Nelson Freimer say the school is trying to push her out.

    CREDIT: JAKE FREIMER

    Several individuals familiar with the case say it appears to have spiraled out of control. “Honestly, I can't figure out why there's such a commotion,” says gastroenterologist Peter Anton, who directs UCLA's Center for HIV and Digestive Diseases and has collaborated with Blower on HIV models for several years. “Certain personalities don't click well—but those usually seem to be resolvable without polarization, without … having to isolate and discredit a faculty member.” Anton, who filed a letter in support of Blower, adds that he hasn't “seen any evidence of egregious behavior” on her part.

    Blower is particularly incensed by the charge that she threatened students. In one case, she says, she is accused of threatening to withdraw as thesis adviser to Emily Kajita, then her only graduate student. Blower says she did e-mail Kajita saying the department was impeding her ability to advise students, but the two chose to proceed. Kajita praises Blower as “the best adviser you could ever have,” adding that Blower paid nearly $4000 to cover her living expenses when Kajita was struggling to find graduate school funding.

    Blower traces the charge of failing to hold exams to a September 2002 qualifying exam she postponed to attend the funeral of a close family friend in San Francisco.

    Blower does admit to sending “rude e-mails” to members of her department but says she wouldn't have done so if they had responded to her inquiries for financial information and for room scheduling, so she could hold classes.

    Blower joined the biomathematics department in 2000 after she and her husband struck a deal with UCLA. The university was aggressively recruiting Freimer, who said he would come only if his wife were also offered a tenured position. Blower says she was made to feel “invisible” by a department she had hoped to shake up, for example, by boosting the profile of its graduate program.

    “There is a huge other side to this story; unfortunately, I can't divulge any of that,” says David Meyer, senior associate dean for graduate studies at UCLA's medical school and one of those who filed a complaint against Blower.

    Now, Freimer says that “without a doubt” he will leave UCLA if Blower is terminated. Even if she's found not guilty, he says, “my feelings about the place are so negative” that he might depart anyway.

  10. PHYSICS

    High-Energy Physics: Exit America?

    1. Charles Seife

    Budget cuts and cancellations threaten to end U.S. exploration of the particle frontier

    Monday, 7 February, was a grim day for the Fermi National Accelerator Laboratory (Fermilab). “You wake up, you go to a presentation, and you find out you're dead,” says Fermilab physicist Joel Butler. Butler is co-spokesperson of an experiment known as BTeV—a multimillion-dollar project that would allow scientists to study the properties of the bottom quark. But that Monday, when the new Secretary of Energy Samuel Bodman took to the podium to announce the department's budget request for 2006, BTeV scientists were horrified to discover that their project had been canceled.

    The decision—which is unlikely to be reversed by a Congress that doesn't have extra money to spend—sent ripples throughout the high-energy physics community. BTeV was the only planned project to study the physics of heavy fundamental particles at Fermilab, which is rapidly becoming the last of what was once a handful of U.S. labs devoted to the study of high-energy physics. Even under the most sanguine projections, the chances are good that no traditional accelerator experiments will be running on U.S. soil after 2010. And if a new linear collider that the Department of Energy (DOE) is gambling heavily on never materializes, the Nobel-filled record of U.S. achievements in high-energy physics could be consigned to history.

    “The U.S. program is very weak looking to the future,” says Michael Witherell, the outgoing director of Fermilab in Batavia, Illinois. “It's something we have to think very hard about: Is the U.S. getting out of that game?”

    Heavy reality

    Just as microbiology has its microscopes, high-energy physics has its accelerators. And the bigger the machine, the better physicists can see into the subatomic world.

    Particle accelerators are machines that turn energy into matter. Using powerful magnetic fields, they force subatomic particles such as electrons or protons to move faster and faster until they approach the speed of light. When those particles smash into a target, they dump that energy in a sudden flash—and, in that instant, particles leap into existence out of the vacuum, born of the pure energy of the collision. As those particles interact and decay, they leave behind a shower of debris. Physicists root through those debris to figure out what, precisely, took place; the curling and branching trails of particles skittering away from the collision reveal the nature of the exotica that were brought to life for a fraction of a second.

    Swan song?

    Fermilab's Tevatron, due to shut down around 2010, could be the last large particle accelerator in the United States.

    CREDIT: FERMILAB

    But the exotica you can create are limited by the amount of energy your accelerator can dump into a small space. (In fact, high-energy physicists describe the mass of particles with units of energy: MeV, millions of electron volts.) Broadly speaking, the more powerful your machine, the heavier and more exotic the particles you create and the deeper you look into the laws that govern matter and the forces of nature.

    In the mid-1950s, the building-sized Bevatron accelerator at Lawrence Berkeley National Laboratory in California led to the discovery of the antiproton (938 MeV). By the 1970s and 1980s, accelerators no longer fit within a single building. Such an enormous accelerator at CERN, a high-energy physics laboratory created outside Geneva in the 1950s to pool Europe's scientific resources, enabled scientists to spot the W and Z particles, carriers for the weak force that weigh in at about 80,000 MeV and 90,000 MeV respectively. In 1995, Fermilab's Tevatron, roughly 1000 times more powerful than the Bevatron, discovered the top quark (174,000 MeV). And the biggest accelerator of all—the 90-kilometer proton-proton smasher called the Superconducting Super Collider—was killed off in 1993 while still under construction in Texas.

    Although these projects were the flagship “discovery” experiments of particle physics, there were others that didn't rely on brute force. By looking at how particles (such as B mesons) interact at slightly lower energies, scientists can infer properties of higher-mass particles—even if they have yet to be discovered. These two types of projects and other high-energy experiments have led to a very effective description of the fundamental components of matter and the forces that affect them: the Standard Model.

    But the Standard Model is incomplete, and high-energy physicists believe that they are on the edge of two major discoveries that will bust it wide open. The Large Hadron Collider (LHC) at CERN, the biggest accelerator in history, will come on line in 2007 or so. For a variety of reasons, physicists believe that it may well spot a particle known as the Higgs boson, a particle that will expand the Standard Model to explain why particles have mass. Many scientists also believe that the LHC will spot a “supersymmetric” particle, the first of a whole class of new fundamental particles that lie beyond the Standard Model—and that may be responsible for most of the matter in the universe. When it does, physicists hope to use the next great accelerator under consideration, the International Linear Collider (ILC), to zero in on those fresh discoveries and give theorists the ability to extend the Standard Model to a truly all-inclusive theory of matter (Science, 21 February 2003, p. 1171).

    Yet even as high-energy physicists anticipate great discoveries in the next decade, the high-energy physics budget in the United States is dropping. This year, DOE, which funds the vast majority of high-energy physics in the United States, requested $716 million for high-energy physics, a decline of about 3% from the previous year. The cut follows years of stagnant budgets that have left labs with barely enough funding to run existing experiments, much less start new projects. “We're running at a lower level than we'd like,” says Ray Orbach, head of DOE's Office of Science. “How can we run the current facilities, support current people, and at the same time have a future? That's the central question.”

    That future looks bleak. “There is no new money,” says Robin Staffin, head of DOE's office of high-energy physics, who says that there is insufficient funding to start new projects. “Any new initiatives will have to come from redirection.”

    Next big thing.

    The Large Hadron Collider in Geneva should ensure European dominance of high-energy physics.

    CREDIT: CERN GENEVA

    A new direction

    That redirection caught BTeV scientists off guard. “I was surprised by this,” says physicist Sheldon Stone of Syracuse University in New York, co-spokesperson for BTeV. “There was no advance information to either us or to the Fermilab group.” Even Fermilab Director Witherell says that he didn't know about BTeV's cancellation until the day of the speech. “The first I found out was when I downloaded the budget from the Web site [that morning],” he says. “It was a shock: There was nothing to replace it, and $20 million was cut from the budget.”

    “It's looking very difficult in the U.S.,” says Roger Forty, deputy spokesperson of LHCb, a similar B-physics experiment that will start up at CERN when the LHC turns on. “From a global perspective, it's a pity.”

    According to DOE officials, BTeV had to die. “We did not see in our budget how to accommodate this,” says Staffin. Orbach agrees that DOE had no alternative but says the decision saddened him: “It's a downer.”

    The decision to cancel BTeV weakens the program at Fermilab, which will soon be the last remaining high-energy physics laboratory in the United States. Brookhaven National Accelerator Laboratory in Upton, New York, has shifted its focus to nuclear physics, although its plans to host a heavy-particle project known as RSVP recently took a hit after an apparent jump in its projected cost (Science, 18 February, p. 1022). The Stanford Linear Accelerator Center (SLAC), which is currently using a collider to produce B mesons, will shut down its B factory in 2008 or so to focus on generating x-ray beams for studies of chemical bonding and other high-speed phenomena on the molecular scale. “Fermilab is the future of high-energy physics in the United States,” says Orbach.

    Yet Fermilab's future direction is uncertain. The Tevatron will also likely be shutting down in 2010 or so. Unless a new project comes along, after that date the laboratory will not be using its equipment to study quarks at all. And until the ILC turns on—if it ever does—Fermilab and the United States will be out of the traditional high-energy physics game. “This is the first time in my memory that there is nothing in line, no major items of equipment” being requested, says Witherell.

    Given those realities, Orbach's pledge that DOE “will continue to support Fermilab” is less than reassuring to high-energy physicists. The support DOE has in mind would require moving away from Fermilab's traditional strengths—the physics of quarks and other heavy fundamental particles—toward areas such as neutrino physics.

    In 2002, Fermilab scientists began running MiniBooNE, smashing protons from a Fermilab accelerator ring into a target. The resulting neutrinos are steered to a nearby detector in hopes of learning some of the fundamental properties of those nearly massless particles. Last month Fermilab launched NuMI/MINOS, in which a similar setup sends neutrinos to a detector in Minnesota (Science, 11 March, p. 1543).

    Wild card.

    An American site for the International Linear Collider would give the U.S. a stake in future experiments—if the ambitious project ever gets built.

    Although the scientific community is excited about the new experiments—and two or three potential follow-ons to NuMI/MINOS—the shift from studying quark flavors and heavy particles to neutrinos has been an uncomfortable one for many Fermilab physicists. Even the neutrino scientists at Fermilab have their qualms. “This lab used to do so many kinds of physics,” says Deborah Harris, a physicist working on the NuMI/MINOS experiment. “It's strange to put all your eggs into one basket. It's a very good basket and an important basket, but it seems strange to focus so narrowly.”

    Fermilab has not yet abandoned studying quarks and other heavy particles, and the Tevatron will likely run until the end of the decade. “We're going flat out,” says Orbach about what he calls a “very vibrant” research program at the Illinois lab. “We're going to leave the Tevatron a burning hulk when we finish with it.” Scientists at Fermilab and elsewhere in the United States are also collaborating on planned LHC experiments that will provide fresh opportunities for studying heavy particles. Furthermore, observing cosmic rays and other high-energy phenomena in the heavens might give scientists an indirect way of understanding fundamental particles, as would mineshaft experiments to find exotic dark matter.

    Nevertheless, these experiments are not nearly as far-reaching (or expensive) as the accelerator-based experiments in which the United States excelled for so many decades. Combined with the new emphasis on neutrinos, DOE is steering Fermilab—and the high-energy physics community in the United States—away from its traditional strengths in accelerator physics and high-energy experimentation. “The thing I most worry about is that we're allowing an important line of our physics to atrophy because we can't afford it,” says Witherell. “It's an area that the U.S. has always been a leader in. That's a problem.”

    To remain active, Fermilab's Butler says, “a large number of U.S. physicists at the Tevatron are already planning to work at the LHC; they have exit strategies.” But Butler isn't happy about the new venue. “This field is being outsourced,” he says.

    The one big hope for U.S. accelerator physics is the ILC. “We're going to go for the linear collider,” says Orbach. If based in the United States, the collider would not only give high-energy physicists a machine to explore Higgs and supersymmetry physics, it would also prevent a hemorrhage of heavy-particle physicists overseas. That's an appealing prospect to federal politicians. “We want the best minds in the world coming here and not going elsewhere. That's all to our benefit,” says Speaker of the House Dennis Hastert (R-IL), who attended the recent start-up of the NuMI/MINOS experiment at Fermilab. But the leader of the majority party in the U.S. House of Representatives isn't ready to make a firm commitment. “If [the ILC] fits within certain parameters, we'd like to keep it in the U.S.,” Hastert says.

    Dark forecast.

    Outgoing Fermilab director Witherell wonders whether the U.S.'s “very weak” future program marks the end of the line.

    CREDIT: FERMILAB

    The biggest of those parameters is the cost, estimated by DOE at $12 billion, of which the host country would presumably pick up half. “Now we have a unified program,” says Orbach. “The problem is that it's too expensive.” DOE might be able to handle a project of half that size, Orbach says, but the probability of joining a $12 billion project is slim.

    The odds are better than that, says physicist Barry Barish of the California Institute of Technology in Pasadena, who heads the ILC design group. He doesn't accept DOE's projected price tag. “There's no way you can get me to talk about cost” until the design group completes some preliminary studies, he says. “But I don't buy $12 billion.”

    For Butler and other physicists, the projected completion date for the ILC in the middle of the next decade is another huge obstacle. “The schedules put up are frankly incredible and don't do justice to the effort,” Butler says. But a timetable that puts the ILC at the end of the next decade or beyond would leave an entire generation of physics students without access to an accelerator in the United States.

    All or nothing

    In an attempt to make the best of a bad situation, high-energy physicists are reprioritizing their projects—again. The High Energy Physics Advisory Panel (HEPAP), which counsels DOE and the National Science Foundation, will apparently be resurrecting its Particle Physics Project Prioritization Panel—a subcommittee that disbanded at the end of 2003 after presenting its recommendations (which included strong support for BTeV). The National Research Council is busy preparing a report, Elementary Particle Physics in the 21st Century, that will do a similar job from a broader perspective (as many of the committee members are from outside particle physics). The goal is an attractive, unified program that lawmakers will be able to fund.

    But these efforts may be moot if the budget situation doesn't improve. “What use is a 20-year outlook if you can't build anything?” Orbach rhetorically asked HEPAP physicists, who were likely wondering about the same question. Having priorities is not equivalent to having a budget, he adds, although such a list may improve DOE's chances of getting some of its projects funded.

    Some high-energy physicists fear that their problems run far deeper, however. Perhaps, they speculate, the United States no longer considers high-energy physics very important, and comments by DOE officials provide little comfort. “Where society goes, the budget also goes,” says Staffin. A major discovery—like the Higgs or supersymmetric particles—could win back public support, they say. But will U.S. labs be ready to respond?

    Most labs, like SLAC and Brookhaven, will have stopped research in the field. Fermilab will be concentrating on a neutrino physics program and some smaller projects, with all its study of heavy fundamental particles taking place overseas. And if Congress doesn't make room in DOE's budget for the International Linear Collider, then there may be no active U.S. high-energy physics program to take advantage of a breakthrough should it come. “It's a gamble,” Orbach admits about the road DOE is taking in high-energy physics. “We're going for broke.”

  11. ECOLOGY

    Taking the Pulse of Earth's Life-Support Systems

    1. Erik Stokstad

    A massive effort to document the state of ecosystems—and their ability to provide food, comfort, and other services—lays out some grand challenges, but no easy answers

    The plan was nothing if not ambitious: assess the state of ecosystems across the entire planet, from peat bogs to coral reefs. Rather than solely chart pristine habitats and count species, as many surveys have done, the $20 million Millennium Ecosystem Assessment (MA) put people and their needs front and center. At its core was the question: How well can ecosystems continue to provide the so-called services that people depend on but so often take for granted? These include not just the food and timber already traded on international markets but also assets that are harder to measure in dollar values, such as flood protection and resistance to new infectious diseases.

    In another novel approach, the assessment simultaneously examined this issue across a huge range of scales from urban parks to global nutrient cycles. The goal in each case, says director Walter Reid, was to offer policymakers a range of options that might help ecosystems recover or improve their role in providing for human well-being. The participants tried to make clear the tradeoffs involved in managing land; some methods of boosting crop yield, for example, exact a long-term price of degraded soils and incur downstream consequences such as fisheries stunted by fertilizer runoff. What's good for people in one region may cause harm in another place or time.

    The project didn't aim to generate new data but to analyze and synthesize published, peer-reviewed research; even that was easier said than done. Some 1360 scientists from 95 countries spent countless hours poring over the literature, hashing out disagreements at meetings, and writing the first volumes of reports that will ultimately number some 3000 pages. In the process, they had to nail down nebulous terms such as biodiversity and cultural services and deal with the headaches of data gaps. “Doing a global assessment of ecosystem services poses huge problems; it really pushes the envelope,” says an apparently indefatigable Reid.

    Fragile.

    Drylands are some of the most delicate ecosystems in the world and face increasing demands, which could threaten efforts to fight poverty.

    CREDIT: WOLFGANG KAEHLER/CORBIS

    There's broad agreement that the envelope has been filled with a valuable status report, the first overview of which was released this week. “It is a magnificent achievement,” comments Stuart Pimm, an ecologist at Duke University in Durham, North Carolina, who reviewed parts of the effort. “There will be a ‘Wow!’ when people see this.” But will it actually influence policy? Most say it's much too early to tell. Participants argue that the bottom-line focus boosts the odds for action. “This will make it much more relevant for policymakers,” says Peter Bridgewater, secretary general of the Ramsar Convention on Wetlands, who was on the MA's board. “It's not just scientific dancing on the head of the pin.”

    Widespread degradation

    The MA's first report,* a summary of the major technical reports to follow, identifies three main problems with how humans are managing ecosystems. Topping the list is widespread abuse and overexploitation of resources. Although some ecosystems have yielded more and more goods—principally fish, livestock, and crops—their integrity, and the productivity of many more, has been compromised. Of the 24 kinds of services described by the MA, 60% are being degraded, the report found. “We're undermining our ecological capital all around the world,” says Robert Watson, chief scientist of the World Bank and a lead author.

    Second, and a bit of a surprise to some, the degradation is probably hiking the risk of sudden, drastic changes in ecosystems, such as the collapse of fisheries or the emergence of new diseases from fragmented forests (see box, below). “When we started, that was not something that scientists might have listed as a top conclusion,” Reid says.

    View this table:

    Finally, the pressure on ecosystems is disproportionately harming the poor. That's particularly true in the drylands of sub-Saharan Africa, central Asia, and Latin America, where a third of the world's population tries to make do with 8% of its fresh water. The report argues that healthy ecosystems are key to alleviating poverty and meeting other objectives in the U.N.'s Millennium Development Goals. “The rationale is that only by protecting and restoring the ecological function will you be able to adequately address hunger and poverty,” says Jane Lubchenco of Oregon State University, Corvallis, who co-chaired a report to business and industry.

    Another troubling trend is the intense disturbance of global nutrient cycles. Excessive use of nitrogen fertilizers has inadvertently led to the eutrophication of estuaries, algal blooms, massive fish kills, and contamination of groundwater. “The impacts are very large, with direct and significant consequences for human well-being,” Reid says.

    The report documents numerous other examples of how humans create problems when they try to wring more out of an ecosystem. Farming shrimp degrades water quality; cutting down forests to make charcoal worsens flooding downstream; rice paddies, cows, and slash-and-burn agriculture all pump carbon into the atmosphere, changing climate in ways that could ultimately hurt farmers. “The key point is that not everything is win-win,” Watson says. The harm can stretch down a river or across the ocean, when dust from degraded African soils worsens air quality in North America. And it can pass from generation to generation, when actions today—depleting soil, for instance—compromise the ability of ecosystems to deliver goods and services in the future.

    The severity of the problems becomes more tangible when costs are tallied in dollars. The MA cites a litany of examples including $2 billion spent to help communities recover from the crash of the Newfoundland cod fishery in the early 1990s and more than $70 billion worth of damage in 2003 from floods, fires, drought, and other disasters. Valuing the benefits of ecosystems, on the other hand, is still in its infancy. The MA didn't try to make such estimates itself but notes studies such as a 1998 estimate of $346 million in benefits from protecting water quality in the Catawba River, North Carolina.

    Nor does the MA recommend particular solutions to broad environmental problems. Instead it lays out four visions of the future (see sidebar) that might result from various kinds of policies, such as changing the nature of agricultural subsidies to promote land conservation or investing heavily in green technologies. Effective action will be required, the report urges, and soon: Even though world population is expected to stabilize by 2050, the MA predicts that the challenges will be heightened by climate change and ever-more-voracious demand for resources.

    Service.

    Walter Reid spent 6 years leading a dozen staff and 1300 volunteers.

    CREDIT: MILLENNIUM ECOSYSTEM ASSESSMENT

    Measuring the intangible

    The idea for the global assessment was hatched in 1998 at the World Resources Institute (WRI) in Washington, D.C., where Reid and others were interested in taking a look at the state of the world's ecosystems at the turn of the millennium. Lingering in their minds was the fate of a previous overview, called the Global Biodiversity Assessment (GBA), that had gone essentially unnoticed after its release in 1995 (Science, 8 September 2000, p. 1676).

    They quickly decided that the way to make a bigger impact would be to focus on goods and services. “You can talk about how ecosystems operate to policymakers, and their eyes glaze over. But if you mention services, they perk up,” says Hal Mooney of Stanford University, a lead MA participant.

    Watson, a veteran of many assessments—including the GBA—was deeply involved with the much more successful Intergovernmental Panel on Climate Change (IPCC) and decided to take a page from its design, which included winning broad support for the project from the get-go. Watson and Reid approached their target audiences, such as the Convention on Biological Diversity, and involved them on the board and in the design of the project. Various “stakeholders” chimed in on what would be of value to them.

    Another key decision was to combine the big picture with many local assessments, ranging in scale from Peruvian villages to all of southern Africa. The reason for including targeted studies, says Reid, is that some ecosystem problems and services such as cultural benefits of green space are local, and decision-makers need specific information. For example, the Southern Africa assessment, the first of 17 such studies to be published, found that the region as a whole has an adequate supply of fuelwood but that there are several areas with severe shortages. At the same time, decisionmakers need to know what's going on globally, as decisions at a larger scale (such as government intervention in the international timber market) affect local communities.

    Over the course of a year, a committee of about 90 scientists hashed out a conceptual framework: a box and arrows that laid out the components of ecosystems and how to assess the services they provide. Definitions were ironed out for terms such as biodiversity (“the variability among living organisms from all sources … and the ecological complexes of which they are a part”) and human well-being (“a combination of physical and social factors including shelter, health, freedom, and the ability to provide for children.”) “Getting everyone to agree on definitions gets rid of the tower of Babel phenomenon,” says Shahid Naeem of Columbia University in New York, enabling ecologists to collaborate with anthropologists and economists, for instance.

    In an attempt to bring clarity to the nascent field of ecosystem services, the committee divided benefits into four categories: provisioning, regulating, cultural, and basic support (see box, below). Mooney says this methodological framework, laid out in a 2003 book, has been adopted widely, for example, by an international biodiversity consortium called Diversitas in Paris, France. This standardization should facilitate comparisons of ecosystem services between countries, notes Les Firbank of the Center for Hydrology and Ecology in Lancaster, U.K., who was not involved.

    View this table:

    When the actual assessment got under way in April 2001, participants soon found themselves struggling with a critical lack of data. For example, remote-sensing data cover most of the world, but it was possible to extract an accurate description of land cover for only those parts of the world for which scientists had verified the satellite data. The whole exercise “gives us an understanding of how much data we don't have—and the really poor quality of data that we do have,” says MA board member Jeffrey Tschirley of the U.N. Food and Agriculture Organization (FAO) in Rome.

    Then came the equally tricky task of trying to link the state of ecosystems to human well-being in a rigorous way. “That's a surprising and unfortunate gap,” Reid says. Most of the evidence is anecdotal, such as studies of the cost to replace the loss of natural pollinators or pest-eating bats. And although it's well established how to measure the costs and benefits of marketed ecosystem services, such as timber, much less is known about how to do that for other benefits, like the clean water provided by healthy watersheds.

    Technical reports from the four working groups will be published this year, as will syntheses aimed for the health and business sectors and four sponsoring conventions. All will be available on the Web site http://www.maweb.org/.

    Going, going.

    Major ecological communities, or biomes, will be turned to farms and other purposes over the next 50 years.

    SOURCE: MILLENNIUM ECOSYSTEM ASSESSMENT

    On the agenda

    Although the MA reports don't contain any breaking news for ecologists, participants say the MA's main accomplishment is a consensus document that emphasizes that human well-being depends on healthy ecosystems. They hope that, like the IPCC, that consensus will help raise the issue higher on the priority lists of nations.

    Beyond the rhetorical value, the collected data will likely be useful for national and local decision-makers, such as those calculating conservation subsidies to farmers or devising certification schemes for sustainable forestry. But that won't happen on its own, so key MA participants are trying to ensure that the report doesn't gather dust. “The outreach task for this is just huge, and we're just at the beginning,” says Stephen Carpenter of the University of Wisconsin, Madison, who co-chaired the scenarios report. Robert Scholes of the Council for Scientific and Industrial Research in South Africa says he's been briefing policy- makers ever since the southern African assessment was published 6 months ago, targeting key ministers and giving two to three talks a month. He expects to keep up the pace for at least a year more.

    The convention leaders say that the MA will help them carry out their own work. In the Ramsar convention, for example, Bridgewater explains that the MA has better defined key terms in the wetlands treaty. Member nations had agreed to foster “wise use” of their wetlands and ensure that “ecological character” does not change; the framework has helped make that more explicit by focusing on long-term delivery of ecosystem services.

    Although scientists and policymakers have lauded the MA, some are frustrated at the limits of its conclusions. The MA finds that ecosystems are being degraded, for example, but it couldn't say anything specific about the pace of that degradation or what levels of use are sustainable. That uncertainty can make it tougher to implement policies that cause economic losses in the short-term. The threat of various catastrophic ecosystem failures might make policymakers think twice, but no one knows how close society is to the brink. The report is “pretty anecdotal when it comes to thresholds,” says Richard Norgaard, an environmental economist at the University of California, Berkeley. That's not the MA's fault but more a reflection of the immature state of the science, he adds.

    What's needed to help design effective policy is more quantification and dollar values of ecosystem services, says FAO's Tschirley. “That's where there is a payoff; that's when it gets value for development and when things can be planned.”

    The MA was planned to provide a baseline for future assessments but not to organize them. Reid says that the working assumption is that if governments and the private sector find that round one was useful, they'll pony up for more. “If this really succeeds,” Reid says, “then people will want to follow the model.” Many scientists say that follow-ups will be key, as it took several rounds for the IPCC to work out its kinks. Follow-up assessments would also be a way to confirm trends in ecosystem services.

    Regardless, a key legacy is the effect on ecologists and other environmental scientists. “It's been a really big force for shifting mindsets in the science community,” says ecologist Gretchen Daily of Stanford University. “There's a whole community of people who are now abuzz with this effort.”

    • *“Millennium Ecosystem Assessment Synthesis Report,” http://www.maweb.org/

    • Ecosystems and Human Well-Being: A Framework for Assessment (Island Press).

  12. ECOLOGY

    Choose Your Own World

    1. Erik Stokstad

    What will ecosystems be like in 2050? That depends on decisions made today, so to give policymakers an idea of the consequences, a Millennium Ecosystem Assessment (MA) working group envisioned what life would be like under four types of broad policies. For each of these scenarios, they used standard computer models that predict changes in variables such as gross domestic product, water use, and food production. Where no models were available, they relied on expert opinion to assess the role of technological innovation, human health, and other factors.

    Some observers were frustrated at the lack of quantification and reliance on expert opinion. But most say the scenarios are a useful thought exercise. “It's a way of motivating thinking and stimulating the imagination, even at national levels, about the choices involved in ecosystem changes,” says Jeffrey Tschirley of the U.N. Food and Agriculture Organization in Rome. “Based on decisions you make, you can create a different world.”

    CREDITS: JUPITER IMAGES

    “Order from strength.” Perhaps not as bad as Blade Runner, the world has fragmented into regional markets and alliances. Nations are obsessed with security issues, and the tragedy of the commons deepens. Every category of ecosystem services takes a nosedive, and the developing world bears the brunt of the damage.

    “Global orchestration.” Free trade and a good heart reign in this scenario. There's no focus on preventing environmental problems, but an emphasis on fighting poverty. The result is a huge boost in food and other provisioning services from developing countries. The costs are borne by regulating services—such as climate change—and a loss in cultural services, such as ecotourism.

    “TechnoGarden.” Al Gore would love this world—globally connected with green technology abounding and a focus on preventing ecosystem problems. Food and other provisions rise, although they are not maximized. Climate change, floods, and disease are less of a worry. The downside is that biodiversity continues to decline.

    “Adapting mosaic.” The emphasis here is on low-tech, local solutions. Regional politicians and institutions focus on watershed-scale ecosystems to maximize benefits and prevent problems. If it catches on widely, it pays off: Every type of ecosystem service improves in both developing and industrialized nations.

  13. AMERICAN CHEMICAL SOCIETY MEETING

    Unnatural Amino Acid Could Prove Boon for Protein Therapeutics

    1. Robert F. Service

    SAN DIEGO, CALIFORNIA—In this biotech hotbed, biological chemistry was front and center at the 229th national meeting of the American Chemical Society from 12–16 March.

    Protein-based therapeutics are a bright spot for drug companies in troubled times. Their annual market is expected to surpass $50 billion by 2010. But proteins can suffer from problems not found with conventional small-molecule drugs. Some trigger immune reactions, and proteases and other compounds inside the body can quickly chop them up and clear them out. Cloaking protein drugs with a polymer called polyethylene glycol (PEG) can help hide them from the immune system. But it also makes some protein drugs less reliable because proteins sporting different numbers of PEGs may behave differently inside the body.

    At the meeting, researchers from a California biotech company showed that they could precisely control the number of PEGs on each molecule, and thus the behavior of protein drugs, by inserting unnatural amino acid into proteins. “It's excellent work,” says Ryan Mehl, a chemist at Franklin & Marshall College in Lancaster, Pennsylvania, who also specializes in adding unnatural amino acids to proteins. Although such altered proteins are still far from the clinic, Mehl and others say that the new work may eventually enable researchers to tailor the medicinal properties of proteins much as they do with small-drug molecules today.

    The new technique builds on more than a decade of work by Peter Schultz, a chemist at the Scripps Research Institute in La Jolla, California, and his colleagues. Schultz's team over the years has created a scheme for altering the basic chemistry of proteins (Science, 14 July 2000, p. 232). Virtually all organisms construct their proteins from 20 amino acids, although scores more exist. Those 20 amino acids are encoded by the four nucleic acids that make up DNA. Those nucleic acids form three- letter “codons” that signal which amino acid should be inserted in a growing protein chain. Schultz's team pioneered an approach to hijack one of those codons to have it insert a non-natural amino acid instead. So far Schultz's lab has doctored proteins from Escherichia coli and other organisms to include some 50 different amino acids, which provide different chemical handles the researchers can use to alter the chemistry of the proteins.

    Hookup.

    Unnatural amino acid (red) added to human growth hormone (gold) helps researchers attach protective polymers (green).

    CREDIT: ANNA-MARIA HAYS/AMBRX

    In 2003, Schultz helped set up a new San Diego biotech company called Ambrx to take advantage of the new technology. At the meeting, Ho Sung Cho, Ambrx's director of molecular technology and process development, reported that he and his colleagues have made rapid strides in making precise modifications to human growth hormone (hGH), a protein used widely to promote growth in undersized children. The current hGH, Cho and others note, is made by linking copies of PEG to some of the 11 lysine amino acids on the protein. Researchers have found that hGH with about four PEGs per molecule strikes the best balance of safety and effectiveness. In practice, however, drugmakers end up with a mishmash of hGH molecules with different numbers of PEGs attached to different sites.

    To get around this variability, Cho's team made 20 different versions of hGH, each of which had an unnatural amino acid called p-acetylphenylalanine inserted at a different site. That gave the analogs a unique chemical handle called a keto group, which standard amino acids lack. The researchers then linked a single PEG to each keto group and tested the compounds in cell cultures containing mouse and rat cells. Some of the variations destroyed hGH's effectiveness, but many did not. When the group tested six promising analogs in mice, all worked and lasted longer in the animals than the commercial version of the drug did. A single injection of the best variant, for example, showed the same efficacy after 1 week as daily injections of the commercial hGH. If the result holds up in people, it could not only reduce the number of injections needed for hGH patients but also give drug companies a new way to tailor-make protein-based drugs.

  14. AMERICAN CHEMICAL SOCIETY MEETING

    Nanofibers Seed Blood Vessels

    1. Robert F. Service

    SAN DIEGO, CALIFORNIA—In this biotech hotbed, biological chemistry was front and center at the 229th national meeting of the American Chemical Society from 12–16 March.

    Researchers have made heady progress in regenerating tissues such as cartilage and skin, which either don't require an extensive blood supply or are thin enough to tap into nearby blood vessels. They've had far more trouble regenerating thick tissues such as heart muscle that require a blood supply throughout. But nanotechnology may soon provide some help.

    At the ACS meeting, chemist Sam Stupp of Northwestern University in Evanston, Illinois, reported that his team has developed a novel variety of self-assembling nanofibers that strongly promote the growth of new blood vessels both in cell cultures and preliminary animal tests. “It's preliminary, but I thought it was the most interesting talk I heard at the meeting,” says Harvard University chemist George Whitesides. “It could be the beginning of something genuinely important.”

    Stupp and his Northwestern colleagues have been perfecting their self-assembling nanofibers for years. A year and a half ago, the group reported making nanofibers that promote the regrowth of nerve cells in rats (Science, 3 October 2003, p. 47). Before that, the team had used their nanofibers to promote the growth of hydroxyapatite crystals that form a primary component of bone (Science, 23 November 2001, p. 1635).

    In each case, the group starts by crafting two-part molecules called peptide amphiphiles (PAs) that contain oily hydrocarbon chains linked to water-friendly peptide groups. In water the hydrocarbons naturally clump together, but negative charges on the peptides repel one another and keep the molecules apart. By sprinkling positive ions into the solution, however, the researchers can counter the peptides' mutual repulsion, allowing the oily hydrocarbon tails to pack together into nanofibers with the peptides facing outward.

    Networking.

    Nanofibers injected into tissue.

    ILLUSTRATION: MARK SENIW

    In the new study, Stupp—working with graduate student Kanya Rajangam, and John Lomasney, a pathologist at Northwestern University School of Medicine in Chicago—searched for peptides to promote the growth of blood vessels, a process called angiogenesis. Other researchers had discovered that the crucial biochemical players in the process include a protein called heparin. Heparin binds to two protein-based growth factors, called fibroblast growth factor (FGF) and vascular endothelial growth factor (VegF), which work together to promote angiogenesis. Several groups had examined peptides that bind heparin, and they had found that each seemed to contain a short sequence of eight critical amino acids. So the Northwestern researchers synthesized PAs containing this octet of amino acid peptides. When they added the PAs to a solution containing heparin and the growth factors, they found that, as expected, the PAs assembled into nanofibers that bound heparin, which in turn bound FGF and VegF.

    In cell cultures containing endothelial cells from cow lungs, Stupp's team quickly saw the formation of tubular structures resembling capillaries. When they injected PA-containing liquid into a damaged region of rat corneal tissue, they saw a rapid formation of blood vessels. By contrast, few blood vessels formed in control animals injected with a collagen gel containing heparin, FGF, and VegF. Even more enticing, when they injected liquid containing the PAs into the damaged heart tissue of rats, they noticed a “dramatic” improvement in the animals' electrocardiograms after 7 to 10 days, Stupp says.

    Stupp suspects that the nanofibers work so well because they possess a very high density of sites that can bind heparin and the other compounds, and they may also slowly release these proteins to stimulate nearby cells as they break down. Whatever the explanation, if the success continues, angiogenesis-promoting nanofibers may one day help restore healthy function in patients who have had heart attacks and even create a new blood supply for a wide range of engineered tissues.

  15. AMERICAN CHEMICAL SOCIETY MEETING

    Fast, Sensitive Scan Targets Anthrax

    1. Robert F. Service

    SAN DIEGO, CALIFORNIA—In this biotech hotbed, biological chemistry was front and center at the 229th national meeting of the American Chemical Society from 12–16 March.

    In mid-March, automated anthrax sensors triggered shutdowns at three separate military mail-sorting facilities in the Washington, D.C., area. All reopened within a couple of days, after further tests declared them safe. But confusion between conflicting field and lab tests underscored a common frustration for biowarfare officials: Field tests are fast but typically have either low sensitivity for finding anthrax or a high rate of false alarms, whereas lab tests are more reliable but can take days to yield results. Now, there's new hope for a field test that's both fast and accurate.

    At the meeting, chemist Richard Van Duyne of Northwestern University in Evanston, Illinois, reported that he and his students have created a portable laser-based anthrax detector that can turn out highly reliable data within minutes. Van Duyne says the instrument, about the size of a pack of cigarettes, can detect just 2550 anthrax spores, well below the level believed to trigger an infection in humans. And he says its sensitivity could well improve another 10-fold. “They have a phenomenal detector,” says Chad Mirkin, a chemist and anthrax-detection expert at Northwestern who is not affiliated with Van Duyne's work.

    Spotting tiny amounts of anthrax and other potential biowarfare agents has long been problematic. But in the wake of five anthrax-related deaths in 2001, federal agencies have poured money into research aimed at spotting the pathogen. Nine different technologies are now being tested, but all come with tradeoffs. For example, fluorescence detection—in which a laser shining light on a suspect sample causes different compounds to fluoresce at different wavelengths—is fast, portable, and sensitive, but it has the potential to give false positive readings.

    Postal strike.

    Contaminated mail and false alarms both have spurred demand for better field tests for anthrax.

    CREDIT: AP PHOTO/DANIEL HULSHIZER

    Van Duyne has long specialized in another laser-based detection technology called surface-enhanced Raman spectroscopy (SERS). Instead of tracking fluorescence, SERS tracks the shift in the light's wavelength that occurs after photons hit the sample and lose some of their energy by exciting the molecules before being re-emitted. Other groups have previously used SERS to detect anthrax, Van Duyne notes, but those tests never approached the ability to detect a mere 4000 or so spores, the number thought to be capable of starting an infection.

    SERS works when laser light shines on a specially roughened surface of either silver or gold. The light intensifies the electric field surrounding tiny bumps on the surface. Any molecules sitting atop the surface experience the heightened electric field, which makes them scatter more light. By comparing the intensity and wavelengths of the scattered light to known reference materials, researchers can determine the chemical identity of compounds in a sample.

    To detect anthrax, Van Duyne and colleagues look for the spectral signature of calcium dipicolinate, a compound that makes up part of a protective layer around anthrax spores. In tests, they dunk an anthrax sample in nitric acid for 10 minutes and spritz some of the solution onto the metal plate of the detector. In an earlier version of their sensor, the researchers roughened the surface by layering a silver film over plastic spheres of assorted sizes in the several-hundred-nanometer range. Because the wavelength of the laser light is optimal only for spheres of one particular size, however, the SERS enhancement was modest.

    For their current study, Van Duyne and his graduate students—Xiaoyu Zhang, Matthew Young, and Olga Lyandres—carefully controlled the size of their spheres. Starting with a copper surface, they topped it with 300-nanometer plastic spheres and laid down a thin silver film on top, leaving behind a surface resembling the corrugations in a human brain. When the researchers sprayed on the anthrax extract and blasted the surface with laser light, the result was a 200-fold increase in detection sensitivity in a test that took less than a minute.

    The group's work also appears in the 30 March issue of the Journal of the American Chemical Society. The researchers are planning field tests with emergency officials in Evanston and elsewhere.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution