News this Week

Science  20 Feb 2004:
Vol. 303, Issue 5661, pp. 1116

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Colwell Steps Down, Bement Named Acting Director

    1. Jeffrey Mervis

    The National Science Foundation (NSF) isn't used to the political limelight. But the $5.5 billion workhorse of basic academic research took center stage last week when its director, Rita Colwell, announced at a congressional hearing that she was leaving 6 months before the end of her 6-year term. Within minutes of her declaration, the White House heightened the drama by asking the director of another federal science agency to do double duty while it looks for a new NSF director.

    Colwell, a 69-year-old microbiologist who spent 26 years at the University of Maryland, was appointed to head NSF by President Bill Clinton in 1998. Last week, she said that she will be returning to academia—at both the University of Maryland and Johns Hopkins University—as well as heading up a new life sciences division of the U.S. subsidiary of Canon, the Japanese camera and copying giant. “I've enjoyed it immensely,” Colwell says about her 5 1/2 year tenure at NSF. “It's the best science agency in the world. But this is an exciting opportunity.”

    The vacancy at NSF will be filled, effective 21 February, by Arden Bement, director of the National Institute of Standards and Technology (NIST). The 71-year-old materials engineer has headed NIST for the past 2 years, following a distinguished career in academia, government, and private industry. Although Bement will continue some duties at NIST, he plans to work “full-time” as acting NSF director. He adds that he's taking the job “with mixed feelings” (see sidebar) and only because “the White House has assured me that it is to be a short-term appointment.” Although Bement says the search for a permanent replacement “is already under way,” most observers put long odds on a successor being nominated and confirmed before the November elections.

    Science policy leaders in Washington were surprised by the elevation of an outsider, rather than NSF Deputy Director Joseph Bordogna, to the top post. It is traditional for the deputy, also a presidential appointee, to fill any gaps in leadership. Bordogna, an electrical engineer, came to NSF in 1991 and was appointed deputy director in 1999. “I don't understand it,” says former NSF Director Erich Bloch, who is the only NSF director in the past 40 years to complete his term. “It's never happened before.”

    Bowing out.

    Rita Colwell says she's leaving “the best science agency in the world.”


    The White House declined to comment on the NSF changes, issuing a one-sentence statement of its plans to name Bement. Bordogna says only that “I hope to continue to serve the foundation as deputy NSF director.”

    Although Colwell's departure had been rumored for weeks, she denied up to the end that she was about to leave (Science, 30 January, p. 606). Last week, however, after her appearance before the House Science Committee, which was reviewing President George W. Bush's 2005 budget request for science, she told reporters she had decided to join Canon last September and that her new employer had shown “extraordinary patience” in allowing her to remain until the release of the president's 2005 budget.

    Whatever the official explanation, many Washington observers suspected that Colwell gave her swan song during a budget hearing as a way to highlight her unhappiness with the president's NSF requests in the past few years. A nonbinding 2002 law sets NSF's spending on a 5-year doubling path, and earlier this month the National Science Board, which oversees the foundation, argued for a quadrupling over the same time period. The doubling seems completely out of reach now: Any attempt by Congress to beef up the president's proposed 3% growth rate for NSF in 2005, as it did with his 3% request in 2004, would have to buck pressure to curb domestic spending and reduce a looming half-trillion-dollar deficit.

    Securing a healthy budget is at the top of the list of challenges facing the next NSF director, says Warren Washington, chair of the science board. And it may become even more difficult now. Bement is capable of “holding down the fort,” says one congressional observer, but if NSF is to grow, it will need a leader with strong political skills and a strong vision of where the agency should be headed. “The director needs to have a good answer when the appropriators ask what the public will get for its money,” says Christopher Hill of George Mason University in Fairfax, Virginia. The next NSF director also should bring “strong intellectual leadership” to the job, says one current science board member. “But I don't know if that's on the agenda” for this Administration, the board member lamented.

    The leadership gap could create an opening for the science board to exercise more authority. Last year Congress gave it greater independence after a bruising battle with the director over the board's status (Science, 27 September 2002, p. 2187). But to do so, it needs to eschew national policy issues and concentrate on NSF, warns former presidential science adviser John Gibbons. “The board needs to narrow its focus if it wants to have any impact,” he says.


    NIST Struggles to Escape the Knife

    1. David Malakoff

    Even as Arden Bement heads to the National Science Foundation (NSF) as acting director, he's vowing to help shepherd his current agency through one of its worst budget crises in years. Officials at the National Institute of Standards and Technology (NIST) say they will have to lay off up to 100 scientists and technicians this year if they can't find creative ways to make up for a $20 million cut in the agency's 2004 research budget.

    NIST, which employs more than 3000 technical staffers mostly located on two campuses, in Gaithersburg, Maryland, and Boulder, Colorado, is no stranger to budgetary buffeting. Congress and the White House have often fiddled with its research accounts, which pay for everything from new fire-safety and timekeeping standards to materials science and even the creation of new kinds of matter (Science, 6 February, p. 741). A spending bill approved last month, however, left the agency reeling: Congress cut NIST's $352 million laboratories budget by about 6%, to $332 million. Nearly half the reduction—$9 million—came in the agency's general “research support” account, which dropped 15%, to $53.2 million. Other losers included NIST's external grants program and its internal math and materials science programs. And there's no money to begin a long-awaited $2.3 million effort to develop technology standards for new electronic voting machines.

    Heavy load.

    NIST Director Arden Bement says he'll be “full-time” at NSF.


    Agency administrators are still looking for ways to soften the blow. But Congress left little room to maneuver. “Everyone is focused on figuring out how we can operate under that number,” says a senior aide in the Department of Commerce's Technology Administration, which oversees NIST. “We're looking for savings everywhere,” and layoffs and early retirements “are realistically something we have to look at,” the aide says. That message has hurt morale among some NIST employees, particularly on the agency's Colorado campus. “We're on watch for the Grim Reaper,” says one technician, who worries that her position is in jeopardy.

    Bement, meanwhile, says he'll do what he can even while absent from NIST. “While I will have a full plate of responsibilities at NSF, I can and will continue to keep track of the ‘big picture’ developments at—and affecting—NIST,” he wrote in an 11 February memo to agency staff. And he notes that there is some good news: In part to make up for the 2004 cuts, the Bush Administration has proposed giving NIST's research account a hefty, 22% increase in 2005. Nervous NIST scientists are hoping Congress will go along.


    MRC Gets Flexible and Clinical in Funding Overhaul

    1. Fiona Proffitt

    OXFORD, U.K.—After months of soul-searching, the United Kingdom's main biomedical funding agency, the Medical Research Council (MRC), last week announced sweeping changes designed to placate Britain's frustrated biomedical research community. The changes include more flexibility in how long projects can run, and they come with a welcome sweetener: a 30% boost in the money available for research grants.

    MRC Chief Executive Colin Blakemore launched the review in the wake of an excoriating report last March from the U.K. Parliament's Science and Technology Committee. The report took the $550 million agency to task over everything from capricious funding decisions to inadequate consultations with the research community over plans for Biobank, a $70 million data repository on the genetics and lifestyle of the British population (Science, 28 March 2003, p. 1958). (Blakemore replaced former MRC chief George Radda last October.) “I'm pleased that [MRC has] recognized the need for change,” says committee chair Ian Gibson. “They're now beginning to respond to what MRC grant holders have been afraid to tell them.”

    New broom.

    MRC Chief Executive Colin Blakemore took over last October.


    Particularly unpopular is a program concocted 6 years ago to spur collaborative research. The Cooperative Group Grant scheme, funded to the tune of $44 million a year, alienated many scientists because its byzantine application procedure tended to exclude less-experienced researchers. The grants also spurred what Blakemore calls “synthetic collaborations” thrown together simply to tap that resource. The program “stultified good research by forcing people to make their research fit” the MRC scheme, says neuropharmacologist David Smith, deputy head of medical sciences at the University of Oxford. As part of a wider streamlining effort, the program will become an add-on to more traditional research grants.

    With flexibility as its new watchword, MRC is revamping its research grants program to allow for projects ranging from two to five or more years in duration. “This is an experiment … to escape the almost universally adopted view among funders that research comes in a package of either 3 or 5 years,” Blakemore says. Now, he says, researchers will simply ask for how much they need and for how long. There will be more to spend, too: MRC also announced a $57 million infusion of extra cash for research grants in 2004 and 2005, boosting this portion of its budget to $228 million. After enduring a funding shortfall in recent years, Blakemore says that MRC “anticipates there is a considerable pent-up demand out there.”

    MRC is also giving a long-neglected stepchild—clinical research—more attention in its funding schemes. “This is excellent news. There is an urgent need for more support for clinical research,” says Smith, who adds that MRC is taking a “quite visionary approach” to funding. The changes are set to take effect on 1 April.


    Big, Hot Molecules Bridge the Gap Between Normal and Surreal

    1. Charles Seife

    The quantum world and the classical world are getting harder and harder to distinguish. In a paper published this week, Austrian physicists show that there's a smooth transition between the outlandish behavior typical of the very small and the ordinary behavior of larger objects such as cells and baseballs and people.

    “I love it. It's superb and great,” says Wojciech Zurek, a physicist at Los Alamos National Laboratory in New Mexico. The excitement stems from a long-standing puzzle: why a small quantum-mechanical object such as an atom or a photon can do several seemingly contradictory things at the same time (take both a left path and a right path at a beam splitter, be polarized both horizontally and vertically, and be spin up and spin down), whereas familiar macroscopic objects are always one thing or the other (alive or dead, on the left side of a barrier or the right, prone or supine).

    In this week's issue of Nature, Anton Zeilinger and his colleagues at the University of Innsbruck describe a clever experiment that used medium-sized objects—cagelike carbon molecules known as buckminsterfullerenes—to probe the boundary where quantum behavior ends and classical behavior takes over. “We saw [the quantum behavior] disappear,” says Markus Arndt, an Innsbruck physicist and co-author of the paper.

    The experiment supports the theory that quantum-mechanical objects behave like classical ones once they interact with their environment—when they're measured or prodded or buffeted by air molecules, or when some other process extracts information about them. This extraction destroys the superposition of a quantum object such as an atom (say, its ability to be in two places at once), leaving behind a classical-ish atom that is either to the left or the right of the instrument. If this interpretation is correct, then the reason a small, cool object stays in superposition longer than a warm, large one is that less stuff is bumping into it, its molecules are dancing around less frenetically, and it's emitting less radiation—which carries information—into the environment.

    Feeling the heat.

    Laser-warmed C70 molecules lose their quantum characteristics as their energy level rises.

    SOURCE: L. HACKERMÜLLER ET AL., NATURE 427, 711 (2004)

    A number of groups have been trying to put this theory to the test, and in the past few years, Zeilinger's has been among the leaders. He and his colleagues work with an interferometer, a device that splits and recombines beams of particles. By persuading ever-larger molecules to take two paths at the same time, the researchers have shown that the boundary of the quantum realm doesn't lie between small atoms and relatively large molecules (Science, 25 May 2001, p. 1471). In their new experiment, they show how the boundary depends on temperature.

    The researchers took C70 molecules (70-carbon buckminsterfullerenes, enormous by quantum standards), heated them with a laser beam, and shot them through a series of gratings that essentially forced each molecule to choose among several paths. If a molecule takes several paths simultaneously, quantum-style, it interferes with itself to create a fringed pattern in a detector at the far end of the device. When molecules behave like classical objects, each chooses a single path, and the interference patterns disappear.

    By tuning the power of the laser, the team could make the C70 molecules as hot as 3000 kelvin or as “cool” as a mere 1000 kelvin. Sure enough, the hot molecules behaved like classical objects, the cool molecules behaved like quantum objects, and there was a smooth but rapid change from quantum to classical as the temperature increased, just as theorists expected (see figure). Arndt strongly suspects that the transition from quantum to classical happens when a warm C70 molecule radiates, emitting a photon and releasing information into the environment.

    The Innsbruck group hopes to push the boundaries between quantum and classical still further, but Arndt's dream of getting very large objects to act like quantum creatures is still far away. “There are many very difficult technical challenges of getting a virus beam,” he says.


    Imaging Studies Show How Brain Thinks About Pain

    1. Constance Holden

    When you see someone getting hurt, you flinch. And so does your brain. Indeed, when we empathize with another person's pain, we use many of the same brain areas that are activated by our own experience of pain, a new brain-imaging study on page 1157 has shown.

    Researcher Tania Singer of the Institute of Neurology at University College London, U.K., and her team set up an experiment using 16 couples who were romantically involved and presumed to be acutely sensitive to each other's pain. Keeping both partners in the same room, they put the female in a magnetic resonance imaging machine and watched her brain while a 1-second electric shock was delivered to the back of either her hand or her partner's. She could not see his face but could see from an indicator which one of them was going to be zapped and whether it would be a weak shock or a sharp, stinging one.

    When the woman received sharp shocks, well-known pain regions in the limbic system were activated, including the anterior cingulate cortex, the insula (which is involved in relaying information from the cortex), the thalamus, and the somatosensory cortices, which relay the physical nature and location of the pain. Many of the same regions were activated in subjects when their partners got the painful shock. But empathy alone failed to activate the somatosensory cortices, for instance. The fact that the same affective brain areas respond to both experienced and imagined pain, claims Singer, is the root of empathy.

    Neuropsychiatrist Helen Mayberg of Emory University in Atlanta, Georgia, calls the study “brilliant.” Using a “very fundamental system like pain,” she says, the researchers have captured both sensory and emotional aspects of the experience and provided new insights on how they interact.

    Picture of pain.

    Empathy for pain mirrors the suffering—but not the physical pain—in the same brain regions.


    Singer's study is part of a growing body of research exploring mind states—including empathy, imitation, and “theory of mind”—which have in common the creation of an interior representation of what another individual is experiencing. This type of representation has been shown at a cellular level by the discovery, in monkeys, of “mirror neurons”: brain cells that are activated both when monkeys observe another individual grasping something or when they are doing the grasping.

    Singer says the same sets of neurons that are activated by empathy are also set in motion by the anticipation of pain. That fits with another piece of the pain puzzle, presented by a second imaging study in this issue (see p. 1162), showing that anticipation of pain relief is closely tied to the placebo response.

    In this study, headed by Tor Wager of the University of Michigan, Ann Arbor, subjects were given an inert salve that they were told was being tested as an analgesic cream. They were then given a shock or painful heat stimulus on the wrist. Those who showed increased activity in the prefrontal cortex prior to the stimulus also showed the biggest reduction of activity in pain-sensitive brain regions and reported the greatest pain reduction—suggesting that anticipation of pain relief is intimately tied with actual pain reduction. Co-author Richard Davidson of the University of Wisconsin, Madison, says this indicates that cognitive control may be crucial for downregulating pain circuitry. Presumably, he says, more prefrontal activity reflects “the active maintenance of a [mind]set” associated with pain relief.

    Mayberg, who has done brain-imaging studies on placebo effects with depressed patients, says the study supports the notion that it may be possible to predict response to medication by looking at the “expectation component” in patients' brain scans.


    Nipah Virus (or a Cousin) Strikes Again

    1. Martin Enserink

    An enigmatic, highly lethal group of viruses has struck again. More than 40 people in central Bangladesh appear to have fallen ill with encephalitis, and 14 have died. Tests point to the Nipah virus, which debuted during a devastating outbreak in Malaysia in 1999. Dozens more cases are under investigation, according to the World Health Organization. The disease has occurred in several clusters, and many of the patients are children, says senior scientist Robert Breiman of the Centre for Health and Population Research in Dhaka.

    The Nipah virus and an Australian cousin, Hendra, both naturally infect Pteropus fruit bats. Using horses as an intermediate host, Hendra first jumped to humans in 1994, killing two. Nipah made its way to humans in Malaysia after causing a massive outbreak in pigs, killing 105 of its 276 victims (Science, 16 April 1999, p. 407). Grouped into a new genus—the Henipaviruses—within the Paramyxovirus family, the duo's high mortality and ability to jump species barriers have attracted close attention.

    Bangladesh had similar, smaller outbreaks in 2001 and 2003. Because researchers at the U.S. Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia, could detect antibodies against Nipah antigens in patients but could not isolate the virus, they dubbed it “Nipah-like.” This time, the virus has been isolated, Breiman says, and CDC studies should soon make clear whether it's Nipah or a close relative.

    Epidemiologically, “it's a very different disease than in Malaysia; that's what makes it so fascinating,” Breiman says. Most Malaysian victims were pig farmers; in Bangladesh, there has been no pig outbreak, and many of the patients were young boys. Tests in Bangladesh fruit bats have shown that they, too, carry a Nipah-like virus; whether there is an intermediate host is under intense investigation. It's also possible that the victims were exposed directly to infectious bat droppings, Breiman says.

    There's no cure for Nipah, but a vaccine is in development. In the January issue of the Journal of Virology, French and Malaysian researchers reported that vaccinia viruses, engineered to express either one of two Nipah's surface glycoproteins, protected golden hamsters from a lethal challenge of Nipah. Because antibodies against Nipah and Hendra cross-react, Pasteur Institute senior virologist Vincent Deubel says he's “quite confident” that the vaccine would also protect against the Bangladesh virus.


    Science in Seattle

    SEATTLE, WASHINGTON—A highly caffeinated crowd of about 11,000 attended this year's meeting here of the AAAS (Science's publisher) from 12 to 16 February. Among other highlights, Seoul National University researchers Woo Suk Hwang and Shin Yong Moon described their extraction of an embryonic stem cell line from a cloned human blastocyst (Science, 13 February, p. 937) to a packed ballroom. Plenary lecturer David King, chief science adviser to the U.K. government, delivered an urgent warning about global warming (Science, 9 January, p. 176) and reiterated the U.K.'s commitment to reducing carbon dioxide emissions by 60% in the next 50 years (Science, 28 February 2003, p. 1291). Reception halls buzzed with idle speculation about Rita Colwell's motivation for stepping down early as head of the U.S. National Science Foundation (see p. 1116). And three cloned mules frolicked in the exhibition hall (below). The 6-, 7-, and 9-month-olds are doing just fine, thank you. All were cloned from the same prizewinning mule (Science, 30 May 2003, p. 1354), but using different host eggs and with different horses serving as surrogate mothers. Two are expectedly ornery, but a third is fairly mellow for a mule.


    RNA Rules Metabolite Production

    1. Jennifer Couzin

    SEATTLE, WASHINGTON—Small RNA molecules have electrified scientists in recent years with their newly discovered roles in controlling gene expression. The surprises are apparently far from over: Another kind of RNA can detect levels of small molecules that help a cell run smoothly, and it can switch genes on or off depending on the cell's needs.

    Molecular biologist Ronald Breaker of Yale University and his colleagues unearthed these multitalented RNA molecules—a class called riboswitches—after wading through decades of scientific literature and puzzling over a handful of unsolved mysteries. Small molecules called metabolites mediate a cell's survival, and metabolite production is fine-tuned by genes that indirectly sense metabolite levels. It had long been assumed that specific proteins bind to a metabolite and trigger expression or repression of genes. But Breaker, as well as researchers at other universities, uncovered seven cases in bacteria, some as old as 30 years, in which frustrated scientists searched in vain for that key protein. There was a reason they couldn't find it, he says: The mystery protein was actually messenger RNA (mRNA). It normally carts the information from DNA to a cell's ribosome, where it's translated into a protein.

    Riboswitches—so named because they are composed of RNA—are portions of specific mRNAs that bind to a metabolite. That changes the shape of the mRNA and switches a gene off, or occasionally on. “It was right in front of you in the literature for 20 or 30 years,” says Thomas Tuschl, an RNA researcher at Rockefeller University in New York City. But until Breaker, no one had made the link.

    At the AAAS meeting, Breaker reported on his eighth bacterial riboswitch. The switch mediates levels of glucosamine, a key sugar that helps bacteria build their cell walls. Unlike the previous seven (those unsolved mysteries from the past), the new riboswitch is also a ribozyme, a scissorlike molecule that can cut up RNA. It uses this ability to control gene expression. When glucosamine reaches high levels in the cell, the metabolite binds to mRNA and induces the mRNA to cut itself. That prompts a plunge in gene expression, because DNA's message to churn out glucosamine can no longer be transcribed. Mysteriously, the mRNA gets sliced at a site that doesn't code for protein but that still manages to disrupt gene expression. “We don't know why” that happens, says Breaker. “It's a little eerie,” notes Sean Eddy, a computational biologist at Washington University in St. Louis, Missouri.

    Still, it's clear that “bacteria are loaded” with riboswitches, says Breaker. He's confirmed two more that aren't yet published and has 10 other candidates. The switches have also been found in fungi and plants, and Breaker is planning to start hunting for them soon in animals.


    Batteries Powered to Order

    1. Robert F. Service

    SEATTLE, WASHINGTON—As our collection of newfangled electronic gadgets grows, the variety of batteries that power these contraptions keeps expanding as well. But a new generation of smart batteries could put a stop to that. Prototypes have the potential to power a multitude of devices with different energy needs.

    Marc Madou, a microengineering expert at the University of California, Irvine, and his colleagues are targeting their efforts at microbatteries, small power cells used to juice devices such as pacemakers, hearing aids, smart cards, and remote sensors. Like all batteries, microbatteries work by shuttling electron-toting compounds from a negatively charged electrode to one that is positively charged, where the electrons are siphoned off. Unlike large batteries, microbattery electrodes are typically made out of thin carbon films stacked on top of one another. The electrodes' large surface area relative to their volume allows electron carriers such as lithium to ferry charges out quickly, providing a quick burst of power. Unfortunately, their low overall volume means they can't store much of a charge.

    One strategy for boosting storage capacity has been to try to create a forestlike array of carbon pillars, which are then wired together in long alternating rows to serve as negative and positive electrodes. For the scheme to pay off, individual pillars must be as tall and skinny as possible, which raises their mass while still keeping their surface area large. Micromechanical engineers have invented slick ways to make such three-dimensional structures in silicon, but they've had less luck with carbon, a standard electrode component.

    Pillar power.

    After a series of heat treatments, these 400-micrometer-tall pillars can serve as carbon electrodes for long-life microbatteries.


    Madou and postdoc Chunlei Wang broke this logjam by starting with a computer-chip-patterning technology known as photolithography. They shined ultraviolet (UV) light through a stencil pocked with an array of holes. The light hit an underlying organic film called a photoresist and caused the material to polymerize. Because the photoresist was largely transparent, beams of UV photons penetrated deep within the film, forming tall polymer pillars. The researchers chemically washed away the unpolymerized material to expose the pillars. A series of heat treatments then burned away much of the polymer, leaving only carbon behind. Finally, the researchers wired up the pillars into alternating rows of anodes and cathodes, submerged the forest in a lithium-containing electrolyte, and showed that it provided 78% more power than a microbattery made with thin-film carbon electrodes, they reported at the meeting.

    The pillar-based batteries may do more than store additional charge. By controlling which groups of electrodes are active, it's possible to deliver power at a wide range of voltages and currents—a micro equivalent of a battery that can serve as a 9-volt or a AAA. The electrode's large surface-to-volume ratio may also allow them to be recharged quickly, Madou adds, although his team has yet to measure this effect.

    Kevin Drost, a mechanical engineer at Oregon State University in Corvallis, says the new work is impressive. In addition to making batteries, he says, the 3D patterning technique is likely to be useful for making a wide variety of novel carbon-based microstructures.


    Science at the Extremes?

    1. Jeffrey Mervis

    SEATTLE, WASHINGTON—Scientists—and the scientific press—are accessories to the crime of letting extreme voices dominate the public debate over science-based policies on everything from climate change to evolution. So says Rita Colwell, the outgoing director of the National Science Foundation.

    Colwell's remarks, at a forum on school science, were a response to a question about whether the Bush Administration is “hostile to science.” The fault lies not with her political bosses, she said, but rather with the way the issues are portrayed in the media. “You're doing tabloid journalism,” she said, pointing at an audience that presumably included science writers covering the meeting.

    She was no less hard on her professional colleagues. “We should be bringing science writers into our labs and showing them the latest instruments and facilities,” she said. “Instead, we are allowing science to be presented in sound bites, from dangerous people.”

    “It's not the Administration that's the issue,” she concluded. “It's like [the comic strip character] Pogo once said: ‘We have met the enemy, and he is us.’”


    Oceans Policy Is In for an Overhaul

    1. Jay Withgott*
    1. Jay Withgott writes from Portland, Oregon.

    SEATTLE, WASHINGTON—A report due out next month will urge a major overhaul in the way the United States studies, manages, and protects the oceans. Scientists hope that the federally mandated Commission on Ocean Policy, whose conclusions will parallel those of last year's privately funded Pew Oceans Commission, will help the country reverse a long-running pattern of overharvested fish populations, damaged critical habitat, and degraded marine ecosystems.

    The 16-member commission, mandated by the Oceans Act of 2000 and appointed by President George W. Bush, will culminate a 3-year effort next month with a draft report to state governors and the public. Commission members presented preliminary outlines of their upcoming recommendations at the meeting.

    The report cites the current chaotic state of ocean policy, which is distributed among 20 laws, a dozen federal agencies, and 60 congressional committees. To bring order, it will recommend establishing an interagency National Oceans Council, reporting to the president. The commission will also encourage the creation of local, state, and regional councils that would develop their own management plans. “We've got to engage people at the local level in wrestling with their own problems,” says commissioner and former Environmental Protection Agency Administrator William Ruckelshaus. “Government's role should be to empower them to do that and then get out of the way.”

    The report will urge doubling the funding for oceans research, which currently takes up less than 4% of the federal research budget. Other recommendations would improve data management and public education and would introduce a regulatory regime for emerging issues such as bioprospecting, aquaculture, offshore wind and wave energy facilities, and ocean research observatories, according to Commissioner Andrew Rosenberg of the University of New Hampshire, Durham.

    No one expects smooth sailing for reforming a system fraught with political, economic, and scientific tensions. But the broad agreement between the two commissions bodes well, say scientists. Pew Commissioner Jane Lubchenco of Oregon State University in Corvallis said that, together, the reports will help “the country articulate a clear vision for what it wants [and how] to manage oceans as a public trust, for the common good, and in perpetuity.”


    As the West Goes Dry

    1. Robert F. Service

    In a region already prone to water shortages, researchers now forecast that rising temperatures threaten the American West's hidden reservoir: mountain snow

    MOUNT BACHELOR, OREGON—Under the dome of a concrete-gray sky, Stan Fox assembles four pieces of aluminum tubing into a 3-meter-long hollow pipe. After standing it on end, he plunges it through more than 2 meters of snow at Dutchman Flat, an alpine meadow perched on the shoulder of this 3000-meter mountain. Fox, who heads the Oregon snow-survey program for the U.S. Department of Agriculture's Natural Resources Conservation Service (NRCS), removes the tube and reads the snowpack depth, a measurement that has been tracked at nearby sites monthly since the 1930s. Today the snow is 250 centimeters deep, and by comparing the weights of the tube both filled and empty, Fox and a colleague determine that the snow contains about 30% liquid water. If all the snow were instantly liquefied, the water would be nearly 1 meter deep. Not too bad. In a region prone to spikes in precipitation, Dutchman Flat is more than 15% above its 30-year average. “The snow in these mountains is a virtual reservoir,” Fox says. As the snow melts in the spring and summer, it will slowly release that water, filling streams and reservoirs, which provide lifeblood to the region during the normally bone-dry summer months.

    But indications are that this age-old cycle is beginning to change. New assessments of decades' worth of snowpack measurements show that snowpack levels have dropped considerably throughout the American West in response to a 0.8°C warming since the 1950s. Even more sobering, new studies reveal that if even the most moderate regional warming predictions over the next 50 years come true, this will reduce western snowpacks by up to 60% in some regions, such as the Cascade Mountains of Oregon and Washington. That in turn is expected to reduce summertime stream flows by 20% to 50%. “Snow is our water storage in the West,” says Philip Mote, a climatologist at the University of Washington (UW), Seattle, who leads a team that has produced much of the new work. “When you remove that much storage, there is simply no way to make up for it.”

    The impacts could be profound. In the parched summer months, less water will likely be available for everything from agriculture and hydropower production to sustaining fish habitats. Combined with rising temperatures, the dwindling summertime water could also spell a sharp increase in catastrophic fires in forests throughout the West. With much of the current precipitation headed downstream earlier in the winter and spring, the change is also likely to exacerbate the risk of floods.

    For resource managers already struggling to apportion limited water supplies throughout the West, the predictions are grave. “If that's true, it would have a huge impact,” says Christopher Furey, a policy analyst with the Bonneville Power Administration in Portland, Oregon, which markets electricity from over a dozen power-generating dams in the Columbia River Basin that provide power to millions of people. In a region where farmers, fishers, recreationalists, and municipalities already compete for water, climate change may be setting the stage for an entirely new round of conflicts. “We think of the water wars in the past,” says Fox, referring to the epic battles over rerouting western waters in the early 20th century. “In the future they will probably be more peaceful but much more prevalent.”


    Too wet, too soon

    The root of the problem is easy to state: The semiarid West has too little water, spread too unevenly throughout the year. Most of Montana sees less than 46 centimeters of precipitation a year. Even rainy Portland receives only about one-tenth of its annual 91 centimeters of precipitation during the summer. For most of California the fraction is even smaller. Philadelphia, by contrast, typically receives 102 centimeters of annual precipitation, 30% of which comes in the summer.

    Thanks to massive dam-building in the first half of the 20th century, more than 60 million people—roughly one-fifth of the U.S. population—now live in the Pacific and Intermountain West. Those tens of millions of people are dependent not just on water, but on snow. Snowmelt makes up 75% of all water in streams throughout the West. If that snow falls as rain or melts too early, there will be little water left in the virtual reservoir come late summer and fall. Unfortunately, that is just what appears to be happening.

    Back down the mountain in a conference room at a small ski resort outside Bend, Fox and a collection of about 50 water experts from the Northwest settle in to listen to Mote describe some of his group's latest data on western snowpacks. Perhaps fittingly, outside the temperature has warmed up on this mid-January day to about 5°C. Icicles encircling the roof drip steadily.

    Mote describes work published last year in Geophysical Research Letters, in which he took a detailed look at the trend in snowpack accumulations throughout the Pacific Northwest over the last half of the 20th century. Mote reviewed federal records of snow water equivalents (SWE)—the amount of water in a given depth of snow—on 1 April, typically the peak of the season's snowpack. Of the 230 sites where SWEs were measured back to the 1950s, Mote found that nearly all showed negative trends, even as precipitation increased in most places. The hardest hit: areas in the Cascade Mountains in Oregon and Washington, which saw as much as 60% declines in total snow accumulation. The most likely explanation, Mote says, was the region's temperature rise. When he plotted the snowpack declines against the elevation of the snow-tracking sites, he found that the biggest decreases occurred at the lowest elevations, suggesting that the moderate warming throughout the region was raising the freezing level.

    That's just the beginning. In work presented last month at the American Meteorological Society meeting in Seattle, Washington, Mote teamed up with UW Seattle colleague Alan Hamlet and University of Colorado, Boulder, hydroclimatologist Martyn Clark to expand his initial analysis to look at historical snowpack levels throughout the West (see right hand figure). The news was better, but not much. Snowpacks decreased at 85% of the nearly 600 snow-measurement sites throughout the West. The biggest decreases hit the Northwest, where the mountains are smaller and the temperatures warmer, thanks to their proximity to the Pacific Ocean. Declines in the northern Rockies were mostly in the range of 15% to 30%. In these inland areas, Clark points out, winter temperatures are typically far lower than in the Pacific Northwest, so a rise of a few degrees still does not push the mercury above freezing. “In the interior regions all the winter precipitation falls as snow,” Clark says. And some regions in the Southwest even witnessed large SWE increases, thanks primarily to a rise in precipitation.

    In retreat.

    A modest temperature rise since the 1950s has reduced spring snowpacks throughout the West (bottom) and shifted the peak snowmelt earlier in the year (top).


    Other clues also suggest that the West's snowpack is changing. The biggest: Snow is melting earlier in the spring. “There has been a fairly broad tendency in snowmelt basins to exhibit advances in runoff timing,” says Daniel Cayan, a climate researcher at the Scripps Institution of Oceanography in La Jolla, California. Last month, Cayan, postdoctoral assistant Iris Stewart, and Michael Dettinger, a hydroclimatologist with the U.S. Geological Survey (USGS) in San Diego, reported in Climatic Change that the peak of the annual spring runoff in streams throughout California's Sierra Nevada now comes as much as 3 weeks earlier than it did in 1948 (see lower figure). Again, the effect was most pronounced in streams adjacent to lower elevation snow that is more sensitive to temperature increases. “This is very consistent with the evidence Phil [Mote] and company have seen with the snowpack,” Cayan says. In a paper now under review at the Journal of Climate, Clark and colleagues at the University of Colorado recently found much the same shift for streams in the Northwest. “There is definitely something happening,” Clark says.

    That evidence is further bolstered, Cayan points out, by records that track the first springtime blooms of flowers such as honeysuckle and lilac, which show a similar 1- to 2-week advance. “This is a totally independent measure and one that is quite strongly related to temperatures in the springtime,” Cayan says.

    Not everyone is yet ready to believe that these trends will continue. George Taylor, the state of Oregon climatologist and a climate researcher at Oregon State University in Corvallis, for example, argues that broad trends in temperature and snow accumulation over the past century are most likely due to natural multidecade swings as the climate oscillates between periods of relative warm and cold temperatures. “There was significant warming in the 1920s, '30s, and '40s, cooling in the '50s and '60s, and warming again from the 1970s through '90s,” Taylor says. “In my opinion, the effects of human-induced global warming are small compared to the multidecadal cycles.”

    Greg Johnson, a climatologist with NRCS in Portland, also points out that Mote typically starts his analysis of snowpack trends at the beginning of the 1950s, which saw some of the largest snow accumulations over the past century. “If you use those numbers, you will show large decreases,” he says. Decadal swings in climate caused by El Niño and the Pacific Decadal Oscillation, he adds, further muddy the numbers.

    “I'm not saying it's a nonissue, just that we need to keep watching it closely,” Johnson says. “The point is, if you look at the historical record, we've seen some warming and drops in low-elevation snowpack. The question is what can we tie it to. But from a planning standpoint, I think people have to be concerned about this.”

    Mote agrees that the trend data may be skewed to some degree by the high-snow years of the early 1950s. However, he says, before the 1950s there were so few snow measurement sites that earlier data are suspect. Furthermore, he says, the snow loss is still best explained by the region's modest warming. “The thing that really stands out is that the largest losses are at the lowest elevations, which can only be explained by warming,” Mote says. As for whether this warming is best explained by the decade-long climate swings, Mote defers to the latest work by the Intergovernmental Panel on Climate Change (IPCC), the global body of hundreds of scientists that has assembled the “standard model” of climate change. Although IPCC's latest report does show that both natural and human-induced factors explain portions of the last century's global temperature record, climate models that take both into account do the best job at reproducing the complete temperature record.

    Dry times ahead?

    No matter what the historical picture, Mote, Cayan, and others argue that the picture for western snowpacks looks far more bleak when the anticipated future warming is taken into account. Here, too, several teams have been working to understand how events are likely to unfold. All agree there is considerable uncertainty. Precipitation trends, for example, “are all over the map” in different climate models, because precipitation can vary drastically over a short distance, Mote says. However, Mote, Cayan, and others agree that climate models generally do a far better job of estimating temperature, because temperature differences drive winds that tend to reduce those differences. Regional climate models suggest that over the next 100 years, western temperatures are likely to rise between 2° and 7°C, depending on—among other factors—the rate of increase of greenhouse gases in the atmosphere. And unlike the precipitation forecasts, the models all show an increase in temperature.

    Modelers then feed these temperature data and other variables into another set of computer programs called hydrology models that compute the effects of changing climate on snowpack and stream runoff. And these hydrology models consistently show that even low-end temperature changes produce big effects. As part of a study described in last month's issue of Climatic Change, for example, UW Seattle hydrologist Dennis Lettenmaier and colleagues used a global climate model to compute how the western snowpack would respond to modest temperature increases. They found that a temperature rise of 1.5°C by 2050 resulted in a loss of nearly 60% of the 1 April snowpack in the Oregon and Washington Cascades, and a 3° rise by 2090 reduced those snowpacks by 72% (see figure). “That's the best-case scenario,” Mote says. “By the 2090s with a warm scenario, you would have essentially no snow left in Oregon by April 1st.” When the Pacific Northwest is taken as a whole, the picture is only a bit better, showing a 35% loss in 1 April snowpack by the 2050s and 47% loss by the 2090s.

    Virtually gone.

    Computer models suggest that even moderate warming will drastically reduce the spring (peak) snowpack in the Oregon and Washington Cascades.


    In a Geophysical Research Letters paper last year, Cayan and former postdoc Noah Knowles—now with USGS in Menlo Park, California—computed a similar analysis for the watersheds that make up the western drainage of California's Sierra Nevada Mountains. They found that a predicted temperature rise of about 2.1°C over the next century would reduce the Sierra snowpack by one-third by 2060, primarily at mid to low elevations, and would halve it by 2090. A separate analysis by L. Ruby Leung and colleagues at the Pacific Northwest National Laboratory in Richland, Washington, together with researchers from the National Center for Atmospheric Research in Boulder, Colorado, and Scripps reached similar conclusions when they looked at the effect of climate throughout the West. The one notable difference: In the Rockies, the colder wintertime temperatures are expected to limit the losses to 30%. Without putting too much faith in the exact amount of losses, Mote says, “it's nearly inescapable that we're going to continue losing snowpack.”

    “Enormous impacts”

    “It doesn't mean we've lost water,” Cayan hastens to point out. “It means the water is coming off earlier.” Rather than sticking around as snow into the late spring and summer, western snowpacks will wash down mountainsides in the winter and spring. Simply stated, the upshot is wetter winters and drier summers.

    In the Sierras, for example, Knowles and Cayan's models predict that the portion of water that flows through the watershed's rivers from April through July each year will decline from 36% today to 26% by 2030. “This represents over 3 km3 [3 billion cubic meters] of runoff shifting from post-April 1 to pre-April 1 flows,” the authors write. That figure nearly doubles by 2090. Other studies show that parts of the Columbia River Basin are likely to fare worse, whereas the Colorado River watershed, with smaller anticipated declines in snowpack and generally colder temperatures, is likely to emerge comparatively unscathed. Overall, however, a steady temperature climb will likely affect tens of millions of people. “There are enormous impacts from this potential change,” Cayan says. “Water management in the West has been to use the snowpack as a natural reservoir. This reservoir is really important. It's water that will come later when a lot of the water demand is heaviest.” Without that water “people will need to make some difficult choices,” adds Todd Reeve, who directs watershed restoration programs for the Bonneville Environmental Foundation in Portland.

    That's particularly true in the Pacific Northwest and California. Reservoirs in the Columbia River Basin capture only about 30% of the region's annual runoff, whereas California's reservoirs hold slightly more. The typical pattern is to fill these reservoirs with late spring runoff and use that water throughout the summer and fall for irrigation and then in the early winter for power generation. An earlier snowmelt means that the water must be spread over a longer dry season when irrigation, recreation, and municipal demand peaks. “You're losing natural storage and taxing built storage. Something has to give,” Lettenmaier says. (Here too, Lettenmaier says, the Colorado River Basin is unique, because reservoirs there can store four times the region's annual precipitation.)

    With less summertime water, one of the hardest hit areas is likely to be agriculture. Today, farmers in California use about 75% of the state's water. Earlier this month, agricultural economists Wolfram Schlenker of the University of California, San Diego, and W. Michael Hanemann and Anthony Fisher of UC Berkeley presented a preliminary study at the American Economic Association meeting in San Diego of the likely impacts of climate change on California agriculture. Using a range of hypothetical climate and stream-flow scenarios in line with published modeling results, the researchers forecast that snowpack losses could lower farmland values by more than 15%. If that pattern holds for the state's 3.84 million hectares of irrigated farmland, the loss to the state's agriculture economy would be measured in the billions of dollars.

    What is more, Fisher says, because access to irrigation water in California depends on the historical system of first-come, first-served water rights, those losses will likely be absorbed primarily by the farmers lowest on the water-rights totem pole, driving many out of business. That same pattern is likely to hold true in the Northwest, particularly in the dry lands east of the Cascades. “It's not going to be feasible to have the irrigated acreage we have now,” Mote says—fighting words in a region long wedded to an agricultural way of life.

    Forests are also likely to suffer, according to Anthony Westerling, a climate researcher at Scripps. Westerling recently fed data from Cayan and Knowles's climate and hydrology models of the Sierras into a model of his own that attempts to forecast changes in wildfires. Westerling says his preliminary results show that fire danger will soar. “The mean area burned more than doubled by 2090” relative to the present, Westerling says.

    Although less easily quantified, low summertime stream flows are also expected to exacerbate problems with declining fish runs, crimp water supplies for recreation and cities, and increase the likelihood of winter and springtime flooding throughout the Northwest and California. But not all the impacts are sure to be bad. Last year, John Fazio, a river flow analyst with the Northwest Power and Conservation Council in Portland, plugged some of the UW group's hydrology forecasts into his Columbia River flow models and found that a warmer Northwest may actually benefit Northwest electricity consumers. Warmer winters, Fazio says, will likely lower the need for electricity during the region's peak demand period, and an expected small increase in wintertime precipitation could churn generators to the tune of an extra 1900 megawatts of power—nearly enough to power two cities the size of Seattle. Of course, if precipitation swings toward the dry side, it could wind up costing rate payers hundreds of millions of dollars, he says.

    Dangerous consequences.

    Over the next century, larger winter and spring runoffs from melting snow are expected to increase flooding and catastrophic wildfires.


    No matter how the climate evolves, water managers will face uncomfortable tradeoffs between providing water for agriculture, hydropower, and recreation, and keeping it in streams to support fish runs. In their current Climatic Change paper, for example, Lettenmaier and colleagues show that to keep summertime flow levels in the Columbia River high enough to support endangered-fish recovery plans, water managers will likely have to sacrifice 10% to 20% of the river's wintertime hydropower generating capacity, because it will force water managers to draw down their reservoirs in the summer. “Even with these reductions in power, late-summer minimum flows would still be lower than at present,” the authors write.

    More big dams?

    In a region prone to water shortages, talk of such tradeoffs doesn't go down easy. “We already have a problem with shortages,” says Maury Roos, chief hydrologist for the state of California. And coming up with the water to deal with population growth throughout the region is already an acute problem, he adds. “This will certainly make the problem worse.”

    In hopes of heading off some of those problems, Roos and other water officials are beginning to incorporate climate change into their regional water plans. California's latest draft water plan, for example, discusses climate change, although it doesn't yet recommend changing California's infrastructure. Portland, Seattle, and other cities have begun studying the issue in detail to see whether they need to change their water-management plans.

    Initial rumblings are also being heard among advocates for building new dams throughout the West. That comes as something of a surprise to many, because during the Clinton Administration, then-Secretary of the Interior Bruce Babbitt claimed that the era of big dam building was over, due to their adverse impacts on fish and wildlife. Already, for example, California is considering building several new dams as part of a joint state and federal effort to provide water for threatened ecosystems while keeping water available for farmers. Washington too is flirting with building a dam at a cost of more than $1 billion in the eastern part of the state to provide irrigation water for farmers near Yakima. And Idaho water managers say that climate change may force them to build new reservoirs to prevent winter floods along the Boise River, where one-third of the state's inhabitants currently live.

    But due to their high dollar and environmental costs, many water experts doubt whether such projects will go forward. “Dams are tough fights and so expensive,” says Hal Anderson, planning chief for the Idaho Department of Water Resources. And even if built, they will only soften the blow. With the amount of spring snow expected to be lost due to climate change, “there is no way we're going to build that many dams to capture it all,” Mote says.

    Other strategies may help. Most water officials agree that there is much that can be done to conserve water, particularly by lining irrigation canals and making other improvements to irrigation. As well, a handful of new programs have sprung up recently to buy or lease water rights from farmers and then keep the water in stream during the low-flow months to improve habitat for fish. Last year, for example, one umbrella effort called the Columbia Basin Water Transactions Program sponsored 32 such deals to keep 28.4 million cubic meters of water in tributaries where it's needed most. That amount of water pales in comparison to what stands to be lost. But for now, water planners still have some time to act before climate change alters the American West in a way humans have never witnessed.


    The Evolution of the Golden Rule

    1. Gretchen Vogel

    Humans and other primates have a keen sense of fairness and a tendency to cooperate, even when it does them no discernible good

    Despite what you might think from the piles of dishes accumulating in the laboratory sink, humans seem to have an innate tendency to cooperate with one another, even when it goes against their rational self-interest. Some theorize that the ability to cooperate is one of the main reasons humans have managed to survive in almost every ecosystem on Earth, but it poses puzzles for evolutionary biologists: Are cooperative urges ingrained in our genes? Or are we taught by our culture to play well with others? Are those who break the rules deviant, or are those who follow them driven by outdated urges that don't make sense today?

    Recent studies of how humans and other primates cooperate have stoked these old debates with new data. Evolutionary theorists are attempting to piece together the forces that might have shaped cooperation. Neuroscientists are getting into the mix as well, identifying circuits in the brain that respond to cooperators and cheaters.

    At first glance, cooperation seems to be an evolutionary anomaly. In the hardscrabble competition for food, territory, and mates, why would one individual go out of its way to help another? Nevertheless, the animal world has plentiful examples of cooperation that seem to be hard-wired: bees that collect pollen for the whole hive, mole rats that build elaborate tunnels used by other group members, and meerkats that risk their lives to guard a common nest.

    Biologists have successfully explained why such behaviors can be beneficial in the game of passing genes to the next generation. In the 1960s, the late evolutionary biologist William Hamilton developed a theory of kin selection that showed how helping relatives can increase the chances that one's own genes will be passed on through them. The theory elegantly explains the behavior of cooperative insects, for example.

    In the 1970s, evolutionary biologist Robert Trivers, now at Rutgers University in New Brunswick, New Jersey, developed a theory that explained why unrelated individuals might also be inclined to help one another. According to his theory of reciprocal altruism, helping a nonrelative increases one's own fitness as long as the recipient can be reasonably expected to return the favor. An innate willingness to help those who would help you back, he argued, would have been beneficial in early human societies, when people lived in relatively small groups and could keep track of who cooperated with whom.

    Fair trade.

    Capuchin monkeys refuse to cooperate when they see a comrade receive a better reward for the same task.


    Evidence for the benefits of reciprocity is starting to show up in studies of nonhuman primates. For example, primatologist Joan Silk of the University of California, Los Angeles (UCLA), and her colleagues reported last fall (Science, 14 November 2003, p. 1231) that social female baboons have a greater chance than less social females of having an infant survive to its first birthday. Somehow, grooming and staying in close contact with other group members seems to give female baboons a reproductive advantage. It's possible that the social butterflies are simply healthier overall and therefore can afford to be more generous, but Silk and her colleagues argue that the benefits of social support go deep in the primate lineage.

    Not just tit for tat

    It's notoriously difficult to isolate human motivations for various actions. But economists, increasingly in cooperation with evolutionary biologists, have devised laboratory-based games that can measure purely altruistic tendencies. For example, in an experiment sometimes called the trust game, one player, the truster, is given a sum of money, say, $10. He or she decides how much money to share with an anonymous partner, the trustee. The trustee then receives double the sum donated by the first player and can decide how much to give back. The trustee has no logical reason to give any money back; he or she will never see the partner again. Nevertheless, in multiple studies, more than half of the trustees return some of the money, and they tend to transfer more when they are entrusted with more.

    Such cooperation may pay off eventually. In an extension of Trivers's theory of reciprocal altruism, mathematicians Martin Nowak, now at Harvard University, and Karl Sigmund of the University of Vienna developed a theory called indirect reciprocity: People are willing to help someone who won't pay them back as long as other people see the charitable act. The generous person, in this case, builds a reputation for cooperation, and others who observe this behavior are more likely to cooperate with him or her. Indeed, evolutionary biologist Manfred Milinski of the Max Planck Institute for Limnology in Plön, Germany, and his colleagues have shown that reputation is a key motivator in cooperation. When players carry their “cooperation history” with them through repeated rounds of games (instead of remaining completely anonymous), cooperation increases dramatically—from about 50% to more than 80% in some cases. And those who have a history of cooperation are much more likely to receive cooperation from future partners. “Reputation is something that is very important to people,” Milinski says.

    In another potential evolutionary payoff for public generosity, a willingness to cooperate might also serve as an indirect sign of mating fitness, according to some theorists. People who are already well off (and therefore valuable potential mates) can afford to be generous. So an open display of benevolence might increase one's chances of reproduction. German speakers even have a saying for the practice, Milinski notes: “Tue Gutes und rede darüber: Do good and talk about it.”

    Enforcing fairness

    Some researchers argue that indirect reciprocity can't explain the large-scale cooperation humans excel at, the behavior that allows groups to construct transportation networks or send robots to Mars. To keep long-term behavior on track, it seems, people have devised a series of rewards and punishments. Although some rules are unique to a given culture, in recent years, several evolutionary biologists have argued that a predisposition for such systems might be deeply ingrained in our brains and might be an evolutionary underpinning of morality.

    Researchers call this tendency strong reciprocity: the willingness of people to punish cheaters and reward those who cooperate, even at substantial cost and with no foreseeable reward to themselves. The pattern shows up consistently in laboratory games: Even when players know their identity will be kept secret and are clearly told they will never encounter their partner again, up to 50% still cooperate. And when players are given a chance to spend some of their earnings to reward those who cooperate or punish those who cheat, they do so readily, although they have no chance to benefit from any behavioral changes that may result.

    Twisted instincts.

    A strong human drive to punish perceived wrongs may be part of what motivates suicide attacks.


    Such results have led anthropologist Robert Boyd of UCLA and Ernst Fehr of the University of Z_urich and others to theorize that both genetic and cultural evolution have reinforced the human tendency to cooperate. “Our picture is that you have a system of conformism and moral punishment: If you deviate from the local norms, there are sanctions imposed,” Boyd says. In each society, he adds, different systems of rules developed. Those that were particularly successful spread to neighboring communities, “which led to the spread of larger-scale moral systems and to larger-scale cooperation. This created a world in which ordinary natural selection favored people that were pro-social, because those that didn't got in trouble” and had less success in passing on their genes.

    Indeed, in mathematical simulations, Fehr and his colleagues showed that in groups in which some members are willing to punish miscreants, willingness to play by the rules is an advantage. Deviants—those who try to break the rules by jumping the queue or sneaking on an honor-system subway without a ticket—can face reprimand or retaliation from others who have patiently waited in line. In the simulations, a few willing punishers can maintain stable levels of cooperation. Without punishments, the defectors quickly take over.

    Such a theory can perhaps help explain the results of one wide-ranging experiment using the ultimatum game. In it, one player is given a sum of money and can decide how much to give a partner. The second player then decides whether to accept or reject the offer. If he or she accepts, both keep their share of the money. If the second player rejects the offer, neither player gets anything. In a wide range of industrial societies, the most common offer hovers around 50%. Offers of 20% are rejected at least half the time even when fairly large sums of money are involved.

    A consortium of researchers played the ultimatum game in 15 societies from around the world, from Peru to Papua New Guinea. The results varied widely. Members of the Machiguenga people of Peru and the Tsimané of Bolivia tended to make and happily accept offers of less than 20% of the pie. Among the Gnau culture in Papua New Guinea, however, the first player sometimes made overly generous offers of up to 70% of the total sum. But such offers were just as likely to be rejected as unfair. “In that culture, social status depends on how much you give,” explains Herbert Gintis, an economist at the Santa Fe Institute in New Mexico and one of the study coordinators. Those who made excessive offers were perceived as putting on airs, he says, and were rejected. What all the groups have in common, says Gintis, is strong reciprocity. “What is considered to be fair varies from society to society, but the fact that people punish those who violate norms does not vary.”

    Enforcing societal norms might also explain an initially puzzling result from economist Simon Gächter of the University of St. Gallen, Switzerland, and Benedikt Herrmann of the University of Göttingen, Germany. The team played cooperation games with subjects from Germany, Switzerland, Belarus, and Russia. All the players had an opportunity at the end of the game to punish other players for a fee. Somewhat surprisingly, he told an audience at a conference in Göttingen* in December, the Russian and Belarussian players punished not only defectors but also strong cooperators. Any deviation from the norm, it seems, can be singled out for punishment.

    Some researchers are not convinced that the theory of strong reciprocity—that an innate sense of fairness leads people to reward do-gooders and punish cheaters—is necessary. Dominic Johnson of Stanford University and his colleagues have argued that the recent results should be understood in the light of environmental and cultural conditions predominant during human evolution—conditions very different from today's. Even when people are told that their identity will remain secret and they will never encounter their partner again, they may be guided by tendencies that evolved when people lived in small groups and rarely encountered strangers. Thus, Johnson argues, older theories of reciprocity, such as Trivers's reciprocal altruism, can account for the results. For his part, Trivers says the data are intriguing but don't require new theories: “It's absurd—and I use that word advisedly—to imagine that we've evolved to respond to the specific situations that these economists put us in, with complete anonymity and no chance to interact with partners a second time.”

    Cooperative urges.

    Standing in an orderly queue requires cooperation with strangers one may never see again.


    Even though lab experiments may be artificial, they appear to elicit deeply conflicting responses in the brain. Last year, neuroscientist Alan Sanfey of Princeton University and his colleagues reported using functional magnetic resonance imaging (fMRI) to observe the brains of people playing the ultimatum game (Science, 13 June 2003, p. 1755). Unfair offers of 20% of the pie prompted a reaction from the bilateral anterior insula, a part of the brain activated during unpleasant experiences such as pain, hunger, or thirst and emotions like anger and disgust. Those with stronger activation of this area were more likely to reject the unfair offer. A second area, the dorsolateral prefrontal cortex, was also activated. It is linked to rational, problem-solving processes. The authors theorize that the dilemma faced by someone presented with an unfair offer prompts conflict between the two brain areas: Should I accept the smaller sum, and at least end up with something, or should I refuse and punish the selfish partner? Subjects with greater activation of the prefrontal cortex were more likely to accept even unfair offers.

    In this week's issue of Neuron, Tania Singer and her colleagues at the Wellcome Department of Imaging Neuroscience in London describe fMRI studies of people playing a cooperation game called the sequential prisoner's dilemma. After the game, seeing faces of cooperative partners triggered increased activity in brain areas linked to social cognition and reward. Fehr and his colleagues have also measured the activation of brain regions when game participants have the chance to punish those who don't cooperate. In preliminary results, the opportunity to punish seems to activate similar kinds of reward-related pleasure circuits as does eating sweets. “We call it the ‘sweet taste of revenge,’” he says.

    On a more troubling note, the desire to punish unfairness could help drive irrational acts such as suicide bombings, says Gintis. Some people are willing to pay an extremely high price as long as they can inflict serious suffering on those they consider an enemy. “If such behavior is present in a controlled, cool lab situation, it is even more likely that it is an important force in the real world,” Gächter says.

    A sensitivity to “fairness” may have emerged early in the primate lineage. In a paper published in August 2003 in Nature, Sarah Brosnan and Frans de Waal of Emory University in Atlanta, Georgia, and their colleagues showed that capuchin monkeys have a keen sense of fairness. Brosnan gave a monkey a pebble, which it was supposed to return to her in exchange for a slice of cucumber. Most monkeys were perfectly content to trade the pebble for a cucumber, but when they could see a neighboring monkey perform the same task and receive a grape—a much more valued treat—in return, monkeys refused the cucumber even though the alternative was no reward at all. In recent work, chimpanzees have shown similar reactions, Brosnan said at the Göttingen meeting. “A test for fairness is probably hardwired,” Gächter says.

    With so many cooperative tendencies built into human brains, whether by genes or culture or both, why isn't there more harmony in the world? Unfortunately, notes Boyd, one of humans' most successful cooperative endeavors is making war. “All that increased cooperation has done is change the scale on which conflict takes place,” he says. “I would like to think there's a happy story of peace and understanding. But you can't be a 21st century human and not see that the trend is in the other direction.”

    • * “Cooperation in Primates and Humans, Mechanisms and Evolution,” Deutsches Primatenzentrum, 9–12 December 2003.


    Brazil Institute Charts a New Hemisphere for Neuroscience

    1. Marcia L. Triunfol,
    2. Jeffrey Mervis
    1. Marcia L. Triunfol is on the staff of Science's Next Wave.

    Brazilian-born neuroscientist Miguel Nicolelis has an ambitious plan to create a novel research institute in a poor region of his native country

    Duke University neuroscientist Miguel Nicolelis's social conscience was pricked whenever he used the plush athletic complex at the University of São Paulo as a medical student in the 1980s. The sports facilities were off-limits to neighborhood children, who had to make the best of their congested, urban setting. The disparity so disturbed Nicolelis and his friends that they created an after-hours sports school, giving local children a chance to use the university facilities—and to learn the value of teamwork, fair play, and working toward a goal.

    Two decades later, the school still exists, and some of those who played soccer and capoeira, a traditional Brazilian sport that mixes dance and fight, have gone on to attend university and launch successful careers. Now Nicolelis wants to apply his professional expertise and social conscience to a much bigger challenge: Brazil's underdeveloped northern region. His dream is to create a world-class neuroscience research institute on the outskirts of Natal, the state capital of Rio Grande do Norte and its largest city, and with it a mental health clinic, an elite community school, and a science museum.

    The Brazilian government has donated a scenic 100-hectare plot of land in the hills overlooking the city, which hugs the Atlantic Ocean. It's also contributed most of the $1.5 million that Nicolelis has raised toward a target of $30 million by 2006 for the entire complex. Next month more than 300 neuroscientists from around the world will converge on Natal to talk about their latest research (—and how they can help the International Institute for Neuroscience of Natal (IINN) become a reality.

    Science with a social bent.

    Miguel Nicolelis plans to establish a clinic and a school along with a neuroscience research center.


    “Yes, we want the institute to be a center of excellence for Brazilian neuroscience,” says the 42-year-old Nicolelis, who came to the United States 15 years ago as a postdoc. He has since earned international recognition for translating electrical signals in monkey brains into commands that drive robot arms—work that he and his colleagues hope will lead to clinical trials later this year of prostheses for paralyzed people. “But the key to the whole concept is the social mission attached to the institute,” he says. “That's what appeals to the Brazilian government. A group of us had been talking about doing something like this for a long time. And with the election of the new president [Luiz Inácio Lula da Silva in October 2002] and his promise to enhance Brazilian science, we saw an opportunity to move ahead.”

    IINN was officially created last spring by Nicolelis and two other expatriate neuroscientists: Duke's Sidarta Ribeiro, his postdoc, and Claudio Mello of Oregon Health and Science University in Portland. Nicolelis is chair of a star-studded advisory group that will oversee the institute's scientific activities. Next month's conference will also mark the first meeting of the institute's board of trustees, a group of Brazilian political and scientific leaders, leavened with distinguished foreigners, who will set overall policy and raise money for the project.

    “Miguel is a life force, an outstanding researcher with a strong entrepreneurial streak who's very committed to the cause,” says Peter Lange, Duke's provost and an IINN trustee. Duke has put up $50,000, in part to support a laboratory at nearby Federal University of Rio Grande do Norte through which the institute will eventually offer doctoral degrees. The Pentagon's Defense Advanced Research Projects Agency, which has funded Nicolelis's work at Duke, has chipped in $40,000 for the conference.

    But the institute still faces a long, hard road. One major obstacle, say Brazilian scientists with experience on other international projects, is the fickle nature of Brazilian funding sources. Ivan Izquierdo, a neuroscientist at the Federal University of Rio Grande do Sul in Porto Alegre, notes that he's still receiving money from a 3-year federal grant awarded in 1997 because it took several years for the funds to begin flowing. “When it comes to science, Brazilian funding agencies have a long tradition of noncontinuity,” he says.

    Even after the tap is turned on, however, the country's stifling import regulations can soak up those resources. “To give you an idea of how bureaucratic the process is, an electrophoresis apparatus that I ordered as an undergraduate was held up by customs until the end of my Ph.D.,” says Stevens Kastrup Rehen, a Pew Fellow at the Scripps Research Institute in La Jolla, California, and an associate professor at the Federal University of Rio de Janeiro.

    Nicolelis and his colleagues have heard the horror stories. But they insist that times have changed. Last month the institute set up shop in a nondescript office building in Natal. It will serve as temporary quarters until a move to its permanent home in nearby Macaiba, an impoverished farming town that Nicolelis hopes will be transformed into a showcase community.

    The first example of that transformation will be a health clinic. Among its services will be the sort of blood tests on newborns that are routine in major cities but unknown in the rural areas outside Natal, he says. Next year should also see the debut of a school that provides free, topflight education for neighborhood children and their parents, many of whom are illiterate. “We'll hire teachers from all over and pay salaries comparable to the best private schools,” Nicolelis promises. “I'm already getting queries from scientists who want to move to the university so that their kids can go to the school.”

    Field of dreams.

    This site overlooking Natal, which was donated by the Brazilian government, will be home to the International Institute for Neuroscience of Natal.


    Building the research institute itself, he predicts, “will be our hardest job.” Current plans call for 25 laboratories, 15 for staff scientists and 10 for collaborative work with researchers from around the world. Nicolelis hopes that the institute will serve as the vehicle “for a generation of Brazilian scientists to come back home.” At the same time, he expects that about one-fourth of the professional slots will be filled by non-Brazilians; the director will be chosen through a worldwide search. And although federal and state governments may provide much of the money, Nicolelis says, they won't be calling the shots: “I've made it very clear that we are going to run ourselves.”

    But not with Nicolelis at the wheel. Although he plans to spend considerable time in Natal helping the institute get off the ground, he has no interest in becoming its director. In fact, once IINN is both financially and scientifically secure, Nicolelis would like to clone the concept in other underdeveloped parts of the country. “We have a list of 10 topics,” he says, including genetics, agriculture and food production, nanotechnology, and environmental science.

    His hope of spreading Brazil's scientific resources outside the twin hubs of São Paulo and Rio de Janeiro, he says, meshes with President Lula's promise to foster economic development in the hinterlands. But the name chosen for the institutes' umbrella organization—Alberto Santos Dumond Foundation—underscores the project's central challenge. Santos Dumond, an aviation pioneer and national hero, “is by far Brazil's most successful scientist,” Nicolelis asserts. But outshone by the Wright brothers and unsung outside his own country, he is also a reminder of how far Nicolelis and his colleagues must go to make a lasting mark on global science.

  15. E.T. SEARCH

    No Din of Alien Chatter in Our Neighborhood

    1. Richard A. Kerr

    Early-generation searches for extraterrestrial intelligence are coming up empty-handed, but the SETI community is carrying on

    It was a good try, the best of its kind ever, but the Phoenix Project came up empty. “We found nothing,” says radio astronomer Frank Drake, founder of the modern search for extraterrestrial intelligence (SETI). After listening for radio broadcasts from the vicinity of more than 700 nearby, sunlike stars using some of the largest, most sensitive radio telescopes in the world, “we have no success to report,” says Drake, who is senior scientist at the SETI Institute in Mountain View, California, the operator of Project Phoenix. The failure of Phoenix and other powerful searches of the past decade shows that “this idea there's a galactic club that we would join as soon as we started … doesn't look like it's panning out,” says physicist James Trefil of George Mason University in Fairfax, Virginia.

    No one in the SETI community is discouraged, however. They didn't get lucky this time, but then, they had no idea what Phoenix's chances were anyway. “There's got to be life in the galaxy,” says physicist and SETI searcher Paul Horowitz of Harvard University. “If they're attempting to contact other civilizations, we'll succeed some day.”

    SETI has come a long way from its humble beginnings. Drake kicked off modern SETI in 1960 with his Project Ozma; he pointed a 26-meter radio telescope dish at two nearby stars for a few days each, scanning across 0.4 megahertz (MHz) of the microwave spectrum one channel at a time. Nothing heard. NASA eventually began a SETI program, only to have a Congress leery of looking for “little green men” kill it off in 1993. Project Phoenix immediately rose from the ashes of the NASA program. Privately funded at about $4 million a year, it involved larger, more sensitive antennas, including the world's largest at 1000 meters at Arecibo, Puerto Rico. Project researchers will end their final observing run there on 5 March. They will have searched about 710 star systems within 150 light-years of Earth. Thanks to faster but cheaper computing, they could search 70 million channels simultaneously across an 1800-Mhz range of frequencies just above those of FM radio.

    All things considered, the Phoenix search has been 100 trillion times more effective than Ozma, says SETI Institute senior astronomer Seth Shostak. “We could have heard a transmitter aimed our way from 100 light-years,” he says, assuming the antenna was the size of Arecibo's and was broadcasting with a very modest power of 10,000 watts or more. The negative result from Phoenix “does imply there are not large numbers of civilizations transmitting at many frequencies,” says Drake, or at least not lately.

    An increasingly discriminating ear.

    Ever-cheaper fast computers will greatly accelerate the Allen Telescope Array's scanning for signals from any communicative extraterrestrials.


    Phoenix has been the premier SETI search targeting individual stars, but there's more than one way to scan the skies. Phoenix searchers were assuming that broadcasting civilizations are abundant in the Milky Way, abundant enough that scanning something like one or two thoughtfully selected stars out of each 100 million might succeed. An alternative to targeted searching pursued by several privately funded groups independently of the SETI Institute is to scan the skies broadly and hope the signal is a strong one.

    Under a half-dozen incarnations of SERENDIP (Search for Extraterrestrial Radio Emissions from Nearby Developed Intelligent Populations), led by astronomer Daniel Werthimer of the University of California, Berkeley, receivers have ridden piggyback on several antennas, including Arecibo. They siphon off a bit of whatever signals radio astronomers happen to pick up as they sweep the sky during their own studies. SERENDIP has scanned billions of radio sources in the Milky Way.

    The META (Megachannel Extraterrestrial Assay) and BETA (Billion-channel Extraterrestrial Assay) searches conducted by Harvard's Horowitz were all-sky like SERENDIP, albeit with a much less sensitive, dedicated antenna. But BETA, like Phoenix, could separate true E.T. signals from terrestrial interference in real time, which SERENDIP cannot. That has left SERENDIP researchers with hundreds of possible E.T. signals that were not there when they checked again months or years later. In the past few years, Drake, Werthimer, Horowitz, and others have begun real-time searches at optical wavelengths, on the assumption that E.T. broadcasters might be using a different part of the electromagnetic spectrum. Thousands of stars have been targeted so far for nanosecond flashes of an alien laser.

    None of these searches has detected a bona fide signal from an E.T. intelligence. “Maybe what's been done is establish that the sky is not littered with signals that are easy to detect,” says Horowitz. Barring a lucky break, that's about all anyone was expecting from a still-nascent endeavor in which researchers are “looking for an uncertain manifestation of a hypothetical presence,” as Shostak has put it.

    Uncertain or not, they'll keep on looking. SERENDIP and the optical searches continue, and the SETI Institute's $35 million successor to Phoenix, the Allen Telescope Array, is already taking shape near northern California's Mount Lassen. Initially funded by an $11.5 million gift from technology investor and Microsoft co-founder Paul Allen, the first three 6-meter antennas are already operating. A set of 32 identical antennas will begin searching this year, and—given future funding—a field of 350 antennas will blossom. With a total collecting area one-quarter that of Arecibo, the full array would sacrifice sensitivity for accelerated search speed. That's because multiple antennas can computationally generate multiple “virtual telescopes” for simultaneous searching.

    Given the continued reduction in computation costs, “you can expect the speed of the reconnaissance will double every 18 months,” says Shostak. “Within the next 2 decades, we'll check out not a few thousand, but a few million stars.” If the galaxy is populated by only 10,000 advanced civilizations, he says, “success is a few decades down the road.”