# News this Week

Science  04 Dec 2009:
Vol. 326, Issue 5958, pp. 1328
1. Agricultural Research

# International Centers and Donors Warily Eye Sweeping Changes

1. Dennis Normile

The Consultative Group on International Agricultural Research (CGIAR) is facing what could be the biggest shakeup in its 38-year history as members meeting in Washington, D.C., on 7 and 8 December vote on far-reaching reforms. The delegates—representing donors, countries, international foundations, and development organizations—will likely vote to convert CGIAR from a voluntary association into a legal entity with power over a trust fund so it can enforce systemwide priorities. But the success of the venture hinges on deciding how to set and evaluate the research agenda without simply adding a layer of bureaucracy to the system.

“There is still a lot of work to do on these changes,” says Elizabeth Woods, an agricultural economist with the Queensland government in Brisbane, Australia, who chairs the board of the International Rice Research Institute (IRRI) in Los Baños, Philippines.

Dealing with the CGIAR centers “was getting more and more complicated” for donors, says Jonathan Wadsworth, an adviser at the United Kingdom's Department for International Development, one of the major supporters. Although a CGIAR science council set priorities, centers could ignore them, Wadsworth says. So donors increasingly funded specific projects. Such tied funding grew from about 30% of total CGIAR funding a decade ago to about 70% now. And no one is happy. Wadsworth says donors evaluate programs and sometimes meddle in center management, tasks for which they are ill-suited. Center scientists have been spending more time accounting for hundreds of small projects. “We are torn in 64 different directions by different donors and their agendas,” says Carlos Seré, director of the International Livestock Research Institute in Nairobi. And with core funding shrinking, center directors find it hard to plan for the long term. “It is extraordinarily difficult to accumulate the amounts needed for infrastructure investment,” says Woods.

Several formal studies led to an action plan, says CGIAR Director Ren Wang. If approved, CGIAR will become a legal entity with two parts: a consortium representing the centers and a fund to bring the donors together. The intent “is to agree on roles for the funders and roles for the doers,” says Shey Tata, CGIAR's lead financial officer. The consortium will be governed by a board, likely comprising scientists and development experts, and the fund will be run by donors. The consortium and the fund will together decide on a so-called strategy and results framework, which will set research objectives through a number of megaprograms expected to involve multiple centers. Two of the seven megaprograms are “genomics and global food crop improvements” and “agriculture, nutrition, and health.”

The restructuring could create efficiencies. “Instead of 15 centers negotiating with 65 donors, it will boil down to much higher-level but reduced interactions between one big consortium and one pooled source of funds,” Wadsworth says. Wang says donors have indicated that they will rely on standardized program evaluations by the consortium, reducing reporting requirements for numerous small research projects. Sharing procurement, human resources, and other services could save up to $130 million a year, Wang says. Center directors and board members generally support reform. “Anything that makes [CGIAR] nimbler, more efficient, and able to respond with the best science to the serious food security problems facing the world would be welcome,” says Robert Zeigler, director general of IRRI. And donors are embracing the idea. The United Kingdom intends to at least double contributions to CGIAR to £40 million annually by 2014, depending on the progress of reforms, Wadsworth says. Wang says Australia, the Netherlands, the United States, and Switzerland have all pledged to increase funding significantly. He says the target is an annual income of$1 billion by 2013, roughly double the 2008 figure.

The sticking point is the strategy and results framework and its megaprograms. “The process went ahead too hastily without [sufficient] consultation with researchers,” says Ryotaro Suzuki, director of international research at Japan's Ministry of Agriculture. Wang agrees that the megaprograms are more thematic than specific. At the same time, the framework envisions precisely measuring results from each project, such as percentage increases in productivity and the number of people lifted out of poverty. “We can't be accountable for things beyond our control,” says Seré. He is also concerned that the programs focus on cereals and slight the vegetable crops and livestock that generate income for small holders and food for local communities. These areas “are much less addressed by public investment,” he says. Woods worries that the cost of another administrative layer, at least during a multiyear transition period, could swallow a lot of the rising contributions.

But the upside could be greater impacts, says William Dar, director general of the International Crops Research Institute for the Semi-Arid Tropics in Patancheru, India. If, after 5 years, there is more long-term support for agricultural research with the majority in unrestricted core funding, “I will give a very positive verdict on this reform process,” he says.

2. Science and Society

# Stolen E-mails Turn Up Heat on Climate Change Rhetoric

1. Eli Kintisch

The theft and unauthorized release last month of 1000 private e-mail messages from the servers of the Climatic Research Unit (CRU) at the University of East Anglia in the United Kingdom has provided a glimpse into the fractious world of climate science. The public airing of frank conversations among powerful scientists about sensitive topics such as possible holes in their data and the use of contrarian papers in major reports comes at a pivotal time for climate science, just days before a meeting of world leaders in Copenhagen.

The messages—whether hacked or released by a disgruntled insider—have raised thorny questions about the proper behavior of researchers who feel under siege for their science. How willing should they be to share their raw data with their staunchest critics? “It's very difficult to admit that your data are not as strong as you wish it were, especially if you know that will be used against you,” says Nicholas Steneck, an expert on research integrity at the University of Michigan, Ann Arbor. And yet the “circle the wagons” mentality conveyed in numerous messages could inflict lasting “damage to the public credibility of climate research,” warns climate scientist Judith Curry of the Georgia Institute of Technology in Atlanta.

But openness just leads to twisted interpretations, says NASA climate researcher Gavin Schmidt. “You can't have a spelling mistake in a paper without it being evidence on the floor of the Senate that the system is corrupt,” says Schmidt.

Four e-mail exchanges have received most of the media attention. The first regards a research finding considered by most scientists as a canonical fact: that the globe warmed by roughly 0.7°C in the 20th century. That fact derives in large part from global temperature data recorded by stations on land and sea, as analyzed independently by groups at East Anglia, NASA, and the U.S. National Oceanic and Atmospheric Administration.

Referring to requests for climate data from critics, CRU Director Phil Jones wrote in 2005 that “I think I'll delete the file rather than send to anyone.” In May 2009, Jones told Michael Mann of Pennsylvania State University, University Park, to “delete any emails” to a colleague about their work on the Intergovernmental Panel on Climate Change (IPCC) report and to ask a third colleague to do the same. (Mann says he conveyed the message but deleted no messages himself.) Through a spokesperson, Jones declined an interview request. But in a statement he said that “no record” has been deleted amid a bombardment of “Freedom of Information requests.” CRU acknowledged in August that it deleted old data on digital tapes to make space for a move.

A second message relates to a chapter in the 2007 IPCC report that Jones edited. In 2004, he suggested that two recent papers on temperature trends didn't deserve to be published in a peer-reviewed journal. “I can't see either of these papers being in the next IPCC report,” he wrote Mann. “Kevin [Trenberth] and I will keep them out somehow - even if we have to redefine what the peer-review literature is.” But Trenberth, of the National Center for Atmospheric Research in Boulder, Colorado, says the papers were indeed considered. Thomas Karl, director of the National Climatic Data Center in Asheville, North Carolina, an official reviewer for the chapter, says the IPCC's peer-review procedures “were sacrosanct.” Both papers wound up being cited.

A third message is viewed by critics as an acknowledgement that global warming has ceased. “The fact is that we can't account for the lack of warming at the moment and it is a travesty that we can't,” wrote Trenberth in October. Contrarians have noted the lack of record new highs in global temperature since 1998 (Science, 2 October, p. 28). But Trenberth was actually bemoaning something else. “The observing system we have is inadequate for tracking energy flow through the climate system,” he observed, affecting the forecasting of year-to-year climate changes.

A fourth message, about assembling a diagram for a 1999 World Meteorological Organization report, has been misinterpreted, says Trenberth (see graphic). Scientists believe proxy data such as tree rings are valuable for reconstructing past climates, but certain tree-ring data became unreliable midway through the century. So scientists used proxy data for all but the final 40 years of the millennium before switching to instrumental data in 1961. “Reasonable people,” writes Stephen McIntyre, a retired industry consultant and prominent blogger, might conclude that the decision not to show the divergence of the two data sets was “simply a trick” to avoid giving fuel to skeptics.

Whatever their meaning, the messages have emboldened opponents. Some are calling for congressional hearings and, possibly, lawsuits. Penn State says that it is “looking into” the matter, and the University of East Anglia has announced an investigation into the theft and contents of the e-mails.

Scientists know they will need every bit of credibility to defend their findings from future attacks. But Curry suggests that it would be better to bring the skeptics into the fold than to keep them out. That way, she says, the critics will “quickly run out of steam and become irrelevant.”

3. Mantle Dynamics

# Sea-Floor Study Gives Plumes From the Deep Mantle a Boost

1. Richard A. Kerr

Earth's interior is like a pot of boiling water—very viscous, very slowly churning water. The great debate about how Earth's interior operates to shed internal heat and shape the surface began with disagreements over whether two layers in the pot—the upper and lower mantle—always remain separate, like oil and water.

That debate ended when seismologists imaged Earth's cold, brittle surface scum, the tectonic plates, and saw some of them diving all the way into the lower mantle. For the past decade, geoscientists have been focusing on the opposite question: whether plumes of hot, buoyant rock from the lower mantle are rising to the surface to fuel volcanic hot spots.

On page 1388, eight researchers weigh in with the most detailed seismic imaging yet beneath the world's most iconic hot spot, the island of Hawaii. “I do think it's a strong case” for a deep plume, says lead author Cecily Wolfe of the University of Hawaii, Manoa. Most in the often-contentious field of seismic imaging don't go quite so far. But the quality of the data and the apparent Hawaiian plume's resemblance to theorists' expectations has won some cautious support for the work. “They're doing their best. It looks promising,” says seismologist Barbara Romanowicz of the University of California, Berkeley. But the question of deep plumes “is still a little open,” she says.

To gather information about the mantle beneath Hawaii, researchers had to cast a wide net. Seismologists image mantle features by compiling records of seismic waves that have passed from an earthquake source through the feature of interest and on to a seismometer. Warmer-than-average rock slows a wave down; colder rock speeds it up. Seismic tomographers imaging the mantle combine wave travel times in much the way radiologists combine x-rays to create computed tomography scans of the human body. In one tomographic study, seismologists imaging the whole mantle reported seeing a couple of dozen deep plumes scattered around the globe (Science, 22 September 2006, p. 1726).

Geoscientists have long recognized that a single hot spot of persistent volcanic activity created the Hawaiian island chain as the Pacific plate moved over the hot spot. But tomographers trying to see how deep-seated the hot spot's source is face a special challenge: Because of the remoteness of large earthquakes around the Pacific's Ring of Fire, seismometers on the Hawaiian Islands receive few seismic waves that would have passed through any deep plume. So researchers in the Hawaiian Plume-Lithosphere Undersea Melt Experiment (PLUME) stepped away from their subject. Deploying 10 conventional seismometers on the Hawaiian Islands and ocean-bottom seismometers at another 73 sites in waters as much as 5500 meters deep, the team created a seismic “eye” centered on the island of Hawaii and 1000 kilometers across. The network could pick up both seismic shear waves (S waves) that had passed through the upper mantle beneath Hawaii and—vital to imaging any deep plume—SKS waves that had passed upward from Earth's core.

The PLUME images, the authors write, “suggest that the Hawaiian hotspot is the result of an upwelling, high-temperature plume from the lower mantle.” They show a hotter-than-average column of rock extending downward at least 1500 kilometers, topped by a “pancake” of hot rock where a plume would spread outward after hitting the cold, rigid tectonic plate. There's also a parabola-shaped feature of high-wave-speed material where computer models of plume behavior show a curtain of cold, descending rock. The plume's inferred temperature of 300°C above its surroundings at 900 kilometers' depth fits expectations. And, perhaps most telling, the apparent plume tilts downward toward the southeast—the way computer models show the churning mantle “blowing” a plume, like smoke rising from a chimney.

Reaction to the PLUME imaging is varied. “The tomography is pretty good,” says marine geophysicist and regional tomographer Donald Forsyth of Brown University. “It's not an absolute slam dunk, [but] I'm fairly convinced there's an anomaly [in seismic velocities] going down on the order of 1500 kilometers, though it's hard to say if it's continuous.”

Tomographer Jeannot Trampert of the University of Utrecht, the Netherlands, is more skeptical. The signature of a deep plume “is so weak it's hard to say” if it's real, he says. He suspects that the apparent plume may be just an echo of some nearby deep-mantle feature that lies just outside the tunnel view of PLUME.

The obvious candidate for a plume imposter is the edge of the nearby Pacific “large low-shear-wave-velocity province,” more familiarly called a superpile. To test that possibility, Wolfe and colleagues reconstructed their deep-plume signal under the assumption that it was created entirely by the superpile in the nearby lowermost mantle. If the superpile were entirely responsible, the resulting image should resemble the edge of a superpile. “To me, it doesn't look like the proposed superpile,” says Wolfe. “It looks like a doughnut.”

The superpile could still be the source, the researchers write, but a deep plume “remains a more straightforward solution.” Trampert remains unconvinced. “They do not address it in a satisfactory way,” he says. Everyone does agree that the PLUME observations remain to be fully mined for information on any plume, and that global data might be profitably merged with the regional data. But just where or when the slam-dunk evidence might emerge is anyone's guess.

4. Science Policy

# European Union Selects Unknown For Top Science Post

1. Martin Enserink

The two women tapped to head the European Union's efforts on science and climate over the next 5 years have a lot in common. Both were elected to parliament in their mid-20s—one in Denmark and the other in Ireland—but left politics later on. Both wrote for national newspapers and had stints in television broadcasting. Both are described as strong-willed and smart.

The difference is that one is virtually unknown to scientists and science policymakers, and the other is almost an international celebrity. Danish energy and climate minister Connie Hedegaard, nominated last week to become the first European commissioner for climate action, was picked as one of the world's 100 most influential people by Time magazine in April and this month will host the Copenhagen climate talks. In contrast, Máire Geoghegan-Quinn, the proposed new commissioner for research and innovation, has spent the past 9 years examining the E.U.'s finances as the Irish representative of the less-than-glamorous Court of Auditors in Luxembourg.

The nominations, announced on 27 November by European Commission President José Manuel Barroso, are the outcome of delicate backroom talks in which E.U. member states jockey for posts in Brussels. The entire slate of 27 proposed commissioners—one from each country—is subject to hearings and a vote by the European Parliament, scheduled for 26 January.

Before being nominated as the boss of E.U. science, Geoghegan-Quinn, 59, held various posts in the Irish government, including minister of state for European affairs between 1987 and 1991. She left politics in 1997 and joined the Court of Auditors 3 years later. Several European science leaders Science contacted said they could not comment on her nomination simply because they had never heard of her.

Frank Gannon, director-general of the Science Foundation Ireland and a former head of the European Molecular Biology Organization, does know, and admires, Geoghegan-Quinn—he once lived across the road from her in Galway. She's an “intelligent and straightforward person,” he says. “I think she will bring a lot of qualities to the job.” Gannon points out that Janez Potočnik, the Slovenian economist who currently holds the post, was new to science as well in 2004 and “was an excellent commissioner.”

Despite her lack of sciencepolicy experience, Geoghegan-Quinn may have a head start on important decisions regarding Framework Programme 8 (FP8), the next of Europe's gargantuan research funding programs, which is slated to start in 2014. The Court of Auditors, Geoghegan-Quinn's former outpost, said in a highly critical report in October that FP6, which ran from 2002 through 2006, failed to meet some of its key objectives; for instance, large international networks funded to foster innovation and collaboration often fell apart after funding dried up, the report said. (A spokesperson for the court says Geoghegan-Quinn had no personal involvement in the report.) Geoghegan-Quinn would also help decide whether to increase the budget of the European Research Council (ERC), the new funding agency through which some 15% of FP7's ε50 billion is spent. The ERC rewards individual investigators, rather than networks, and uses excellence as a criterion instead of political and economical considerations.

Potočnik will stay on as a commissioner but move to the environment post, now arguably diluted by the creation of a separate post for climate. Hedegaard, 49, who will fill that post, has earned the respect of climate advocates for her efforts to make Denmark's economy greener and for her “great personal commitment” to the Copenhagen summit, says Joris den Blanken, Greenpeace's E.U. climate and energy policy director at its Brussels office.

5. ScienceInsider

# From the Science Policy Blog

President Barack Obama will attend the Copenhagen climate meeting and probably announce a U.S. commitment, contingent on congressional agreement, to a 17% cut in greenhouse gas emissions relative to 2005 by 2020.

The Presidential Commission for the Study of Bioethical Issues will be chaired by Amy Gutmann, a political scientist and the president of the University of Pennsylvania. Bioethicists expect the new commission to be more policy-oriented and pragmatic than its predecessor, which focused on philosophical and moral issues in biomedical research.

In one of the first signs that HIV prevention efforts have begun to make a dent on a global scale, new infections appear to have dropped by 17% over the past 8 years, according to a new report by the Joint United Nations Programme on HIV/AIDS and the World Health Organization.

India and the United States signed a deal on 24 November that includes a full suite of technical cooperation agreements, including shared work on food, wind power, extreme weather, and nuclear energy.

The neurologist and biomechanics expert in charge of the National Football League's committee on mild traumatic brain injury resigned last week. The league appears to be changing its attitude toward growing evidence that head injuries suffered on the field can lead to personality changes, dementia, and other problems later in life.

The world's largest atom smasher, the Large Hadron Collider, has set a new record for accelerating subatomic particles to high energy. On 30 November, protons whizzed around the 27-kilometer-long accelerator at an energy of 1.18 tera–electron volts—20% higher than the previous standard.

For more science policy news, visit blogs.sciencemag.org/scienceinsider.

6. Stem Education

# Web Site Matches U.S. Scientists With Teachers Looking for Help

1. Jeffrey Mervis

Kate Lievens and Jack Hidary live in very different worlds. But the elementary school teacher and the neuroscientist-turned–serial entrepreneur have something in common: a new, interactive Web site designed to match scientists and classroom teachers from across the United States in projects aimed at improving learning.

The site (nationallabday.org) is one element in a White House initiative to encourage private-private partnerships in STEM (science, technology, engineering, and mathematics) education. Hidary has agreed to run the site, and Lievens is one of the first teachers to participate.

The initiative, dubbed Educate to Innovate, doesn't involve any new federal dollars. But it got a boost last week from President Barack Obama, who praised the private sector's promised investment of $260 million in a variety of projects, some new but many with a long track record, ranging from after-school robotics competitions to educational video games, and from science-themed television shows to better professional development for teachers. “The success we seek [in improving STEM education] is not going to be attained by government alone,” the president told scientists, educators, business leaders, and philanthropists at a 23 November rally in a federal office building next door to the White House. “[I] encourage folks to think of new and creative ways of engaging young people in science and engineering.” Hidary says the idea for the Web site was hatched less than 3 months ago in a meeting with officials from the White House Office of Science and Technology Policy (OSTP). “A number of us were involved in TechNet Day,” says Hidary, referring to an effort to promote the role of information technology in society. His background seemed perfect for launching what Hidary describes as “eHarmony for science”: He began his career as a neuroimaging fellow at the National Institutes of Health in the early 1990s before making it big in financial information services. In 1995, he started his first Internet company, EarthWeb/Dice, and a decade later he sold a second company, Vista Research, to McGraw-Hill and turned to community philanthropy. “He is a passionate and extremely hard-working advocate,” notes Rick Weiss, senior policy analyst and director of strategic communications at OSTP. Within weeks, Hidary had won promises from the American Chemical Society, the National Science Teachers Association (NSTA), and other professional groups to enlist their members. He's raised more than$1 million from various organizations and has borrowed Jan Cuny, a program officer from the National Science Foundation's computing directorate, to manage the project's Washington, D.C., office.

National Lab Day (NLD) is a misnomer for the project, admits Hidary. “It's actually a year-long series of activities,” he explains. “We're not interested in another boutique program. We want something that will really galvanize people on a national scale.” Obama said he expects the partnerships formed through the Web site to “reach 10 million young people with hands-on learning” by next spring, when organizers hope the president will keynote a second event to celebrate its success.

Lievens, a veteran teacher at Earl Hanson Elementary School in Rock Island, Illinois, signed up the first day the site went live. A former physical education teacher who became a reading specialist, Lievens took an environmental sciences course this summer at a nearby college that rekindled her latent interest in science. She joined NSTA and began thinking about how to work the Mississippi River, only a few blocks from the school, into her science classes.

What about having her fifth- and sixth-graders test the quality of the water from that mighty river, which they drink every day? “The kids really get fired up when they can connect what they are learning to their everyday lives,” says Lievens. She also thought it would be fun. “Anytime you can make it a little messy, they're more likely to remember the lesson.”

Lievens imagined her students taking water samples and examining them under a microscope, visiting the local water-treatment plant, and maybe even learning a little hydrology and environmental chemistry. But with no formal training in science, she knew that she'd need assistance. Last week, it arrived in the form of an NSTA e-mail alert describing the new Web site. “I'm eager to see who responds,” she says.

Rebecca Smith knows more than a little about what it takes to pull off a successful STEM partnership program. A biochemist at the University of California, San Francisco, she's co-director of the Science and Health Education Partnership (SEP), which since 1987 has been matching area scientists with San Francisco schoolteachers (biochemistry.ucsf.edu/programs/sep). This year they expect to deploy 80 such teams, representing more than 200 scientists.

Smith, whose program can tap an embarrassment of riches from academia and industry, applauds NLD for trying to reach areas that lack such a large talent pool. She also offers Hidary some pointers: Take the time to make good matches, don't expect too much, monitor the partnerships closely, and stick with it. But the most important ingredient, she says, may be mutual respect and trust.

Hidary acknowledges the good work of SEP and many other programs and hopes that NLD can build on their successes. “We already know that project-based, hands-on learning works,” he says. “The challenge now is to scale up. And the only way to do that is through massive partnerships.”

7. India

# Stem Cell Center to Rise in Biology Hub

1. N. N. Sachitanand*

BANGALORE—India's fledgling stem cell R&D effort is set to receive a major boost. Construction began here last month on the Institute for Stem Cell Biology and Regenerative Medicine (inStem), a $50 million center to be built alongside an existing biology powerhouse—the National Centre for Biological Sciences (NCBS)—and a planned$12 million technology center that will seek to commercialize the biocluster's findings. “We hope that this intertwined environment can be transformative,” says NCBS Director K. VijayRaghavan, who will serve as inStem's first director.

Initial plans include launching an international collaboration using stem cells to probe the molecular mechanisms of cardiovascular diseases. inStem will also link up with the Centre for Stem Cell Research at Christian Medical College in Vellore, which specializes in translational and clinical research. “inStem should help both in human resources and capacity building on one hand and accelerate progress in therapeutic possibilities on the other,” says D. Balasubramaniam, who led a government task force that recommended setting up inStem.

inStem's team includes two deans—Jyotsna Dhawan of NCBS and S. Ramaswamy of the University of Iowa, Iowa City, who is returning to his home country after 18 years in the United States. “Bioscience research in India is in an exponential growth phase,” says Ramaswamy. “The excitement of being able to … shape this growth was irresistible.” inStem expects to ramp up to 40 researchers after its new facility, shared with NCBS, opens in June 2011.

• * N. N. Sachitanand is a writer in Bangalore.

8. ScienceNOW.org

# From Science's Online Daily News Site

Coral Reefs Act Like Sunscreen Living on a coral reef is a bit like living in a tanning bed. As the sun's rays shine through the water and reflect off the reef, they strike corals, their symbiotic photosynthetic algae, and other inhabitants from above and below. So what keeps these creatures from being fried? A new study suggests that coral acts as a sunscreen, absorbing UV light and limiting the harm it inflicts on the reef's denizens.

Milky Way Grew by Swallowing Other Galaxies The motto “E Pluribus Unum” (“out of many, one”) could be applied to the Milky Way. Astronomers have obtained new evidence that our home galaxy contains pieces of many former galaxies. The findings strengthen the idea that large galaxies don't emerge whole from single, gigantic clouds of dust and gas. Rather, they grow by swallowing their neighbors.

Americans' Eating Habits Grow More Wasteful After their biggest meal of the year, Americans might reflect on the fate of those moldering Thanksgiving leftovers. Nearly 40% of the food supply in the United States goes to waste, according to a new study, and the problem has been getting worse.

Why Suffocating Is Scary Breathe too much carbon dioxide (CO2), and you'll suffocate. That's why people begin to panic if they breathe air enriched with the gas. One reason this happens, according to a new study in mice, is because breathing CO2 triggers chemical sensors in a crucial part of the brain's fear circuitry. The findings could point the way to new treatments for anxiety disorders.

Titan Lakes Migrate South for the Winter Imagine if all of the water in the Great Lakes evaporated, moved to the Southern Hemisphere, and rained down to form new lakes in Argentina. Then thousands of years later, the process repeated and the water returned north. That's what researchers say could be happening on Titan, Saturn's largest moon. Understanding the process could shed light on how long-term climate cycles operate on other worlds.

9. Origins

# On the Origin of Tomorrow

1. Carl Zimmer*

What is the future of evolution? In the final essay in Science's series in honor of the Year of Darwin, Carl Zimmer explores the subject of human-driven evolution.

In the final words of the final sentence of On the Origin of Species, Charles Darwin gave a nod to the future. “There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.”

Darwin recognized that as long as the ingredients for the evolutionary process still exist, life has the potential to change. He didn't believe it was possible to forecast evolution's course, but he did expect humans would have a big effect. In his day, they had already demonstrated their power with the triumphs of domestication, such as breeding dogs from wolves. Darwin recognized that we humans can also wipe out entire species. He knew the dodo's fate, and in 1874 he signed a petition to save the last surviving Aldabra giant tortoises on the Seychelles Islands in the Indian Ocean.

Darwin also expected that our own species would change. As Western powers colonized other parts of the world, he predicted that some populations would become extinct. But Darwin also felt a cautious optimism. “Looking to future generations,” he wrote in The Descent of Man, “there is no cause to fear that the social instincts will grow weaker, and we may expect that virtuous habits will grow stronger.” And unlike other species, humans could bring about this change consciously, through cultural evolution.

As the world celebrates the 150th anniversary of the publication of On the Origin of Species this year, scientists continue to think deeply about what comes next. But the complexity of evolution still makes forecasting hard. “As Yogi Berra once said, ‘Prediction is very difficult. Especially about the future,’” says Stephen Stearns, an evolutionary biologist at Yale University.

Yet evolutionary biologists also feel a new sense of urgency about understanding what lies ahead. Since Darwin's day, humans have gained an unprecedented influence over our own evolution. At the same time, our actions, be it causing climate change, modifying the genomes of other organisms, or introducing invasive species, are creating new sources of natural selection on the flora and fauna around us. “The decisions we and our children make are going to have much more influence over the shape of evolution in the foreseeable future than physical events,” says Andrew Knoll, a paleontologist at Harvard University.

## Shaping our genome

If there's one thing that's certain, it's that humans, like other living things, will continue to evolve. “Evolution is unstoppable,” says Lawrence Moran of the University of Toronto in Canada. But that doesn't mean that humans are marching on a path toward becoming giant-brained, telepathic creatures out of Star Trek. All it means is that the human genome will continue to change from generation to generation.

A background mutation rate guarantees this process. Each baby's DNA carries about 130 new mutations. Most of them have no effect on our well-being. People can pass these neutral mutations down to their offspring without harm, and over time, a small fraction of them will end up spreading across entire populations, or even the entire species, thanks to random luck.

Natural selection can cause mutations that help individuals survive and reproduce to spread much faster than neutral ones. Exactly which mutations natural selection will favor, however, depends on the environment in which we live. And over the past 10,000 years, we humans have dramatically changed that environment. We have fostered new diseases to which humans have adapted, for example. But in other cases, civilization has shielded us from the environment, weakening the power of natural selection.

One of the best known examples of human-driven evolution is malaria. Early farmers cleared forests and created fields where malaria-carrying mosquitoes could lay eggs in pools of water. As malaria spread, natural selection favored those humans with defenses against the disease. One such defense comes from a variant of a hemoglobin gene that makes it hard for parasites to reproduce in blood cells. One copy of the gene reduces your chance of contracting malaria. Two copies cause sickle cell anemia.

On the other hand, civilization has also blunted some of natural selection's power over humans, particularly in the 150 years since Darwin published On the Origin of Species. Back then, for example, some children had the misfortune to be born with defective copies of a gene for an enzyme that breaks down amino acids in the food they ate. This disorder, known as phenylketonuria, generally led to severe brain damage. Few people with severe phenylketonuria were able to pass on their genes. But today, now that scientists know what causes the disease, people with phenylketonuria can enjoy fairly normal lives simply by being careful about the foods that they eat, and they pass their genes on to their children. Other medical advances, from eyeglasses to antibiotics, may also allow some potentially detrimental genes to become more common than in the past.

Yet medical advances and other changes to human life have not stopped natural selection, nor will they in the future. HIV, for example, first evolved into a human pathogen in the early 1900s and today takes a devastating toll in many parts of the world. Genes that provide some resistance to the virus may be favored by natural selection in places where HIV is particularly common.

Even in affluent parts of the world like the United States, natural selection has not stopped. Subtle differences in people's health influence how many children they have and thus can gradually change entire populations.

In a report published online 26 October in the Proceedings of the National Academy of Sciences, Stearns and his colleagues documented natural selection in 2238 U.S. women. The women were subjects in the Framingham Heart Study, which has tracked the health of thousands of people in Framingham, Massachusetts, since 1948. The scientists searched for traits that were correlated with having a higher number of children. Then they checked to see whether those traits tended to be passed down from mother to child—in other words, whether they were genetically based.

The scientists discovered that a handful of traits are indeed being favored by natural selection. Women with a genetic tendency for low cholesterol, for example, had more children on average than women with high cholesterol. A greater body weight was also linked with greater reproductive success, as was shorter height, lower blood pressure, an older age at menopause, and having one's first child at an earlier age.

Stearns and his colleagues now know which traits are selected in the women of Framingham, but they have yet to determine exactly what advantage each trait confers—a situation that evolutionary biologists often face when documenting natural selection. Nevertheless, based on the strength of the natural selection they have measured, the scientists predict that after 10 generations, the women of Framingham will give birth, on average, a few months younger than today, have 3.6% lower cholesterol, and will be 1.3% shorter.

Of course, even this prediction is subject to change. Women with higher cholesterol may eventually be able to enjoy higher fertility rates thanks to cholesterol-lowering drugs, says Stearns, wiping out the differences in reproductive rates. “Selection is always operating,” says Stearns, “but the traits on which it operates shift with ecology and culture.”

Along with natural selection, it's also conceivable that one day genetic engineering will change human DNA directly. In September, scientists at the Oregon National Primate Research Center reported that they could replace the DNA in the mitochondria of a monkey embryo with mitochondrial DNA from another monkey. In July, scientists at the Center for Regenerative Medicine in Barcelona, Spain, reported that they had repaired human stem cells carrying genes for an inherited blood disorder. Both studies hint that eventually scientists will be able to alter the genes of future generations.

But even if a child was born with engineered genes in our lifetime, that milestone wouldn't mean much for the evolution of our species. Those engineered genes would be swamped by the billions of mutations that emerge naturally in the babies born every year. Yet although engineered genes aren't likely to provide enough reproductive advantage to spread on their own, they may still become common. John Hawks, an anthropologist at the University of Wisconsin, Madison, speculates that if genetic engineering becomes cheap enough and provides an attractive trait—such as staying thin—economics could spread a gene even if natural selection can't. “I think people would buy it,” says Hawks.

## Human-powered evolution

Genetically engineered humans may still be science fiction, but genetically engineered animals, plants, and microbes are all here already. In 2008, farmers planted 125 million hectares of genetically modified crops. Many of these crops carry genes from other species. Corn, cotton, and other plants have been engineered to carry a gene produced by bacteria, for example, so that they can make an enzyme that can kill insects. With big countries such as China and India dramatically ramping up their use of genetically modified crops, this evolutionary trend will likely continue.

In the near future, scientists may start to engineer life in a more profound way, manufacturing new species from scratch. The idea would be to design a microbe on a computer, combining genes with different functions into genetic networks. Scientists could then synthesize the new genome from raw DNA and insert it into an empty microbial cell that would come to life. J. Craig Venter and his colleagues at the J. Craig Venter Institute in Rockville, Maryland, have taken a series of key steps toward that goal, such as performing a “genome transplant” on a microbe.

If Venter succeeds, his artificial would be a triumph of human ingenuity, but it would probably be a minor blip on the biosphere's radar. Synthetic biologists want to make microbes to serve our own ends, such as making fuel and medicines. Burdened with genes for these functions, the microbes will likely be ill equipped to compete in the wild against species that have adapted for millions of years. For the foreseeable future, synthetic microbes will probably survive only in the refuge of a laboratory or a fermentation tank. “I will venture a prediction,” says Adam Wilkins, a biologist at the University of Cambridge in the United Kingdom. “This kind of biotech engineering might succeed in creating some rather weird and wonderful organisms. But the net effect on evolution will be nil—that is, outside the laboratory.”

But humans, Wilkins is quick to point out, don't need synthetic biology to have a big effect on the evolution of life. Chainsaws, fishing lines, and smokestacks do just fine. Many fisheries, for example, have established rules for keeping fish only above a certain size. As a result, natural selection has favored fish that become sexually mature at smaller sizes. On land, hunters have had a similar effect by going after big game. Bighorn sheep, for example, now grow horns 25% smaller than they did 30 years ago.

Humans have also triggered bursts of evolutionary change by introducing species to new habitats. In Australia, for example, cane toads brought in from South America in 1935 have became a continent-wide pest. They're devouring some small native species, and their poisonous skin is killing off some of their predators. Scientists have discovered that the toads are evolving in their new home: Toads at the leading edge of the invasion are growing longer legs and moving faster than their ancestors, speeding up the invasion. The native species are responding as well. Australian snakes are evolving resistance to the cane toad poison.

Stephen Palumbi, a biologist at Stanford University in Palo Alto, California, expects that human-induced natural selection will become much stronger in the future. “In the last century, we were having a big impact, but it wasn't everywhere,” says Palumbi. “But global climate change is an ‘everywhere’ impact, and that's different.”

Plants and animals are already responding to the warming climate by shifting their ranges to find the most comfortable temperatures. But moving won't be a solution for many species, which will face barriers such as deserts or cities. They will have to adapt to survive—a process scientists have already detected in some species, such as red squirrels in Canada, which have evolved to breed earlier in the spring.

Extra carbon dioxide is creating a second worldwide evolutionary pressure as it dissolves into the ocean. There it is turning into carbonic acid and lowering the pH. Continued acidification will make it more difficult for corals and other marine animals to build skeletons and shells from calcium carbonate. Organisms will need to adapt to survive in these new conditions.

“We know that things can evolve quickly, but can they evolve fast enough?” asks Palumbi. He and many other scientists suspect that for many species the answer is no. Unless we can ease up on the biosphere, they warn that the biggest feature of evolution in the near future will be extinctions.

Knoll points out some disturbing parallels between today's crisis and a pulse of mass extinctions that occurred 252 million years ago, wiping out an estimated 96% of species in the oceans and 70% of species on land. A rapid increase in carbon dioxide in the atmosphere led, among other things, to ocean acidification. For animals that depended on calcium carbonate, “you had about a 90% chance of going extinct,” says Knoll. “Corals, sponges, brachiopods, they all kicked the can.”

Knoll doesn't expect human-driven mass extinctions to be as bad as that ancient one. But they could still be unimaginably huge. “If we lose half the species on the planet, our grandchildren are not going to see them restored,” says Knoll. “It will take millions of years.”

A drop in biodiversity may bring with it a collapse of many ecosystems. Coupled with a rapid increase in global temperatures, ocean acidification, and other changes, we may be pushing the environment into a state we've never experienced as a civilization. Such a stress could put our species under intense natural selection as well.

## Taking the long view

One way or another, life will survive this current crisis. But where is life headed in the very distant future? To find out, planetary scientist King-Fai Li of the California Institute of Technology in Pasadena and his colleagues built a model of Earth and the sun and watched it evolve for billions of years. In their simulation, the sun gets brighter, as it has since it first formed. The extra energy speeds up the rate at which carbon dioxide is drawn out of Earth's atmosphere, cooling it off. But after about 2 billion years, this cooling mechanism breaks down, and Earth heats up, ending up like its lifeless neighbor, Venus.

But Li's model does not include a clever species like our own, which can use its brain to influence the planet. Would it be possible to extend the life span of Earth's biosphere? “I am not going to rule out any talented civilizations that will be able to do that,” says Li.

• * Carl Zimmer is the author of The Tangled Bank: An Introduction to Evolution (Roberts and Co., 2009).

10. Neurodegeneration

# Could They All Be Prion Diseases?

1. Greg Miller

Recent studies have renewed interest in the idea that many neurodegenerative diseases may involve prionlike mechanisms.

The idea that proteins can be agents of disease was once heretical, but two Nobel Prizes later all but the most die-hard skeptics have been convinced that misfolded proteins called prions are the cause of several neurodegenerative disorders in humans and other animals. In disorders such as scrapie, mad cow disease, and Creutzfeldt-Jakob disease, misfolded molecules of a naturally occurring protein act like bad role models, encouraging normally folded proteins to misfold and clump together. As aggregates of misfolded proteins spread through the brain, nerve cells stop working properly and eventually die.

A recent flurry of papers has revived interest in the idea that such mechanisms may play a role in an even wider range of neurodegenerative disorders, including two of the most dreaded scourges of old age: Alzheimer's and Parkinson's diseases. Such diseases almost certainly aren't contagious like true prion diseases are, at least in ordinary circumstances, but they may propagate through the nervous system in much the same way. The idea is actually decades old and seems to have originated with Daniel Carleton Gajdusek, who won a share of the 1976 Nobel Prize in physiology or medicine for his work on kuru, a prion disease he claimed was transmitted by ritualistic cannibalism among the Fore people of New Guinea. But until very recently, there was little experimental evidence for prionlike mechanisms in other neurodegenerative disorders, says Lary Walker, a neuroscientist at Emory University in Atlanta. “It's an old idea with new legs,” Walker said in his introduction to a recent online seminar on this topic hosted by the Alzheimer Research Forum (Alzforum).

Evidence from recent animal studies suggests that many of the misfolded proteins thought to play a central role in a wide range of neurodegenerative disorders can, like prions, “seed” the misfolding and aggregation of their normally folded kin. In some cases, these pathological protein clusters appear to propagate from cell to cell. Such a mechanism could help explain several puzzles—such as why some neurodegenerative disorders tend to spread from one part of the nervous system to another in a characteristic pattern, and why some researchers have found pathological protein deposits in fetal stem cells transplanted into the brains of Parkinson's patients (Science, 11 April 2008, p. 167).

“Twenty, 30 years ago, when people were proposing these links, we didn't know that networks degenerate [in characteristic patterns], and we didn't have fetal transplants,” says Marc Diamond, a neurologist at Washington University School of Medicine in St. Louis, Missouri. The prion concept helps integrate much of what's known about neurodegenerative diseases, Diamond says. “The reason it's catching on is that it makes a lot of sense.” Like a growing number of researchers, Diamond thinks the prion concept may not only help researchers gain a better understanding of neurodegenerative diseases but also point to treatment strategies they might not have considered otherwise.

## Killer proteins

The high prevalence of kuru in the Fore people is one of the great medical mystery stories of all time. The disease spread in a manner that suggested infection, yet it caused no fever or other inflammatory response. Gajdusek won the Nobel for his work suggesting that kuru was transmitted by cannibalism practiced as part of funeral rites among the Fore. But the infectious agent remained a puzzle. In laboratory experiments with infected brain tissue, the infectious agent survived heat, chemicals, and ultraviolet light that destroy the infectivity of viruses and bacteria.

In the early 1980s, Stanley Prusiner of the University of California (UC), San Francisco, proposed that proteins could be the infectious agent. It was a radical notion: All infectious agents known at the time contained DNA or RNA, the genetic blueprints for replication. But Prusiner proposed that infectious proteins, or prions, spread disease not by replicating themselves but by encouraging other proteins to undergo a conformational change. He won the 1997 Nobel Prize (some thought prematurely) for work supporting the prion hypothesis (Science, 10 October 1997, p. 214).

Prusiner's theory explained the kuru puzzle, but both Gajdusek and Prusiner were interested in applying the idea to a variety of other disorders. After all, autopsy studies commonly found suspicious clumps of protein in the brains of people who died of Alzheimer's, Parkinson's, and other neurodegenerative diseases. As early as the 1960s, Gajdusek tried injecting extracts of brain tissue from Alzheimer's patients into monkeys and chimps. But these efforts, and later attempts by other researchers, yielded inconsistent results.

Disease can develop decades after exposure to prions in humans, and researchers had to wait years to see whether experiments in primates had any effect, says Walker. Enter the transgenic mouse: In a study published in 2000 in The Journal of Neuroscience, Walker and colleagues injected extracts from the brains of Alzheimer's patients into genetically engineered mice susceptible to the disease (normal mice are not susceptible). They injected one side of the brain in each animal. Within a few months, the mice developed widespread plaques made up of β-amyloid peptide, a hallmark of Alzheimer's disease, on the injected side of the brain. That indicated that something in the brain extracts can seed plaque formation, although whether the seed is β-amyloid peptide itself remained unclear.

More recent work led by Walker and Mathias Jucker at the University of Tübingen in Germany bolsters the case that β-amyloid is the culprit. In one experiment, the researchers found that brain extracts treated with antibodies to remove β-amyloid did not seed aggregation of β-amyloid when injected into mice (Science, 22 September 2006, p. 1781). And in the 4 August issue of the Proceedings of the National Academy of Sciences (PNAS), they reported that stainless steel wires coated with brain extract and then heated to kill microbes still caused β-amyloid deposits to form when implanted into the brains of mice. After 6 months, deposits had spread to neighboring brain regions. To Walker and others, such findings suggest that β-amyloid can induce deposits to form and spread through the brain—much as prions do. Walker says his group is working to create synthetic β-amyloid for a more def initive experiment: If a synthetic peptide can seed plaques, that should rule out the possibility that a microbe or some other factor in the brain extracts is to blame.

Other researchers have been finding similar hints of prionlike behavior in other proteins associated with neurodegenerative disorders. Diamond and colleagues have found that aggregates of misfolded tau, a protein that forms pathological tangles in the brains of people with Alzheimer's disease and frontotemporal dementia, can be taken up by cultured mouse cells. Then, once inside the cells, the misfolded tau appears to encourage normally folded tau to misfold and aggregate, they reported 8 May in The Journal of Biological Chemistry. In July, European researchers reported similar findings in vivo: Injecting brain extracts containing misfolded tau into the brains of mice triggered tau misfolding and aggregation that spread from the injection site to nearby brain regions, they reported in Nature Cell Biology.

Another suspect protein, α-synuclein, the main component of the “Lewy bodies” found in the brains of people with Parkinson's disease and certain types of dementia, also appears to propagate from cell to cell. In the 4 August issue of PNAS, researchers led by Eliezer Masliah of UC San Diego and Seung-Jae Lee of Konkuk University in Seoul reported that rogue aggregates of α-synuclein can pass from cell to cell and spur the formation of Lewy body–like aggregates in cultured human neurons. Experiments with cultured rat and mouse cells, reported in the same paper, suggested that α-synuclein triggers cell death in neurons and neural stem cells. “Cells that take it up form new aggregates, and they get sick and eventually die,” Masliah says.

If α-synuclein spreads from neuron to neuron in the intact human brain, that might explain findings from two research groups that reported last year that fetal cells transplanted into the brains of Parkinson's patients contained deposits of α-synuclein—something that's unheard of in such young cells, the oldest of which had survived for 16 years before the patient died. (A third team found no pathology in transplanted cells.) The findings surprised many researchers who had assumed that deposits build up inside cells over many decades and don't jump from cell to cell. Cell-to-cell transmission of α-synuclein wouldn't necessarily doom stem cell therapies for Parkinson's disease, but it may present yet another obstacle, Masliah says. “We'd like to engineer those fetal cells to be resistant to the aggregates,” he says. One possibility, he suggests, would be to engineer them to overexpress enzymes that can break down aggregates.

## Is it contagious?

The list goes on. Misfolded huntingtin protein, the culprit in Huntington's disease, can find its way from the extracellular fluid to the inside of cultured cells and trigger aggregation, according to a report by Stanford University cell biologist Ron Kopito and colleagues in the February issue of Nature Cell Biology. And at the Alzforum seminar, Neil Cashman of the University of British Columbia, Vancouver, in Canada described unpublished findings from his group that hint at prionlike behavior in SOD1, a protein thought to be central to neurodegeneration in amyotrophic lateral sclerosis. “We're getting a lot of hints from a lot of diseases,” Kopito says. “Together, it adds up to an emerging picture that deserves some pretty close attention.”

These recent studies “expand the prion concept to other proteins … [and] show that under certain conditions the process of protein aggregation can be transmissible” from cell to cell, says Claudio Soto, a molecular biologist who studies neurodegenerative disease at the University of Texas Medical School at Houston. “What remains to be seen is whether or not this occurs in real life,” Soto says.

So far there's virtually no evidence that proteins other than prions can transmit disease from one individual to another, notes Adriano Aguzzi, a prion researcher at the University Hospital of Zurich in Switzerland. One exception, Aguzzi says, may be amyloid A amyloidosis, a protein misfolding disorder that affects the spleen, liver, and other organs. Japanese researchers reported in PNAS in 2008 that misfolded amyloid A can be transmitted from one captive cheetah to another via feces. (The disease is a major cause of illness and death in these endangered cats.) A 2007 paper in PNAS suggested that foie gras prepared from duck or goose liver can transmit amyloidosis when fed to mice.

Most researchers say it's unlikely that diseases like Alzheimer's and Parkinson's are contagious in the usual sense of the word. “I think what's special about prion diseases is that prions are indestructible,” says Walker. “There's practically nothing you can do to get rid of them within the realm of what we consider normal sterilization.” Most protein aggregates are more fragile, which may limit their ability to jump from one person (or animal) to another. All the same, Walker says the issue merits closer study. His experiments with the stainless steel wires, he notes, suggest at least a theoretical possibility that surgical instruments could transmit the disease.

## Window of opportunity

Even if most neurodegenerative diseases don't spread from individual to individual like true prion diseases do, the possibility that they may spread from cell to cell in an analogous way opens up new options for treating them, say some researchers. If aggregates of tau jump from cell to cell to spread disease instead of building up slowly inside cells, for example, that presents an opportunity to cut them off with antibodies or other molecules that can't get inside cells, says Diamond. His group has been designing antibodies that specifically target misfolded forms of tau. Cashman's group has been taking a similar approach for SOD1. Both presented promising preliminary results from animal experiments at the Alzforum seminar.

Another approach is to use small molecules designed to latch on to specific parts of a protein and prevent it from misfolding, says Jeffrey Kelly, a biochemist at The Scripps Research Institute in San Diego, California. In July, FoldRx Pharmaceuticals, a company Kelly cofounded, announced encouraging results from a phase II/III clinical trial of a compound that prevents protein misfolding and aggregation in people with a rare but fatal disease called transthyretin amyloid polyneuropathy. The disease affects the peripheral nerves, causing loss of function in the hands and feet, before spreading to the autonomic nervous system, which regulates digestion and other essential functions. Untreated, the disease causes drastic weight loss, but patients who took the drug for 18 months reversed course and started gaining weight, Kelly says. That suggests that the drug slows the disease's impact on the autonomic nervous system, Kelly says. “We're pretty excited about this, and I think it will energize efforts on other amyloid diseases that focus on preventing this process.”

11. Neurodegeneration

# Acting Like a Prion Isn't Always Bad

1. Greg Miller

If misfolded proteins are so dangerous, why hasn't evolution selected against them? One possibility is that it hasn't had to: Many of the diseases caused by protein misfolding strike late in life, after the reproductive years are over. It's also possible that this type of protein folding isn't always bad.

In mad cow disease and related disorders, misfolded proteins called prions cause normal proteins to misfold and clump together, spreading havoc through the nervous system. Virtually any protein can misbehave like this when the conditions are right, and some researchers now suspect that prionlike mechanisms of protein misfolding and propagation may underlie a wide range of neurodegenerative disorders (see main text). But if this type of behavior in proteins is so dangerous, why hasn't evolution selected against it? One possibility is that it hasn't had to: Many of these diseases strike late in life, after the reproductive years are over.

It's also possible that this type of protein folding isn't always bad, says Adriano Aguzzi, a prion researcher at the University Hospital of Zurich in Switzerland. “Having a protein that can exist in an on-and-off state where the on state is infectious is a wonderful way of transmitting information,” Aguzzi says. “Nature would be very stupid if it didn't utilize this system in order to solve specific problems during evolution.”

Indeed, recent studies suggest that proteins that behave like prions play important roles in a wide variety of normal biological functions in organisms ranging from bacteria to humans. In some bacteria, prionlike proteins create a fibrous matrix that helps cells adhere to surfaces and stick together to form colonies. In some fungi that live in wet environments, such proteins form a film that reduces the surface tension at the water's surface, enabling spores or fruiting bodies to form. In insects they help strengthen eggshells and may lend strength to spiders' silk.

In 2006, researchers led by biochemist Jeffrey Kelly of The Scripps Research Institute in San Diego, California, reported in PLoS Biology that in bovine cells a protein called PMel17 forms self-propagating aggregates that play a role in the synthesis of melanin, a pigment in the skin and eyes that protects against ultraviolet rays. More recently, researchers led by structural biologist Roland Riek of the Swiss Federal Institute of Technology in Zurich reported that several peptide and protein hormones pack into prionlike aggregates inside secretory granules in endocrine cells from several species, including humans (Science, 17 July, p. 328). These aggregates have several advantages as a storage system: They are stable, densely packed, and exclude other proteins, helping to keep the granules' contents pure.

Even in the nervous system, prionlike proteins may have beneficial roles. In 2003, neuroscientist Eric Kandel of Columbia University and colleagues reported in Cell that a protein with prionlike properties plays a role in longterm memory in the sea slug Aplysia californica. At October's meeting of the Society for Neuroscience in Chicago, Illinois, Kandel described new work extending these findings to mice and bolstering his group's argument that self-propagating aggregates of these proteins may be involved in “tagging” specific synapses for strengthening when a long-term memory is created.

“Prions and prionlike phenomena are much more common than we realized,” Aguzzi says. “These things that are cropping up now are the tip of the iceberg.”

12. Fishery Management

# Can Science Keep Alaska's Bering Sea Pollock Fishery Healthy?

1. Virginia Morell

The pollock fishery in the chill waters of the eastern Bering Sea is said to be the best managed in the world. But a surprising decline in numbers has scientists worried.

Every January along the continental shelf in the eastern Bering Sea, a great mass of spawning, olive-green fish surge through the nutrient-rich waters. These are walleye pollock (Theragra chalcogramma), social fish that tip the scales at 700 grams when mature. The pollock spawn in waters north of the Aleutian Islands, where the bounty serves as food for marine mammals, seabirds, fish—and humans: The eastern Bering Sea pollock fishery is the largest and most lucrative in North America. Each year it brings in $1 billion and supplies millions of meals in the form of fish sticks, fast-food fish fillets, imitation crabmeat, and roe. The fishery is remarkable not only for its size but also, to date, for its sustainability: It's certified as sustainable by the London-based Marine Stewardship Council, and catch limits are recommended by scientists who judge the state of the fishery with surveys and state-of-the-art models; there's even a major ecosystem study funded by the U.S. National Science Foundation (NSF). This year's data are sparking concern, however. Previous predictions of a sizable uptick in pollock numbers weren't borne out by recent surveys. Instead, there are dramatically fewer pollock than scientists had estimated just a year ago. The stock is at its lowest level since 1980, and Greenpeace has put the fishery on its red list of unsustainable harvests. Last week in Seattle, Washington, responding to the lower numbers, scientists advising the North Pacific Fishery Management Council (NPFMC) recommended that the catch, rather than being raised from this year's low as expected, stay low again in 2010. Given the uncertainty, some argue that the harvest should be cut even further. The council itself will vote on the recommendation next week. “This time last year we said the stock was going up,” says marine biologist Lowell Fritz of the Alaska Fisheries Science Center (AFSC) in Seattle, who argued for a lower quota. “But it didn't. And that is cause for concern.” Fritz and others say that the revised estimates and dwindling numbers raise questions about how well-managed the fishery really is, and whether the researchers' reams of data and calculations can produce what all parties want: a long-term sustainable fishery in a healthy ecosystem. “It's tricky,” says Douglas DeMaster, science and research director of AFSC, who's based in Juneau. “How far can we knock down a single species before impacting the ecosystem? We don't know yet.” But even though the fishery is at a low point, “it is not overfished,” he says. “And we're working hard to make sure it never is.” ## The big haul Pollock are found across the North Pacific from Puget Sound to the Sea of Japan. But they are especially abundant in the waters of the Bering Sea's continental shelf. Once regarded as commercially worthless, pollock gained value after Japanese trawlers developed a process for reducing its white meat into a protein paste called surimi. And after the North Atlantic cod fishery collapsed in the 1990s, pollock fishing surged. The Bering Sea pollock fishery is now the world's largest single-species fishery, averaging more than 1 million metric tons annually. But pollock are not immune to overfishing: Other formerly abundant pollock fisheries in the region, including an exceptionally rich one called the Donut Hole (see graph, p. 1341), were heavily fished in the 1980s and 1990s and have never recovered. In the eastern Bering Sea, fishing boats hauled up an average of 1.33 million metric tons of pollock each year between 2001 and 2007. In 2007, however, surveys showed that pollock numbers were down. So scientists recommended—and fishers abided by—a reduced quota of 1 million metric tons in 2008. NPFMC reduced the 2009 catch by another 18% to 815,000 metric tons. But the scientists' models predicted better news ahead, and last fall they estimated that in 2010 the stock could sustain a catch close to previous levels. To assess the health of the pollock population, AFSC researchers gather data about the sex, size, weight, and condition of the fish from scientist-observers aboard the fishing vessels and take abundance data from annual bottom-trawl and acoustic midwater-trawl surveys. The scientists track the fish in age classes. Pollock reproduce prolifically, live nearly 11 years, and are mostly fished beginning at age 4, as many fishing vessels selectively target areas preferred by older, larger, and more valuable fish. So when the next season opens in January 2010, fish that hatched in 2006 will be included in the catch. And it is this 2006 class that has surprised and disappointed scientists. As 1-year-olds and 2-year-olds, this age group appeared particularly plentiful. “It looked like an above-average class,” says James Ianelli, an AFSC fisheries biologist in Seattle who heads the modeling team and is lead author of the pollock assessment report released on 17 November. But this year's bottom-trawl survey found fewer fish, and the midwater acoustic trawl was even worse, down 30% from last year's estimate. “There were fewer 3-year-olds than our model predicted,” says Ianelli, and older fish were largely absent. Why were 3-year-old pollock relatively scarce? “It could be they had poor survival rates as 1-year-olds, or maybe they are staying more toward the bottom. We've also had 4 years of record-cold bottom temperatures,” which could affect the fish's distribution, says Ianelli. Whatever the reason, the discrepancy between the previous and the most recent surveys, coupled with the continuing decline, has set off alarm bells. “It surprised people because we [scientists] almost always get the trend right,” says Fritz. “But we really missed this one; we were off by 30%. And that makes me think we're in new territory.” In November 2008, scientists had estimated that the population could sustain a catch of 1.23 million metric tons in 2010. But last week the 15 members of the NPFMC Bering Sea Groundfish Plan Team, after much debate, voted instead for a catch of 813,000 metric tons. The vote was split, with seven scientists recommending a further cut to 738,000 metric tons. ## Driving the system Despite the lower number, Ianelli and DeMaster say the fishery remains healthy. There's enough of a buffer built into the model to assure that the spawning stock never drops below 20% of its estimated unfished numbers, says DeMaster. “It's a conservative approach,” he says, meant to ensure that enough young fish will be produced every year to replace those caught. This year's stock is at 27%. To better understand the pollock's cycles, and what factors in the ecosystem affect them, scientists are busy incorporating more data about the ecosystem—on ocean temperature, zooplankton production, pollock predators, and climate—into their models. “We're in the third year of our Bering Sea Project,” says marine biologist Mike Sigler at AFSC in Juneau, referring to a 6-year,$52 million NSF and North Pacific Research Board–funded study of the eastern Bering Sea's ecosystem. “It's already helping us understand these changes in pollock.”

Researchers have found that many poor age-class years seem to be tied to less sea ice and warmer ocean waters. “That's the pattern for the [pollock] classes from 2001 to 2005,” says Sigler. However, the drop in the 2006 class remains puzzling, because there was plenty of sea ice and colder water that year. Sigler hopes to have an explanation by 2012, when the project ends.

Regardless of the cause, the spawning mass has declined, and AFSC's harvest rules have in turn limited the catch, says fisheries scientist Steven Martell of the University of British Columbia, Vancouver, in Canada. “There's been a 45% reduction in catches over the last 4 years,” he says. But if those reductions aren't sufficient and the stock doesn't recover as projected in the next few years, he warns that the fishery “will certainly be in trouble” and could be closed.

Others think that the catch should be reduced even further now. “The pollock fishery is the most valuable fishery in the U.S.,” says Jeremy Jackson, a marine ecologist at the Scripps Institution of Oceanography in San Diego, California. “More is known about it than any other fishery in the world. Yet despite all the wonderful data and fancy models, they've failed to protect the pollock or the Bering Sea ecosystem. We need to call ‘Time out!’”

Jackson says that although including ecosystem effects in pollock management is worthwhile, the researchers also need to look at “the effects of pollock fishing on the ecosystem.” He notes that pollock is a staple for northern fur seals and endangered Steller sea lions, both of which are struggling (Science, 4 April 2008, p. 44).

Critics say worries about the fishery stem from a fundamental issue: It's managed primarily to get the maximum sustainable yield from a single species. “It can't make a legitimate claim to [being] an ecosystem-based fishery, as long as it maintains this single-species focus,” says Timothy Ragen, director of the Marine Mammal Commission in Bethesda, Maryland.

Jackson and others say that when faced with questionable data, as scientists were this year, it would be better to reduce the quota even further. Fritz says he tried to persuade the Plan Team to do that but failed. “Can we recognize the danger signals and react appropriately and in time, if the fishery is really in trouble?” he asks.

But the fishery is required by law to also consider the socioeconomic effects of its decisions. The pollock fishery is one of Alaska's largest employers, and former Alaska Senator Ted Stevens once brought the entire U.S. government to a halt to protest (and eventually overturn) restrictions upon it. “There's a lot of policy in this process,” says DeMaster. “It's not entirely science.”

Still, many say that the pollock fishery continues to be one of the best-managed in the world, largely because the Fishery Council, unlike some other big fisheries, follows scientists' guidance. “The North Pacific Fishery Council relies the most on science,” says fisheries biologist Daniel Pauly of the University of British Columbia, Vancouver. That's “in stark contrast” to other councils or governing bodies such as the European Commission, he says, where fisheries sometimes ignore scientific advice and adopt high quotas, “and the stocks [such as bluefin tuna] suffer accordingly.”

So when fishing-fleet representatives and others gather in Anchorage next week at the Fishery Council's meeting to set next year's quotas, the scientists' recommendations likely will be adopted. “We know the scientists are concerned,” says Donna Parker of the Seattle-based fishing firm, Arctic Storm Management Group. “They treat our fishery like a cultivated field, and we expect they will manage it well into the future.”

13. Particle Physics

# Seeking a Shortcut to the High-Energy Frontier

1. Adrian Cho

An accelerator that smashes exotic particles called muons promises more bang from a smaller accelerator—if physicists can actually build it.

BATAVIA, ILLINOIS—When you fall behind, you need a comeback plan, and physicists here at the Fermi National Accelerator Laboratory (Fermilab) think they have a dandy. They're losing their title as keepers of the world's highest-energy particle smasher, but they have an idea for a wild new one that might vault them back to the energy frontier. They're hoping the U.S. Department of Energy (DOE) will give them enough money to find out whether their idea is a dream machine—or a technological nightmare.

For 24 years, Fermilab's Tevatron collider has held the energy record for particle collisions, firing protons into antiprotons at a maximum of 2 tera–electron volts (TeV). But researchers at the European particle physics lab, CERN, near Geneva, Switzerland, are finally revving up the 27-kilometer-long, $5.5 billion Large Hadron Collider (LHC), which aims to blast protons into other protons at 14 TeV. With the Tevatron facing obsolescence, Fermilab physicists hope to build a beast called a muon collider. The new machine, the topic of a workshop here last month, would smash muons, which are heavier cousins of electrons, into antimuons. In principle, it could reach energies as high as rivals already in the planning stages—the 30-kilometer-long straight-shot International Linear Collider (ILC) that would fire electrons into positrons and a higher-energy electron-positron collider called the Compact Linear Collider (CLIC) being developed at CERN. But a muon collider would be much smaller. As cost scales with size, it could also be cheaper than the other machines. That's if a muon collider can be built. Unlike electrons or protons, muons are radioactive. So the facility would have to generate the particles, accelerate them, and smash them together in the fraction of a second before they decay. Physicists would also have to protect their equipment from the intense radiation emanating from the muons and limit the amount flowing out of the lab. Interest in the exotic machine has grown because the more conventional plans hatched by U.S. physicists have stalled. Three years ago, they were hoping to host the ILC, with construction to start as early as 2012. But DOE officials blanched when researchers estimated that it would cost at least$7 billion (Science, 9 February 2007, p. 746). DOE officials now think the price of the ILC could top $20 billion, including inflation and contingency, and have said the project cannot be realized until the mid-2020s. That delay has created an opening for supporters of the muon collider, and they seem to have DOE's ear. At an advisory panel meeting in October, William Brinkman, head of DOE's Office of Science, told Science: “I'd like to see Fermilab do something with a muon accelerator. That would be something novel, rather than spending time beating our brains out building the next biggest accelerator.” But is a muon collider a machine impossible? “The problems range from hard to very hard to ultrahard,” says Fermilab Director Pier Oddone. Researchers have requested$16 million per year over 5 years just to determine if a muon collider can be built, he says. Even that may be asking too much, says Daniel Schulte, an accelerator physicist at CERN. “That's—oh God, how do I put this?—an ambitious goal.”

## The best of both worlds

Particle colliders generally come in two types: those that smash protons or antiprotons and those that smash electrons and positrons. A muon collider might combine the advantages of proton and electron machines.

For revving particle beams to the highest energies, nothing beats a proton accelerator. At 14 TeV, the LHC will blast out massive new particles or even open new dimensions of space, or so researchers hope. However, as with any proton collider, the LHC will reveal those things in a messy way.

Protons contain other particles called quarks and gluons. So when one proton tears through another, debris flies every which way. “It's like two cans of Campbell's soup,” says Vladimir Shiltsev, an accelerator physicist at Fermilab. “You collide them and soup splashes everywhere. But God knows what actually happened.” Within the mess, typically only one gluon scores a direct hit on another, so only 1/6 to 1/10 of the protons' energy goes to make new particles.

In contrast, electrons and positrons have no internal parts. So they make clean collisions in which all the energy can go into making new particles. That's why physicists say that the logical successor to the LHC is an electron-positron collider. The ILC would generate collisions at 0.5 TeV, which might be enough to map the terrain the LHC will open. If not, the 40-kilometer-long CLIC would make 3-TeV collisions by using a lower-energy beam to drive a higher-energy one.

There is a catch, however. Because every action has a reaction, charged particles radiate when their paths are forced to curve. Responding readily to a sideways shove, lightweight electrons and positrons give off copious x-rays that sap their energy and prevent them from reaching TeV-level energies in a circular accelerator, or “synchrotron.” So both the ILC and CLIC would use two huge linear accelerators facing each other. That's an inefficient arrangement, as the electrons and positrons collide only once instead of repeatedly, as happens in a circular machine.

Muons share the better features of protons and electrons. Like protons, they are heavy, weighing 207 times as much as electrons. So they radiate little energy as their paths bend and could reach high energy in a relatively small synchrotron. Like electrons, however, muons have no parts, so their clean collisions permit all the energy to go into new particles. “Basically, we can do very efficient acceleration and reach a higher energy with a much smaller machine than any electron collider or proton collider,” Shiltsev says. A muon collider would fit comfortably on Fermilab's 2750-hectare campus, he says.

The machine might also fit into Fermilab's other plans. Sometime in the next decade, lab officials hope to build a proton source, known as Project X, to generate neutrinos and study their interactions and pursue other subjects on the “intensity frontier” (Science, 31 August 2007, p. 1155). A muon collider would also need a proton source like Project X.

## Challenges galore

A muon collider poses daunting challenges, however. Physicists must generate the muons and antimuons by blasting protons into a metal target. They must gather the particles into bunches and “cool” them so that they nestle together. The bunches of muons and bunches of antimuons must then pass through a series of accelerators and into a final synchrotron, where they would circulate in opposite directions and collide.

All of this must happen before the muons decay. If a muon zings along at 1.5 TeV, the time dilation of special relativity stretches its lifetime to 30 milliseconds—up from 2 microseconds when it's still. That's time enough for 500 circuits in the final ring, says Michael Zisman of Lawrence Berkeley National Laboratory in California, if everything goes smoothly.

Accelerator physicists' biggest challenge will be cooling the muons. Cooling reduces the relative motion of particles within a bunch, and because electrons radiate so easily, physicists can cool them by simply sending them around a synchrotron for a while. As the electrons in a bunch radiate, the bunch contracts like a balloon shoved into a refrigerator. Muons radiate very little, so that approach won't work for them.

Instead, physicists are considering a scheme called ionization cooling. In it, the muons would run alternately through liquid hydrogen, to slow their motion in all directions, and through chambers filled with radio waves, to speed them up in just one direction. An incredibly high magnetic field—50 tesla, or 1 million times Earth's field—is needed to keep the muons from flying away. “The techniques we need are beyond state-of-the-art,” says Fermilab's Stephen Geer. Researchers in the United Kingdom are currently trying to demonstrate the technique in small-scale bench tests.

Radiation will also cause headaches for accelerator designers. Muons decay into electrons and neutrinos, and the electrons would convey enough power to overheat the superconducting magnets that would guide the muons around the ring. So researchers are considering novel designs with slots to let the radiation out.

The radiation would also generate particle “backgrounds” that might overwhelm the desired signals from the collisions. Simulations suggest that 100,000 neutrons and 10 million photons would flood each square centimeter of a particle detector during each bunch crossing, Fermilab's Marcel Demarteau said at the workshop. All that extraneous particle pollution might render the muon collider irrelevant as a follow-up to the LHC, says Barry Barish, a physicist at the California Institute of Technology in Pasadena, who leads the ILC design team: “The whole point of going to electron-positron collisions is to get an environment that's clean enough to do all the things you can't do at the LHC,” he says.

Perhaps most problematically, the muon collider will send radiation beyond the lab's boundary. Neutrinos hardly interact with matter, so those from the muon decays would pierce Earth in all horizontal directions and emerge from its curved surface tens of kilometers away. There they would be numerous enough to knock loose other particles and create interacting radiation. Scientists say they can keep such radiation at safe and legal levels by building the collider 200 meters underground and limiting the muons in the beam. Convincing the public, however, may be more difficult.

## Which collider, when?

Such obstacles led physicists to write off a muon collider in the 1990s. But recent advances have prompted a reconsideration. For example, physicists once lacked a suitable source of muons, but researchers have recently demonstrated a mercury target that can provide enough of them, says Fermilab's Nikolai Mokhov.

All agree that before any decision can be made on the next collider, the LHC has to reveal at what energy new particles will emerge, which will take 3 or 4 years. If the answer is in the neighborhood of 0.5 TeV, then the ILC would be the way to go, says Fermilab's Oddone. If the range is higher, then the options would be a muon collider or CLIC. “If we need a higher-energy machine, we want to have put the muon collider into contention” by that time, Oddone says.

But even that goal strikes some as optimistic. Jean-Pierre Delahaye, an accelerator physicist at CERN, says researchers plan to publish a report next year that shows that CLIC is feasible. That is 24 years after they came up with the idea. “I would bet a bottle of champagne—a case—that [a feasibility study] will take at least as long for a muon collider,” he says. If so, 5 years won't be nearly long enough to tell whether the idea of a muon collider is too good to be true.