News this Week

Science  15 Oct 2010:
Vol. 330, Issue 6002, pp. 302
  1. Gulf Oil Spill

    Government Chided for Poor Planning and Communication

    1. Richard A. Kerr and
    2. Erik Stokstad

    It wasn't long after the drilling platform Deepwater Horizon sank into the Gulf of Mexico that two contentious scientific issues rose to the surface. Researchers in and out of the government sparred over exactly how much oil was gushing out of the damaged well. And some scientists raised concerns about the wisdom of injecting large volumes of dispersants near the ocean floor to break up the oil gushing from the wellhead, given the potential toxicity of the chemicals and the unprecedented use of dispersants at that depth. Last week, a presidential commission investigating the disaster released two preliminary reports from its staff that fault the government's handling of both issues.

    Up and up.

    Government estimates of the flow from the damaged Gulf of Mexico well grew until early August, when direct measurements put the flow near 60,000 barrels per day.

    CREDIT: (GRAPH, SOURCE) DEEPWATER HORIZON INCIDENT JOINT INFORMATION CENTER, REUTERS

    Perhaps most damaging to the government's credibility was its initial, low-ball estimate of how much oil per day was gushing from the well. The commission's report on the fate of the oil reveals for the first time the story behind this controversial first estimate—and the public confusion it sowed.

    On 28 April, a week after the Deepwater Horizon exploded, government officials released an estimated flow rate of 5000 barrels a day from the Macondo well. Despite immediate skepticism and a flurry of conflicting estimates from outside experts that pegged the flow rate at roughly 10 times as high—estimates that turned out to be in the ball park of the actual figure—the government stood by its figure for a month.

    According to the report, the first estimate came from a National Oceanic and Atmospheric Administration (NOAA) scientist who apparently had no particular expertise in the video technique involved. This scientist had cautioned in his e-mail to authorities that his was a “very rough estimate.” But neither his warning about the uncertainties in his figure nor any aspects of his methodology made it into the public release. The report found this to be “an overly casual approach.”

    A month after the spill began, the government set up the interagency Flow Rate Technical Group composed of about 20 members from government, academia, and independent organizations to apply a variety of measurement methodologies. Still, the report laments, the sketchy technical information released by the group discouraged helpful input from the broader scientific community. And the group's official estimates, though growing, continued to come in low, although some group members argued that the flow rate was much higher.

    How fast?

    Scientists struggled to gauge the leak's flow without direct measurements.

    CREDIT: REUTERS/BP/LANDOV

    The commission staff says it has no evidence so far that the low estimates slowed the response, but the less-than-transparent estimation process, according to the report, generated “significant controversy, which undermined public confidence in the federal government's response to the spill.”

    The government could have enhanced confidence, says the report, by being more forthcoming about how bad the spill might have been. From the outset, responders stated that they were scaling their efforts to the “worst-case” spill scenario, not anyone's estimate of the flow rate. But authorities never revealed their worst-case flow rate or how they arrived at it. In late April or early May, NOAA sought to make public some of its worst-case spill models, according to the report, but the White House's Office of Management and Budget (OMB) denied the request. (In a recent statement, OMB claims it delayed release of NOAA's worst-case scenario only for technical reasons, but by the time the report came out, the true flow was known.) The “lack of information may have contributed to public skepticism about whether the government appreciated the size of the Deepwater Horizon spill and was truly bringing all of its resources to bear,” the report states.

    The government also bungled the rollout of a report about the fate of the oil, according to the commission staff. Although the “oil budget” report was technically sound, it was doomed to misinterpretation because of poor presentation, the staff report says. Appearing on news shows on 4 August, the day of the report's release, Carol Browner, the director of the White House Office of Energy and Climate Change Policy, remarked that “the vast majority of the oil is gone” even though the oil budget report in no way supported her take. For months to come, the media would use her comments to characterize the oil budget report as overly optimistic.

    A second report focuses on the government decision to use unprecedented volumes of dispersants on the surface and at depth. Neither the Environmental Protection Agency (EPA) nor NOAA had planned for large-scale use of dispersants in a deepwater accident. As a result, according to the report, a lack of studies on dispersant toxicity meant that the Coast Guard's Thad Allen, EPA's Lisa Jackson, and NOAA's Jane Lubchenco were “seriously handicapped” when deciding about the use of dispersants. So far, they seem to have made the right call, the staff paper concludes, because dispersants helped protect surface-dwelling wildlife, the wetlands, and cleanup workers.

    But the biological impact of the dispersant-oil mixture in the water column remains unknown. “This event clearly shows that ecosystem-based information on dispersants and oil is really inadequate,” says Robert Diaz of the Virginia Institute of Marine Science in Gloucester Point, who is not on the panel. “The feds should have seen this [deep-water spill] coming and adjusted their planning,” he adds. The report calls for more extensive testing of existing dispersants and development of less-toxic alternatives. EPA and NOAA declined to comment on the dispersant report.

    The commission continues to investigate the response to the spill, especially whether the low flow estimates slowed efforts to cap the well. The five-member commission will issue a final report in January.

    Meanwhile, says science policy analyst Roger Pielke Jr. of the University of Colorado, Boulder, the government should learn from this crisis “to deal openly and honestly with uncertainties. … The public is smarter than politicians give them credit for. They can handle uncertainties.”

  2. Economics Nobel

    Three Laureates Explained Why Unemployment Is Inevitable

    1. Adrian Cho

    High unemployment is now plaguing many nations' economies, but even in the best of times, about one out of every 25 workers will be out of a job. This year's winners of the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel laid out the theory that explains why full employment is impossible.

    Peter Diamond, 70, of the Massachusetts Institute of Technology (MIT) in Cambridge; Dale Mortensen, 71, of Northwestern University in Evanston, Illinois; and Christopher Pissarides, 62, of the London School of Economics will split the $1.5 million prize. Starting in the 1970s, they developed, mostly independently, the theory of markets that suffer “search friction,” or costs for consumers and suppliers of a good to find one another. Dubbed the “DMP theory” in the economists' honor, it has become the bedrock for the study of labor markets and explains why people are sure to be out of work even when the number of vacant jobs equals the number of job seekers, so-called equilibrium unemployment.

    “It's a wonderful choice,” says Michael Elsby, an economist at the University of Michigan, Ann Arbor, of the trio's honor. “If you're going to be thinking about unemployment, you're going to start by playing around with the DMP model.”

    Classical economics predicts that unemployment should vanish whenever the number of available jobs equals or exceeds the number of workers. In reality, that never happens. For example, even during the economic boom a decade ago, unemployment in the United States dipped only as low as 3.8%.

    PETER DIAMOND (LEFT); DALE MORTENSEN (CENTER); CHRISTOPHER PISSARIDES (RIGHT)

    CREDITS (LEFT TO RIGHT): MIT; LARS KRUSE/AU-FOTO; LSE; © THE NOBEL FOUNDATION

    Diamond, Mortensen, and Pissarides found that they could explain that fact. They began with a dynamical model that considers the flow of workers both into and out of jobs. They imposed costs for an unemployed worker to find a job and for an employer to find a suitable employee, which affected the flows. The economists then showed mathematically that a new equilibrium would arise with at least some unemployment, the amount of which depended on the costs.

    In the 1960s, much of economic theory strived to prove that the results from idealized classical economics still held sway as economists made their models more realistic, Diamond said at an MIT press conference after the prize was announced. He preferred to let the improved models lead where they may. “It seemed to me that a better approach was to think about real dynamics and see where they go,” he says. “Maybe they go to the [classical] equilibrium solution, and maybe they don't.”

    Although the DMP theory started out as a mathematical abstraction, it has become a tool for applied economists and policymakers. That's because it provides a framework for studying in detail the effectiveness of a specific intervention in reducing unemployment or ameliorating its effects. “There's a whole literature out there on unemployment insurance that uses the model,” says Stephen Woodbury, an economist at Michigan State University in East Lansing.

    Given the unemployment crunch afflicting many countries, Pissarides says he favors political action to get people back to work. “What we should really be doing is to ensure that they do not stay unemployed too long,” he said at another press conference. “Give them direct work experience—not necessarily advanced training—after a few months so they don't lose touch with the labor market.”

  3. Immunology

    Painful Failure of Promising Genital Herpes Vaccine

    1. Jon Cohen

    A vaccine designed to ward off genital herpes has failed in a large clinical trial, abruptly ending the product's seemingly promising future. After 8 years of study in more than 8000 women in the United States and Canada, there was not even a hint of a positive result against the sexually transmitted disease caused by herpes simplex virus-2 (HSV-2). “It's dead negative,” says Lawrence Corey, a herpes and vaccine specialist at the Fred Hutchinson Cancer Research Center in Seattle, Washington.

    Corey stresses that a vaccine against HSV-2 could have a major public health impact. HSV-2 spreads easily through sexual contact; according to a 2003 estimate from the World Health Organization, more than 300 million women and 200 million men were infected with the virus. The infection often lasts for life and sporadically causes symptoms, including painful blisters that can burst and form ulcers. Pregnant women can spread the virus to their newborns, which sometimes kills them. A person infected with HSV-2 is also much more susceptible to HIV, and a co-infected person may transmit the AIDS virus more readily. “Its biological component to the HIV epidemic has been vastly underestimated,” says Corey.

    CREDIT: LAWRENCE COREY/FHCRC

    Despite its toll, HSV-2 has generally been seen as a “trivial” disease, says Corey, and vaccines have received scant attention. “It's been the Rodney Dangerfield of the sexually transmitted infection field,” he says.

    A few large pharmaceutical and biotech companies have shown HSV-2 the respect it deserves, investing heavily in vaccine research over the past few decades. But none of the efforts panned out, leading many to pin high hopes on the vaccine used in the current study. Made by GlaxoSmithKline (GSK) in Rixensart, Belgium, the vaccine contains a protein from HSV-2 mixed with a novel adjuvant, or immune system stimulator, triggering antibodies that researchers hoped could prevent infections. “A lot of the field has been waiting for this result,” says Robert Belshe of the St. Louis University School of Medicine in Missouri, who led the study. “It's a great disappointment.”

    Before the trial, researchers knew the vaccine had serious limitations, but it still offered a toehold for this struggling field. In two earlier studies, the vaccine failed to prevent genital herpes in male and female participants. But when researchers drilled down into the data from those trials, published together in the 21 November 2002 issue of The New England Journal of Medicine, they discovered the vaccine may have worked in a subset of women. Specifically, they focused on women who at the trial's start were not infected with HSV-1, a cousin of HSV-2 that causes cold sores in mouths and on lips. Infection with HSV-1 may offer some immune protection against HSV-2. This, in turn, might have masked the impact of the HSV-2 vaccine. The researchers came to this conclusion after noting that in HSV-1–negative women, the vaccine seemed to work more than 70% of the time.

    That led the National Institute of Allergy and Infectious Diseases (NIAID) to invest $27.6 million in the massive follow-up study in women who were negative for both HSVs. Researchers recognized that if the vaccine worked, it would have a narrow market: It's designed only for women, and HSV-1 is so widespread in developing countries that the vaccine would be ineffective in much of the world. Still, they hoped to build on the success—a hope that has evaporated with the new negative data. “The inconsistency between the trials is really quite disconcerting,” says Corey.

    When NIAID and GSK revealed on 30 September the latest vaccine trial's findings, researchers were equally dispirited and confused. “We're really puzzled by the results,” says herpes specialist Lawrence Stanberry of Columbia University Medical Center in New York City. Stanberry, who headed the earlier studies of the vaccine and was on the scientific advisory board for the subsequent trial, notes that the follow-up did not recruit the same population of women. In the first trials, women were selected because they were in “discordant” relationships, meaning they had regular sexual partners who were infected with HSV-2. The new study enrolled women who simply perceived themselves as being at risk of infection with the virus. “Discordant couples may by their nature be different biologically or behaviorally than people in a new relationship,” says Stanberry. Belshe agrees. “With long-standing relationships, we think there's chronic exposure to herpes antigen, so there might be some immunologic explanation for the different results.”

    As investigators sort through the wreckage for clues, many worry that the new results will further dampen the little interest industry has shown in an HSV-2 vaccine. GSK has said it will no longer pursue development of this particular vaccine, and no other big pharma company currently has an HSV-2 vaccine in clinical trials. David Knipe, a virologist at Harvard Medical School in Boston who is doing preclinical studies of an HSV-2 vaccine with Sanofi Pasteur that relies on a weakened form of the whole virus, urges NIAID to ramp up its involvement. “As the companies become more cautious about getting into this risky field, the investment is going to have to come from government and nongovernment people.”

    Corey says the findings underscore how little is known about HSV-2 immunology. He notes that the data make clear that antibodies, which prevent infection, alone are not the answer. Next-generation vaccines, he contends, should also dispatch killer cells to clear cells that HSV-2 manages to infect, as well as less specific innate immune responses. “These results reset the targets,” he says. “We've underestimated the kinds of immune responses required to restrain this virus.”

  4. Climate Change

    Climate Talks Still at Impasse, China Buffs Its Green Reputation

    1. Richard Stone

    TIANJIN, CHINA—Delegates to a United Nations meeting here last week made scant headway on a global strategy for reining in greenhouse gas emissions. But amid the pessimism and recriminations, one nation won praise from observers for its efforts to boost energy efficiency and invest in green technologies: the host, China.

    Carbon futures.

    Three scenarios for China's CO2 emissions, as envisioned in a new NRDC report.

    CREDIT: ADAPTED FROM D. COHEN-TANUGI, PUTTING IT INTO PERSPECTIVE: CHINA'S CARBON INTENSITY TARGET, NRDC (OCTOBER 2010)

    Negotiators from 177 countries came to this port city near Beijing with low expectations for progress on a deal that could slow global warming, and thick smog that blanketed Tianjin further dampened spirits. Last week's U.N. Framework Convention on Climate Change meeting was a preparatory session for a summit next month in Cancun, Mexico, where countries will resume the Sisyphean task of crafting a successor to the Kyoto Protocol, in which 39 industrialized nations and the European Union committed to reducing greenhouse gas emissions several percent from 1990 levels by 2012. According to the 1997 accord's principle of “common but differentiated responsibilities,” developing nations, including China, have pledged to take voluntary steps to rein in carbon emissions.

    Talks on a post-Kyoto agreement have floundered, most notably at a sumit in Copenhagen last year (Science, 1 January, p. 19). The United States, which endorsed but did not ratify the Kyoto Protocol, and some industrialized nations have balked at the economic cost of deep, legally binding cuts in carbon emissions deemed necessary to avoid catastrophic climate change. Developing nations, the group to which China maintains it still belongs despite its growing economic strength, have vowed to become more energy efficient but as a group have refused to accept binding emissions reductions.

    With any progress toward a new accord elusive, the two biggest greenhouse gas emitters—China and the United States—fired broadsides at each other. The lead U.S. negotiator in Tianjin, Jonathan Pershing, criticized China and other major developing nations for refusing to implement a stringent program of monitoring, reporting, and verifying their carbon emissions. “These elements are at the heart of the deal, and the lack of progress on them gives us concern,” Pershing told reporters. China, meanwhile, scolded the United States for using the Asian nation as a scapegoat for its own foot-dragging on addressing climate change. The United States “has no measures or actions to show for itself,” said Su Wei, director of the powerful National Development and Reform Commission's climate change department, who reiterated China's view that the United States must take “historical responsibility” for rising atmospheric CO2 levels. In the toxic atmosphere, delegates made only modest progress on issues such as a plan to pay nations to preserve forests.

    One silver lining in the smog was China's ramped-up efforts to save energy. The world's biggest greenhouse gas emitter has shuttered thousands of inefficient coal-burning power units in recent months in a frantic bid to reduce its energy intensity—energy consumption per unit of GDP—20% from 2005 levels by the end of this year (Science, 4 June, p. 1216). At the Copenhagen summit last December, China upped the ante, vowing to reduce energy intensity by 40% to 45% from 2005 levels by 2020. “China has the will and the way to achieve that,” says Barbara Finamore, China program director at the Natural Resources Defense Council (NRDC), a U.S.-based nonprofit. China can meet or exceed its energy-intensity target if, among other things, it eases its reliance on coal from nearly 70% of its energy mix to 62% by 2020, NRDC's David Cohen-Tanugi concluded in an analysis released at the meeting.

    Scaling up.

    Solar and wind power are making huge strides in China.

    CREDIT: XINHUA/LANDOV

    A key prong of China's strategy is renewable energy. It aims to raise the non–fossil-fuel share of its primary energy from the current 9% to 15% by 2020. “China is doubling its wind capacity every year” and next year is expected to surpass Germany and the United States as the biggest wind-energy producer, says renewable energy analyst Joanna Lewis of Georgetown University in Washington, D.C. It has also become the world's largest producer of solar cells. And China announced last July that it would spend $738 billion over the next decade on alternative energy.

    Another element of the nation's energy strategy is to overhaul the electrical grid. Around 80% of China's coal lies in the west and north, and 80% of hydropower generation is in the west, while 75% of energy demand is to the east. To link areas of energy production and demand, China plans to build thousands of kilometers of ultrahigh-voltage transmission lines over the next decade. And starting this year, grid operators must purchase all available renewable energy in their area.

    Forecasts indicate that China's greenhouse emissions won't peak for another 10 to 20 years at the earliest. But energy savings from reduced transmission losses, increased renewable energy and electric vehicles, and more-efficient coal burning should allow it to avoid a whopping 1.649 billion tons of CO2 emissions a year by 2020, according to the State Grid Corporation of China.

    All China's initiatives are undergirded by massive investments in clean energy technology, from advanced solar cells to electric vehicles. The International Energy Agency in Paris forecasts a $13 trillion clean energy market over the next 2 decades. According to the Pew Charitable Trusts, in 2009 China invested $34.6 billion in clean energy—an amount second only to the European Union ($41.1 billion) and nearly twice that of the United States ($18.6 billion).

    Even though last week's meeting offered little hope for a post-Kyoto compromise, it was clear which greenhouse gas superemitter had made gains in revamping its image. “In the race for a clean energy future, China is already off the starting blocks,” says Jake Schmidt, NRDC's international climate policy director. “The U.S. is still taking off its sweatsuit.”

  5. Undergraduate Science

    Better Intro Courses Seen as Key to Reducing Attrition of STEM Majors

    1. Jeffrey Mervis

    Ten years ago, analytical chemist William LaCourse took a hard look at the status of teaching within his department at the University of Maryland, Baltimore County (UMBC). He didn't like what he saw. “We were losing ground,” says LaCourse. Students were doing worse on tests, and more were failing or dropping courses. Attendance was spotty, and only the best students were showing up for extra help. The number of chemistry majors was also declining.

    A big part of the problem, LaCourse and others decided, was Chemistry 101. It's the gateway course for prospective chemistry majors and a requirement for those in many other fields. “The student newspaper called it a weed-out course,” he recalls. “And I thought, ‘That doesn't make any sense. Rather than always looking for new customers, why can't we do a better job with the ones we already have?’”

    A new report from the National Academies offers advice to universities trying to answer that question. It says that improving introductory courses is one of many steps needed to increase the number of students obtaining degrees in the fields of science, technology, engineering, and mathematics (STEM) and, in particular, the percentage of minorities in the scientific and engineering workforce.

    “We are missing what could be called the low-hanging fruit,” says Freeman Hrabowski, chair of the National Research Council (NRC) panel that wrote Expanding Underrepresented Minority Participation: America's Science and Technology Talent at the Crossroads. “These are students who have gotten into college and are majoring in math and science and who want to be in those disciplines. But more than half of them are not completing degrees in those fields.”

    Getting it done.

    Participatory introductory science courses like this one at the University of Maryland, Baltimore County, hope to boost the low percentage of black and Latino students who achieve their goal of earning a STEM degree.

    CREDITS: TIM FORD/UMBC; (INSET, SOURCE) UNIVERSITY OF CALIFORNIA LOS ANGELES, HIGHER EDUCATION RESEARCH INSTITUTE

    Only about 20% of underrepresented minorities who aspire to a STEM degree actually earn one within 5 years, according to a longitudinal study cited in the NRC report by researchers at the University of California, Los Angeles (see graphic). And it's not just minorities who are falling out of the science pipeline. Only 33% of whites and 42% of Asian-Americans complete their STEM degrees in 5 years, the UCLA study finds. “It's really an American issue,” says Hrabowski, a mathematician and longtime UMBC president. “It's simply unacceptable for such a large majority of students not to achieve their goal [of a STEM degree]. We must find ways for larger numbers of American students to excel in science.”

    The new report says that retaining STEM majors will require better academic, social, and financial support for students. Some of those steps are relatively inexpensive, says Hrabowski, although the report does recommend that the government launch a scholarship program to boost the number of needy minority students that could eventually cost $600 million a year.

    Under Hrabowski, UMBC has implemented enough of those ideas to become one of the nation's top feeder schools for minorities going on to receive a Ph.D. degree. (Some 22% of UMBC students are underrepresented minorities, and they comprise 20% of all STEM majors.) One especially promising intervention, notes the report, is the university's Meyerhoff Scholars Program, which offers a summer bridge program, scholarships, tutoring and networking, research experiences, and study abroad.

    UMBC has also overhauled its introductory courses. The chemistry department, which LaCourse now chairs, has converted Chem 101 and Chem 102 into “Discovery Learning” courses. Students who had once sat passively during a weekly 2-hour recitation section while a graduate student solved problems on the whiteboard are now part of four-person teams responsible for finding the right answer. Teaching assistants act as “Sherpas,” says LaCourse, “guiding students up the mountain.” Laptops and cell phones have been banned from the sections so that students can focus on the assignment. Attendance is mandatory, and unexcused absences result in lower grades. There are still lectures, but they are delivered by someone “who lives and breathes Chem 101,” says LaCourse.

    The changes have produced immediate—and dramatic—results. Pass rates for the introductory courses shot up the first year from 70% to 85%—even though the department also raised the minimum score—and have inched up from there. Attendance has improved, and fewer students drop the course. The number of chemistry majors has nearly doubled since 2003, and the outflow from chemistry to other majors has stopped, reversing a chronic leakage of up to a dozen students a year. As a bonus, a once-moribund student chapter of the American Chemical Society is now thriving.

    LaCourse freely acknowledges that UMBC's approach draws heavily on a national movement to replace the didactic style of undergraduate teaching with more active, hands-on learning (Science, 31 July 2009, p. 527). And although the results are impressive, LaCourse says the transformation is not complete. The person with the primary responsibility for teaching the two intro courses, for example, is not a tenured faculty member. “I'd like that situation to change,” he says frankly. “And I think it's only a matter of time before [teaching] becomes a legitimate pathway. But I'm not that powerful.”

  6. ScienceNOW.org

    From Science's Online Daily News Site

    CREDIT: ALPSDAKE/WIKIMEDIA

    Biggest Genome Ever Now that's a genome. A rare Japanese flower named Paris japonica sports an astonishing 149 billion base pairs, making its genome 50 times the size of a human genome—and the largest one ever found. Until now, the biggest genome belonged to the marbled lungfish, whose 130 billion base pairs weighed in at an impressive 132.83 picograms. (A picogram is one-trillionth of a gram.) The genome of the new record-holder, revealed in a paper in the Botanical Journal of the Linnean Society, would be taller than the Big Ben clock tower if stretched out end to end. The researchers warn, however, that big genomes tend to be a liability: Plants with lots of DNA have more trouble tolerating pollution and extreme climatic extinctions—and they grow more slowly than plants with less DNA because it takes so long to replicate their genomes.

    CREDIT: NASA/JPL/UNIVERSITY OF ARIZONA/DLR

    Building Blocks of Life in Titan's Atmosphere? It's unlikely that the process produced Titanians, but experiments simulating the chemistry of the dense air on Saturn's biggest moon have yielded some of the basic buildings blocks of life. Last week at the American Astronomical Society's Division for Planetary Sciences meeting in Pasadena, California, researchers described how they used radio-frequency radiation—a more convenient substitute for ultraviolet sunlight—to turn methane, nitrogen, and carbon monoxide (the main constituents of Titan's atmosphere) into glycine and alanine, the two smallest amino acids. The experiments also produced cytosine, adenine, thymine, and guanine, the four most basic components of DNA. And they created uracil, a precursor of RNA. The researchers said that because they achieved the reactions without the presence of liquid water, it's possible life could have sprung forth on Earth not in the seas, as commonly assumed, but perhaps in the planet's early atmosphere—a considerably thinner version of the fog enveloping Titan today.

    How Volcanoes Feed Plankton Want to spur plankton growth in the ocean? Hire a volcano.

    In August 2008, scientists on a research cruise in the northeastern Pacific Ocean were shocked to witness a sudden, huge spike in the area's plankton population. Within just a few days, the chlorophyll concentration in the water had increased by 150%.

    To find out why, the team analyzed air and water chemistry and found that dissolved iron, a nutrient that plankton need to thrive, must have increased dramatically. The source, they deduced, was the 7 to 8 August eruption of the Kasatochi volcano some 2000 kilometers away, chemical oceanographer Roberta Hamme of the University of Victoria in Canada and colleagues report in Geophysical Research Letters.

    Despite the size of the bloom, however, the plankton's uptake of carbon was relatively modest. Hamme says the team's preliminary analysis suggests that trying to stimulate plankton growth by adding iron to the water—a suggested countermeasure to global warming—would have a minuscule effect on marine CO2 absorption.

    CREDIT: SIMONE/FLICKR/WIKIMEDIA

    Who's Your Mommy? Breeders know that sleek, speedy, and spirited thoroughbred horses arose from three Arabian stallions brought to England more than 3 centuries ago. But who were the mares that birthed these noble steeds? A new genetic analysis suggests that thoroughbred fore-mothers hailed from Ireland and Britain.

    Mim Bower, an archaeogeneticist at the University of Cambridge in the United Kingdom, and colleagues analyzed mitochondrial DNA, which is passed down only by the mother, of about 300 thoroughbreds and nearly 2000 other horses of different breeds from across Europe and Asia. Most of the sequences came from a genetic database, but the researchers also contacted horse people, asking for samples of hair pulled from manes. The team then sequenced short stretches of DNA extracted from the tissue attached to the hair.

    The thoroughbreds' mitochondrial DNA sequences were closest to those of native Irish and British breeds, like the Connemara (pictured). There was a hint of other ancestries—including Arabian—but thoroughbred moms most likely hailed from the British Isles, the researchers report in Biology Letters. And that makes sense—in the early days, breeders thought the important parent was the stallion, says Bower; any old mare would do as a mother.

    Read the full postings, comments, and more at http://news.sciencemag.org/sciencenow.

  7. Computational Biology

    Custom-Built Supercomputer Brings Protein Folding Into View

    1. Robert F. Service

    Scientists have long been frustrated in their efforts to use computers to simulate the atomic detail of how proteins fold into their three-dimensional structures. The computing demands for simulating all the motions of a protein's atoms and the surrounding water are so high that scientists have had difficulty tracking the myriad atomic wiggles and gyrations for long enough to see the complete folding process. But now help is on the way. On page 341, computational biologists led by computer scientist and former hedge fund manager David Shaw report that they ran a specially built supercomputer for about 3 weeks to simulate a relatively small protein going through 15 rounds of folding and unfolding over 200 microseconds. They also tracked the folding gyrations of a similarly sized protein for more than a millisecond.

    “This is a landmark paper,” says David Baker, a protein-folding expert at the University of Washington, Seattle. Klaus Schulten, a molecular simulations expert at the University of Illinois, Urbana-Champaign, who has been Shaw's friendly rival, agrees, calling the paper “very important.”

    Baker notes that the simulations from Shaw and his colleagues revealed that the folding protein followed more or less the same general pattern of movements each time it folded rather than each folding having a distinct progression. That was something of a surprise because it wasn't clear from previous modeling and experimental work this would be the case. Others note that simulating individual proteins for long periods isn't the only way to investigate protein folding: networks of computers can also cobble together large numbers of shorter simulations to explore some key events. Still, the new work sets the stage for extended simulations of dozens, if not hundreds, of other proteins that are less well understood, which could reveal whether all proteins follow a similar set of rules as they fold. “Now one can approach these questions in a quantitative manner,” Baker says.

    Custom job.

    By speeding calculations, this Anton supercomputer can run all-atom simulations (inset) 100 times longer than can general purpose supercomputers.

    CREDITS: COURTESY OF MATTHEW MONTEITH; (INSET) COURTESY OF D. E. SHAW RESEARCH

    That's been Shaw's dream since he left the world of high finance 9 years ago to start an outfit called D. E. Shaw Research in New York City. Shaw, who now also has an affiliation with Columbia University, originally trained as a computer scientist and specialized in designing parallel supercomputers. After a brief stint on the faculty of Columbia in the 1980s, he moved to Wall Street, where he designed powerful algorithms for stock trading. He later moved on to run his own hedge fund, winding up on the Forbes list of 400 richest Americans in the process. (He also recently served as the treasurer of AAAS, Science's publisher.) But Shaw missed the intellectual challenge of science. “I found myself at night solving math problems for fun,” he says.

    After conversations with friends in computational biology, Shaw chose the challenge of simulating the motion of proteins for his reentry into science. Drawing on his early career, he decided to design and build a customized supercomputer to push the boundaries of the field. Two years ago Shaw revealed the result, Anton, a supercomputer containing 512 specially designed computer chips hard-wired to speed the relatively simple calculations involved in determining how neighboring atoms in a protein interact. By speeding computations, Shaw says, Anton has run all-atom simulations 100 times longer than general purpose supercomputers can. In the current study, for example, Shaw and his colleagues were able to track 13,564 atoms, comprising a relatively small protein and surrounding water molecules, long enough to see the protein fold and unfold repeatedly.

    Like all supercomputers, the Anton used in the current study still has its limits and can't run such lengthy simulations of very large proteins. But Shaw says he and his colleagues are already making progress on that. In addition to building 11 supercomputers incorporating 512 custom-designed computer chip cores, or nodes—Shaw donated one to the National Resource for Biomedical Supercomputing in Pittsburgh, Pennsylvania—Shaw's team has built a 1024-node machine and one with 2048 nodes. The larger machines, he notes, are more efficient tracking the motions of larger proteins. Moreover, Shaw says his team is already building successors to Anton, using the next generation of chip technology to burn through calculations significantly faster. And Shaw says he's happy to be back in the thick of a knotty intellectual challenge: “I love this. It's just the most fun I've ever had. It's very satisfying.”

  8. Chemistry

    Carbon-Linking Catalysts Get Nobel Nod

    1. Robert F. Service*

    Credit the matchmaker for this one. This year's Nobel Prize in chemistry went to three chemists—Ei-ichi Negishi, Akira Suzuki, and Richard Heck—for discovering catalysts used to tie the knot between carbon atoms on separate molecules. The ability to tailor such molecular unions has spawned whole sectors of advanced technology, making possible the synthesis of everything from anticancer drugs and agricultural pesticides to advanced displays and electronic chips in computers.

    RICHARD HECK (LEFT); EI-ICHI NEGISHI (CENTER); AKIRA SUZUKI (RIGHT)

    CREDITS (FROM LEFT): © THE NOBEL FOUNDATION; D. PERRY/UNIVERSITY OF DELAWARE; PURDUE UNIVERSITY; HOKKAIDO UNIVERSITY

    At the heart of these applications are organic molecules made from chains and rings of carbon atoms. Carbon is the key to organic chemistry—and life—thanks to its ability to link with its neighbors to form molecular chains and rings of an enormous variety of shapes, much as lumber can be nailed together to build houses of almost any design. To build synthetic molecules, chemists continually look for new ways to join together carbon atoms on separate molecules. The trouble is that in most organic molecules, the carbon atoms are happy with where they are and thus unlikely to react with their neighbors. In the early 1900s, German chemists came up with ways to link metal atoms to carbons, making them more reactive and willing to bond with neighbors. But the reactions weren't specific. Instead of producing just the desired molecule, the more-reactive carbons would bond willy-nilly with any carbon around, producing all sorts of junk that had to be tossed out.

    In the late 1960s, Heck, then with chemical manufacturing company Hercules Corp. in Wilmington, Delaware, and later at the University of Delaware, Newark, found that he could tailor just which carbons he wanted to link together. In one early example, Heck first linked a bromine atom to one of the six carbon atoms in a molecule of benzene. This slightly modified the electronic structure of the carbon and tagged it as the one that would react. He then added small, two-carbon molecules called olefins to the solution, as well as palladium. The palladium temporarily binds with both the carbon on the bromine-tagged benzene as well as one from the olefin, bringing them close enough to pair up. When they do so, they form styrene, the building block of polystyrene plastics. The reacting molecules kick the bromine out into solution and send the palladium on its way to orchestrate another hookup. In the late 1970s, Japanese-born Negishi, who spent the bulk of his career at Purdue University in West Lafayette, Indiana, and Suzuki, of Hokkaido University in Sapporo, Japan, modified the approach, adding different tagging atoms as well as metals to tailor the reaction to make other organic compounds.

    Today, the three approaches are collectively known in chemistry parlance as “palladium-catalyzed cross-coupling reactions,” and they continue to grow more popular. “Of all methodologies developed over the past 50 years, it is safe to say that palladium-catalyzed cross-coupling methodologies have had the biggest impact on how organic compounds are made,” says Eric Jacobsen, an organic chemist at Harvard University. “Cross-coupling methods are now used in all facets of organic synthesis, but nowhere more so than in the pharmaceutical industry, where they are used on a daily basis by nearly every practicing medicinal chemist.”

    As a result, Jacobsen and other chemists say they were not surprised by the award. “It was just a matter of time for this chemistry to be recognized,” says Joseph Francisco, a chemist at Purdue University and the president of the American Chemical Society. Jacobsen says the Nobel Committee could have also chosen any of a few other cross-coupling pioneers, such as Barry Trost of Stanford University. But Nobel rules limit the committee to picking no more than three recipients. “I think they got it right,” says Jeremy Berg, who heads the National Institute of General Medical Sciences in Bethesda, Maryland.

    For Negishi in particular, the prize is a dream come true. After immigrating to the United States from Japan, Negishi says he had the opportunity to interact with several Nobel Laureates while studying at the University of Pennsylvania. “I began dreaming about this prize half a century ago,” Negishi says. At a press conference televised in Japan, Suzuki said he hopes his work will have a similar effect on the next generation. “Japan has no natural resources. Knowledge is all we've got,” Suzuki says.

    • * With reporting by Dennis Normile in Tokyo.

  9. ScienceInsider

    From the Science Policy Blog

    A National Academies' report on how U.S. universities have managed intellectual property in the wake of the 1980 Bayh-Dole Act has concluded that things are pretty much hunky-dory but that schools may be trying too hard to cash in on discoveries. Universities instead should aim to disseminate technology for the public good, which may mean passing up a more lucrative licensing deal.

    The U.S. Food and Drug Administration is pressing for a $25 million funding boost for research that can help it evaluate new treatments better and faster. Commissioner Margaret Hamburg says such “regulatory science” would allow the agency to help turn the nation's sizable investment in basic biomedical research “into vital products for those who need them.”

    Israel's minister of education, Gideon Sa'ar, has fired his chief scientist for comments that questioned the tenets of evolution and global warming. Gavriel Avital's trial appointment last December had been controversial from the start.

    The National Ignition Facility, the highest energy laser in the world, has fired its first shot in what officials at Lawrence Livermore National Laboratory hope will be a successful campaign to achieve ignition—a self-sustaining fusion burn that produces more energy than was pumped in to make it happen.

    The National Institutes of Health has launched a $60 million program that will allow a few talented young scientists to become independent investigators shortly after earning their Ph.D.—provided they can get jobs with institutions willing to nominate them for the award.

    The European Union has unveiled a new plan to foster innovation. Officials hope its emphasis on making it easier for companies to actually use the fruits of science will bridge a valley of death that slows commercialization.

    For more science policy news, visit http://news.sciencemag.org/scienceinsider.

  10. Can the Census Go Digital?

    1. Sam Kean
    CREDIT: (COLLAGE AND GRAPHICS) N. KEVITIYAGALA/SCIENCE; ISTOCKPHOTO.COM

    After Kenneth Prewitt was dismissed this summer from jury duty, the former director of the U.S. Census Bureau assumed that he wouldn't have to perform this particular civic responsibility again for another 4 years. But the jury foreman had a surprise: New York uses various government databases to form its pool of potential jurors, and Prewitt's name was listed four separate times. For legal reasons, those public agencies can share only small amounts of personal data, making it impossible to determine how many Kenneth Prewitts actually lived in the city.

    For Prewitt, a professor of public affairs and vice-president for Global Centers at Columbia University, the ambiguity meant that he could be called again for jury duty any day. But it also highlights a bigger problem for the agency he once headed as it struggles to meet its constitutional mandate to conduct a decennial nose count of the nation. The 2010 census that's winding down will cost U.S. taxpayers $13 billion, a figure that has roughly doubled each of the past two censuses. That staggering sum—it's the most expensive census in the world—has prompted policymakers to ask if there are cheaper and better ways for the Census Bureau to do its job. For instance, why not tap into the vast amount of digital data on U.S. residents already being collected by various state and federal agencies and sitting in computers?

    Cold hard data.

    Director Robert Groves greets local officials in Noorvik, Alaska, to kick off the 2010 census.

    CREDIT: CAROLYN KASTER/AP PHOTO

    But Prewitt's experience shows that using digital data is not as straightforward as it would appear. It also suggests that the Census Bureau must be very careful if it decides to rely on it in the future. For starters, a census can't err on the side of counting someone four times. Data from government agencies also contain more mistakes about individual characteristics—age, race, sex, and so on—than a census can tolerate. In addition, few databases come close to delivering the universal coverage the decennial census demands.

    What's more, there are no obvious technical fixes to these problems. Part of the reason is the dearth of research on the topic. Few demographers have access to these databases to examine these questions. And when they do, the research is very expensive because they must sort through enormous volumes of complicated data.

    U.S. demographers aren't alone in feeling pressure to come up with a better census. In July, the newly elected coalition government in the United Kingdom announced that it might scrap the 2021 census to save money. (The upcoming 2011 census will cost £480 million.) A few weeks before, Canadian Prime Minister Stephen Harper decreed that the long form to be used in the 2011 census would be voluntary rather than mandatory. Harper never consulted his statistics bureau, Statistics Canada, before making the announcement, which he said was being done in response to privacy concerns. Canada's chief statistician resigned in protest, noting that a voluntary questionnaire would generate results incompatible with those of previous censuses even if the response rate is the same, because the volunteers won't be representative of the entire country. “Censuses around the world are having a hell of a time figuring out a design that will guarantee high levels of data quality but keep costs low,” Prewitt observes.

    U.S. Census Director Robert Groves has ordered the bureau to determine how a 2010 digital census might compare to the real one. Although results won't be available for at least 2 years, Groves already knows one thing: “I could not guarantee that, if we wanted a 2011 census and we assembled the information from administrative records, that we could do as good a job.”

    Digital divide

    The future of the census isn't simply a concern for government bureaucrats. Its fate affects researchers across many fields. Census results are used by public health officials to study the quality of medical care, by economists to study the economy, and by epidemiologists to determine how common diseases are. As Nancy Krieger, a social epidemiologist at Harvard School of Public Health in Boston who uses census data in her research, says, even “to determine if heart disease mortality is going up or down, or breast cancer rates are going up or down, without census [population] data you have a problem.”

    While it's not clear that the United States can run a wholly digital census, here's what one might look like. A few months beforehand, the Census Bureau would send workers out with hand-held electronics to update its master address file of all domiciles in the country. These devices would have built-in GPS maps to avoid the need for expensive printed ones, and their ability to record information digitally right away would help avoid problems that arise when blackened bubbles don't (for whatever capricious reason) upload into computers correctly.

    The census itself would likely follow one of two tracks. Under one scheme, people would be directed to Web sites to enter details like name, age, and race. That online process would be far more efficient than having the bureau scan mailed-in questionnaires. Respondents could instantly request a form in another language, too, a feature likely to improve the response rate for recent immigrants, a difficult group to tally. Electronic programs would also prompt people to change nonsensical answers—can someone born in 2003 really be married?—and correct any arithmetic mistakes.

    At this point, the census would have gathered information on most people. Instead of sending workers out to knock on doors to fill in the gaps, the census would mine government databases to fill in gaps and check a respondent's answer if it didn't make sense. This would be especially helpful in places like Alaska, where such follow-up activities can be extremely expensive because of the need to fly around the state. The bureau might still deploy workers with hand-held devices, but far fewer than before. And the hand-helds could be updated in real time, for example, to cancel a visit to someone whose form arrived after the tardy list was compiled.

    As a second option, the bureau could decide to use administrative records as the census itself. Alan Zaslavsky, a health statistician at Harvard Medical School in Boston who has advised the Census Bureau, estimates that digital records could gather up to 90% of the data needed without asking anyone any questions. In this case, the Internet form and personal follow-ups would only clear up records with flaws or ambiguities. (The bureau already uses administrative data for between-census updates, but these updates use official census data as a baseline and merely adjust figures up or down.)

    However, each step in a digital census would have to overcome serious technical, statistical, or political obstacles. For instance, the planned transition to hand-held electronics for the 2010 U.S. census turned into a costly mess. Forced to buy equipment years in advance because of the extensive preparation time needed, the bureau was stuck with somewhat outdated technology that was slow to upload data and froze frequently. The problems were exacerbated when officials tried to add new features to the devices, like one critical “dashboard” interface, after conducting a full-scale dress rehearsal in 2007.

    The census finally abandoned the devices for most 2010 fieldwork and reverted to pencils and paper. As for how much the botched effort cost, the bureau says it spent an extra $1.4 billion it received to pay for the transition back to paper and other increased costs.

    Prewitt also questions how much money would actually be saved by converting to online response forms. For example, tying someone to an address—so that Jane Q. Public in Hawaii can't get online and claim she lives in Maine—the Census Bureau would have to mail postcards with unique ID numbers to all addresses in the U.S. anyway.

    There's a more fundamental problem facing the census that online forms probably wouldn't address: decreasing response rates. As with most censuses, there are two parts to the U.S. tally. The short form, which goes out every 10 years to every household, asks about age, race, and other basic demographic characteristics. A longer version, which seeks details about housing, transportation, and other issues, is sent to a much smaller population. In the United States, the American Community Survey is conducted monthly and reaches 3 million households a year (Science, 9 April 2010, p. 158).

    Living proof.

    Census workers began using GPS devices in March 2009 to update the bureau's address list.

    CREDIT: U.S. CENSUS BUREAU, PUBLIC INFORMATION OFFICE

    There's only so much the Census Bureau can do to encourage participation. Despite spending $370 million to publicize this year's census—and despite the fact that failure to participate is illegal and subject to a fine of up to $100—only 72% of the 2010 census forms were returned. The rate dipped below 20% in a few rural U.S. counties. Zaslavsky suspects that people are so bombarded with information nowadays that it is simply easier than in the past to blow off the census, no matter the medium. Paradoxically, people seem generally less guarded about providing information than ever before (see sidebar, p. 311).

    Using administrative records would sidestep the low response rates, to be sure. But whacking this mole only causes others to pop up. The Internal Revenue Service (IRS), which has the most promising database because of its wide coverage, often takes a year to process data. That's a nonstarter for the census, which must report to the president by December of each census year. (Using 2019 IRS data for the 2020 census is not an option, either, as it would miss anyone who moved in the intervening year.)

    Demographers worry even more about the accuracy of administrative data. The IRS might not know if it had someone's address or age wrong, for instance. Gathering accurate racial data would be even trickier, says John Czajka, a statistician at the think tank Mathematica Policy Research Institute, who has done research on incorporating government data into the census. Federal law requires the census to collect race statistics to monitor (non)compliance with the Voting Rights Act and Civil Rights Act. But few other agencies do. The other major one that does, Czajka notes, is the Social Security Administration, which began collecting it in the pre–Civil Rights era, when racial classifications were cruder.

    Another problem is that the census would need to supplement federal data with information collected by local agencies, like those that issue food stamps or birth certificates. These systems are generally not comparable across all 50 states, Czajka says. For instance, he has found strong evidence in his research that local offices in different areas that perform the same function will tabulate race in different ways. So there is often no consistent methodology for gathering such data, making it useless for a census.

    Even if data from other agencies are trustworthy, demographers worry that it may be patchy. The IRS, for example, would have little data about people who pay no taxes (e.g., many students and the very poor), and agencies for education or housing would have similar blind spots.

    Those blind spots come on top of the existing shortcomings of most censuses. Many undercount immigrants and minorities, especially the urban poor. (Demographers learned this by borrowing a trick from ecologists, called capture-tag-recapture, in which they interview people in urban neighborhoods, then return to see how many of the same people they meet again.) But demographers have studied these shortcomings for decades and can at least correct the biases. With administrative data, no one yet knows how to quantify the biases.

    The nation's denominator

    Some European countries, including Sweden, the Netherlands, and Denmark, have met these challenges by forming a strong central registry that collates data from other agencies and keeps a master file on each person. Demographers salivate over these registries—not only because they make population counts much easier, but because they offer enhanced research opportunities.

    “I'm jealous as can be of what the Nordic countries do with their data,” says Michael Marmot, an epidemiologist at University College London who has worked on population data with the World Health Organization. He'd like to see the United Kingdom build a registry and then compare it to the results of a traditional 2021 census. If the registry looks accurate, he says he would feel comfortable letting the census die.

    Setting up such a registry would be politically unpopular in many Western nations, however. Requiring residents to notify the government every time they move or change jobs “might be regarded as extreme,” says Ivan Fellegi, the longtime head of Statistics Canada who retired in 2008. In fact, those privacy concerns led the U.K. government recently to scrap its plans for a national ID system, a necessary precursor to a registry. Echoing those concerns, in August the Republican National Committee passed a resolution comparing the census to “a scam artist … asking very personal questions and using fear of penalties to manipulate the respondent to answer.”

    While political leaders debate the future of the census, scientists who rely on its data sit and wait, hoping that budget cuts or changes in methodology won't unravel their research. Even simple measures of public health become fraught to calculate when census error bars expand. Says Krieger, “Whenever you have a rate of disease or mortality, it's always a numerator divided by a denominator. And in the U.S. the denominator for mortality rates and many disease rates comes from the census.”

  11. Soap or Census?

    1. Sam Kean

    While response rates to population surveys like the census have been sledding downhill for decades, the average person seems less guarded than ever about airing personal details. As Alan Zaslavsky, a health statistician at Harvard Medical School in Boston, puts it, “People provide vast amounts of information about themselves online. They'll fill out a form for a coupon for a $2 box of soap and [provide] about as much as the Census Bureau asks for.”

    CREDIT: ISTOCKPHOTO.COM

    Is Zaslavsky just engaging in hyperbole, or will people really out themselves for nominal discounts on lavatory cleansers? The answer seems to be yes.

    The census short form asks for a person's name, date of birth, gender, housing status (own or rent), phone number, address, and race—plus details about those who live with him or her. An online form from Proctor and Gamble promising $1 off Dawn dish soap asked pretty much the same things: name, date of birth, gender, phone number, and address. Scoring an Oil of Olay coupon requires dishing up those details, too.

    Neither of those soaps ask for racial data. But an online survey for Tide laundry soap—after pages of interrogating your laundry habits and exact level of awareness of Tide products—does indeed ask for race. And really, there's no reason to pick on soaps. Getting a deal on toiletries or anything else online probably requires trading demographic data. (And at least the census doesn't require e-mail addresses on top of everything else, with invitations to join special census discount clubs.)

    This survey of consumer data-gathering was of course unscientific. But in a way, that's the point: Even a cursory look around cyberspace suggests that Zaslavsky is more right than wrong. When it comes to soap or the census, the people choose soap.

  12. Zoonoses

    In China's Backcountry, Tracking Lethal Bird Flu

    1. Li Jiao*

    QINGHAI LAKE, CHINA—The lake glitters like a sapphire under a blue sky as birds circle near the shore. On the rocky beach, two researchers are tying a GPS transmitter to the back of a small gray duck. They will track its migration by satellite, part of a series of investigations that began after highly pathogenic avian influenza (H5N1 subtype) first swept the region in 2005.

    The studies aim to pinpoint the viral reservoir and the role that wild birds play in transmission. “The lake has attracted the whole world's researchers to keep a close eye on it,” says He Yubang, vice director of Administration of Qinghai Lake Chinese National Nature Reserve. No reservoir has yet been found, but transmission routes have come into clearer focus.

    The emergence of H5N1 was a disaster for wildlife and humans alike. Since 2003, H5N1 has killed 300 people, including 18 so far this year, according to the World Health Organization. More than 250 million infected domestic poultry have been culled, and thousands of wild birds have been felled. In 2005 alone, more than 6000 wild birds at Qinghai Lake died, “the single largest H5N1 wild bird mortality event that has ever occurred,” says Scott Newman, an animal health officer for the UN Food and Agriculture Organization (FAO) in Rome.

    H5N1 was first isolated in 1996 from a domestic goose in China's Guangdong Province. The next year, the virus spread to people in Hong Kong. After laying low, H5N1 flared in 2004 in several Asian nations. It kills about 60% those infected but does not spread easily from person to person. The virus has been held in check by poultry vaccination and better husbandry, but 16 countries, including China and Romania, have reported H5N1 outbreaks in poultry so far this year. A constant worry is that the virus will mutate into a more transmissible form among humans.

    Because Qinghai Lake sits within the eastern portion of the Central Asian Flyway—which reaches from India and Bangladesh to Russia—some experts suspect it is a focal point of viral transmission. Others question whether wild birds play a major role in H5N1 dispersal, suggesting that the virus spreads primarily among poultry (Science, 21 October 2005, p. 426). To date, all human cases but one have been associated with exposure to poultry or found on farms. Researchers now believe that wild waterfowl on the eastern portion of the Central Asian Flyway help spread H5N1 into Mongolia each spring as they move across the Qinghai-Tibetan plateau to the north and east, says Newman. The role of wild waterfowl on the other major flyway is less certain.

    Poultry production is on the rise in Asia, as are farming, trade, and the mixing of wild and domestic birds. “All of them are increasing the opportunities for viral transmission and persistence,” says Xiao Xiangming, a landscape ecologist and remote sensing expert at the University of Oklahoma, Norman.

    Mixing bowl.

    Scientists track birds entering and leaving Qinghai Lake with GPS transmitters.

    CREDITS (TOP TO BOTTOM): HOU YUANSHENG; YANG TAO/COMPUTER NETWORK INFORMATION CENTER

    Every summer, more than 100,000 migratory birds descend on Qinghai Lake, China's largest inland body of salt water. Half the birds that died here in 2005 were bar-headed geese (Anser indicus), says Lei Fu-Min, an ornithologist at the Institute of Zoology of the Chinese Academy of Sciences. Yan Baoping, chief engineer at the Computer Network Information Center in Beijing, led an academy team that set up a monitoring network after the die-off. The next year international scientists joined the effort. To date, the team led by FAO and the U.S. Geological Survey has tracked more than 525 waterfowl from 24 species in 11 countries.

    In the past 5 years, the involvement of wild birds has become clearer, Lei says. “The H5N1 strains from wild birds that subsequently arrived in Asia and Eastern Europe were most like the H5N1 strains of Qinghai Lake,” far from large poultry farms, he says. GPS data on migration paths are now being used for the first time to explore the relationships between different groups of birds and their interactions with domestic fowl, says Diann Prosser, a biologist at USGS's Patuxent Wildlife Research Center in Beltsville, Maryland. This year, she says, researchers learned that the majority of bar-headed geese tagged at Qinghai spend their winters in the Lhasa region of Tibet, south of the lake. These wintering grounds have domestic poultry and captive bar-headed goose farms—and H5N1 outbreaks have been reported there, suggesting a path for the virus to move from captive to wild birds.

    Southeast of Lhasa, the ruddy shelduck may help explain the virus's spread, says John Takekawa, an ecologist at USGS's Western Ecological Research Center. In autumn and winter the ducks gather at Poyang Lake in the lower reaches of the Yangtze River within the East Asia Flyway (Science, 23 October 2009, p. 508). Qinghai strains can be traced to one early strain from Poyang based on the genomic analysis, Lei said. But recent work suggests that the viral reservoir may lie farther to the north, in Siberia—an area shared by both major Asian flyways—or that another as-yet-unstudied migratory bird may be carrying the virus from lake to lake. Since 2006, Xiao has led an international team to develop an early-warning system for H5N1 in Asia, focusing on agricultural and ecological risk factors.

    Researchers need a better understanding of wild bird distribution, habitat use, and daily movements, Newman says. And the human role—including population growth and urbanization—must be better accounted for, says Takekawa. Why some people exposed to the virus become infected and others do not “is still an unsolved question,” says Shu Yuelong, director of the National Influenza Center of the Chinese Center for Disease Control and Prevention. China has launched a nationwide monitoring network to check poultry markets for H5N1. That's a good start, but what's needed is a global network, says Shu. It must get started now, he says, “without delay.”

    • * Li Jiao is a writer in Beijing.

  13. Nanotechnology

    Nanoparticle Trojan Horses Gallop From the Lab Into the Clinic

    1. Robert F. Service
    On target.

    Red blood cells (yellow) ferry nanoparticles (red) containing a chemotherapy drug to tumor tissue (green) in a mouse. Nanoparticles shield normal cells from chemotherapy toxins and deliver higher doses to tumors.

    CREDIT: COURTESY OF CERULEAN PHARMA INC.

    In the early 1990s, Mark Davis's career was thriving. As a chemical engineer at the California Institute of Technology (Caltech) in Pasadena, Davis pioneered work on catalysts called zeolites. Then in 1995, his wife, Mary, was diagnosed with breast cancer and his research interests took a sharp turn.

    After a mastectomy, Mary's oncologist recommended chemotherapy with a medicine nicknamed the Red Death because its toxic side effects are so debilitating. The surgery and medication worked: Mary's cancer is in remission. During a treatment, she made an offhand comment to Mark that there had to be a better way to design chemotherapy drugs so others wouldn't have to endure what she had to go through. He took the comment to heart and in 1996 turned part of his lab over to engineering nanoparticles to ferry toxins into tumor cells before they release their cargo. Now, 14 years later, one of Davis's compounds has been picked up by a Cambridge, Massachusetts–based company called Cerulean Pharma that is now in the middle of a midstage clinical trial to measure its safety and establish doses for combating various cancers.

    Davis's novel nanoparticle-based medicine is not the only one under development. After many years of studies with cell cultures and animals, nearly a dozen nanoparticle-based drugs are in clinical trials, most of which aim at treating or diagnosing cancer. Many other compounds are progressing through preclinical studies and are nearing human trials. “There is a continuous pipeline” with numerous nanomedicine compounds at each stage of development, says Piotr Grodzinski, who directs the National Cancer Institute's Alliance for Nanotechnology in Cancer in Bethesda, Maryland. Grodzinski, Davis, and others underscore that it will require several more years of testing to determine whether the compounds are safe and effective. However, Davis says, “I'm very optimistic. I think the potential is very high to have some good results.”

    That would be welcome news in the fight against cancer. Despite a decades-long “war” on the disease, the number of people diagnosed remains stubbornly high. In the United States alone, more than 1.3 million people this year will be diagnosed with cancer, and more than 550,000 will die from it. Overall, the rate of death among those who contract cancer has barely changed since 1950. There has been progress, Grodzinski acknowledges. Researchers know far more about the myriad different tumor types and about molecular hallmarks of some forms of the disease, and few novel treatments have earned widespread attention. Still, today most cancers are treated with the same blunt instruments of surgery, radiation, and harsh chemotherapy that oncologists have wielded for decades. And of the chemotherapies available to patients, many are as toxic to normal cells as they are to cancer cells.

    Nanomedicines have the potential to change that, Grodzinski says, because unlike traditional medicines they can be engineered to optimize several different functions. To treat cancer, a medicine must not only kill tumor cells but also be soluble in water in order to travel through the bloodstream; it must evade immune cell sentries and avoid being cleared out by the liver or kidneys; and it must find its targets. Traditional medicines have to build all these functions into single molecules. Nanomedicines, by contrast, can divide them among different components. Particle surfaces can be tailored for solubility, friendliness to immune cells, and target-seeking ability, while the particles' cargoes can be tailored to kill tumor cells.

    That was the hope, anyway, more than a decade ago when Davis and other researchers first looked into nanoparticle-based medicines. The field received widespread hype early on, and a handful of compounds made it all the way to market. In 2005, for example, the U.S. Food and Drug Administration approved Abraxane for treating metastatic breast cancer. The compound is simply a conventional anticancer compound called paclitaxel—better known by its trademarked name, Taxol—linked to a common blood protein called albumin. The albumin shields the paclitaxel, increasing its solubility and circulation time and giving it a greater chance of winding up in tumor cells. In addition to proving effective in fighting metastatic breast cancer, Abraxane is now in a phase III clinical trial for treating advanced lung cancer and in phase II trials against pancreatic cancer and melanoma. A handful of other compounds, packaged in lipid vessels called liposomes or combined with biofriendly polymers, have also made it to market. Those successes have the field booming. Grodzinski says more than 50 companies are developing nanoparticle-based medicines as diagnostics and treatments for cancer alone; 34 of them formed in the past 4 years.

    Most early successes have been very simple drug carriers; many next-generation nanoparticles are more complex. In December, for example, Bind Biosciences in Cambridge, Massachusetts, expects to launch a phase I clinical trial of nanoparticle carriers made from a trio of biodegradable polymers abbreviated PLA, PLGA, and PEG. PLA and PLGA are the polymers currently used to make biodegradable sutures; PEG helps shield the particles from being recognized and cleared by immune cells. The combination was originally developed by Robert Langer, a chemical and biomedical engineer at the Massachusetts Institute of Technology, and colleagues. In recent work, Langer's team incorporated the anticancer compound docetaxel into the PLGA polymer matrix and added a targeting molecule that seeks out prostate-specific membrane antigen, a protein expressed on the surface of prostate cancer cells and other types of solid tumor cells. According to Bind's CEO Scott Minick, animal trials showed that the combination of the targeting compound and slow release of the docetaxel by degrading nanoparticles increases the tumor cell concentration of the anticancer drug 20-fold over docetaxel packaged in conventional liposomes. Moreover, Langer notes that the byproducts of the polymer are lactic acid and glycolic acid, naturally occurring substances safe to the body.

    On trial.

    Several first-generation nanomedicines have already made it to market. Now, more than 50 companies are working to bring second-generation nanomedicines to market. A dozen such nanoparticles (NPs) are in clinical trials, most for treating, imaging, and diagnosing cancer.

    Other groups are working on variations on the strategy. Davis's and Cerulean's particles, for example, are engineered to degrade over time while leaving their building blocks intact. The shell of the particles, Davis explains, is made from sugars called cyclodextrins coated with PEG. These sugars contain hydroxyl groups that bind readily with water, making them—and the particles—highly soluble. But once they are inside tumor cells, the acidic environment there breaks the cyclodextrin particles and PEG apart, releasing an anticancer compound called camptothecin. The remaining fragments of the cyclodextrin are small enough to be cleared by the cells and the kidney. In August, at the American Chemical Society meeting in Boston, Cerulean researchers reported that initial results from a phase I trial showed that patients tolerated the compound well, and in several patients with advanced, progressive cancer, the disease stabilized for more than 6 months. Those results are encouraging, says Cerulean's senior vice president for research and business operations, Alexandra Glucksmann, because previous trials showed that giving patients camptothecin alone was too toxic. “This gives us the opportunity to rescue drugs that have failed before,” Glucksmann says.

    Nanoparticles are also being harnessed for less-traditional therapies. Numerous teams are using them to package tiny snippets of specific RNA molecules, in the hope that they can enter tumor cells and kill them by binding to the cells' own RNA molecules required for building essential proteins. This strategy, known as antisense, became a white-hot field in the early 2000s, when numerous teams developed antisense RNAs to block proteins critical to a variety of diseases. Numerous clinical trials using this strategy to kill cancer cells failed, however, primarily because researchers injected antisense RNA directly into patients' bloodstreams, where it was quickly chopped up by enzymes and cleared. “For RNA, nanoparticles are enabling, because delivery is such a key issue,” Langer says.

    An early clinical trial underscores this hope. In the 15 April issue of Nature, Davis and researchers at Calando Pharmaceuticals in Pasadena, California, and several other institutions reported the first results from an initial human clinical trial with nanoparticles packed with RNA designed to target melanoma tumor cells and interfere with critical protein production. The RNA-packed nanoparticles readily penetrated tumor cells, where they blocked the RNA target for a gene called RRM2 that cancer cells need to multiply. The trial wasn't intended to gauge the particles' efficacy, but Davis says the early results look promising.

    A very different approach to making nanoparticles may also soon revolutionize the way common vaccines are made and delivered. The work builds on progress by Joseph DeSimone and colleagues at the University of North Carolina, Chapel Hill, in using computer chip manufacturing techniques to make nanoparticle medicines. DeSimone's group came up with a sort of nano–cookie cutter approach to mold virtually any organic compound into nanoparticles of whatever size, shape, and stiffness they want. Along the way they found that making such changes yielded big results. Stiff nanoparticles injected into animals, for example, are cleared within as little as 2 hours. But soft, flexible ones circulate for 93 hours. Similarly, cylindrical particles have a knack for getting inside cells far more readily than spheres do. In animal studies, DeSimone says, as many as 15% of the particles they inject can find their way inside tumor cells, compared with about 5% for conventional spherical liposomes.

    DeSimone recently launched a company called Liquidia to commercialize the technology. Liquidia is working to deliver particles packed with anticancer drugs and RNA. But in an initial clinical trial, likely to begin later this year, the company intends to deliver particles shaped like pathogenic bacteria to carry influenza proteins already used in vaccines. Animals injected with pathogen-shaped particles produce antibody titers as much as 10 times as high as animals dosed with conventional vaccines, DeSimone says. Working with flu proteins that are already part of conventional vaccines could also help Liquidia get its initial vaccines to market more quickly. “We think it's just a beachhead” and that many other products will soon follow, DeSimone says.

Log in to view full text