News this Week

Science  17 Dec 2010:
Vol. 330, Issue 6011, pp. 1596
  1. U.S. Science Policy

    NSF Won't Build Underground Lab; Scientists Hope That DOE Will

    1. Adrian Cho*

    Plans to convert an abandoned gold mine in South Dakota into the world's largest underground lab may have to be scaled back and could fall apart entirely after the National Science Foundation's (NSF's) oversight board rejected the current proposal.

    On 30 November, the National Science Board (NSB), which sets policy for NSF, turned down the agency's own request for additional money to support development through 2011 of the $875 million Deep Underground Science and Engineering Laboratory (DUSEL) at the Homestake mine in Lead, South Dakota. That puts the onus on the Department of Energy (DOE), NSF's partner in the project, to pay for whatever it wants to build.

    Supporters are putting the best light on the board's decision. “This is not the end of the project,” says Kevin Lesko, a physicist at the University of California, Berkeley, who leads the design team. “We are continuing conversations with the NSB, NSF, and DOE to resolve their issues and to maintain the project's momentum.” But Edward Seidel, who leads NSF's mathematical and physical sciences directorate, says that the 10-year campaign to build the underground facility has entered uncharted waters: “One option that's no longer open is that the NSF would steward the facility.” In other words, NSF may support experiments to be done at DUSEL, but it won't build the lab itself.

    Starting over?

    The science board's decision shreds the current management plan for DUSEL.

    SOURCE: DUSEL

    Researchers in Italy and Japan have enormous underground labs, and earlier this week China opened the world's deepest lab, 2500 meters down in a tunnel through Jinping Mountain in Sichuan Province (Science, 5 June 2009, p. 1246). DUSEL was supposed to be the biggest lab and, with a chamber at 2250 meters, among the deepest (Science, 12 November, p. 904). The great depth would shelter experiments from cosmic rays, and the huge capacity would allow for a suite of physics experiments whose results could be revolutionary, such as searches for particles of the mysterious dark matter whose gravity binds the galaxies and for a kind of radioactivity that would blur the distinction between matter and antimatter. The lab would also give geophysicists and microbiologists rare access to Earth's innards.

    DUSEL began its rocky journey in 2001, when researchers tried and failed to win a congressional earmark to prevent the then-functioning Homestake mine from filling with water after it was shut down. NSF then held a site competition and in July 2007 chose Homestake, by then under water, based on a conceptual design. In September 2009, NSB approved $29 million to support a detailed preliminary design to be finished this year.

    But that money didn't cover what's needed to continue operations, a key consideration because the pumps must keep running to keep the partially drained mine from filling up again. In March, South Dakota appropriated $5.4 million to keep things going through May 2011. NSF and the research team requested $19 million to tide them over until NSB could evaluate the design late next year, with the expectation that the board would provide $10 million more in the spring.

    Instead, the NSB panel decided that the current plan was “unacceptable.” Members objected to the two agencies' plan to share responsibilities and costs, says Mark Abbott, chair of the NSB committee that took action last month. Under that “stewardship model,” NSF would have borne the $480 million expense of developing DUSEL's infrastructure, while DOE took the lead for the lab's biggest experiment, a particle detector called the Long-Baseline Neutrino Experiment (LBNE) weighing up to 200,000 metric tons and costing at least $660 million. That experiment would snare neutrinos fired through Earth from DOE's Fermi National Accelerator Laboratory in Batavia, Illinois, and would be Fermilab's flagship project in coming decades.

    The NSB panel also questioned whether the massive infrastructure project fits the mission of NSF, which ordinarily builds scientific instruments such as telescopes. “We're a science agency, not a mission agency, or a facilities agency, or a big infrastructure agency,” says Abbott, a biological oceanographer at Oregon State University, Corvallis. In contrast, DOE's Office of Science supports a network of national labs and user facilities specifically chosen to further DOE's mission.

    NSB's decision affects the particle physics programs at both NSF and DOE. LBNE designers aren't wedded to a site, says Robert Tschirhart, a physicist at Fermilab, and an alternative could be the Sanford Underground Laboratory, the portion of Homestake that South Dakota has opened with $34.2 million of its own money and $70 million from philanthropist T. Denny Sanford in preparation for DUSEL approval.

    The good news for DUSEL developers is that DOE seems willing to try to salvage the project. “We have not made any decisions yet, but we are trying not to lose all the effort that has been put into the project so far,” says William Brinkman, head of DOE's Office of Science. In the meantime, there is only enough money to keep the water pumps running for a few more months.

    • * With reporting by Jeffrey Mervis.

  2. Climate Negotiations

    Cancún Delegates See the Trees Through a Forest of Hot Air

    1. Eli Kintisch,
    2. Antonio Regalado

    A new deal inked last week at the United Nations climate talks in Mexico paves the way for rewarding nations for maintaining their forests. A central facet of the agreement, which puts the protection of trees at the vanguard of climate negotiations, is a call for developing countries to begin calculating national totals for emissions from forest loss and to adopt systems to begin monitoring deforestation.

    The deal, which rests upon years of research and political maneuvering, opens the door for future trading of forest credits. Paying poor nations to preserve trees is considered a key component of any long-term strategy to reduce emissions. It also enshrines the scientific finding that carbon in forests can and should be measured at national scales. “Once you have the science demonstrating how it can work, it makes it possible politically,” says Paulo Moutinho, executive director of Brazil's Instituto de Pesquisa Ambiental da Amazônia.

    Deforestation, whether trees are felled or burned, contributes as much as a quarter of greenhouse gas emissions from human activity worldwide. The agreement on reducing emissions from deforestation and forest degradation, known as REDD, sets the stage for an arrangement in which countries would be compensated for slowing deforestation. Nations that keep trees standing might then sell credits to developed countries, or even individual polluters. The agreement does not, however, spell out a compensation mechanism. “How this stuff will be funded has been kicked down the road,” says Steve Schwartzman of the Environmental Defense Fund (EDF) in Washington, D.C.

    REDD was a high point among a series of modest achievements forged in the 2-week meeting in Cancún that ended on 11 December. Another agreement is a $100-billion-per-year green fund for developing nations to adapt to climate change and promote clean energy, although as with REDD, no one knows how that money will be raised. Negotiators also made progress on rules for sharing technology and monitoring emissions reductions. Mexican President Felipe Calderón said the meeting removed the “inertia of mistrust” that has existed, particularly between poor and rich countries.

    Lumbering giant.

    Agreement highlights the role of forests in reducing carbon emissions.

    CREDIT: GALEN ROWELL/CORBIS

    Climate negotiations nearly broke down a decade ago over the REDD question. Negotiators initially wanted to allot credits to small-acreage conservation projects that could be carefully monitored. But protecting trees in one place might simply cause them to be cut down elsewhere.

    Forest scientists and policy experts have since crafted the national-scale strategy at the heart of the REDD deal. One key step was a proposal in a 2005 paper by Brazilian and U.S. researchers, including Schwartzman, for “compensated reduction,” a process in which countries could be rewarded for stemming deforestation at a national level.

    In a series of meetings since then, remote-sensing experts determined that tools to calculate national totals do exist. Those workshops “showed the U.N. parties that you could actually measure this stuff,” says Annie Petsonk, a forest expert with EDF.

    The ability to monitor its forests has been a boon to Brazil, where, until a few years ago, deforestation was a source of shame and raw data were confidential. “Five or 6 years ago, Brazil's government didn't want to discuss deforestation, so they would say we didn't know how much carbon there was in the forest,” says Moutinho. “Now it's hard to find anyone who questions the measurement technology.” After building up the largest remote-sensing agency of any tropical nation, Brazil officials arrived at Cancún touting the lowest rate of national deforestation in more than 2 decades.

    Researchers have also made progress in understanding the most effective ways to preserve forests. One powerful tool, they say, is demarcating territory for indigenous groups, which has occurred throughout Latin America in the past few years by the millions of hectares. “Before, that was just an argument. Now there is economic and remote-sensing data that backs up” that approach, says David Kaimowitz of the Ford Foundation in Mexico City.

    Some researchers have also argued that REDD may be the cheapest way to reduce carbon emissions. That “created [the] political will,” says Kaimowitz. Scientists are already plotting their next steps, including how to merge remote-sensing data with on-the-ground monitoring. The biggest near-term challenge, scientists say, will be helping developing countries put in place the “robust and transparent” national forest-monitoring systems stipulated under the agreement. The Cancún deal requires countries to propose rules “for measuring, reporting and verifying anthropogenic forest-related emissions” before next year's meeting in Durban, South Africa.

    Some solutions are already coming into view. At Cancún, the charity arm of the search company Google unveiled Google Earth Engine, which will allow users to study deforestation using its computing power. And a group at the Carnegie Institution for Science in Stanford, California, has been training hundreds of experts from several countries to analyze deforestation data.

    Despite having a general framework for REDD, it will take years of negotiations to fill in the details. One especially sticky question is setting the baseline year against which to measure progress. Still, remote-sensing and forestry experts came away from Cancún revitalized. “It makes me feel that the last 5 years of not sleeping has been worth it,” says Carnegie tropical ecologist Greg Asner. “We'll keep being sleepless, but we're going to be celebrating this.”

  3. High Technology

    Haunted by 'Specter of Unavailability,' Experts Huddle Over Critical Materials

    1. Dennis Normile
    Going critical.

    A worker pours the rare earth metal lanthanum into molds at a workshop in China.

    CREDIT: DAVID GRAY/REUTERS

    When shipments of rare earth metals from China to Japan temporarily stopped in the wake of a territorial spat this autumn, high-tech companies around the world got an uncomfortable reminder that China owns a stranglehold on supplies of the coveted commodities. The episode might have attracted more attention than China hoped for. The U.S. Department of Energy (DOE) and counter parts in Japan and Europe have held workshops in recent weeks bringing materials scientists and policymakers together to brainstorm on how to ensure supplies of rare earths and other strategic minerals and to stimulate research on alternatives.

    The meetings, at Lawrence Livermore National Laboratory in California in November and at the Massachusetts Institute of Technology in Cambridge, Massachusetts, earlier this month, are expected to help DOE chart a strategy for critical materials, participants say. The quickest fix for restricted supply would be boosting production outside of China. But no one expects breakthroughs on supply or alternative materials overnight.

    Rare earths—elements 57 through 71 on the periodic table plus scandium and yttrium—are key components in products such as solar panels and hybrid cars. Around 95% of the current supply comes from China, which is holding back an increasingly large share of production for its own use (Science, 11 September 2009, p. 1336). The recent controversy over shipments to Japan has observers worried about China's willingness to squeeze supplies for political purposes.

    “The specter of mineral unavailability” due to geopolitical risks or market imbalances haunts other elements as well, says Roderick Eggert, a mineral resources economist at the Colorado School of Mines in Golden. Most of the world's lithium, a key component in batteries for consumer electronics and electric vehicles, is mined in Bolivia. Congo and Zambia are the main sources of cobalt, used in high-strength alloys and as an industrial catalyst. Indium and tellurium, ingredients of photovoltaic cells and flat panel displays, are byproducts of, respectively, zinc and copper processing. Because availability of indium and tellurium depends on demand for copper and zinc, supplies of the two won't necessarily increase with increasing prices, Eggert says.

    Even before China flexed its muscles, experts in other countries were fretting over critical materials. A few years ago, Toru Nakayama, who manages materials research programs for the New Energy and Industrial Technology Development Organization (NEDO) in Kawasaki, Japan, drafted a rare metal substitute development project while at the Ministry of Economy, Trade and Industry. Since 2007, NEDO has organized consortia of researchers from companies, universities, and institutes to work on reducing or replacing indium, dysprosium, platinum group metals, and four other elements in high-tech products. NEDO has provided $82 million over the past 4 years for these efforts. As that effort was getting started, the U.S. National Research Council warned in 2008 that a looming shortage of critical minerals could hinder adoption of emerging clean energy technologies. And last June, the European Commission released a report citing concerns over access to 14 critical raw materials.

    On the rare earth supply issue, workshop participants agreed that immediate relief would come from “getting rare earth mines outside of China up to speed as fast as you can,” says Karl Gschneidner Jr., a metallurgist at DOE's Ames Laboratory and Iowa State University. Two rare earth mines—one in the United States and another in Australia—could come on stream in 2011, says Eggert. In the meantime, he says, manufacturers will be working to make more efficient use of the scarce minerals. Several NEDO projects aim to optimize manufacturing processes or tweak materials to get the same performance out of less mineral.

    The toughest challenge may be to replace rare elements (Science, 26 March, p. 1597). Some materials have been in use for several decades, and “nobody has found substitutes yet,” says Gschneidner. For example, he says, phosphors in lighting and optics applications have unique properties difficult to find in other materials.

    The workshops showcased some progress on replacements. One NEDO-funded group has produced a trial liquid crystal display that uses zinc oxide instead of indium tin oxide for some electrodes, reducing indium use by 45%. Dysprosium, used in the magnets in motors powering electric and hybrid vehicles, is another target. Several groups have developed dysprosium-free magnets, but they weaken at car motor operating temperatures, says Kunihiro Ozaki, a materials scientist at Japan's National Institute of Advanced Industrial Science and Technology. By tweaking grain properties and strictly controlling manufacturing, researchers hope to strengthen these magnets' thermal resistance—but that could take 5 years or more, Ozaki says.

    Workshop participants came away with a clearer picture of what colleagues are doing in other countries. “It was very valuable to understand the extent of the program that's ongoing in Japan,” says John Hryn, a materials scientist at Argonne National Laboratory in Illinois. The first cooperative effort is likely to be between the geological surveys of the United States and Japan to assess rare mineral deposits around the world, Eggert says. Beyond that, Hryn says there are a number of ongoing projects at Argonne focused on magnets, phosphors, and catalysts that use rare earths, creating “opportunities for future collaborations.” These will take time to coalesce and bear fruit. In the meantime, China is sitting pretty with most of the rare earth cards.

  4. Demographic Research

    Asia's Looming Social Challenge: Coping With the Elder Boom

    1. Richard Stone

    BEIJING—Asia is graying fast. The portion of the population age 65 or older will more than triple in China, India, and Indonesia—and more than double in Japan—between 2000 and 2050, according to United Nations projections. That's a triumph of civil society and medical science, but it also poses stiff challenges to Asian nations trying to create or strengthen social safety nets. “Responding to these challenges will be one of the most difficult tasks facing governments in the first half of this century,” warns a report* released here last week by five science academies.

    CREDITS (FROM TOP): WORLD POPULATION PROSPECTS: THE 2008 REVISION, REPRINTED WITH PERMISSION FROM THE NATIONAL ACADEMY OF SCIENCES, (DATA) UNITED NATIONS; PREPARING FOR THE CHALLENGES OF POPULATION AGING IN ASIA/J. P. SMITH

    Long-term population studies under way or about to begin will chart demographic, economic, and health transitions—and give governments worldwide clues about how to cope with the aging boom. One hallmark of the studies is that all will put data in the public domain—not a common practice in Asia. “This is a scientific revolution,” says James P. Smith, a senior economist at RAND Corp. in Santa Monica, California. “There are no secrets anymore. This is our version of WikiLeaks.”

    Asia's demographic shift is happening at a torrid clip. In the United States, it will take approximately 70 years for the percentage of the population age 65 and older to rise from 7% to 14%. Such a doubling is expected to occur in only about 25 years in China, India, and Indonesia, according to the report. “The growth in the number of the elderly will speed up very fast, and we are not prepared,” says Mayling Oey-Gardiner, an economics professor at the University of Indonesia in Jakarta. “There are no social safety nets in Indonesia.”

    That's also true in China, where several factors—including its 30-year-long one-child policy and a tide of young migrant workers seeking their fortunes in cities—will increasingly leave elderly people to fend for themselves, especially in the countryside. In a revealing survey in 72 villages in Shandong Province, Wang Guangzhou and his colleagues at the Chinese Academy of Social Sciences' Institute of Population and Labor Economics here found that more than 80% of the elderly were in debt. Most could not afford medical care and lived in dilapidated housing.

    A new generation of population studies (see table) modeled after the 20-year-old U.S. Health and Retirement Study will document, among other things, to what extent children will support elderly parents, how much money older Asians have put away for retirement or receive from pensions or governments, and how medical systems will manage. “We will learn what the myths are and what the facts are—what's happening in people's lives,” says Smith.

    It will be up to policymakers to implement the lessons. And the longer Asian governments wait to craft policies to care for graying populations, the more constrained their choices will be, says the report. Asia may not want to look to the United States for answers, says Robert Hauser of the National Academies. “We don't have a good model,” he says. “We have a terrible situation with respect to [overreliance on] institutional care of the elderly.” The population studies may not offer ready solutions, but they will give a vivid picture of the transformation of Asian societies.

    • * Preparing for the Challenges of Population Aging in Asia, the Chinese Academy of Social Sciences, the Indian National Science Academy, the Indonesian Academy of Sciences, the National Research Council of the U.S. National Academies, and the Science Council of Japan.

  5. ScienceInsider

    From the Science Policy Blog

    The FBI has belatedly provided an expert panel with new information that will delay a long-awaited report on the scientific merits of the government's investigation into the deadly 2001 anthrax mailings. Five weeks after getting a draft report, FBI officials provided new material and asked to appear before the committee.

    Political leaders in three western states are seeking help from Congress to remove the northern Rocky Mountain gray wolf from protection under the Endangered Species Act. “There do need to be discussions,” says one California wildlife biologist. “But this needs to be done with the best science possible, not with legislation.”

    Eight bioethicists at the University of Minnesota are charging that their own institution has committed an “alarming series of ethical violations” in a clinical trial during which a young man committed suicide in 2004. They say that his death wasn't adequately investigated.

    The Irish government has increased its funding for research in 2011 by 12% despite being forced to make €6 billion in cuts following its recent bailout. But an emphasis on commercial research has some scientists concerned that funding for basic science will suffer.

    European nuclear physicists released a long-range plan that sets as their top priorities completion of the Facility for Antiproton and Ion Research in Darmstadt, Germany, and of SPIRAL2, a complementary machine at France's GANIL lab in Caen.

    A new test for tuberculosis that is cheaper, faster, and more accurate than standard tests has received an important thumbs-up from the World Health Organization.

    For more science policy news, visit http://news.sciencemag.org/scienceinsider.

  6. Digital Data

    Google Opens Books to New Cultural Studies

    1. John Bohannon

    In March 2007, a young man with dark, curly hair and a Brooklyn accent knocked on the door of Peter Norvig, the head of research at Google in Mountain View, California. It was Erez Lieberman Aiden, a mathematician doing a Ph.D. in genomics at Harvard University, and he wanted some data. Specifically, Lieberman Aiden wanted access to Google Books, the company's ambitious—and controversial—project to digitally scan every page of every book ever published.

    By analyzing the growth, change, and decline of published words over the centuries, the mathematician argued, it should be possible to rigorously study the evolution of culture on a grand scale. “I didn't think the idea was crazy,” recalls Norvig. “We were doing the scanning anyway, so we would have the data.”

    The first explorations of the Google Books data are now on display in a study published online this week by Science (www.sciencemag.org/content/early/2010/12/15/science.1199644.abstract). The researchers have revealed 500,000 English words missed by all dictionaries, tracked the rise and fall of ideologies and famous people, and, perhaps most provocatively, identified possible cases of political suppression unknown to historians. “The ambition is enormous,” says Nicholas Dames, a literary scholar at Columbia University.

    CREDITS: J. B. MICHEL ET AL.; WORDLE.COM

    The project almost didn't get off the ground because of the legal uncertainty surrounding Google Books. Most of its content is protected by copyright, and the entire project is currently under attack by a class action lawsuit from book publishers and authors. Norvig admits he had concerns about the legality of sharing the digital books, which cannot be distributed without compensating the authors. But Lieberman Aiden had an idea. By converting the text of the scanned books into a single, massive “n-gram” database—a map of the context and frequency of words across history—scholars could do quantitative research on the tomes without actually reading them. That was enough to persuade Norvig.

    Lieberman Aiden teamed up with fellow Harvard Ph.D. student Jean-Baptiste Michel. The pair were already exploring ways to study written language with mathematical techniques borrowed from evolutionary biology. Their 2007 study of the evolution of English verbs, for example, made the cover of Nature. But they had never contended with the amount of data that Google Books offered. It currently includes 2 trillion words from 15 million books, about 12% of every book in every language published since the Gutenberg Bible in 1450. By comparison, the human genome is a mere 3-billion-letter poem.

    Michel took on the task of creating the software tools to explore the data. For the analysis, they pulled in a dozen more researchers, including Harvard linguist Steven Pinker. The first surprise, says Pinker, is that books contain “a huge amount of lexical dark matter.” Even after excluding proper nouns, more than 50% of the words in the n-gram database do not appear in any published dictionary. Widely used words such as “deletable” and obscure ones like “slenthem” (a type of musical instrument) slipped below the radar of standard references. By the research team's estimate, the size of the English language has nearly doubled over the past century, to more than 1 million words. And vocabulary seems to be growing faster now than ever before.

    It was also possible to measure the cultural influence of individual people across the centuries. For example, notes Pinker, tracking the ebb and flow of “Sigmund Freud” and “Charles Darwin” reveals an ongoing intellectual shift: Freud has been losing ground, and Darwin finally overtook him in 2005.

    Analysis of the n-gram database can also reveal patterns that have escaped the attention of historians. Aviva Presser Aiden led an analysis of the names of people that appear in German books in the first half of the 20th century. (She is a medical student at Harvard and the wife of Erez Lieberman Aiden.) A large number of artists and academics of this era are known to have been censored during the Nazi period, for being either Jewish or “degenerate,” such as the painter Pablo Picasso. Indeed, the n-gram trace of their names in the German corpus plummets during that period, while it remains steady in the English corpus.

    Once the researchers had identified this signature of political suppression, they analyzed the “fame trace” of all people mentioned in German books across the same period, ranking them with a “suppression index.” They sent a sample of those names to a historian in Israel for validation. Over 80% of the people identified by the suppression index are known to have been censored—for example, because their names were on blacklists—proving that the technique works. But more intriguing, there is now a list of people who may have been victims of suppression unknown to history.

    “This is a wake-up call to the humanities that there is a new style of research that can complement the traditional styles,” says Jon Orwant, a computer scientist and director of digital humanities initiatives at Google. In a nod to data-intensive genomics, Michel and Lieberman Aiden call this nascent field “culturomics.” Humanities scholars are reacting with a mix of excitement and frustration. If the available tools can be expanded beyond word frequency, “it could become extremely useful,” says Geoffrey Nunberg, a linguist at the University of California, Berkeley. “But calling it ‘culturomics’ is arrogant.” Nunberg dismisses most of the study's analyses as “almost embarrassingly crude.”

    Although he applauds the current study, Dames has a score of other analyses he would like to perform on the Google Books corpus that are not yet possible with the n-gram database. For example, a search of the words in the vicinity of “God” could reveal “semantic shifts” over history, Dames says. But the current database only reveals the five-word neighborhood around any given term.

    Orwant says that both the available data and analytical tools will expand: “We're going to make this as open-source as possible.” With the study's publication, Google is releasing the n-gram database for public use. The current version is available at www.culturomics.org.

  7. U.S. Science Policy

    Leaving Congress, Physicist Bill Foster Calls for Reinforcements

    1. Eli Kintisch

    A month after losing his seat in the U.S. House of Representatives, physicist-turned-lawmaker Bill Foster has a parting message for his fellow scientists: Replace me in Congress. And he's eager to help them do it.

    Foster, 55, is a Democrat from an Illinois district that includes the Fermi National Accelerator Laboratory, where he spent 22 years as an experimental physicist. In 2006, he decided to “try to do something” about what he saw as “poor decisions” by Washington politicians. In March 2008, he was the surprise winner in a special election held to replace former Speaker Dennis Hastert, a Republican who retired a year after the Democrats won control of the House in November 2006.

    Foster's allies portrayed him as an antipolitician with business savvy—he started and ran a successful company making lighting equipment. But after winning a full term in November 2008, his support for the president's $800 billion stimulus package and health care reform bill put him at odds with his traditionally conservative district. On 2 November, he lost 51% to 45% to Republican Randy Hultgren, a state legislator and lawyer who lambasted “Bill's boondoggle.” The high unemployment rate doomed him, Foster says: “People were looking for someone to blame.”

    Despite his defeat, Foster thinks his scientific training helped him as a legislator. He pushed his staff to delve into the details of policy issues, polling data, and even get-out-the-vote efforts on Election Day, recalls Thomas Bowen, a former aide. “As chief of staff, I was always forced to ‘show my work,’” says Bowen. During the yearlong debate over health care, Foster says, he would call experts to chase down numbers “that seemed fishy.”

    “An engineer or scientist can cut through the sound-bite level of debate that is common in politics,” says Foster. That skill was especially useful in scrutinizing the complex financial products, many designed by scientists, whose failure contributed to the melt-down on Wall Street. “It takes a physicist to unwind what physicists did to structured finance,” he says.

    Using his perch on the House Committee on Financial Services, Foster persuaded colleagues to include six amendments in the Dodd-Frank financial reform bill passed earlier this year, including studies on derivatives and steps to compel banks to act less recklessly. Even so, his colleagues declined to adopt one idea that grew out of his years as an experimental physicist: a rule that complex derivatives—often spelled out in reams of legalese—be expressed as algorithms so that their possible impacts could be modeled on computers. The approach, he says, would mirror “very sophisticated software tools that get shared across experiments.”

    The need for more legislators with scientific backgrounds—“More than 80% of the people running China are engineers, and you see it in their [economic] behavior,” Foster says—may even serve as the basis for the next stage of his career. “One of the things I've been contemplating [launching] is called Albert's list,” he says. (The name comes from arguably the greatest physicist of all time.) It would be modeled on EMILY's List, the successful pro-choice political action committee formed in 1985, and it would go where previous efforts to get scientists politically involved have feared to tread.

    Scientists and Engineers for America “never actively recruited and developed candidates the way EMILY's List does,” Foster says about SEA, which has conducted workshops and offered fellowships since being formed in 2004. “EMILY's List very consciously identifies promising candidates, trains them, and gives them [help in] public fundraising.”

    Training camp.

    Bill Foster wants to start an organization to help scientists run for office.

    CREDIT: LLOYD DEGRANE

    One issue that must be resolved before any organization is created, Foster says, is whether to make the group partisan. “[Another is] whether you concentrate on only federal candidates or work on a state-by-state basis, or whether you recruit people to go into the federal bureaucracy.” Then there's the question of applying a scientific litmus test. “You could imagine insisting, or not, that they have a science-based attitude about climate change, or, more controversially, about evolution,” he says.

    What type of person is Foster hoping to attract? Potential recruits would have to decide “what fraction of your life you should spend serving your fellow man,” he says. And he thinks the best candidates would “have a sincere desire to serve and not to become famous. Voters detect that very rapidly.”

    As for his own political aspirations, there's talk that a Democratic-controlled state legislature in Illinois could create a congressional district next year that would be more compatible with Foster's liberal views. “It's way too early to think about that,” he says. But Bowen thinks it could happen. “This may not be the last we hear from Bill Foster,” he says.

    If Albert's List becomes a reality, Foster could return to a Congress with more members who are scientists—and one that is more capable of dealing with an increasingly complex world.

  8. Cancer Screening

    Genetic Analysis Points the Way to Individualized PSA Tests

    1. Jennifer Couzin-Frankel

    In cancer research, one never-ending quest is the hunt for biomarkers, proteins in the blood that reveal tumors deep in the body. The goal is to find a reliable predictor that enables early diagnosis and better treatment of disease. Most biomarkers identified so far have been disappointing, but millions of men are tested for one of them: PSA, or prostate-specific antigen, which cells in the prostate gland churn out at higher levels when cancer is present.

    But some men with high PSA levels don't have the disease, leading to unnecessary biopsies, and some with low PSA levels do. PSA testing is routine in some places, yet large clinical trials in the United States and Europe are questioning how big an impact it has on reducing cancer deaths. This week, a group in Iceland suggests a new way to assess PSA results: Acknowledge that under normal circumstances some men produce more (and some produce less) PSA, and individualize healthy and high levels for each man tested by focusing on genetic variants that affect PSA levels.

    “Let's say that a PSA level of 4 is considered worrisome,” says William Catalona of Northwestern University Feinberg School of Medicine in Chicago, Illinois. He developed the original PSA test and is a co-author of the new work, published in Science Translational Medicine. “You might be able to take something like a tongue depressor and scrape a man's cheek,” and with DNA testing “you might find out this man is a high PSA secretor” normally, reducing concern.

    Adjustable baseline.

    A man's genes affect normal PSA levels. The border marking the danger zone for cancer risk (dotted line) may shift, depending on an individual's DNA.

    The researchers on the PSA study, led by geneticists Julius Gudmundsson and Kari Stefansson of deCODE Genetics in Reykjavik, began by scanning the genomes of about 16,000 men to identify places in the genome that modify the PSA level in blood. They found six, including three that were previously identified. Then the group compared more than 5000 men with prostate cancer and 41,000 without. They assessed whether variants at the six sites modified PSA and prostate cancer risk. Although it's difficult to disentangle genetic effects on PSA levels from those on prostate cancer risk, the researchers concluded that two of the variants influenced only PSA levels and two others had only a marginal effect on cancer, making all four useful for testing.

    PSA levels for the men in the study varied widely. The key was whether genetic analysis would shift the men into different categories—for example, whether a man whose PSA level might merit a prostate biopsy would escape biopsy once his genes were taken into account. The numbers here were modest: About 6% had their PSA measurement reclassified as either safe or unsafe based on their genetics. “Basically what we're talking about is changing the parameters of detection of disease,” says Richard Hayes, an epidemiologist at New York University who has studied some of these gene variants. In theory, the approach could be applied to biomarkers for other cancers, too, making them more precise and more clinically useful.

    The study looked back in time, so the genes didn't influence health care decisions. And Hayes and others agree that although the concept of using genetics to make PSA tests more meaningful holds promise, these specific variants aren't ready for prime time. The finding “is a good start, but it's not sufficient to introduce this extensive genetic screening to everybody,” says Fritz Schröder, a urologist at Erasmus Medical Center in Rotterdam, the Netherlands.

    A valid gene-based PSA analysis could help, Stefansson claims, by, for example, enabling some men with high PSA results to avoid a biopsy. But genetic analysis cannot yet be used to tackle a different puzzle: Many men with high PSA results do have cancer but an “indolent” form that grows so slowly that it's unlikely to kill them if left untreated.

    There is currently no way to distinguish the men at high risk of dying from their prostate cancer from those with a minimal risk of dying. The deCODE study didn't examine whether adding genetics to the mix could help reduce mortality. The deCODE group and others now need to examine “how many people they can help,” says Stacy Loeb, chief resident in urology at Johns Hopkins University in Baltimore, Maryland, who has studied PSA genetics.

    Two large U.S. and European clinical trials underscore this need. Both trials randomly assigned tens of thousands of men to get regular PSA testing, or not, and then followed them to see who developed prostate cancer and who died of it. Last year in The New England Journal of Medicine, the U.S. study reported no difference in mortality between the group that had regular PSA screening and the one that did not. The European team, led by Schröder, found that those who got PSA screening were 20% more likely to survive prostate cancer. Schröder attributes the contrasting results to different study designs; both groups are continuing to follow the men.

    Meanwhile, PSA testing is in flux. Earlier this month, a cancer screening committee in the United Kingdom advised against routine PSA testing, citing concerns about overdiagnosis of slow-moving cancers. In other European countries, the popularity of PSA testing varies, says Schröder. It remains routine in the United States.

    “I'm reluctant to make any predictions” about how PSA genetics might help guide early detection of prostate cancer, says Stephen Chanock, chief of the laboratory of translational genomics at the National Cancer Institute in Bethesda, Maryland, who works in this area. While he commends the deCODE group for pushing the ball forward, he says, “one really would want to see further confirmation and precision” in the number of men affected by these gene variants and the effect each variant has on PSA.

  9. The Top 10 ScienceNOWs of 2010

    Here's a look back at some of our favorite and most popular stories of the year. As in previous years, the 2010 batch is an eclectic mix. And it contains something special: our most popular story of all time.

    The Secret of Turtle Island

    CREDIT: F. MAFFUCCI

    In the Mediterranean Sea off the coast of Libya, there's an area local fishermen call “Turtle Island.” It's real enough, but you'd be foolish to try to set foot on it. That's because it's composed of groups of loggerhead turtles. Biologists think they now know why the reptiles cluster like this.

    Superaccurate Clocks Confirm Your Hair is Aging Faster Than Your Toenails

    According to Einstein's theory of relativity, a clock on the floor ought to run very slightly slower than an identical one on top of a step stool because the lower clock nestles deeper into Earth's gravitational field. Now, physicists have demonstrated this effect using two superaccurate clocks and hoisting one several centimeters above the other. It's the first time scientists have used clocks to show that time flies faster for your nose than for your navel.

    The Shocking Truth About Running Shoes

    CREDIT: D. LIEBERMAN/HARVARD UNIVERSITY

    Haile Gebrselassie, the world's fastest marathoner, once said of his early career, “When I wore shoes, it was difficult.” A new study reveals why: Humans run differently in bare feet. Researchers have discovered that sneakers and other sports shoes alter our natural gait, which normally protects us from the impact of running. The finding offers new insight on how early humans ran and raises concerns that sports shoes may promote more injuries than they prevent.

    How To Train Your Robot (To Lie)

    CREDIT: M. POLAK AND N. KAVAL

    Just what the world needs: lying robots. Well, at least these guys don't look dangerous. Researchers have programmed small machines to trick each other—the first time scientists have deliberately implanted deceit.

    The Spiky Penis Gets The Girl

    CREDIT: M. POLAK AND N. KAVAL

    When it came to insect penises, Charles Darwin had it right. The famed naturalist suspected that insect genitalia, which are frequently festooned with bizarre combinations of hooks, spines, and knobs, essentially functioned like peacock tails. That is, they helped males beat out their rivals for females. Now, researchers have confirmed this hypothesis by zapping fly penises with a laser.

    Tiny 'Flying Saucers' Could Save Earth From Global Warming

    CREDIT: WIKIMEDIA COMMONS

    Geoengineering—the concept of tinkering with the atmosphere to curb climate change—has seen some wacky ideas. But this may be the most bizarre of them all. Using a trick of sunlight itself, a researcher says tiny metallic disks could be levitated to the stratosphere, where they would shade Earth's surface and counteract the effects of global warming.

    Is Your Dog Pessimistic?

    CREDIT: PHOTOS.COM

    Why do dogs yowl when their owners leave home? According to this study, it's because they're down on life. The finding could help owners find ways to improve their dogs' mood.

    Oil Drop Navigates Complex Maze

    Here's an ego crusher for mice. Scientists have found a way to make simple droplets of oil navigate complex labyrinths with the same skill as laboratory rodents. The advance could help researchers devise better ways to solve other mazelike problems, from rooting out cancer in the body to mapping paths through traffic jams.

    These Dance Moves Are Irresistible

    Hey, guys, want to impress ladies on the dance floor? Keep your head and torso moving and don't flail your arms and legs. This useful advice comes courtesy of a new study, which finds that women are more attracted to computer avatars that rock these moves.

    … And The Top ScienceNOW of 2010—and The Most Popular of All Time—Is …

    CREDIT: WIKIPEDIA

    Here's a hint (see last image, to the right):

    Go to http://scim.ag/top10-2010 to find out the answer.

    Daily postings, comments, and more are available at http://news.sciencemag.org/sciencenow.

  10. Breakthrough of the Year

    The First Quantum Machine

    1. Adrian Cho

    A humanmade object that moves in ways that can be described only by quantum mechanics might lead to tests of our notion of reality.

    It may not prove as handy as the Model T, but, conceptually, a tiny machine unveiled this year blows the doors off Henry Ford's famous car or any other previous machine. Until now, all machines have moved according to the not-surprising laws of classical mechanics, which govern the motion of everyday objects. In contrast, the new gizmo jiggles in ways explicable only by the weird rules of quantum mechanics, which ordinarily govern molecules, atoms, and subatomic particles. The proto-quantum machine opens the way to myriad experimental devices and perhaps tests of our sense of reality. That potential and the ingenuity of the experiment make it the Breakthrough of the Year.

    Springboard.

    Scientists achieved the simplest quantum states of motion with this vibrating device, which is as long as a hair is wide.

    CREDIT: AARON D. O'CONNELL AND ANDREW N. CLELAND/UNIVERSITY OF CALIFORNIA, SANTA BARBARA

    Thanks to quantum mechanics, the realm of the extremely small looks nothing like our everyday world. Quantum theory dictates that a very tiny thing can absorb energy only in discrete amounts, can never sit perfectly still, and can literally be in two places at once. Scientists have observed such quantum effects and weirder ones in countless experiments with atoms, molecules, subatomic particles, light, electric currents, and even liquid helium. But nobody had seen such effects in the motion of a humanmade object.

    Not that physicists weren't trying. Researchers have fashioned tiny beams of semiconductor nanometers wide and micrometers long. Such a beam, or “oscillator,” will vibrate at a set frequency like a guitar string, and according to quantum theory, it can absorb or emit energy only in dollops or quanta whose size is proportional to the beam's frequency. To see such effects, physicists first have to suck out every possible quantum and leave a beam in its least-energetic “ground state.” Even then, a beam can't stand perfectly still, as quantum uncertainty requires it to hold an irretrievable half-quantum of energy and dance with unquenchable “zero-point motion.”

    To reach the ground state, physicists had to cool their beams to nearly absolute zero. They also had to make the quanta as large as possible by making a beam stiffer to increase its frequency. But that also reduces the amplitude of the motion, making it harder to detect. So several teams are using laser light or microwaves to cool a beam and detect its motion, getting them within a few quanta of the promised land.

    Breakthrough of the Year 2010

    A video introduction to the top scientific achievement of 2010: putting a humanmade object into its quantum ground state.

    Having trouble watching this video? view it in our Video Portal.

    A team of American physicists found a quicker route, as they reported in March. Instead of a beam, they fashioned a tiny diving board of aluminum nitride plated with aluminum that vibrated by getting thinner and thicker. As the doohickey hummed away at a very high frequency—a whopping 6 billion cycles per second—the “piezoelectric” material in it produced a warbling electric field that was easy to detect. Most important, through that field, the physicists managed to “couple” the mechanical device to an electronic one called a “phase qubit,” a ring of superconductor that itself has one low-energy and one high-energy quantum state.

    Manipulating the qubit with microwaves, the researchers could use it to feed energy quanta into the oscillator or pull them out of it, as one might use an ATM to deposit a $20 bill to a bank account or withdraw one. First they showed that when they cooled the oscillator to a few hundredths of a degree they could get no quanta out of it. That meant it had to be in the cashed-out ground state, jiggling with only zero-point motion. The researchers then put the oscillator in a state with exactly one more quantum of energy. They even coaxed it into both states at once, so that it was literally moving two different amounts simultaneously.

    The ingenuity in this scheme lay in the design of the oscillator and the use of a qubit to control it. In fact, in 2009, the team used a phase qubit to feed quanta into a long strip of superconducting metal that would ring with microwaves much as an organ pipe rings with sounds. Once they worked the kinks out, they replaced the microwave cavity with their clever mechanical oscillator, a move that had other physicists slapping their foreheads for not having seen it coming.

    What use is it? In basic research, simple quantum machines might make ultrasensitive force detectors or serve to generate quantum states of light. Most grandly, they might help test the bounds of quantum theory and our sense of reality. Why can't a car or a person be in two slightly different places at once? Does some principle forbid it? One way to find out would be to try to put ever larger things in such states.

    Such a test of quantum mechanics is a long way off. Still, other groups are already working toward quantum control of the motion of human-scale objects. In fact, after it is upgraded over the next 5 years, the Laser Interferometer Gravitational-Wave Observatory, which has sites in Livingston, Louisiana, and Hanford, Washington, will feature pairs of 40-kilogram mirrors laser-cooled to their ground states of motion, which may give physicists an opportunity to attempt an experiment on a very large scale.

    But first, physicists had to achieve a quantum state of motion for a mechanical object. And in 2010 they did just that.

  11. Breakthrough of the Year

    The Runners-Up

    This year's runners-up for Breakthrough of the Year include a synthetic genome, the Neandertal genome, next-generation genomics, souped-up cellular reprogramming, exome sequencing, quantum simulators, molecular dynamics simulations, knockout rats, and HIV prophylaxis.

    Build Your Own Genome

    A technical tour de force grabbed headlines around the world for synthetic biology this year. In what was hailed as a defining moment for biology and for biotechnology, researchers at the J. Craig Venter Institute (JCVI) in Rockville, Maryland, and San Diego, California, built a synthetic genome and inserted it into a bacterium in place of the organism's original DNA. The new genome caused the bacterium to produce a new set of proteins.

    The synthetic genome was an almost identical copy of a natural genome, but ultimately, researchers envision synthetic genomes custom-designed to produce biofuels, pharmaceuticals, or other useful chemicals. Also this year, researchers at Harvard University improved their high-throughput method of modifying existing genomes for such purposes, and other synthetic biologists showed that RNA-based “switches” can get cells to behave differently in response to certain signals.

    Life recreated.

    Scanning electron microscope image of bacteria with synthetic genomes.

    CREDIT: TOM DEERINCK AND MARK ELLISMAN/THE NATIONAL CENTER FOR MICROSCOPY AND IMAGING RESEARCH, THE UNIVERSITY OF CALIFORNIA, SAN DIEGO

    J. Craig Venter and his team built its $40 million genome from smaller pieces of store-bought DNA. First they stitched the synthetic DNA together in stages in yeast; then they transplanted it into a bacterium, where it replaced the native genome.

    Although not truly “artificial life,” as some media declared, this success prompted a congressional hearing and a review by a presidential commission on the ethics of synthetic biology.

    It's far from the only synthetic biology game in town, however. In 2009, Harvard's George Church introduced a technique called multiplex genome engineering, which adds multiple strands of DNA to bacteria every couple of hours, rapidly generating genetically engineered organisms with extensively revamped genomes. This year, his team came up with a cheaper way to produce the DNA strands used to modify the genome, in hopes of making this approach cost-effective for industrial use.

    Teams led by Caltech's Niles Pierce, Stanford University's Christina Smolke, and Boston University's James Collins have come up with ways to change a cell's behavior by modifying its regulatory pathways. In some cases, they add specially designed RNA molecules that can sense molecules in the cell associated with, say, cancer or inflammation. Once that happens, they cause the cell to produce a protein that may sensitize the cell to drugs or cause it to undergo programmed cell death. Another team made a riboswitch that caused bacteria to seek out and destroy the herbicide atrazine. Such devices are much closer than synthetic and modified genomes to having practical applications.

    Reading the Neandertal Genome

    In the DNA.

    Some living humans may have Neandertal ancestors.

    CREDIT: JOE MCNALLY

    Thirteen years ago, when researchers sequenced just a few snippets of mitochondrial DNA from a Neandertal, the breakthrough made headlines worldwide. This year, researchers published a draft of the Neandertal nuclear genome—and their first analysis of what these 3 billion bases of DNA reveal about the evolution of these extinct humans and us.

    Using new methods to sequence degraded fragments of ancient DNA (see “Insights of the Decade,” p. 1616), researchers spliced together a composite sequence from three female Neandertals who lived in Croatia 38,000 to 44,000 years ago, to reconstruct about two-thirds of the entire Neandertal genome. For the first time, scientists could compare in detail the genomes of Neandertals and of modern humans.

    Reading this sequence, the researchers concluded that modern Europeans and Asians—but not Africans—have inherited between 1% and 4% of their genes from Neandertals. Apparently, Neandertals interbred with modern humans after they left Africa at least 80,000 years ago but before they spread into Europe and Asia. If correct, this stunning discovery challenges a model that says that as modern humans swept out of Africa, they completely replaced archaic humans such as Neandertals without interbreeding.

    The Neandertal genome also gives researchers a powerful new tool to fish for genes that have evolved recently in humans, since they split from Neandertals. The catalog includes 78 differences in genes that encode proteins that are important for wound healing, the beating of sperm flagella, and gene transcription. Several encode proteins expressed in the skin, sweat glands, and inner sheaths of hair roots, as well as skin pigmentation—all differences that reflect adaptations to new climates and environments as modern humans spread around the globe.

    The researchers have also identified 15 regions of interest that differ between humans and Neandertals, including genes that are important in cognitive and skeletal development. When mutated in humans, some of these genes contribute to diseases such as schizophrenia, Down syndrome, and autism, or to skeletal abnormalities such as misshapen clavicles and a bell-shaped rib cage.

    As researchers close in on the few genes that separate us from Neandertals, they are also trying to decipher how differences in genetic code alter proteins produced in the lab. This year, scientists inserted 11 pairs of single peptides into eukaryote cells to test for differences in gene expression. With luck, they may pinpoint some of the genes that equipped us to survive while Neandertals went extinct.

    Next-Generation Genomics

    CREDIT: STEVEN EVANS/WIKIPEDIA

    Genomics researchers savored the fruits of massively parallel sequencing in 2010. Cheaper, faster “next generation” machines have taken hold over the past 5 years; this year they yielded important results from several large projects.

    One ambitious effort, the 1000 Genomes Project, seeks to find all single-base differences—or single-nucleotide polymorphisms (SNPs)—present in at least 1% of humans. It completed three pilot studies this year, which together identified 15 million SNPs—including 8.5 million novel ones. The information will help scientists track down mutations that cause diseases.

    Researchers also finished cataloging all the functional elements in the genomes of the fruit fly Drosophila melanogaster and the nematode Caenorhabditis elegans; the results are expected to be published by year's end. In human DNA, the complete genome sequences of two Africans from hunter-gatherer tribes, the oldest known lineages of modern humans, confirmed the extensive genetic diversity within those groups. Researchers also produced a draft of the Neandertal genome (see p. 1605) and deciphered the genome from 4000-year-old hair preserved in Greenland's permafrost.

    The cornucopia of results also included surveys of all the transcribed DNA—the so-called transcriptome—and of protein-DNA interactions, as well as assessments of gene expression and the identification of rare disease genes.

    Souped-Up Cellular Reprogramming

    Changing a cell's fate by adding extra copies of a few genes has become routine in labs around the world. The technique, known as cellular reprogramming, allows scientists to turn back a cell's developmental clock, making adult cells behave like embryonic stem cells (see “Insights of the Decade,” p. 1612). The resulting induced pluripotent stem cells (iPSCs) are helping scientists to study a variety of diseases and may someday help to treat patients by supplying them with genetically matched replacement cells.

    This year, scientists found a way to make reprogramming even easier using synthetic RNA molecules. The synthetic RNAs are designed to elude the cell's antiviral defenses, which usually attack foreign RNA. The technique is twice as fast and 100 times as efficient as standard techniques. And because the RNA quickly breaks down, the reprogrammed cells are genetically identical to the source cells, making them potentially safer for use in therapies.

    Early evidence suggests that the RNA approach reprograms the cell more thoroughly than other methods do, yielding a closer match to embryonic stem cells. The method can also prompt cells to become nonembryonic cell types. By inserting synthetic RNA into a cell that codes for a key gene in muscle tissue, for example, the researchers could turn both fibroblasts and iPSCs into muscle cells.

    Homing In on Errant Genes

    CREDIT: MURAT GUNEL

    Scientists who study rare genetic disorders hit on a powerful strategy for finding the culprit DNA this year. Using cheap sequencing techniques and a shortcut—sequencing just the 1% of the genome that tells cells how to build proteins—they cracked several diseases that had eluded researchers until now.

    The old way to track down the cause of Mendelian disorders, or diseases caused by a mutation in a single gene, was to study DNA inheritance patterns in families. That approach doesn't work when few relatives with the disease can be found or when a mutation isn't inherited but instead crops up spontaneously.

    In late 2009, geneticists began sequencing just the exons, or protein-coding DNA, of patients with Mendelian disorders. (A few teams sequenced the patients' entire genome.) This “exome” sequencing yielded a long list of mutations that the scientists then winnowed, for example, by ignoring those that don't change protein structure or that many people carry. The end result: the faulty DNA underlying at least a dozen mystery diseases—including genes that lead to severe brain malformations, very low cholesterol levels, and facial deformities that look like a made-up Japanese Kabuki performer.

    Finding the gene behind a rare disease can lead to better diagnosis and treatments and to new insights into human biology. Scientists hope to use exome sequencing to tick off the causes of more than half of some 7000 known or suspected Mendelian diseases that still don't have a genetic explanation.

    Quantum Simulators Pass First Key Test

    Like a student who sneaks a calculator into a test, physicists have found a quick way to solve tough mathematical problems. This year, they showed that quantum simulators—typically, simulated crystals in which spots of laser light play the role of the crystal's ions and atoms trapped in the spots of light play the role of electrons—can quickly solve problems in condensed-matter physics.

    Physicists usually invent theoretical models to explain experiments. They might approximate a magnetic crystal as a three-dimensional array of points with electrons on the points interacting through their magnetic fields. Theorists can jot down a mathematical function called a Hamiltonian encoding such an idealization. But “solving” a Hamiltonian to reveal how a system behaves—for example, under what conditions the electrons align to magnetize the crystal—can be daunting.

    CREDIT: W. S. BAKR ET AL., SCIENCE 329, 547 (30 JULY 2010)

    However, physicists can tailor a quantum simulator to a particular Hamiltonian and let the experiment solve the theoretical problem. Five groups reproduced the results for four previously solved Hamiltonians. Three even mapped “phase diagrams” akin to the one that shows the temperatures and pressures at which water becomes a gas, liquid, or solid.

    Physicists hope quantum simulators will crack Hamiltonians that have not been solved—such as one for high-temperature superconductors. But first they had to show that the things could reproduce known results. Check.

    Molecular Dynamics Simulations

    CREDIT: © 2009 MATTHEW MONTEITH

    Sometimes brute force is the way to go, particularly when using computers to simulate the gyrations proteins make as they fold. Such simulations are a combinatorial nightmare. Each two neighboring amino acids in a protein chain can bind to one another at two different angles, each of which can have three conformations. So a simple protein with 100 amino acids can fold in 3198 different ways. Getting at the atomic detail is even scarier. Proteins sort through all these possibilities in milliseconds or less. Computers take far longer.

    Protein-folding experts have long turned to supercomputers for help. But even these behemoths struggle to track the motions long enough to simulate the complete folding process. Two years ago, researchers in the United States unveiled a new supercomputer hardwired with 512 computer chips tailor-made to speed the calculations of the way neighboring atoms in a protein and the surrounding water interact. That enabled them to gain another burst in speed. As a result, the group reported this year that they've been able to track the motion of atoms in a small protein 100 times longer than previous efforts could do—long enough to see the protein wind its way through 15 cycles of folding and unfolding. Next up, the group is already turning to novel machines with 1024 and 2048 chips to improve simulations of larger proteins.

    Rats Redux

    CREDIT: ISTOCKPHOTO

    Today, most lab cages house mice, but the tenant of choice used to be rats. The reason: Rats are more like us. The human heart, for example, beats about 70 times a minute; a rat's heart, 300 times; a mouse's, 700. Electrical signal patterns in rat and human hearts are also similar. Rats, being more intelligent than mice, might also be better models of human neural diseases such as Alzheimer's and Parkinson's. And rats are bigger and easier to handle for lab work.

    Then, in 1989, researchers learned to delete specific genes to make “knockout mice.” The technique they used, called homologous recombination of embryonic stem cells, didn't work in rats. So mice became the preferred experimental animal in various studies, from developmental biology to drug development.

    That too may pass. In 2009, researchers adapted to rats a method, previously used in fruit flies and zebrafish, that uses enzymes called zinc finger nucleases to knock out genes. In August, another group announced a tweak that produced “knockout rats” by the same genetic trick used for knockout mice. Also this year, several groups reported advances in using transposons, DNA sequences that jump from one location to another within a genome, to generate rats with genetic mutations—animals useful for developmental biology and disease research. As a result of such techniques, knockout and genetically modified rats may soon displace their smaller cousins in lab cages around the world.

    HIV Prophylaxis

    CREDIT: JON COHEN

    From the start of the AIDS epidemic through 2009, only five of 37 large-scale studies that attempted to prevent HIV yielded convincing, positive results. Then, this past July and November, two trials of different, novel HIV-prevention strategies unequivocally reported success. AIDS researchers all but danced with joy.

    The first result stole the show at the jampacked XVIII International AIDS Conference held in Vienna, Austria. A vaginal gel that contains the anti-HIV drug tenofovir reduced HIV infections in high-risk women by 39% over a 30-month period. Nearly 900 South African women participated in the study, half receiving the microbicide and the others an inert gel. Among “high adherers,” women who used the microbicide exactly as instructed, its efficacy reached 54%.

    Last month, the first-ever study of oral pre-exposure prophylaxis made headlines with results even more encouraging. The subjects, 2499 men and transgender women who have sex with men, were recruited from six countries. Half were asked to take Truvada, a combination of tenofovir and emtricitabine, each day. After an average of 1.2 years, the treated group had 43.8% fewer infections than the group that took a placebo. Again, better adherence equaled better efficacy: In a small substudy, efficacy increased to 92% in participants who had measurable levels of Truvada in their blood.

    Neither approach is a magic bullet, AIDS researchers say. But in combination with other measures, they could usher in a new era of HIV prevention.

  12. Breakthrough of the Year

    Diving Into the Oil Spill

    1. Erik Stokstad

    Through improved communication, researchers ultimately helped devise ways to deal with the leaking oil well in the Gulf of Mexico this year and assessed the scale of the spill.

    CREDIT: AP

    During the massive, 3-month oil spill in the Gulf of Mexico this year, scientists were quick to offer help. It wasn't always easy. Many researchers in the region at first felt unwanted; there were rough patches in coordinating academic and government scientists; and clashes over when to release data sometimes sent the public mixed messages. But through improved communication, researchers ultimately helped devise ways to deal with the leaking well and assessed the scale of the spill.

    One early source of confusion was the conflicting estimates of the amount of oil flowing from the well. At first, BP estimated 1000 barrels per day. Academic scientists accused the company of low-balling, and a National Oceanic and Atmospheric Administration (NOAA) scientist studying a video figured the rate was roughly five times higher. The resulting scientific debate and public confusion over the flow rate—which will be crucial for determining BP's fine—might have been lessened if the various groups had shared their methods and data. Eventually, an ad hoc task force of government scientists estimated the flow at 35,000 to 60,000 barrels per day, a figure made more rigorous by sensors added to the wellhead.

    As the oil continued to gush, academic scientists hopped on boats and rearranged their research plans to investigate the extent of the spill and the damage to wetlands and wildlife. But lack of coordination bred tensions, and researchers clashed over when to release preliminary data and how much to speculate on their meaning. The highest-profile example was the surprise discovery on 15 May of what appeared to be oil plumes. When NOAA Administrator Jane Lubchenco tried to sound a note of caution about the preliminary results, researchers and media blasted her for squelching the discovery (Science, 2 July, p. 22). In June, NOAA confirmed the plumes.

    One of the most controversial decisions was to use dispersants to keep oil from reaching wetlands and islands. A group of experts convened to advise the government approved the unprecedented use of dispersants at depth, which appears to have worked.

    Overall, the spill appears to have been less of an environmental disaster than first feared, at least for the wetlands and barrier islands. But it will take years before the toll is fully known, especially for deep-sea organisms. A $500 million research fund, set up by BP, will help scientists gauge the long-term impact.

  13. Breakthrough of the Year

    Scorecard

    Science's editors foresaw this year's advances based on research with induced pluripotent stem cells and exome sequencing and Obama's new direction for the U.S. space program, but other predictions were a mixed bag.

    CREDIT: © 2009, JAY MATTERNES

    Rating Last Year's Areas to Watch

    GraphicIPS Cells

    Last year, we predicted that the ability to reprogram adult skin cells into induced pluripotent stem cells (iPSCs) would usher in a new wave of research based on patient-specific cell lines. In 2010, researchers made new cell lines from hundreds of patients with dozens of disorders. Patient-derived iPSCs, which can be coaxed to become various mature cell types, have helped scientists better understand Rett syndrome and genetic heart defects, among other conditions. But finding new treatments for more common conditions, say, Parkinson's disease or diabetes, is still a work in progress.

    GraphicCosmic Eye

    The Alpha Magnetic Spectrometer (AMS) particle detector didn't make it into space aboard NASA's space shuttle as planned, but it had an interesting year nevertheless. Destined for the international space station, AMS received a last-minute design change in April that delayed its launch by 4 months until November. Then, in July, NASA delayed its shuttle schedule for other reasons, pushing the launch into 2011. AMS is now scheduled to go up on 1 April. No fooling.

    GraphicExome Studies

    This new way of gene hunting—sequencing just protein-coding DNA—revealed the errant DNA underlying a dozen or so rare diseases (see Runner-Up, p. 1606). But exome sequencing hasn't yet found the elusive mutations that geneticists hope will explain much of the missing heritability for common disorders, such as Alzheimer's disease, diabetes, and heart disease. These studies require thousands of patients' exomes, and at year's end, researchers were still collecting data and analyzing their results.

    GraphicBiochemistry Beats Cancer?

    Safety has been the stumbling block for treatments that tamper with cancer metabolism. But this year, a preliminary trial gave the compound dichloroacetate a passing grade. Researchers also exposed new metabolic eccentricities of cancer cells that might inspire fresh avenues of attack.

    GraphicHuman Space Flight

    As forecast, President Obama set a new direction for the U.S. space program. But the plan drew fire from Congress, which eventually accepted it only in part. Obama wanted to scrap the Bush Administration's $3.5-billion-a-year Constellation program and invest $3.3 billion over the next 3 years in the development of commercial spacecraft. In an authorization measure, Congress allowed NASA to spend $1.3 billion on commercial spacecraft development and required that the agency start building a heavy-lift rocket next year, instead of after 2015, as Obama had proposed. Congress also gave the space shuttle one more flight next year, instead of retiring it in 2010.

  14. Breakthrough of the Year

    Areas to Watch

    In 2011, Science's editors will be watching the Large Hadron Collider, adaptation genes, laser fusion, broadly neutralizing antibodies, electric cars, and malaria shots.

     

    The Large Hadron Collider

    The first really interesting results will come out of the LHC this year, and dollars to doughnuts they'll have nothing to do with the search for the Higgs boson or supersymmetric particles with the collider's two big detectors, ATLAS and CMS. Rather, look for results from a smaller detector called LHCb, which will study familiar particles called B mesons in great detail with an eye to probing a slight asymmetry between matter and antimatter called CP violation. CP violation in ordinary B mesons seems to conform to physicists' standard theory. This year, however, researchers at Fermi National Accelerator Laboratory in Batavia, Illinois, reported hints that there could be extra CP violation in a subspecies called Bs mesons—a possible sign of new particles on the horizon. Collecting data at a massive rate, LHCb should test those claims in short order.

    Adaptation Genes

    Ecologists and evolutionary biologists are busy harnessing the faster, cheaper sequencing technologies to learn which genes help organisms, from bacteria to butterflies, thrive in the natural world. Until now, such genome scans were very difficult to do in organisms that weren't also already well studied in the lab. But new techniques, such as one called RAD tag sequencing, should lead to the discovery of many more genes contributing to adaptation in the coming year.

    Laser Fusion

    CREDIT: LAWRENCE LIVERMORE NATIONAL LABORATORY

    The National Ignition Facility (NIF) at Lawrence Livermore National Laboratory in California will make an end run in 2011 for a long-sought goal of energy research: an ignited fusion burn. Shots from NIF's 192 laser beams will pump energy into a peppercorn-sized target containing deuterium and tritium. The ensuing implosion compresses and heats the nuclei until they begin to fuse, releasing energy. If that self-heating causes a sustained fusion burn with energy gain, NIF will have finally shown that generating power from fusion is at least possible.

    Hammering Viruses

    Most antibodies are quite specific, derailing only viruses that look alike. But broadly neutralizing antibodies (bNAbs) are immune system generalists, capable of disabling a wide range of viral variants. Recent progress in identifying bNAbs effective against HIV and in developing a flu vaccination strategy in animals that elicits these generalists have brought us closer to harnessing these molecules. In 2011, researchers hope to come up with the viral pieces—the main vaccine ingredients— that trigger the immune system to make bNAbs.

    Electric Cars

    Toyota's Prius and other gas-electric hybrids have been on the market for years. But in 2011, plug-in hybrid electric cars that charge a car's batteries from a wall socket are set to go mainstream. These cars make a marked shift in the demands put on battery technology to take consumers where they want to go. For now, battery-powered cars cost more than their gasoline-powered alternatives. Will consumers give up their long-distance driving range and the convenience of a quick fill-up to produce less carbon emissions? Doesn't seem like a sure bet.

    Malaria Shots

    CREDIT: JAMES D. GATHANY/WIKIPEDIA

    Scientists can't wait for the results of the first phase III trial of a malaria vaccine. Up to 16,000 children in seven African countries are taking part in the study; the results for almost 9000 of them—the cohort aged 5 to 17 months—will be announced late in 2011. The vaccine, called RTS, S, offered roughly 50% protection in phase II studies. That's not spectacular, but given the field's frustrating history, expect elation—and a speedy application for regulatory approval—if the phase III study does equally well.

  15. Breakthrough of the Year

    Whiplash for Stem Cell Researchers

    1. Jocelyn Kaiser

    Last summer, a federal court banned taxpayer-funded research on human embryonic stem cells. Although a higher court soon allowed the research to continue temporarily, the ongoing lawsuit has cast a shadow over the area that could linger well into 2011.

    The field of stem cell research in the United States was shaken to its core last summer when a federal court banned taxpayer-funded research on human embryonic stem cells (hESCs). Although a higher court soon allowed the research to continue temporarily, the ongoing lawsuit, Sherley v. Sebelius, has cast a shadow over the area that could linger well into 2011.

    Stem cell scientists began the year on a high note, after the Obama Administration lifted Bush-era limits on which cell lines could be used in studies funded by the National Institutes of Health (NIH). Government lawyers and NIH weren't terribly worried about a lawsuit challenging the agency's July 2009 embryonic stem cell guidelines filed last year by two scientists, Christian groups, and an embryo adoption agency on behalf of embryos. And Chief Judge Royce Lamberth of the U.S. District Court in Washington, D.C., soon threw the suit out because the plaintiffs had not demonstrated they were harmed by the NIH policy.

    The suit got a second wind last summer when an appeals court reversed Lamberth's ruling about the legal standing of the two scientists, James Sherley and Theresa Deisher, who claimed that the NIH guidelines made it harder for them to win grants to study adult stem cells. Lamberth then considered Sherley and Deisher's argument that the guidelines violated the Dickey-Wicker Amendment , which bans the use of federal funds for research that destroys embryos (a step necessary to derive hESCs). On 23 August, Lamberth ruled that NIH-funded hESC research likely violated the amendment and issued a preliminary injunction blocking it.

    Brakeman.

    Judge Royce Lamberth's ruling threw hESC research into turmoil.

    CREDIT: AP

    NIH Director Francis Collins said he was “stunned”; NIH suspended hESC grant payments and peer review and stopped research on the NIH campus. Lamberth's decision, Collins said, threw “sand into that engine of discovery.” Researchers got a reprieve on 9 September, when the U.S. Court of Appeals for the District of Columbia Circuit lifted the ban while it considered the government's appeal of Lamberth's preliminary injunction. A three-judge panel that heard oral arguments on 6 December is expected to rule by mid-January—and no one is making bets on the outcome. Meanwhile, Lamberth could issue a permanent ban any day if he decides federally funded hESC research violates Dickey-Wicker. Any decision will likely be appealed, ultimately to the Supreme Court.

    Since the September stay, it's officially been business as usual at NIH. But researchers say damage has already been done. The uncertainty about whether hESC research will be halted again—and the fading hope that Congress will pass a law allowing hESC research—has many scientists rethinking their work. U.S. postdocs are thinking twice about venturing into the field, and foreign graduate students are hesitating to take postdoc positions in U.S. labs. The legal tangle is yet another low point in the roller-coaster history of hESC research in the United States.

  16. Breakthrough of the Year

    The Year in News

    1. Jeffrey Mervis

    2010 was a busy year for science and science policy. Science lists some developments that tested the limits of our knowledge and influenced thinking about the global research enterprise.

    2010 was a busy year for science and science policy. Here are some developments that tested the limits of our knowledge and influenced thinking about the global research enterprise. [PDF]

  17. Introduction

    Stepping Away From the Trees For a Look at the Forest

    1. The News Staff

    Science's news staff takes a break from reporting to review some big ideas of the past 10 years and the technologies that made them possible.

    Ten years ago, Karl Deisseroth was stuck. A psychiatrist and neuroscientist, he wanted to learn how different brain circuits affect behavior—and what went awry in the brains of his patients with schizophrenia and depression. But the tools of his trade were too crude: Electrodes inserted into the brain would stimulate too many cells in their vicinity. So in 2004, Deisseroth and his students invented a new tool. They inserted a gene for a light-activated algal protein into mice brains, where it entered nerve cells. By stimulating those cells with a laser, the researchers could control the activity of specific nerve circuits with millisecond precision and study the effects.

    The technique, called optogenetics, took the field by storm. Today, thousands of scientists in hundreds of labs are using optogenetics to probe how brains work. They have examined which cells in the brain's reward pathway get hijacked by cocaine, and how deep brain stimulation relieves the symptoms of Parkinson's disease. The list of questions that can now be addressed is endless.

    Deisseroth's story repeats itself everywhere in science: An ingenious new tool triggers a cascade of new insights. In this special section, Science's news reporters and editors mark the end of the current decade by stepping back from weekly reporting to take a broader look at 10 insights that have changed science since the dawn of the new millennium. This survey covers only a small fraction of the decade's scientific advances, of course; many others could have filled these pages.

    First, however, a shout-out to some of the tools that made those insights possible. In the past 10 years, new ways of gathering, analyzing, storing, and disseminating information have transformed science. Researchers generate more observations, more models, and more automated experimentation than ever before, creating a data-saturated world. The Internet has changed how science is communicated and given nonscientists new opportunities to take part in research. Whole new fields, such as network science, are arising, and science itself is becoming more of a network—more collaborative, more multidisciplinary—as researchers recognize that it takes many minds and varied expertise to tackle complex questions about life, land, and the universe.

    Seeing and sensing

    Many of the decade's most useful new tools, like optogenetics, were advances in sensing and imaging. Cryoelectron tomography brought into focus the cell's components, allowing scientists to get atomic-level detail of whole-cell organization. The technique builds up a three-dimensional image from a series of two-dimensional slices of a flash-frozen cell, a computer-intensive process.

    In 2002, intravital scanning two-photon confocal microscopy revolutionized immunology when it was applied to lymph nodes, showing immune cells in intact tissue in real time. The work opened the door to understanding the interactions among immune cells and the body and changed how researchers thought about immune responses.

    Developmental biology has also made huge leaps since 2000 thanks to new microscopy techniques. Now researchers can keep samples alive and have longer-lasting fluorescent tags to track specific cells. They can follow whole-organ and whole-animal development in movies that follow cells as they divide and move around. Other microscopy techniques have been able to sidestep a fundamental limit of optics to look at proteins and the fine structure of cells smaller than the diffraction limit—half the wavelength of the light being used.

    Not only can we see better, but we can also send our eyes to hard-to-reach places. On Mars, for example, the Spirit and Opportunity rovers marked a big step up from earlier spacecraft: able to roam for kilometers instead of meters and carrying more capable instruments to analyze the chemical and physical properties of rock. Their observations rewrote the history of water on Mars. Closer to home, torpedolike robots with no connection to the surface searched for oil in the Gulf of Mexico and explored the water under glacial ice shelves in Antarctica. Remotely operated planes, some the size of model airplanes and some the size of military Predators and Hawks, routinely monitor sunlight passing through clouds and fly over hurricanes. Thousands of oceangoing floats send back data on water properties and, therefore, currents. These mobile sensing devices, along with stationary counterparts on land and in the sea, will soon monitor the state of the planet around the clock, making ecology and environmental science almost as data-rich as astronomy and particle physics are.

    Emphasizing connections.

    Thanks to computers, the Internet, and new ways of looking at links among everything from genes to people, network science is flourishing.

    CREDIT: D. COX AND R. PATTERSON/NCSA, UIUC

    The indispensable machine

    Key to handling such unprecedented torrents of data, of course, have been ever more powerful and more affordable computers. No field has benefited more than genomics. A decade ago, sequencing a human genome took years, hundreds of people, hundreds of machines, and endless hours of sample preparation to generate the pieces of DNA to be deciphered, one at a time. Some researchers favored a shotgun approach—cutting up whole genomes, then sequencing and assembling them all at once—but available computers weren't up to the task. Proponents had to build a massive supercomputer and laboriously program it for the job. Now whole-genome shotgun is de rigueur, and efficient software abounds. In the past 5 years, new “next-generation” sequencing technologies have streamlined the work even more. Today, a single machine can decipher three human genomes in little more than a week.

    Sequencing is getting so cheap that researchers are using it to study gene expression and protein-DNA interactions on an unprecedented scale. Geneticists now depend on sequencing to track down the genetic causes of rare diseases. In 2008, 10 countries set up the International Cancer Genome Consortium, which aims to catalog mutations and other DNA and epigenetic changes for about 50 types of cancer by sequencing part or all of 25,000 genomes.

    Computers aren't limited to piecing genomes back together. With their help, genomics researchers predict gene locations and compare, say, chimp and human genomes to identify sequences of evolutionary importance—deriving new insights about how genomes work. The Web has served as a vital link between researchers and publicly accessible databases of genome information.

    In biochemistry, computer technology has led to huge strides in understanding proteins. To help with the calculations needed to model the jumps and squiggles proteins make as they fold into their “final” shape, scientists beefed up their hardware with graphics processing units used to render 3D images in video games.

    Computational chemistry also got a boost from field programmable gate arrays: chips that allow users to essentially design their own hardware to streamline their simulations. And one research team built a supercomputer with customized integrated circuits that dramatically speed protein-folding calculations, allowing simulations on the time scale of milliseconds.

    In some cases, computing power is transforming the basic way researchers work together. In astronomy, for example, the Sloan Digital Sky Survey is cataloging everything that can be seen in a fifth of the sky using a 2.5-meter telescope at the Apache Point Observatory in New Mexico. Researchers will tap into the masses of data over the Web. Similar “data utilities” are business as usual in particle physics and seismology, but they are a far cry from the way small teams of observers have divided up telescope time in the past. Europe's CERN laboratory has gone even further. To handle the petabytes of data its Large Hadron Collider (LHC) will generate every year, CERN has set up a computing “grid” system—a virtual organization that pools and shares the computer processing power of each member institution. The grid also makes it possible for thousands of scientists to access LHC data and work together in unprecedented ways.

    Other organizations are harnessing the power of networking through crowd-sourcing, in which large numbers of researchers (even nonscientists) can contribute to solving problems, setting policy, or forecasting the future. An Internet company called InnoCentive, for example, posts problems online and offers rewards of up to $1 million for their solution. It boasts that 200,000 people have weighed in on more than 1000 problems in fields as wide-ranging as drug synthesis and brick manufacture and have solved two-thirds of them.

    Networking programs also enable volunteers around the world to donate idle time on home computers for protein-folding calculations, to search for comets in images from the SOHO satellite, and to classify galaxy types in data from the Sloan Digital Sky Survey.

    To get a handle on all this interconnectedness and grasp the ways information travels through complex systems, theorists have spawned a new field called network science. The field took off about 10 years ago, after physicists developed mathematical models to explain some of the network phenomena sociologists had observed. Now, thanks to technologies that make possible measurements on thousands of genes or proteins at once and to computers that can track and analyze the movements, voting habits, or shopping preferences of millions of people, the network approach is bursting into full flower. A host of new insights are bound to lie just ahead.

    But that is a tale for another decade.

  18. Shining a Light on the Genome's 'Dark Matter'

    1. Elizabeth Pennisi

    Since the publication of the human genome sequence in 2001, scientists have found that the so-called junk DNA that lies between genes actually carries out many important functions.

    CREDIT: Y. HAMMOND/SCIENCE

    It used to seem so straightforward. DNA told the body how to build proteins. The instructions came in chapters called genes. Strands of DNA's chemical cousin RNA served as molecular messengers, carrying orders to the cells' protein factories and translating them into action. Between the genes lay long stretches of “junk DNA,” incoherent, useless, and inert.

    That was then. In fact, gene regulation has turned out to be a surprisingly complex process governed by various types of regulatory DNA, which may lie deep in the wilderness of supposed “junk.” Far from being humble messengers, RNAs of all shapes and sizes are actually powerful players in how genomes operate. Finally, there's been increasing recognition of the widespread role of chemical alterations called epigenetic factors that can influence the genome across generations without changing the DNA sequence itself.

    The scope of this “dark genome” became apparent in 2001, when the human genome was first published. Scientists expected to find as many as 100,000 genes packed into the 3 billion bases of human DNA; they were startled to learn that there were fewer than 35,000. (The current count is 21,000.) Protein-coding regions accounted for just 1.5% of the genome. Could the rest of our DNA really just be junk?

    The deciphering of the mouse genome in 2002 showed that there must be more to the story. Mice and people turned out to share not only many genes but also vast stretches of noncoding DNA. To have been “conserved” throughout the 75 million years since the mouse and human lineages diverged, those regions were likely to be crucial to the organisms' survival.

    Edward Rubin and Len Pennacchio of the Joint Genome Institute in Walnut Creek, California, and colleagues figured out that some of this conserved DNA helps regulate genes, sometimes from afar, by testing it for function in transgenic mouse embryos. Studies by the group and others suggested that noncoding regions were littered with much more regulatory DNA than expected.

    Further evidence that noncoding DNA is vital has come from studies of genetic risk factors for disease. In large-scale searches for single-base differences between diseased and healthy individuals, about 40% of the disease-related differences show up outside of genes.

    Genetic dark matter also loomed large when scientists surveyed exactly which DNA was being transcribed, or decoded, into RNA. Scientists thought that most RNA in a cell was messenger RNA generated by protein-coding genes, RNA in ribosomes, or a sprinkling of other RNA elsewhere. But surveys by Thomas Gingeras, now at Cold Spring Harbor Laboratory in New York, and Michael Snyder, now at Stanford University in Palo Alto, California, found a lot more RNA than expected, as did an analysis of mouse RNA by Yoshihide Hayashizaki of the RIKEN Omics Science Center in Japan and colleagues. Other researchers were skeptical, but confirmation soon came from Ewan Birney of the European Bioinformatics Institute and the Encyclopedia of DNA Elements project, which aims to determine the function of every base in the genome. The 2007 pilot results were eye-opening: Chromosomes harbored many previously unsuspected sites where various proteins bound—possible hotbeds of gene regulation or epigenetic effects. Strikingly, about 80% of the cell's DNA showed signs of being transcribed into RNA. What the RNA was doing was unclear.

    Other studies revealed that RNA plays a major role in gene regulation and other cellular functions. The story started to unfold in the late 1990s, when plant researchers and nematode biologists learned to use small RNA molecules to shut down genes. Called RNA interference (RNAi), the technique has become a standard way to control gene activity in a variety of species, earning a Nobel Prize in 2006.

    To understand RNAi and RNA in general, researchers began isolating and studying RNA molecules just 21 to 30 bases long. It turned out that such “small RNAs” can interfere with messenger RNA, destabilizing it. Four papers in 2002 showed that small RNAs also affect chromatin, the complex of proteins and DNA that makes up chromosomes, in ways that might further control gene activity. In one study, yeast missing certain small RNAs failed to divide properly. Other studies have linked these tiny pieces of RNA to cancer and to development.

    The surprises didn't stop at small RNAs. In 2007, a group led by Howard Chang of Stanford and John Rinn, now at Beth Israel Deaconess Medical Center in Boston, pinned down a gene-regulating function by so-called large intervening noncoding RNAs. Rinn and colleagues later determined that the genome contained about 1600 of these lincRNAs. They and other researchers think this type of RNA will prove as important as protein-coding genes in cell function.

    Many mysteries about the genome's dark matter are still under investigation. Even so, the overall picture is clear: 10 years ago, genes had the spotlight all to themselves. Now they have to share it with a large, and growing, ensemble.

  19. A Recipe for the Cosmos

    1. Adrian Cho

    In the past decade, cosmologists have deduced a very precise recipe for the content of the universe, as well as instructions for putting it together, transforming cosmology from a largely qualitative endeavor to a precision science with a standard theory.

    First light.

    The afterglow of the big bang as charted by the WMAP spacecraft (inset).

    CREDITS: NASA/WMAP SCIENCE TEAM

    For Eons, humans have peered into the starry sky and wondered, “Where did it all come from?” Cosmologists still don't have a solid answer, but in the past decade they have deduced a very precise recipe for the content of the universe, as well as instructions for putting it together. The advance has transformed cosmology from a largely qualitative endeavor to a precision science with a standard theory that provides little wiggle room for other ideas, even as it leaves obvious questions unanswered. Like it or not, cosmologists are now more or less stuck with the theory.

    If you are cooking along at home, you'll need three ingredients: ordinary matter such as that in stars and planets, mysterious “dark matter” whose gravity binds the galaxies, and even weirder “dark energy” that's stretching space and accelerating the expansion of the universe. According to the latest numbers, the proportions are 4.56% ordinary matter, 22.7% dark matter, and 72.8% dark energy.

    Here's how they went together: The universe emerged, superhot and ultradense, in the instant of the big bang. For a tiny fraction of a second it expanded at faster-than-light speed. Known as “inflation,” that growth spurt stretched tiny quantum fluctuations in the density of the newborn universe to colossal scales. The gravity of these “overdense” regions pulled in dark matter, which drew along with it ordinary matter, initially a soup of subatomic particles. Slowly, a vast “cosmic web” of filaments and clumps of dark matter formed, and within them, the galaxies.

    Before the turn of the millennium, cosmologists had narrowed the list of cosmic ingredients. Many had suspected the presence of dark matter for decades. Dark energy made a splashy entrance in 1998, when astronomers used stellar explosions called type Ia supernovae to trace the rate of expansion of the universe and found that it was speeding up. The idea of inflation emerged in the 1980s to solve various conceptual puzzles. But only recently have observations enabled scientists to weave these threads into a rigorous theory.

    Much of the progress has come through studies of the afterglow of the big bang, radiation known as the cosmic microwave background (CMB). In 2000, an experiment known as Balloon Observations of Millimetric Extragalactic Radiation and Geophysics measured the CMB in detail in patches of the sky; a year later, so did the ground-based Degree Angular Scale Interferometer at the South Pole. Then in 2003, NASA's space-based Wilkinson Microwave Anisotropy Probe (WMAP) mapped the CMB across the sky, producing an exquisite baby picture of the cosmos.

    It's the myriad dimples in that picture that count. The temperature of the CMB varies by about one part in 100,000 from point to point on the sky, with the hotter spots corresponding to the denser regions in the primordial universe. By measuring the distribution of the spots' sizes and fitting it with their theoretical model, scientists can probe the interplay and amounts of ordinary and dark matter in the early universe. They can also measure the geometry of space. That allows them to deduce the total density of energy and matter in the universe and infer the amount of dark energy.

    CMB studies show that the universe is flat. That means that if you draw parallel lines in space, then they will remain the same distance apart along their infinite lengths instead of converging or diverging. Such flatness is a central prediction of inflation, so the measurements bolster that wild idea. So does a particular randomness seen in the hot and cold spots. The WMAP team released its latest results in January, 9 months before the satellite shut down.

    The most jaw-dropping thing about the CMB results is that the cosmologists' model, which has only a half-dozen adjustable parameters, fits the data at all. This isn't some marginal approximation; the theory fits the data like the casing on a sausage. That makes it hard to argue that dark matter and dark energy are not real and to explain them away by, for example, modifying Einstein's theory of gravity. Other cosmological data, such as measurements of the distribution of the galaxies, also jibe nicely with the model.

    All of this leaves cosmologists in a peculiar situation: They have the precise proportions of the ingredients of the universe, but they still do not know what two of the three ingredients—dark matter and dark energy— really are. It's as if they've been handed a brownie recipe that calls for a “sweet granular substance” and a “white powder that mixes with water to make a paste,” instead of specifying sugar and flour.

    Cosmologists may have a better idea of what they're talking about by the end of the next decade. A particle theory called “supersymmetry” predicts the existence of weakly interacting massive particles (WIMPs), which could be the particles of dark matter. Researchers are optimistic that they may soon detect WIMPs bumping into sensitive detectors deep underground, emerging from high-energy particle collisions at an atom smasher, or producing telltale gamma rays in space. Numerous astronomical surveys will probe space-stretching dark energy, although figuring out what it is could take much longer than another decade.

  20. Tiny Time Machines Revisit Ancient Life

    1. Ann Gibbons

    Scientists have been giving us new views of the prehistoric world in the past decade that hinge on the realization that "biomolecules" such as ancient DNA and collagen can survive for tens of thousands of years and give important information about long-dead plants, animals, and humans.

    For hundreds of years, humans studying ancient life had to rely on stones and bones for hard evidence about what extinct creatures looked like. In the past decade, powerful new x-ray scans and three-dimensional computer models have transformed the analysis of bones, teeth, and shells. But something even more revolutionary has been afoot: a new kind of analysis capable of revealing anatomical adaptations that skeletal evidence can't provide, such as the color of a dinosaur's feathers or how woolly mammoths withstood the cold.

    Mammoth undertaking.

    Scientists sequenced the genome of this extinct mammal.

    CREDIT: DAVIDE BONADONNA

    The new views of the prehistoric world hinge on the realization that “biomolecules” such as ancient DNA and collagen can survive for tens of thousands of years and give important information about long-dead plants, animals, and humans. In the past 10 years, this field has surged forward beyond expectation. In 1997, the sequencing of the first snippet of DNA from a Neandertal by a team at the Max Planck Institute of Evolutionary Anthropology in Leipzig, Germany, was hailed as a major milestone. But this year, the same lab published the nuclear genome of Neandertals, representing 10 million times as much DNA (see “Breakthrough of the Year,” p. 1605). Over the past decade, these molecules have painted the past in living color: Ancient genes have shown that some Neandertals had red hair and pale skin, and ancient pigment-bearing organelles called melanosomes have revealed the chestnut color of a Sinosauropteryx dinosaur's downy tail.

    Ancient molecules can also expose relationships among long-dead species; for example, showing that the amino acid sequence of collagen from a dinosaur more closely resembled that of living birds than that of reptiles. Ancient DNA from woolly mammoths revealed that their blood contained special cold-adapted hemoglobin, which researchers reproduced in bacteria this year. DNA has even shown that a few Neandertals interbred with our ancestors.

    Ancient-DNA research began in the mid-1980s. But its early promise faded amid a series of spectacular but unfounded claims, such as supposed 80-million-year-old dinosaur DNA that was actually from a human. Ancient DNA was so often contaminated with DNA from bacteria or scientists themselves that no one trusted (or published) the findings. Funding dried up, and only a few labs survived, mostly in Europe.

    But over the past 10 years, new tools developed for exploring the human genome gave ancient DNA research a lifesaving infusion. For example, new high-throughput sequencers turned out to work best with small fragments of DNA—just what degraded ancient samples yield. The technology enabled scientists to sequence every bit of extracted DNA relatively cheaply. So they were able to develop methods to recognize and sort short bits of ancient DNA from longer chunks of contaminant DNA and also to fish out gene regions of interest.

    As sequencing power leapt forward, researchers began making discoveries at a record-breaking clip. In 2005, two teams worked together to sequence 27,000 bases of ancient cave bear DNA. Six months later, another team generated a remarkable 28 million bases of DNA from a mammoth bone and showed that mammoths split from African elephants about 6 million years ago. In 2008, that same team bought mammoth hair on eBay and sequenced the entire mammoth genome—the first genome of an extinct animal. That same year, the first complete mitochondrial genome of a Neandertal was published; five others soon followed. In 2010, that work culminated in the publication of the Neandertal genome.

    Code-buster.

    A researcher extracts DNA from ancient bone.

    CREDIT: © FRANK VINKEN

    Also this year, the Max Planck team achieved a feat that many researchers a decade ago would have dismissed as impossible: They identified a previously unknown species of human by its DNA alone. They sequenced the mitochondrial genome of a single human finger from a Siberian cave. The finger was neither Neandertal nor human; it apparently belonged to a new species that lived in Central Asia just 40,000 years ago.

    Meanwhile, researchers are also zeroing in on other ancient molecules, such as RNA and collagen. Researchers at North Carolina State University in Raleigh made the spectacular claim that they had isolated collagen from a 68-million-year-old Tyrannosaurus rex and from an 80-million-year-old hadrosaur. Some scientists suspect that the collagen comes from bacteria rather than dinosaurs, but others are investigating collagen as a way to bar-code scraps of unidentifiable bone. A few are even looking at RNA—which is far more fragile than DNA—in ancient seeds to learn about gene expression in early crops.

    Unlocking the secret lives of long-dead organisms may have practical implications in today's world, too. Some researchers hope to use ancient DNA to introduce genetic diversity back into threatened populations, for example, those of polar bears. If they succeed, molecules of long-dead organisms may one day help save living ones.

  21. A Roller-Coaster Plunge Into Martian Water—and Life?

    1. Richard A. Kerr

    The past decade's half-dozen martian missions have made it clear that early in Mars history, liquid water on or just inside the planet did indeed persist long enough to alter rock and, possibly, sustain the origin of life.

    It's been a heck of a ride. From civilization-sustaining canals early in the past century to Mariner 9's barren, lunarlike terrain of the 1970s to shallow, salty seas on early Mars in this decade, the search for water on the Red Planet—and the life that liquid water would permit—has had its ups and downs.

    The ballyhooed shallow seas evaporated from scientists' view of early Mars within a few years, but even so, the past decade's half-dozen martian missions have finally delivered. It is now clear that early in Mars history, liquid water on or just inside the planet did indeed persist long enough to alter rock and, possibly, sustain the origin of life. There is now even enough moisture to encourage those who seek living, breathing alien microbes—organisms that, if they exist, could hold the key to explaining how life on our own planet got its start.

    Pointers.

    Clay's spectral blues and greens (above, left) will guide the Curiosity rover (below).

    CREDIT: NASA/JPL/JHUAPL (INSET); NASA/JPL/UNIVERSITY OF ARIZONA

    At the turn of the millennium, experts knew early Mars hadn't been bone dry. News headlines of the day spoke of “A Dripping Wet Early Mars” and suggested that “Mars Is Getting Wetter and Wetter” (to cite two examples from Science). Cameras on spacecraft following Mariner 9 had revealed landscapes sculpted by catastrophic floods. Layered sediments evoked long-past crater lakes, and there were even hints of ancient shorelines that had encompassed a great northern ocean. But as far as anyone knew, all of those reservoirs could have been too ephemeral to give life a start.

    A soaking splash of martian water came in early 2004 when the Opportunity rover discovered signs of long-lived seas, or at least broad lakes, on early Mars. Opportunity was investigating a splotch of spectral color, seen from orbit, that connoted the water-related iron mineral hematite. From close up, the landing site appeared to be the salty remains of a shallow sea or lake. Science proclaimed the find—along with the discovery of cryptic water-rotted rocks on the other side of the planet by the Spirit rover—the Breakthrough of the Year for 2004.

    CREDIT: NASA/JPL

    The excitement was short-lived. Opportunity never found much more evidence that surface waters had rippled the sediments. Instead of a sea, rare puddles of salty, acidic groundwater may have oozed up between arid dunes of salt sand. Such intensely briny, corrosive water would have been inimical to Earth-like life.

    Not to worry. Spectrometers flying on Mars Reconnaissance Orbiter and Mars Express the past half-dozen years are far more capable of spying out water-altered minerals, especially clays. Astrobiologists love clays. Their formation requires prolonged contact of water and rock under mild, life-favoring conditions, and they are great for preserving the remnant organic matter of past life. Researchers have now turned up water-generated clays in so many places on Mars that the scientists and engineers contemplating where to send the next rover—the high-powered Curiosity, a.k.a. Mars Science Laboratory—had scores of choices.

    Orbiting instruments have also turned up ice buried within centimeters of the soil surface and in now-stagnant, debris-covered glaciers hundreds of meters thick. Ice near the martian surface can't melt today, but because Mars wobbles wildly on its axis, its climate has swung through warmer spells lasting hundreds of thousands of years many times in the past.

    Early in the decade, planetary scientists started finding signs that some of this ice may have been liquid within geologically recent times. Sharper imaging revealed gullies on crater walls that look as if they were cut by water gushing down the slope, perhaps from melting snow. In October, team members from the Phoenix lander mission reported evidence that liquid water—perhaps little more than a damp film—had percolated through the upper few centimeters of far-north soil. And Spirit team members earlier this year suggested a similar seeping near the equator where the rover has become stuck.

    As a result of such discoveries, the search for life on Mars now means more than merely groping in the dirt for long-decayed molecular remains. For the first time, many scientists think they might realistically find actual microorganisms, either living or recently deceased. To biologists, that prospect is incalculably exciting. If such micro-martians turn up, will they be products of a separate origin, with their own distinctive biochemistry and genetic code? Or will they be related to life on Earth—perhaps precursors such as a hypothesized “RNA world”—carried from one planet to another by spaceborne debris?

    It's possible. Meteorites from Mars, knocked off the surface by asteroid impacts, have been found on Earth. Sometimes those rocks might make the trip quickly enough for bacterial hitchhikers to survive the crossing. Mars was wet enough early enough and long enough; if it turns out to harbor life as we know it, then life on Earth could well have started there.

  22. Cells Rewrite Their Own Destiny

    1. Gretchen Vogel

    By prompting a cell to overexpress a few genes, researchers have discovered in the past decade how to turn a skin or blood cell into a pluripotent cell: one that has regained the potential to become any number of cells in the body.

    Reprogrammed.

    Adding extra copies of a few genes turns the developmental clock backward, producing induced pluripotent stem cells.

    CREDIT: COURTESY OF SHINYA YAMANAKA, KYOTO UNIVERSITY

    A classic metaphor in biology pictures an embryonic cell at the top of a hill. As the embryo develops, the cell rolls downhill into a series of branching valleys. Once a cell enters, say, the valley that leads to becoming a skin cell, it cannot suddenly change course and become a neuron. When developmental biologist Conrad Waddington came up with the image in the 1950s, the message was clear: Development is a one-way trip.

    Not so. In the past decade, scientists have figured out how to push differentiated cells back up the hill and, perhaps even more surprising, directly from one valley to the next. By prompting a cell to overexpress a few genes, researchers can turn a skin or blood cell into a pluripotent cell: one that has regained the potential to become any number of cells in the body. Other genes can prompt skin cells to turn directly into neurons or blood cells. Scientists are already using the technique to make cell lines from patients with hard-to-study diseases, and ultimately they hope to grow genetically matched replacement cells and tissues—perhaps even entire organs.

    The foundation for this groundbreaking insight was built over several decades of research. John Gurdon laid the cornerstone in the 1960s, when he cloned frogs from adult cells by transferring the cell's nucleus into an enucleated egg. For the first time, a scientist had coaxed an adult cell's genetic material into starting over to make an entire new individual. In 1996, Dolly the sheep extended this “nuclear transfer” to mammalian cells. That work raised hopes that researchers could figure out how oocytes can reset the cellular clock of introduced adult DNA, allowing development to begin afresh in defiance of Waddington's metaphor.

    In 2006, however, Shinya Yamanaka stunned the world when he showed that simply by adding extra copies of four genes to adult mouse cells, he could prompt them to become pluripotent—no help from oocytes necessary. He called the resulting cells “induced pluripotent stem” cells (iPSCs).

    A year later, two groups—one led by Yamanaka, the other by James Thomson—independently created iPSCs from human skin cells. The groups used slightly different combinations of genes, demonstrating that there are alternative ways to make the process work. Soon thereafter, several groups showed that it was possible to reprogram one adult cell type directly into another, turning fibroblasts into neurons and fibroblasts into blood cells, among other results.

    The breakthrough offered a way around some of the sticky ethical and political issues that have dogged research with human embryonic stem cells, which are taken from early embryos. Suddenly, scientists had a source of human pluripotent cells free of special rules and regulations.

    Early reprogramming techniques did have several drawbacks, however. First, they permanently inserted the extra genes into the reprogrammed cell's genome. Although the genes seemed to turn back off once the cells were pluripotent, it wasn't clear how they might influence the cells' later behavior. Second, at least one of the genes Yamanaka used was known to trigger cancer, and indeed it soon became clear that mice grown from iPSCs frequently developed tumors. Finally, the process was inefficient, reprogramming only about one in 5000 of the treated cells.

    Many of those problems have now been addressed. Some labs have reprogrammed cells using viruses that don't insert themselves into the genome. Others have used small rings of DNA called episomes that don't replicate when the cell divides. Other researchers have found small molecules that can substitute for some of the genetic factors, and they have found ways to insert reprogramming proteins directly into a cell. A few months ago, one group described how to use modified RNA to reprogram cells faster and more efficiently than the original technique.

    At the same time, scientists have been eagerly reprogramming cells from hundreds of patients and healthy controls, in an effort to uncover the origins of various diseases and perhaps find new treatments. Labs are using such cells to study amyotrophic lateral sclerosis, Parkinson's disease, Huntington's disease, and even autism. Pharmaceutical companies are using heart cells grown from iPSCs to test drugs for cardiac side effects, a common reason that promising drugs fail.

    Researchers are also working to understand exactly how reprogramming works. The evidence so far suggests that a bit of luck is involved: A cell has to receive the right dose of each factor at the right time. That could explain why the process is so inefficient: Only a small fraction of cells happens to receive the correct dose. Nuclear transfer is much more efficient, and some researchers are still working to piece together how the oocyte works its magic, in the hope that it might provide clues to make reprogramming work better.

    Reprogramming has reshaped the developmental biology landscape, both for cells and for researchers. Scientists hope that in the coming decades it will reshape medicine as well.

  23. Body's Hardworking Microbes Get Some Overdue Respect

    1. Elizabeth Pennisi

    This past decade has seen a shift in how we see the microbes and viruses in and on our bodies, most of which are commensal and just call the human body home; collectively, they have come to be called the human microbiome.

    Humans have been doing battle with bacteria since the 1800s, thwarting disease with antibiotics, vaccines, and good hygiene with mixed success. But in 2000, Nobel laureate Joshua Lederberg called for an end to the “We good; they evil” thinking that has fueled our war against microbes. “We should think of each host and its parasites as a superorganism with the respective genomes yoked into a chimera of sorts,” he wrote in Science in 2000.

    His comments were prescient. This past decade has seen a shift in how we see the microbes and viruses in and on our bodies. There is increasing acceptance that they are us, and for good reason. Nine in 10 of the cells in the body are microbial. In the gut alone, as many as 1000 species bring to the body 100 times as many genes as our own DNA carries. A few microbes make us sick, but most are commensal and just call the human body home. Collectively, they are known as the human microbiome. Likewise, some viruses take up residence in the body, creating a virome whose influence on health and disease is just beginning to be studied.

    Their genes and ours make up a metagenome that keeps the body functioning. This past decade we've begun to see how microbial genes affect how much energy we absorb from our foods and how microbes and viruses help to prime the immune system. Viewing the human and its microbial and viral components as intimately intertwined has broad implications. As one immunologist put it, such a shift “is not dissimilar philosophically from the recognition that the Earth is not the center of the solar system.”

    Invisible partners.

    The roster of bacteria varies among body sites.

    CREDIT: ADAPTED FROM L. DETHLEFSEN ET AL., NATURE 449 (18 OCTOBER 2007)

    This appreciation has dawned gradually, as part of a growing recognition of the key role microbes play in the world. Microbiologists sequencing DNA from soil, seawater, and other environments have discovered vast numbers of previously undetected species. Other genomics research has brought to light incredible intimacies between microbes and their hosts—such as a bacterium called Buchnera and the aphids inside which it lives. A study in 2000 found that each organism has what the other lacks, creating a metabolic interdependency.

    One of the first inklings that microbiologists were missing out on the body's microbial world came in 1999, when David Relman of Stanford University in Palo Alto, California, and colleagues found that previous studies of bacteria cultured from human gums had seriously undercounted the diversity there. Turning to samples taken from the gut and from stools, the researchers identified 395 types of bacteria, two-thirds of them new to science.

    In 2006, Steven Gill of the University at Buffalo in New York and colleagues did a metagenomics study of the gut, analyzing all the genes they could find in the 78 million bases sequenced. They found metabolic genes that complemented the human genome, including ones that break down dietary fiber, amino acids, or drugs, and others that produce methane or vitamins. This and a more comprehensive survey in 2010 by Jun Wang of BGI-Shenzhen in China and colleagues provided support for the concept of the microbe-human superorganism, with a vast genetic repertoire. Now, large-scale studies have surveyed the microflora in the gut, skin, mouth, nose, and female urogenital tract. The Human Microbiome Project has sequenced 500 relevant microbial genomes out of a planned 3000.

    Some of these microbes may play important roles in metabolic processes. In 2004, a team led by Jeffrey Gordon of Washington University School of Medicine in St. Louis, Missouri, found that germ-free mice gained weight after they were supplied with gut bacteria—evidence that these bacteria helped the body harvest more energy from digested foods. Later studies showed that both obese mice and obese people harbored fewer Bacteroidetes bacteria than their normal-weight counterparts.

    The microbiome is also proving critical in many aspects of health. The immune system needs it to develop properly. What's more, to protect themselves inside the body, commensal bacteria can interact with immune cell receptors or even induce the production of certain immune system cells. One abundant gut bacterium, Faecalibacterium prausnitzii, proved to have anti-inflammatory properties, and its abundance seems to help protect against the recurrence of Crohn's disease. Likewise, Sarkis Mazmanian of the California Institute of Technology in Pasadena showed that the human symbiont Bacteroides fragilis kept mice from getting colitis. And inserting bacteria isolated from healthy guts restored the microbial communities, curing chronic diarrhea in a patient infected with Clostridium difficile.

    Herbert Virgin of Washington University School of Medicine finds a similar role for the virome. In mice, his team found that dormant herpesviruses revved up the immune system just enough to make the mice less susceptible to certain bacterial infections.

    The ideas of a microbiome and a virome didn't even exist a decade ago. But now researchers have reason to hope they may one day manipulate the body's viral and microbial inhabitants to improve health and fight sickness.

  24. Alien Planets Hit the Commodities Market

    1. Yudhijit Bhattacharjee

    Data on the 500-and-counting planets discovered outside of our solar system in the past decade are revolutionizing researchers' understanding of how planetary systems form and evolve.

    Plural worlds.

    The Kepler space telescope (below) has already spotted hundreds of candidate planets around other stars.

    CREDITS: ESO; NASA (INSET)

    There are countless suns and countless earths all rotating around their suns in exactly the same way as the seven planets of our system. We see only the suns because they are the largest bodies and are luminous, but their planets remain invisible to us because they are smaller and non-luminous.
    —Giordano Bruno, 1584

    For holding firm to this idea of plural worlds, Giordano Bruno spent 7 years in a dungeon; then, on 17 February 1600, he was led to a public square in Rome and burned at the stake. If Bruno had had the power to summon the future, his best shot at survival might have been to show his inquisitors the Web page of the Extrasolar Planets Encyclopedia, circa 2010. Evidence from the year 2000, when the planets in the encyclopedia numbered a mere 26, might not have done the trick. But the latest tally, 505 and counting, surely would have stayed their torches.

    In the past decade, astronomers have discovered so many planets outside of the solar system that only the weirdest of them now make the mainstream news—such as WASP-17, a giant planet discovered in August 2009, which orbits “backward,” or counter to the spin of its parent star. A software application for iPhones and iPads keeps track of exoplanet discoveries; the score crossed 500 as this article was being written. Hundreds more may soon follow as astronomers pursue some 700 candidates that NASA's Kepler space telescope detected in the first few months after its launch in March 2009.

    Although most of the planets discovered so far are gas giants, an analysis of the Kepler data has convinced researchers that smaller Earth-like planets abound in the universe and that improved detection capabilities in the coming years will turn up scores of them just in our galactic backyard. This insight has opened up the possibility of detecting life elsewhere in the universe within the lifetimes of young astronomers entering the field, if not sooner. Meanwhile, the sizes and orbits of planets already discovered are revolutionizing researchers' understanding of how planetary systems form and evolve.

    The discovery of exoplanets began as a trickle in the previous decade, starting with the detection of “51 Pegasi b” in 1995 by a Swiss team led by Michel Mayor, followed the next year by the discoveries of five planets by U.S. astronomers Geoffrey Marcy, Paul Butler, and their colleagues. By 2001, several other teams had joined the quest, and the pace of discovery quickened.

    The oldest and most popular technique for finding planets has been the use of Doppler spectroscopy—the blue-ward or red-ward shift in the light of a star as it wobbles under the gravitational tug of its orbiting planet. In 1999, astronomers also began detecting exoplanets by the transit technique, watching for a star to dim slightly as its planet travels across its face. Transits have yielded the discovery or confirmation of more than 100 planets to date.

    Since 2001, planet-hunters have added two more techniques to their toolbox. One is microlensing, in which a star briefly brightens as the gravity of another star in the foreground bends its light; changes in the brightening can reveal a planet orbiting the foreground star. Researchers led by Ian Bond of the Royal Observatory, Edinburgh, in the United Kingdom announced the first discovery of a planet through microlensing in 2004; the technique has led to 10 more finds since.

    In 2008, astronomers published the first direct images of exoplanets: tiny pinpricks of light close to a nearby star. With advances in adaptive optics, the technology that corrects for the blurring effect of the atmosphere on ground telescopes, and the development of better coronagraphs—devices that help block out the direct light from a star—astronomers hope to image many more planets directly.

    The diversity of planetary systems discovered to date has forced astronomers to revise their theories of how these systems arise and develop. The discovery of hot Jupiters orbiting very close to their parent star suggests that gas giants—thought to form far out from the star—can migrate inward over time. And the discovery of planets dancing around their stars in tilted or even retrograde orbits suggests that planets can be wrenched from their original birthplaces into odd orbits that astronomers could not have predicted.

    Astronomers expect Kepler to find several Earth-like planets in the next few years. Already, researchers are planning new ground- and space-based instruments to take spectra of the atmospheres of some of those habitable planets. Those atmospheres may bear signatures of life, such as oxygen, which researchers believe can be produced only by biological processes. If and when that happens, it would be the ultimate vindication of Bruno's fatal vision of a cosmos teeming with worlds.

  25. Inflammation Bares a Dark Side

    1. Jennifer Couzin-Frankel

    Over the past decade, it has become widely accepted that inflammation is a driving force behind chronic diseases that will kill nearly all of us: cancer, diabetes and obesity, Alzheimer's disease, and atherosclerosis.

    Burrowing in.

    Macrophages were an early clue that chronic disease and inflammation are tightly linked.

    CREDIT: DAVID M. PHILLIPS/PHOTO RESEARCHERS INC.

    Not long ago, inflammation had a clear role: It was a sidekick to the body's healers, briefly setting in as immune cells rebuilt tissue damaged by trauma or infection. Today, that's an afterthought. Inflammation has hit the big time. Over the past decade, it has become widely accepted that inflammation is a driving force behind chronic diseases that will kill nearly all of us. Cancer. Diabetes and obesity. Alzheimer's disease. Atherosclerosis. Here, inflammation wears a grim mask, shedding its redeeming features and making sick people sicker.

    When a kitchen knife slips while you're chopping vegetables, the body reacts swiftly. White blood cells swoop in and sterilize the injury, and the tissue-repair effort begins. This inflammatory response does have its downsides, causing swelling, redness, and pain. (Indeed, “inflammation” derives from the Latin verb inflammare, which means to set on fire.) But there's no question that acute inflammation is a net positive, a response to trauma that evolved millennia ago to keep us alive and healthy.

    A darker story began to emerge in the 1990s. Researchers peering at apparently unrelated diseases noticed that immune cells congregate at disease sites. Atherosclerosis, in which fatty plaques build up in the arteries, was among the first to make the list. In the 1980s, the late Russell Ross of the University of Washington, Seattle, saw macrophages in atherosclerotic tissue; these white blood cells are a hallmark of inflammation. Slowly, as more people parsed arterial tissue, more came to agree that an inflammatory response was under way. There were T cells. There was interferon-γ, which the immune system produces as part of its inflammatory efforts. Also in the mix were gene variants identified by the Icelandic company deCODE that predispose people to heart attacks by fueling inflammation in plaques. And then this April, researchers used a new microscopic technique to describe, in Nature, tiny crystals of cholesterol in arteries that induce inflammation at the earliest stages of disease in mice.

    Other conditions unrolled parallel story lines. In 1993, a group at Harvard University found that fat tissue in obese mice was churning out a classic inflammatory protein. Ten years later, back-to-back papers showed a correlation between macrophage infiltration of fat tissue in rodents and people and how obese they were. Newcomers to the inflammatory story include neurodegenerative diseases such as Alzheimer's and Parkinson's. Here, it's murkier whether inflammation is perpetuating disease or just along for the ride.

    In most chronic illnesses for which inflammation has been fingered, it appears to drive ill health but not initiate it. In cancer, for example, papers published over the past decade suggest that tumors and inflammation dance together toward disaster: Tumors distort healthy tissue, setting off tissue repair, which in turn promotes cell proliferation and blood vessel growth, helping cancers expand. And although it's genetic mutations in tumor cells that initiate cancer, there's evidence that inflammation in surrounding tissue helps coax those cells along.

    In cancer, inflammation shows up at least partly for the same reasons it normally does: tissue injury. Elsewhere, its appearance is more mysterious. In neurodegenerative conditions, for example, there's some tissue damage from loss of neurons, which could prod inflammation—but there's evidence, too, that inflammation is helping kill neurons. Inflammation also seems to promote two components of type 2 diabetes: insulin resistance and the death of pancreatic beta cells that produce insulin.

    When it comes to obesity, it's unclear why inflammation permeates fat tissue. But theories are percolating. One cites a misguided immune response: Fat cells in obese individuals are not metabolically normal, and the immune system perceives them as needing help and sends macrophages to the rescue, even though they only do harm.

    The surest way to prove that inflammation is driving any disease is by blocking it and testing whether that helps, and experiments are under way. In 2007, Marc Donath of University Hospital of Zurich in Switzerland and his colleagues described results from a clinical trial of type 2 diabetes that had once been dismissed as crazy. Seventy patients received either a placebo or anakinra, a drug used occasionally to treat rheumatoid arth ritis that blocks interleukin-1. IL-1 is a proinflammatory cytokine, a protein that promotes inflammation; it's been found in beta cells from people with type 2 diabetes. In Donath's small study, published in The New England Journal of Medicine, the drug helped control the disease. Anakinra is not a good option for long-term diabetes treatment, so several companies are racing to develop alternatives.

    Mediating inflammation in chronic diseases is a new frontier, its success still uncertain. But after inflammation eluded them for so long, researchers are chasing lead after lead, trying to stay a step ahead and discern when its fires need putting out.

  26. Strange New Tricks With Light

    1. Robert F. Service,
    2. Adrian Cho

    In the past decade, physicists and engineers pioneered new ways to guide and manipulate light, creating lenses that defy the fundamental limit on the resolution of an ordinary lens and even constructing "cloaks" that make an object invisible—sort of.

    Now you see it.

    Metamaterials make it possible to steer light around an object, creating an artificial blind spot (below). A disk version works with microwaves (left).

    CREDIT: DAVID SCHURIG

    Three centuries after Isaac Newton published his Opticks, that ages-old science got really weird. In the past decade, physicists and engineers pioneered new ways to guide and manipulate light, creating lenses that defy the fundamental limit on the resolution of an ordinary lens and even constructing “cloaks” that make an object invisible—sort of.

    The feats sprang from a roughly 50–50 mixture of a new technology and one oh-so-clever idea. The technology was “metamaterials”: assemblages of little rods, rings, and wires that act like materials with bizarre optical properties. The idea was transformation optics, a scheme that tells scientists how to tailor the properties of a metamaterial to achieve an effect like cloaking.

    Metamaterials work by steering light and other electromagnetic waves. When light waves enter normal materials such as glass, the material alters the electric and magnetic fields in the light, slowing the waves to a particular new speed. The deceleration gives the new medium a different refractive index. This effect explains why a straw in a glass of water appears to bend at the surface of the water. In natural materials, light waves entering at an angle always bend to plunge more steeply into the material—the hallmark of a refractive index greater than 1.

    Back in 1968, Victor Veselago, a Russian physicist, reasoned that a material might be engineered to create a negative index of refraction and bend light waves more radically. If water had a negative index of refraction, then a straw placed in a glass would appear to bend back under itself. Such a material, Veselago determined, could form a superlens: a flat sheet capable of focusing light even better than curved lenses do. The notion sat dormant for decades. Then, in the late 1990s, John Pendry, a physicist at Imperial College London, and colleagues determined that the long, thin shape of carbon nanotubes helped them absorb radio waves.

    Pendry started pondering how other artificial materials might affect electro magnetic waves. Copper wires and slitted rings will ring, or “resonate,” with electric and magnetic fields of specific frequencies, and Pendry realized that by playing with those resonances he could tune the electric and magnetic properties of a metamaterial independently to achieve a negative refractive index for microwaves. He also reasoned that it should be possible to build Veselago's superlens to see objects smaller than one-half the wavelength of the light it transmits. No conventional lens can beat that “diffraction limit.” But metamaterials, Pendry realized, might do it by amplifying tiny resonances, called evanescent waves, that light creates under certain conditions.

    CREDIT: J.B. PENDRY ET AL., SCIENCE 312 (23 JUNE 2006)

    Pendry's ideas touched off a torrent of experiments. In 2001, Pendry teamed up with physicist David Smith, then at the University of California (UC), San Diego, and now at Duke University in Durham, North Carolina, to build Pendry's negative-refractive-index material for microwaves—although it took them a few years to convince the scientific community that it worked. Just a few years later, other teams reported initial success in making a superlens and related devices called hyperlenses.

    Then things got really strange. In May 2006, Pendry and colleagues and, independently, Ulf Leonhardt, a theorist at the University of St. Andrews in the United Kingdom, reported that metamaterials could be used to render an object invisible by steering light around it the way water streams around a boulder. Just 5 months later, Smith, his postdoc David Schurig (now at North Carolina State University in Raleigh), and other colleagues unveiled such a ring-shaped cloaking device.

    The device wasn't perfect; it worked only for microwaves of a specific frequency. The real advance lay in the concept behind it. In his theory of general relativity, Einstein realized that space and time can stretch and warp in ways that change the trajectory of light. So Leonhardt and Pendry imagined bending space to steer light around a circular region, making anything inside that hole invisible.

    The theorists realized that they could mimic this extreme stretching of space by filling an unwarped region of space with a metamaterial whose electric and magnetic properties vary in a specific way. They laid out the mathematical theory for making that transformation. In principle, that theory enables experimenters to bend light pretty much any way they want, provided they can sculpt a metamaterial accordingly.

    The wonders keep coming. In 2008, researchers built a microwave cloak that works over a range of wavelengths. That same year, UC Berkeley physicist Xiang Zhang and colleagues created the first cloaking device that worked in three dimensions. Related cloaks have been made to work with infrared rays and visible light.

    Where all of this will lead is hard to say. Much to the disappointment of Harry Potter fans, cloaking a human-sized object at visible wavelengths seems like a long shot. But metamaterials for microwaves could have myriad practical uses. And the idea of transformation optics is so beautiful that it would seem a profligate waste of inspiration if it didn't lead to something useful.

  27. Climatologists Feel the Heat As Science Meets Politics

    1. Richard A. Kerr,
    2. Eli Kintisch

    In the past few years, climate scientists finally agreed that the world is indeed warming, humans are behind it, and natural processes are unlikely to rein it in—just as they had suspected.

    Both sides now.

    Global warming's impacts—both drying and wetting—have come sooner than expected.

    CREDITS (FROM LEFT TO RIGHT): © JP LAFFONT/SYGMA/CORBIS; KONRAD STEFFEN, UNIVERSITY OF COLORADO/CIRES

    Most insights come as a surprise: a burst of understanding, an elegant solution to a problem. This decade's main insight in climate science was a different breed. For 40 years, researchers had wrestled with three big questions: Is the world warming? If so, are humans behind the warming? And are natural processes likely to rein it in? In the past few years, climate scientists finally agreed on solid answers: yes, yes, and no—just as they had suspected.

    There were surprises, and they were bad ones. The effects of rising greenhouse gases on oceans and polar ice were swifter than models had predicted. Yet, faced with the obvious remedy—cutting carbon emissions—the world balked. In the United States, even as the science grew stronger, a political backlash forced climate scientists to defend their credibility and motives.

    The sudden reversal blindsided global-warming researchers. They had been issuing assessments of the state of greenhouse-warming science under the aegis of one organization or another since 1979; in 1990, the new United Nations Intergovernmental Panel on Climate Change (IPCC) took the lead. IPCC's second assessment, released in 1995, asserted mildly that “the balance of evidence suggests” that humans were influencing global climate. But by 2007, IPCC had reached a solid scientific consensus: Warming was “unequivocal,” it was “very likely” due mostly to human beings, and natural processes were “very unlikely” to blunt its strength. The breadth and depth of the IPCC process seemed to drown out the small but well-publicized chorus of climate contrarians.

    Developments around the globe amplified the message. In the 1980s and '90s, most researchers thought the projected impacts of rising greenhouse gases wouldn't hit hard until well into the 21st century. But by the mid-2000s, summertime Arctic sea ice was obviously disappearing, ice shelves were falling apart, and Greenland and West Antarctic glaciers were rushing to the sea. Hurricane Katrina inundated New Orleans just as scientists were debating how the greenhouse could intensify and multiply hurricanes. Even ocean acidification was an observational fact by decade's end. In April 2006, a cover story in Time magazine treated global warming as a given and warned starkly: “Be Worried. Be Very Worried.”

    But powerful nations were acting anything but. As a presidential candidate in 2000, George W. Bush had pledged to regulate CO2; as president, he swiftly reneged and refused to sign the Kyoto Protocol, an emissions-limiting treaty that 187 countries had ratified 3 years earlier. There followed years of efforts by the Bush Administration to alter a handful of climate science reports to downplay the possible effects of climate change, while lawmakers in Washington and negotiators overseas repeatedly failed to pass comprehensive U.S. or international regulations. Europe had some initial success with its cap-and-trade system, but even the World Wildlife Fund says there is “no indication that the scheme is as yet influencing longer-term investment decisions.”

    A new Administration in Washington brought a change in tone but not in course. During the 2008 presidential campaign, Barack Obama pledged to cut U.S. emissions 80% by 2050 relative to 1990; after the election, the U.S. House of Representatives passed a law that did basically that. But the bill died in the Senate this year, after President Obama failed to secure a binding treaty on emissions at key negotiations in Copenhagen in December 2009.

    That November, the release of e-mail correspondence among scientists, taken from the servers of the University of East Anglia in the United Kingdom, had given climate science a jolt of bad publicity. Five panels of experts later absolved the scientists of scientific malfeasance. Even so, the event may have profoundly damaged public views of climate science, with political repercussions yet to unfold. Last month's U.S. congressional elections may hint at things to come: Most Republicans who won election to the House and nearly all Republican Senate candidates have questioned the fundamental science behind climate change, and a few of them denounce the entire field as a conspiracy. “The war on climate science and scientists that's going on now makes the Bush Administration look moderate,” says Rick Piltz, a White House climate official from 1995 to 2005 who now heads the watchdog group Climate Science Watch in Washington, D.C.

    There are hints of movement. The U.S. Environmental Protection Agency is girding for battle to cut emissions of big power plants, and China, Indonesia, Brazil, and India have recently made their first-ever commitments to tackle emissions. But “climate hawks” have lost time and momentum, and many experts now think that adapting to a warming planet, not mitigating emissions, will dominate policy discussions in the decade ahead.

Log in to view full text