# News this Week

Science  12 Nov 2010:
Vol. 330, Issue 6006, pp. 896
1. U.S. Elections

# Researchers Anxious and on the Defensive After Republican Gains

1. Jeffrey Mervis

More scrutiny. Less money.

U.S. scientists trying to assess the impact of last week's midterm elections on the U.S. research enterprise have begun preparing for the former in hopes of staving off the latter. At the same time, community leaders acknowledge that additional scrutiny of their work and its value to the nation is merited and that scientists need to do a better job of interacting with Congress and the public if they hope to fend off assaults on research and training budgets.

Many researchers fear the worst after a Republican resurgence at the polls produced a 25-plus-seat majority in the House of Representatives and loosened the Democrats' grip on the Senate. The 2 November vote ended a 4-year streak of district, state, and national successes by Democrats that paved the way for unprecedented increases in federal research funding. One notable achievement was the 2007 America COMPETES Act that authorized a 7-year doubling of spending on the physical sciences at three key agencies, a promise that subsequent congresses have tried to keep. Another landmark was the $20 billion for basic research across several agencies that was part of the$787 billion stimulus package enacted shortly after President Barack Obama took office in 2009 (Science, 27 November 2009, p. 1176).

The 112th Congress that will convene in January could be headed down another path. Budget hawks are preparing to reduce overall federal spending, newly elected members are questioning the need to take action against rising levels of greenhouse gases, and advocates for smaller government are eying pieces of the Department of Education, the Department of Energy, and even the National Science Foundation (NSF).

Still, science lobbyists point to some bright spots amid those dark clouds. They emphasize that previous Republican Administrations and Congresses have been strong supporters of basic research, which has remained one of the few bipartisan issues in an increasingly divided political culture. They note that Obama has already said he will consider alternatives to the much maligned cap-and-trade system to lower carbon emissions (see p. 897). And they suggest that dismantling government is easier said than done. The 1994 Republican electoral wave that was buoyed by similar rhetoric failed to crest, they recall; indeed, the federal government actually grew significantly during the subsequent Administration of George W. Bush. In addition, they note, any House-passed measures are likely to face tough resistance in the Senate.

But few deny that the election has altered the landscape for science. “In terms of appropriations, it'll be harder for everybody, including academic research,” says Peter McPherson, president of the Association of Public and Land-grant Universities. “I agree that we need to reduce the deficit [$1.3 trillion this year]. But it would not be good for the country if Congress didn't continue to provide strong support for university research.” The new Congress will convene without several staunch supporters of science. In the Senate, Arlen Specter (D–PA), a fierce and effective advocate for biomedical research, is departing after 30 years, following a loss in the Democratic primary earlier this year. Two huge losses to the House Science and Technology Committee are Representative Bart Gordon (D–TN), its current chair and the driving force behind an attempted reauthorization this year of the COMPETES Act, and Representative Vern Ehlers (R–MI), one of three physicists in the current Congress. Both are retiring, Gordon after 26 years and Ehlers after 17. The science committee also loses veterans representatives Brian Baird (D–WA) and Bob Inglis (R–SC). On appropriations, the subcommittee that funds NSF, NASA, and the Commerce Department's science agencies loses Representative Alan Mollohan (D–WV), its current chair. Representative Bill Foster (D–IL), the second member of the unofficial physics caucus, lost his bid last week for a second full term. [The lone remaining physicist in the House is Representative Rush Holt (D–NJ), who narrowly won reelection to a seventh term.] “We're back to where we were in 2004,” observes Michael Lubell, a lobbyist for the American Physical Society, about the ideological makeup of the new Congress. And that's not a good place to be, he says. “There will be a lot of new members who don't have much interest in the so-called elites, including scientists. The public doesn't dislike science, but it doesn't like scientists, especially if they say that they are from Stanford or Harvard and that they know what's best.” Ehlers, a college physics professor turned lawmaker and a champion of both basic research and science education, agrees with Lubell that scientists need to change their approach to public discourse. “A frequent mistake that intellectuals make is to think, ‘Since I'm smart, I know everything, and that if you don't listen to me you're an idiot,’” says Ehlers. “That doesn't work. You have to have respect for your member of Congress and say, ‘What can I tell you that will help you understand the gravity of the situation?’” He also thinks that scientists need to adapt to what he says is a major change in the political landscape. “There is a new era here, and they should be getting to know the Republicans,” says Ehlers. “Scientists are making a big mistake if they think that they can hunker down and just wait for Democrats to reclaim the House. Most university faculty tend to be liberal and identify with Democrats. So they need to become more open-minded and stop ridiculing Republicans and start trying to work with them. Otherwise, they won't be very effective.” Lubell suspects that many scientists will have a hard time making those adjustments. “I spoke at a national lab before the election and asked them what plans they had made for reductions in federal support and for working with people who know nothing about science,” he says. “They were shocked. They didn't believe me.” And even a fresh approach may not be good enough, he warns. “I think that there are Republicans who will be receptive if you make the case that science isn't something that the private sector will fund,” says Lubell. “But even then, it may come down to trying to argue against cutting science, or that Congress should use a duller ax.” 2. U.S. Elections # Election Means Change in Climate for Efforts to Curb Emissions 1. Eli Kintisch Joe Manchin, the Democratic governor from coal-rich West Virginia, won a narrow election for a U.S. Senate seat last week, helping Democrats retain control of that body. But a memorable TV ad showing him shooting a copy of a cap-and-trade bill with a rifle sent a clear message: A core feature of the Obama Administration's approach to curbing greenhouse gas emissions is dead. With “No on cap-and-tax” among their key talking points, Republicans who triumphed in Congress and in dozens of state capitols have vowed to block the president's plan to limit greenhouse gas emissions and reject state laws with a similar goal. And many Democrats, like Manchin, are likely to join them. Although a California state referendum that would have effectively repealed a statewide cap-and-trade law was defeated, the Administration is already bowing to the inevitable. “Cap and trade was just one way of skinning the cat,” a chastened President Barack Obama said in a press conference the day after the election. “It was a means, not an end. And I'm going to be looking for other means to address this problem.” With those words, Obama signaled that he is eager to find a bipartisan approach to curbing emissions. But aggressive questioning of the fundamental tenets of climate science by many right-leaning politicians may make such an approach elusive. Several postelection analyses suggested that most Democrats who lost races for the House of Representatives would have done so whether or not they voted against the cap-and-trade bill that passed the House before stalling in the Senate. But the Republican electoral wave was accompanied by an increasing willingness of candidates to question whether anthropogenic greenhouse gas emissions are warming the planet. Morgan Griffith, who defeated veteran Representative Rick Boucher (D–VA), said humanmade warming was a “scientific theory … that many scientists do not even believe is happening.” Dozens of House candidates—and every Republican Senate candidate—made similar public statements. “It's not only anti-science and anti–fact-based policymaking, but it's just political demagoguery,” says Representative Alan Mollohan (D–WV), a senior House member who lost his seat in a Democratic primary. Still, policymakers are looking for areas of compromise. White House spokesperson Robert Gibbs said one possible way to make progress on emissions might be to pass a national “renewable portfolio standard.” Enacted in two dozen states, such a system requires utilities to obtain a significant portion of their power from nonemitting sources. Congress so far has balked at such legislation, although Senator Jeff Bingaman (D–NM), chair of the Senate Energy and Natural Resources Committee, hopes to succeed in the lame-duck session. Economist Richard Schmalensee of the Massachusetts Institute of Technology in Cambridge, however, warns that such a standard could deliver more costly emissions reductions than a cap-and-trade scheme. He says it would limit utilities' options by forcing them to use solar or wind power, for example, instead of making their coal plants more efficient. A new report by authors who span the political spectrum calls for rapidly doubling federal funding for the Department of Energy's Office of Science and its Energy Frontier Research Centers while quintupling the budget of the Advanced Research Projects Agency–Energy. Even so, co-author Mark Muro of the Brookings Institution admits that a big boost in federal spending on energy research—now roughly$4 billion annually—would be a “real challenge” for Congress, given the newly powerful Tea Party movement, for whom limiting spending is paramount.

Even with cap and trade dead, there will be plenty of new opportunities for rancor. The Environmental Protection Agency (EPA) is currently setting up rules to limit carbon emissions from power plants, and Obama has underscored that he is committed to that effort. But House Republicans—and many in the Senate, which Democrats still control—have vowed to risk an Obama veto by withholding funding for such efforts. “If [Obama] wants a fight on the [EPA's] budget, we'll need to take it to him,” says Griffith.

The lawmakers expected to chair key House committees—such as Representative Darrell Issa (R–CA), in line to run the Oversight and Government Reform panel—have made it clear that they intend to investigate not only EPA Administrator Lisa Jackson but also the “Climategate” scandal of last year. “If the raw data's in doubt, then the idea that we have settled science doesn't exist. I want settled science,” Issa told ClimateWire.

Representative Bob Inglis (R–SC), who lost earlier this year in a Republican primary, was one of two lawmakers who attributed his defeat to his vote in favor of cap and trade. (Boucher was the other one.) “It is a challenge,” says the six-term legislator. “We've got to figure out a way to get the triple play of this American century, which is improving the national security of the United States, creating jobs, and cleaning up the air.”

3. # Ralph Hall Bids to Lead Science Panel

1. Jeffrey Mervis

In an election that saw Republicans gain control of the House of Representatives, it's perhaps fitting that the presumptive new chair of the House science committee, Representative Ralph Hall (R–TX), is a former Democrat. But the committee's traditional bipartisan culture could take a hit if Hall's first public comment on how he might run the panel is any guide.

Hall, who represents a rural district east of Dallas, came to Washington as a conservative Democrat in 1981 and switched parties in 2004, 3 years before his new party lost its House majority. A lawyer and former businessman, Hall easily won reelection last week to a 16th 2-year term. At 87, he's the oldest member of Congress but has never chaired a committee.

It may be weeks before House Republicans choose committee chairs for the 112th Congress. And Representative James Sensenbrenner (R–WI), who is in line to head the special committee on climate change that the Democrats created after taking over in 2007, could use his seniority to stake a claim on the science post if Republican leaders eliminate the climate panel.

Although Hall voted with a majority of his Republican colleagues 94% of the time in the current Congress, he'll need to convince the new leadership that he's an able and willing soldier. Perhaps with that in mind, Hall issued a statement the day after the election pledging “strong oversight over the [Obama] administration in key areas including climate change, scientific integrity, energy research and development, cybersecurity, and science education.” He also mentioned his concerns about “the unprecedented growth in the federal government” in recent years and his conviction that “federal investment in R&D must empower the free market, not interfere with it.”

Then he went a step further. He promised that the committee “will be a place … where all Republicans can play a role in crafting good science policy.” An aide says that “Mr. Hall was not trying to exclude Democrats. … He certainly did not intend to infer that Democrats are not welcome to also play a role in crafting science policy.”

In an era of increasing partisanship, however, Democrats will be looking closely to see which interpretation holds sway if and when Hall takes the gavel in January.

4. U.S. Elections

# Retiring Legislators Warn of Pitfalls Facing Science in New Congress

### Audio: Post-Election Roundtable

Two days after last week's midterm election, Science talked with four veteran legislators who have been staunch defenders of science and science education. Three are retiring next month: representatives Brian Baird (D–WA) and Bob Inglis (R–SC) of the House science commitee and appropriations subcommittee chair Alan Mollohan (D–WV). A fourth, Representative Sherwood Boehlert (R–NY), a former chair of the House science committee, left in 2007. Here are excerpts from that discussion, available online, with Deputy News Editor Jeffrey Mervis.

## On the prospects for a spending freeze

Sherwood Boehlert: One of the most important things for scientists to do is to change the vocabulary. No longer should we be talking about investing in science or increasing R&D funding or STEM education because it's important for science. We should make this a national security issue. When a lot of the conversation is about the next Congress cutting or freezing all non–national security spending, we ought to take [science] funding and put it under the national security umbrella. Because it is a question of national security: lessening dependence on foreign oil, competitiveness, providing opportunities for our young people, creating jobs. …

Alan Mollohan: There has to be a national consensus that increased funding for science is akin to supporting economic growth and development generally. … We ought to be working against having a continuing resolution this year, because a CR would cost science funding in our bill [Commerce, Justice, Science] about $500 million. That would mean cuts, because we have increases in this bill over last year that would not be implemented. ## On so-called negative earmarks Brian Baird: There's a lot of talk about how [the next Congress] is going to eliminate earmarks. But my hunch is that the practice of negative earmarking [eliminating specific projects or programs] will continue and expand and that science, particularly the social sciences, may suffer from this. People will grandstand and take positions about how they're saving the public money … by taking money from an NSF- or NIH-approved grant and applying it to some “mom and apple pie” issue. And science needs to be mindful and watchful of some members, especially the newer ones, who may try to earn some cheap political points by substituting themselves for the peer-review process. ## On explaining the value of research Bob Inglis: It's very important for scientists to explain the implications of their research, to make it more real to the people who are paying the bills. … By selling the sizzle, scientists can help rescue us from a retreat from science. ## On the need for compromise S.B.: When I announced my decision not to run in 2006, The New York Times referred to me as an endangered species because of my moderation. Now it appears that I'm almost extinct. The [Democratic] Blue Dogs have suffered, too. There has to be a role for the center in American politics. … Let's hope that after all is said and done, people will come to their senses. I'm not overly optimistic, but I'm keeping my fingers crossed. ## On 17 November, the science committee will hold what may be the last Democratic-led hearing on climate change, billed as a “rational discussion” of the issue. Two members explained why. B.B.: We're going to start with the basics: why the ocean is acidifying, why we think the climate will change. I don't think we've made the case for the basic physics and chemistry of these two phenomena and presented the evidence that it is happening. … B.I.: I would encourage the scientists to come to these hearings with great glee, to show the data that they have discovered, and to present it in a winning way. … If they come with an attitude of defensiveness, then it will support those who say, “This is a bunch of hooey, and we don't need to spend any money on it.” They should welcome the inquiry and say, “We're happy you asked. Now we're going to tell you what the data are.” ## On the role of U.S. businesses B.B.: It's true that America COMPETES has been endorsed by the Chamber of Commerce and the National Association of Manufacturers, et cetera. But if you look at where their campaign contributions went, they quite literally do not put their money where their mouth is. They're happy to endorse legislation, but they don't fight vigorously to make sure [pro-science] candidates get in, nor do they insist that if a candidate is antithetical to investments in research and education, that candidate is antithetical to the interests of business and to economic and national security. ## On the learning curve for new members A.M.: The reality is that most new members will be preoccupied with getting reelected and won't have the kind of responsibilities that would prompt them to get deeply involved [in the substance of a particular committee] until their second or third or fourth term. If they're on a nonscience committee, they won't become versed in the pros and cons of any particular [science] issue for a long time. 5. ScienceInsider # From the Science Policy Blog Physicist and Representative Bill Foster (D–IL) lost his seat in last week's election as the Republican wave hit his conservative-leaning district, despite considerable financial support from colleagues at his district's Fermi National Accelerator Laboratory, where he worked for 22 years. “We [always knew] it would be a tough seat for a Democrat to hold,” said Fermi physicist James Volk. “We're not going to elect him single-handedly out of Fermi.” Meanwhile, voter initiatives on defining “personhood” at birth and establishing the right to hunt failed at the voting booth. An arbitrator has ruled that Florida State University's decision to lay off 12 tenured faculty members last year was the result of an “arbitrary, capricious and unreasonable” process. The university promptly reinstated all 21 tenured faculty members axed in the cost-cutting move. Officials with the ITER fusion reactor in Cadarache, France, are debating several controversial ways to save money. Among the proposed steps is to eliminate magnetic coils for controlling disturbances in the gas at the heart of the reactor or to skip tests on the reactor's giant containment magnets. A federal judge for the District of Columbia has rejected a legal argument used by the Bush Administration in 2008 to argue that polar bears are threatened, not endangered. That moves the animals one step closer to being listed as endangered under U.S. law. Popular genetic testing services may sometimes be wrong about a customer's cancer risk, according to a new study. When test results for a small group of patients were compared with their risk calculated from family history, the results differed about half the time. For more science policy news, visit http://news.sciencemag.org/scienceinsider. 6. Paleoanthropology # Neandertal Brain Growth Shows A Head Start for Moderns 1. Ann Gibbons With brains as big as ours, Neandertals were no dumb brutes. But they may not have used their brains the same way we do. In the crucial first year of life, their brains developed dramatically differently from the way ours do, according to a report in this week's issue of Current Biology. “Although they have the same brain size as us, Neandertals missed something that [modern] humans got in the first year of life,” says study co-author Jean-Jacques Hublin of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. Like chimpanzees, Neandertals apparently skip a phase during which infant Homo sapiens expand certain regions of the brain, when the dome of the H. sapiens skull rounds into its distinctive globular shape, says Hublin. This bulging in the parietal and temporal regions (at the top of the brain and above the ear) and in the cerebellum (at the base of the brain) suggests that humans are building brains that are fundamentally different from those of Neandertals, says Hublin. Paleoanthropologist Steven Leigh of the University of Illinois, Urbana-Champaign, who was not involved in the work, says, “Quite importantly, it moves us beyond examining simply the size of brains in these taxa and begins providing meaningful information on shape change.” Researchers have long recognized that adult Neandertal brains were more elongated and less globular than those of modern humans. But when do such changes appear in the life span? To find out, Hublin, Philipp Gunz, and Simon Neubauer of Max Planck first compared CT scans of the brains of 58 humans and 60 chimps, varying in age from birth to adulthood, including seven newborns of each species. Using three-dimensional imaging and several hundred landmarks on the braincases to match them accurately despite differences in size, the team showed that humans—but not chimps—preferentially expand their parietal lobes and cerebellums and widen their temporal lobes in the first year of life, according to a report in November's Journal of Human Evolution. This results in the characteristic rounded dome of our skulls. Computer modeling showed how those early changes set human brains on a developmental trajectory that is different from that of chimps. Then the team used the same imaging methods to study nine fossil Neandertals, including a newborn, a year-old baby, and three children. Because the brain does not fossilize, they studied endocasts, imprints of the brain left in the skull. They found that at birth, both Neandertal and modern human infants had elongated braincases that were similar in shape, although Neandertal faces were already larger. By age 1 or so, modern humans had grown globular brains, but Neandertal babies did not show the preferential bulging in the parietal and cerebellar regions, even though the brain grew overall. “They finally demonstrate that this stage [bulging] is absent in Neandertals,” says paleoneurologist Emiliano Bruner of the National Research Centre on Human Evolution in Burgos, Spain. In living people, those areas of the brain are linked with key functions such as the ability to integrate sensory information and to form abstract representations of the environment. The differences suggest that Neandertals didn't behave just like we do, says Hublin. However, because there are so few fossils of Neandertal infants, the new analysis depends on endocasts of only one incomplete skull of a Neandertal newborn and of an older baby, points out anthropologist Marcia Ponce de León of the University of Zurich in Switzerland. “They do reveal a pattern,” says Leigh, but “this pattern needs statistical support before we can be confident about its validity. That may not be possible with such small samples.” Hublin and Gunz acknowledge the sample-size problem but say that they also used computer growth simulations that did not rely on juvenile fossils. When they modeled the development of modern human babies without a globular growth stage, the babies' brains grew to look like those of Neandertal adults. The team recently confirmed its findings in a second, more complete Neandertal newborn. Now, they are investigating how brain development may be linked to genes that differ in Neandertals and modern humans, working with the Max Planck researchers who sequenced the Neandertal genome. “Very small differences in this crucial time in development can have serious consequences,” says Gunz. 7. Cancer Screening # The Promise and Pitfalls of a Cancer Breakthrough 1. Eliot Marshall Cancer research got some good news last week: A landmark clinical trial reported that screening for small tumors with advanced x-ray imaging led to a significant drop in lung cancer deaths (20% fewer) among smokers and ex-smokers, compared with screening with standard chest x-rays. Such positive results are unheard of, particularly for lung cancer, which kills 157,000 people in the United States each year. Until now, no randomized study has shown that screening for lung cancer can save lives. But the news was double-edged. Experts quickly warned that the U.S. health care system isn't ready for what may follow. The new evidence, they said, is likely to bring a surge in demand for lung screening, which most insurers and Medicare do not pay for. In addition, perhaps a quarter of the people who decide to get screened will be burdened with false alarms and expensive follow-up tests. Even the National Cancer Institute (NCI), which sponsored this$250 million study of a 3D imaging method known as low-dose helical computed tomography (CT), was guarded in its comments. NCI Director Harold Varmus said in a teleconference on 4 November that he saw “a potential for saving many lives,” presumably through early detection and treatment. The study, called the National Lung Screening Trial (NLST), enrolled 53,454 smokers and ex-smokers between ages 55 and 74; 354 died of lung cancer in the CT screening arm compared with 442 in the chest x-ray group. But Varmus warned that the results apply only to heavy smokers and that the data must still be scrubbed, a process that will take months, before recommendations can be made.

NCI Deputy Director Douglas Lowy, also speaking at the teleconference, ticked off some “disadvantages” of CT screening. One is cost. The price of a scan, estimated at about $300 to$500 per screening, is the least of it. Big expenses ensue, Lowy said, from the high ratio of people who get positive test results but do not have lung cancer. Even if you focus strictly on those with the highest risk—this trial screened smokers and ex-smokers who had used a pack of cigarettes a day for 30 years—“20% to 50%” of the CT scans “will show abnormalities” according to recent studies, said Lowy. According to NCI, about 96% to 98% are false positives.

In NLST, about 25% of those screened with CT got a positive result requiring follow-up. Some researchers have seen higher rates. Radiologist Stephen Swensen of the Mayo Clinic in Rochester, Minnesota, says that a nonrandomized study he led in 2005 gave positive results for 69% of the screens. One difference between the Mayo and NLST studies, Swensen says, is that Mayo tracked nodules as small as 1 to 3 millimeters whereas NLST, which began in 2002, cut off positive findings below 4 mm.

One negative consequence of CT screening, Lowy said at the teleconference, is that it triggers follow-up scans, each of which increases radiation exposure. Even low-dose CT scans deliver a “significantly greater” exposure than conventional chest x-rays, said Lowy, noting that, “It remains to be determined how, or if, the radiation doses from screening … may have increased the risks for cancer during the remaining lifetime” of those screened. Clinical follow-up may also include biopsy and surgery, Lowy said, “potentially risky procedures that can cause a host of complications.”

G. Scott Gazelle, a radiologist and director of the Institute for Technology Assessment at Massachusetts General Hospital in Boston, has been analyzing the likely impacts of lung cancer screening for a decade. He agrees that people are going to demand it—and that “there are going to be a huge number of false positives.” He was not surprised at NLST's finding of a lifesaving benefit of 20%. His group's prediction of mortality reduction through CT scans, based on “micromodeling” of actual cancers and data from previous studies, was 18% to 25%, right on target. But Gazelle says this analysis, now under review, still suggests that a national program of CT screening for lung cancer “would not be cost effective.” Indeed, the costs seem likely to be three to four times those of breast cancer screening, with similar benefits.

Advocates of screening, in contrast, see the NLST results as vindicating a campaign to put advanced computer technology to work on lung cancer. The detailed images of early tumors in CT scans are “exquisite,” says James Mulshine, vice president for research at Rush University Medical Center in Chicago, Illinois, and an adviser to the pro-screening advocacy group, the Lung Cancer Alliance in Washington, D.C. He thinks it should be straightforward to reduce the number of biopsies and surgeries resulting from false positives by monitoring small tumors for a time before intervening. There are 45 million smokers in the United States who might benefit from CT screening, says Mulshine. He asks: Do we provide it, or “Do we tell them, ‘Tough luck’?”

8. ScienceNOW.org

# From Science's Online Daily News Site

Cats' Tongues Employ Tricky Physics Cats like to do things their own way—even, it seems, when it comes to drinking.

Dogs and other animals lap up water by curling their tongues into a cuplike shape, but high-speed cameras reveal that cats rest the tips of their tongues on the liquid's surface without penetrating it. The water adheres to the cat's tongue and is pulled upward in a column as the cat draws its tongue into its mouth—a complex maneuver that pits gravity against inertia. The technique maximizes a feline's gulp while keeping its whiskers clean, researchers report online this week in Science.

New Malaria Drug Could Save Tens of Thousands It's not often that spontaneous applause erupts halfway through a scientific talk. But when malaria researcher Arjen Dondorp came to the crucial slide of his presentation last week at the American Society of Tropical Medicine and Hygiene's annual meeting, the audience couldn't contain itself. Dondorp's study showed that compared with an older drug called quinine, a new one called artesunate reduces the risk of death from severe malaria in African children by 23%—a finding that could save tens or even hundreds of thousands of lives annually.

Comet Revealed as a Giant Dog Bone Last Thursday, the EPOXI spacecraft screamed by the icy nucleus of comet Hartley 2—the fifth comet ever imaged close up. Photos beamed back reveal that the comet looks like a dog bone. Presumably, comet Hartley 2 was once two separate bodies that came together. Or, it could have always been a single body now stretched out by its own spinning. Today, the sun's warmth is driving off primordial gases and dust, especially in bright jets seen streaming off the far end. Surprisingly, frozen carbon dioxide—dry ice—rather than water ice seems to be contributing most of the gas that's driving the dust.

9. Remote Sensing

# Earth-Observation Summit Endorses Global Data Sharing

1. Richard Stone

BEIJING—Last August, heavy monsoon rains submerged nearly one-fifth of Pakistan, inflicting $43 billion worth of damage. The floodwaters destroyed homes and businesses, washed away bridges and roads, ruined crops, and claimed about 1800 lives. As bad as it was, the toll could have grown in the weeks that followed if not for a novel Earth-observation system featured at a meeting here last week. In July, before the deluge, the International Centre for Integrated Mountain Development in Kathmandu—along with NASA and the U.S. Agency for International Development—had booted up SERVIR-Himalaya, a Web-based monitoring system that pulls together satellite imagery, forecast models, and ground observations. It “showed the progression of the floods in [near] real time,” says Sherburne Abbott, associate director for environment at the White House Office of Science and Technology Policy. As the disaster unfolded, analyses revealed that flooding had knocked nearly 200 tuberculosis clinics out of commission. Forewarned, aid agencies scrambled to steer patients to functioning health centers. “They knew they were going to have a real problem,” Abbott says. SERVIR is one new instrument in a veritable orchestra of Earth-observation systems intended to make reams of data available and relevant to decision-makers. At the summit last week of the Group on Earth Observations (GEO)—the organization attempting to get this ensemble performing in synchrony—initiatives were unveiled to monitor land-cover changes and forest carbon stocks. And GEO delegates embraced plans to funnel data from platforms tracking everything from biodiversity to earthquake risks into a free and open database. “What's happening is groundbreaking,” says David Hayes, deputy secretary of the U.S. Department of the Interior. “This data is incredibly valuable. If you share it, your incremental contribution can yield a super benefit.” Established in 2005, GEO is an effort by 85 countries, the European Commission, and 58 international organizations to meld disparate remote-sensing tools and ground-based databases—300 databases and counting—into a seamless Global Earth Observation System of Systems (GEOSS), which is expected to come fully online in 2015. When GEO was conceived, “we understood that if you want to manage planetary problems, you have to have planetary information—which didn't exist at that stage,” says Bob Scholes, a biodiversity expert at the Council for Scientific and Industrial Research in Pretoria. GEO's progress has been remarkably swift, Scholes adds, and the project has overcome the view that data should be hoarded, not shared. “When an earlier generation of scientists collected data on the public purse, they considered it their data. The norm now is that data will quickly enter the public domain,” he says. To reinforce such good behavior, “persistent identifier” tags are being developed that will note which scientists or teams contributed data to GEOSS. The U.S. Office of Management and Budget (OMB) is spurring agencies to release data via www.data.gov. “OMB is looking to measure our department's productivity in part by how much we're adding to the public's access to data,” says Hayes. NASA and the U.S. Geological Survey 2 years ago began allowing free access to their 4-decade Landsat archive, including images with a resolution of 30 meters that enable tracking of land-cover changes wrought by human activity. And riding new open-data legislation in the European Union, the European Space Agency plans to allow free access to data streams from its soon-to-be-launched Sentinel satellites, says Manuela Soares, director for environmental research at the European Commission's Research Directorate. “There's been delivery of data on a massive scale,” says Gary Richards of Australia's Department of Climate Change and Energy Efficiency in Canberra. Ground-truthing such data is a key element of Silva Carbon, a U.S.-led scientific network announced here to help GEO improve access to Earth-observation data on forests. SilvaCarbon is expected to develop technologies to implement one of the few bright spots in international climate negotiations: REDD+, a program to reduce emissions from deforestation and enhance forest carbon stocks. Together with GEO's Global Forest Observation Initiative, SilvaCarbon “shows that we are ready to take the next big step to a robust and transparent global monitoring system for forest carbon,” says Richards. A second new effort, the Global Land-Cover Data Initiative, aims over the next 2 years to compile and publicly share a current snapshot of Earth's land-cover conditions. Landsat data provide 80% coverage; GEO partners will fork over the rest. As GEOSS is woven from disparate data sets, there have been a few glitches in integrating the information. “We can't get all data into the free and open database at this point,” says Abbott. And some resistance remains. “We still get pushback,” says Scholes. “Some countries worry about how data release will affect national security.” Nations fret, for instance, over satellite data they have no control over and others revealing info such as flows rates of transboundary rivers. Of course, all agree that some sensitive data, such as the precise location of the last few individuals of an endangered species, should not enter the public domain. “But these instances are now perceived as the exceptions to the rule,” Scholes says. And that, he says, testifies to the profound cultural change on data sharing that GEO is helping drive. 10. Human Genetics # Affordable 'Exomes' Fill Gaps in a Catalog of Rare Diseases 1. Jocelyn Kaiser The couple had four healthy babies, but tragedy struck as the children became teenagers: Three began to lose their sight. Now in their 30s, two of the sisters and their only brother have lost most peripheral vision; one is legally blind from retinitis pigmentosa, the disease all three inherited. Although doctors tested them for more than 50 genetic mutations linked to the disease, they came up empty. Then late last year, geneticists at the University of Miami in Florida tried a new test, one that searched for mutations by sequencing the siblings' entire protein-coding DNA. By July, the family had an answer: The three affected siblings had inherited two copies of a defect in a gene called DHDDS. It codes for an enzyme that may be involved in tacking sugar groups to the light-sensing protein rhodopsin; when researchers blocked DHDDS in zebrafish, the fish became partially blind. The family members, who for years worked to raise money to study inherited blindness, “were overwhelmed when they learned that we found the gene,” says geneticist Stephen Züchner of the University of Miami. He presented the study this week at the American Society for Human Genetics (ASHG) meeting in Washington, D.C. The study adds to a flurry of reports this year on new genes for Mendelian disorders: rare diseases caused by a defect in a single gene. The work takes advantage of cheap, next-generation DNA-sequencing technologies that make it possible, for a few thousand dollars, to sequence the 1% of the genome that codes for proteins, known as exons. This so-called exome sequencing has “reinvigorated the field of Mendelian disorders,” says Jay Shendure of the University of Washington, Seattle, who gave an overview at the ASHG meeting. Researchers studying rare diseases are experiencing a “euphoria,” says Eric Green, director of the National Human Genome Research Institute (NHGRI) in Bethesda, Maryland. For decades, geneticists have had only one way to find the genes underlying Mendelian disorders: studying families. By analyzing inheritance patterns of genetic markers, they could pinpoint the disease gene. But this so-called linkage approach doesn't work when few affected family members can be found or the disease is extremely rare, often because the mutation was not inherited but occurred spontaneously. As a result, of the nearly 7000 known or suspected Mendelian disorders identified based on clinical features, less than half have been linked to a gene (see table). That is changing thanks to exome sequencing. Researchers use an off-the-shelf kit to separate out the exon-coding DNA in a sample, then feed the DNA into the new sequencing machines. This yields a list of perhaps 20,000 mutations. The researchers then filter out data such as changes that don't alter amino acids in proteins or mutations commonly found in other people. In August 2009, Shendure and colleagues reported as a proof of principle that by sequencing the exomes of four unrelated patients with the same disease, Freeman-Sheldon syndrome, they found the known underlying gene. A string of reports of new Mendelian disease mutations have followed, some in known genes but also 10 or so novel disease genes, says Shendure. These include, for example, the first gene for Kabuki syndrome, an extremely rare disorder involving deformities and retardation. (In a couple of cases, the researchers used whole-genome sequencing, which costs eight to 10 times as much as exome sequencing, but the genes could have been found with just the exome, Shendure notes.) “It's solving things that we've been looking at for a long time,” says molecular geneticist Joris Veltman of Radboud University Nijmegen Medical Centre in the Netherlands, whose group has found several novel genes—and “you don't have to be a big genome institute.” Some mutations point toward treatments. Yale University geneticist Richard Lifton's team last year reported that a Turkish infant with failure to thrive who had been diagnosed with an inherited kidney disorder instead had a known defect in a gene coding for an intestinal chloride transporter, leading to chronic diarrhea; the boy was put on oral rehydration therapy. Züchner's group is exploring whether drops of the chemical made by the DHDDS enzyme can restore some sight in the blind siblings. Other discoveries reveal new insights about human biology. For example, as reported online in Nature in August, Lifton's group found a gene called WDR62, which when mutated severely alters how the brain develops. Exome sequencing isn't perfect, however. The kits used in most published work capture only about 75% of all 20,000 genes, although newer versions are more comprehensive. The approach also misses structural changes and noncoding DNA that can influence gene regulation. The exome could soon be eclipsed by whole-genome sequencing, which is getting ever cheaper, geneticists say. It's an open question whether exome sequencing can ferret out rare mutations that underlie complex, common diseases. Answers may come soon: Tens of thousands of exomes from people with common diseases are now pouring out of sequencing machines, Shendure notes. But the technique's success for Mendelian disorders is undisputed. One sign is that NHGRI plans to fund a center next year that would find the genes underlying 40 to 50 Mendelian diseases a year and coordinate samples for studying others. Medical geneticist David Rosenblatt of McGill University in Montreal, Canada, spoke out passionately at the ASHG meeting in support of such a systematic effort to find the genes behind Mendelian diseases. “We should put the pins in the map … and in the next 2 years find them all,” Rosenblatt said. 11. # Scientific Gold Mine or Dicey Money Pit? 1. Adrian Cho Scientists say a proposed underground laboratory in South Dakota could be a world beater. But they must persuade the National Science Foundation to pay for the massive project. Miners once hauled gold out of the Homestake mine in the Black Hills of South Dakota. Now particle physicists hope to find scientific treasures there. They want to convert the mine into the Deep Underground Science and Engineering Laboratory (DUSEL), the largest underground lab in the world. In it they would seek the elusive “dark matter” whose gravity binds the galaxies, a type of radioactivity that would blur the line between matter and antimatter, and protons falling apart as predicted by some particle theories. Advocates say the$875 million project is too good an opportunity to pass up. “We're investing in a suite of experiments, and three, four, or five of them could be discovery experiments that will change the textbooks,” says Kevin Lesko, a physicist at Lawrence Berkeley National Laboratory in California, who leads the DUSEL design team. But DUSEL is not a typical project for the U.S. National Science Foundation (NSF), which historically builds scientific instruments such as telescopes. Instead, DUSEL is mostly an infrastructure project to provide lab space for a host of experiments in a variety of disciplines. Moreover, the biggest experiment in it would be a gargantuan particle detector funded primarily by the Department of Energy (DOE), not NSF. Operations such as pumping water out of the mine also cost $1 million a month. The project must win approval from the National Science Board (NSB), which sets policy for NSF, and observers say that board members will want good answers to three important questions before they sign off on the project. How would DUSEL stack up against other underground labs around the world? How will NSF and DOE coordinate efforts to ensure the project stays on track? And will DUSEL yield enough science to justify the investment? “The NSF science projects have to be enough in the forefront to make it a good sell,” says Barry Barish, a physicist at the California Institute of Technology in Pasadena and a consultant to the board. “The more it looks like it's just a big hole in the ground for a DOE experiment, the less sellable it is.” The project is entering a crucial period. The preliminary design should be completed by year's end. Meanwhile, the National Academies' National Research Council has begun a study of the three main questions and should provide input to NSF by the spring. Based on the preliminary design, NSB could vote on the project as early as August. “The question is, how much of this list of issues will be settled by next summer?” Barish says. ## The lowdown on labs low down When it comes to underground labs, “the deeper the better, the bigger the better, there is no doubt,” says Franz von Feilitzsch, a physicist at the Technical University of Munich in Germany. Deep under ground, scientists can escape the hail of ordinary particles from space called cosmic rays to search for more exotic things. Big labs provide room for the huge detectors needed to spot very rare phenomena. At a lab like DUSEL, four physics studies would anchor the research program. One would search for the dark matter that cosmologists say makes up 85% of all matter. Physicists are already trying to glimpse dark-matter particles smacking into nuclei in detectors weighing tens of kilograms in various underground labs around the world. They're spurred by a concept called “supersymmetry,” which predicts that every known particle has a massive “superpartner.” The lightest superpartner would be a prime candidate for dark matter. Fully testing the idea may require a detector weighing a metric ton. Physicists would also search for a type of radioactivity called “neutrinoless double beta decay” that would blur the distinction between matter and antimatter. Many a nucleus can change identity through beta decay, when a neutron in it turns into a proton and spits out an electron and an elusive antineutrino. Others change by absorbing a neutrino instead of emitting an antineutrino. In neutrinoless double beta decay, two neutrons in a nucleus such as germanium-76 or tellurium-130 would change into protons while ejecting only two electrons. That can happen only if the neutrino is its own antiparticle—so the antineutrino emitted by one neutron can absorb the other neutron because it's also an antineutrino. The biggest experiment in DUSEL would be a detector weighing between 100,000 and 200,000 tons that would snare neutrinos fired through Earth from a distant particle accelerator. Using such setups, physicists in the United States, Europe, and Japan are already studying how the three types of neutrinos—electron, muon, and tau—change into one another as they whiz along. The Long-Baseline Neutrino Experiment (LBNE) would go further and look for an asymmetry between the behavior of neutrinos and antineutrinos, called charge-parity (CP) violation, that might explain why the universe contains so much more matter than antimatter. LBNE would cost between$660 million and $940 million. In addition to detecting neutrinos from the sun or from supernovae, such a huge detector could also search for proton decay, the fourth key experiment. Theorists have developed “grand unified theories” that roll together the three forces that dominate particle interactions: electromagnetism, the strong force that binds the nucleus, and the weak force that causes beta decay. Those theories predict that the otherwise eternal proton should decay, albeit on a time scale so immensely long that physicists would have to study hundreds of thousands of tons of matter in a detector to spot a few decays. DUSEL would also house experiments in microbiology, geoscience, and engineering. For example, Derek Elsworth, a geophysicist at Pennsylvania State University, University Park, has proposed heating a 10-meter-wide cubic volume of rock to see how heat affects stresses, chemistry, and fluid flow in it. The data could aid in, among other things, designing geothermal reservoirs. “You have these processes going on 2 or 3 kilometers down in geothermal structures, but you can never see them,” Elsworth says. With DUSEL, “you'fre right there.” ## Leapfrogging the competitors All of this would occur at Homestake, the deepest mine in North America. Starting in 2014, workers would carve out two labs with a total volume of 72,000 cubic meters at a depth of 1480 meters, and a third smaller lab 2255 meters down. On the upper level they would dig at least one 50-meter-tall, 260,000-cubicmeter cavity for the LBNE detector to field neutrinos beamed from Fermi National Accelerator Laboratory (Fermilab) 1300 kilometers away in Batavia, Illinois. Construction would take about 5 years, but some experiments could move in before work is completed. A dozen mines and tunnels around the world already house underground labs (see map). None has enjoyed more success than the Kamioka Observatory in central Japan. In 1998, physicists there used the 22,500-ton Super-Kamiokande detector to prove that muon neutrinos generated in the air change type in flight. (Fewer survived the long trip through Earth than the short trip from above, proving that some changed en route.) This year they began shooting neutrinos at the detector from a lab 295 kilometers away in Tokai to measure a key parameter describing such “neutrino oscillations,” a number called θ13. Italy's Gran Sasso National Laboratory is currently the largest underground lab. Lying next to a highway tunnel under the Apennine Mountains in central Italy and boasting a volume of 180,000 cubic meters, it houses, among others, two detectors to field neutrinos from the European particle physics laboratory, CERN, near Geneva, Switzerland. SNOLab in Sudbury, Canada, is the deepest big lab, with a depth of 2073 meters. Physicists there showed in 2001 that electron neutrinos from the sun also change type in transit. But even though they have large labs, physicists in Europe and Asia say DUSEL would propel the United States to the lead in underground science. “Japan is doing really well, but I am worried that the U.S. will pull ahead of us,” says Masayuki Nakahata of the University of Tokyo. In fact, overseas researchers are planning bigger labs that resemble DUSEL. But they have not yet found sites for them. In Europe, physicists hope to build a lab with a neutrino detector weighing as much as 440,000 tons to receive a beam from CERN. They are studying seven possible sites, ranging from Canfranc, Spain, to Pyhäsalmi, Finland, says Andre Rubbia of the Swiss Federal Institute of Technology Zurich. In Japan, physicists want to build a huge lab either 20 kilometers south of Kamioka or on the island of Okinoshima, 658 kilometers west of Tokai. The site decision may have to wait 2 or 3 years for measurements of θ13, says Takashi Kobayashi of Japan's accelerator lab KEK in Tsukuba. That's because CP violation can exist among neutrinos only if θ13 is not zero. Experiments in existing labs could discover dark matter or neutrinoless double beta decay before those planned for DUSEL. But that would only spur more experiments requiring more lab space, physicists say. Some say that relatively shallow Kamioka and Gran Sasso may soon be inadequate for newer experiments requiring even lower background radiation. Munich's von Feilitzsch works on a detector called Borexino in Gran Sasso that detects low-energy neutrinos from the sun. “For this physics, Gran Sasso is not deep enough,” he says. “It's done its job.” ## Worries closer to home Although DUSEL may have non-U.S. researchers looking over their shoulders, U.S. scientists are afraid that cost considerations may force designers to pare down their plans in ways that would dull the lab's competitive edge. In particular, they worry that the lab's lower 2250-meter level, which is currently flooded, will be smaller than they want and may not be available right away. A 2007 “conceptual design report” envisioned three halls at that lower level with a total volume of 4500 cubic meters; current plans call for a single hall of 1700 cubic meters. Those changes could cause some scientists to make do with existing facilities. “It's possible that by the time DUSEL is built the people who need easy access will have gone to Gran Sasso and the people who need to go deep will have gone to SNOLab and there will be no customers,” says Giorgio Gratta, a physicist at Stanford University in Palo Alto, California, who is working on neutrinoless double beta decay. Others say that the changes simply reflect a determination to build only what's necessary. Design-team leader Lesko notes that when the conceptual design report was written in 2007, designers had received no specific proposals for experiments to go into DUSEL. Now, they have more than a dozen to which they're tailoring their plans. At the same time, some scientists fret over the balance between spending on infrastructure and funding for experiments. When Homestake was selected in 2007 over seven other potential sites, NSF officials estimated it would cost$500 million, with half of the money going to key experiments other than LBNE. That pot of money would make DUSEL the obvious place to propose a dark-matter experiment, says Blas Cabrera, a physicist at Stanford.

However, the balance between infrastructure and science has shifted, so that grant money now accounts for $175 million of the$875 million total. In part, retrofitting the old mine has proved more expensive than originally estimated. Also, Lesko says, the original cost breakdown did not include the $123 million NSF would now contribute to LBNE, to which DOE gave the initial nod earlier this year. Nevertheless, NSF officials say that anything less than a broad science program and four key experiments would be hard to sell to the science board. “If it turned out that only one of these experiments could be done within the cost envelope, then obviously [getting approval] would be difficult,” says Edward Seidel, who leads NSF's mathematical and physical sciences directorate. Observers say concern over the balance of science and infrastructure may have led to a glitch in the president's 2011 budget request for NSF that threatened DUSEL. NSF initially asked for$38 million for DUSEL design work, but the White House Office of Management and Budget (OMB) cut that amount in half. NSF and OMB officials now seem to have agreed to support the project until NSB can evaluate it, although parties to the negotiations declined to discuss the details.

The 2011 budget is still pending before Congress, but Lesko says the design team has submitted a proposal to NSF to fund the “scope of work” to be completed in 2011. NSB consultant Barish says he's “absolutely sure” the board will approve the additional money.

Some researchers also worry about the partnership between NSF and DOE. DOE's interest in DUSEL should bolster the project, especially as LBNE anchors DOE's plans for Fermilab's future. But the two agencies have a mixed record on joint projects. One glaring failure, scientists say, was a pair of NSF experiments known as Rare Symmetry Violating Processes (RSVP) that in 2000 was approved to run at DOE's Brookhaven National Laboratory in Upton, New York. Uncertainties over whether DOE would run an accelerator long enough to complete the experiments or make NSF pay for more time led NSF to cancel the project in 2005. Researchers say the agencies didn't discuss the problem until after RSVP was approved.

To avoid such snags with DUSEL, officials from DOE and NSF are hammering out an arrangement in which one agency would take the lead for an experiment but the other would also contribute. For example, DOE would lead on neutrinoless double beta decay, and NSF would spearhead dark-matter searches. The approach would give both agencies a stake in every experiment. “It's like any marriage,” says Fermilab's Robert Tschirhart. “If you have a common goal, you'll work it out.”

Of course, most marriages begin with a wedding. And DUSEL scientists are hoping that it won't be too long until the science board responds to their proposal with “I do.”

12. Archaeology

# Collapse? What Collapse? Societal Change Revisited

1. Andrew Lawler

Old notions about how societies fail are at odds with new data painting a more nuanced, complicated—and possibly hopeful—view of human adaptation to change.

CAMBRIDGE, UNITED KINGDOM—At midnight on 24 August, 410 C.E., slaves quietly opened Rome's Salaria gate. The waiting Visigoths poured through the narrow passage, trumpets blaring and torches held high. The first sack of Rome in 8 centuries has often been cited as the moment when one of the world's largest, wealthiest, and most sophisticated empires died a violent death. For researchers struggling to understand how societies collapse, Rome's fall has served as a model and a touchstone.

And yet an eclectic group of scholars who met recently at the University of Cambridge* argues that true social collapse is actually rare. They say that new data demonstrate that classic examples of massive collapse such as the disintegration of Egypt's Old Kingdom, the end of the Classic Maya period, and the vanishing of pre-Columbian societies of the U.S. Southwest were neither sudden nor disastrous for all segments of their populations. “Collapses are perhaps more apparent than real,” says Cambridge archaeologist Colin Renfrew.

Rome, for example, didn't fall in a day, as Edward Gibbon recognized in the 18th century in The History of the Decline and Fall of the Roman Empire. More recent work underscores the fact that the sack of Rome was just one step in a long and complex spiral of decline that affected peoples of the empire differently. For example, archaeologists in northeastern England have recently uncovered post-Roman villages that clung to the empire's ways while their neighbors swiftly abandoned the Latin language, Roman-style kitchenware, and construction practices. “There's a bewildering diversity that is only magnified as the system falls apart,” says Cambridge archaeologist Martin Millett.

This emphasis on decline and transformation rather than abrupt fall represents something of a backlash against a recent spate of claims that environmental disasters, both natural and humanmade, are the true culprits behind many ancient societal collapses. Yale University archaeologist Harvey Weiss, for example, fingered a regional drought as the reason behind the collapse of Mesopotamia's Akkadian empire in a 1993 Science paper. And in his 2005 book Collapse: How Societies Choose to Fail or Succeed, geographer Jared Diamond of the University of California, Los Angeles—who was invited to but did not attend the meeting—cites several examples of poor decision-making in fragile ecosystems that led to disaster.

Renfrew and others don't deny that disasters happen. But they say that a closer look demonstrates that complex societies are remarkably insulated from single-point failures, such as a devastating drought or disease, and show a marked resilience in coping with a host of challenges. Whereas previous studies of ancient societies typically relied on texts and ceramics, researchers can now draw on climate, linguistics, bone, and pollen data, along with better dating techniques. The result, they say, is a more nuanced understanding of the complicated and often slow-moving processes behind massive societal change. “The rarity of collapse due to the resistance of populations to environmental changes or disease is considerable,” says Cambridge historian John Hatcher, who studies the Black Death; that plague ravaged medieval Europe and Asia yet did not overturn the existing social order.

## The change remains the same

Societal collapse is a slippery concept that defies a strict definition. Renfrew contends that it involves the loss of central administration, disappearance of an elite, decline in settlements, and a loss of social and political complexity. Collapse implies an abrupt end rather than a long, slow devolution.

That description would seem to fit the demise of Egypt's Old Kingdom around 2200 B.C.E., after a 1000-year reign of pharaohs. As far back as the 1800s, researchers found texts and archaeological evidence pointing to a nightmarish era of civil war, drought, famine, and anarchy. This collapse, which brought down the all-powerful kings who built the pyramids, long appeared to be a relatively sudden event that ushered in a century dubbed the First Intermediate Period. Earlier work suggested that a massive drought, the same one that may have laid the Akkadian Empire low, struck at this time and dropped the level of the Nile.

But the latest climate data from the northern Ethiopian highlands—a key source of the Nile—do not support a severe drought, says Richard Bates of the University of St. Andrews in the United Kingdom. “Likely, climate change was not as much of an impact as perhaps first thought,” he says.

An increasing number of Egyptologists also now posit a more complicated and drawn-out decline—and one that ultimately had limited impact on the population. Miroslav Barta of Charles University in Prague notes that by the 25th century B.C.E., important changes in Egyptian society were already afoot. Smaller pyramids were built, nepotism within the royal families diminished, royal princesses married nonroyals, and the move from a centralized, pharaonic kingdom to a more regionalized structure was well under way. “The idea that this was sudden is nonsense,” he says.

The changes were accelerated rather than caused by the drought, says Barta, citing details such as species of beetles in a 2300 B.C.E. tomb that thrive only in desert conditions. “There's a sense that the climate change was more gradual,” agrees Mark Lehner, an Egyptologist based in Cambridge, Massachusetts, who works at Giza and sees drying as early as the 24th century B.C.E. “It is already happening earlier in the Old Kingdom.”

For Barta, the 22nd century B.C.E. shift away from a single leader lacked the disruptive effect imagined by 19th century C.E. archaeologists and their 21st century descendants who are focused on a short and brutal drought. “There was no collapse,” he insists. While the unified state disappeared and large monuments weren't built, copper continued to be imported from abroad and the concept of maat or kingship continued to be used at a more local level. “The peasants may never have noticed the change,” he adds.

Not all scholars agree. Some hold fast to the idea of a more rapid climate-based change. “It's hard to eat when there is no food,” says Weiss. But John Baines of the University of Oxford says Barta's view of a more gradual transition is “more or less a consensus these days.” He adds that the changes that took place prior, during, and after the Old Kingdom's demise “were about redistribution of power and wealth more than about collapse.”

## Kingdoms in the forest

Like the end of Egypt's Old Kingdom, the close of the classic Maya period around 900 C.E. has long been a poster child of collapse. Huge cities in the northern highlands were abandoned, monumental architecture ceased, and royal inscriptions halted. Foreign invasion, epidemics, social revolt, and the collapse of trade have been identified as key factors. Richardson Gill, a Texas businessman and archaeologist, argued a decade ago that the worst drought in 7000 years afflicted the Yucatán Peninsula between 800 C.E. and 1000 C.E., an idea with widespread support. Some researchers now favor a more nuanced version, in which environmental, political, and social changes combined to ravage the society.

But Elizabeth Graham, an archaeologist at University College London who works in the lowlands of Belize, says “there's not a blip” in the occupation of the Maya areas she has dug along the coast, which lie about 300 kilometers from major inland centers to the north. Graham is convinced that additional settled areas existed during and after the end of the Classic period but that archaeologists haven't found them yet, due to the uncertainties of preservation in the humid tropics and the difficulties of spotting low mounds in tropical forests.

Coastal sites like Lamanai and Tipu were admittedly smaller than the great inland cities, but Graham says there is no sign of crisis there at the end of the Classic period. Skeletons show no increase in dietary stress, populations seem constant, terraces and check dams are maintained, and sophisticated pottery continues to be crafted. The drying of the climate doesn't appear to trigger any societal rupture. Such new conclusions are “staggeringly important,” says Norman Yoffee, an archaeologist at the University of Michigan, Ann Arbor, and co-editor of a 2010 book called Questioning Collapse that challenged many of Diamond's ideas.

Not everyone accepts that most of the Maya thrived at the end of the Classic Period, however. “It depends where you are standing,” says Stephen Houston, an archaeologist at Brown University. He digs inland, in areas like northern Guatemala, where he says “there absolutely is a collapse.” But he adds that “the image of people dying in the streets is a caricature of what was taking place; these cities just become not very attractive places to live,” in large part because of the loss of an elite. “People voted with their feet.” Houston suggests that the drought took place over a long period of time and that different parts of the Yucatán were affected differently. Even Diamond acknowledges in his writings that the Maya collapse was most intense in the highlands with its more fragile soils. Houston says that archaeologists must look more closely at how regional events may affect local areas in radically different ways.

## Rising from the ashes?

Ancient ruins and canals prompted the founders of Arizona's future capital to name their city after the mythological sacred bird that is reborn in its own ashes every 500 years. Five centuries prior, the Hohokam had lived in the Phoenix basin, creating a complex society from 750 C.E. to 1450 C.E. with vast irrigation systems, ball courts, plazas, platform mounds, and polychrome pottery. Then the population vanished, the canals were forgotten, and even outlying areas were abandoned. The abandonment appears total.

Archaeologists have long blamed a sudden onslaught of flooding that destroyed the canals and suggested that field salinization and overpopulation contributed; some see European diseases, arriving after 1500 C.E., as the ultimate culprit. But archaeologist Randall McGuire of Binghamton University in New York state argues that the data don't support any of these theories.

He says that the lack of remains after 1450 C.E. make the disease idea untenable and that there is no evidence for the destruction of the life-giving canals. Drawing on data from the Center for Desert Archaeology in Tucson, Arizona, he instead links the Hohokam's disappearance with broader changes across the Southwest between 1250 C.E. and 1450 C.E., when the population shrank by as much as 75%. “This is not a catastrophic event but a slow process over 150 years or more,” he says. “Were [the Hohokam] even aware that this was a ‘collapse’?”

The data show clusters of populations gradually vanishing or migrating during the 2-century period; one cluster in northern New Mexico, by contrast, gradually increases. McGuire suggests that Southwest pueblo structures based on rival clans that kept each other in check survived, while the Hohokam, who increasingly favored a more rigid hierarchical system, eventually failed. So although a drying climate no doubt played a role in the dissolution of societies and the migration of peoples, McGuire believes that a complex combination of religious movements and elite interactions were also important factors and that they took place over a much longer period than previously imagined.

Paul Fish, an archaeologist at Arizona State University, Tempe, says that McGuire “is certainly correct in that no simple environmental, economic, or social explanation is satisfactory.” But he notes that the final years of the Hohokam remain the least understood phase. “I don't believe we have the dating evidence to know how rapid the decline actually was,” he adds.

Such appeals to “complex factors” have their critics. Weiss, for example, insists that “mushy 1960s multicausality collapses” in the face of hard “21st century paleoclimate data” that definitively pinpoint serious droughts. But other researchers argue that scientists have been too quick to overlook sociological explanations and turned to environmental change “as the snappy explanation” for collapse, as Poul Holm, a historian at Trinity College Dublin, puts it.

Holm decries what he sees as an industry of apocalypse that pervades religion, academia, and even Hollywood, with its blockbusters like 2012. He argues that societies under stress have actually shown surprising resilience in overcoming crises. An old way of life may quietly continue in a revamped but world-changing form, such as the way some Imperial Roman traditions survive today in the Roman Catholic Church.

Unlike the dangers faced by many past societies, however, today's big threats—global climate change, war, peak oil, economic dislocation—are nearly all due to human choice rather than natural causes. Like Diamond, Holm sees awareness of our predicament as the key to not repeating past mistakes. “At the end of the day, trying to understand how humans cope with change is about how we think,” he says. Immediate threats to individual as well as societal existence may be what humans require to change outdated thinking. We may, after all, need those barbarians at the gate.

• * Crisis, what Crisis? Collapses and Dark Ages in Comparative Perspective, The McDonald Institute for Archaeological Research, University of Cambridge, 24–26 September.

13. Air Pollution

# Taking the Sting Out of Acid Rain

1. David Malakoff*

As a landmark U.S. law turns 20, success marks the acid rain control program, but the cap-and-trade market it created is in turmoil.

The water in Brooktrout Lake is getting a bit murky these days. Two decades ago, visitors could see up to 16 meters down, clear to the bottom of this remote lake nestled high in the Adirondack Mountains of New York state. Today, visibility is often just 5 meters.

The cloudy water might be cause for concern in many places. But in the Adirondacks, it has sparked a cautious celebration. “The reduced visibility means there is more algae and plankton in the water—and that means life in Brooktrout Lake is starting to recover from acid rain,” says biologist Clifford Siegfried of the New York State Museum in Albany, who has been studying the lake for more than 25 years. “It is a really encouraging sign.” So encouraging, in fact, that some 35 years after the lake became too acidic to support fish, biologists have been able to restock it with—what else?—colorful, darting brook trout.

That feel-good fish story is making Brooktrout Lake something of an icon for a landmark air pollution control law that celebrates its 20th anniversary next week. On 15 November 1990, after a bruising, 13-year-long legislative battle, President George H. W. Bush signed a sweeping set of changes to the U.S. Clean Air Act designed to take the sting out of acid rain. Most notably, the amendments called for using a controversial new “cap-and-trade” system to slash sulfur emissions from coal-fired power plants. The acidic plumes were transforming Brooktrout Lake—and hundreds of other lakes, ponds, and streams in the eastern United States—into eerily clear, seemingly sterilized ecosystems. Skeptics from both industry and environmental groups attacked the emissions trading scheme, predicting it would be too expensive, too ineffective—or both.

How times have changed. The 1990 Clean Air Act Amendments are now widely hailed as a “win-win-win” for the environment, consumers, and industry. Cap-and-trade has helped cut sulfur pollution faster and at lower cost than many predicted and become a widely emulated model for dealing with other pollutants. The amount of acidic sulfur returning to Earth, meanwhile, has dropped by roughly 50% across the eastern United States. Many fishless lakes and streams, once written off as dead, are showing signs of renewed life.

Even as many prepare to toast the success of the 1990 law, however, the U.S. acid rain program is facing turmoil and uncertainty. Its once high-flying emissions-trading market has collapsed in the wake of a complex court ruling, legislative gridlock—and its own success. Industry analysts say the collapse could create a perverse incentive for utilities to ease up, at least temporarily, on cutting emissions while the government works on new rules. Researchers, meanwhile, warn that the problem isn't going away. Acid deposition continues to threaten many sensitive ecosystems, and analysts say deeper emissions cuts are needed to prevent future pollution from undoing the gains of the past 20 years. “We're at a point where we need to reevaluate how far we've come and where we need to go,” says microbial ecologist Sandra Nierzwicki-Bauer of the Darrin Fresh Water Institute in Bolton Landing, New York, part of the Rensselaer Polytechnic Institute (RPI) in Troy, New York.

## Ghost lakes

Today's turning point has its origins in the plight of places like Brooktrout Lake. Once a flourishing fishery, the last documented catch came in 1975. By 1984, when Siegfried first visited, Brooktrout had become one of more than 350 Adirondack lakes that had become highly acidified (a pH of about 5.5 or less) and no longer sustained fish. Soon, major media were featuring dramatic stories on these “ghost lakes,” with Sports Illustrated declaring acid rain “a chemical leprosy … eating away at the face of the U.S.” In a slew of technical papers, researchers pointed the finger at what they said was one obvious cause: a fleet of power plants in the Upper Midwest that burned massive piles of high-sulfur coal and then sent sulfur dioxide emissions wafting east on prevailing winds. “What goes up must come down,” Gene Likens, a prominent ecologist at the Cary Institute of Ecosystem Studies in Millbrook, New York, told one Washington audience at the time. Likens was on the team that first documented acid rain in the United States in the 1960s.

Neither the science nor the aphorisms, however, persuaded President Ronald Reagan and congressional leaders to act. By one scholar's count, they killed at least 50 legislative proposals to cut sulfur emissions between the late 1970s and 1988. George H. W. Bush's campaign pledge to be “the environmental president,” however, opened the door to regulation. The final 1990 deal called for using flexible, free-market mechanisms to cut sulfur emissions by nearly 50% from 1980 levels by 2010. Under the plan, the government set a declining annual “cap” on emissions that included specific goals for several hundred power plants. Companies could choose the best—and cheapest—way to meet those targets, including installing new smokestack “scrubbers,” shifting to lower-sulfur coal, or buying “allowances” to pollute from other companies that had met their targets.

Despite some early bumps, the system flourished. Millions of allowances traded hands, and compliance costs, which some estimated might be as high as $25 billion annually, proved to be just a few billion dollars a year. By 2009, sulfur emissions had dropped to 5.7 million tons, well below the mandated cap of 8.95 million tons and down 67% from 1980 levels. The deposition of sulfate across the eastern United States fell by about 50%, and lakes and rivers showed an increasing ability to neutralize, or “buffer,” acidic precipitation. ## Assessing improvements The 1990 law also established a research program to document how life in Adirondack lakes and rivers was responding to the chemical shift. “It was pretty novel at the time to build science right into the legislation,” says RPI limnologist Chuck Boylen. Starting in 1994, Boylen helped guide the Adirondack Effects Assessment Program (AEAP), a multi-institution effort funded by the U.S. Environmental Protection Agency (EPA) and New York state. For 13 years, it sent scientists out to collect data from about 30 lakes. Although federal funding for AEAP has dried up, the state recently funded more monitoring of 17 lakes, and researchers are still producing papers. They reveal the often complex—and sometimes surprising—ways that acid rain has reshuffled aquatic food webs in sensitive waters. One trend is crystal clear, a team led by Nierzwicki-Bauer reported this past July in Environmental Science & Technology: More acid meant less biodiversity. The researchers came up with a grim rule of thumb: For every one-digit drop in pH (from 6 to 5, for instance, which represents a 10-fold increase in acidity), there were 2.5 fewer genera of bacteria, 1.43 fewer bacterial classes, and 3.97 fewer species of phytoplankton. A one-digit drop in pH also meant nearly two fewer crustacean species and about four fewer species of aquatic plants, rotifers, and fish. “Lots of studies had examined acid rain's impact at a chemical level,” says Nierzwicki-Bauer. “We tried to quantify how it changes the biota.” One thing researchers learned is that acidity isn't necessarily destiny for a lake bacterium. A study of 18 lakes showed that, although acidification favored species able to survive in low-pH waters, these “acidophiles” didn't always take over the base of the food pyramid. Instead, apparently more generalist species were often more numerous, perhaps because factors such as water depth also play an important role in shaping bacterial communities. That AEAP study appeared in the March 2008 issue of Applied and Environmental Microbiology. ## Celebration and concern Overall, long-term monitoring has shown that many acidified Adirondack lakes are “now on the trajectory to recovery,” says Nierzwicki-Bauer, with Brooktrout Lake a prime example. Its pH, perhaps once as low as 4.6, has risen to nearly 6. By 2005, “you could really see things were starting to happen biologically,” says Siegfried. “Zooplankton was increasing, larger invertebrates were coming back.” In late 2005, biologist James Sutherland of the New York Department of Environmental Conservation decided the lake was healthy enough to reintroduce trout—not for anglers, but for the scientists. “The hypothesis that's being tested is that fish are essential to recovery” of the ecosystem, says Boylen. “It's one big experiment.” Since then, state and academic researchers have regularly taken the 10-kilometer hike, or short helicopter ride, from the nearest road to the lake to check on how the ecosystem is changing. “I'm surprised at just how well those fish are doing,” Siegfried says. “It's really a cause for celebration.” But “it's not clear that all the lakes are going to recover like Brooktrout Lake,” he adds. At least 10% of the region's lakes do not have improving pH trends, according to EPA data, and about 40% aren't showing an increased capacity to resist acidification. In part, that's because acid deposition continues to leach buffering minerals from surrounding soils and affect vegetation. And water-quality improvements have slowed in the past decade, according to state studies, partly due to the complex role played by other pollutants, such as nitrogen oxides. The bottom line, concluded one recent state study, is that “unless air pollution emissions are further reduced, this recovery will likely be followed by a lengthy period of renewed acidification of lakes that were recently recovering.” Ironically, efforts to achieve those deeper cuts may end up gutting the cap-and-trade system that has become the hallmark of the 1990 law. In 2005, the second Bush Administration approved a plan known as the Clean Air Interstate Rule (CAIR) for a new round of emissions reductions. One result was that the cost of sulfur allowances more than doubled to nearly$1600 a ton, as utilities braced for expensive new controls. But now spot prices have plunged to just \$5 a ton, in large part because of a 2008 court ruling that invalidated CAIR and sent it back to EPA and Congress for retooling. Last July, EPA responded with a new plan that relies less on cap-and-trade. The chaos has “essentially killed an acid rain market that was working just fine, thanks,” one anonymous emissions trader noted on an online discussion board. Others have speculated that the crash could lead some utilities to turn off expensive sulfur scrubbing equipment and buy cheap allowances instead, until the new rules take effect in 2012.

So far, experts say that isn't happening. And whatever happens, Adirondack researchers hope policymakers will pay attention to what they've learned about acid rain from places like Brooktrout Lake. “Good policy has to be driven by good science,” says Nierzwicki-Bauer. And in this case, the science offers cause for both hope and concern.

• * David Malakoff is a writer in Alexandria, Virginia.