# News this Week

Science  25 Jun 2010:
Vol. 328, Issue 5986, pp. 1618
1. Gulf Oil Disaster

As disastrous as the oil spill has been, the Deepwater Horizon tragedy is just the latest affliction for wetlands in the Gulf of Mexico. For decades this productive coastline has been sliced apart by navigation channels and chewed through by invasive rodents. Worse, engineering of the Mississippi River has starved the delta of the sediment needed to keep it above sea level. All told, nearly 6000 square kilometers of wetlands have disappeared underwater in the past 100 years. Even though about $1.2 billion has been committed over the past 2 decades by the state of Louisiana and the federal government, efforts to restore the ecosystem are dwarfed by the scope of the problem. But now advocates have hopes for new momentum. In a primetime speech about the oil spill, President Barack Obama last week called for long-term restoration and put the secretary of the Navy, Ray Mabus, who was governor of Mississippi from 1988 to 1992, in charge of developing a plan. “It is the first time that any president has clearly articulated a national interest in restoring this remarkable but collapsing deltaic ecosystem,” says Jim Tripp of the Environmental Defense Fund, an advocacy group in Washington, D.C. The speech “indicates a real sense of national priority and urgency.” Scientists say the plan should focus on diverting large amounts of sediment-laden water from the Mississippi River to build land. Because sediment supply is limited and sea level is rising, it's crucial to site the diversions strategically for maximum land-building. “The big hurdles are going to be the social, political, and economic problems involved in making a big diversion,” says Harry Roberts, a coastal geologist at Louisiana State University (LSU), Baton Rouge. To date, most restoration efforts have been relatively small, such as rebuilding specific wetlands with dredged sediment. In 2000, the state of Louisiana released a plan that laid out ambitious, but relatively vague, goals—and a price tag of$14 billion. What eventually crystallized, after the Bush Administration scaled the plan back, was the $1.9 billion Louisiana Coastal Area study (Science, 25 November 2005, p. 1264). The Army Corps of Engineers has been authorized to build 15 restoration projects, including three sediment diversions, but none of the major projects has yet received funding to start construction. Two prototype diversions are already in operation. These concrete gateways, which channel river water so that it deposits sediment into wetlands, haven't built much land. That's mostly because they are operated at low levels; they divert only about 1% to 2% of the river's flow. Last year, Wonsuck Kim, now at the University of Texas, Austin, and others calculated that larger diversions would create a significant amount of land. By placing the structures midway between New Orleans and the mouth of the river, and diverting 45% of the spring flood stage, the sediment would build some 2740 square kilometers of land over 100 years, they reported 20 October in Eos. Such gains would “represent dramatic improvements over the “do nothing” situation,” they concluded. Even large diversions won't be enough to rebuild all the lost land, however, according to a recent study by Roberts of LSU and Michael Blum, now at ExxonMobil. Given that sea level is rising three times faster than it was 1000 years ago, and that at little as half as much sediment now reaches the Gulf of Mexico—the rest is trapped by dams in the Mississippi River watershed—there will be a large net loss of land by 2100, they reported in the June 2009 issue of Nature Geoscience. Although not all experts think the future is that grim, the study suggests that sediment diversions will be more about retrenching than recouping land. “We need to be very smart about where we put the sediment and diversions,” Roberts says. Roberts and other scientists recommend locating diversions as close to land as possible, where there is less erosion and the delta isn't subsiding as quickly as it is farther offshore; in these places, more land can be built with less sediment. But residents of Plaquemines Parish, along the southernmost extent of the river, would like to see diversions farther seaward see ( map), where they would protect towns and other infrastructure. Louisiana's 2007 comprehensive master plan for coastal restoration did not decide locations for diversions but presented two “conceptual scenarios.” One reason the master plan isn't resolved yet is that even the concept of diversions can alarm some stakeholders. By introducing more fresh water into wetlands, for example, diversions can harm oyster beds that have sprung up as salt water intruded into the sinking wetlands. The shipping industry worries that the structures could accidentally create shoals where ships need to anchor. Clinton Willson, a coastal and water resources engineer at LSU Baton Rouge, and other experts are optimistic that the negative impacts of diversions on shipping could be minimized. In March, the White House Council on Environmental Quality (CEQ) released a road map to help federal agencies and the state draft a more specific vision for coastal restoration, due by this fall. This road map will inform Mabus's plan, as will input from residents, officials, scientists, and conservationists, says CEQ spokesperson Christine Glunz. Funding remains a major issue. The price tag for restoring as much of the coast as possible to long-term health will likely run in the tens of billions of dollars, says Denise Reed of the University of New Orleans. It's possible that payments made by BP for damages caused by the spill could cover some of the restoration, but ultimately, a new funding mechanism will be needed, says Tripp, perhaps a surcharge on oil from the gulf. Certainly, the current trend of federal appropriations to the corps won't suffice; Tripp predicts it would take 200 years to get the job done. And inaction is not an option: “The status quo is ongoing collapse.” 2. Paleoanthropology # Lucy's 'Big Brother' Reveals New Facets of Her Species 1. Ann Gibbons First came Lucy. Then came Lucy's baby, an infant of her species. Now comes Lucy's “big brother”: the partial skeleton of a large male of Australopithecus afarensis, unveiled this week in the Proceedings of the National Academy of Sciences. The roughly 40% complete skeleton has been nicknamed Kadanuumuu, which means “big man” in the Afar language of the Afar Depression of Ethiopia, where it was found. “It was huge—a big man, with long legs,” says lead author Yohannes Haile-Selassie, a paleoanthropologist at the Cleveland Museum of Natural History in Ohio. Dated to 3.6 million years ago, the new skeleton is almost half a million years older than Lucy and the second oldest skeleton found of a possible human ancestor. It had long legs and a torso and a pelvis more like those of a modern human than an African ape, showing that fully upright walking was in place at this early date, Haile-Selassie says. Although headless, the skeleton also preserves parts not found before in Lucy's species. “It is important because it provides the ribs and scapula,” or shoulder blade, says paleoanthropologist Carol Ward of the University of Missouri, Columbia. In 2005, a sharp-eyed member of Haile-Selassie's team, Alemayehu Asfaw, spotted a fragment of lower arm bone on the ground at Woranso-Mille, about 48 kilometers north of Lucy's grave at Hadar (Science, 11 March 2005, p. 1545). Over the next 4 years, the team unearthed the shoulder blade, collarbone, ribs, and neck vertebra, the first time those bones were found together in an A. afarensis adult. The team also found a pelvis, an arm, and leg bones. Although they never found the skull or teeth, which are typically used to assign species, the skeleton's age and similarity to Lucy suggest that it belongs to her species, says co-author Owen Lovejoy of Kent State University in Ohio. The robust male stood between 1.5 and 1.7 meters tall, about 30% larger than Lucy. Isolated bones of other individuals suggest that some males were even larger, so the new skeleton doesn't settle a long-standing debate over just how much sexual dimorphism there was in A. afarensis, Lovejoy says. The shoulder blade looks more like that of a gorilla and a modern human than that of a chimpanzee. The curvature of the second rib suggests a wide rib cage at the top and a barrel shape overall, similar to that of modern humans and distinct from the more funnel-shaped rib cage of a chimpanzee, the authors say. This skeleton also gives a leg up to researchers who had proposed that Lucy's legs were proportionately longer compared with her arms than a chimpanzee's, says paleoanthropologist Terry Harrison of New York University in New York City. He agrees with the authors that the new skeleton is “not apelike in limb proportions.” Lovejoy argues that these nonchimpanzee-like features support a hypothesis he has championed that the last common ancestor shared between hominins and other great apes didn't look much like chimpanzees, as many had once thought (Science, 2 October 2009, p. 36). Ward agrees that the shoulder “provides further evidence that in the ways that A. afarensis is not like a human, it is not always like a chimpanzee.” But paleoanthropologist William Jungers of Stony Brook University in New York state is skeptical. He points out that the ribs are damaged and that the limb proportions depend primarily on just one complete leg bone; thus, the skeleton can't say much new about leg length. Expect more data—and more debate—from this newest member of Lucy's family. 3. Newsmaker Interview # Amid War, Appraising the Mineral Wealth of Afghanistan 1. Yudhijit Bhattacharjee When U.S. geologists visited the Afghanistan Geological Survey in Kabul in 2004 to launch an assessment of the country's mineral resources, they walked into a shell of an office building that had been bombed during the conflict of the previous decade. Since then, U.S. and Afghan geologists have been conducting field surveys and analyses, building on Russian maps and reports prepared during the Soviet occupation of the 1980s. The U.S. Geological Survey (USGS) issued a preliminary assessment in 2007, but the findings made headlines only last week when The New York Times reported that officials had “discovered” that Afghanistan harbored mineral wealth estimated at more than$1 trillion. That includes vast deposits of copper and iron ore, cobalt, molybdenum, and gold, which Pentagon officials say could transform Afghanistan's economy.

Two geologists who have helped manage the survey are Jack Medlin, a regional specialist in USGS international programs, and Said Mirzad, who directed the Afghanistan Geological Survey until 1979 and currently serves as Afghanistan Program co-coordinator at USGS. Last week, they spoke to Science about the past, present, and future of mineral exploration in the war-torn country. Their remarks have been edited for length and clarity.

Q:How did USGS get involved in Afghanistan?

J.M.:Our involvement goes back to 1952, when we began work with the Afghans on water resources. In the early '60s, we were responsible for the installation of a seismic station in Kabul. But all that ended in 1973.

In October 2001 [after the U.S.-led invasion], we came up with a plan to help in the reconstruction of the country. It called for an oil and gas assessment, a water-resource assessment, a mineral assessment, and an earthquake-hazard assessment.

Q:Surely there was previous work done on estimating the country's mineral wealth?

S.M.:Absolutely. In my time [1960s and 1970s], we had 22 groups of geologists roaming around Afghanistan doing geological surveys. We established the geological map of the country.

J.M.:The Russians did a tremendous amount of exploration work in the 1980s. They produced a number of mineral-exploration reports. Copies were produced in English and Dari; there was also a Russian copy that went back to Russia that most people think was different from the one that was left behind with the Afghans.

Q:How did you go about conducting the mineral-resource assessment?

J.M.:In 2004, we went to visit with our Afghan counterparts in the Afghan Geological Survey. The building had been destroyed by the mujahedeen—the plumbing had been stripped out, the electrical wiring was gone.

We asked to see the library and discovered all these reports that were stacked in these rooms. The Afghan geologists said they had taken the reports home to preserve them and store them; now they were bringing them back, which we were delighted to see.

Q:Said, some of these geologists must have been your former colleagues, right?

S.M.:Yes. There were nearly 100 Afghan geologists there. I asked them, “What are you doing here?” They said, “We arrive here at 10:30 every morning, stay till 2; then we sell cigarettes and fruit at street corners.” The director was driving a taxi after 4. They were reporting to work because the Taliban had told them that they couldn't go anywhere. They told me that they had given up hope that they would do geology ever again.

Q:What did you do after finding the old reports? Did you do any fieldwork?

J.M.:We started by organizing and compiling the data, working with 30 Afghan geologists and engineers. We selected certain areas for field study, and the Afghan geologists went to those sites to collect rocks; at some sites, they dug trenches several hundred feet long to get fresh work. There were landmines in some areas that had been placed around coal mines and mineral locations.

Q:What other surveying did you do?

J.M.:In 2006, we carried out an airborne geophysics survey covering about 70% of the country using an Orion P-3 plane owned by the U.S. Naval Research Lab. It collected gravity and magnetic data, which gives you an insight into what the three-dimensional look of a potential ore body is.

In 2007, we carried out a hyperspectral-imaging survey with an instrument mounted on the belly of a former British bomber. Hyperspectral imaging [imaging across the electromagnetic spectrum] allows you to map different rocks and minerals [each of which reflects sunlight uniquely] from 50,000 feet above the ground. It allows us to confirm a number of sites with potential deposits.

Q:How did you arrive at the assessment?

J.M.:We integrated the geological, geochemical, and geophysical information with mining history and ran it through a Monte Carlo simulation. We're looking at the known resources, and based upon the geology, we use probability, statistics, and expert judgment to say what there may be.

Q:Can some of this wealth be tapped in the short term?

J.M.:There is a lot of low-hanging fruit. For example, you don't need a lot of mining equipment to mine gold. Another example is brick clays, which can be used for making bricks or ceramics. It's not high-tech, but it can help create jobs.

S.M.:The Afghan government should not touch the mining business. We have to give enough information to potential investors. You will see the full impact when you have 20 investors.

I am very optimistic. If Afghanistan has a few years of calm, allowing the development of its mineral resources, it could become one of the richest countries in the area within a decade.

4. Physics

# Invisibility Cloaks for Visible Light Must Remain Tiny, Theorists Predict

Three years ago, physicists unveiled the first invisibility cloak, a ring-shaped device that ferried microwaves around an object and made it undetectable to microwave receivers (Science, 20 October 2006, p. 403). Since then, scientists have pushed to make a cloak for shorter-wavelength visible light. But those efforts will never lead to something you can hide in, one team of researchers predicts. The researchers say a cloak for visible light would be so small it could hide only objects almost too tiny to be seen. “It would be about the size of a dot on a piece of paper,” says Steven Johnson, an applied mathematician at the Massachusetts Institute of Technology in Cambridge. But the limit may not apply to some types of cloaks, other experts counter.

Scientists have made great strides in cloaking. In May 2006, theorist John Pendry of Imperial College London and colleagues proposed a cloak that consisted of a ring of “metamaterial”: a material whose electromagnetic properties are patterned on scales shorter than the electromagnetic waves it channels. Five months later, David Smith and colleagues at Duke University in Durham, North Carolina, produced one for microwaves. By April 2009, two other groups had fashioned tiny cloaks for infrared light.

Along the way, there have been plenty of doubts. For example, Pendry's original cloak worked at only one wavelength because it relied on a metamaterial that “resonates” with light, much as an organ pipe rings with sound of a fixed frequency. Some argued that no cloak could work at a range of frequencies. But in 2008, Pendry and a colleague devised a “carpet cloak,” which produces a hidden hump on a reflective surface instead of a hidden hole in space and works over a range of wavelengths. The infrared devices reported last year were “broadband” carpet cloaks.

Even so, a broadband cloak cannot be much bigger than the wavelengths at which it works, Johnson and colleagues argue. In a paper in press at Physical Review Letters, they consider a simple scenario in which a pulse of light with a range of wavelengths descends on a flat object covered by a cloaking layer. If the object were not there, the light pulse would take more time to reach the surface and bounce back. So to hide the object, the cloak must delay the light pulse. And for the cloak to do that correctly over the entire wavelength range, its thickness must increase in proportion to the height of the hidden object, Johnson argues.

The thicker the cloaking layer, however, the longer the light pulse will remain in the material and the more light the cloak will absorb or scatter. If the cloak is too thick, that light loss becomes noticeable. Johnson and colleagues estimate that researchers might someday beat down the losses enough to cloak a meter-sized object at microwave wavelengths. At optical wavelengths, the losses are orders of magnitude too high to conceal such a large object, they say. A cloak for infrared or visible light cannot be more than a few micrometers across, they conclude.

Not everyone is convinced. Johnson's argument applies only to resonant systems, Pendry contends; it does not prove you cannot make a large nonresonant cloak. “It's not Moses descending from the mountain and saying you can't do it,” Pendry says. “It's a rider saying that there may be some complications.” Johnson says the result is general.

Cloaking is only one application for the concept of “transformation optics” that Pendry has pioneered, and others could prove more important. Still, it would be disappointing if all you could hide in your personal invisibility cloak were an eyelash.

5. ScienceInsider

# From the Science Policy Blog

The U.S. Supreme Court says that the government shouldn't have banned the planting of genetically modified alfalfa pending completion of an environmental review.

The American Heart Association has banned drug company scientists from making scientific presentations at its upcoming meeting in what some researchers say is an overreaction to concerns about conflicts of interest in research.

Partners in a worldwide effort to harness nuclear fusion as a source of power have imposed a 31 July deadline on themselves for approving the design, schedule, and cost of the ITER fusion reactor. It's the latest attempt to manage the mounting price tag of the project.

A $1 billion tax-credit program to jumpstart the languishing U.S. biotech industry is now open for business. For many companies, the Therapeutic Discovery Project Program will actually mean a cash grant to help develop a drug or therapeutic. Two Japanese foundations have honored four scientists for their research achievements. The Inamori Foundation has awarded its 2010 Kyoto Prize to Shinya Yamanaka and László Lovász, and the Asahi Glass Foundation named James Hansen and Robert Watson the 2010 winners of its Blue Planet Prize. An internal investigation has concluded that Peter Duesberg, a molecular virologist at the University of California, Berkeley, was within his rights when he wrote in a paper since retracted that there is no evidence of a deadly AIDS epidemic in South Africa. Blood banks are being advised to tell patients with chronic fatigue syndrome not to donate because they may pass on a virus suspected of causing the elusive syndrome. See the full postings and more at news.sciencemag.org/scienceinsider. 6. Mantle Dynamics # Another Quarry Sighted in the Great Mantle Plume Hunt? 1. Richard A. Kerr Mantle plumes—tall columns of hot rock rising to feed volcanic hot spots like Hawaii—are either fundamental components of Earth's heat-shedding machinery that reshape the surface and perhaps trigger mass extinctions, or they are figments of some geophysicists' imaginations. Every report that someone has caught sight of a plume in seismic images of the mantle has been greeted by roughly equal portions of support and derision (Science, 4 December 2009, p. 1330). The latest claim is faring a bit better. The report of the detection of the most realistic-looking plume yet “is a very, very nice piece of work,” says seismologist Jeannot Trampert of the University of Utrecht in the Netherlands. “This is a new way forward,” but—there's still no escaping the buts—“it's a tough problem. There are still a lot of assumptions.” The appeal of the new work is that it is rooted directly in data. Previous plume claims involved so-called tomographic imaging. A seismic wave passing through a hot plume would be slowed so that it arrived at a distant seismometer later than a seismic wave that missed the plume. Tomographers would combine seismic travel times to create images of Earth's interior in much the way radiologists combine x-ray absorptions to create computed tomography images of human interiors. But just where the putative plume images came from, critics contended, was not clear. A signal created by an unrelated slow spot elsewhere in the mantle could have smeared into an adjacent area and could be taken for a plume, they said. The new work, by seismologists Daoyuan Sun, Donald Helmberger, and Michael Gurnis of the California Institute of Technology (Caltech) in Pasadena, also uses travel times but includes the opticslike behavior of seismic waves as well. As the Caltech group discusses in its 4 May paper in Geophysical Research Letters, seismic waves—like light waves—are diffracted when they graze any sharp change in seismic properties. The effect showed up in the complex shapes of seismic waves that rose along a narrow, sharp-edged zone of lower seismic velocity in the lower mantle just off the southern tip of Africa. By modeling exactly what sort of mantle feature could create the observed complex shapes of waves, the Caltech group homed in on the most realistic-looking plume yet reported. As geophysical theory calls for, the Caltech group's African feature is conical, sharp-edged, and about 150 kilometers across. All previously reported plume images were fuzzy and several times wider. “There is definitely something there, something small-scale,” says seismologist Christine Thomas of Münster University in Germany. “It's a direct observation; you can see [sharp edges] with your eye” in unprocessed data, unlike tomographic images that require massive processing. “Whether it's a plume,” she says, “I don't know.” The picture is still incomplete. Scientists can't tell whether the small, sharp-edged feature extends into the upper mantle because no waves passed that way. They can't gauge its length without better knowledge of the difference between seismic velocities in its interior and in the surrounding mantle. And seismologist Edward Garnero of Arizona State University in Tempe even asks, “How do you know [the plume] is not somewhere else along the path followed by the seismic wave?” Helmberger says that's unlikely but can't rule it out. Everyone agrees on the need for more data from seismic waves grazing the suspected plume. “The problem is there aren't that many earthquakes,” Thomas says. All in all, Garnero says, “the question may be extremely difficult to answer.” 7. Climate Change # Critics Are Far Less Prominent Than Supporters 1. Eli Kintisch A new analysis of 1372 climate scientists who have participated in major climate science reviews or taken public positions on their main conclusions confirms what many researchers have said for years: Those who believe in anthropogenic climate change rank, on average, much higher in the scientific pecking order than do those who take issue with the idea. The co-authors examined lists of scientists who have signed statements in support or opposition to the main findings over the years of the Intergovernmental Panel on Climate Change, namely, that the planet is warming and humans are largely responsible. They categorized the scientists as either “convinced” or “unconvinced” and then analyzed how many papers involving climate they had published. “Unconvinced” scientists comprised only 2% of the top 50 researchers ranked by number of climate publications and 3% of the top 100. Among scientists with 20 or more papers on climate, the so-called convinced group had an average of 172 citations for their top paper compared with 105 for the unconvinced. But the paper, published this week in the Proceedings of the National Academy of Sciences, faces several criticisms. The first is that the grouping of researchers into “unconvinced” and “convinced” fails to capture the nuances of scientific views on the subject. That makes the paper a “pathological politicization of climate science,” says Roger Pielke Jr. of the University of Colorado, Boulder. Pielke also objects to applying the “unconvinced” label to anyone who signed a paper opposing immediate cuts in greenhouse gas emissions. “So you are a ‘climate skeptic’ if you have a certain view on climate policy?” he asks. “Bizarre.” Critics say the results reflect the cliquishness or biases inherent in peer-reviewed science. “We are being “black-listed,” as best I can tell, by our colleagues,” says John Christy of the University of Alabama, Huntsville, who was in the “unconvinced” group. Co-author Jim Prall, a computer support professional at the University of Toronto in Canada, says, “It would be helpful to have lukewarm [as] a third category.” But he defends the peer-review system, noting that journal editors, although not perfect, “know the field better than any one else.” 8. ScienceNOW.org # From Science's Online Daily News Site Scientists Score in Ranking Soccer Stars If you're new to World Cup soccer madness and wondering which team is tops, science has an answer. Luís Amaral, a complex-systems engineer at Northwestern University in Evanston, Illinois, and an avid soccer fan, wanted to measure team and player performance in a way that takes into account the complex interactions within the team and each player's contribution. Applying the kinds of mathematical techniques used to map Facebook friends and other networks, Amaral and colleagues created software that can trace the ball's flow from player to player. The program assigns points for precise passing and for passes that ultimately lead to a shot at the goal, then calculates a skill index for each team and player. When the researchers analyzed data from the 2008 UEFA European Football Championship, the indices closely matched the tournament's outcome and the overall consensus of sports reporters, coaches, and other experts who weighed in on the performances, Amaral and colleagues reported in PLoS ONE. Still No ‘Mammoth-Killer’ Proponents of the idea that an exploding comet wiped out mammoths, giant sloths, and other megafauna 12,900 years ago have pointed to unusual organic debris in the soil from this period—debris, they say, that could have formed only in extreme wildfires raging across North America. But in a new study, a team argues that this debris is just fungal remains and bug poop. Using four kinds of microscopy, Andrew Scott of Royal Holloway, University of London in Egham, U.K., found a good match between the odd, honeycombed debris, known as carbonaceous spherules, and so-called fungal sclerotia, balls that fungi form to hibernate through times of environmental stress. When the sclerotia are charred at relatively low temperatures, the resemblance increases. Some of the more elongate particles are “certainly fecal pellets, probably from termites,” says Scott. What's more, the 12,900-year-old spherules were heated in low-intensity natural wildfires, if that, the team reported in Geophysical Research Letters. “There's certainly no evidence they're related to intense fire from a comet impact,” says Scott. But comet proponents say Scott and others have yet to explain an unusual feature in some carbonaceous spherules: nanodiamonds, nanometer-size bits of diamond that they say could have formed only in the extreme conditions of an impact. A World Without Flowers A world without flowering plants wouldn't just be drab, it would be hotter and drier, particularly in parts of the tropics, a new study concludes. Paleontologist C. Kevin Boyce and climate modeler Jung-Eun Lee of the University of Chicago in Illinois rejiggered climate models, cutting out the moisture released to the atmosphere through the leaves of flowering plants, or angiosperms—the group that includes elms, oaks, tulips, and roses. The effects were complex, but the biggest impact occurred in tropical areas of South America, the pair reported in the Proceedings of the Royal Society B. Average annual rainfall declined by 300 millimeters. In the eastern Amazon basin, the length of the wet season decreased by nearly 3 months. The extent of the wettest rainforests, which receive more than 100 millimeters of rain per month, shrank by 80%. Biodiversity would also suffer, as less precipitation usually translates into fewer species of animals and plants. Chimpanzees Kill for Land After at least 5 million years of separate evolutionary history, chimpanzees and humans still have one thing in common: Males of both species kill each other over territory, according to a study in Current Biology. From 1999 to 2008, a team led by primatologist John Mitani of the University of Michigan, Ann Arbor, and David Watts of Yale University followed male chimpanzees in the Ngogo group in Kibale National Park in Uganda whenever the chimpanzees would stop their usual noisy social behavior to form patrols of several males, about once every 2 weeks or so. In 18 cases, the researchers observed the Ngogo chimpanzees ambush a solitary male—often an infant—and kill it by pummeling it and biting it, says Mitani. In three other cases, the primatologists did not witness the attacks but found the beaten, torn bodies of chimpanzees after they heard an attack, or they observed the Ngogo males eating a freshly killed infant chimpanzee. In most cases, the Ngogo males took over new territory after a kill. Experts say these land grabs may give the chimpanzees access to more fruit trees or to females that live there. Read the full postings, comments, and more at news.sciencemag.org/sciencenow. 9. Energy # Natural Gas From Shale Bursts Onto the Scene 1. Richard A. Kerr New technologies have sparked a rush of drilling in the United States, but environmental concerns and economic unknowns could still keep shale gas from becoming a bridge to clean energy. Engineering ingenuity is unlocking a vast storehouse of natural gas buried beneath American soil from Texas to New England. Drillers are turning their instruments from the vertical to horizontal and then blasting the rock that tightly holds the gas with high-pressure chemical brews. This “fracing” (pronounced and sometimes spelled “fracking”) is finally making gas trapped in shale a profitable resource. That change, in turn, has driven up declining U.S. gas production, rescuing the American natural gas industry from seemingly inevitable depletion. The sudden great promise of clean, homegrown shale gas has all kinds of people excited. National-security types see it as a replacement for foreign oil and gas, environmentalists as a replacement for dirty coal and even oil. And because it yields only 45% of the carbon dioxide emissions of coal, advocates of taming global warming see it as a temporary crutch while carbon-free energy sources are developed and deployed. Everyone seems to agree with a March study by IHS Cambridge Energy Research Associates in Massachusetts that concluded that shale gas “provides the potential to transform North America's energy landscape.” The problem is that word “potential.” Every link in the chain between the newly abundant domestic energy source and its transformative impact is still shrouded in uncertainty. How much gas is there? What will it cost to extract? What government policies will be needed to direct the natural gas revolution toward reducing greenhouse gas emissions? No one's sure. And a rising tide of NUMBY—not under my backyard—that's greeting shale gas in the Northeast could hobble the revolution (see sidebar, p. 1625). ## How to unleash the gas The newly applied technology of shale gas extraction is letting drillers go straight to the source. Conventional deposits of oil and gas are actually the final resting places of far-traveled hydrocarbons that were generated in deeper “source beds” of organic-rich rock. Drill into a conventional reservoir, and out flows oil and gas. By contrast, shale gas—a so-called unconventional resource—never left its birthplace. It's still in the source bed whose organic matter gave rise to the gas. Because the pores in the fine-grained shale are not well connected, the rock is too impermeable to let the gas go. Drill into it—as drillers occasionally did—and you get barely a fizzle. But all that shale gas began to look more and more tantalizing as conventional U.S. natural gas resources dwindled. After drillers had sunk millions of wells into North America's conventional oil and gas reservoirs, U.S. natural gas production—which had soared for three-quarters of a century—peaked in the early 1970s and promptly went into decline. The easy gas was quickly becoming a thing of the past. Conventional gas production will never recover, but the two basic tools drillers needed to unleash unconventional shale gas were already on hand, waiting to be combined and refined. From the offshore oil and gas industry, they borrowed horizontal drilling. The ability to drill straight down and then bend the hole made it possible to drain much more of a reservoir from a single offshore drilling platform. Onshore, horizontal drilling out to 2.5 kilometers from a drill site can multiply the length of a single well within a gas-bearing shale layer by five or 10 times. The other tool was hydraulic fracturing, or fracing. Drillers pressurize a horizontal section of a well by rapidly pumping in 3 million or 4 million gallons of water (plus a bit of fine sand and chemicals) to pressures of up to 7000 kilopascal. The extreme pressure creates a football-shaped cloud of fractured shale 300 meters long, the fractures remaining propped open by sand grains. Repeat up to 30 times in one well and drill tens of wells from a single site, and you could free up enough gas to make a tidy profit. ## Hitting the mother lode When the price of natural gas began to climb along with that of oil early in the decade, newly equipped shale gas drillers went to work. In 2000, shale gas was 1% of the U.S. gas supply; now it is 20%. Production from the Barnett Shale under Fort Worth, Texas, increased 3000% from 1998 to 2007. And unconventional gas—from shale, low-permeability sandstone, and coal beds—rose to more than 50% of U.S total production. By 2009, total production was back up almost to the 1970 peak, thanks largely to shale gas. With gas gushing from the Barnett Shale and increasingly from other shale basins (see map), those sizing up America's natural gas resources took notice. Last June, the Potential Gas Committee, a nonprofit organization of experts, announced that shale gas was largely responsible for boosting PGC's estimate of the country's total available future supply of gas by 35% from its previous estimate just 2 years earlier. The natural gas resource had reached the highest level in the committee's 44-year history. That would be a 100-year supply at the current rate of consumption, industry ads touted. The changes in the natural gas industry could be big. A June 2009 study by Navigant Consulting Inc. in Houston, Texas, found that gas production companies believed that they could be producing 300 billion cubic meters of shale gas a year—equal to half of today's total U.S. gas production—in little more than a decade. And thanks in large part to shale gas, “natural gas is more than a bridge fuel; [it] is part of the long-term energy solution,” James Mulva, chair and CEO of Conoco-Phillips, said at a major oil and gas gathering last March in Houston. ## Enter uncertainty Despite all the glowing testimonials, every shale gas analysis has its caveats. The Navigant estimate of future production, for example, “represents what producers say they could do if everything works,” says Richard Smead, a Navigant analyst, but “there are so many issues.” For one, experts question how much gas is actually in the ground and how much of that can be extracted. “The [shale gas] resource is quite large,” says analyst Richard Nehring of Nehring Associates in Colorado Springs, Colorado, “but how large is it?” Resource estimates are usually reported as an average, he notes, but that conceals the range of uncertainty. PGC's 100-year supply “could be 50 years to 125 years of supply at present rates,” he says. And production will peak and decline rather than hold steady for 50 years or 125 years and then suddenly disappear. Nehring recently estimated that U.S. and Canadian gas production will likely peak sometime between the 2020s and the 2040s, depending in large part on how much shale gas there is. Narrowing the resource uncertainties will take a while. “At this stage of the game, we have very little experience with shale gas,” notes Nehring. “Predictions of well performance over 15 to 20 years are based on 6 to 24 months of experience.” Yet flow from a new well can decline by 60% to 80% during the first year of production alone. “We don't know if they'll keep diving or level off,” says geologist Richard Pollastro of the U.S. Geological Survey in Denver. Future shale gas production will also depend on the profits to be made from extracting the gas. Production took off as the U.S. price of gas soared toward$14 per million British thermal units (MBtu). Since that peak, the price has fallen to about $4.50 per MBtu. Some shale gas producers might be making money from some wells at the current price, Navigant's Smead says, but “it's not sustainable; it's not good for growth.” On the debit side of the ledger, the cost of producing shale gas could soar once drillers have depleted the obvious “sweet spots” where geology has made the gas particularly abundant and relatively easy to extract. For example, the Marcellus Shale now under development runs 1000 kilometers from eastern Kentucky halfway across New York state, but the difficulty and expense of getting gas out varies from location to location. Without further technological change, production could become considerably more costly in the future. How much shale gas actually gets used in coming decades also depends on how it fits into the U.S. energy economy. To see how gas might perform as a “bridge” to a low-carbon energy future, resource economists Stephen Brown, Alan Krupnick, and Margaret Walls of Resources for the Future in Washington, D.C., ran a range of scenarios in the National Energy Modeling System, which was developed by the U.S. Department of Energy and modified by RFF. The energy model predicted prices and consumption across U.S. energy markets under various assumptions about the size of U.S. shale gas resources and the nature of government energy policy, among other factors. The researchers used it to explore scenarios in which the shale gas resource was either moderately large or very large and in which the country either had or lacked a low-carbon cap-and-trade policy with carbon dioxide emission targets similar to those in legislation passed in the U.S. House of Representatives. Simply having a shale gas bonanza, the modeling suggested, doesn't solve all energy problems. In the absence of a low-carbon policy, more abundant shale gas “just gives you cheap energy,” says Brown, and everyone “just consumes more”—although gas burns more cleanly than conventional fossil fuels do and is a domestic fuel rather than a foreign one. Some coal is displaced from the energy mix, but so are zero-emission nuclear energy and renewables. As a result, both energy consumption and carbon dioxide emissions increase slightly. With a low-carbon policy in place, however, things are different. In that case, the model showed, abundant shale gas fills the need for a low-carbon fuel, somewhat decreasing the cost of meeting emission goals in the law. Even if shale gas is less abundant, the low-carbon policy still helps nuclear energy and renewables to replace coal. Shale gas “is a bridge if we want it to be,” Brown concludes. But given all the uncertainties, he says, putting a price on emitted carbon—using cap-and-trade or a carbon tax—is preferable to letting the government promote one technological solution over another. The government could easily guess wrong, he notes. Ingenuity and perseverance have unleashed a natural gas revolution in America and someday perhaps worldwide. But the revolution's first test is already in the offing. Financial arrangements that have encouraged continued shale gas drilling despite low prices will begin to expire in the next year or so. Drillers are said to be slowing the frenetic pace of sinking wells that's required just to maintain production. And a blowout that struck a well in the Marcellus Shale earlier this month serves as a reminder that environmental concerns could still sway a jittery public to favor leaving the gas inside the rock. 10. Energy # Not Under My Backyard, Thank You 1. Richard A. Kerr Now that drillers have figured out how to extract gas trapped in shale rock, a new and perhaps more hazardous style of drilling is coming to parts of the country unfamiliar with the ways of the oil and gas industry. A 3 June blowout 145 kilometers northeast of Pittsburgh, Pennsylvania, paled beside the catastrophe in the Gulf of Mexico. But it was the most dramatic harbinger yet of a revolution in the natural gas industry that will bring the threat of environmental disaster to millions of people's backyards, both figuratively and literally. Now that drillers have figured out how to extract gas trapped in shale rock (see main text, p. 1624), a new and perhaps more hazardous style of drilling is coming to parts of the country unfamiliar with the ways of the oil and gas industry. To those outside the industry, the scariest part of the new drilling might be the way drillers unleash the gas. They drill straight down, turn the drill to go horizontally, and then pump in chemically treated water to pressurize a horizontal section of the well until surrounding rock fractures and begins to give up its gas. Many local residents and environmentalists have worried that this “fracing” (pronounced and sometimes written “fracking”) might propagate fractures upward into overlying aquifers so that escaping gas, chemicals, and brine will contaminate groundwater. There is certainly some sort of problem. In January 2009, somehow gas got into a residential water well near Dimock, Pennsylvania, and exploded. There's general agreement that nearby shale drilling had something to do with it, but it's not clear how. Driving fractures from a shale gas well up into an aquifer is “pretty inconceivable,” geologist Ian Duncan of the University of Texas (UT), Austin, said at a seminar in Washington, D.C., in April. In fact, “we don't think [the Dimock gas] is coming from the Marcellus,” says hydrogeologist Daniel Soeder of the Department of Energy's National Energy Technology Laboratory in Morgantown, West Virginia. “There have been isolated cases of gas occurring in water wells,” he says, but isotopic analysis has shown that the gas is the shallow crustal equivalent of marsh gas rather than deep shale gas. Drilling and fracing may have triggered release of this biological gas, he says, but there are many other concerns about shale gas drilling. The biggest concern for many is the water that pressurizes the well during fracing. The process takes 12 million to 16 million liters of water per well. Supplying such amounts can be a challenge in dry regions, but the larger problem is often what to do with it after fracing, when most of it comes back up the well. To improve performance, drillers may add as many as a dozen chemicals up to about 2% by volume, such as biocides to keep down corrosive bacteria. The fracing fluid also picks up naturally occurring chemicals, primarily salts from deep brines. Drillers either inject the wastewater into deep wells, treat and release it, or recycle it, but spills do occur. The Dimock area had a series of them in 2009, for example. And the 3 June blowout in Pennsylvania spewed drilling fluid, brine, and gas into a forest for 16 hours (without igniting). Shale gas extraction can have other drawbacks. A drill site covering 2 hectares turns into a heavy industrial zone, even if it's in the suburban Fort Worth part of the Barnett Shale in Texas. More than 100 big water-tanker trucks might have to come and go for each of 20 or more wells at a site. Although drillers can drain more than 500 hectares from a single site, “it still makes quite a mess,” says analyst Richard Smead of Navigant Consulting Inc. in Houston, Texas. In response to the upsurge in shale gas drilling, the Pennsylvania Legislature is moving to strengthen pertinent regulations. New York state has greatly tightened licensing requirements on drilling in about 10% of the Marcellus in the state, effectively ruling out drilling in a prime source of New York City drinking water. And in March, the U.S. Environmental Protection Agency launched a multimillion-dollar study of potential ill effects of fracing on water quality and public health. “It's all manageable,” says oil and gas analyst Michelle Foss of UT Austin, assuming everyone does the right thing. “Industry needs to get its act together and try to minimize things that would not be good,” she says. Companies may have cut corners in their haste to drill. And “the states need to get their houses in order” too, she says. Better late than never. 11. Psychology # A WEIRD View of Human Nature Skews Psychologists' Studies 1. Dan Jones* Relying on undergraduates from developed nations as research subjects creates a false picture of human behavior, some psychologists argue. Suppose you're a psychologist at a research university, trying to figure out what drives human behavior. You have devised simple, clever experiments in which people play economic games or perceive visual illusions, and you would like large sample sizes. How will you find subjects? For generations of psychologists, the answer has been straight-forward: Use the pool of thousands of undergraduates at your university. But although undergrads from wealthy nations are numerous and willing subjects, psychologists are beginning to realize that they have a drawback: They are WEIRDos. That is, they are people from Western, educated, industrialized, rich, and democratic cultures. In a provocative review paper published online in Behavioral and Brain Sciences (BBS) last week, anthropologist Joseph Henrich and psychologists Steven Heine and Ara Norenzayan of the University of British Columbia in Canada argue that WEIRDos aren't representative of humans as a whole and that psychologists routinely use them to make broad, and quite likely false, claims about what drives human behavior. “A lot of psychologists assume that one group of humans is as good as the next for their experiments, and that results from these studies apply more broadly. We show that this assumption is wrong,” says Heine. “WEIRD subjects are some of the most psychologically unusual people on the planet.” There's little doubt that psychologists have relied on WEIRDos. In a 2008 paper in American Psychologist, Jeffrey Arnett of Clark University in Worcester, Massachusetts, analyzed all empirical papers published in six top-tier psychology journals between 2003 and 2007 and found that the United States alone provided 68% of study subjects, with a further 27% coming from the United Kingdom, Canada, Australia, New Zealand, or Europe. Psychology undergraduates were the sole subjects in 67% of U.S. studies and 80% of studies in other countries. Overall, 96% of subjects were WEIRDos. This would be fine if WEIRDos were representative of people from other cultures, but they are not, Henrich, Heine, and Norenzayan argue in the BBS paper. Although cultural variation is sometimes assumed to be superficial, Heine says that cultures differ in fundamental aspects such as reasoning styles, conceptions of the self, the importance of choice, notions of fairness, and even visual perception. For example, in the Muller-Lyer illusion (see figure), most people in industrialized societies think line A is shorter than line B, though the lines are equally long. But in small-scale traditional societies, the illusion is much less powerful or even absent. The reliance on WEIRD data has led to a biased picture of human psychology, says Heine. Social psychologists, for example, talk of the “fundamental attribution error,” or the tendency to explain people's behavior in terms of internal personality traits rather than external, situational factors (attributing an instance of angry behavior to an angry temperament, for example). Yet outside WEIRD societies, this error looks a lot less fundamental, says Henrich, as people pay more attention to the context in which behavior occurs, so someone's anger might be construed as simply reflecting an irritating day. Textbooks also frequently describe people as valuing a wide range of options when making choices, being analytical in their reasoning, being motivated to maintain a highly positive self-image, and having a tendency to rate their capabilities as above average. Again, the review article contends, this picture breaks down for people from non-WEIRD societies: These groups tend to place less importance on choice, be more holistic in their reasoning, and be less concerned with seeing themselves as above average. And although WEIRDos stand apart from the rest of the world in these and other respects, Americans stand even further away, with U.S. undergraduates further away still—“an outlier in an outlier population,” as the BBS authors put it. “We will never figure out human nature by studying American undergrads,” says Henrich. Other researchers welcome this central message but caution that the differences observed in crosscultural studies may themselves be problematic. “Not only do psychologists use WEIRD people, they also use weird, highly artificial experiments,” says Nicolas Baumard, an anthropologist at the University of Oxford in the United Kingdom. So the cultural variation those experiments detect may simply reflect the way experiments are construed by various groups rather than deep differences. Heine counters that many crosscultural findings have been replicated with a range of methods, suggesting that the differences are robust. Henrich, Heine, and Norenzayan recommend that psychologists explicitly discuss whether their findings can be generalized and make data on subjects available so population effects can be more easily detected. Researchers should also try to build links to diverse subject pools, perhaps drawing on contacts made by economists and public-health researchers in non-WEIRD societies. The Internet also provides another way of reaching out, though potentially biasing research away from WEIRD people toward wired people. The accumulated data on WEIRDos may still prove to have enduring value, argues cultural psychologist Paul Rozin of the University of Pennsylvania, as the world becomes more globalized. “The U.S. is in the vanguard of the global world and may provide a glimpse into the future,” he says. For now, however, psychologists should remember that WEIRDos remain weird. • Dan Jones is a freelance writer in Brighton, U.K. 12. Biodiversity # Pushing DAISY 1. Sarah Reed In his spare time, an entrepreneurial engineer-cum-entomologist is teaming up with a museum to develop an automated insect-identification computer program. Ever since he was a small child, Mark O'Neill has been captivated by the small things in life: insects. He remembers vividly the impressive Brown Hawker dragonflies that inhabited the wet meadow across the road from his childhood home in Lincolnshire, England, and the devil's coach horse beetles and puss moth caterpillars that used to abound in the garden. “When I was about 5, I got the idea that I could use large bumblebees to power paper aircraft,” recalls O'Neill. “The idea failed miserably, but I became adept at harnessing bumblebees using cotton thread.” That unusual skill continues to pay dividends as he now develops transponders for tracking bees for the technology company that he founded, Tumbling Dice, which is based in Newcastle-upon-Tyne in the United Kingdom. Although O'Neill isn't a formally trained entomologist—his background is in physics and engineering—his fascination with insects has also brought him to develop an image-based insect-identification computer program, the Digital Automated Identification System (DAISY), in his spare time. The program compares digital photographs of insects' morphological features with a database of shapes and markings gathered from taxonomic records, in much the same way detectives use computer databases to match crime-scene fingerprints or a suspect's face from security cameras. The idea is simple, but once DAISY is fully loaded with enough insect records, it should save taxonomists hours of painstaking research. In doing so, DAISY may help combat the problems associated with a growing shortage of trained taxonomists. It also promises to bring greater rigor to biodiversity studies, experts say, as traditional taxonomy methods rely on written descriptions in field notes to distinguish between species, using words that are sometimes not expressive enough for the task. “Taxonomists are forced to use subjective designations about how something looks, which is why DAISY's approach is so valuable,” says paleontologist Norman MacLeod of the Natural History Museum (NHM) in London. O'Neill has been developing DAISY for the past 18 years, struggling to find funding and collaborators—mainly because of his nontraditional background and position outside the academic world, he believes. But this insect-identification system has begun to prove itself in pilot trials around the world. MacLeod has been impressed enough by the results—and with O'Neill on both a personal and a professional level—that he has persuaded NHM to back DAISY with significant funding in the coming years. O'Neill “is one of the most interdisciplinary scientists that I have ever worked with,” MacLeod says. “Mark lives and breathes entomology—growing caterpillars at home—but he is also an advanced IT specialist. He is decidedly unique.” ## DAISY takes roots O'Neill wasn't actually around when the initial seed for DAISY was planted. That came about as a result of a conversation between the late entomologist Ian Gauld, formerly at NHM, and ecologist Kevin Gaston of the University of Sheffield in the United Kingdom. While stranded overnight at a Costa Rican airport in 1988, the pair pondered how a technological society might solve one aspect of the “taxonomic bottleneck”: how a lot of biodiversity work isn't carried out simply because the people and resources required to identify known species are not available. They decided that they needed to automate species identification, and so the concept of DAISY was born. A few years later, O'Neill came onboard to work alongside Gauld on the project, bringing his technology expertise to the table. O'Neill had acquired his information-technology and computer-programming skills while studying physics and engineering. He ultimately completed a Ph.D. in engineering in 1992 at University College London. O'Neill says he chose to study these subjects, despite his lifelong passion for entomology, because they were academically his strongest areas. “My physics and engineering background has helped me to contribute more to entomology than would have been the case had I trained as a classical biologist,” he says. More than 2 decades have now passed since that late-night conversation in the departures lounge in Costa Rica, and DAISY is still in the backwaters. Gauld, who had been awarded the original funding to work on DAISY, was slowed by illness for many years, leaving the project without a prominent entomologist to champion it. (He died in early 2009.) And O'Neill says he's also partly to blame for the slow progress. “I'm afraid that I am a bit of a ‘backroom boy,’ meaning that I have invested a lot more time in developing DAISY as opposed to advocating it,” he says. But O'Neill has a fully operational, UNIX-based version of the software now. It has been used for projects at several universities in the United Kingdom and overseas, including the University of Oxford, Newcastle University, and the University of Costa Rica. Wherever O'Neill has taken the project, he seems to have left a lasting impression. “Mark is one of the brightest and most enthusiastic people I have ever met,” says entomologist Paul Hanson of the University of Costa Rica in San José, who provided specimens for O'Neill to test DAISY on during a field trip to Costa Rica. During that test, and many others, DAISY performed extremely well, says O'Neill. In the Costa Rican test, DAISY was able to identify more than 40 species of hawkmoths (Xylophanes) with an accuracy of 95% (increasing to 98% if the program was allowed to reject tricky specimens). In another study, in which DAISY was asked to identify 60 species of biting midges (Ceratopogonidae), the program had a 90% success rate, far better than a group of undergraduate zoologists at Sheffield, who identified only 20%. These high success rates result from the sophisticated self-learning algorithm at the heart of DAISY, called a Self-Organizing Map, which, once trained to recognize an insect from a set of images from taxonomic records, can do so again automatically. Also, unlike fingerprint recognition, which matches only between seven and 10 spots of an image, DAISY's algorithm uses every bit of each image presented to it in order to make an ID. ## Taking a gamble O'Neill has repeatedly failed to secure grants for DAISY from research councils or other science agencies, even after the encouraging test results of the past few years. As a result, from February 2000, when the initial Darwin Initiative grant awarded to Gauld ended, until the recently announced NHM commitment, DAISY was completely unfunded. “I think in the U.K., and possibly in the U.S. as well, research grants are often awarded to well-known scientists and institutions without assessing the potential of the field as a whole,” O'Neill says. In order to maintain excellence, he contends, it is sometimes necessary to gamble on relative unknowns and people whose ideas are unorthodox. It seems NHM is willing to take that gamble; the museum recently announced that it will back DAISY for the next 3 to 5 years. The numbers are still being crunched, but the financial support is expected to be on the order of hundreds of thousands of dollars. And NHM's involvement in DAISY doesn't end there. It will feed its world-renowned taxonomical records into the system. Currently, about 7000 of the museum's 70 million records have been uploaded to DAISY. “We are in the process of formulating a strategy for ramping up the amount of resources we put into the digitization of the NHM collections,” says MacLeod. By 2012, from his own personal effort, MacLeod expects to be able to double, or maybe even triple, the number of images that are currently uploaded to DAISY. “If I'm able to attract others to work with me, then that figure will be even larger,” he says. For comparison, another insect-identification technique called DNA bar-coding, which uses a short DNA sequence from a standard position in the mitochondrial genome to automatically identify species, currently has bar codes representing 100,000 insect species at its disposal. When the infrastructure is in place for DAISY, it will complement DNA bar-coding and help to address the immense task facing biologists of understanding the diversity of life on the planet. “Our current knowledge of species diversity is shockingly inadequate for the critical tasks that confront biodiversity science—the exploration of patterns in nature, the management of natural resources, and the conservation of life,” says biologist Paul Hebert of the University of Guelph in Canada, who came up with the idea of DNA bar-coding. Hebert agrees that DAISY can also address that problem, though he notes that it has limitations. “I think that DAISY will have an important role to play in the identification of taxonomic groups,” he says. “Its most serious constraint relates to its reliance on current taxonomic knowledge to provide the training sets required for species identification; DAISY is designed to deliver identifications, not to probe taxonomic terra incognita, like DNA bar-coding promises.” Hanson acknowledges that DNA bar-coding is powerful but feels that DAISY has been unfairly overlooked as a result. “It is distressing to what extent the latest fashions can dominate funding in science,” he says. He can see the advantages of both techniques. DNA bar-coding is more widely applicable than DAISY, he says, as it can be used on specimens with a simple morphology, such as nematodes or microbes, or in the event you have only a small piece of an insect. DAISY, on the other hand, is a more straightforward, faster, and cheaper technique (although DNA bar-coding costs have decreased and are now estimated at about$1 per specimen). However, DAISY's unique selling point is that it has the potential to be used by the general public to aid biodiversity studies.

Indeed, O'Neill has grand plans to make DAISY readily available for “citizen science.” He is currently adding a Web-based platform for DAISY, which will make it possible for members of the public to identify insects using an Internet-enabled mobile phone. He's also experimenting with future commercial applications of DAISY, such as identifying the model numbers of consumer products and distinguishing among species of protists called foraminifera, which could be used to find new deposits of oil.

Motivated by the NHM funding, O'Neill's progress has increased tremendously, and he expects to have a pilot version of the Web-based DAISY, hosted on the museum's Web site, running by the end of the summer. He hopes that it will be used by professional and hobby entomologists, as well as by anyone with a passing curiosity about what type of butterfly they're looking at on a country walk. “This will be DAISY's crowning glory,” says O'Neill, “giving the public a powerful tool to understand the world around them.”

13. Paleoclimate

# Could East Antarctica Be Headed for Big Melt?

1. Douglas Fox*

New research suggests that the world's largest ice sheet may be more vulnerable than once thought to rising CO2 levels and temperatures.

The Orangeburg Scarp, a band of hard, crusty sediment teeming with tiny plankton fossils, runs from Florida to Virginia under tobacco fields, parking lots, shopping centers, and Interstate 95, the major highway along the U.S. East Coast. It marks an ancient shoreline where waves eroded bedrock 3 million years ago. That period, the middle Pliocene, saw carbon dioxide (CO2) levels and temperatures that many scientists say could recur by 2100. The question is: Could those conditions also result in Pliocene-epoch sea levels within the next 10 to 20 centuries, sea levels that may have been as much as 35 meters higher than they are today? The answer, say climate scientists, may lie 17,000 kilometers away in East Antarctica.

The East Antarctic Ice Sheet is the world's largest, a formation up to 4 kilometers thick and 11 million km2 in area that covers three-quarters of the southernmost continent. Its glaciers were thought to sit mostly above sea level, protecting them from the type of ocean-induced losses that are affecting the West Antarctic Ice Sheet. But studies of ancient sea levels that focus on the Orangeburg Scarp and other sites challenge that long-held assumption. Not everybody believes the records from Orangeburg. But combined with several other new lines of evidence, they support the idea that parts of East Antarctica could indeed be more prone to melting than expected.

“That's pretty incredible when you think about it,” says Maureen Raymo, a marine geologist at Boston University who studies Pliocene records. “It implies that the East Antarctic Ice Sheet is not quite as stable as we think it is.”

Three studies, using different remote-sensing methods, show that East Antarctica has already begun to lose ice. A survey of laser altimetry data from the ICESat satellite, published in Nature in October 2009, found ice thinning in several spots along the East Antarctic coast at annual rates as high as nearly 2 meters. Another study, published in Nature Geoscience in November 2009, used the gravity-sensing GRACE satellites and found two areas along the East Antarctic coast each losing about 13 km3 of ice per year. A 2008 study in Nature Geoscience that compared ice flux off the edges of the continent with new accumulation of snow in the interior found a loss of about 10 km3 of ice per year at two areas.

These amounts pale in comparison to the 150 km3 or so of ice vanishing each year from West Antarctica. But all three studies point to a handful of hot spots—including, most strongly, Totten Glacier, which is losing up to 1.9 meters of thickness per year. If East Antarctica were to start losing weight, Totten is exactly where it should happen, researchers say, because of its distinctive subglacial landscape. Seeing Totten at risk marks a sea change in the way that people think of East Antarctica, whose topography is the least-known landscape on Earth.

Donald Blankenship, a glaciologist at the University of Texas, Austin, has spent the past 2 decades compiling what data there are. He now oversees parts of a multinational survey, called ICECAP, which is using ice-penetrating radar and other sensors flown on aircraft to map the subglacial topography. “Most of these basins weren't particularly well surveyed; some weren't surveyed at all,” Blankenship says.

The surveys now in progress, he says, show these basins to be “good and deep.” And that's bad news. One survey, published last year in Tectonophysics by Fausto Ferraccioli of the British Antarctic Survey, finds the bed of East Antarctica's Wilkes Basin far lower than thought, with two long troughs plunging as far as 1400 meters below sea level. A soon-to-be-published interpolation of older radar data, using undulations in the ice surface to refine the bed topography, suggests that the much larger Aurora Basin, which includes Totten Glacier, is connected to the ocean by troughs that sit 500 to 1000 meters below sea level. These troughs deepen as they head inland, to as low as 2000 meters below sea level in one spot.

The new data are worrisome, says Jason Roberts of the Australian Antarctic Division (AAD) in Hobart, Tasmania. “That's the classic scenario where if you do get melting, you get a deepening grounding line of the ice sheet, which makes it more susceptible to further retreat,” he says. “It's much more potentially vulnerable to climate change than we would have thought.”

Events in West Antarctica provide a glimpse into what happens when even small amounts of deep, warm water reach glaciers that sit in deepening beds. Pine Island, Smith, Thwaites, and a handful of other glaciers along the coast of West Antarctica's Amundsen Sea are collectively hemorrhaging 100 km3 of ice annually.

It's too early to know what the ice loss in East Antarctica really means, says Isabella Velicogna, a remote-sensing specialist at NASA's Jet Propulsion Laboratory in Pasadena, California. “What is important is to see what's generating the mass loss,” she says. Reductions in snowfall, for example, might reflect short-term weather cycles that could reverse at any time. But thinning caused by accelerating glaciers—as seen in West Antarctica—would warrant concern.

Finding out whether Totten is accelerating will be difficult, says Neal Young, an AAD glaciologist who has studied the region for 40 years. Totten sits beyond helicopter range from Australia's Casey Station, and Young says “huge crevasses that would swallow up big buildings” make it extremely tricky to land field parties by airplane.

Young was part of an Australian team that rode Caterpillar tractors 200 kilometers to Totten in 1987—the first of only two human visits to the glacier's lower reaches—and measured Totten's velocity in nine places. Heavy snowfall and rapidly shifting surface features have prevented satellites from remeasuring velocity precisely enough to confirm whether Totten has sped up. And thick sea ice prevents icebreakers from getting near enough to look for warm ocean currents that could erode the glacier underneath.

Even so, the patterns of thinning reported last year suggest that it may have accelerated near the ice fronts of Totten and nearby Vanderford Glacier, causing them to stretch. Both glaciers have thinned most strongly at their ocean margins, where ice is sliding fastest. Thinning is substantially less in adjacent, slow-moving areas. “The answer is in the ocean and the dynamics of the glacier,” Young says. “I think the floating ice tongue on Totten has thinned, leading to [increases in speed] that are propagating inland.”

All of this suggests that CO2 levels expected by 2030 and temperatures predicted by midrange models for 2100 could eventually—over a millennium or longer—raise global sea level by 25 to 30 meters. That's well above the 10 to 15 meters predicted from the melting of Greenland and West Antarctica, and a magnitude that would displace hundreds of millions more people.

Plenty of uncertainties remain, though. The best ice-sheet models don't predict East Antarctica losing anywhere near that much ice. But refined topographic maps expected from ICECAP in the next 18 months could change this. “If there are really deep, [narrow] basins around the edges of the continent that the model doesn't know about, then we could be underestimating the potential for retreat,” says Robert DeConto, an ice-sheet modeler at the University of Massachusetts, Amherst, who published a model in March 2009 in Nature that successfully reproduces 5 million years of ice fluctuations in West Antarctica.

The biggest question is whether Pliocene sea levels really ever reached the heights indicated by studies of the Orangeburg Scarp. “[Pliocene sea level] is a poorly constrained number,” Raymo says. “The error bars could be anywhere from 10 to 40 meters above present levels.”

Timothy Naish of Victoria University of Wellington, New Zealand, hopes that a technique called backstripping will reduce these errors. Rock cores drilled from ancient coastlines show a sequence of erosion faces in which rising seas periodically chewed away at sediments. Stacked atop one another in time, such a sequence of water lines allows researchers to better correct for tectonic rise or fall, which can skew sea-level estimates.

Preliminary results of cores drilled from the Chesapeake Bay in the United States and Wanganui Basin in New Zealand suggest lower sea levels than Orangeburg would imply. “Twenty-five meters is an absolute upper number,” says Naish, who works with Kenneth Miller of Rutgers University, Piscataway, in New Jersey. “We think the number is nearer to 15 to 20 meters.” That estimate implies that there was some melting in East Antarctica, but not as much.

Another confounding factor is that ice-sheet melting produces uneven rises in sea level. The meltwater is redistributed to reflect changes in Earth's gravity field caused by the disappearance of a fixed ice sheet. “Sea level is a complex stew,” says geophysicist Jerry Mitrovica of Harvard University, who is collaborating with DeConto and Raymo. “The bathtub model is only going to get you partway there.” Just 5000 years ago, as Ice Age glaciers were still melting, for example, sea levels in New Jersey were 9 meters lower than at present—whereas those in Argentina were 5 meters higher.

Differences this large could distort interpretation of the world's handful of Pliocene sea-level records. Raymo and collaborators hope that Mitrovica and DeConto can reconcile ice-sheet masses with sea levels around the world by plugging more numbers into their models. Paul Hearty, a collaborator at the University of North Carolina, Wilmington, is looking for sea-level records in Australia and the Azores. Raymo is constructing a Web-based wiki that she hopes will attract sea-level records from around the globe.

Others are directly investigating the history of East Antarctica's low-lying basins for indications of what happened to their ice during the Pliocene. Early this year, a team with the Ocean Drilling Program extracted seven cores from the sea floor in front of Wilkes Basin. Investigators are picking those layers apart this summer in search of clues about how well the ice in Wilkes Basin held together during the Pliocene.

Together, these efforts will begin to address the bigger question of what the loss of ice in East Antarctica, if it's occurring, actually means for the world's inhabitants. “Nobody expected it,” says Velicogna. “And if we just think that nothing is going to happen there, we're making a mistake.”

• * Douglas Fox is a freelance writer based in San Francisco, California.