News this Week

Science  18 Nov 2011:
Vol. 334, Issue 6058, pp. 880

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution

  1. Around the World

    1 - Baikonur Cosmodrome, Kazakhstan
    Phobos-Grunt Still Earth-Bound
    2 - Menlo Park, California
    Geron bails out of stem cells
    3 - Berlin
    Germany to Look for New Nuclear Waste Site
    4 - Ankara
    Turkish Academy Members Resign in Protest
    5 - Washington, DC
    First 2012 Spending Bill Backs Science Programs

    Baikonur Cosmodrome, Kazakhstan

    Phobos-Grunt Still Earth-Bound

    In limbo.

    Phobos-Grunt, prelaunch.


    Russia's first solar system exploration mission since 1996 remains stranded in Earth orbit this week after the engines that should propel it to Mars failed to fire on 9 November. After a perfect launch, the Phobos-Grunt probe—intended to return samples from Mars's moon, Phobos—has remained stubbornly silent.

    Vladimir Popovkin, head of the Russian Federal Space Agency, told reporters Monday that the chances of salvaging the mission were low. Ground controllers don't know what went wrong with the craft because they haven't reestablished contact with the onboard flight control computer. But Popovkin said the craft is maintaining its orientation relative to the sun, so its batteries should remain charged. He added that attempts to communicate with the spacecraft would continue for some weeks. “I still have some hope. We still have a few days for reprogramming before the end of the Mars accessibility window for 2011,” says Lev Zelenyi, director of the Institute of Space Research (IKI) in Moscow which developed the mission.

    Attention is now turning to the possibility of an uncontrolled reentry between the end of November and mid-January. Phobos-Grunt carries 10 tons of toxic fuel, but this is expected to burn up on reentry. Further updates will be posted on ScienceInsider.

    Menlo Park, California

    Geron bails out of stem cells

    Geron, the company that helped pioneer human embryonic stem (hES) cell research, said this week that it is stopping its first-in-the-world clinical trial and pulling out of further stem cell work. The Menlo Park, California-based company will instead concentrate on its anti-cancer therapies. Geron helped to fund the work of James Thomson at the University of Wisconsin, who in 1998 was the first to isolate hES cells (Science, 6 November 1998, p. 1145). The trial, launched last year, was designed to treat eight patients with spinal cord injury using neuronal cells derived from hES cells. Four people have been treated so far, and CEO John Scarlett, who joined Geron 2 months ago, says there have been no adverse effects. The company will continue following the patients, but it will not enroll any new participants.

    Scarlett said the decision enables Geron to continue operating without raising new money for the next year and a half, when it expects results from a half-dozen phase II trials of two cancer drugs. As part of the stem cell downsizing, the company will cut 66 full-time positions, 38% of its workforce.


    Germany to Look for New Nuclear Waste Site


    For more than 3 decades, Germany has been storing its waste at a site called Gorleben in Lower Saxony near the former border between East and West Germany. Germany has spent an estimated €1.6 billion building and testing the Gorleben facility, but the choice has long been controversial. At a meeting 11 November between the federal environment minister and officials from the 16 German Länder (states), leaders agreed to work toward drafting a law to govern a new search. A task force will begin work this month, and should have a draft law ready by next summer.

    The search will consider sites all over the country, including Gorleben, said environment minister Norbert Röttgen at a press conference. There will be “no taboos.” Leaders in the southern states of Baden-Württemberg and Bavaria, which host a majority of the country's nuclear power plants, had long argued against a new site selection process, in part because granite and clay deposits there are likely to be on the list of possible candidates. But officials in both states now say they are open to a new search.


    Turkish Academy Members Resign in Protest

    Members of the Turkish Academy of Sciences (TÜBA) are making good on their threat to resign in protest of what they see as government intrusion on the autonomy of the organization. At least 60 members—more than one-third of the total—have said they will cancel their membership.

    The government's initial plan, announced 27 August, would have expanded TÜBA's membership from its current 140 members to 300: 100 appointed by Turkish Prime Minister Recep Erdoğan, 100 appointed by the government-run Council of Higher Education, and the rest elected by sitting members. That announcement garnered letters of opposition from numerous scientific academies and societies.

    On 4 November, the government made what appeared to be a concession, stating that it would appoint a small committee to appoint the 100 academy members. But this is just “a cosmetic change,” says Erol Gelenbe, a computer scientist at Imperial College London and one of the resigning TÜBA members. Those who have resigned plan to form their own organization, called Science Academy Society, which will retain TÜBA's infrastructure and will be open to all its members.

    Washington, DC

    First 2012 Spending Bill Backs Science Programs

    The first 2012 budget bill contains surprisingly good news for the U.S. scientific community. At press time, Congress was expected to approve legislation that gives the National Science Foundation (NSF) a 2.5% increase, preserves funding for NASA's James Webb Space Telescope, and provides enough money for the National Oceanic and Atmospheric Administration to continue its polar-orbiting environmental satellites program.

    The spending bill ( is linked to an extension of current spending levels for all federal agencies, a necessary step to avoid a government shutdown. But for the science agencies included in this $182 billion slice of the overall U.S. budget, the bipartisan support for basic research was very gratifying. NSF's increase, for example, comes after the GOP-led House of Representatives approved a flat budget and the Democratic Senate applied a 2.5% cut. But rather than splitting the difference, the conferees from each body added $173 million to NSF's pot. The rising cost of the $8.7 billion Webb telescope will come out of other parts of NASA's $17.8 billion budget, which is some $650 million less than this year, including some science programs. And essentially all of NOAA's $306 million increase was given to the Joint Polar Satellite System, a two-satellite system scheduled to have its first launch in 2016.

  2. Random Sample

    They Said It

    "Even though Rick Perry's life was not being threatened, his brain was responding as if there was a lion in the audience about to pounce on him."

    —David Diamond, a behavioral neuroscientist at the University of South Florida in Tampa, explaining to The Washington Post how the Republican presidential candidate struggled during a debate on 9 November to remember the name of a department he wanted to eliminate: the Department of Energy.

    Airlifting Rhinos to a New Home


    It was a daring, but photogenic, plan: To help save the species, 19 southern black rhinos were wafted by helicopter out of their habitat in South Africa's Eastern Cape last week. The airlift was part of the World Wildlife Fund's (WWF's) Black Rhino Range Expansion Project, which works with landowners to add habitat—particularly important for black rhinos, which are less social than white rhinos and require more space.

    Many rhino species are in trouble: The western black rhinoceros was declared officially extinct in the wild by the International Union for Conservation of Nature on 10 November. But the southern black rhinoceros is still hanging on, with 4880 animals in the wild, including 1915 in South Africa.

    Since 2003, the WWF project has transported nearly 120 animals. Because these 19 rhinos lived in a region without roads, helicopters airlifted the sedated animals out by the ankles on a 10-minute trip across the treetops, where trucks were waiting to carry them 1500 kilometers to their new home in Limpopo Province.

    By the Numbers

    $850,000 — Estimated price of a meteorite, found by a Missouri farmer in 2006, now identified as a rare pallasite.

    90 million — Number of children who contract seasonal influenza annually around the world, according to a study published online 10 November in The Lancet.

    First Paper Submitted From Space

    Floating text.

    The submitted manuscript in the ISS (top). Cosmonaut Sergey Volkov (bottom at right).


    It started as a joke. Michael Schreiber, editor-in-chief of EPL, wanted to dispel the notion that his journal was only for European science. During a video interview with last May to commemorate EPL's 25th anniversary, Schreiber noted that the journal takes papers from India, China, and Brazil. “If a scientist would submit an excellent paper from the ISS, the International Space Station, I'm sure we would accept it too,” he added.

    “It was said, more or less, in fun,” Schreiber says now. But in the back of his mind, he adds, he was thinking about a talk he'd just heard by plasma physicist Hubertus Thomas at the Max Planck Institute for Extraterrestrial Physics in Garching, Germany, who presented data collected from the ISS. Schreiber wrote to Thomas, sending him a link to the video interview and posing a question: Would Thomas and his team of German and Russian scientists consider submitting their research to EPL—but from the ISS? After obtaining permission from the Russian government, the team agreed.

    On 27 October, Russian cosmonaut Sergey Alexandrovich Volkov submitted a manuscript (published online 11 November) about complex plasmas in microgravity conditions from the ISS; his address on the paper is listed as “International Space Station (present address).” That Volkov, who performed many of the plasma experiments, happened to still be on the ISS was a lucky break for the journal, Schreiber says: The cosmonaut was originally scheduled to return to Earth 2 months ago but troubles with the Soyuz rocket delayed the launch that would have brought him home. Having submitted the first paper from space, Volkov will soon be on his way home: A Soyuz rocket bound for the space station launched successfully 14 November.

  3. Newsmakers

    Disgraced Dutch Psychologist Returns Doctoral Degree


    After a devastating report accusing him of fraud in dozens of papers, Dutch social psychologist Diederik Stapel has given up his doctor's title. The University of Amsterdam, where Stapel worked from 1993 until 1999, issued a short statement 10 November saying that he had voluntarily relinquished the degree and returned his diploma.

    The panel investigating Stapel's misconduct had concluded that it was impossible to determine whether his 1997 thesis was based on fraud, in part because the data had been destroyed. But it recommended that the university investigate whether Stapel could be stripped of his title, “on the grounds of exceptional academically unworthy conduct.”

    DOE Science Boss Steps Down


    Steven Koonin is leaving his job as undersecretary for science in the Department of Energy (DOE) after what others say was an unhappy stint in a poorly defined position. Koonin was nominally responsible for scientific activities throughout DOE, but in practice had no control over budgets—not even for DOE's $4.8 billion Office of Science. “Here was a guy who had no budget authority, and that's a tough position,” says Michael Lubell, a lobbyist with the American Physical Society in Washington, D.C.

    Koonin acknowledges the issue but says that he was effective since taking the post in May 2009. “In terms of what actually gets done, I think I influenced things quite a bit,” he says. For example, Koonin helped build bridges between researchers doing largescale computer simulations in DOE's nuclear weapons programs and scientists in other parts of DOE who could turn such simulations to other problems. He also headed DOE's first Quadrennial Technology Review, released 27 September, which aims to explain and sharpen DOE's mission.

    Koonin's next stop is the Institute for Defense Analyses' Science and Technology Policy Institute in Washington, D.C. However, he hopes to have a post at a university lined up by the next academic year.

  4. Ecology

    Will Busting Dams Boost Salmon?

    1. Robert F. Service

    As dams fall on the Elwha River in Washington state and other rivers around the country, scientists are relishing rare opportunities to watch natural ecosystems restore themselves.

    No more.

    The Elwha Dam (above), built in 1913, is now largely gone (previous page). All that remains is the original power house and much of the sediment that was lodged behind the dam.


    ELWHA RIVER, WASHINGTON—“It's like an alien abduction,” says George Pess, a fisheries biologist with the National Oceanic and Atmospheric Administration (NOAA) Fisheries in Seattle, Washington, as he walks past a pair of colleagues preparing to snatch their quarry. One of those colleagues, Martin Liermann, wades calf deep in a side channel of the Elwha River in Washington's Olympic Peninsula with a giant battery pack strapped to his back. Holding a 2-meter-long yellow pole connected to the pack, he dunks the business end of it in the water and pulls a red trigger. An electric jolt stuns a handful of steelhead fingerlings, a jolt that Liermann avoids thanks to the rubber chest waders and neoprene boots he's wearing.

    Glines Canyon Dam on Its Way Down Between September and November 2011CREDIT: NATIONAL PARK SERVICE

    Liermann's partner Holly Coe scoops up a few fingerlings and plunks them into a white 5-gallon bucket. When they've gathered a couple of dozen or so, they shuffle over to what looks like a makeshift surgical unit strewn amid the rocks on the river's edge. “Have you puked these guys over here?” Liermann asks Sarah Morley, a fisheries biologist with NOAA Fisheries. “Just one,” Morley replies, as she plucks a fish out of the bucket, quickly measures and weighs it, and carefully wedges a syringe needle in the fish's mouth and pushes the plunger. Water shoots into the fish's stomach and forces the contents out onto a fine mesh sieve. “He had a full belly,” she says as she spots a small slug, a few mayflies, a couple of leeches, and some green bits she can't identify. Morley carefully spoons the contents into a sample jar for later lab analysis and returns the fish to the bucket, after which it will be returned to the Elwha, hungry but otherwise unharmed.

    There's good reason to be careful with the steelhead. They're one of three fish species in the Elwha listed as either threatened or endangered under the Endangered Species Act. (Chinook salmon and bull trout are the others.) Morley and Liermann's work on a bright day here in September is one of nearly a dozen related research projects designed to document the current state of the river. Just downstream, Pess and three colleagues sort through a small wedge of the streambed measuring and weighing rocks and other pieces of sediment. Upstream, John McMillan, a NOAA Fisheries contractor in a black dry suit, blue mask, and snorkel, sloshes his way down the river, occasionally plunging in to count the number of adult salmon he sees.

    Morley's team also surveys aquatic invertebrates and algae to get a full sense of what types of food the salmon have available to them. Others collect tissue samples from the fish for genetic studies; track numbers of bears, otters, and other mammals in the area; sample isotopes of nitrogen and phosphorus nutrients in the water to determine the percentage that comes from the ocean, brought in by spawning salmon; and even cut out the ear bones of dead salmon that have already spawned to look for telltale signs of whether they came from hatcheries or were wild stock.

    This intense interest came to a head as engineers were set to begin tearing out two massive dams that have blocked the free flow of water and salmon on the river for nearly 100 years. The dams—Glines Canyon and the Elwha—are some 64 and 33 meters high, respectively, the largest dams ever removed in the United States. It will take as long as 3 years to tear them out completely. But bulldozers and excavators have already made quick progress. Water now flows around the carved-out right side of the lower Elwha Dam and over the top of four notches hacked into the face of Glines Canyon Dam. Today, the river is close to flowing freely for the first time in 98 years.

    The Elwha isn't alone. After a century of furious dam construction in the United States, the era of dam building has ebbed. Over the past 3 decades, increasing concerns about safety and the high cost of upkeep have pushed the number of removals up from an average of about 10 per year to six times that number today, with a total of nearly 1000 dams removed, according to numbers kept by American Rivers, a river advocacy group based in Washington, D.C. For much of this time, the focus has been on tearing out small, obsolete structures in the Midwest and East. But that pattern has begun to shift, says Martin Doyle, a river restoration expert at Duke University in Durham, North Carolina. “Bigger dams are starting to come out, out west,” Doyle says, in part as an effort to rebuild endangered salmon and steelhead runs.

    Even longtime proponents of dam building, such as officials at the U.S. Bureau of Reclamation, now support removals on a case-by-case basis. At a recent ceremony commemorating the demolition of the Elwha dams, Bureau Commissioner Michael Connor was enthusiastic. “This is not only an historic moment, but it's going to lead to historic moments elsewhere across the country,” Connor was quoted in the Los Angeles Times.

    All these dam removals have created a unique opportunity for scientists to study how quickly rivers revert to their old ways once obstructions come down. “We're stoked,” says Mike McHenry, a habitat biologist with the Lower Elwha Klallam Tribe in Port Angeles, Washington, who has worked on river-restoration projects on the Elwha since 1989. “The hope here is we'll learn something we can apply in other places.”

    Tipping the scales

    There are plenty of possibilities. According to the U.S. Army Corps of Engineers, the United States has more than 80,000 dams 3 meters tall or taller, most of them privately operated. River blockages of all sizes likely top 2 million, according to another survey. Many of these aging structures have fallen into disrepair, are too expensive to fix, or have been made obsolete as nearby mills have closed and large coal and natural-gas power plants have come online. “The need for these dams just doesn't exist anymore,” Doyle says. “If you have a dam that's no longer generating revenue, you are basically sitting on a liability.”

    Privately held dams are licensed by the Federal Energy Regulatory Commission (FERC). As licenses run out, owners can apply for 35- to 50-year extensions. In most cases, however, the requirements for relicensing dams are now far more stringent than when the dams went in, meaning that complying with environmental and safety standards can cost more money than the dams generate. So it's often cheaper to tear out the dams than to bring them up to current standards. In recent years, Pennsylvania and Wisconsin have led the way in dam removals, each state tearing out hundreds.

    Finding the proper balance between the usefulness of dams and their environmental costs has become particularly acute in the Northwest, where communities are struggling to safeguard salmon populations that have been declining for decades. In the Columbia River Basin, which at 673,000 square kilometers stretches across most of the Northwestern U.S. states and into British Columbia, dams have blocked an estimated 55% of historic salmon spawning habitat, says Michelle McClure of NOAA Fisheries. Those dams, together with loss of habitat from development, as well as competition from hatchery-bred salmon, have dropped wild salmon runs to roughly 10% of their historic numbers; 13 salmon and steelhead runs in the basin are now listed as either endangered or threatened.

    To counter that decline, fisheries managers now mandate that most dams provide fish passage by building a parallel “ladder” or waterway. For dams that historically had no such passage, that is typically an expensive proposition; in many cases it has now tipped the scales in favor of dam removal. In 2008 and 2009, for example, that equation led Portland General Electric to rip out two power-producing dams on the Sandy River in suburban Portland. Dams have also come out along the Rogue River in southern Oregon, and the Wind and White Salmon rivers in southwest Washington.

    Unique opportunity

    Much the same calculation was at work on the Elwha. The Elwha Dam was installed in 1913, and Glines Canyon Dam 14 years later, to provide electricity for sawmills in nearby Port Angeles. By the 1990s, most of those mills had closed. The dams still provided power—but a total of only 19 megawatts, far below the 500 to 1000 megawatts of a typical gas- or coal-fired plant. When it came time for FERC relicensing, it was clear the costs of repairing the dams and adding fish passages didn't pencil out. Still, the company that owned the dams, James River Corp., didn't have the money to remove them.

    In 1992, the U.S. Congress stepped in and authorized money to tear out the dams. But congressional opponents, including former Washington State Senator Slade Gordon, blocked dam-removal funding proposals for more than a decade. Among Gordon's concerns were that if dam removal succeeded in the Elwha, there would be a clamor to remove dams throughout the west—particularly those on the lower Snake River in Idaho, which have long drawn the ire of salmon advocates but have been a boon to farmers looking to barge their goods to coastal ports. “There was concern that when one dam falls, they are all going to fall,” says Patrick Crain, a fisheries biologist with the National Park Service.

    Ultimately, however, the Elwha was just too good a restoration opportunity to pass up. The Elwha dams sit close to the river's mouth where it flows out into the Strait of Juan de Fuca, near where Puget Sound meets the Pacific Ocean. Upstream, 83% of the 112-kilometer-long river lies within the protected wilderness of Olympic National Park. Before the dams were built, an estimated 390,000 salmon from 10 different species returned to the river to spawn each year—so many that early white settlers along the river complained that the slapping of fish was so loud during spawning periods that they couldn't sleep. Now that the dams are coming out, the Elwha has become not just the nation's second largest ecological restoration site after the Florida Everglades, but a rare opportunity to study how natural processes return to normal and how ecosystems respond. “It's really a unique opportunity, because the habitat is in such good shape,” Pess says.

    That opportunity sent Pess and colleagues from several federal, state, and tribal organizations fishing for money to track the Elwha before and after the dams came down. Despite the removal project's $325 million price tag, little of it was dedicated to monitoring efforts. In the end, that was something of a benefit, Pess and others say, because it forced researchers from numerous organizations to build a broad network of collaborators to track the recovery. “We have very little information about how the ecosystem works. That's why so much effort is going into this project,” says Peter Kiffney, a biologist with NOAA Fisheries in Seattle.

    Near death.

    A female Chinook salmon is scarred and battered from her final journey up the Elwha.


    As they began to collect data on the Elwha a few years ago, it became clear that the river was dramatically altered from its free-flowing past. Not only are the fish counts drastically reduced, so too are the numbers of eagles, bears, and other animals that used to depend on the robust fish runs for survival. Over the decades, fine gravel in which salmon prefer to lay their eggs has largely washed out because it wasn't being replaced, leaving mostly rocks softball-sized and larger. The dams even blocked plant seeds from washing downstream, fragmenting some plant communities in the watershed.

    So how long will it take for the river to recover? “It will be a great experiment,” Pess says. Initially, the changes may not bode well for the salmon. The biggest concern, Pess says, is that 18 million cubic meters of silt and other sediment is trapped behind the dams, enough to fill 200,000 dump trucks; it will move downstream. Engineers are taking the dams down slowly to ensure that it doesn't all wash out at once. Some of the sediment will actually be trucked out. Botanists will then race to plant native species in the banks of remaining sediments, in an effort to stabilize them—a particular worry during spring high water flows.


    Dam removals continue to rise (above), particularly in the East (top). But teardowns are increasing in the West in hopes of saving salmon runs.


    High sediment levels can not only harm fish directly but can also suffocate eggs waiting to hatch. Such concerns prompted recovery officials to approve the construction of a new hatchery for steelhead and chum, coho, and pink salmon in the lower Elwha. But that hatchery itself has become highly controversial, as some researchers worry that hatchery-raised salmon will out-compete natives and further imperil wild fish runs. “I think the hatcheries [on the Elwha] are unneeded,” says Jack Stanford, an ecosystem scientist with the University of Montana's Flathead Lake Biological Station in Polson. Stanford notes that every year, highly productive salmon streams such as the Copper River in Alaska carry sediment loads equivalent to what the Elwha will now face after the dams come out. Stanford has plenty of allies. On dedication day for the demolition of the Elwha and Glines Canyon dams, opponents filed a notice of intent to file a lawsuit to block the new hatchery.

    Even with sediment problems and hatchery competition, giving fish access to new river territory can produce a quick reaction. “The response of fish can be pretty immediate,” Pess says. “Salmon are very opportunistic creatures. If given the opportunity to utilize a habitat, they will use it.”

    Pess points to one such example on the Fraser River in British Columbia, Canada. A landslide in 1913 triggered by railroad builders trying to blast a route for a track through Fraser Canyon cut off 80% of 1400 kilometers of prime salmon-rearing habitat. About 1 million fish a year still returned to the lower river. Some 30 years later, fish ladders were built to reopen access to the upper river. In the 40 years after they went in, salmon populations in the river doubled. At the American Fisheries Society meeting in September in Seattle, Rory Saunders of NOAA Fisheries reported that after two dams were recently removed from the Sedgeunkedunk Stream in Maine, Atlantic salmon, sea lamprey, and alewife all showed improved runs within 2 to 3 years. “This was much faster than any of us expected,” Saunders says.

    Still, Kiffney cautions that although salmon and other fish can return quickly, the recovery of the broader ecosystem can be far slower. For much of the past decade since fish ladders were installed in 2003, Kiffney has tracked the rise in salmon spawning up the Cedar River near Seattle. “After the fish ladders went in, whoosh, salmon numbers started climbing,” Kiffney says. But the physical mass of all the salmon in the runs is still only 0.006 kilograms of fish per square meter of river—too low to make a broader impact. “In experiments we've done, we see ecosystem changes at 0.6 kg/m2. So we need a 100-fold increase,” Kiffney says.

    Some estimates suggest such a return to historical salmon runs could be in store for the Elwha, says William Robert Irvin, president of American Rivers. But Pess and others are more cautious. Self-sustaining populations of different salmon species are likely to take 10 to 30 years to become established, Pess says. “There will be recovery. But it won't be the same as it was historically.”

    As the fish runs begin to improve, the first to spot the changes could well be Kent Mayer, a fish biologist with the Washington Department of Fish and Wildlife. Last year, Mayer built a fish weir—essentially a temporary trap for large adult fish—2 kilometers below the Elwha Dam. The weir, which spans the full width of the river, is the largest such structure in the United States outside of Alaska. It enables Mayer and his colleagues to not only count fish moving upstream and downstream but also document their size and weight and take tiny tissue samples for genetic profiling before releasing them. Last month, on the day Pess, Morley, and their colleagues were weighing rocks and puking fingerlings, Mayer caught and released the first bright red sockeye salmon known to have spawned in the Elwha in modern memory—hopefully a harbinger that the long-extinct salmon run could soon return to the river. “There are days when I love my job,” Mayer says standing waist deep in the fish trap. “This is one.”

  5. Ecology

    Out of the Frying Pan?

    1. Robert F. Service

    A new climate-modeling effort called the Riverscape Analysis Project finds that climate change will affect rivers throughout the Pacific Rim.

    Despite the continuing rise in dam removals, not all long-term trends are good news for salmon. Climate change and ocean acidification, according to recent studies, threaten to wipe out possible gains and could push many salmon runs to extinction by the end of this century.

    A new climate-modeling effort called the Riverscape Analysis Project finds that climate change will affect rivers throughout the Pacific Rim. The effort, led by Jack Stanford and John Kimball at the University of Montana's Flathead Lake Biological Station in Polson, starts with a standard hydrologic model for how temperature and precipitation changes affect river temperatures and stream flows. The researchers incorporated information from satellite and space shuttle images on the physical characteristics of 1500 rivers, as well as the extent of existing salmon runs, and examined how expected air temperature changes given by scenarios from the Intergovernmental Panel on Climate Change would likely affect river temperatures, flows, and fish populations.

    Stanford says that when he and his colleagues input past climate data, their models could explain 70% of the observed variation in river flows and temperatures. Not perfect, but by a modeler's standards, “that's pretty good,” Stanford says. When they looked forward to the end of this century, they found that the model flagged dozens of rivers at serious risk of losing their salmon. The most vulnerable tend to be in California and the interior regions of the Columbia Basin in Oregon, Washington, and Idaho.

    The results are preliminary, Stanford emphasizes, as he and his team continue to refine their models. But the general picture is clear, Stanford says: “Salmon will experience a warming [of water temperatures] across their entire range. In some places, it will get so warm that salmon will be eliminated.”

    Good news, bad news.

    Dam removals are likely to increase salmon populations, but climate change and ocean acidification could do the opposite.


    David Purkey, a hydrologist at Stockholm Environment Institute (SEI) in Davis, California, calls the assessment “important” but notes that individual rivers will be affected not only by temperatures and stream flows but also by how humans manage water flows through dams and other diversions. Purkey, along with colleagues at SEI, the University of California, Davis, and the National Center for Atmospheric Research in Boulder, Colorado, recently ran a similar climate-change analysis on the Butte Creek Watershed in California. Their study took into account not only hydrology, CO2 emissions scenarios, and six different climate models but also two different scenarios describing how dam operators managed water flows throughout the year. In a business-as-usual scenario in which operators didn't change water flows during summer temperature spikes, spring Chinook salmon runs became extinct by about 2070. But when water was allowed to flow through the dams in the hot summer months when the fish are most vulnerable, the additional cold water in the rivers allowed at least small runs to survive the century in some, although not all, scenarios.


    Highly acidic ocean waters (red) periodically well up to shallow depths near the west coast of North America. As CO2 concentrations in seawater increase, ocean acidification is expected to spread.


    One potential bit of good news for the salmon, Stanford says, is that rivers along the North Slope of Alaska and other arctic regions that typically freeze for much of the year are likely to become new free-flowing spawning habitats as temperatures warm. The North Pacific, however, would provide far smaller feeding ground for salmon than the current region between the northwestern United States and Japan in which the fish now spend the bulk of their lives.

    Ocean acidity at high latitudes may pose an even bigger threat. Fossil fuel burning emits carbon dioxide into the atmosphere; the gas is absorbed by the oceans and quickly converted to carbonic acid. This reaction releases hydrogen ions into the water, reducing its pH—in other words, making the water acidic. Since preindustrial times, the overall pH of the oceans has declined by 0.1 pH unit. That may not sound like much, but the pH scale, like the Richter scale for earthquakes, is logarithmic: Over this period, seawater acidity has increased by 30%.

    According to emission scenarios from the Intergovernmental Panel on Climate Change, pH values will drop an additional 0.2 to 0.3 units by 2100, potentially doubling current acidity levels. Added hydrogen ions reduce seawater concentrations of carbonate ions, which oysters and other organisms use to build their shells. Lab and field studies suggest the drop in carbonate ions is already affecting oyster larvae and other organisms. The trend is particularly bad news for juvenile salmon: One of their primary food sources in the Pacific is pea-sized snails called pteropods.

    Even without added CO2 from human civilization, pteropods and other shelled critters in some coastal regions periodically experience high acidity levels. It happens when populations of algae, plankton, and other ocean organisms bloom and die, falling to the ocean floor. There they are gobbled up by bacteria that churn out carbon dioxide, making deep waters highly acidic. When surface winds push the top layer of water out to the open ocean, the deep corrosive waters from underneath surge upward, explains Richard Feely, a chemical oceanographer with the National Oceanic and Atmospheric Administration's Pacific Marine Environmental Laboratory in Seattle.

    At the American Fisheries Society meeting in Seattle in September, Feely reported that he and colleagues had just completed a cruise throughout the North Pacific and down the West Coast of the United States. According to the team's latest surveys, ocean acidification from human causes already accounts for between 29% and 49% of the corrosive waters present along the West Coast of the United States through the North Pacific. The human contribution to this high acidity could rise to as much as 80% by 2050, Feely estimates. In a more-acidic ocean, pteropods and other organisms probably won't be able to build their shells. So the ability of salmon to thrive in the ocean could hinge on their ability to find food sources different from those they've relied on throughout the course of evolution.

  6. Evolutionary Biology

    Evolutionary Time Travel

    1. Elizabeth Pennisi

    With clever and challenging lab experiments, researchers are forcing species to become multicellular, develop new energy sources, and start having sex.

    Bigger, better.

    An experiment evolving bigger yeast yielded tubes with more settled yeast through time (above, left to right) and resulted in multicellular organisms with dead cells (red) helping reproducing yeast fragment.


    In December 2009, evolutionary biologist Michael Travisano was debating with his future postdoc William Ratcliff what to do next in their lab at the University of Minnesota (UMN), Twin Cities. They had just seen a talk on slime molds that delved into what it meant to be a multicellular organism. Under certain conditions, some of these single-cell amoebas can coalesce into masses of millions of cells that act in a coordinated fashion, as if a whole organism.

    Inspired by this, Travisano and Ratcliff began musing about how one of evolution's apparently major leaps up the ladder, the jump to multicellularity, takes place. A typical evolutionary biologist might tackle this challenge by comparing fossils or genomes of related unicellular and multicellular species, but the duo had a more daring idea. They decided to try to force the evolution of yeast, normally a single-celled creature, into a multicellular one. “I wouldn't have wagered a large sum of money that this would have worked,” Ratcliff recalls. “But if it [did] work, it would be the coolest thing we could think of.”

    Researchers have long deliberately bred animals, plants, and microbial species for specific purposes—leaner meat, drought-resistant plants, chemical-producing bacteria, and so forth—but what Ratcliff and Travisano wanted to do was probe how evolution itself happens, by forcing it to occur, under controlled conditions, as scientists watch. Despite Ratcliff 's reservations, they succeeded in producing multicellularity, at least in a limited form, in just 60 days.

    This type of research, known as experimental evolution, has existed almost since Darwin put forth his theories. The approach has risen in popularity over the past few decades, in large part thanks to the pioneering work of Richard Lenski. An evolutionary biologist at Michigan State University in East Lansing, Lenski has for several decades now conducted an ongoing study in which 12 populations of the bacterium Escherichia coli live, and evolve, in flasks with limited supplies of glucose for energy. In an extraordinarily long-term effort, Lenski and his lab members have followed more than 50,000 generations of E. coli and in so doing gleaned insights into the pace and reproducibility of microbial evolution (Science, 25 June 1999, p. 2108). Lenski's work “really highlighted the power” of experimental evolution to other biologists, says Nick Colegrave, an evolutionary biologist at the University of Edinburgh in the United Kingdom.

    At this summer's 13th Congress of the European Society for Evolutionary Biology in Tübingen, Germany, it was clear that the field of experimental evolution has itself evolved. Biologists today conduct controlled evolution studies with everything from viruses to fish. And as the multicellularity experiment conducted by Travisano and Ratcliff indicates, many are trying to address complex questions such as how evolution fashions major changes in a creature's lifestyle. One team at the Germany meeting reported tackling how sex evolves, for example, whereas another examined how an alga deals with losing access to light, its main source of energy. “Evolution is expanding from a strictly comparative and observational science to an experimental one,” says Graham Bell, an evolutionary biologist at McGill University in Montreal, Canada.

    Even the field's pioneer admired the ambitious work presented in Tübingen. “As the field grows, people are thinking about more and more specific hypotheses and complex scenarios,” Lenski says.

    Dark science

    For Bell, the complex scenario was to try to get a plant to grow in the dark. The cells of plants, which survive by photosynthesis, are structured around harnessing light and converting it to chemical energy. What if there were no light? The plant might have to shift from depending on its photosynthetic machinery to another source of energy, perhaps the mitochondria that power most nonphotosynthetic eukaryotes. A few parasitic plants and at least one protist have made such a switch, and Bell wanted to see if he could drive that change in the lab. It “constitutes a new way for life” for a plant, he explains.

    Bell couldn't test a typical plant—none grows and reproduces fast enough to make such an experiment in evolution feasible. So he turned to a photosynthetic microbe belonging to the plant kingdom, the single-celled green alga called Chlamydomonas, often studied as a model system in cell and molecular biology. Bell knew this organism already had some ability to feed off acetate in a pinch and wondered if it could build on that to thrive in the dark. After all, he points out, “you can't evolve something from nothing.”

    “It was a big challenge,” Lenski says. But “experimental evolution offers a way to see the relevant processes in action.”

    Bell set up 2880 cultures of Chlamydomonas on acetate-laden media and left them in a corner of his lab in constant darkness. Every other month he transferred 5% of the algae-media mix in each culture into a new dish of media. Unless the alga was increasing its population 20-fold monthly, a sign of strong growth, it would eventually be diluted out of existence by the periodic transfers. About 90% of the cultures stopped growing within a year, but a few hundred alga lines kept pace, he reported at the meeting. After about 12 months, he started transferring these algae each month, then every other week, and finally weekly, such that only the fastest growers would survive. “You have to be fairly patient,” Bell says.

    Five years into the experiment, 241 lines of algal “survivors” are thriving in the darkness that would be lethal to the ancestors. The alga lines vary significantly in appearance. Some form clusters that are circular; others are ragged. Some are green, whereas others are yellow or white. Most can still grow in the light, but a few can't and rely solely on acetate, Bell said: “You have a whole range of solutions to growing in the dark.”

    Dark days.

    Very few algae survived growing in the dark (right), but those that did evolved a variety of colors and shapes (above).


    Bell and his colleagues are now looking at whether these transformed algae can mate with their original lines, or whether more traits than morphology and metabolism have changed through time. He has a list of genes involved with acetate processing that he will check for enabling mutations. At some point, Bell says he might put the acetate users back into the light to see if they can come to depend on photosynthesis again. The genes needed for photosynthesis have likely degraded, and he wants to know whether the algae can fix the broken genes or whether other genes will be brought into play. With this work, we “gain insights that would otherwise be impossible,” Lenski says.

    Showing sex is good

    Aneil Agrawal, an evolutionary biologist at the University of Toronto in Canada, and his postdoctoral fellow Lutz Becks are also exploring how a species can go back and forth between certain lifestyles. They're addressing the puzzle of sex. The evolutionary quandary is that requiring two individuals, typically a male and a female, to generate offspring makes reproduction much less efficient than asexual cloning, wherein each individual reproduces. Thus one would expect that even if sexual reproduction evolves, it shouldn't persist should asexual individuals subsequently arise, unless the sexual mixing of genomes provided some significant advantage. Yet in most species, sex predominates.

    By the 1990s, researchers had put forth more than 20 theories to explain this puzzle. Some experimental work in yeast suggested sex was advantageous in changing environments. And a study in viruses, which swap DNA in a way similar to sex, indicated that sex thrives because it weeds harmful genetic mutations out of a population (Science, 28 November 1997, p. 1562). But for the most part, sex's purported evolutionary advantages have gone untested.

    Three years ago, Agrawal and Becks decided to look more deeply at the long-and short-term benefits of sex through a series of lab studies, some involving experimental evolution. Researchers had proposed that recombining genes through sex could lead in the short run to fitter offspring. In the long run, the many possible genetic combinations produced by sex mean that there will likely be more genetic variation in the population and greater ability to adapt rapidly—an advantage that could favor the maintenance of sex.

    The duo's studies, which center on rotifers, microscopic animals found in lakes and ponds, are showing that it is easy for sex to develop in certain populations but difficult for it to persist. So-called Bdelloid rotifers are famously asexual, but other rotifer species, including Brachionus calyciflorus, the one Agrawal and Becks study, sometimes resort to sexual reproduction but only in crowded conditions. (The buildup of a rotifer-secreted chemical induces this sexual behavior in crowds.)

    Becks and Agrawal began by testing descendants of wild-caught B. calyciflorus rotifer for reproductive fitness, counting the number of eggs offspring produced in conditions that favored either sexual or asexual reproduction. The asexual populations (in uncrowded conditions) produced more than twice as many offspring as the crowded sexual populations, confirming a large fitness cost for sex, they reported in the March Journal of Evolutionary Biology. The sexual populations were also not more variable in their fitness, suggesting there was no long-term potential benefit to sex per se. “It showed the big problem we have explaining sex,” Becks says.

    Next, Becks and Agrawal used experimental evolution methods, tweaking the rotifiers' diet to look at the effect of environment on the balance between sexual and asexual reproduction. They fed one batch of 10,000 rotifers nitrogen-rich algae, a similarly sized batch got less-nutritious algae, and a third batch, broken into subgroups of about 5000 rotifers, were regularly exposed to both. Becks would weekly transfer 1% or 10% of each subgroup from the one kind of alga food source to the other, simulating migration between two environments. At the beginning of the experiment and after 6 and 12 weeks—45 and 90 generations—he tested the rotifers' propensity for sex by exposing a subset from each batch to the sex-stimulating chemical and looking at the eggs produced. (Asexually produced eggs appear solid under the microscope, whereas sexually produced ones seem partially void.) In this way, Becks determined what proportion of the rotifers were able to switch to sexual reproduction.

    Rotifers maintained with a consistent food source—whether of high or low quality—produced half as many sexual eggs as rotifers regularly switching between the two kinds of food, Becks and Agrawal reported online 13 October 2010 in Nature. And when they followed the rotifers a month longer, they found propensity for sex increased in the rotifers with a mix of foods but declined in the batches where rotifers experienced a constant food environment. In a constant environment, sexual reproduction eventually disappeared altogether, Becks later found.

    Becks and Agrawal wanted to see whether the pattern observed in the simulated migration—more sex in a more diverse environment—held true if the rotifers simply were confronted with a completely novel environment. Over the course of a week, Becks gradually added low-quality food to batches of B. calyciflorus rotifers that were used to feeding on high-quality food, transforming their diet. In the first few days of the experiment, as he was changing the food out, the rotifer population started to crash, reaching a low in 12 days. At that time, sexual reproduction began to increase and continued to do so for about 3 weeks, Becks reported at the evolution meeting. Then the trend reversed: The population kept growing but became increasingly dominated by asexually reproducing rotifers, a trend that continued through the final 10th week of the experiment. “We saw this increase [in sex], and then it went down again,” he said.

    They repeated the experiment for a period of a month and got the same results. In contrast, in control batches kept on a consistent diet, sexual reproduction waned from week one. According to Becks, this experiment and the one with the mix of two food environments suggest that new challenges favor sex but only until a way of coping with that challenge has developed. “In the end, sex was only beneficial during the time that they adapted to their environments,” concluded Becks, who is now at the Max Planck Institute for Evolutionary Biology in Plön, Germany. Once the right adaptations had evolved, sex was no longer favored.

    “What's remarkable is that they've developed a rapidly evolving empirical system that allows them to watch the evolution of sex in real time,” says Sarah Otto, an evolutionary biologist at the University of British Columbia, Vancouver, in Canada. “By tracking changes in the frequency of sex, the rotifer system promises to allow us to tease apart the mechanisms that promote the loss or maintenance of sex.” Lenski adds that “the work allowed them to challenge and support a classic model for the evolution of sex.”

    Yeast beasts

    Just as the evolution of sex represented a major transition for life, so did the leap to multicellularity. When Travisano and Ratcliff began to consider what experimental conditions would encourage the yeast Saccharomyces cerevisiae to go multicellular, they focused on size. One rather obvious hallmark of multicellular creatures is that they are bigger than unicellular organisms. On its own, largeness could offer many evolutionary advantages: a greater ability to access nutrients or avoid being eaten by small predators, for example.

    Still, finding the right selective force to make yeast go big was a challenge. Working with their UMN colleagues R. Ford Denison and Mark Borrello, Travisano and Ratcliff first tried exposing cultures of yeast to detergent, thinking that some might form a multicellular entity in which the outer cells would shield the inner ones from the detergent's destructive power. But nothing survived.

    Next, they turned to gravity as the selective force, letting tubes of yeast in solution, after being shaken to distribute the microbes evenly, sit quiet for 45 minutes. The researchers then transferred the bottom 1% of the tube's contents to new cultures. Bigger yeast, which would be more likely to settle out, should have a better chance of surviving these transfers. “It turns out that 45 minutes is pretty lenient,” Ratcliff says. Almost all the yeast settled out, so there was little selection for larger size.

    Sex mania.

    In typically asexual rotifers, the ratio of sexually derived eggs (darker) increases in novel environments but decreases after conditions stabilize.


    To speed up the process, they instead gently spun the tubes for 10 seconds in a centrifuge before transferring the bottom 1%. Two weeks later, ever-bigger pellets of yeast were settling to the bottom of the tubes in two of the 10 cultures, Travisano reported at the meeting. Spherical clusters of cells loosely resembling snowflakes eventually dominated all 10 setups. Tests showed that these weren't simply individual cells that managed to stick together, as happens when yeast in brewing beer aggregate. Instead, these clusters arose because dividing yeast cells had lost the ability to separate completely. The researchers ultimately discovered that the yeast in these snowflakes make much less of the enzyme that enables normal separation.

    The snowflakes also began reproducing like a multicellular organism. Individual cells in the snowflakes would divide but not detach, enlarging the snowflake. And once the snowflake reached a certain size, it would fragment, releasing a daughter snowflake. At first the fragments were about equal size, but as more and more generations went by, a pattern developed: Snowflakes broke into a smaller and much larger part. That's presumably “so it could produce more offspring by allocating fewer resources for each one,” Ratcliff said.

    The newly multicellular yeast also evolved a division of labor that facilitated its uneven fragmentation. At first all the cells in the snowflake divided. But after hundreds of generations, some cells in each snowflake stopped dividing and eventually died. These cells in effect were sacrificed for the benefit of the multicellular organism, becoming sites where the snowflakes fragmented, Ratcliff reported. “It's the beginning of specialization,” Lenski says.

    Some scientists question whether the new complex yeast are true multicellular organisms, but others nonetheless praise the experiments. “The origin of multicellularity with the subsequent evolution of specialized cell lineages represents one of the most important transitions in the history of life,” David Reznick, an evolutionary biologist at the University of California, Riverside, says. “I would have never dreamed that it could be possible to study it from an experimental perspective.”

    By doing so, Ratcliff says, he's gained a better understanding of this transition. “The constraints on the evolution of multicellularity may be lower than we thought,” Ratcliff concludes. As a result, he now believes multicellularity arose more often than researchers have realized—most current estimates suggest it has emerged about 20 times in various lineages—but then subsequently faded away.

    The attempt to transform yeast into a multicellular species also impressed Lenski. “That's some of the neatest work going on right now,” he says. “It makes you think about what is a major transition” in evolution of life.

    Indeed, Colegrave says, “experimental evolution is an approach which can in principle be used to address any of the big questions in evolution.”

  7. Natural Resources

    Dreams of a Lithium Empire

    1. Jean Friedman-Rudovsky*

    Bolivia is betting that a former nuclear engineer, Guillaume Roelants, will develop a new extraction process for the world's largest lithium reserve.

    Liquid wealth.

    Bolivia's lithium riches lie in the Salar de Uyuni, a vast salt reservoir south of La Paz.


    SALAR DE UYUNI, BOLIVIA—Once a giant prehistoric lake, the empty expanse of Bolivia's majestic Salar de Uyuni seems frozen in time. But beneath the surface of this 10,000-square-kilometer salt flat lies a dynamic zone of unknown depth partially comprised of mineral-laden brine. The geology has fascinated Guillaume Roelants, a Belgian nuclear engineer, for 30 years. Now it has the world's attention, too: The salty liquid holds the planet's largest reserve of lithium, the key ingredient for lithium-ion batteries. Bolivia wants to extract lithium to power the world's future electric vehicles, and Roelants leads this nation's quest.

    “This is not the first time Bolivia has a resource the rest of the world covets,” explains Roelants, 59, a nationalized Bolivian who's lived here since 1981. Spanish colonizers drained the country's vast silver wealth; Bolivia's tin enriched Europe's coffers in the 19th century. Recently, the Andean nation's natural gas reserves, the second largest in South America, have been developed largely for export. Today, Mitsubishi, LG, Lithium Corporation of America, and more have all tried to make inroads here. But Evo Morales—the first indigenous president—has decided Bolivia will develop the industry on its own: state control, with Roelants at the scientific helm.

    Critics, from Japan to the United States, say Bolivia is squandering its good fortune: The country ought to bring in foreign experts and capital to master Uyuni's complicated brine and environment. Roelants shrugs, black leather Harley-Davidson jacket dangling on a finger over his shoulder, concluding: “We will prove them wrong.”


    The elevated evaporation ponds rise from the vast expanse like a mirage. At first, they look like toys, staged on an endless white carpet. A salar this big eliminates depth perception. Only at a few hundred meters do the ski-masked workers seem life-size.

    Roelants is standing in faded black jeans and a pilling checkered sweater. He's frustrated because it's 1 p.m. on Saturday and he's only just arrived. The father of three spends every weekend on the flat, normally arriving on Saturdays by 6:30 a.m. after a 10-hour overnight journey by bus on unpaved roads from La Paz. Delays on his first attempt flying into the newly completed Uyuni Airport have cost him half his working day. “I won't be doing that anymore,” he says, conceding that his budget can't accommodate the $175 weekly airfare anyway.

    Despite his annoyance, Roelants is upbeat as division heads of the Bolivian State Mining Corp. approach with worksite progress reports. Approximately 10 solar evaporation ponds covering 12 hectares lie in front of him, arranged in stages. Mineral-laden brine is pumped in at the front end; as it moves downstream, solar evaporation causes minerals to crystallize in fractions, isolating the unwanted ions in the brine—including sulfates, magnesium, and potassium. The final ponds will soon yield lithium chloride, to be converted into marketable lithium carbonate, the white powder base material for lithium-ion batteries.

    Wind blows Roelants's thinning white hair vertical as he explains that this is a personal dream reborn. In 1989, he persuaded the Belgian government to finance a lithium-extraction pilot project, “at that time, for use in ceramics and aluminum production,” Roelants recalls. But the Bolivian government wasn't interested, so Roelants moved on: He established Bolivia's borax mining industry and spent more than a decade working near the Salar de Uyuni as an adviser to local communities on agricultural production.

    After the 2005 presidential election, Roelants says, “representatives from these communities asked me to accompany them to a meeting with President Morales regarding regional economic projects.” The president listened to Roelants's ideas, and after reviewing a feasibility study in 2008, committed $5.7 million in state funds for a lithium carbonate pilot plant. Roelants was appointed as head. “I stayed in Bolivia to be challenged by working in an impoverished country, with the people, rather than for multinational companies,” he says, “so this was a great opportunity.”

    Indeed, the possibilities are grand. The Salar de Uyuni is a bowl-shaped depression of alternating brine and clay layers. The deepest well, at 220 meters, does not touch rock, so Uyuni's total yield is impossible to determine. The basin is considered to hold at least 100 million tons of lithium, and the southeastern portion boasts some of the world's highest concentrations, reaching 4000 ppm of lithium. The U.S. Geological Survey estimates that Bolivia's reserves are more than half of the world's total.

    But Uyuni presents special challenges. Salt flats are like snowflakes: No two brines are alike, and evaporative conditions vary. So though the majority of the world's lithium carbonate is produced via solar evaporation from salt flat brine (it's also possible to mine lithium from rock), Roelants's research team had to develop a unique process. “There is no blueprint for this work,” says Ihor Kunasz, former chief geologist for U.S.-based Foote Mineral Co. and one of the world's experts in brines. Each pond must maintain specific mineral concentration levels while maximizing evaporation rates.

    The world's largest producer of lithium carbonate, SQM in Santiago, says it considered Uyuni but decided instead on what's now their flagship operation in Chile's Atacama salt flat.

    “Uyuni is problematic because of its brine's magnesium levels and the evaporative conditions,” says Eduardo Morales, general manager of the Sociedad Chilena de Litio. Magnesium, he explains, is found in almost all salt flats and is difficult to separate because of its chemical similarity to lithium. The higher the magnesium-to-lithium ratio, the more complicated and costly lithium carbonate production will be. Uyuni's ratio, Roelants admits, is 20:1; Atacama's is 6:1. Evaporation rates at Uyuni, which is relatively cool at 3600 meters above sea level, are good but not great. Rain slows the process, too. “These projects are extremely complex especially with the challenges presented in Uyuni,” says geologist R. Keith Evans, who has worked in the industry for 30 years and is arguably the world's best known consultant on lithium. “Bolivia should be looking for every expertise out there, but instead they decided to go it alone.”

    Roelants may be the nation's lithium chief, but he has no formal training with the metal. Nor do the 20 meteorologists, chemists, and geologists that form Roelants's R&D team. They are, though, all Bolivian and all graduates of Bolivian public universities.

    The government stresses that it does not shun foreigners, but encourages outside involvement through a scientific advisory committee that offers Bolivia voluntary assistance. It's comprised of national and international scientists, lithium experts, and automobile industry representatives. “I have felt very welcomed,” says Waldo Rojas, a Chilean hydraulic engineer with 20 years' experience, who's been helping here with pond design. But Evans says that without industry big names on the committee, outside involvement means little. Brine expert Kunasz agrees: “Bolivia made a politically driven decision to exclude many in the industry, and it hurts them scientifically.”

    State administration means technical limitations, too. Uyuni's critical meteorological conditions are tracked on a clunky Dell laptop that sits in a wooden shack next to the ponds. The project's main laboratory is housed in a decrepit building on the slopes of La Paz. “We have had to make do with what we have,” says Jose Bustillos, the Uyuni team's lead electrochemistry researcher. But, even so, after researchers scanned methods used at other salt flats, he says, they developed an extraction process tailored to Uyuni's conditions that solved the magnesium problem in 2009. “It was a feeling like none other,” Bustillos says, glowing at the memory of holding Uyuni flat's first grams of lithium carbonate. The team will not reveal details until international patents are secured.

    By the bootstraps.

    Guillaume Roelants, a Belgian-born engineer, was chosen by Bolivia's president, Evo Morales, to create and lead an autonomous national lithium-extraction industry.


    There have been delays. Uyuni should already be producing 12,000 tons of potash per year, a byproduct to be sold to Brazil as fertilizer. Lithium carbonate production was set to be 500 tons per year by 2010. Instead, the potash plant is still being completed, and the final evaporation pond that should be yielding lithium chloride is empty.

    Building for the future

    As the setting sun casts its rosy hue over the flat, Roelants arrives at the newly constructed pilot plant offices. Two professors from La Paz's metallurgy institute are visiting to learn about Uyuni's process, a regular occurrence for Roelants.

    Gustavo Calderón Valle, who recently stepped down as head of doctoral programs at Universidad Mayor de San Andrés, the country's largest university, says Bolivia is being swept by “a scientific revolution,” noting that the number of doctoral programs went from less than a dozen in 2005 to more than 200 currently. “Our national lithium project is helping to encourage this trend,” he adds, and the young are interested.

    In the sparkling new “international level” laboratory accompanying the pilot plant, chemist Jon Alvarez, 30, looks over the printout of the 240 pond samples that the team of seven analyzes daily. He's like the rest of the R&D team: average age between 30 and 35. “This is a long-term project, and we are preparing to carry it into the future,” Alvarez says, adding that he would have had no interest in working for a foreign lithium company.

    Alvarez is right about the time range. The global lithium carbonate market is currently saturated, and the Atacama alone could supply the world's demand for 10 years. But if in 25 or 50 years, electric vehicles do outnumber gasoline-powered transport, the world will most likely need Bolivia's lithium. Also, the country's goals go beyond selling lithium carbonate. The government wants to develop the downstream chemicals to manufacture the lithium-ion batteries themselves. “This is about transforming the country from exporter of raw materials into one that industrializes its natural resources,” Roelants says.

    It's another, still-far-off collective dream for the scientist, his colleagues, and an impoverished nation.

    • * Jean Friedman-Rudovsky is a print and radio journalist based in La Paz, Bolivia.

  8. Saving for a Rainy Day

    1. Edwin Cartlidge

    Materials for thermal storage may make cheap solar energy available around the clock—even when the sun doesn't shine.

    Canned heat.

    The Andasol power station in Spain uses tanks of molten salt to store solar energy so that it can continue generating electricity when the sun isn't shining.


    The Andasol complex at the foot of the Sierra Nevada mountains in Granada, southern Spain, is one of the world's largest solar power stations. Its 600,000 parabolic mirrors, lined up in hundreds of rows over an area of several square kilometers, concentrate the sun's rays to provide heat that creates steam for electricity generation. The plant produces some 150 megawatts, enough to meet the needs of about half a million people. What sets Andasol apart, however, is not how much energy it delivers—it's how much it holds back. The plant is designed to store part of the solar energy it collects so that it can produce electricity after the sun sets or disappears behind a cloud.

    The fickleness of sunlight, like the unsteadiness of wind, poses a major obstacle for renewable energy. Grid operators can take up the slack with fossil fuel or nuclear plants, but this need to compensate limits the contribution of wind and solar plants— particularly those that use photovoltaic panels, which convert sunlight directly into electricity. In principle, batteries could store such renewable energy, but they remain very expensive.

    Instead, Andasol hoards its raw product: heat. Its “batteries”—three pairs of innocuous-looking metal tanks containing molten salt—hold enough energy to generate electricity for about 7.5 hours, allowing the plant to provide almost round-the-clock electricity during the summer. Indeed, Spain's national grid operator has classified Andasol as a “predictable” source of energy, allowing the percentage of locally generated electricity that is provided by renewable sources to increase.

    Experts say storage systems could help concentrating solar power (CSP) clear another major hurdle: cost. Its price per kWh is currently about $0.17—slightly more expensive than that of photovoltaics ($0.16), and nearly three times the cost of natural gas (about $0.06). Increasing output by being able to generate electricity after sunset will reap economies of scale, says Yogi Goswami, a chemical engineer at the University of South Florida in Tampa. But capital costs must also be reduced. Cheaper, simpler mirrors will be essential, says Fabrizio Fabrizi of Italy's national agency for energy research, ENEA, and improved storage technology also has an important role to play. “There is a need to force down the cost of investment,” Fabrizi says, estimating that improvements in storage “could reduce that cost by up to 25%.”

    Oil, salt, and steam

    The standard storage material for plants like Andasol, a mixture of 60% sodium nitrate and 40% potassium nitrate, already does its job very well. It is stable at temperatures up to 600°C. Its high specific heat capacity and a high density enable it to store a lot of energy in very little space. Its low viscosity when molten makes it easy to pump through pipes. And its ingredients are cheap and abundant. The challenge is to make better use of its virtues.

    Andasol produces electricity in two stages. First, the parabolic mirrors concentrate the sun's energy along the length of a pipe fixed just above their surface. Then synthetic oil flowing through the pipe heats up and travels to a heat exchanger, where it generates steam that turns a turbine.

    En route, some of the oil takes a detour through a separate heat exchanger. Molten salt being pumped from a “cold” tank at a temperature of about 290°C takes heat from the oil, flows into a “hot” tank at about 390°C, and sits until it is needed. Later, the salt is pumped back to the cold tank; as it passes back through the heat exchanger, it reheats oil returning from the steam generator, giving it enough thermal energy for another round. In this “two-tank indirect storage” system, the oil serves as a heat transfer fluid (HTF). The salt provides the heat that keeps the electricity flowing, but it's the HTF that raises steam.

    The obvious way to improve on this approach is to cut out the middleman. Instead of using one material to absorb the sun's heat and a second material to store it, why not use one material to do both? Such a “direct storage” approach would eliminate one heat exchanger. With the right material—one that remained stable at higher temperatures—it would also raise the temperature of the hot tank, making electricity production more efficient, and would increase the amount of heat stored in a given volume.

    Direct storage has already supplied electricity to the grid in California. The Solar Energy Generating Systems (SEGS) I power plant in the Mojave Desert—one of nine sister plants that together make up the world's largest operating solar power station—used hot mineral oil to meet demand on winter evenings starting in 1985, but in 1999 the storage system was damaged by a fire and was not replaced. Installing storage systems in SEGS II through IX would have been prohibitively expensive because the HTF used in those plants, a synthetic oil called Therminol, would need special pressurized tanks to keep it liquid. (Most of the plants use gas boilers as backup when the sun doesn't shine.)

    Most current research projects on direct storage rely on molten salt—the same mixture used at Andasol. One is the Italian energy agency ENEA's Archimede project, which switched on in July 2010. Archimede is a 5-megawatt pilot plant incorporated into a combined-cycle gas power station close to the Sicilian city of Syracuse. The plant's hot tank is maintained at 550°C—160° higher than Andasol's. As a result, Fabrizi says, more heat is lost in transit from the mirrors to the steam generator, but higher generating efficiencies and savings on storage considerably outweigh the loss. “Using the same molten salt as Andasol, we can store the same amount of energy but using about 40% less salt,” he points out. “That's a very dramatic reduction of cost.”

    One significant challenge for Archimede arises from the salt's melting point, 240°C. If the molten salt cools to that temperature, it will freeze solid and block the pipes—a problem that could be extremely costly and time-consuming to resolve. To keep the pipes hot, Fabrizi says, the plant continuously circulates the salt through them and turns on electric heaters if necessary. If salts with lower melting points were available, Fabrizi notes, Archimede's operators could use less energy to keep the pipes hot and—if circulation were to stop for some reason—would have more time to intervene before the fluid froze.

    Researchers at Sandia National Laboratories in Albuquerque, New Mexico, are working on the melting-point problem. David Gill, a mechanical engineer at Sandia, says that he and colleagues have found several mixtures that freeze below 100°C, but reaching such low temperatures cheaply is a challenge. Two of the salt mixtures—a four-component salt that freezes below 80°C, and a five-component salt that freezes closer to 70°C—should be “economically feasible,” Gill says, although the cost is hard to gauge because both contain the economically volatile element lithium.

    Towers and blocks

    Gill and other researchers hope that new designs will keep direct-storage CSP plants from freezing up. Plants like the 20-megawatt Gemasolar plant in Seville, Spain, which opened in June, attack the problem by doing away with parabolic mirrors entirely. Instead, hundreds or thousands of small reflectors known as heliostats direct the sun's radiation to the top of a tall tower and onto a single receiving module through which the HTF flows. Such “power tower” designs achieve high operating temperatures and thus very high efficiencies. Like Archimede, Gemasolar uses molten salt as both HTF and storage medium. But the much shorter length of its tubing minimizes both heat losses and the chance that the salt will freeze.

    Light work.

    Power towers may prove cheaper than parabolic mirrors.


    Gill says the combination of high efficiency and low losses is attracting increasing investment to power towers in the United States. “Parabolic troughs have a long track record and so are generally seen as a less risky investment,” he says. “But putting molten salt in is seen by some investors as putting the risk back in.”

    Another approach to thermal storage scraps flowing molten salt in favor of solid materials that just sit still. In a “passive” storage system developed by researchers at the German Aerospace Center (DLR) in Stuttgart, HTF from the mirror array passes through pipes embedded in concrete or cast-able ceramic materials. The scientists have found that the ceramics offer superior heat capacities and thermal conductivities but are too expensive and impractical. “On the one side you want good thermophysical properties, and on the other side the material has to be durable, workable, and cheap,” DLR's Doerte Laing says. “Concrete represents a good compromise between the two.”

    Since 2008, the German group has been making detailed thermal tests of a 20-cubicmeter concrete block at a test facility at the University of Stuttgart. The tests have shown good heat transfer between the concrete and embedded pipes while at the same time avoiding undue stress inside the material as a result of thermal expansion. The researchers say a storage system with the same energy capacity as the molten-salt tanks at the Andasol plant would require some 250 concrete blocks, each 200 cubic meters and weighing 400 tons, spread out over an area equal to about five football pitches.

    Let it freeze?

    In a still more radical departure, some researchers hope to achieve much higher storage capacities by harnessing the heat associated with a material's change of phase. Instead of raising the temperature of already molten salt, their scheme uses solar energy to change salt to a liquid. Hot HTF passes through solid salt, melting it; later, cool HTF absorbs energy, so refreezing the storage material into a solid. Because the latent heat associated with a material's change of phase is much greater than the “sensible” heat required to raise its temperature, much less storage material would be needed than in conventional molten-salt storage.

    Again, nitrate salts are likely to be the storage medium of choice. But because they are relatively poor conductors of heat, several research groups are working on designs that cause the salts to absorb or lose heat more effectively.

    To increase the surface area of heat transfer, Goswami and colleagues at the University of South Florida seal the salt inside small capsules with diameters of a few centimeters and let the HTF flow around them. The team is currently working on optimizing the size of the capsules and developing an industrial-scale method for carrying out the encapsulation process. Laing and co-workers, meanwhile, route the HTF through the salt via pipes outfitted with aluminum fins to speed heat transfer.

    In tests carried out with the utility company Endesa at the Litoral power plant in southern Spain, the German group has shown that this phase-change-based technology could provide storage for so-called direct steam generation. This technology, which uses the water from a solar power plant as the HTF, does away with the expensive oils and the steam generator that normally feeds the turbine. It also reaches higher temperatures than oil-based HTFs do—up to 550°C with superheated steam.

    The method is potentially cheap and efficient, but it requires that heat be transferred to and from the storage medium at a near constant temperature, which would require an enormous volume of salt in the case of sensible heat storage. The phase-change approach would be far better suited, Laing says. She adds that her group has shown that the approach is technically feasible and is working to reduce costs for industrial-scale application.

    Other researchers are working on approaches that include adding nanoparticles to a molten salt or an ionic liquid to increase the material's specific heat capacity, and studying ways to store both hot and cold salt in the same tank.

    Whichever storage technology—or combination of technologies—eventually pans out, it will be critical to the future of CSP. The current global capacity of solar thermal power plants is minuscule: just over 1 gigawatt, about the power output of one large fossil-fuel or nuclear power station. Another 15 gigawatts are currently in development or under construction in the United States, Spain, North Africa, China, India, and elsewhere, according to the International Energy Agency. If CSP is to become a major player in the future, the ability to store some of the sun's energy and then use it to generate electricity when needed will be essential.

  9. Turning Over a New Leaf

    1. Robert F. Service

    Artificial-photosynthesis researchers dream of using sunlight's energy to generate chemical fuels. Despite progress, the approach must become more efficient and cheaper to make an impact on where the world gets its fuel.


    The next time you groan when it's time to mow your lawn, take a second first to marvel at a blade of grass. Plants are so commonplace that it's easy to take their wizardry for granted. When they absorb sunlight, they immediately squirrel away almost all of that energy by using it to knit together a chemical fuel they use later to grow and multiply. It sounds so simple. Yet it's anything but. Modern society runs on fossil fuels precisely because researchers have never managed to duplicate the chemical mastery of a fescue. Now, with the side effects of our massive-scale use of fossil fuels piling up (climate change, acidified oceans, oil spills, and so on), researchers around the globe are struggling to play catch-up with biology in hopes of harnessing the sun's energy to synthesize gasoline or other fuels that are the bedrock of modern society.

    Humans, of course, already have ways to capture solar energy. Today's photovoltaic solar cells typically trap 10% to 20% of the energy in sunlight and convert it to electricity, and PV prices continue to drop. But because electricity is difficult to store on a large scale, the effort to store sunlight's energy in chemical fuels has risen to one of the grand challenges of the 21st century. “You're talking about turning the energy world on its head. Today we turn hydrocarbon fuels into electricity. But in the future, we need to find a way to turn electricity [from sunlight] into fuels,” says Daniel DuBois, a chemist at the Pacific Northwest National Laboratory in Richland, Washington.

    The problem is daunting. Energy production is the world's largest enterprise. Today the world consumes power at an average rate of 17.75 trillion watts, or 17.75 terawatts, 85% of which starts out as fossil fuels, coal, oil, and natural gas. Thanks to rising populations and incomes, by 2050 the world's demand for power is expected to at least double. To keep fossil fuels from stepping in to fill that need, with potentially devastating side effects, any new solar fuels technology will have to provide power just as cheaply, and it must have the potential to work on an equally massive scale.

    Enter artificial photosynthesis. Researchers around the globe are working to combine materials that capture sunlight with catalysts that can harness solar energy to synthesize fuels. This dream has been pursued for decades. But recent strides are adding new zip to the field. “In the last 5 to 10 years, there has been amazing progress,” DuBois says. Anthony Rappé, a chemist at Colorado State University, Fort Collins, agrees. However, he adds, “the bottom line is we're not there yet.”

    The splits.

    An artificial leaf harnesses energy in sunlight to split water into oxygen and hydrogen.


    Molecular shuffle

    To get there, most artificial photosynthesis researchers look to natural photosynthesis for inspiration. During photosynthesis, plants absorb sunlight, water, and CO2. Then they use two protein complexes—called photosystem I and II—to split water and synthesize fuel. First, in photosystem II, energy in sunlight splits two water molecules into four hydrogen ions (H+), four electrons, and a molecule of oxygen (O2). The O2 wafts away as waste; the protons and electrons are sent to photosystem I and used to energize the coenzyme NADP to NADPH, which in turn is used to help synthesize sugars—a key series of metabolic steps.

    Of course, artificial photosynthesis researchers aim to make fuel not for plants but for planes, trains, and automobiles. So after splitting water into H+, electrons, and oxygen molecules, most make very different use of those ingredients. Some researchers are working to combine the protons and electrons with carbon dioxide (CO2) to make methane gas and other hydrocarbon fuels (see sidebar, p. 927). But most are working on what they believe is a simpler approach: combining the pieces they get from splitting pairs of water molecules into molecules of O2 and hydrogen gas (H2). That H2 can then either be burned in an engine or run through a fuel cell, where the water-splitting reaction runs in reverse: combining two H2s with O2 from the air to generate water and electricity.

    Although plants split water with seeming ease, it's not a simple task, and it requires electrons to perform an intricate quantum-mechanical dance. Quantum mechanics dictates that electrons can exist only at discrete energy levels—or “bands.” In semiconductors, for example, electrons can sit in either a lower energy state known as the valence band, where they are closely bound to the atom on which they sit, or a more freewheeling energized state in the conduction band. Molecules like chlorophyll in plants act like tiny semiconducting proteins. When they absorb sunlight, they kick an electron from the valence to the conduction band, leaving behind a positively charged electron vacancy called a hole.

    Box step.

    Natural photosynthesis depends on a molecular cube (left) made from manganese atoms (purple), oxygens (red), and a calcium atom (yellow). The catalyst splits water molecules (blue), generating molecular oxygen. Synthetic versions (center and right) have similar cube-shaped cores.


    The holes are shuttled over to a compound called the oxygen-evolving complex, which grabs two oxygen atoms, holds them close together, and rips out an electron from each to fill the holes. The electron-deficient oxygens regain their stability by combining to form O2. In an artificial system, the electrons and protons liberated by water splitting then must migrate to a second catalyst, which combines them into two molecules of H2.

    A successful artificial photosynthesis system must therefore meet several demands. It must absorb photons, use the energy to create energized electrons and holes, and steer those charges to two different catalysts to generate H2 and O2. It also has to be fast, cheap, and rugged. “This is a much more stringent set of requirements than [those for] photovoltaics,” says John Turner, a water-splitting expert at the National Renewable Energy Laboratory (NREL) in Golden, Colorado.

    In 1972, Japanese researchers took on the challenge by using particles of titanium dioxide to split water. The method was impractical for commercial use because TiO2, which absorbs only ultraviolet light, could make no use of 95% of the solar spectrum. But the demonstration inspired numerous other water-splitting systems. One setup uses molecular dyes made with ruthenium and other rare metals to absorb a variety of wavelengths of light and pass the charges to metal catalysts. Another, developed by Turner's NREL team, absorbed light with semiconductor wafers made from gallium arsenide (GaAs) and gallium indium phosphide (GaInP). A platinum electrode served as the catalyst to split water and generate O2, while the semiconductor acted as the electrode to produce H2.

    Unfortunately, these systems, too, had drawbacks. The metals in the best light-absorbing molecular dyes are too rare to be viable as a large-scale technology. To get enough ruthenium to power the world with water splitting, “we would need to harvest 1% of the Earth's total continental crust to a depth of 1 kilometer,” Rappé says. Scale-up is problematic with the semiconductor system as well. Although Turner's devices convert 12% of sunlight to hydrogen, the materials would cost as much as $50,000 per square meter, according to an estimate by Harry Gray, a chemist at the California Institute of Technology (Caltech) in Pasadena. To be viable on a large scale, “we need to build something this good for $100 per square meter,” Gray says.

    Wanted: the perfect catalyst

    So more recently, much of the work in the water-splitting field has begun to shift to trying to make light collectors and catalysts from abundant and cheap materials. A prime example is the quest for H2-forming catalysts. Natural photosynthesis carries out the reaction using enzymes called hydrogenases, which are built from the abundant elements iron and nickel. The enzymes have evolved until they can knit roughly 9000 pairs of hydrogen atoms into molecular H2 every second. Many early water-splitting systems performed the same reaction even faster using pure platinum as the catalyst. But platinum is too rare and expensive to be broadly useful.

    In recent years, researchers have synthesized numerous compounds aimed at mimicking the core complex of hydrogenases. All work more slowly (if at all), however, largely because they lack parts of the natural protein around the core that optimizes the core's activity. In 2008, Thomas Rauchfuss, a chemist at the University of Illinois, Urbana-Champaign, and colleagues devised catalysts with molecular arms that act like a bucket brigade to ferry protons to the catalytic core. In the 12 August 2011 issue of Science (p. 863), DuBois and his colleagues described how they had refined this strategy further by creating a nickel-based catalyst that stitches 106,000 H2 molecules together every second (

    The new H2 makers still aren't ideal. They work only at high speed when researchers apply an electrical voltage of more than 1 volt to their system, a sizable energetic penalty. So DuBois's team is now working to tweak the catalysts to work at a lower added voltage. In a paper published online in Science on 29 September (, Dan Nocera, a chemist at the Massachusetts Institute of Technology in Cambridge, reported that he and his colleagues had come up with another H2 catalyst that works with an extra voltage of only 35 thousandths of a volt (millivolts). It, too, is made from relatively cheap metals: molybdenum, nickel, and zinc. But Nocera's catalyst is slower than DuBois's, so the race is on to marry the best attributes of each.

    Balancing speed and extra energy input has been an even tougher problem with the catalysts needed for other reactions in water splitting, which grabs oxygen atoms from two water molecules and links them together as O2. In 2008, Nocera and his team made headlines when they unveiled a cobalt-phosphate (Co-Pi) catalyst that works at 300 millivolts applied potential over the minimum 1.23 electron volts required to link two oxygen atoms. The group followed that up with a nickel-borate compound that does much the same thing. And in the 29 September online paper, the researchers described a triple-layer silicon wafer lined with their Co-Pi catalyst on one face and with their H2 catalyst on the other. The silicon absorbed sunlight and passed charges to the two catalysts, which then split water. “I love the triple junction. It's pretty sexy,” says Felix Castellano, a chemist at Bowling Green University in Ohio.

    Turner cautions that the overall efficiency of the device—it converts just 5% of the energy in sunlight to hydrogen—is still too low, and the extra voltage input required is still too high, to be commercially useful. Nocera counters that this initial system was built using amorphous silicon wafers as the sunlight absorbers. Such wafers are only 8% efficient in converting light to electrical charges. An artificial leaf based on crystalline silicon solar cells, which are 20% efficient, could convert sunlight to chemical energy with an efficiency of 12%, he says. But Nocera's team has yet to demonstrate such a device.

    Other related catalysts are also entering the picture. Charles Dismukes, a chemist at Rutgers University in Piscataway, New Jersey, and colleagues reported last year that they had made a series of O2-forming catalysts using lithium, manganese, and oxygen. And earlier this year, Dismukes's team reported in the Journal of the American Chemical Society that they had created another oxygen-forming complex with cobalt and oxygen. What's unique about all these new oxygen formers is that they share almost an identical cubic molecular structure, which is also at the heart of the natural O2-forming complex in photosystem II. “There is only one blueprint from biology that can be copied,” Dismukes says.

    Many other advances are also making their way out of the lab. Castellano and colleagues have recently created a family of cheap polymers capable of absorbing the energy from low-energy green photons and reemitting it as lower numbers of higher energy blue photons. They are now working on using this upconversion process to make use of more of the solar spectrum to split water. Researchers led by Steve Cronin of the University of Southern California in Los Angeles are adding metal nanoparticles to conventional solar absorbers as another way to convert low-energy photons to electrical charges that can then be harnessed to improve the efficiency of water-splitting setups. And Gray's group at Caltech has teamed up with students at 17 other universities to create a “solar army” that has already made progress in finding new water-splitting catalysts.

    These and other advances will need to continue if artificial photosynthesis ever hopes to contend with fossil fuels. With today's low natural gas prices, companies can use a mature technology called steam reforming to convert natural gas to hydrogen at a cost of about $1 to $1.50 per kilogram of H2 generated, which contains about the same amount of energy as a gallon of gasoline. Yet a recent analysis by Turner and his colleagues showed that, even if researchers could create an artificial photosynthesis system that cost $200 per square meter for the equipment and was 25% efficient at converting sunlight to H2, the H2 would still cost $2.55 per kilogram. That's not saying artificial photosynthesis isn't worth pursuing—only that fossil fuels are the leading energy source for a reason, and they won't be easy to dethrone.

  10. Sunlight in Your Tank—Right Away

    1. Robert F. Service

    Some researchers are looking to use artificial photosynthesis to generate hydrocarbon fuels like those we already burn. Their goal is essentially to run combustion in reverse.

    Using sunlight to split water and generate hydrogen doesn't make the most useful chemical fuel. To use hydrogen on a large scale, societies would have to develop a new infrastructure to store, transport, and distribute the energy carrier. With that limitation in mind, some researchers are looking to use artificial photosynthesis to generate hydrocarbon fuels like those we already burn.

    Their goal is essentially to run combustion in reverse, starting with carbon dioxide (CO2) and water and using the energy in sunlight to knit the chemical bonds needed to make hydrocarbons, such as gaseous methane and liquid methanol. “That's a technology that's going to come,” says Harry Gray, a chemist at the California Institute of Technology in Pasadena. “But it is hard.”

    The difficulty is that CO2 is a very stable molecule. In converting CO2 to hydrocarbons, the first step is to strip off one of the oxygen atoms, leaving behind a molecule of carbon monoxide (CO), a more reactive combination of carbon and oxygen. CO can then be combined with molecular hydrogen and converted into liquid hydrocarbons using an industrial process known as Fischer-Tropsch synthesis.

    That first step of converting CO2 to CO is the energy hog. A minimum of 1.33 electron volts (eV) of energy must be applied to carry out the reaction. Over the past few decades, researchers have developed numerous catalysts that carry out the process. But virtually all of them require adding a lot of extra energy, typically another 1.5 eV. As a result, it would take far more energy to synthesize a hydrocarbon fuel than the fuel's molecules could store in their chemical bonds.

    On 29 September, however, researchers led by Richard Masel of Dioxide Materials in Champaign, Illinois, and Paul Kenis of the University of Illinois, Urbana-Champaign, reported online in Science ( that they've come up with a less energy-intensive way to convert CO2 to CO. By adding a type of solvent called an ionic liquid to the CO2 in their setup, they reduced the added energy needed for splitting CO2 by 90%. Ionic liquids are liquid salts that are adept at stabilizing negatively charged compounds. Adding a negative charge is the first step required to convert CO2 to CO; the Illinois researchers suspect the increased stability reduces the voltage needed to do the job.

    The Illinois catalysts are slow, and so far the researchers have not powered them with electrical charges from a solar cell. But other labs are taking an approach that looks more like full-fledged artificial photosynthesis. At Lawrence Berkeley National Laboratory in California, for example, chemist Heinz Frei and his colleagues reported in 2005 that for the first time they had used energy from visible light to convert CO2 to CO using a porous catalyst made from silica and impregnated with zinc and copper. Frei's team has used related catalysts to split water to generate molecular hydrogen. Now the group is working to put the two pieces together to combine light-generated CO and H2 to make methanol, one of the simplest hydrocarbons.

    It's not ExxonMobil yet. But with further developments, the technology could lead to fuels made basically from air, water, and sunlight.