News this Week

Science  20 Jun 2008:
Vol. 320, Issue 5883, pp. 200
  1. SCIENCE EDUCATION

    Louisiana Opens School Door for Opponents of Evolution

    1. Fayana Richards

    Louisiana school teachers have been given license to supplement the existing science curricula with material that they feel “promotes critical thinking skills.” The seemingly innocuous language, in a bill passed overwhelmingly by the state legislature and expected to become law as early as next week, marks the latest attack in the United States on the teaching of evolution and mainstream scientific thought on global warming and other topics.

    “The only thing this bill does is give a green light for the school board to protect teachers who want to use creationist supplementary materials,” says Barbara Forrest, a philosopher at Southeastern Louisiana University in Hammond who has been fighting the legislation.

    Under the banner of “academic freedom,” opponents of evolution have made some headway in Florida and have attracted support in Michigan and South Carolina (Science, 9 May, p. 731). But their greatest success has come in Louisiana, where state legislators have invited educators to hold “an open and objective discussion of scientific theories being studied, including but not limited to evolut ion, the origins of life, global warming, and human cloning.”

    The approach appeals to Louisiana's Republican Governor Bobby Jindal, who is expected to sign the bill. “Some want only to teach intelligent design. Some only want to teach evolution. I think both views are wrong,” he told a television interviewer last weekend. “As a parent, I want [children] to be presented with the best thinking. I don't want any facts or theories or explanations to be withheld from them because of political correctness. The way we are going to have smart and intelligent kids is exposing them to the very best science.”

    Political science?

    This 2007 book takes a view in sync with supporters of the Louisiana legislation.

    CREDIT: DISCOVERY INSTITUTE

    Science educators say the new wording is intended simply to circumvent rulings by U.S. courts that creationism and intelligent design are unconstitutional religious intrusions into a public school science curriculum. It's also unnecessary, adds Brenda Nixon of Louisiana State University in Baton Rouge, who codirects a statewide effort to improve science and math education and also works with the Louisiana Science Teachers Association, because teachers already explore these topics in class. Teachers are required to follow the Louisiana Comprehensive Curriculum, which encourages teachers to keep up to date and allows them to incorporate outside materials as long as the content is consistent with the state framework. “We have had overwhelming support from our science teacher members, who don't want to see this approved,” Nixon says about the association's 1600 members.

    The bill requires the Louisiana board of education to implement the language in time for the 2008–09 academic year. But Forrest and others worry that it will be very difficult for any government body to make sure that the supplementary materials meet agreed-upon standards.

  2. BIOBANKS

    Canada Launches Massive Study of Adult Cancer Precursors

    1. Paul Webster*
    1. Paul Webster is a freelance writer in Toronto, Canada.

    TORONTO, CANADA—Canada has joined the global stampede of countries gathering biological data over decades on a large population cohort in hopes of better understanding the genetic, social, and environmental factors that affect human health.

    The Canadian Partnership for Tomorrow Project, launched last week, will follow 300,000 adults over the age of 35 for 30 years, gathering saliva, blood, urine, fecal, and toenail samples as well as answers to questions about the health effects of influences including diet, physical fitness, and environmental conditions. The goal is “a comprehensive data set for research into the causes of cancer,” says Heather Bryant, vice president of cancer control for the Canadian Partnership Against Cancer in Toronto, a federally funded organization helping to lead the study. But she says the project will also “provide a platform for numerous other research topics.”

    The project builds on a cancer-risk study in Alberta that examined the interaction of lifestyle, behavioral, environmental, and genetic factors. Five provincial public health agencies have kicked in an initial $82 million to recruit participants in what is expected to be a $3.5-million-a-year effort. Researchers have already obtained funding to probe the effects of vitamin D in northern climes, measure compliance with public health recommendations for physical activity, and chart the effects of dietary supplements as varied as alcohol, vitamins, and traditional native diets, notes Phillip Branton, head of the Canadian Institutes of Health Research's Institute of Cancer Research, who will oversee research.

    The Canadian study is intended to dovetail with the efforts of more than a dozen biobank studies around the world, says Branton. “One of the biggest questions to be tackled is, ‘Who are the people first at risk for cancer as diets and lifestyles rapidly change in different societies?’ “Epidemiologist Michael Thun, who is recruiting 500,000 participants for a biobank to be managed by the American Cancer Society in Atlanta, Georgia, says the Canadian study will add “useful further capacity. The more communication there is among early stage cohorts, the more that can be gained.”

    The long view.

    Canadian cancer scientists Jeffrey Lozon (left) and Phil Branton flank study participant Mary O'Neill.

    CREDIT: COURTESY OF THE CANADIAN PARTNERSHIP AGAINST CANCER

    Co-principal investigator Louise Fortier, who directs an international biobank consortium centered at Montreal's CARTaGENE biobank with information on 20,000 Quebec participants, predicts that “environmental measures are likely to become an important and novel focus” as the new study progresses. “We will have samples as well as really good information on the subjects' homes and environments,” she explains.

    Thun says the decentralization of private and public health records in the United States makes it difficult to collect and manage such data from larger populations. He hopes that the Canadian study can take advantage of centralized public health systems in each province that are capable of collecting and managing a wide array of data from large populations.

    The Canadian study is enrolling adults from five provinces from eastern, central, and western Canada. Instead of canvassing for volunteers, researchers will seek a cross section of “ordinary Canadians,” says Bryant, perhaps by calling a randomized list of telephone numbers. She believes that such a pool will be of greater value to other researchers. “If you don't build the platform,” she says, “you can't ask the questions.”

  3. BIODEFENSE

    Senate Bill Would Alter Biosafety, Select Agent Rules

    1. Jocelyn Kaiser

    As U.S. biodefense research has expanded since 2001, so has scientists' frustration with the red tape involved in studying potential bioweapons. Last week, a bipartisan pair of U.S. senators introduced a bill that would address some of these problems as well as safety concerns at the nation's biodefense labs. Some researchers hope the legislation will trigger a broader debate on finding better ways for science and security to coexist.

    The Select Agent Program and Biosafety Improvement Act of 2008 would reauthorize an arrangement under which 325 research organizations and nearly 10,000 individuals have been approved by the Centers for Disease Control and Prevention since 2002 to work with anthrax and botulinum toxin and other so-called select agents. The bill (S. 3127), introduced by senators Richard Burr (R-NC) and Edward Kennedy (D-MA), calls for “minimum standards” for biosafety and biosecurity training, a voluntary, anonymous accident reporting system, and inclusion of newly created organisms. It would also have the National Academies study whether the select agent program has hindered research, including international collaborations.

    Microbiologists say strict rules for shipping samples have stymied investigations of outbreaks abroad, and a requirement that collaborators abroad follow U.S. rules has made some joint research projects impossible. “The Select Agent Program is an important part of ensuring the nation's safety and security,” Burr said in a press release, “and I look forward to working with my colleagues to reauthorize and improve the program.”

    Stanford University microbiologist David Relman, a member of the National Science Advisory Board for Biosecurity (NSABB), says he hopes the provision to update the existing list of select agents will “open up a larger discussion about how we prioritize concerns.” He worries that a definition based on nomenclature is not specific enough and may be hindering research. The bill also asks the U.S. Attorney General to clarify language adopted in 2004 that would ban work on poxviruses genetically similar to smallpox but fairly benign (Science, 11 March 2005, p. 1540). NSABB, which offers advice on the oversight of research that could be potentially useful to terrorists, advised that the language should be repealed.

    The senators also want to address biosafety concerns—including the fear that many accidents aren't reported (Science, 12 October 2007, p. 182). The bill calls for a system, similar to what's used by the aviation industry, that would allow researchers to learn from one another's mistakes.

    “It's very exciting. It has a lot of things that I completely agree with,” says Gigi Kwik Gronvall of the University of Pittsburgh Center for Biosecurity in Baltimore, Maryland, who's also encouraged that the bill asks for an assessment of whether the many new labs are needed. But Janet Shoemaker, public affairs director of the American Society for Microbiology, says the bill, although worthy, “needs further refinement.” She suggests deferring action on any reporting system until after an interagency task force examining biosafety submits its report later this year.

    With little time left on the legislative calendar and Kennedy recovering from brain surgery, prospects for the bill appear dim this year. But Senate staffers hope that its introduction will stimulate interest in the House and lay the groundwork for passage in the next Congress.

  4. RENEWABLE ENERGY

    U.K. Ponders World's Biggest Tidal Power Scheme

    1. Daniel Clery

    Harnessing nature's energy to produce up to 5% of the United Kingdom's electricity without any carbon emissions sounds too good to be true. It is, according to a report last week from 10 environmental groups opposing plans to build the world's largest tidal power scheme.

    High water.

    The River Rance barrage (above) in France has been churning out electricity for more than 40 years. The proposed Severn barrage (left) would be much bigger.

    CREDIT: ATTAR MAHER/CORBIS SYGMA

    Britain is under pressure to combat climate change with more renewable energy. According to the European Union's (E.U.'s) common energy policy, 15% of the U.K.'s total energy consumption should come from renewables by 2020. Wind turbines and other renewables now provide less than 5% of U.K. electricity. As a result, the government is reviving mothballed plans for a dam, or barrage, across the Severn estuary, which separates southwest England from south Wales.

    But wildlife and environmental groups, including the Royal Society for the Protection of Birds (RSPB), the Worldwide Fund for Nature, and The National Trust, who argue that it will damage a unique ecosystem, now also assert that it will cost too much. “The report shows that this exorbitantly expensive and massively damaging proposal cannot be justified on economic grounds—there are simply too many cheaper options for clean energy generation,” says RSPB Chief Graham Wynne.

    Positioned across an estuary or inlet, a tidal barrage is essentially the same as a hydroelectric dam, but the rise and fall of the tides drives water through its turbines. The first such barrage began operating on France's River Rance in 1966. Because of high construction costs and fears of ecological damage, there have been only two, smaller imitators, in Canada and Russia.

    The River Severn has the second highest tidal range in the world—15 meters between high and low tide. The first of many plans for a tidal power scheme there dates from 1925, but none has left the drawing board. The $29 billion scheme now being considered by the U.K. government is an order of magnitude larger than that on the Rance. The barrage would stretch 16 kilometers from Weston-super-Mare in Somerset to Cardiff in south Wales and would generate 17 terawatt-hours of energy per year, equivalent to the output of two 1-gigawatt power stations. A tidal barrage has lower operating costs than a nuclear station and would last up to three times longer, as long as 120 years.

    The Severn barrage would have locks to accommodate ships and perhaps a road or rail link along its top. Proponents say that the water behind it would be safe for shipping and watersports and would reduce the threat of floods.

    Then there are the drawbacks. Apart from cost, the barrage will irrevocably change the ecosystem of the enclosed estuary. The groups that sponsored last week's report say that it would threaten 35,000 hectares of protected wetlands, home to 68,000 birds in winter and more in summer. The barrage will also disrupt the migration of salmon, shads, lampreys, and sea trout to their spawning grounds. “The estuary is truly exceptional for its ecological value,” says Wynne.

    In 2007, the government-funded Sustainable Development Commission (SDC) issued a report supporting a Severn barrage, as long as it does not contravene E.U. environmental directives. The directives allow for schemes that alter habitats if there is overwhelming public benefit—such as combating climate change—and if compensatory habitats are provided, either by restoring damaged habitats or creating new habitats somewhere else. SDC also recommended that the project should be government-funded and owned to avoid higher commercial interest rates. In January, the U.K. government launched a 2-year feasibility study into the barrage.

    Last week's report, drawn up by the consultancy group Frontier Economics, argues that there is no compelling reason for the government to bankroll a project that the private sector could do equally well. Doing so, it adds, may actually contravene U.K. treasury rules. Public money would be better spent on other types of renewable project, it concludes.

    Researchers are divided over both the economic and the environmental arguments. “I'd rather see more distributed, smaller [schemes] built sooner,” says ecologist Peter Randerson of Cardiff University in the U.K., who believes a barrage would take 20 years to build. Hydraulic engineer Richard Burrows of the University of Liverpool in the U.K. notes that E.U. targets have Britain getting 60% of its energy from renewables by 2050 and that small-scale schemes will never reach such a target. “You have to capture a larger part of the tidal power out there,” he says.

    The barrage's environmental impact is also debatable. “There will be environmental modification but not necessarily degradation,” says Burrows. “You could argue that there will be a richer ecological state inside the impounded reservoir.” Oceanographer Robert Kirby, who has studied the estuary for 40 years, predicts that the barrage will be good for the estuary, slowing the fast tides that stir up sediments and blocking sunlight from the water.

    Randerson says that this is a “tantalizing argument” but that as yet there have been “no serious studies” of the idea. In any case, he expects the decision to be made on political rather than scientific grounds. “It's very attractive for politicians to have a big, megabucks, grandiose scheme to hang their credentials on,” he says. “It's inevitable for all the wrong reasons.”

  5. U.S. ENVIRONMENT

    Heinz Center Wants Feds to Build Ecosystem Indicator Partnership

    1. Erik Stokstad

    WASHINGTON, D.C.—The nonpartisan Heinz Center this week issued a comprehensive update on the health of U.S. ecosystems—along with a plea for the U.S. government to coordinate and fund future assessments.

    The 368-page report, titled The State of the Nation's Ecosystems 2008, summarizes 108 environmental indicators, some new and many improved, on the state of farmland, forests, and four other major ecosystems. “We've proved that it is possible to have credible improvements and refinements” of the indicators, says Robin O'Malley, who heads the center's Environmental Reporting Program. And now the center is ready to turn over the reins to the U.S. government. A companion policy document, released along with the updated report here on 17 June, lays out the center's vision for a congressionally charted future.

    New trend?

    Nitrate runoff is most severe in the Mississippi River watershed (map, above), but the amount flowing from the Mississippi River into the Gulf of Mexico may have begun a downward trend since the 2002 report.

    DATA SOURCE: U.S. GEOLOGICAL SURVEY'S NASQAN, NAWQA, AND FEDERAL-STATE COOPERATIVE PROGRAM

    Following the suggestion of the Clinton Administration's Office of Science and Technology Policy, the Heinz Center took on the challenge of designing a U.S. system of environmental indicators in 1997. It convened more than 150 representatives from environmental groups, industry, academia, and government agencies, who eventually agreed on 103 indicators. But 44% of the indicators contained in the 2002 report were essentially left blank because of insufficient data (Science, 27 September 2002, p. 2191).

    The updated report contains new data for 68 indicators; 41 of those now have multiyear trends, a 32% increase from the first iteration. But significant holes remain. For 40 indicators, as opposed to 45 in the earlier report, no adequate data exist.

    The group also redesigned or refined 56 indicators and added six more in areas that had been underemphasized. One new national indicator is change to stream flows, which O'Malley says is already revealing the impact of climate change. Carbon storage is another new indicator. Although nationwide estimates aren't yet possible for most ecosystems, the report finds that carbon storage in agricultural soils increased by 11 million tons a year from 1995 to 2005, perhaps due to no-till management that lessens soil erosion. Tracking such changes could help policymakers decide which practices are worth encouraging to reduce atmospheric carbon dioxide levels, O'Malley notes.

    The Heinz Center, which has spent $9.2 million on the project, hopes the U.S. Congress will create a new public-private partnership to take over the process and create a system of national indicators. Meanwhile, the White House this week announced a federal pilot project to create national indicators of water quantity and quality. William Clark of Harvard University, who chaired the design committee of the Heinz report, says the key steps are to sell the next Administration on the concept and persuade Congress to fund it.

  6. AUSTRALIA

    Science Minister Drives Push to Strengthen Innovation

    1. Jeffrey Mervis

    Kim Carr readily admits that he's not a scientist and that he has never run a company. But as Australia's minister of innovation, industry, science, and research for the Labor government that took office in November, Carr is convinced that he's the right man to foster collaboration between university researchers and industry. This is one of the key planks in Prime Minister Kevin Rudd's attempt to put the country's science and technology base to better use in growing the economy. “I don't have to be an expert in scientific research; I have to be an expert in public policy,” explains Carr, a 52-year-old senator from Victoria with strong ties to the Socialist wing of his party.

    Carr was a history teacher at a technical school—“the only science I took was political science,” he cracks—and a union strategist before entering politics in 1993. His portfolio includes Australia's research councils and the Commonwealth Scientific and Industrial Research Organisation (CSIRO), the country's largest scientific and industrial research agency, giving him a foot in both industry and higher education camps. He's been busy on both fronts.

    On 6 June, he and Education Minister Julie Gillard jointly laid out the rules for grants from the $11 billion Education Investment Fund to help rebuild the country's academic-research infrastructure. The program was created in the waning days of the previous government, and Rudd has expanded it beyond universities to include technical and vocational schools. But his government's first budget (Science, 23 May, p. 998), which featured a near doubling of that fund, also contained a $60 million cut to the $1 billion CSIRO as part of a 2% belt-tightening across all agencies. Already unhappy over what they see as the government's disregard for the basic research they conduct for industry, CSIRO scientists staged a 1-day protest last week over delays in contract negotiations. Next month, a blue-ribbon commission reviewing the nation's innovation system—all government policies affecting science and technology—is due to submit its report, a document that is expected to have a major impact on future budgets.

    Carr is a cheerleader for the country's manufacturing sector, and last week he went to Japan with his boss to trumpet a $35 million grant to Toyota, from a new $500 million green car fund, to help the company build a hybrid Camry in Melbourne. “It drives innovation across the economy,” Carr says of the auto industry. “That's why I'm interested in it.”

    However, the government's wooing of automakers runs counter to the recommendations of the independent, influential Productivity Commission, which takes a dim view of industrial subsidies. And some scientists see other motives. “He comes from a labor background that would make him a bit dogmatic on the issue of protecting jobs, and the subsidy to Toyota is possibly symptomatic,” says Kurt Lambeck, a geophysicist at Australian National University in Canberra and president of the Australian Academy of Science.

    Capital visitor.

    Australia's Kim Carr told a Washington, D.C., audience that improving innovation “is an endurance event, not a sprint.”

    CREDIT: COURTESY OF THE EMBASSY OF AUSTRALIA, WASHINGTON, D.C.

    At the same time, Lambeck says that Carr, who held the same portfolio in Labor's shadow Cabinet, is passionate about innovation: “He recognizes that science and technology are important, and he's certainly seeking input from the community. But it's too soon to tell if he's listening.”

    Carr is currently visiting the United States to drum up support for all manner of industrial collaborations. He's also promoting Australia's bid to host the Square Kilometre Array (SKA), a proposed $2 billion radio telescope with a 50-fold increase in sensitivity that will allow scientists to “see” back to the very young universe (Science, 18 August 2006, p. 910). Science caught up with him during a daylong visit to Washington, D.C., that was sandwiched between stops in Detroit, Michigan.

    On the innovation review:

    K.C.: There are a whole range of approaches, which boil down to one simple proposition. And that's talking to people. We have received more than 700 submissions. [The commission] has also undertaken considerable research on the nation's innovation system and the gaps in it.

    On university-industry ties:

    K.C.: There is the expectation among universities that business is there to be plundered, that is, that industry will fund whatever we say. And there is the view in some sectors of business that universities are there simply to do the research that companies won't do themselves and that scientists should be at the direction of the private firm. Well, none of those things are right. But there is a need to find projects that people see the value of working together on.

    In Australia, the historical pattern is such that business will simply not be spending a lot of money on basic research. … We [also] don't have the same strength of philanthropy that you do in the States.

    On the infrastructure fund:

    K.C.: There's an infinite level of demand and a finite level of resources. But the current arrangement shouldn't be seen as the end of the story. There will be additional funding made available in future budgets. … We are also looking at [ways] to get universities to take more responsibility for what they do with the money they receive from the government.

    On the value of SKA:

    K.C.: We are using technologies developed in conjunction with the SKA for monitoring baggage in airports, testing engines, diagnosis and treatment of cancer, tracking of icebergs, identification of crime suspects, and monitoring climate change. … Basic research in radio astronomy, or astronomy more generally, has produced these practical outcomes. That's not the reason you do it. But it's a good place to start.

  7. VIROLOGY

    Alzheimer's Risk Factor Also Aids HIV

    1. Elsa Youngsteadt

    The defective lipid carrier apolipoprotein E4 (apoE4) has accumulated a nasty record. Not only are people who have the gene for apoE4 famously predisposed to Alzheimer's disease, but the same risk factor can also worsen several nervous system disorders and promote cardiovascular disease. A study out this week suggests that apoE4 also hastens the death of people infected with HIV, possibly by allowing the virus easy entry into cells.

    This discovery already has scientists wondering if the new genetic risk factor could help guide AIDS treatment and if prospective Alzheimer's drugs could one day fight HIV as well. But some remain skeptical, noting that it's not clear how many of the apoE4-linked deaths were due to AIDS.

    Apolipoprotein E is supposed to ferry cholesterol and lipoproteins through the plasma and cerebrospinal fluid to cells. One version of the protein, apoE3, does the job just fine—but two other versions, E2 and E4, are associated with different diseases. People possess two copies of the gene for apoE, and having even one copy for apoE4 is a major risk factor for Alzheimer's disease.

    Some research has also hinted at a role for E4 in infectious diseases. One 1998 study, for example, found that among 44 HIV patients, individuals with at least one copy of the E4 gene were twice as likely as other patients to develop HIV-associated dementia.

    Intrigued by that dementia link, a team led by physician-scientist Sunil Ahuja of the University of Texas Health Science Center in San Antonio recently studied the apoE gene status of 1267 people with HIV—mostly servicemen—who received free military health care for up to 20 years. The E4-dementia connection did not hold up in this larger study—perhaps because new antiviral drugs have nearly eliminated the condition, suggests Colin Hall, a neurologist at the University of North Carolina, Chapel Hill, who led the 1998 study.

    What Ahuja and his team do report this week in the Proceedings of the National Academy of Sciences is a dramatic effect of E4 status on survival: 10 years after diagnosis, almost all of the 27 patients who had two copies of the E4 gene had died, but among those with other gene combinations, even one E4 gene, half were still alive. Double E4 patients also developed AIDS-related diseases such as toxoplasmosis twice as fast as the other patients.

    “The clinical differences … are really not subtle,” says Michael Lederman, a molecular biologist at Case Western Reserve University in Cleveland, Ohio. “People should hustle after [this].”

    It's not clear whether E4 is detrimental or E3 is protective. But by adding purified E4 or E3 to an HIV cell infection assay, Ahuja's team showed that the virus most frequently infected cells incubated with E4.

    Ahuja predicts that a role for apoE in infectious disease is “a broader theme that's going to emerge down the road.” Linking E4 to AIDS progression could lead to new HIV drugs. Robert Mahley, a pathologist at the J. David Gladstone Institutes in San Francisco, California, and an author of the study, is working with Merck to develop Alzheimer's drugs that would force apoE4 to assume an E3-like structure. Those compounds, however, aren't even ready for animal tests.

    Risky.

    HIV (inset) may more easily infect cells of people who produce only apolipoprotein E4 (right) instead of the more common E3 (left).

    CREDITS: COURTESY OF R. W. MAHLEY AND K. H. WEISGRABER, THE J. DAVID GLADSTONE INSTITUTES; (INSET) K. SUTLIFF/SCIENCE

    Robert Shamburek, who specializes in lipoprotein metabolism at the National Heart, Lung, and Blood Institute in Bethesda, Maryland, agrees that the new study is provocative. But he worries that Ahuja's team doesn't define the cause of death among those with the E4 gene. Instead of AIDS, they may be dying of other causes, such as cardiovascular problems that would respond better to cholesterol-lowering drugs than to aggressive antiviral treatment.

  8. GEOPHYSICS

    An Unpredictably Violent Fault

    1. Richard Stone*
    1. With reporting by Hao Xin.

    Chinese researchers placed a dense array of seismometers around a dangerous-looking seam in the rocks of Sichuan--only to be blindsided by the true killer.

    Chinese researchers placed a dense array of seismometers around a dangerous-looking seam in the rocks of Sichuan—only to be blindsided by the true killer

    Taken by surprise.

    Locals evacuate from devastated Beichuan County.

    CREDIT: CHEN XIE/XINHUA PRESS/CORBIS

    BEIJING—Geophysicists knew that the rugged mountains of Sichuan Province were primed for a “big one.” But they didn't know when or which fault would give way first. Two years ago, Liu Qiyuan, a geophysicist at the China Earthquake Administration's (CEA's) Institute of Geology, bet on the Anninghe fault, which has been shifting to the east about 10 millimeters a year as the Indian subcontinent shoves the Tibetan Plateau against the Sichuan Basin. Liu deployed 300 broadband seismometers—the $6 million array is one of the densest in the world—around Anninghe and two faults to the north. Thanks to a prodigious 1.8 terabytes of seismic data per year, he began compiling a high-resolution threedimensional map of the underlying crust.

    Liu and his colleagues guessed wrong.

    Mountain building.

    During the Wenchuan earthquake, land west of Longmenshan lurched eastward. At Shiyan village in Beichuan County, the rupture lifted a road more than 4 meters, destroying houses along the scarp.

    CREDITS: JIE CHEN; SOURCE: USGS

    Northeast of Anninghe, on 12 May, a complex fault system ruptured under the Longmenshan, or Dragon's Gate Mountains, releasing energy equivalent to about 2000 Hiroshima-size atomic bombs. Nearly 70,000 people are known to have died, thousands are missing, and more than 1.5 million people lost their homes in the magnitude-7.9 Wenchuan earthquake. Land west of the Longmenshan fault system had been edging eastward toward the Sichuan Basin at a rate of only a couple of millimeters per year, according to Global Positioning System (GPS) measurements. Liu says the GPS readings blinded researchers to the real threat: “We did not imagine such a big event happening in Longmenshan.”

    Scientists will need to redraw seismicintensity maps that guide planners on safe areas for construction and how much shaking buildings must be designed to withstand. The U.S. Geological Survey (USGS) estimates that the Wenchuan earthquake subjected 1.3 million people to violent or extreme shaking. The revisions must be made posthaste in Sichuan, which is keen to begin reconstruction.

    But with many of China's 1.3 billion people living in seismic zones, every province will have to check its intensity maps. Municipalities will have to upgrade building codes and strengthen enforcement. “They can begin to prepare for the inevitable,” says geophysicist Walter Mooney, USGS's top expert on China. But in the long run, a deeper understanding of the titanic forces at play in Earth's crust is necessary to refine predictions of where future big quakes might strike.

    Other revelations are sure to follow. For the past month, a few dozen Chinese scientists have been braving strong aftershocks to survey the rupture on the central Longmenshan fault and a shorter gash on a parallel fault. Their observations, coupled with readings from the northeastern corner of Liu's Sichuan array, which covers much of the Longmenshan system, should give an unprecedented look at how a powerful temblor warps geological structure. “The Sichuan earthquake is very important because it's rare to see this happen on a thrust fault inside a continent,” says seismologist David Simpson, president of Incorporated Research Institutions for Seismology in Washington, D.C., a university consortium that has wired up the world with digital seismometers. Not surprisingly, Liu has been barraged with requests for data from the CEA array; he expects to release preliminary analyses in the next several weeks.

    In the shadow of Dragon's Gate

    For someone who has spent nearly three solid weeks surveying Longmenshan and ducking aftershock-induced landslides, geologist Wen Xue-ze looks surprisingly refreshed. Chengdu is sweltering on 5 June, but Wen, after a couple of arduous days in the field around Wenchuan, has just showered and changed and can now review his team's copious findings at the Sichuan Seismological Bureau (SSB) in Chengdu, CEA's biggest provincial bureau.

    At 2:28 p.m. on 12 May, when the earthquake shook the bureau violently, “we realized it was a big one, but we didn't know where it struck,” Wen says. In a few minutes, they learned that the epicenter was Wenchuan, just 70 kilometers northwest of Chengdu. That brought a fresh worry: the possibility that the 156-meter-high Zipingpu Dam, several kilometers east of the epicenter, would collapse. Engineers determined that Zipingpu suffered cracking but was structurally sound. Some Chinese and Western geophysicists privately say it's necessary to investigate whether the dam, completed less than 2 years ago, triggered the earthquake.

    A couple of hours after the quake struck, Wen and other SSB staff reached the historic town of Dujiangyan, situated on the edge of the Sichuan Basin in the shadow of the Longmenshan. Much of the town lay in ruins, the road wending through the mountains to Wenchuan blocked by landslides. To aid relief efforts, Sichuan officials in Dujiangyan asked SSB to produce a map of the hardest hit areas. Based on aftershocks in the first several hours after main shock, Wen and SSB Director Wu Yao-qiang circled an area encompassing a mind-numbing 20,000 square kilometers. That night, SSB delivered the sobering map to authorities. “They realized the destroyed area was enormous,” Wen says.

    Wen led a team into the field on 17 May to look for surface ruptures. Their first stop was Beichuan, which straddled the main fault and had been reduced to rubble. Wen's group found that land on the northwestern side of the fault had been thrust up as much as 5 meters. “This is mountain building,” says Liu. (The third-largest temblor ever recorded—the magnitude-9.3 earthquake off Sumatra that spawned the tsunami in 2004—ruptured 1600 kilometers, shifting the seabed fault a staggering 20 meters in places.)

    Based on the dramatic scar at Beichuan, CEA chiefs in Dujiangyan asked Wen to rev up his survey work and assigned him a team of 30 scientists. They fanned out in eight groups and over 2 weeks mapped a rupture running more than 200 kilometers along the main fault. In addition to lifting 3 to 5 meters, the fault had shifted areas to the west 1 to 4 meters relative to those in the east. “The earthquake lifted up the mountains and pushed them to the side,” says Mooney. The section near Beichuan showed a strike-slip movement, a grinding twist as two slabs of crust moved in opposite directions. Aftershocks have rattled 300 kilometers of the main fault, including a roughly 100-kilometer section to the northeast that Wen says did not rupture. His team also discovered a rupture more than 50 kilometers long on a secondary fault 10 to 20 kilometers to the southeast. A third fault in the Longmenshan system northwest of the main one appears not to have ruptured.

    Before the Wenchuan earthquake, Wen and his colleagues, like Liu, perceived two immediate threats. One was the Anninghe fault—which has a 90-kilometer seismic gap, or eerily quiet stretch with few tremors. The other was the Xianshuihe fault, which runs southwest to northeast, forming a “V” with the Longmenshan fault, and which, like Anninghe, has been moving about 10 millimeters per year. Longmengshan's giving way before the others, says Wen, “is a challenge to the traditional idea of active fault segmentation.”

    Out of the blue?

    The Wenchuan earthquake certainly wrong-footed CEA headquarters. After the quake struck, it took less than 6 minutes for the initial seismic waves to leave Chinese soil. By the time the waves had reached the far corners of the globe 20 minutes later, USGS's National Earthquake Information Center in Colorado had pinpointed the epicenter and assigned a preliminary magnitude of 7.5. Meanwhile, CEA's automated software determined, erroneously, that a magnitude-3.9 earthquake had struck the Beijing suburb of Tongzhou and posted the information to CEA's Web site. Minutes later, CEA identified the correct epicenter and updated the Web site. By evening, a 230-person-strong CEA team had ar rived in Dujiangyan (Science, 23 May, p. 996). But an incorrect calculation of the moment tensor—a mathematical description of a fault's movement during a quake—lingered on the Web site for 4 days.

    The errors left CEA's cadre of scientists red-faced. The massive agency employs some 10,000 people, but only about 100 are Ph.D. scientists. “The problem is at the top,” says Chen Yuntai, honorary director of CEA's Institute of Geophysics. “The root cause of the mistakes is not placing importance on the science.” Perhaps as a result, says Peking (Beijing) University geophysicist Huang Qinghua, “CEA is isolated from the scientific community.”

    One long-standing gripe is a lack of data sharing. The main entities involved in earthquake research—CEA, China's Geological Survey, the Chinese Academy of Sciences, and universities under the Ministry of Education—each collect similar data using their own instruments. “We need a policy to force scientists to upload data to a common server,” says Zhou Shiyong, a geophysicist at Peking University. Collaboration across disciplines must also improve. “Before, seismologists and geohazards researchers were working completely separately. Now we realize we have to cooperate,” says Cui Peng, a geomorphologist at the Institute of Mountain Hazards and Environment in Chengdu.

    Peking University and CEA last year established a joint seismology research center. “Theoretically, we can get data from CEA,” says Huang. But it hasn't worked very well. U.S. experts, meanwhile, complain that China's policy of delaying the release of seismic data by 30 minutes impedes emergency response. Chinese researchers say their hands are tied by the military, but some say the Wenchuan quake may give momentum to arguments for real-time data release.

    A more fundamental issue is CEA's mission: not only to monitor and respond to earthquakes but also to predict them. “CEA issues many earthquake warnings, but they are just guessing. It's not science,” says Zhou. One recognized precursor of some major quakes is foreshocks that increase in frequency and intensity, but “very few earthquakes have identifiable foreshocks,” says Simpson—and Wenchuan had none.

    Reading portents

    Nonseismic warning signs are even more problematic. In the hours before the Wenchuan earthquake, a Taiwanese meteorology satellite reportedly detected a decrease in density of charged particles in the ionosphere above Wenchuan. Although some researchers speculate that it may have been due to radon seeping into the air, Huang notes that a link between earthquakes and ionosphere anomalies is controversial. A few days before that, the streets of a Sichuan village near the fault were filled with toads migrating from the mountains. “Everyone hopes that animals can tell us something us humans don't know,” says Mooney. “But animal behavior is way too unreliable.” Even in hindsight, he and other s say, they've seen no geophysical connection between these anomalies and the earthquake.

    Chinese scientists devoted considerable energy to research on potential precursors after the late Premier Zhou Enlai in 1966 tasked CEA with earthquake prediction. But a decade later, a disaster laid bare the limitations of this effort. On 28 July 1976, a magnitude-7.8 earthquake leveled Tangshan, 160 kilometers east of Beijing. Officially, 240,000 people died, the highest earthquake death toll in the 20th century. Before the quake, the geology beneath Tangshan was restless: In early July, for example, locals reported fluctuations in the water table, and on the eve of the disaster, there were reports of odd lights emanating from the ground. Some experts were convinced an earthquake was imminent, but others “were waiting for the foreshocks,” says Chen Xiaofei, a geophysicist at Peking University. None came. “Tangshan is why I believe that precursors exist, but we don't understand them yet,” says Huang. The primitive state of the field, says a senior CEA geophysicist, is similar to that of weather forecasting a century ago, when people relied on sky observations and animal behavior. “Meteorologists have made the transition from empirical to physical prediction,” he says. “We haven't.”

    The danger below.

    CEA's seismometer array near Beijing has revealed a tumultuous—and still dangerous—geology beneath the city of Tangshan, which was leveled by an earthquake in 1976.

    CREDIT: COURTESY OF LIU QIYUAN

    In the past decade, Chinese research has largely followed the lead of Japanese and U.S. efforts that focus on deep geophysical processes. For instance, Liu notes, CEA's other broadband array—107 seismometers in the “Capital Circle” region around Beijing—has revealed a convoluted geology under Tangshan caused by a localized upwelling of magma into the crust (see diagram, above). Such mapping can flag hot spots for future megaquakes where GPS reveals little deformation. “The findings suggest that Tangshan remains perilous,” warns Liu.

    Other Chinese scientists argue that their country should chart its own course, with an emphasis on characterizing nonseismic anomalies preceding major temblors. “We pay too much attention to deep structure,” argues Zhou, who would like to see China's seismically restive Xinjiang Province turned into an earthquake prediction laboratory. Huang says that “rigorous and reliable” research could allow scientists to design a precursor monitoring network. “We should study geophysical and geochemical signals,” says Liu. But he cautions that precursors are likely to be more complicated than the earthquakes they presumably foretell. “Some of my colleagues are in too big a hurry to succeed in earthquake prediction,” he says.

    In the short term, all eyes will be on Sichuan. Liu's team is processing data from the Longmenshan broadband stations; only three of the pricey seismometers were damaged. Wen's group will mount field surveys and comb records of past earthquakes in Longmenshan to better estimate the intervals between major quakes. “We want to understand the relationship between this large earthquake and historical seismicity,” he says. The SSB researchers will also investigate whether the Wenchuan earthquake transferred stress to surrounding faults such as southwestern Longmenshan and Anninghe.

    Geophysicists will help guide reconstruction, which the Sichuan government aims to complete in 3 years. “We want people to have a better life than they had before the earthquake,” says Cui, whose mountain hazards institute is one of dozens of organizations participating in reconstruction planning. The most urgent task, Wen says, is to remap the faults. Within the next few weeks, he says, a CEA team will produce an active fault map of the region, which will be revised as new information comes in.

    Although geophysicists do not expect another huge earthquake in the 12 May rupture zone for another century or two, Sichuan authorities have already chosen another location for a new Beichuan; there are no plans to rebuild other villages on the Longmenshan fault. “People feel the area is no good,” says Cui. Adding to the agony is the observation that many buildings that collapsed were poorly constructed. That was the biggest lesson of the Wenchuan earthquake, says the senior CEA geophysicist. “We got this knowledge at the expense of many lives. We should never let it happen again.”

  9. BEHAVIORAL GENETICS

    Abuzz About Behavior

    1. Elizabeth Pennisi

    Researchers are tracking down the genes underlying variations in alcohol dependence, sleepiness, and other behaviors by studying specially bred fruit flies.

    Researchers are tracking down the genes underlying variations in alcohol dependence, sleepiness, and other behaviors by studying specially bred fruit flies

    RALEIGH, NORTH CAROLINA—Humans can be short-tempered or mild mannered, shy or boisterous, neat or slovenly—and every combination in between. Understanding the complex genetic networks that underlie behavior—and, ultimately, what makes each of us unique—is a mind-boggling task. Now, Robert Anholt, Trudy Mackay, and their colleagues have developed a resource that may help researchers begin to figure out how genes make us who we are.

    Together with a dozen colleagues at North Carolina State University (NCSU) in Raleigh, as well as collaborators in Europe and Canada, this husband-and-wife team has established a collection of inbred fruit flies (Drosophila melanogaster) from a wild population in North Carolina and are correlating patterns of gene expression with specific behaviors in the insects. The work could steer biomedical researchers to genes that influence aspects of human behavior, says Anholt. “What Trudy and Robert are doing will be very important for the discovery of genetic changes that contribute to behavior,” says Catherine “Katie” Peichel, an evolutionary geneticist at the Fred Hutchinson Cancer Research Center in Seattle, Washington.

    The project began a decade ago. Twice, in 1999 and 2002, Mackay's NCSU collaborator Richard Lyman showed up at Raleigh's farmers' market and picked off the fruit flies that emerged as crates of freshly harvested peaches were opened. Individual females were placed in vials, and each fly that reproduced became the progenitor of a single line of flies. Their offspring were allowed to mate only with each other, resulting, after several generations, in a line of genetically identical individuals that display consistent behavior. Each line is genetically—and behaviorally—different from all the others. The overall goal is to capture the genetic variation in the North Carolina wild fruit fly population in these wild-derived inbred lines.

    To date, the NCSU group has established 345 lines, and the researchers have used microarrays to determine the level of activity of 18,000 genes in 40 of them. Surprisingly, the expression of about 10,000 genes varies from one line to the next, Mackay and Anholt reported at the annual meeting here earlier this month of the American Genetic Association.

    CREDIT: UNIPROT CONSORTIUM

    The team has also tracked variation in behavior from line to line under different conditions. Then, by looking for differences in gene expression in lines that differ most in a particular behavior, they can zoom in on genes likely to underlie that particular behavior. And, drilling down further, the researchers use existing lines of D. melanogaster with mutations in every gene, as well as techniques for manipulating genes in this species, to pin down what each gene does and how it might influence a particular behavior.

    Genes in a bottle.

    In fruit flies, variation in sleep correlates with gene activity, helping to pinpoint relevant genes.

    CREDIT: S. HARBISON AND C. COUCH/NCSU

    The analyses reveal not just individual genes but sets of genes that act in concert. Taken together, “these are the genes that are contributing to the variation in the trait you are looking at,” says William Etges, an evolutionary biologist at the University of Arkansas, Fayetteville. Synchronized genes are likely to be part of a common biochemical pathway; thus known genes in a set provide clues about the function of uncharacterized genes in that cluster.

    To get at the genetic underpinnings of sleep, for example, NCSU's Susan Harbison has compared gene-expression patterns of flies that keep quite different schedules, some spending 20 hours a day resting, others barely pausing for four. She reported that different sets of genes were turned up for nappers and for nighttime snoozers—out of the hundreds correlated with sleep, only 78 were common to night and day resting, she reported.

    CREDIT: REBAY LAB/MIT

    Another project, spearheaded by NCSU postdoc Tatiana Morozova, is probing the genetic underpinnings of the fruit flies' responses to alcohol. Morozova monitors which lines are sensitive to ethanol vapor—measured by how long it takes individuals to lose their ability to cling to a tilted screen—and which ones become more tolerant when the exposure is repeated. In people, tolerance is a risk factor for alcoholism. She found a wide range in both sensitivity and tolerance, but the two traits were not linked. “You can't tell who will develop tolerance” based on who is sensitive, Morozova reported.

    The gene-expression analysis revealed 195 genes that appear to play a role in sensitivity to alcohol and about 600 linked to tolerance. Many of the genes that underlie sensitivity and tolerance play a role in metabolism, but few were common to both responses. And many have human counterparts. Anholt, Mackay, and Morozova are now investigating whether some of these genes correlate with alcohol tolerance in humans.

    NCSU postdoctoral fellow Katherine Jordan is using the inbred lines to look for genes that regulate the insects' responses to a variety of psychoactive drugs. “There's no uniform pattern,” she reported. “It's very similar to how humans react to these drugs.” She has identified several dozen genes whose variations in activity correlated with differences in the flies' responses, and she is now focusing on a handful that might influence the efficacy of these medications.

    These early results are fueling widespread interest in the inbred lines. Soon, anyone will be able to order the lines from a stock center, and microarray data will be available on a public database. And the U.S. National Human Genome Research Institute in Bethesda, Maryland, has just awarded Baylor College of Medicine $5.75 million to sequence the genomes of individuals from 192 of the lines; the sequences will be publicly available once they are completed, Mackay notes. “It's the next generation [of genetic studies],” predicts Michael Ritchie, an evolutionary biologist at the University of St. Andrews in Fife, U.K. “You can see people asking not just about two or three genes but about [whole] networks.”

  10. MICROBIAL ECOLOGY

    Out of Thin Air

    1. Christopher Pala*
    1. Christopher Pala is a writer based in Hawaii.

    On a volcano, microbiologists take a trip back through time to understand how microbes help restart life on lava fields and regulate the air we breathe.

    On a volcano, microbiologists take a trip back through time to understand how microbes help restart life on lava fields and regulate the air we breathe

    New beginning.

    Lava flows from Kilauea in Hawaii create new land and reset the volcano's biological clock.

    CREDIT: JIM SUGAR/CORBIS

    VOLCANOES NATIONAL PARK, HAWAII—On a warm, humid day in early May, Gary King is watching an ecosystem come alive. Here, the glowing lava of Kilauea regularly destroys the slopes' forests, then cools into dark gray, gold-flaked phantasmagorical forms. For decades, these lava fields bake under the hot sun, seemingly lifeless. But eventually, a fern or a koa sapling springs up timidly from a crevice, precursors of the forest that will ultimately rise again. A walk across the park takes King back through time, allowing him to get a close look at the specialized bacteria that are midwives to this rebirth.

    “Here, you can study how the microbes colonize the lava, how they provide nutrients for the first plants, and how the microbial community evolves as the amount of plants grows,” says King, a slight, soft-spoken 54-year-old microbiologist from Louisiana State University in Baton Rouge.

    Through field and lab work, he and his colleagues have discovered that these microbial pioneers first survive by processing dust, rainwater, atmospheric hydrogen, and, to a surprising degree, carbon monoxide. Only a few dozen species were known to use this typically toxic gas: Now, King's genetic analyses have uncovered more than 100. His studies of the interactions between the bacteria, flora, and atmosphere are showing how these microbes help control Earth's balance of greenhouse gases—both directly, through their metabolism, and indirectly, by creating hospitable landscapes for plants that can sequester carbon.

    “King is studying this very important step of how life takes hold and discovering keystone organisms that play a critical role in the Earth's biogeochemical cycles,” says Matthew Kane, a program director at the U.S. National Science Foundation in Washington, D.C. As King pins down the carbon monoxide-oxidizing species, “he will eventually figure out what they like and don't like,” adds Ortwin Meyer, a microbiologist at the University of Bayreuth in Germany. “Then we should be able to tell decision-makers what to do and not to do” about putting these organisms to good use in the fight against global warming.

    Life returns.

    Bacterial pioneers enrich sunbaked lava soil with ammonia, enabling plants to take hold.

    CREDIT: CHRISTOPHER PALA

    Today, with Kane looking on, King bends over a crevice at the rim of a 30-meter-deep smoking caldera. He scrapes off tiny bits of 34-year-old lava and puts them into a tube filled with a solution that preserves any DNA that they contain. There's probably a billion bacteria per gram, he says. Examining the 50,000 expected to have carbon monoxide-processing capabilities “will tell us what the diversity is like at this site, and we can compare it to the diversity at other sites, older or younger, and that way we can track the whole process of microbial succession.”

    He and Kane then move on to the next site, one of eight scheduled for sampling on this 2-day visit. Later, they head to the leafy, ever-rainy Volcano Village and stop at the rented house that serves as the temporary lab for King's team. There, graduate student Carolyn Weber is running a gas chromatograph that measures how much carbon monoxide is being consumed by bacteria samples brought back from the volcanoes.

    The work being done here, says Ralf Conrad, a biochemist at the Max Planck Institute for Terrestrial Microbiology in Marburg, Germany, will make it easier to find these bacteria in more complex soils around the world, and that “will help us understand their global role.”

    Hungry for carbon monoxide

    Although carbon dioxide, the chief greenhouse gas, grabs many of the global warming headlines, its oxygen-deprived cousin, carbon monoxide, plays a role as well. Industrialization has caused carbon monoxide concentrations to rise. Once in the air, it turns into carbon dioxide within 3 months by reacting with hydrogen radicals. These radicals also break down methane, another greenhouse gas. “So when carbon monoxide uses up these radicals, there's less left to break down the methane, and the methane increases,” King explains. Thus, removing carbon monoxide from the air “is important.”

    Meyer estimates that at least 15% of the carbon monoxide being released into the atmosphere is absorbed by bacteria living on the first centimeter of the world's soil cover, even though this gas is toxic to most organisms, including microbes. King, long curious about how bacteria affect carbon monoxide concentrations in the atmosphere, had been comparing samples from Georgia and Maine; he first came to Hawaii a decade ago to see how these bacteria worked in a seasonless environment.

    Intrigued by what new “soils”—such as newly hardened lava deposits—would do, King measured the rates of carbon monoxide and hydrogen removal from the air on lava flows laid down anywhere from 25 to 300 years ago. He was surprised to find that some of the youngest—seemingly barren, fractured surfaces—were able to remove these gases from the air as rapidly as mature continental forest soils that support rich communities of bacteria. He wondered if such high consumption was the key to survival for microbes settling on new lava, given the scarcity of other energy sources.

    By comparing rates of carbon monoxide and hydrogen uptake with total rates of respiration in the bacteria, King showed in 2003 that the two gases account for more than 20% of the bacteria's energy needs. He then used genetics to learn more about these microbes. From an assortment of soil samples, his team isolated all the copies of a gene for an enzyme critical to carbon monoxide use. By counting and comparing the different versions of the gene—different ones represent different species—the researchers got a sense of the number and diversity of carbon monoxide consumers at various sites. “King is the first to identify bacteria by their carbon monoxide function using genetic probes,” says Meyer.

    The resulting data showed that microbes that use this gas are far more diverse than he had previously imagined, King says. Moreover, distinctly different communities of carbon monoxide-oxidizing microbes exist on different types of lava—from the ropy “pahoehoe” to the wisps of lava called Pele's hair—and in different types of soils. The newly recognized oxidizers of the gas include important symbiotic partners for peanuts, soybeans, and other plants, as well as plant, animal, and human pathogens. The ability of symbionts and pathogens to use the gas may help explain their survival outside their host organisms, he says.

    While they are consuming carbon monoxide, these bacteria are also taking nitrogen from the atmosphere and converting it into ammonia, a fertilizer that enriches the lava and encourages plant growth. King has shown that the iconic acacia koa tree, a Hawaii endemic that's the preferred wood for the oceangoing canoes of the Polynesians (along with much of the handicrafts sold to tourists), thrives on barren soils because lumpy nodules on its roots host these carbon monoxide- and nitrogen-processing bacteria.

    As plant communities take hold, the bacteria “eat” less carbon monoxide, King and Weber reported online 29 November 2007 in The ISME Journal. Now the microbes feast on organic matter and on carbon dioxide and really begin to thrive. “We've found that in the forested area, there are five times as many species of these carbon monoxide oxidizers as in the dry area,” Weber says. In addition, the expanding plant cover begins to turn the area into a carbon sink.

    Pay dirt.

    King gathers tiny pieces of lava called Pele's hair (inset) to look for carbon monoxide-consuming bacteria.

    CREDITS: CHRISTOPHER PALA

    Warehousing carbon

    During this field visit, King heads beyond the bare lava fields to the cool, humid Pu'u Puai forest. He sticks his trowel into the soft ground, a mix of bits of lightweight dead leaves and humus, holds it up, and says, “This is carbon sequestration.” The leaves of the endemic flowering tree ohia lehua and the invasive fire tree above King absorb carbon dioxide from the atmosphere, lock up the carbon in carbohydrates, fall on the ground, and die. Bacteria then process that plant matter and release some captured carbon as carbon dioxide. The rest of the carbon stays put, sequestered from the atmosphere. Waving his dirt-filled trowel, he continues: “This stuff is at least 20% carbon, which isn't unusual for an old-growth forest. What is unique is that this forest is only 50 years old. In other places, in that time span, you'll find much less soil and carbon.” He doesn't understand why these differences develop but thinks that the volcano is the place to find out.

    “This is a place that can show us the rules that govern sequestration, because we can follow the process from the beginning,” he explains. King stands up and walks out of the forest into a sunlit area of tephra, gravel hurled from exploding volcanoes. The ground is sparsely punctuated by ferns and ohelo bushes laden with bright-red berries, another Hawaiian endemic. “On one side, we have an area that is storing a lot of carbon, and in the other we have an area where there is nothing,” he explains. “If we can understand the basic principles of how you go from nothing to a lot in 50 years, then we might be able to better manage carbon storage in other soils to help reduce the rise of carbon dioxide in the atmosphere.”

    And Christian Giardina of the U.S. Department of Agriculture's Forest Service in Hilo, Hawaii, agrees. “Gary's research is providing fundamental information on how the composition and structure of these microbial communities affect the rate at which they release CO2.”

    Later, as we drive to the next site, King cautions that carbon sequestration in plants is no magic bullet against global warming. “We're never going to be able to use plants to remove more greenhouse gases than we're putting in,” he says. “But if we understand how these microbes affect carbon in soil, we might be able to manipulate the growth of plants and their decomposition in a way that would influence their effect on these gases in the atmosphere.”

  11. NANO SCIENCE AND TECHNOLOGY INSTITUTE NANOTECH 2008

    Membrane Makes Plastic Precursor Deliver More Bag for the Buck

    1. Robert F. Service

    Researchers at the Nano Science and Technology Institute Nanotech 2008 meeting, held in Boston from 1 to 5 June, announced the development of a novel metal-ceramic membrane that enables them to produce ethylene, the starting material for polyethylene, more cheaply than current methods.

    To make the biggest impact, tackle the biggest problem. That's what Balu Balachandran and his colleagues at Argonne National Laboratory in Illinois did when they set out to reform the making of polyethylene, the world's most abundant commodity plastic. At the meeting, Balachandran reported that his team has developed a novel metal-ceramic membrane that enables them to produce ethylene, the starting material for polyethylene. If adopted widely, the process could cut production costs by 15%, saving millions of dollars a year and dramatically cutting the amount of greenhouse gases released into the atmosphere.

    Ronald Pate, an energy and water analyst at Sandia National Laboratories in Albuquerque, New Mexico, says improving the production of ethylene “is a big leverage opportunity.” Pate says that the technology still needs to prove itself as a viable industrial-scale process but that “it ought to be looked at to bring the carbon footprint down.”

    In the world of commodity chemicals, petite ethylene (C2H4) is a behemoth. More than 75 million metric tons of the gas are produced each year to make the plastics that go into everything from grocery bags and milk jugs to compact disc cases and wire sheathing. The simple organic molecule can be made from many materials, most commonly by breaking apart, or “cracking,” light liquid hydrocarbons with high-temperature steam. Although simple, the process converts only about 64% of the starting materials into ethylene. One reason is that the carbon in the starting hydrocarbons can combine with oxygen from the steam to make CO2 instead of pairing up with hydrogens to make ethylene.

    Big impact.

    A new approach could slash the cost and carbon footprint of making polyethylene.

    CREDIT: LEW ROBERTSON/STOCKFOOD/GETTY IMAGES

    To make the process more efficient, Balachandran and colleagues looked for a way to crack ethane (C2H6) and other hydrocarbons while keeping the oxygen and carbons apart. They settled on using a thick membrane made from a mixture of palladium and a ceramic called yttria-stabilized zirconia. Although Balachandran did not reveal the precise makeup of the new membrane or how it transports hydrogen, he and his Argonne colleagues have developed related membranes to separate hydrogen gas for use in fuel cells. For their current study, Balachandran reported, they put ethane on one side of the membrane and air on the other. Heating the ethane caused most of the molecules to break apart into ethylene and H2 molecules. The H2 molecules then traveled through the membrane and combined with oxygen from the air, a reaction that generates heat. In turn, the heat traveled back through the membrane to sustain the ethane-cracking reaction.

    The process converted 73.5% of the ethane to ethylene, nearly 10% more than steam reforming does. As a bonus, no additional energy was needed to produce the superheated steam. Balachandran says he suspects that could simplify reactor designs and help drop production costs.

    Balachandran acknowledges that the new process is still in its infancy. The next steps are to scale up the process and see if it works with other hydrocarbon feedstocks. If they succeed, your plastic milk jugs of the future may well become a little greener.

  12. NANO SCIENCE AND TECHNOLOGY INSTITUTE NANOTECH 2008

    Don't Sweat the Small Stuff

    1. Robert F. Service

    At the Nano Science and Technology Institute Nanotech 2008 meeting, held in Boston from 1 to 5 June, chemists reported an easy-to-use general technique for making dozens of different types of metal oxide nanoparticles that could have a major impact on everything from catalysts to electronics.

    Nanoparticles are known for packing macro-sized surprises. And that's just what chemist Brian Woodfield and his colleagues at Brigham Young University in Provo, Utah, got when they set out to solve a nanoparticle mystery last year.

    Their cerium oxide nanoparticles were displaying odd magnetic behaviors. But they were also spiked with impurities. To see if that's what was causing the odd readings, Woodfield's postdoctoral assistant Shengfeng Liu came up with a new scheme for synthesizing high-purity cerium oxide particles. He found out that the impurities were indeed to blame. But more important, as Woodfield reported at the meeting, Liu hit upon an easy-to-use general technique for making dozens of different types of metal oxide nanoparticles that could have a major impact on everything from catalysts to electronics.

    “It opens a very general, cheap, clean, flexible route to nanoparticles for all sorts of applications,” says Alexandra Navrotsky, a chemist at the University of California, Davis. Researchers around the globe are already looking to metal oxide nanoparticles to improve everything from fuel cells to optical films. So, Navrotsky says, “the opportunities are pretty widespread.”

    There are many ways to make nanoparticles. A common method uses heat to vaporize bulk-sized starting materials. As the vapor cools, its atoms condense into larger nanoparticles. Another approach precipitates nanoparticles from ions in liquids. But such techniques produce highly pure nanoparticles only when the chemistry and kinetics are just right.

    Liu chose instead to start with high-purity metal salts, which, like table salt, are a fusion of positively and negatively charged ions that come together to form a neutral compound. He then ground the salt together with ammonium bicarbonate, a white, powdery material used as everything from a leavening agent in breads to fertilizer. The grinding rearranged the chemical partners. Mixing aluminum nitrate and ammonium bicarbonate, for example, produced aluminum hydroxide and ammonium nitrate, with excess carbon and oxygen bubbling off as CO2 (see diagram). Finally, Liu baked his mixture at 300°C for about an hour. The heating drove off several additional components as gases, leaving behind aluminum oxide nanoparticles. “Voilà, you get nanopowder,” Woodfield says.

    The simple process is a nearly foolproof way to make uniformly sized metal oxide nanoparticles, Woodfield says. He and his colleagues have used their technique to make some two dozen different metal oxide and mixed-metal oxide particles. And Woodfield says researchers should have little trouble in scaling up the technique. He and his colleagues recently formed a company called Cosmas Inc. to commercialize the process.

    Although all the current particles are oxides, Navrotsky says she suspects that the technique could be extended to combine other negatively charged ions with the metals. That should open the door to making a variety of chloride, nitride, and phosphide nanoparticles with a broad palette of exotic optic, electronic, and catalytic properties. If she's right, what is already a powerful technique could become a powerhouse.

    Works every time.

    Chemists probing a basic mystery of magnetism in cerium oxide nanoparticles discovered this general recipe for making numerous flavors of the tiny grains. Start with a metal salt, add ammonium bicarbonate, stir, heat, and presto! Instant nanoparticles.

    CREDIT: L. CREVELING/SCIENCE
  13. NANO SCIENCE AND TECHNOLOGY INSTITUTE NANOTECH 2008

    Solar Cells Gear Up to Go Somewhere Under the Rainbow

    1. Robert F. Service

    At the Nano Science and Technology Institute Nanotech 2008 meeting, held in Boston from 1 to 5 June, researchers reported harvesting infrared photons with arrays of antennas akin to those on televisions and in cell phones, a first step toward solar cells that convert heat to electricity.

    Today's solar cells do a fair job of converting visible light into electricity, but they ignore lower energy infrared (IR) photons, or heat, which don't have enough energy to generate electricity in semiconductors. At the meeting, researchers from the Idaho National Laboratory (INL) in Idaho Falls reported harvesting IR photons with arrays of antennas akin to those on televisions and in cell phones, a first step toward solar cells that convert heat to electricity. If the approach pans out, it could lead to solar cells capable of generating electricity after sunset and using the waste heat from industrial plants.

    “It's certainly an intriguing idea,” says Michael Naughton, a physicist at Boston College in Chestnut Hill, Massachusetts, whose group has built related antennas. But he notes that converting the energy from the collected IR light to electricity will require a separate set of advances. Says Naughton: “Either it has no chance of working, or it will be fantastic.”

    The notion of using antennas to capture electromagnetic waves and then convert that energy to electricity is decades old. In 1964, William Brown, an engineer at the U.S. aerospace company Raytheon, demonstrated a flying helicopter that absorbed microwaves and converted their energy to DC power to run a small engine. At the heart of the helicopter's success was a two-part device called a “rectenna”: a microwave-absorbing antenna combined with a “rectifier” that converts the microwave energy to electricity. More recent are proposals to transmit microwave energy to Earth from arrays of solar collectors in space.

    Several years ago, researchers led by Steven Novack at INL set out to capture and convert IR light, which has a wavelength two to five orders of magnitude shorter than microwaves. That meant the size of each antenna needed to be in the micrometer scale with numerous features in the nanometer range. To capture enough IR photons, Novack and his colleagues needed arrays with millions of the antennas side by side. The good news was that instead of having to use exotic semiconductor alloys to capture the light, they could do so by patterning gold in square spiral structures. Novack's team worked out a way to stamp out millions of gold spiral arrays on either silicon or cheap, flexible plastics. At the meeting, Novack reported that the arrays on silicon capture some 80% of the IR photons that hit them, whereas those on plastic manage a respectable 40% to 50%.

    Gotcha.

    Arrays of gold spiral-shaped antennas absorb infrared photons, or heat, triggering electrons in the antennas to oscillate at 30 trillion times per second. Researchers hope those excited electrons will lead to a new form of solar power.

    CREDIT: IDAHO NATIONAL LABORATORY

    Novack and his colleagues still need to figure out how to get the power out of the antennas. When the IR photons hit the array, they cause electrons in the gold to oscillate back and forth at a frequency of 30 terahertz, or 30 trillion times a second. Conventional electronics operate with a current that oscillates at a plodding 60 times a second. That means Novack's team needs to find devices that can either step down the terahertz electrons or convert them into a DC current.

    Unfortunately, Novack and Naughton know of no devices—commercial or otherwise—that can do that, though diodes and rectifiers do the job at lower frequencies. But Novack says theoretical work suggests that sandwichlike devices made from three metal layers separated by ultrathin insulating layers might step down the frequency. And both Novack and Naughton say that a recent surge in terahertz-frequency research is producing rapid advances. Novack says devices that convert to electricity even 30% to 40% of the IR energy absorbed by the antennas could lead to solar cells that beat the efficiency of crystalline silicon cells with a cheap and simple technology that can be printed like newspapers.

Log in to view full text