News this Week

Science  14 Aug 2009:
Vol. 325, Issue 5942, pp. 798
  1. Hydrology

    Northern India's Groundwater Is Going, Going, Going …

    1. Richard A. Kerr

    Farming is a thirsty business on the Indian subcontinent. But how thirsty, exactly? Satellite remote sensing of a 2000-kilometer swath running from eastern Pakistan across northern India and into Bangladesh has for the first time put a solid number on how quickly the region is depleting its groundwater. The number “is big,” says hydrologist James Famiglietti of the University of California, Irvine—big as in 54 cubic kilometers of groundwater lost per year from the world's most intensively irrigated region hosting 600 million people. “I don't think anybody knew how quickly it was being depleted over that large an area,” Famiglietti says.

    The big picture of Indian groundwater comes from the Gravity Recovery and Climate Experiment (GRACE) satellite mission, launched in March 2002 as a joint effort by NASA and the German Aerospace Center. Actually two satellites orbiting in tandem 220 kilometers apart, GRACE measures subtle variations in the pull of Earth's gravity by using microwaves to precisely gauge the changing distance between the two spacecraft.

    Going down.

    Several centimeters' worth of water (pink) disappears each year from beneath the northern Indian subcontinent.

    CREDITS (TOP TO BOTTOM): AP IMAGES; ADAPTED FROM V. M. TIWARI ET AL.

    As the lead spacecraft passes over a patch of anomalously strong gravity, it accelerates ahead of the trailing spacecraft. Once past the anomaly, the lead satellite slows back down. Then the trailing spacecraft gets accelerated and again closes on the leader. By making repeated passes over the same spot, GRACE measures changes in Earth's gravity, which are due mainly to water moving on and under the surface. Most famously, GRACE has recorded the shrinking of ice sheets (Science, 24 March 2006, p. 1698); it has also detected shifting ocean currents, the desiccation of droughts, and the draining of large lakes.

    Outside of wasting ice sheets, the world's largest broad-scale decline in gravity during GRACE's first 6 years came across a 2.7-million-square-kilometer eastwest swath centered on New Delhi. That's according to a study in press in Geophysical Research Letters by geophysicists Virendra Tiwari of the National Geophysical Research Institute in Hyderabad, India; John Wahr of the University of Colorado, Boulder; and Sean Swenson of the National Center for Atmospheric Research in Boulder. Adjusted for natural variations due to changing precipitation and evaporation, the decline in gravity that GRACE determined equates to a net loss of 54 ± 9 cubic kilometers of groundwater per year, the group reports. That would produce a fall in the water table of about 10 centimeters per year averaged over the entire region.

    A paper published this week in Nature reaches similar conclusions. Geoscientists combining GRACE data with hydrologic models for part of the same area calculate that groundwater levels fell an average of 4 centimeters per year between 2002 and 2008—more than official estimates and unsustainable in the long run.

    A falling water table across the northern Indian subcontinent comes as no great surprise. The GRACE region of sharp groundwater depletion coincides with the world's most intensely irrigated land: 50% to more than 75% of the land is equipped for irrigation with pumped groundwater or reservoir water. And then there are those 600 million people drawing heavily on groundwater. But, the group calculates, the GRACE-determined depletion rate implies that groundwater was being pumped out 70% faster in this decade than the Central Ground Water Board of India estimated it was in the mid-1990s. The apparent surge in withdrawal would have been large enough to turn a once-stable water table into a falling one that demands ever-deeper wells and bigger pumps and may draw in salty or polluted water.

    GRACE “has shown us we can do a pretty reasonable job from space” gauging groundwater depletion, says Famiglietti. “We can help regional water managers by giving them a holistic view of a whole system.” Still, across the subcontinent, no one knows how far down the water goes. They just know, as Famiglietti notes, that “it's not bottomless.”

  2. Particle Physics

    Running at Half-Energy Keeps LHC in Race for Discoveries

    1. Adrian Cho

    The world's largest atom smasher will finally start blasting particles together this winter, but at only half of its maximum energy, officials at the European particle physics laboratory, CERN, near Geneva, Switzerland, decided last week. Officials say that energy is low enough to ensure that the 27-kilometer-long, $5.5 billion Large Hadron Collider (LHC) will not wreck itself the way it did last fall, just 9 days after circulating its first beams (Science, 26 September 2008, p. 1753). But the energy is also high enough, physicists say, to give the LHC a shot next year at surpassing its rival, the aging Tevatron collider at the Fermi National Accelerator Laboratory in Batavia, Illinois, which holds the record for the highest-energy collisions.

    CERN officials have dialed down the energy to keep from overloading faulty electrical connections between the thousands of superconducting magnets that guide protons around the LHC (Science, 31 July, p. 522). Current doesn't normally flow through the questionable connections, but they need to be able to carry thousands of amps should the superconducting wire in the magnets warm up and lose its ability to carry electricity without resistance. Running at half-energy, “we have a safety margin of 2 or 2.5,” says Stephen Myers, director of accelerators and technology at CERN. “It's a very conservative approach.”

    Get real.

    A simulation of a collision in the CMS detector. Real data could come by year's end.

    CREDIT: CERN

    Nevertheless, the energy is high enough to give experimenters a chance next year at seeing things that cannot be spotted at the Tevatron, says CERN's Fabiola Gianotti, spokesperson for the 2500-member team working with the LHC's ATLAS particle detector. The Tevatron smashes protons into antiprotons at 2 tera–electron volts (TeV). The LHC is designed to blast protons into protons at 14 TeV but will start at 7 TeV. Running at 7 TeV for a year, the LHC might spot new particles—such as those predicted by a theory called supersymmetry—that are slightly too heavy for the Tevatron to blast into existence. Running at 6 TeV would not suffice, Gianotti says, because the rates for making such particles would be so low that a definite discovery could not be made in a year.

    Even at 7 TeV, researchers acknowledge, such a coup is a long shot. But Tejinder Virdee of Imperial College London, spokesperson for the 3600 researchers working with the CMS detector, notes that CERN officials hope to ramp the LHC's energy up 10 TeV later next year. “If the bulk of the data is taken there,” he says, “the question [of what can be seen at 7 TeV] becomes moot.”

  3. Regulatory Policy

    U.S. Panel Urges Clearer, Cleaner Role for Science

    1. Jeffrey Mervis

    A bipartisan panel of experts suggested last week how U.S. regulatory agencies can make better use of scientific advice by being more open in selecting and vetting outside experts, clearer in defining the questions they want answered, and more rigorous in reviewing the relevant literature.

    Although the 13-member panel included Democrats and Republicans, academics, environmentalists, and industry leaders, its recommendations may still set off political debate. Critics of the Bush Administration say that its proposals for separating scientific and policy issues would go a long way toward correcting the mistakes of the past 8 years, whereas Bush supporters say that it extends efforts initiated by that Administration.

    “The report is a sign that it's a new ball game,” says Jane Houlihan of the Environmental Working Group, an advocacy organization that battled Bush-era policies on a variety of health and environmental safety issues. “We've seen lots of corruption and closed doors, and the transparency and diversity called for in this report would be a big improvement.”

    Not quite, says panel member John Graham, who led White House oversight of regulatory efforts under President George W. Bush. Graham, now dean of the Indiana University School of Public and Environmental Affairs in Bloomington, says the report mirrors the Bush Administration's “call for clear disclosure of scientific uncertainties in regulatory analysis … and states clearly that the problems are not unique to any particular Administration.”

    The report comes from the Science for Policy Project at the Bipartisan Policy Center in Washington, D.C. (bipartisanpolicy.org). “Right now we have a mishmash of policies and no uniformity,” says co-chair Sherwood Boehlert, a retired Republican representative from upstate New York. “What we need is a system that's as open as possible and one that's also consistent from one agency to the next.”

    Panel members also delivered a warning to the Obama Administration: Asking scientists to make policy undermines science and leads to bad policies. “We need to separate the science from the policy,” says co-chair Donald Kennedy, a former editor-in-chief of Science who led the Food and Drug Administration during the Carter Administration. “Otherwise, groups end up criticizing the science because they don't like the policy.”

    The report concentrates on the use of advisory panels at regulatory agencies, only one aspect of the broader question of how to maintain scientific integrity across the government. The White House is currently reviewing recommendations from science adviser John Holdren for policy changes on the larger issue based on principles outlined in a 9 March presidential memorandum.

  4. U.S. Science Policy

    Chu Lays Out an Agenda for PCAST and Asks for Help

    1. Jeffrey Mervis

    Energy Secretary Steven Chu says that one of his top budget priorities—to spend $288 million on eight break-the-mold energy-research centers modeled after the legendary Bell Labs—was rejected by the U.S. Congress because he failed to deliver the sales pitch in person. It was an uncharacteristic misstep for the 61-year-old Nobelist, who in his 6 months on the job has rarely missed an opportunity to tell policymakers exactly what's on his mind.

    Chu described his lapse last week during a freewheeling discussion with the President's Council of Advisors on Science and Technology (PCAST) in Washington, D.C. Meeting for the first time since its 21 members were appointed in late April, PCAST had invited Chu and other Obama Administration officials to suggest subjects it might want to tackle. The peripatetic Chu—who 4 hours earlier had keynoted a forum at Harvard University on the House of Representatives–passed climate change legislation with the bill's co-author, Representative Edward Markey (D–MA)—obliged by giving the star-studded council an earful.

    Staying in touch.

    Steven Chu is a popular witness at congressional hearings.

    CREDIT: AP IMAGES

    Chu began by fingering two key issues for the new Administration. The first is the difficulty of evaluating and scaling up science-education programs (“How do you analyze what works? And how exportable is it?”), and the second is barriers to industrial innovation (“Wall Street often punishes a company for deciding to invest in research”). Then he waded into more contentious waters. Chu asserted that the Defense Department has retreated from funding basic research over the past 2 decades, pinching support for high-risk, high-payoff ideas. And he tweaked the U.S. biomedical research community for being unduly critical of grant applications to the National Institutes of Health (NIH) from colleagues in the physical sciences. “I think they're afraid some computer scientist might take money from a biologist,” he argued, with a nod to PCAST co-chair Harold Varmus, who had pushed to expand such interdisciplinary research while leading NIH during the Clinton Administration.

    Chu also wanted something from PCAST. He urged the council, with its three Nobelists as well as industrial titans such as Google CEO Eric Schmidt, to help him assess the strengths and weaknesses of the sprawling Department of Energy (DOE). The agency has been assigned a leading role in meeting the Obama Administration's promises to reduce carbon emissions, lessen the country's dependence on foreign oil, and promote sustainable energy, along with $39 billion in one-time stimulus money to complement its $24-billion-a-year budget. And Chu made it clear that he's already reached some conclusions.

    “[DOE's] Office of Science is as good as NSF [in supporting basic research], I think, but I have some questions about the applied areas,” says Chu. “I need some ammunition from you guys, … because we're spending billions of dollars [on scaling up research with commercial potential]. I'd love for PCAST to look at what we're doing [in this area].”

    Referring to his campaign for research hubs, he said he prefers to call them “Bell Lablets” because he'd like to emulate the old phone monopoly's wildly productive approach to supporting basic research. The idea is to identify an important problem and then work relentlessly at finding a solution, testing a multitude of ideas until one succeeds. In contrast, he says, most scientists funded by federal research agencies are told “you've got 3 years to get refunded, so that's how you work.” The result, he says, is too often incremental progress on low-risk ideas.

    A 2010 spending bill passed last month by the Senate contains money for three DOE-funded energy research hubs, whereas its House counterpart funds only one. Although Chu hopes that a House-Senate conference will settle on the larger number, he promises to do a better selling job next year.

    “Congress asked the people below me, and they didn't know what I was thinking,” Chu told PCAST members. “I didn't take the time to explain it personally to each member. That's the way it works around here. And I've learned that.”

  5. ScienceNOW.org

    From Science's Online Daily News Site

    Flu Virus May Trigger Parkinson's Decades after the 1918 influenza pandemic, epidemiologists noted an uptick in the number of people with diminished mobility and other neurological symptoms reminiscent of Parkinson's disease. But despite this and other hints, the idea that viruses can trigger neurodegenerative diseases has remained controversial. Now researchers report new evidence for such a link: Mice infected with the H5N1 avian influenza virus lose the same dopamine-releasing neurons that are destroyed by Parkinson's disease.

    Horses Get Help From Their Friends If you're a female horse, it pays to have a few girlfriends. Mares who form stronger social bonds produce healthier offspring and more of them, according to a new study. The finding adds to the growing evidence that friendship is an adaptation with deep evolutionary roots.

    Whale Stranding: Sonar or Lunar? On the morning of 3 July 2004, more than 150 melon-headed whales tried to beach themselves in Hawaii. A rescue team organized by the National Oceanic and Atmospheric Administration herded the whales back to sea the next day, though a calf died. One study blamed the incident on U.S. Navy sonar; another blamed the moon. Now, researchers believe they've finally gotten to the bottom of this attempted mass stranding.

    Do Clouds Come From Outer Space? Most of Earth's clouds get their start in deep space. That's the surprising conclusion from a team of researchers who argue that interstellar cosmic rays collide with water molecules in our atmosphere to form overcast skies.

    CREDIT: ESA/NASA

    Read the full postings, comments, and more on sciencenow.sciencemag.org.

  6. Newsmaker Interview

    Margaret Hamburg Aims to Strengthen FDA Science

    1. Robert Koenig

    Since taking charge of the U.S. Food and Drug Administration (FDA) in May, Commissioner Margaret Hamburg has been a whirlwind of activity. FDA is implementing a new law that gives it the authority to regulate tobacco products, taking a key role in the approval of the planned H1N1 pandemic flu vaccine, revamping its food-safety approach, and trying to make the agency's workings more transparent. Hamburg, a physician and scientist, served as New York City's health commissioner from 1991 to 1997 and led that city's fight against a resurgent TB epidemic. She has also worked as an assistant secretary of the U.S. Department of Health and Human Services and vice president for biological programs of the Nuclear Threat Initiative. In an interview with Science last week, she discussed her plans and dismissed rumors that she and Deputy Commissioner Joshua Sharfstein would each manage different FDA functions. This interview has been edited for brevity.

    Q:What's being done to bolster the quality of science at FDA?

    M.H.:One of my highest priorities … is to strengthen the science base within FDA and also extend our engagement with the scientific community. … We're making huge investments in exciting biomedical research. But if those investments and discoveries aren't married to investments in regulatory science that allow us to understand what products are safe and effective and ready for widespread use, I don't think we can fully realize those investments. … I'd like to see us … working with the Science Board and a range of scientific organizations, professional societies, research universities, industry scientists, and others. I'd like to develop a road map for strengthening regulatory science for the FDA and public health.

    Busy agenda.

    Margaret Hamburg wants to emphasize the “science base” at FDA and tackle a range of regulatory issues.

    CREDIT: GETTY IMAGES

    Q:Congress has given FDA the mandate to create a new Center for Tobacco Products. What is its focus?

    M.H.:There's a very important set of research questions. We need to study the composition of tobacco products and understand both the addictive components of tobacco and the toxic chemical additives. We need to address how these substances are impacting health and ensure that there are not additional innovations by the tobacco industry that will put new products in the market that may be more addictive or more attractive to youth. There's been a new wave of flavored cigarettes and … other things that are of great concern, in terms of their attractiveness to young people as possibly gateways to addiction and smoking. We also want to look at issues of labeling … [and] marketing, with an emphasis on addressing some of the concerns about fraudulent claims and marketing that targets youth.

    Q:How does FDA approach its role in evaluating H1N1 pandemic vaccines?

    M.H.:It's very likely that we will move forward with a vaccine that is licensed using the traditional, standard method of production that's used for seasonal flu vaccines—a “strain change” framework. … Clinical trials to be done by industry and at NIH will provide information on the dose required, whether or not some populations will need two doses, and also the question of adjuvants. We are moving forward with a flexible response on both looking at the data as it emerges and monitoring and tracking the evolution of the virus as it spreads and as we see how cases are evolving.

    Q:Is FDA encouraging firms to design gene-based treatments that may only be effective or safe in certain subsets of the population?

    M.H.:The issues around personalized medicine are intriguing but also very challenging from a regulatory point of view. I'm excited by the opportunity to work with [NIH Director] Francis Collins on these issues, and I think that, working together, we will be able to put in place a strategic framework to make sure that the scientific opportunities can translate into available products, with appropriate regulatory review.

    Q:Some FDA scientists have expressed unhappiness with the agency in the past. Will FDA be open to dissenting views?

    M.H.:I hope a hallmark of my tenure as FDA commissioner is to establish that FDA is a science-driven agency and that, as we make critical regulatory decisions, we welcome and foster the input of all of our scientists and outside experts so we can have a robust review of available data and a forum in which all voices can be heard.

    Q:You and Joshua Sharfstein recently wrote an article describing FDA as a public health agency at heart. Does that change the way you approach regulation?

    M.H.:I think it does. From the earliest days of the FDA's existence, it has always been a public health agency, and its mission is defined as promoting and protecting the health of the public. But because the FDA is also a regulatory agency, that core public health mission has sometimes been lost. Dr. Sharfstein and I want to make that the foundation on which our work is built. … We need to always ask the question: How does this impact health?

    Q:How do you and Dr. Sharfstein define your responsibilities at FDA?

    M.H.:When our appointments were announced, there was a crazy rumor that he would focus on drugs and I would do food safety and nutrition and tobacco. And there was no basis for that. I'm very much the commissioner, involved in every aspect of FDA activity. But I think we make a good team.

  7. Stem Cell Research

    Recipe for Induced Pluripotent Stem Cells Just Got Clearer

    1. Dennis Normile

    Three years of intense worldwide research efforts into induced pluripotent stem (iPS) cells have failed to decipher exactly how inserting just four or even three genes into an adult mammalian cell causes it to revert to an embryo-like state apparently capable of then diversifying into any of the body's tissues (Science, 1 February 2008, p. 560). A better understanding of this reprogramming, which could provide an alternative to controversial embryonic stem cells, might, among other things, enable researchers to boost the efficiency of the process, which works in just a few percent of the targeted cells.

    A clutch of five letters published online in Nature on 9 August cracks this black box by showing that p53, a gene noted for its role in suppressing tumors, hinders the generation of iPS cells. The work suggests ways to control the formation of iPS cells, although that capability will have to be used judiciously because p53 helps ensure the fitness of generated cells.

    Four of the five groups took different approaches in focusing on p53. A team led by Shinya Yamanaka of Kyoto University in Japan, who reported the discovery of iPS cells in July 2006, shows that the suppression of p53 markedly enhances the efficiency of iPS cell generation. Juan Carlos Izpisúa Belmonte of the Salk Institute in San Diego, California, and co-workers improved the efficiency of iPS cell generation by inactivating both p53 and another tumor suppressor gene while using only two of the original four reprogramming genes. Konrad Hochedlinger of Massachusetts General Hospital in Boston and Harvard University and colleagues report that deleting p53 from cell populations that normally do not produce iPS cells gave them the ability to do so. And Maria Blasco and researchers at the Spanish National Cancer Research Center in Madrid show that p53 prevents the reprogramming of cells carrying various types of DNA damage and that blocking p53 enabled their ability to be reprogrammed. A fifth team led by Manuel Serrano, also of the Spanish National Cancer Research Center, looked at another tumor suppressor locus and found that this also limits reprogramming.

    Blocked!

    Adding a molecule (Trp53 shRNA) that inhibits the activity of p53 at any of several days enables the proliferation of iPS cells (red clusters) as compared to a control (left).

    CREDIT: JOSE M. POLO AND KONRAD HOCHEDLINGER

    The letters “prove p53 to be a major barrier in reprogramming … and further describe the underlying mechanisms,” says Hongkui Deng, a cell biologist at Peking University, whose group's Cell Stem Cell report of last November indicated that blocking p53 activity dramatically improves the efficiency of generating iPS cells. The additional details on how “the reprogramming process activates p53 is important, indeed,” says Shin-ichi Nishikawa, a stem cell researcher at the RIKEN Center for Developmental Biology in Kobe, Japan.

    Although the involvement of p53 seems to be confirmed, the nature of its role is still up for debate. But “whether p53 and [a gene it regulates] have direct roles in reprogramming, or [whether] they simply regulate proliferation and survival during iPS cell generation, remains to be determined,” says Yamanaka.

    It does seem that manipulating p53 to generate iPS cells should be done sparingly. Blasco says it is clear that p53 is “limiting the reprogramming of suboptimal cells”—normally a good thing, so switching off p53 could allow cells from aged or diseased patients to be regenerated for research or therapy. But “once the iPS cells are obtained, p53 has to be turned back on to eliminate damaged and unhealthy iPS cells,” she says. She adds that the papers “represent a conceptual advance on what we know about the reprogramming process.” But this is still just a peek into the black box of reprogramming.

  8. ScienceInsider

    From the Science Policy Blog

    Senate appropriators say that staff members of an ambitious children's health study at the U.S. National Institutes of Health have committed a "serious breach of trust" in withholding the ballooning costs of the project. The director of the study, which hopes to follow the health of 100,000 children from before birth through age 21, has changed jobs, and there are rumors that Duane Alexander, the 69-year-old head of the institute overseeing the project, may soon retire.

    The cash-strapped University of California plans to lend $200 million to the state government, which will then give the money back. The university has a better credit rating than the state, which means it can borrow money at a lower interest rate.

    Two ships that service the U.S. Antarctic research program are likely to fall victim to an expected ban on the heavy-grade fuel oil they use. The National Science Foundation has 2 years to come up with a solution.

    Congress has given itself 2 months to reconcile differences in revamping the $2.5-billion-a-year Small Business Innovation Research and the Small Business Technology Transfer programs.

    A National Academies panel wants NASA to reopen a research shop that it shut down for budgetary reasons 2 years ago. The report says the space agency could use some of the “creativity” that marked the NASA Institute for Advanced Concepts.

    Last week, the U.S. Senate confirmed Francis Collins to be head of the National Institutes of Health and David Kappos to lead the Patent and Trademark Office. The nomination of Cass Sunstein to oversee regulatory policy at the White House remains stalled, however, although Majority Leader Harry Reid (D–NV) has promised a vote in September. Meanwhile, bioethics expert R. Alta Charo is joining the Food and Drug Administration as a senior adviser to Commissioner Margaret Hamburg.

    For more science policy news, visit blogs.sciencemag.org/scienceinsider.

  9. Energy Efficiency

    Leaping the Efficiency Gap

    1. Dan Charles

    Experience has shown that there is more to saving energy than designing better light bulbs and refrigerators. Researchers say it will need a mixture of persuasion, regulation, and taxation.

    Waste not, want not.

    Everywhere you look, energy can be used more efficiently, but doing so requires care and cash. The potential gains are huge, dwarfing expected increases in production of renewable energy.

    CREDIT: JUPITERIMAGES

    Thirty-five years ago in Berkeley, California, two young physicists named Steven Chu and John Holdren were present at the birth of a campaign to curb Americans' appetite for energy. They saw their colleague Arthur Rosenfeld abandon a successful career in particle physics and set up a new research division at Lawrence Berkeley National Laboratory (LBNL) devoted to energy efficiency. Then-Governor Jerry Brown and state regulatory agencies adopted Rosenfeld's ideas with astonishing speed. California canceled planned nuclear power plants, passed pathbreaking efficiency standards for refrigerators and buildings, and ordered electric utilities to spend money persuading their customers to use less power.

    Today, Chu, now the U.S. secretary of energy, cites Rosenfeld as a model for scientists and California as a example for the nation. He points out that per capita electricity consumption in California stayed flat for the past 30 years yet rose 40% in the rest of the United States. That flattened curve even has a name: the Rosenfeld Effect. Together with Holdren, now President Barack Obama's science adviser, Chu has made efficiency the heart of the Obama Administration's energy strategy. Tighter appliance standards are on a fast track through the Department of Energy bureaucracy. Billions of dollars from the stimulus package are pouring into programs to weatherize and retrofit homes with energy-saving technology. Chu says such investments quickly pay for themselves in lower energy bills: “Energy efficiency isn't just low hanging fruit; it's fruit lying on the ground.”

    Efficiency pioneer.

    Arthur Rosenfeld traded particle physics for cutting-edge research into energy-saving technologies.

    CREDIT: LONNY SHAVELSON/WWW.PHOTOWORDS/NEWSCOM

    David Goldstein, who studied with Rosenfeld and now co-directs work on energy policy for the Natural Resources Defense Council (NRDC), says California's experience proves that carbon emissions can be contained and even reduced at minimal cost. “The most important lesson is: Success is possible, and a fairly limited set of policies gets you most of the way there,” Goldstein says. And, he adds, it's not hard to go even further with energy saving: “The practical limits [of increased efficiency] have never been tested.”

    But not everyone views California's success story as so clear-cut. Alan Sanstad, an LBNL researcher who also worked with Rosenfeld, looks at the same data and concludes that California's efficiency offensive wasn't nearly effective enough. He points out that California's total energy use over the past 3 decades grew at almost the same rate as it did in the rest of the country, while the state's population soared. Anant Sudarshan and James Sweeney of Stanford University's Precourt Energy Efficiency Center (PEEC) recently calculated that the state's energy policies can take credit for only a quarter of California's lower per capita electricity use. The rest is due to “structural factors” such as mild weather, increasing urbanization, larger numbers of people in each household, and high prices for energy and land that drove heavy industry out of the state.

    For Sanstad, there's a clear lesson: Meeting the more ambitious goal of reducing greenhouse gas emissions will require more aggressive measures that cause some economic pain. “The real potential of energy efficiency is not going to be realized until we get away from the idea that it has to pay for itself,” he says.

    The biggest challenge is not inventing new technology but persuading more people to adopt technology and practices that already exist. A new generation of researchers and government officials is now examining new strategies for energy efficiency, looking for the key—or a whole ring of keys—that will unlock its full potential. “It's a wonderful opportunity to which we have to rise,” says Ashok Gadgil, an energy technology researcher at LBNL. “We were preparing for this for 20 years; now come under the spotlight and sing!”

    The human dimension

    Rosenfeld and Edward Vine had a friendly, long-running argument during their 2 decades as colleagues at LBNL. Rosenfeld believed in technology. When he testified before the U.S. Congress, as he did frequently in the early 1980s, he always came with props in hand: compact fluorescent light bulbs, heat-shielding windows, or computer programs for predicting the energy use of new buildings. But Vine, whose Ph.D. is in human ecology, wasn't convinced of technology's power. “We can't assume, if we have a great technology, that people will rush to stores and buy it,” Vine says. “We need to find out how people behave, how they make decisions, how they use energy, and we need to work with them.”

    For the most part, energy-efficiency programs around the country have followed Rosenfeld's line. They offer financial incentives for adopting energy-saving, cost-effective technology, and trust that consumers will follow their economic self-interest.

    Yet many researchers are now coming around to Vine's point of view. Consumers don't seem to act like fully informed, rational decision-makers when they make energy choices. Many avoid making choices at all. Give them a programmable thermostat, and they won't program it. Offer them an efficient light bulb that pays for itself in 2 years, and they won't buy it. Builders don't take full advantage of the cheapest source of lighting, the sun. Even profit-seeking businesses sometimes make little effort to control their energy use, says Ernst Worrell, who teaches at Utrecht University in the Netherlands and studies companies all over the world. “There are companies that spend 20% of their operating cost on energy, but upper management doesn't know where that money is going,” Worrell says. “They see energy costs as an act of God.”

    Rosenfeld effect.

    The average Californian uses less electricity than a typical person uses in the rest of the country. That gap has grown wider over the past 30 years, even though California has become relatively more wealthy.

    SOURCES (LEFT TO RIGHT): CALIFORNIA ENERGY COMMISSION; 2008 ELECTRIC POWER RESEARCH INSTITUTE

    Every once in a while, however, circumstances force people to focus on energy. When they do, the results can be astonishing. In April 2008, an avalanche cut a transmission line that supplied Juneau, Alaska, with cheap hydropower. The city switched over to diesel generators, but the electricity they produced cost five times as much. City officials went looking for help and contacted Alan Meier, an LBNL conservation expert.

    “In a crisis, you can talk about behavior,” says Meier. The city spread the word that “good citizens save electricity.” And they did, lowering thermostats, turning off lights, and unplugging electronic equipment. Over 6 weeks, Juneau's electricity consumption fell by 40%, yet Juneau's economy did not falter. The transmission line was repaired within 3 months; electricity use rebounded, but it remains about 6% below its preavalanche level. A similar phenomenon, but on a much larger scale, happened during a 2001 energy crisis in Brazil. The country “cut its power consumption by 20% in 6 weeks. That shows you how much behavior can get you,” Meier says.

    Energy pie.

    Most energy in the United States is used for one of three purposes: transportation; heating, cooling, and lighting buildings; or industrial production.

    CREDITS (TOP): APS REPORT (SEPTEMBER 2008); LINDA A. CICERO/STANFORD NEWS SERVICE

    Stories such as this one have fueled a recent explosion of interest in ways to influence people's energy-using behavior. When Carrie Armel, a neuroscientist at Stanford's PEEC, helped organize a conference on the topic in 2007, “we were expecting 150 people and sold out at 500,” she says. “Last year we were sold out at 700. This year we're opening it up to 800 people.”

    Research has produced some intriguing insights. For instance, people believe that others waste energy because of their inner characters, but they regard their own wasteful practices as the product of circumstances. More information doesn't usually produce energy-saving behavior; experts leave the lights on, too. The concrete example of a friend or neighbor who walks her children to school is much more powerful than any impersonal exhortation to drive less. And don't tell someone that he needs to save energy because nobody else does. “It could end up backfiring,” Armel says, because most people don't like the feeling of being in the minority.

    When people are asked to choose among options that they don't fully understand, such as a list of investment plans, they tend to select the “default option”: the one that doesn't require them to change anything or that seems most popular. Right now, that tendency works against efficiency. In appliance stores, says LBNL's Jonathan Koomey, who also works as a consultant for companies, the most efficient “Energy Star” machines are usually aimed at high-end customers. They're manufactured in low volumes and come with additional features that drive up the price. The marketing strategy sends a clear signal that these are not appliances that the store expects most customers to buy. “You can change that,” says Koomey. “If Costco or Wal-Mart announce that they are only carrying Energy Star products, suddenly the efficient product becomes the standard product.”

    Watching your watts.

    Consumers may live or drive differently when utility bills or dashboard displays show vividly how much energy they are consuming.

    CREDIT: SMUD; TAI STILLWATER

    A few utilities are now designing programs based on the conclusions of behavioral science. Because people like to keep up with their neighbors, the Sacramento Municipal Utility District (SMUD) began an experiment in competitive fuel saving. Starting in April 2008, 35,000 randomly selected customers got bills showing how their energy use stacked up against the average usage of their neighbors. According to SMUD, the typical customer in the experiment responded by cutting consumption by about 2%. SMUD also got some angry mail. “I resent being told I am below average,” one customer wrote. “I pay my bill on time; … leave me alone.” SMUD plans to expand the program to 50,000 customers next year.

    Some energy-conservation advocates are rediscovering the old-fashioned virtues of porch conversations and town meetings, now renamed “social marketing.” “There's more interest now in looking at people as part of a community, a culture, a neighborhood, a church group,” says Vine. That approach paid off 20 years ago during energy-efficiency projects in Hood River, Oregon, and Espanola, Ontario, which reached an impressive proportion of the citizenry. According to Hugh Peach, who helped manage the Hood River project for the energy company Pacific Power & Light, 85% of homes in the community received state-of-the-art energy audits and free efficiency upgrades. In Espanola, more than 90% of homes participated.

    Peach compared the process to a political campaign. The utility sat down with local leaders, followed their advice, and relied heavily on local volunteers. The process was time-consuming and labor-intensive but, Peach says, a pleasure. There was “a lot of community spirit. People just saw it as the right thing to do.”

    The next big force for behavioral change may be technology that brings consumers face-to-face with their energy consumption. A simple version of such energy feedback is the dashboard of a Toyota Prius hybrid car, which displays the rate at which the car is burning gasoline. No one has carried out a controlled study of how drivers react to it, but “every person I know who has a Prius, they get a big grin when I mention feedback, and they have to tell me their personal story about how they've reduced their energy use,” says Armel. At the Institute of Transportation Studies at the University of California, Davis, 12 Prius cars have been outfitted with more detailed dashboard displays. Researchers will use them to study how drivers react to different kinds of information, such as energy consumption, emissions, or the cost of fuel being burned.

    The same feedback is now becoming available for homes and businesses. About 40 million homes will soon get “smart meters” that record every spike or dip in electricity use, hour by hour. Various companies, including Google, are devising ways to deliver that information directly to consumers, either via the Internet or by using displays that are linked to the smart meters themselves. Studies show that consumers usually respond to such feedback by cutting their energy use by 5% to 10%. But Sanstad thinks that may be only the first step because this information could create new markets for energy efficiency. “I think it will open a lot of doors,” he says. “When people have this, what new things will they want?” At a meeting of utility regulators in February, Jason Grumet, executive director of the National Commission on Energy Policy in Washington, D.C., said, “this may change the mood surrounding efficiency. It could make it cool.”

    Battling perverse incentives

    Behavioral change was never a top priority for NRDC's Goldstein. “It's real and important,” he says, but it's something “you can only do once.” Technological innovation, on the other hand, leads down a path of continuous improvement. What keeps people from adopting efficient technology is not a quirk of human psychology, he says, but institutional roadblocks—what he calls “market failures.”

    Snapshot of waste.

    Infrared cameras quickly show where heat is escaping from a building. The older building on the right, for instance, has leaky windows.

    CREDIT: PIXEL THERMOGRAPHICS, WWW.PIXELTHERMOGRAPHICS.CO.UK

    Much has been written about market failures, with little demonstrated success in overcoming them. A prime offender is the “principal-agent problem,” which occurs when someone gets to spend another person's money. Hotel guests, for instance, can waste hot water because they don't pay for it. Landlords buy cheap, inefficient appliances because their tenants pay the utility bills. LBNL's Meier found his own favorite example: the humble set-top box that comes with cable TV services. Each box can consume up to 40 watts of electricity continuously—more than an efficient refrigerator. Cable subscribers can't choose which box they get, and cable companies have no incentive to make the boxes more efficient.

    “This situation is more important than you might think,” says Meier. According to his calculations, some form of the principal-agent problem afflicts a quarter of all residential energy use in the United States. There are ways to solve it, he says. In Japan, companies that deliver a vending machine to a site also pay for the electricity that the vending machine consumes. Not surprisingly, those companies now use more energy-efficient vending machines.

    Koomey has studied energy use in big data-processing centers and found similar problems. “The IT department and the facilities department have separate budgets,” he says. “The IT department buys the equipment, but they don't pay the electric bill. They don't have an incentive to spend even $1 to buy a more efficient server.”

    CREDIT: LBNL

    Such institutional barriers bedevil the fragmented, tradition-bound construction industry. Buildings account for 40% of the country's energy use. Amory Lovins of the Rocky Mountain Institute, perhaps the country's most eloquent prophet of efficiency, wrote in 2005 that architects, engineers, builders, and maintenance workers are “systematically rewarded for inefficiency and penalized for efficiency.” Builders are trained to satisfy the minimal standards of construction codes, but they rarely exceed them.

    Energy Secretary Chu told a congressional committee in July that the average new building could use 40% less energy by simply adopting off-the-shelf technology such as automatic controls that turn off lights when they aren't needed and highly insulating windows that also reflect much of the sun's heat in summertime. Retrofits to older buildings, he said, could cut energy use in half and eventually pay for themselves. Innovative architectural designs, arranging windows, shade, and ventilation so as to minimize the need for additional light or cooling, could cut energy use by 80% below today's average (see sidebar, p. 808).

    Frustratingly, “green” buildings often don't deliver what their designers promised because of mistakes in design, shoddy construction, or poor maintenance. “No one measures building performance,” says Stephen Selkowitz, head of the Building Technologies Division at LBNL. “I'll ask 100 architects, ‘How many of you design energy-efficient buildings?’ Almost all of them. Then I'll ask, ‘How many of you know the measured performance of your last building?’ Not a soul! If you don't know how well you did, how will you ever do any better?”

    The California Energy Commission plans to require all new buildings in California to consume no net energy by 2030. Rooftop solar panels will generate as much energy as the building requires. In Europe, an even more ambitious model is gaining ground: the superinsulated, airtight “passive house,” born in Germany, which consumes 10% of the energy of a typical house.

    Such buildings are possible, and hundreds already exist, but most are relatively small. When it comes to large off ice buildings many architects and developers struggle to reach more modest goals, such as cutting energy use in half. “A lot of people start down this path, but they get hung up on cost, they get hung up on complexity, they can't find vendors, they can't find designers who can do it. Owners lose faith,” says Selkowitz.

    Tougher efficiency standards for buildings could change that, creating a network of architects, equipment suppliers, and construction companies that know how to make highly efficient buildings. Such regulations were the first steps in California's efficiency campaign 30 years ago. The long-term benefits, especially if one includes benefits to the environment, can be substantial. “I'm slowly drifting to the position—let's mandate as much stuff as we can,” Selkowitz says.

    A few communities in California, including the city of Berkeley, are trying a new approach to overcoming the reluctance of many home-owners to spend money on energy-saving equipment. Instead of using tax breaks or subsidies to get their attention, local governments or counties are going ahead and funding the work themselves. Local governments recover the cost of the retrofit by adding a small monthly charge to that home's property taxes or utilities bill. But homeowners should still come out ahead as their monthly energy savings are greater than the extra charge.

    Paying the cost

    Lee Schipper of Stanford's PEEC is a grizzled veteran of campaigns to save energy around the world. And after many years in the trenches, he's changed his mind. In the early 1970s, when Schipper was studying astrophysics at Berkeley (where he shared a graduate student office with Chu), he started teaching classes and giving lectures on the physics of energy. When the energy crisis hit, he quickly earned a reputation as an efficiency enthusiast of the most irrepressible sort. He eventually joined Rosenfeld's research team at LBNL.

    CREDIT: LEE SCHIPPER

    Schipper couldn't restrain himself when, in 1977, President Jimmy Carter urged Americans to conserve energy using arguments that Schipper considered unfounded. Carter said that conserving energy “will demand that we make sacrifices and changes in our lives. To some degree, the sacrifices will be painful.” Schipper wrote an angry letter to Representative John Dingell (D–MI), arguing that conserving energy did not, in fact, require painful sacrifices. He explained that new energy-saving lights, windows, and car engines allowed consumers to live just as they always had yet burn less oil and coal. “You know what?” Schipper says today. “I was wrong. Carter was right.”

    Schipper has worked at the International Energy Agency in Paris, the World Resources Institute Center for Sustainable Transport in Washington, D.C., and now at PEEC. He's seen the push for efficiency repeatedly run into limits. Some of those limits, he says, are perfectly understandable. Energy is not a big-ticket item for most people, and even when new technology is cost-effective, the switch often takes more time and effort than people feel it's worth. And sometimes the technology doesn't live up to its promise. Many consumers haven't been happy with compact fluorescent lighting, either because they don't like the quality of the light or because the light bulbs haven't been as durable as advertised.

    More important, efforts to push efficiency ran into intense political opposition, especially in the United States. “There's a battle, and that battle is vicious. It's like abortion, gun control—it's one of those ‘apple pie’ things,” Schipper says.

    Schipper's views are shaped by his own particular specialty: transportation, including cars. Since 1980, new cars have doubled the amount of mass they move with a gallon of gasoline, but U.S. car manufacturers used most of that efficiency gain to make cars bigger and more powerful, not more fuel-conserving. The simplest and cheapest way to reduce energy use in transportation, Schipper says, is simply to require cars that are lighter, smaller, and less powerful. But because of fierce resistance to that idea, “we get all these interesting technological fixes, like plug-in hybrids, that are actually quite expensive.”

    So Schipper has come around to the idea that conserving energy really does demand that people change their attitudes and the way they live. The single most important step in that direction, he says, is to make energy more expensive. “We're still playing 1970s games, thinking that we don't have to confront consumers and industries with the real price of energy and carbon,” he says.

    Some efficiency advocates are wary of such talk. “I'm a nonenthusiast about price. Low energy prices and efficiency can coexist,” says Goldstein. He points to the example of Seattle, where electricity is cheap but people use relatively low amounts of it. Goldstein credits Seattle's tough building codes, cooperative electric utility, and a strong conservation ethic in the population. Koomey thinks conventional economic thinking may underestimate efficiency's growth potential. Perhaps, he says, it's more like the Internet: As more people adopt energy-conserving practices, the infrastructure of efficiency becomes more widespread, making it easier and cheaper for others as well. The phenomenon, he thinks, could gain momentum like a ball rolling downhill.

    Rosenfeld, the man who once provided a professional home to many of these efficiency researchers, quietly agrees with Schipper. “Of course we need an energy tax,” he says simply. The “father of energy efficiency” is modest in physical stature and demeanor. He still lives in Berkeley but spends just as much time in Sacramento, where he's a member of the California Energy Commission. It's a sad time in his life; his wife, Roz, died suddenly of a stroke in early June.

    But it makes him feel “very well,” he says, to hear Energy Secretary Chu extol his accomplishments. He's as devoted to saving energy today as he was 35 years ago. His latest cause: promoting white roofs that reflect sunlight, reducing the load on air conditioners and cooling the planet. “I get listened to,” he says with a smile. “So I continue to say, ‘Energy efficiency is the first thing you want to do, and I know a lot of tricks for doing it.’ Steve Chu does answer my phone calls.”

  10. Energy Efficiency

    Soap Operas to Save Energy

    1. Dan Charles

    Filmmaker John Johnson is collaborating with the creators of popular video programs on the Web to develop scripts that show people conserving energy and water and considering how their consumption choices might affect the planet.

    In developing countries such as Mexico and Ethiopia, serial dramas on radio and television have proved to be successful tools for social change. Their fictional characters have become role models for real life, encouraging women to use birth control or stay in school.

    Filmmaker John Johnson is deploying a similar technique, adapted to the YouTube age, to persuade Americans to act against climate change. Two years ago, he set up the Harmony Institute, an environmental media group based in New York City. Now it is collaborating with the creators of popular video programs on the Web to develop scripts that show people conserving energy and water and considering how their consumption choices might affect the planet. The first programs will go online later this year.

    “We were fascinated by this amazing way of reaching people through the medium that they already are using,” says the institute's deputy director, Debika Shome. Shome, who previously worked at Columbia University's Center for Research on Environmental Decisions, says the online dramas will harness ideas from behavioral science—for instance, that “people are more likely to make changes if it's not about sacrifice but about community.”

    Shome won't reveal where on the Web the institute's “product placement for ideas” will appear because she says publicity would make it harder to measure the show's impact. The Harmony Institute plans to survey viewers both before and after the new episodes to see if there's any change in their attitudes and behavior.

  11. Energy Efficiency

    Many More More-Efficient Computers

    1. Adrian Cho

    One watt in every 50 now goes to powering computers, but big savings can still be made by using more-efficient power supplies and automatically putting idling computers into an energy-saving "sleep" mode.

    Few technologies can match the efficiency gains made in computing. Compared with the first personal computers introduced in 1981, today's machines need a millionth as much energy to flip a bit. However, they also flip a million times as many bits per second, and there are more than a billion of them in the world. One watt in every 50 now goes to powering computers, and industry leaders are eager to keep that figure from growing.

    Big savings can still be made by using more-efficient power supplies and automatically putting idling computers into an energy-saving “sleep” mode. The 2-year-old Climate Savers Computing Initiative—begun by search-engine giant Google, chipmaker Intel, and the World Wildlife Fund—asks companies to pledge to do that in hopes of reducing annual carbon emissions from computing by 50%, or 54 million tons, by 2010.

    Hardware engineers are applying the turn-out-the-lights strategy within microprocessors themselves. “The number-one issue with processors is that they've become pretty good at what they do, so they spend a lot of time waiting for something to do,” says William Swope, vice president and general manager of Intel's Corporate Sustainability Group in Portland, Oregon. To prevent idling, Intel's latest high-end processor runs algorithms that slow down or stop parts of the chip that aren't being heavily used. The new chip consumes 90% less energy per computation than its predecessor from 4 years ago.

    At the software level, engineers are making gains by using a process known as virtualization to run several copies of an operating system on a single processor and to shift those “virtual machines” from one physical processor to another. In times of low demand, a big data center can now shift work onto a fraction of its thousands of servers. Erik Teetzel, Google's energy program manager in Mountain View, California, predicts that “cloud computing” will move almost all computing into such highly efficient data centers. “You're going to have a lot of people with 5-watt devices accessing these centralized resources,” Teetzel says.

    Nap time.

    Computer chips can be designed to put subunits to “rest” when not needed.

    CREDIT: JUPITERIMAGES

    Although computers' energy demand has increased, that expenditure must be weighed against the savings it brings to other machines such as cars and refrigerators, Swope says. “Computers consume about 2% of the power used in the world,” he says. “And yet every aspect of computing has made the other 98% [of energy use] more efficient.”

  12. Energy Efficiency

    Wanted: Help With Building Design

    1. Dan Charles

    Current computer-aided design tools are not making it easy for architects to design buildings for energy efficiency. New software is needed.

    To create a truly efficient building, don't just buy more insulation, better windows, and efficient lighting. “That gets you a 10 to 30% improvement,” says Stephen Selkowitz of Lawrence Berkeley National Laboratory (LBNL) in California. Bigger energy savings, at a lower cost, come from designing a whole building to manage heat and light in an energy-saving way. But current computer-aided design tools are not making it easy for architects to design for efficiency. New software is needed.

    An inherently efficient building demands a delicate balance of opposing forces. Big windows provide natural light, for instance, but can place heavy demands on a cooling system in the summer. To make it work, architects need to predict the flow of air and heat through a structure, arranging windows and the heat-storing “thermal mass” of walls and floors in ways that maintain a stable, comfortable temperature inside.

    Software can simulate all these phenomena, but the most accurate tools, such as the U.S. Department of Energy's (DOE's) EnergyPlus software, are “unfriendly to nonengineers,” says LBNL's Ashok Gadgil. John Haymaker, a specialist on building design at Stanford University in Palo Alto, California, says architects and engineers often use EnergyPlus simply to show that a planned structure will meet building codes or satisfy a client's wishes. What's needed, he says, are more user-friendly tools that let architects experiment with different configurations of a building and find more energy-saving solutions.

    Light, naturally.

    Movable glass shutters on this office building near Zurich, Switzerland, let in sunlight but keep out unwanted heat.

    CREDIT: GAETAN BALLY/KEYSTONE/LANDOV

    Many groups are taking on that challenge. The software giant Autodesk and Integrated Environmental Solutions, based in Glasgow, U.K., are trying to incorporate more sophisticated energy simulations into their design software.

    DOE, meanwhile, is funding an effort to mate EnergyPlus with Google's user-friendly SketchUp software. “I am encouraged that this will be the new face of design,” says Haymaker.

    All simulation tools have one big limitation, however. “You never know how people will use a building,” Haymaker says. His own office building at Stanford is an example: It is not living up to its energy-saving promises because its inhabitants brought in unanticipated lighting, computer equipment, and space heaters. So the best design software will predict not just a building's behavior but also the actions of people inside.

  13. Energy Efficiency

    The Quest for White LEDs Hits the Home Stretch

    1. Robert F. Service

    White light–emitting diodes are still not ready to go head to head with cheaper incandescent bulbs and fluorescents. A new spate of advances, however, suggests that the whitecoats are coming.

    White light–emitting diodes (LEDs) have already cracked several niche lighting markets, such as flashlights and bike lights. But they're still not ready to go head to head with cheaper incandescent bulbs and fluorescents that dominate the nearly $100 billion global lighting market. A new spate of advances, however, suggests that the whitecoats are coming. “There is steady movement and progress in the field,” says E. Fred Schubert, an electrical engineer and LED expert at Rensselaer Polytechnic Institute (RPI) in Troy, New York.

    Much of that progress is coming from the current generation of white LEDs that use a blue LED in combination with a yellow phosphor to produce white light. In April, North Carolina–based Cree reported that its latest commercial white LED bulb puts out an impressive 132 lumens of light per watt of electricity. Incandescent bulbs, by contrast, put out 15 lumens/watt (lm/W), and compact fluorescents bump that up to about 65 lm/W. And earlier this year, Nichia, a Japanese LED company, reported that in a lab demonstration white LEDs had turned out a stunning 249 lm/W at low current.

    Progress is also coming in combining separate blue, green, and red LEDs that not only can combine their primary colors to produce bright white but also can be tuned to shine in any color. The holdup right now is that green LEDs are less efficient than the reds and blues. But over the past 2 years, key advances have come from the University of California, Santa Barbara (UCSB), Purdue University, and RPI.

    Bright future?

    White light-emitting diodes could slash the need for electricity.

    CREDIT: JUPITERIMAGES

    If progress continues, the payoff could be enormous. According to UCSB researchers, if an affordable 150-lm/W white LED were developed, the efficiency gains from replacing conventional bulbs would save the United States alone some $115 billion in lighting costs by 2025, alleviate the need for 133 power stations, and prevent the release of 258 million metric tons of carbon dioxide.

  14. Energy Efficiency

    Aircraft Designers Shoot for Savings on the Wing

    1. Daniel Clery

    Commercial aircraft have become steadily more fuel-efficient by using better engines and reducing the drag and weight of the planes themselves. But efficiency increases beyond that may require a radical change in the shape of the aircraft, which carries enormous risks for manufacturers and airlines.

    Since jet engines appeared in the mid-1950s, commercial aircraft have become steadily more fuel-efficient—simply in order to fly farther and cheaper. According to the International Air Transport Association, new aircraft are 70% more fuel-efficient than they were 40 years ago. In 1998, passenger aircraft averaged 4.8 liters of fuel per 100 kilometers per passenger; the newest models, the Airbus A380 and Boeing 787, claim 3 liters. Even so, as air travel expands while fuel prices spiral upward, there is more that aircraft designers can do.

    The greatest gains in the past, says aerospace engineer Ian Poll of Cranfield University in the United Kingdom, have come from better engines. The earliest engines were turbojets in which all the air sucked in at the front is compressed, mixed with fuel, and burned, providing thrust through a jet out the back. Engineers soon realized that they could get greater efficiency by using some of the power of the jet to drive a fan that pushes some of the intake air through ducts around the core, a design known as a turbofan. Other boosts have come from better compressors and materials to let the core burn at higher pressure and temperature. Poll says engineers might make turbofans yet more efficient by leaving the fan in the open. Such a ductless “open rotor” design—essentially a high-tech propeller—would make possible larger fans, Poll says, if engineers could solve noise problems and figure out how to fit such engines onto the airframe.

    Changes in aircraft bodies have led to more modest improvements. Computational fluid mechanics has enabled designers to refine the shape to reduce drag—an enemy of efficiency. Manufacturers have also reduced weight with lightweight materials such as plastic. The Boeing 787 is made of 50% composite materials by weight, mostly carbon fiber–reinforced plastic, and is the first airliner to use them extensively in the fuselage, wings, and tail. But increasing efficiency this way is a hard fight: Each 1% reduction in weight cuts fuel consumption by only about 0.75%.

    Poll thinks manufacturers could wring another 50% greater efficiency by using open-rotor engines and more composite materials, but beyond that they may need to radically change the shape of the aircraft. In traditional airliners, the fuselage is a dead weight that contributes no lift. A possible alternative is a blended wing body (BWB) in which the fuselage flows into the wings and is itself a lift-generating airfoil.

    Delta force.

    Boeing and NASA are testing fuel-efficient blended-wing body designs to see how they fly.

    CREDIT: BOEING IMAGE

    NASA and Boeing have been collaborating on an experimental BWB craft known as the X-48B. Since 2007, they have been flight-testing a remotely piloted 6.4-meter-wide model of the plane. Making the jump to such a different technology carries enormous risks for manufacturers and airlines because of development and testing involved. They would take that step only if forced to by high fuel costs. “We know [a BWB] is more fuel efficient,” says Poll. “But it's too early to say if it will be the next generation.”

  15. Energy Efficiency

    Making Use of Excess Heat

    1. Dan Charles

    The single biggest opportunity to increase the "energy productivity" of American industry is to make use of heat that would otherwise be thrown away. One way to do that is via cogeneration, or "combined heat and power," a technique that is more than a century old but newly fashionable.

    Green campus.

    Duquesne University, in the heart of Pittsburgh, built a cogeneration plant (below, with smokestack) that supplies the campus with electrical power and steam heating.

    CREDITS: DUQUESNE UNIVERSITY

    The single biggest opportunity to increase the “energy productivity” of American industry, according to a report issued in July by the consulting group McKinsey & Co., lies untapped in the furnaces of ethanol refineries, paper mills, and other heat-consuming industries. The key is to make use of heat that would otherwise be thrown away.

    One way to do that is via cogeneration, or “combined heat and power” (CHP), a technique that is more than a century old but newly fashionable. “District heating,” common in Scandinavia and Eastern Europe, uses leftover steam from power plants to heat nearby buildings. Alternatively, a factory that needs steam can build a gas-fired generating plant, sell the electricity or use it on-site, and use the waste heat to produce the steam it needs.

    Such combined operations are very efficient. The McKinsey study estimates that linking heat and power generation in U.S. industry could save nearly a trillion megajoules of energy over the next 20 years, and the average project would generate a healthy financial return of 36%.

    Currently, Denmark is the world's cogeneration leader; more than half of the country's electricity is produced in CHP plants. In many other countries, including Brazil, Canada, France, the United Kingdom, South Africa, and India, it's less than 8%. According to the International Energy Agency, these countries could double their CHP output in 10 years and triple it by 2030 if they set up the right incentives.

    The Netherlands showed that it's possible. Starting in the mid-1980s, the Dutch government guaranteed favorable prices for electricity from CHP plants, then encouraged electric utilities to set up CHP plants as joint ventures with industrial companies. “Suddenly, utilities started to build cogen plants—because they weren't competition anymore!” says Ernst Worrell, an energy researcher at Utrecht University in the Netherlands. Between the early 1980s and the late 1990s, the share of Dutch electricity generation that came from CHP plants rose from 8% to almost 30%.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution