News this Week

Science  12 Dec 2008:
Vol. 322, Issue 5908, pp. 1618
  1. NASA

    Delays in Mars Mission Will Ripple Across Space Science

    1. Andrew Lawler*
    1. With reporting by Daniel Clery.

    Faced with mounting technical glitches, last week NASA decided to delay its Mars Science Laboratory (MSL) for at least 2 years. That means the spacecraft will not get off the ground before fall 2011. The new date is a major disappointment for researchers eager to learn whether Mars ever supported microbial life. But what worries them even more is the price that other planetary missions may have to pay to free up the additional $400 million that a delayed MSL will cost NASA.

    The troubles surrounding the sophisticated lab, twice as long and three times as heavy as the rovers that have explored Mars since 2004, are part of a growing crisis that President-elect Barack Obama will inherit in January. NASA's science program needs more money for a host of planned spacecraft. But so do the agency's efforts to complete the space station, retire the space shuttle, and build a new launcher to send humans to the moon. Obama pledged during his campaign to back a new generation of science missions as well as the new launcher, which would add billions to NASA's current $18 billion annual budget.

    Speaking to reporters on 4 December, grim-faced NASA managers blamed problems with the lab's small but complex motors, called actuators, that control many of its critical moving parts. If they don't function properly, said NASA Mars Exploration Director Doug McCuistion, the massive lander could turn into “a metric ton of junk.” He says completing tests and keeping the MSL team together will cost $400 million over several years, bringing the total price tag to as much as $2.3 billion. The lab was already 15% over budget before the latest estimate.

    The extra money, says NASA science chief Edward Weiler, will come from other projects involving Mars and possibly other planets. The queue contains three spacecraft to be launched in 2011—a small atmospheric probe to Mars, a robot to survey Jupiter, and a mission to chart the moon's interior. A 2016 Mars mission that has not been fleshed out will cost $800 million to $1.4 billion. And next month, NASA is expected to select a $3 billion spacecraft to study a moon of either Jupiter or Saturn that would be launched by 2020. “There will be impacts,” says Weiler. “We probably will have to delay a major planetary mission.”

    That's not what planetary scientists want to hear. “It's all bad news,” says Fran Bagenal, a planetary scientist at the University of Colorado, Boulder. “There is no silver lining.” But Weiler does see an upside in combining forces with the European Space Agency (ESA) on a common plan for Mars exploration. “We could probably do a heck of a lot better missions if we do it together,” he said. “It's a no-brainer.”

    Extra protection.

    A 2-year delay in the Mars Science Laboratory mission gives NASA time to test all its components, including the “clamshell” to buffer the landing.


    David Southwood, ESA's director of science, is reluctant to discuss NASA's possible contributions to ESA's ExoMars mission, proposed for 2016, before joint technical discussions are held early next year. He says that “some things are at the heart of the European investment,” noting that ESA wants to test its mettle at landing on Mars and roving across its surface. But providing an orbiter to aid communication with Earth or instruments for the lander are areas in which NASA might be able to help, he adds. Cooperation would also be vital for any sample-return mission. This collaboration “is lifting a curtain on the future,” Southwood says.

    The causes of MSL's overruns are controversial. Weiler's predecessor, S. Alan Stern, wrote recently in The New York Times that his boss, Administrator Michael Griffin, feared retribution from the U.S. Congress if he canceled the mission, and that NASA managers have failed to control MSL and other space science overruns. The delay is both “disheartening” and “predictable,” he told Science last week, adding that further MSL overruns are likely to surface.

    But some scientists point the finger at Stern for planning to divert funds from the Mars program before he resigned from the agency 8 months ago. “If Stern hadn't mucked around, the Mars program would have been able to handle” the overruns, says John Mustard, a planetary scientist at Brown University who also chairs a Mars advisory committee. Stern dismissed such charges as “scapegoating.”

    Much ire has also been directed at the Jet Propulsion Laboratory in Pasadena, California, which is handling the MSL mission. Its director, Charles Elachi, admitted at the press conference that “we have not been good at doing cost estimates.” But he says the lab is proud of its record of three successful Mars landings in the past 5 years.

    Space scientists agree that NASA and the science community must rein in costs to avoid a data-dry future. Bagenal says that the “MSL debacle” cannot be allowed to happen again, lest planned missions never get off the ground. “We have to put more attention on careful costing,” she says.

    In the meantime, Griff in promises to find “the least damaging way” out of the mess by talking with Congress, the White House, and the scientific community. “This is math we will be doing in public,” he says. “We just don't have the answers.”


    A Fresh Start for Embryonic Stem Cells

    1. Constance Holden

    U.S. researchers are eagerly anticipating the moment that President-elect Barack Obama takes office and sweeps away the Bush Administration's restrictions on federal funding for research with human embryonic stem (ES) cells. Scientists at the U.S. National Institutes of Health (NIH) are making no secret of their glee. “I think everybody here is incredibly excited about the new Administration,” says Story Landis, director of the National Institute of Neurological Disorders and Stroke and chair of the NIH Stem Cell Task Force.

    Landis predicts that despite the static NIH budget, more federal money will be going into basic research on human ES cells—funded at $42 million this year. Grants will be highly competitive, she says, because “there are incredibly promising things to do.” Since President George W. Bush's directive of 9 August 2001, federally funded researchers have been limited to 21 human ES cell lines that were derived before that date. But as soon as Obama says the word, as he has repeatedly promised to do, they will be able to use hundreds of lines derived since.

    Picking up where the Clinton Administration left off in 2000, NIH will be drafting new guidelines on matters such as informed consent by donors of embryos from which cell lines were derived. Just how many of the more recently derived lines will qualify under the guidelines—and how the guidelines will be implemented—is yet to be determined. Nonetheless, Harvard University researcher Kevin Eggan predicts that “there will be a massive switch in the lines that people use.”

    Gone will be onerous requirements to keep federally funded ES cell research separate from privately funded research using non-approved lines, which for some investigators has entailed the construction and outfitting of separate labs. It's been “a pain in the behind even for someone [like me] with plenty of private cash,” says Eggan. “This was a crazy artifact of the Bush policy that is antithetical to everything that good science is supposed to be about,” says Sean Morrison of the University of Michigan, Ann Arbor.


    Now, says Columbia University motor neuron disease researcher Christopher Henderson, “we will be able to design projects in the most efficient way possible using the most appropriate lines,” doing away with duplication in both research and equipment. For example, says Landis, federally funded researchers will now have access to ES cells from embryos that carry the genes for Huntington's disease, donated from fertility clinics after preimplantation genetic diagnosis. They will be able to test ES-like induced pluripotent stem cells generated from the skin cells of Huntington's patients by comparing them with ES cells, the “gold standard.” And they'll be able to test new drugs on neurons grown from ES cell lines.

    Money problems aside, research still won't proceed unfettered. The government will not be funding the derivation of new ES cell lines or the practice of research cloning, known as somatic cell nuclear transfer (SCNT). Both are out of bounds because language imposing a federal ban on embryo research, known as the Dickey-Wicker Amendment, has been attached to NIH appropriations bills since 1995.

    Congress is expected to ratify Obama's new policy by passing legislation again that Bush vetoed twice. But no one is currently expecting lawmakers to attack Dickey-Wicker, and although scientists want to be able to use SCNT, they say the field can advance without it. For the near future, says Morrison, “my sense is that the field will be more focused on adult cell reprogramming”—that is, finding new ways to turn a patient's own skin cells directly into liver cells, for example, rather than using an egg to reprogram them.

    Zach Hall, former president of the California Institute for Regenerative Medicine (CIRM) in San Francisco, thinks the upcoming federal policy change will have a “multiplier” effect on state projects that were started largely to circumvent the Bush restrictions. These initiatives may provide more leeway than federal guidelines, allowing researchers to generate new lines and possibly try SCNT. As for California, CIRM mastermind Robert Klein says the agency is in good shape, at least through mid-2009, despite the state's $28 billion budget deficit.

    A new Obama policy relaxing controls on ES cells should help foster more realistic attitudes about the promise of stem cell therapies, says Amy Comstock Rick, president of the Coalition for the Advancement of Medical Research in Washington, D.C. At a recent meeting, she said, “There's been a shift in understanding” on the part of people in the patient community—they are less inclined to blame all obstacles on the restrictions imposed by Bush and recognize that in any case “the field has an awfully long way to go.”


    Sotto Voce, LHC Repair Plan Points to Weaknesses in Original Design

    1. Adrian Cho

    Officials at CERN, the European particle physics lab near Geneva, Switzerland, issued a four-page report last week tersely describing how they plan to get the 27-kilometer-long Large Hadron Collider (LHC), the world's biggest particle smasher, working again after its 19 September breakdown. Although the report doesn't mention errors in design, the list of fixes does point to flaws, including one that some physicists say cannot be completely eliminated. “There are some questions about the design, and they are fixing some of them and some of them cannot be fixed,” says Peter Limon, an accelerator physicist at Fermi National Accelerator Laboratory in Batavia, Illinois.

    The breakdown occurred just 9 days after researchers first circulated protons through the $5.5 billion LHC. The collider comprises more than 1600 electromagnets linked like sausages to form a ring. Researchers were raising the current past 9000 amps in a string of 209 magnets when an electrical connection between two of them melted. This burned a hole in the pipe that surrounds the superconducting niobium titanium wire and keeps it bathed in frigid liquid helium. Boiling helium flooded the magnets' casings, creating a pressure wave that pushed around some of the 35-metric-ton magnets and propelled soot and debris down the pristine beam pipe through which the protons speed.

    CERN workers have begun to remove 53 damaged magnets from the LHC's tunnel. They will rebuild 39 and clean the rest. The LHC will not collide particles until July at the earliest, officials now say.

    The pressure built up in the magnet casings because only every fourth one had an emergency relief valve and those were too small to release the errant helium into the LHC tunnel quickly enough. To correct that shortcoming, workers will install a larger valve on every casing, the report says.

    It will be trickier to ensure that an electrical connection does not melt again. The faulty connection lay in a straight-shot return line called a “bus bar.” After current winds its way through every magnet in a string, the bus bar carries it back through the magnet casings to the power supply. Between casings, the bar has a solder splice like the one that melted. Although the LHC has a system to detect when individual magnets overheat and lose their superconductivity, the bus bar is protected by a relatively insensitive system that looks for a voltage increase—a sign of heating—across the full length of the bar. That's a design flaw, says Thomas Roser, an accelerator physicist at Brookhaven National Laboratory in Upton, New York.

    To better detect overheating, CERN will install equipment to individually monitor the voltages across each connection. They will also monitor the temperature of the surrounding liquid helium. Still, it takes 100 seconds to shunt current out of a magnet string. That time is too long to keep a very hot connection from melting, but the machinery cannot feasibly be changed to shorten it. So the only “fix” is to spot a warm splice before it gets too hot. “It absolutely must not be allowed to happen again,” says CERN's Lyn Evans, who leads the LHC project. “It's a machine on the edge, and all effort must go into learning the warning signs” of trouble.


    A pressure wave within the LHC shoved 35-metric-ton magnets off their supports and mangled the connections between them.


    Physicists hope that the 20 nations that fund CERN will be patient while they work out the kinks. As Science went to press, delegates to the CERN Council were meeting at the lab. “I expect the council will understand that this is a difficult machine and that we will have a few hiccups,” Evans says. Patience may wane if the LHC suffers another setback, however. “If they have another one, then it's going to get more serious,” Limon says. “The council is going to say, 'What's going on here?'” So relief valves won't reduce the pressure on CERN physicists next summer.


    How Kansas Nabbed the New Bio- and Agro-Defense Lab

    1. Yudhijit Bhattacharjee*
    1. With reporting by Jocelyn Kaiser.

    If you build it, they will come. The maxim seems to have worked for Kansas State University (KSU) in Manhattan, which the U.S. Department of Homeland Security (DHS) selected last week as its preferred site for a $450 million facility to replace the aging Plum Island Animal Disease Center off Long Island, New York. What made the university stand out from the competition, KSU and federal officials say, was a recently completed $54 million biocontainment lab on campus for animal research, funded by the state, as well as the lack of any vigorous local opposition to work involving dangerous animal pathogens.


    Kansas Senator Pat Roberts spearheaded the state's aggressive lobbying, which included building this BSL-3 lab on the KSU campus.


    DHS announced in 2006 that it would replace the Plum Island center, a biosecurity level 3 (BSL-3) lab where researchers have been studying livestock diseases such as foot-and-mouth disease and anthrax for 5 decades, with a BSL-4 National Bio- and Agro-Defense Facility (NBAF) to be located on Plum Island or somewhere on the mainland. Although some environmental groups opposed the decision, DHS argued that the new facility was needed to study emerging animal diseases such as African swine fever—which can only be handled in a BSL-4 lab—and that building it would cost less than upgrading the existing lab at Plum Island.

    A total of 29 institutions submitted proposals to host the facility. DHS short-listed six of them, including Plum Island, in July 2007. Last week, DHS published a final Environmental Impact Statement for all six candidates, recommending KSU as its top choice and placing the University of Texas, San Antonio, and Georgia State University in Athens in second and third place, respectively. Governors of two of the losing states, Texas and Mississippi, have indicated that they may challenge the decision during the 30-day comment period before the selection becomes final. But federal officials say the choice is unlikely to be reversed.

    “Kansas by far had more strengths than any of the other sites,” says Jamie Johnson, director of national labs at DHS. The new Biosecurity Research Institute (BRI) on campus means there is already considerable research expertise in livestock diseases there, he says. “They also had strong community acceptance, and they offered a solid package to help offset the cost of building the facility.”

    KSU has offered 15 hectares of land for the project, and the Kansas legislature has granted permission to the state to spend up to $105 million to bring roads and other infrastructure to the site. State officials expect NBAF to bring 1500 construction jobs, boost the state's veterinary and animal-care industry, and add a net $3.5 billion to the state economy.

    For more than a decade, political leaders, scientists, and businesses in Kansas have been trying to make their state the leading venue for research on food safety and animal diseases. Efforts began in 1998 when Kansas Senator Pat Roberts (R), then chair of the U.S. Senate's Emerging Threats Subcommittee, invited KSU researchers to testify on the potential risks and consequences of bioterrorism on livestock and agriculture. “Nobody at the time was talking about the terrorist threat within the agricultural community,” says Jerry Jaax, a former U.S. Army veterinarian who is KSU's associate vice provost for research compliance.

    At Roberts's urging, KSU leaders came up with a proposal to build a research facility to study diseases that could devastate Kansas's cattle-based economy. After failing to get funding from the U.S. Department of Agriculture, the university approached the state legislature, which appropriated funding in 2002. The facility was completed in 2007 and is now being set up for experiments.

    Ron Trewyn, vice president for research, says the bipartisan political support behind BRI paved the way for the state's proposal to DHS. The university had already invested a lot of effort to reassure ranchers and business owners that the BSL-3 facility did not pose a hazard to the surrounding community. Jaax, who did much of the outreach, said the university also emphasized “the importance of this research to the economic security of Kansas—how much Kansas would lose if there were a major bioterrorism event.”

    It was a “well-organized, extremely well-funded consortium of proponents,” says Tom Manney, a retired KSU biophysics professor who led a local campaign to stop the facility from coming to Manhattan. Although he says he knows “a lot of ranchers who are very upset about doing foot-and-mouth research in the middle of this area,” Manney said he had difficulty rallying opposition to the project.

    If the choice is finalized, DHS officials say they expect to have the facility running by 2015 and staff it with 350 employees, half of them scientists. Some researchers will likely be hired locally, DHS officials say. Until NBAF is ready for operation, Trewyn says, the federal government is welcome to “take over” BRI to start some transition projects.


    Researchers Could Face More Scrutiny of Outside Income

    1. Jocelyn Kaiser,
    2. Lila Guterman*
    1. Lila Guterman is a science writer in Washington, D.C.

    Faced with an escalating controversy about researchers who allegedly broke rules on reporting income from drug companies, U.S. National Institutes of Health officials outlined plans last week to tighten federal conflict-of-interest regulations for NIH grantees within the next 6 to 12 months. The news came the same week that several research institutions said they intend to make industry payments to their faculty members public.


    Acting NIH chief Raynard Kington gets the first crack at fixing the conflicts-of-interest furor.


    NIH standards have been under intense public scrutiny since last summer, when Senator Chuck Grassley (R-IA) claimed that three prominent Harvard University psychiatrists with NIH grants had failed to report millions of dollars in outside income to their institutions (Science, 27 June, p. 1708). Until now, NIH has said little about these and half a dozen similar cases that Grassley has identified. Last week at a meeting of the NIH director's advisory committee, however, the agency disclosed that it has been investigating 20 individuals over the past year, some identified in press reports and in Grassley's probe and others identified by NIH. According to Sally Rockey, acting director of the NIH office of extramural research, in 13 of the cases no rules were broken, and six are still being investigated. One “serious” case led to the suspension of a $9.3 million grant to Emory University (Science, 24 October, p. 513).

    Although current federal conflict-of-interest regulations are “likely being followed in the majority of cases,” said NIH acting director Raynard Kington, NIH and officials at the Department of Health and Human Services (HHS) are now considering a revision. NIH and HHS have prepared a draft notice soliciting public input through a series of questions. It asks: Should researchers be required to report all income, not just income exceeding $10,000 per year? Should NIH review the nature of conflicts, including details that are now reviewed only by institutions? Should NIH require a policy for institutional conflicts of interest? Last week, HHS submitted the notice to the White House Office of Management and Budget. It must act within 90 days; a period of 60 days for public comment follows.

    Grassley is proposing his own solution: new legislation that would require drug and device companies to post all payments to physicians in a national public database. Anticipating this action, the Cleveland Clinic in Ohio announced on 3 December that it is posting information in its physician directory about outside income over $5000 a year and royalty and equity interests for bench scientists and clinical researchers. This step is a first for a major academic research center, the clinic says. Guy Chisolm, a professor of cell biology and director of the clinic's new conflict-of-interest program, explains that the Cleveland Clinic is listing names but holding off on posting the payment amounts until it can discuss with drug companies and Grassley's staff how they will distinguish among different types of payments, such as those for travel expenses and consulting fees. “I don't want our faculty in jeopardy” of having to explain discrepancies, Chisolm says.

    The University of Pennsylvania plans to post similar information in the spring, says Arthur H. Rubenstein, dean of the medical school. Duke University in Durham, North Carolina, where an affiliated institute that performs industry-sponsored clinical trials already posts complete disclosure forms for 21 faculty members, is considering doing the same, says Ross McKinney Jr., a professor of pediatric infectious diseases at Duke. Chisolm says Johns Hopkins University and Stanford University are also “interested in the concept.”


    Vaccine Comes Another Step Closer

    1. Martin Enserink

    NEW ORLEANS, LOUISIANA—The most advanced candidate vaccine for malaria has cleared another major hurdle and is now ready for its last and biggest test: a phase III trial of 12,000 to 16,000 children at 11 locations in seven African countries. Two new studies published this week by The New England Journal of Medicine—and presented, to considerable excitement, at a meeting* here on Monday—confirm that the vaccine works and is safe, and, perhaps most important, they show it can be given along with a series of other vaccines that many babies in the developing world receive shortly after birth.

    The vaccine, called RTS,S, is far from perfect: It appears to prevent roughly half of all clinical episodes of malaria at best. But researchers believe that's good enough to help make a major dent in the disease, especially when applied in tandem with other tools, such as long-lasting insecticide-impregnated bed nets and a new generation of drugs called ACTs. And they're encouraged by the fact that trial outcomes so far have been consistent. After a “long history of sob stories” in malaria vaccine development, “this seems like a solid vaccine, study after study,” says Christopher Plowe of the University of Maryland School of Medicine's Center for Vaccine Development in Baltimore.

    RTS,S, designed in the late 1980s by an ancestor company of GlaxoSmithKline (GSK) in Rixensart, Belgium, aims to prevent the parasite Plasmodium falciparum from infecting and surviving in human liver cells. It consists of a fragment of a protein from the parasite's outer surface, fused with a hepatitis B virus protein that helps trigger a more vigorous immune response. Since 2001, the vaccine has been developed in a partnership between GSK and the PATH Malaria Vaccine Initiative, with financial backing from the Bill and Melinda Gates Foundation.

    Previously, a study among children aged 1 to 4 in Mozambique showed that the vaccine, coupled with a GSK-developed immune booster or “adjuvant” named AS02, prevented 30% of malaria cases in the first 6 months after the shots were given, and perhaps as many as 58% of severe cases (Science, 22 October 2004, p. 587). Another study published last year showed that it protected infants—its intended target audience—as well.

    Researchers hope to keep administration of the malaria vaccine simple and cheap by making it part of the Expanded Program on Immunization (EPI), a series of shots against diphtheria, tetanus, polio, and Haemophilus influenzae B given shortly after birth. To test whether that can be done safely, the researchers gave RTS,S with AS02 along with the EPI vaccines to 340 Tanzanian babies at 8, 12, and 16 weeks after birth. The new shot did not reduce antibodies against the other vaccines, they found, and reduced the incidence of malaria during the first 6 months after vaccination by more than half.

    A shot at prevention.

    Salim Abdulla, a lead author on one of the vaccine studies, examines a Tanzanian child with malaria.


    The second new study tested whether it's safe to give RTS,S with a slightly different adjuvant called AS01, which GSK believes to be more potent. Carried out in 894 children aged 5 to 17 months in Tanzania and Kenya, the study found that the combination was safe and led to a 53% reduction of clinical malaria in the 8 months following vaccination. Based on those findings and another study in Ghana, also presented here, the partnership has picked AS01 as the adjuvant for its phase III trial. That study, still awaiting approval in some of the participating countries, will start within 3 months, says GSK researcher Joe Cohen, a co-inventor of RTS,S.

    “These are rigorous studies. I'm impressed,” says Blaise Genton, a malaria researcher at the Ifakara Health Research & Development Centre in Tanzania, who was not involved in the trials. Several worries remain, however. One is that rather than preventing serious malaria, the vaccine may simply be delaying it by a number of years. But even if RTS,S should fail its final exam, researchers take heart from the fact that more possible replacements than ever are behind it in the development pipeline.

    • * 57th Annual Meeting of the American Society of Tropical Medicine and Hygiene.


    Biosummit Seeks to Draw Obama's Attention to the Life Sciences

    1. Eliot Marshall

    How do you make a pitch for spending more money on basic science at a time when the whole world seems to be going broke? That question faces a group of scientists who met last week in Washington, D.C., at a “Biosummit,” part of a new campaign to focus public attention on the importance of the life sciences. It's not clear whether the meeting and two planned follow-up reports will ask for an explicit increase in public spending. But one organizer—biologist Keith Yamamoto of the University of California, San Francisco—says he expects the project will give “very explicit” proposals for improving science education and grant mechanisms. The idea, he said, is to focus on “broad issues that will have a very big impact,” such as removing barriers between fields. The first step, he said, will be to deliver an overview “white paper” to the Obama Administration by February.

    Speakers at the 3 December Biosummit, organized by the U.S. National Academies, included two former directors of the U.S. National Institutes of Health (NIH), Elias Zerhouni and Harold Varmus; the president of the Howard Hughes Medical Institute, Thomas Cech; Massachusetts Institute of Technology (MIT) President Susan Hockfield; and others. Their comments went to a 17-member panel chaired by MIT biologist Phillip Sharp and E. I. du Pont de Nemours Vice President Thomas M. Connelly Jr., who will oversee the project and deliver a full report with recommendations in August. For the academies, that's “warp speed,” says project officer Ann Reid.


    Keith Yamamoto helped orchestrate a review of what's needed in basic research.


    The project began to take shape about a year ago, says Yamamoto, who chairs the Academies Board on Life Sciences. In a way, he concedes, it was a reaction to an earlier pitch for science, a report from the academies in 2005 called Rising Above the Gathering Storm. Many credit that document with galvanizing support in Congress for the physical sciences (Science, 9 May, p. 728). But Gathering Storm said little about bioscience. Yamamoto notes that “it sectored out the physical sciences and engineering away from life sciences.” This was understandable, he adds, because NIH had just come off a 5-year doubling of its budget. But NIH's budget is now stagnant.

    The Board on Life Sciences recruited two sponsors—NIH and the U.S. National Science Foundation—to finance the anticipated $500,000 cost of the Biosummit project. Yamamoto says he hopes the new “gathering storm for biology” will help open up the life sciences to other disciplines such as physics, engineering, and mathematics. Setting priorities will be more important than seeking “lots of new dollars,” he says.

    In his talk, Zerhouni stressed the need to “create venture space” where scientists can take risks. The peer-review system, although good at stopping weak science, he said, is “not very good” at identifying new ideas. He disparaged the “herd mentality” reflected in much NIH research and gave a short list of what he'd like to see, including more methods to analyze dynamic biology “in a standardized and reproducible fashion.” In an interview, Varmus spoke of the need for inclusiveness and the importance of avoiding a “tit for tat” competition among fields.

    One person who says he wouldn't hesitate to ask for a hefty increase in bioscience funding is the chief author of the original Gathering Storm report, former aerospace executive Norman Augustine. “The life sciences need to be given additional resources,” he says. Legislators would be “proud” to invest in science, education, and “fixing the long-term problems” of the economy rather than bailing out failed companies. And the cost would be small in comparison to what's being spent on financial rescues, he says: “You wouldn't even have to replot the charts.”


    Crazy Money

    1. Chelsea Wald*
    1. Chelsea Wald is a freelance writer in New York City.

    Humans aren't rational, as the recent economic crisis shows. So why should financial theories assume that they are?

    Humans aren't rational, as the recent economic crisis shows. So why should financial theories assume that they are?


    NEW YORK CITY—Falling housing prices, crashing stock markets, contracting credit: Even the experts seem bewildered by the current economic crisis. Quantitative analysts (quants)—the whiz-kid financial engineers whose algorithms have dominated Wall Street trading in recent years—have watched those algorithms fail. Former Federal Reserve Chair Alan Greenspan acknowledged in October that there was “a flaw in the model that I perceived … defines how the world works.” U.S. Treasury Secretary Henry Paulson flip-flops about the most effective way to spend an increasingly inadequate-seeming $700 billion of taxpayer money.

    How could the experts be at such a loss? “One of the biggest factors in the current crisis is human behavior,” says Andrew Lo, a financial economist at MIT Sloan School of Management in Cambridge, Massachusetts. However, the classical theory of finance simply does not address human psychology. It looks more like a physical science than a social science—relying on the premises that markets are “efficient,” immediately reflecting new information, and investors are “rational,” always acting in their self-interest. Although industry and regulators don't adopt classical finance theory wholesale, its assumptions underlie many of their choices.

    Most of the time, classical finance theory works very well—and there's the rub. Lo compares it to Newtonian gravitation. During normal circumstances, people are generally rational, and the theory describes observed reality quite well. But in extreme circumstances—such as the past months for our economy—people panic, and the theory fails. Is a grander, general relativity-like theory of finance in reach? Lo says it is. Others, such as Jeffrey Wurgler, a financial economist at New York University's (NYU's) Stern School of Business, say the effort would be more like searching for a financial “string theory, which is unbelievably more complicated”—and possibly futile.

    What went wrong

    Most accounts of the current financial crisis begin earlier this decade, when, inspired by the booming real estate market, mortgage lenders started giving loans to just about anyone who asked for one. Those loans got bundled and repackaged many times into risk-obscuring financial instruments, such as the now-infamous collateralized debt obligations (CDOs).

    Through the unregulated “shadow banking system,” these instruments ended up in the portfolios of nearly every bank and financial firm around the world. Because of the lack of regulation, members of this shadow system habitually borrowed dozens of times their own worth in cash—a debt that would be perilous if their bets didn't pan out.

    And that's what happened when the real estate market, and thus the CDOs, turned sour. The losses surged through the world economy. Many firms—and, in some cases, their associated commercial banks—couldn't stay above water. Not knowing who would fail next, banks stopped lending, leading to further failures. To raise money, investors were forced to sell perfectly good stocks, causing stock prices to fall. “It's a very complicated system that's malfunctioning,” says Yale University economist Robert Shiller.

    Blame has fallen on quants for various aspects of the crisis. First, mathematical models were increasingly used to determine whether someone deserved a loan, bypassing individual judgments. “In the end, there was very little sound credit judgment going into making these credit calls,” says Bjorn Flesaker, a senior quant at Bloomberg in New York. Then, quant models were used to rate the riskiness of financial instruments, including the CDOs. “We never necessarily viewed the rating agencies as having the greatest rocket scientists around,” says Flesaker, yet investors accepted those ratings, taking on more risk than even they realized. Finally, Value at Risk models claimed to tell trading departments of Wall Street companies the maximum loss they could expect to see on any day and therefore how much money they needed on hand to avoid total collapse. These models were “the wink-and-a-nod of Wall Street,” says financial engineer Lee Maclin of NYU. Risk managers should have known to use common sense, but, in some cases, Maclin says, “the models were used to justify a bigger appetite for risk.”

    Despite their errors, Flesaker says he is “not in the 'blame the quants' camp,” and he's not alone. Shiller says that, like many of the elements that economists and the media have focused on, the quant models are simply “proximate causes.” Ultimately, experts must examine human behavior to find out why the crisis happened. Why did so many people take on mortgages that they would not be able to pay? Why did the best minds of Wall Street ignore warnings about a housing bubble? “The bottom-line question that economists, I think, still are struggling with is: 'Did anybody know that the risks were so great and, if so, why did they continue investing?'” says Lo. “I don't think they have an answer for that.”

    The madness of crowds

    For more than 2 decades, researchers in behavioral finance have sought out the signatures of human irrationality in markets. “Behavioral finance is an intellectual revolution that tries to give a broader perspective from multiple social sciences,” including psychology and sociology, says Shiller, a founder of the field.

    For example, says economist José Scheinkman of Princeton University, classical finance theory's model of speculative bubbles, such as the dot-com bubble of the late '90s and the recent housing bubble, does not match real-life observations. Classical finance contends that rational investors will always have the best possible portfolio, so they will not buy or sell unless they have extra money to invest or need to cash in their investments. However, researchers have observed that people buy and sell much more often than that during a bubble—with the rate of transactions becoming increasingly manic the bigger the bubble gets.

    Lacking a good classical model for stock-market bubbles, Scheinkman, whose work is primarily classical, turned to a concept in behavioral finance. Psychologists have found that people often overestimate the precision of their knowledge. Scheinkman and his Princeton colleague Wei Xiong guessed that overconfident investors would trust their own opinions about the price of an asset, so they would consider others' opinions, if different, a little “crazy,” says Scheinkman. Looking to make money off others' crazy opinions, investors would be willing to pay more than they think an asset is actually worth because they believe that they will be able to sell it in the future to an overeager buyer. This process would inflate prices and cause a trading frenzy. Incorporating investor overconfidence into a theoretical model published in 2003 in the Journal of Political Economy, Scheinkman and Xiong were able to recreate more accurately the hyperactive trading in bubbles.

    As Wurgler sees it, the time of most extreme irrationality was the housing bubble of the past several years, not the crash of the past few months. “The crash is the healthy return to normalcy,” he says.

    Evolving a theory

    Although behavioral theory has succeeded at challenging classical theory—and has spawned several bestsellers, including this year's Nudge—some complain that it is only a series of critiques, not an alternative framework for understanding how markets work. “I would argue it takes a theory to beat a theory,” says Lo.

    Lo started his career developing quantitative models in line with classical finance theory, he says, but when he compared the models' predictions with stock-market data, “the models I developed ended up being rejected by the data pretty soundly.” So for the past several years, he has been working on a new theory that combines insights from evolutionary biology and neuroscience, called the Adaptive Market Hypothesis (AMH), in contrast to the Efficient Market Hypothesis. Instead of seeing market participants as computers, which can always calculate the best way to achieve their goals, the AMH sees them as “species,” which evolve strategies to compete for limited profits through trial and error. Overconfidence, he explains, is an adaptation. Overconfident investors who bet big and win get noticed; underconfident investors never even get in the game. He has found that quantitative models, based on the AMH's principles and equations from evolutionary biology, can replicate similar behavioral concepts.

    Unlike investors in classical finance theory, Lo's species behave differently based on what part of their brains they are using. When things go well and people make money, as they did for the past decade, the experience stimulates investors' reward circuitry. This causes them to seek more profits and ignore possible risk, leading, for example, to a bubble. When things take a turn for the worse, panic overrides rational decision-making, leading to a crash. Only when the market is steady does the rational brain take over. Lo is starting to use functional magnetic resonance imaging and other tools of neuroscience to quantify these behaviors and incorporate them into his models. He also needs more real-world data on the way different funds invest money—data that are now secret or that no one bothers to collect.

    Although Lo's idiosyncratic approach lies outside of the behavioral and classical theories, he says it reconciles them. “If you were an efficient-markets type, I think you'd be hard-pressed to explain what happened over the last few weeks. And if you were an irrational finance person, you'd be hard-pressed to explain what happened over the previous 10 years. So I think that the only way to reconcile the two is to acknowledge that both are different aspects of the exact same truth.”

    Keeping it real

    Others say that the very message of behavioral finance is that a comprehensive theory is unrealistic, not only in practice but also in principle. “Economists deeply admire the physical sciences and wish we could be like that,” says Shiller. But, he says, “there's a human element that can get lost” when researchers emphasize quantitative methods instead of, say, historical ones. “I don't want to slavishly copy what is superficially scientific.”

    Nonetheless, behavioral researchers are eager to prove that their ideas mirror nature by using quantitative methods to link them directly to real-life data, NYU's Wurgler says. Stock pricing lends itself to such studies, because valuing a stock involves conjecture—which is subject to psychological factors—and a lot of stock-market data have recently become available to academic researchers.

    In a 2007 paper in the Journal of Economic Perspectives, Wurgler and co-author Malcolm Baker, a financial economist at Harvard Business School, looked for signatures of investor sentiment—irrational optimism or pessimism—in stock-market data since the 1960s. They hypothesized that certain stocks would be more subject to sentiment than others: broadly speaking, stocks for which the true value is difficult to determine. For example, a young, promising company would fit the bill. “The combination of no earnings history and a highly uncertain future allows investors to defend valuations ranging from much too low to much too high,” they write.

    Comparing the stock-market data with their measure of investor sentiment, they found what they had expected. In optimistic times, difficult-to-value stocks were wildly popular and therefore made much more money than average. In pessimistic times, they were wildly unpopular and therefore made much less money than average. On the other hand, easy-to-value stocks, which are considered safer, were more popular in pessimistic times than optimistic ones, but their prices stayed much closer to average. This helps explain past bubbles in certain types of stocks—say, dot-com stocks in the 1990s—and is also useful for making predictions for the future, Wurgler says.


    Jeffrey Wurgler (left) and Andrew Lo (right) incorporate irrational behavior into models of financial markets.


    Mathematician Gunduz Caginalp of the University of Pittsburgh in Pennsylvania unifies theoretical modeling, laboratory experiments, and statistical studies of real-world market data. He and his colleagues, including Nobel Prize-winning economist Vernon Smith of George Mason University in Fairfax, Virginia, wanted to know how increased cash in the stock market would affect a bubble. In classical finance theory, more cash would make no difference, Caginalp says. “The fact that you have extra money, why should that push up prices?” he asks. “But that's exactly what we saw.” In a 2001 paper in the Journal of Psychology and Financial Markets, Caginalp and his colleagues had volunteers trade for real money in an experimental market. They found that adding $1 per share to the market pushed up the price of shares by about $1. The result fit their predictions in an earlier theoretical modeling paper. Later, they also saw the effect in an analysis of New York Stock Exchange data. Caginalp draws the connection with the recent housing bubble: Loosened mortgage lending policies poured money into the housing market. “There's really no place for that money to go except to fuel various bubbles,” he says.

    Using behavioral finance, researchers will get better at explaining specific phenomena such as bubbles, says Wurgler. But, he argues, human psychology is so complex that a unifying theory is not within reach. “We'll putter along with an increasingly complex and diverse set of models,” he says.

    Putting it into practice

    New theoretical models from academics may not affect Wall Street quants all that much. Quant models can be loosely divided into two types, says Flesaker: “trying to forecast future states of the world and trying to forecast the behavior of a particular financial instrument given the state of the world.” The algorithms that help make decisions on when to buy and sell assets fall into the latter category, and they have held up reasonably well, he says. They work by exploiting tiny insights that others don't yet have, so anything published in the academic literature will be useless almost immediately.

    The quant models that failed catastrophically—ratings models, Value at Risk models—are all in the former category, says Flesaker. But even these can be “pretty useful, if local, approximations to a much more complex reality,” says Lo. A new theoretical framework could help quants better understand where their models are likely to be wrong. That could prevent managers from putting too much trust in their models, a factor in the failure of many companies. “The true skill of the practitioner is to know the ins and outs of the models,” says financial engineer Petter Kolm of NYU. Practical decisions “can't just be based on some numbers that a model spits out.”

    On the other hand, new models could be very useful on the regulatory side. Last month, Lo testified at the U.S. Congress House Committee on Oversight and Government Reform's hearing on hedge funds. If markets are adaptive, he argued, then regulation should also be adaptive. For example, instead of making a rule about a particular new financial instrument, regulators could make a rule that applies to any new financial instrument that becomes “too big to fail.” In that way, innovation would not get ahead of regulation, as it did in the shadow banking system.

    Unfortunately, although new models—small or grand—may help experts understand bubbles and crashes better, behavioral finance predicts that they will never be able to prevent the behaviors that cause them. “Financial crises may be an unavoidable aspect of modern capitalism, a consequence of the interactions between hard-wired human behavior and the unfettered ability to innovate, compete, and evolve,” Lo told the committee. In other words, says Caginalp, “human nature being what it is,” when it comes to money, people will always be just a little bit crazy.


    FerryBoxes Begin to Make Waves

    1. Claire Ainsworth*
    1. Claire Ainsworth is a freelance writer in Southampton, U.K.

    European researchers have enlisted a fleet of ferries to inexpensively gather valuable data about the region's waters.

    European researchers have enlisted a fleet of ferries to inexpensively gather valuable data about the region's waters

    Regular customer.

    With 32 daily crossings, this Dutch ferry makes regular, detailed measurements of a tidal inlet.


    White and navy livery glinting in the late afternoon sun, engines quietly thundering, the passenger ferry Pride of Bilbao steams purposefully into dock, on the final leg of a 2-day voyage that started on the northern coast of Spain. Her passage has taken her past pods of dolphins playing in the Bay of Biscay, across the ship-filled English Channel, and into Portsmouth Harbour on the south coast of Britain. Once moored, her ramp swings down to disgorge her cargo of cars, trucks, and chattering passengers. Dressed in fluorescent yellow jackets and ear defenders are marine scientists David Hydes and Mark Hartman, squeezing their way against the melee. They've come to get their data.

    The pair heads toward the hot, noisy engine room in the belly of Pride's 37,500-ton hull. There, tapped into a pipe that brings seawater into the ship to cool its refrigerator spaces and air conditioning is a metal box, about the size of a domestic washing machine. Inside is an array of sensors that sample the water as it flows through the box, sending measurements directly back to the team's Web site. But the satellite link can't handle all the data, which are collected once per second, so Hydes retrieves a memory card containing the rest of the readings. He and Hartman then carefully clean and check the device's individual sensors. It's a ritual the two, both based at the National Oceanography Centre, Southampton (NOCS), United Kingdom, perform once every 9 days, after the Pride of Bilbao has made three round trips to Spain.

    The machine they're servicing is a “FerryBox,” and it is an attempt to solve a problem plaguing oceanographers the world over: how to get enough data to understand our planet's oceans, without breaking the bank. FerryBoxes consist of suites of sensors that measure water variables such as temperature, salinity, carbon dioxide, and oxygen content, as well as biological features such as the amount of chlorophyll a, a molecule that betrays the activity of photosynthesizing plankton in the surface waters of the sea. More than 30 FerryBoxes now quietly chug between ports in Europe, with similar systems at work in the waters of countries including Japan, South Korea, Australia, and the United States. At a meeting in Southampton at the end of September on research uses of FerryBoxes, scientists also presented plans to establish a program in the waters around Africa.

    On board.

    FerryBox advocate David Hydes works with a ship-based sensor of CO2 in water.


    The idea of using a “ship of opportunity” (SOOP) for science has been around for centuries, of course. Marine researchers of yore would hand cruise liner captains a bucket and a thermometer before waving them on their way. (And some modern researchers are retroactively prying data out of ancient ships—see sidebar on page 1629.) Modern-day SOOPs are rather more sophisticated in the way they make measurements. In particular, says Hydes, vessels equipped with FerryBoxes or their relatives represent a new generation of SOOPs that has moved from making just basic hydrographical measurements to encompassing chemistry and biology.

    Ferries are cost-effective and reliable SOOPs, operating under a range of weather conditions that may restrict smaller vessels, says Hans Paerl of the University of North Carolina, Chapel Hill. Paerl heads a research team running the first ferry-box-type project in the United States, FerryMon, which began in 2000 in North Carolina's Pamlico Sound. What's more, the regular nature of ferry crossings lets sensors pick up on short-term phenomena that would otherwise be missed. FerryBox data can also be used to calibrate remote-sensing data. “The quality of the data is just as good [as], if not better than, existing water-monitoring programs,” says Paerl.

    By providing repeated observations of dynamic waters, these relatively low-cost sensor systems are tackling scientific puzzles such as the source of an odd freshwater layer in the English Channel and how currents flow in the Mediterranean Sea. FerryBoxes, say their fans, could also help address one of the biggest issues in oceanography: how the seas are doing at absorbing all the carbon dioxide humans have been pumping into the atmosphere, and how this may change in the future.

    Smooth waters

    Between the world's dedicated research vessels, oceanographic buoys, and remote-sensing satellites, scientists would seem to have the ocean well-covered. And they do—up to a point. But to really understand how Earth's oceans and marine ecosystems behave, researchers need widespread, detailed, regular monitoring sustained over the long term. If oceanographers were to use research vessels for all the operational monitoring that they want, however, the costs would be astronomical. Running just one U.K. research ship, for example, costs between £10,000 and £15,000 (U.S.$17,000 to $26,000) per day.

    That is why in the 1990s European researchers started eyeing the 800 or so ferries that ply the region's waters. The FerryBox in the bowels of the Pride of Bilbao, for example, cost about £35,000 to set up and has minimal ongoing costs, says Hydes. “It's getting the regular data you can't get in any other way,” he says.

    Indeed, one Dutch FerryBox-equipped ship operating in the Wadden Sea between Den Helder and the island of Texel makes two crossings per hour, 16 hours per day, 320 days per year. This has allowed Herman Ridderinkhof and his team at NIOZ Royal Netherlands Institute for Sea Research on Texel to build up a detailed picture of how sediments are transported in the Marsdiep tidal inlet, a key issue for the coastal ecosystem there. At the conference, Ridderinkhof showed a video made from FerryBox data that depicted how sand waves propagate across the estuary floor. “The longer you measure, the more interesting these kinds of things become,” he says.

    In the early 1990s, scientists at the Finnish Institute of Marine Research, in partnership with other institutes and companies in Finland and Estonia, pioneered the use of water-monitoring devices on ferries that sailed the Gulf of Finland and the Baltic Sea. Data from these ships were combined with measurements from other sources, such as satellite imaging and buoy recordings, to monitor the development of the harmful algal blooms that plague the area. The success of the project, called Alg@Line, prompted 10 other research groups to put together a proposal that won E.U. funding to develop the FerryBox technology and infrastructure. In that project, which ran from 2002 to 2005, eight ferry lines successfully collected data.

    Marine science.

    The routes of some of the data-gathering ferries, such as the Pride of Bilbao (above), that crisscross European waters.


    The 30 or so European lines now in operation are allowing oceanographers to study marine processes in a way that would not have been possible otherwise. For example, researchers had long noted that the western part of the English Channel often developed a puzzling freshwater layer during the summer, which varied in size from year to year. Conventional sampling methods failed to explain why, but the regular FerryBox measurements from the Pride of Bilbao revealed the appearance of fresh water off the coast of Ushant in western Brittany during March and April each year. French researchers, meanwhile, had reported that freshwater runoff from the Loire and Gironde areas appeared in the Bay of Biscay during winter. Boris Kelly-Gerreyn, a marine biogeochemist at NOCS, and his colleagues put those clues together and in 2006 concluded that Loire and Gironde freshwater was being blown all the way up into the Channel. The extent to which the fresh water spread into the channel depended on the state of the tides and the direction of the prevailing wind.

    Understanding the origin of the freshwater layer could be key to predicting algal blooms harmful to fisheries and tourism, because that water may spur algal growth. The team utilizing the Pride of Bilbao found that the largest algal bloom ever recorded in the channel, which happened in 2003, also coincided with the greatest freshwater intrusion detected by the FerryBox.

    Tackling CO2

    As scientists consider how they can expand the use of FerryBoxes, the Mediterranean Sea is an inviting target. The Mediterranean is surprisingly underexplored, Isabelle Taupier-Letage, an oceanographer based in Toulon at the French Research Institute for Exploitation of the Sea, said at the Southampton conference. She proposed that a network of FerryBox-like lines could help fill in data gaps, such as salinity measurements, and also resolve confusion about the direction of the prevailing path of the surface current in the southern part of the eastern basin. After the water from the Atlantic enters at Gibraltar, it forms currents that tend to flow eastward along the southern coastlines. In both basins, these currents generate large clockwise eddies (∼100-km diameter), which propagate across the basin, transiently altering the direction of water flow. “If you don't sample these eddies correctly, you get the wrong picture,” says Taupier-Letage. “You have to monitor at the basin scale.”

    Another potential target is the U.S. coast. There are currently only three ferry-box projects operating in the United States, although two more are in the works. In general, it's getting easier for scientists to set up ferry-based sensors. Manufacturers are now making off-the-shelf systems to save researchers the time and trouble of building them themselves. “We have a firm belief that the market is developing,” says Justin Dunning, sales manager at Chelsea Technologies, a U.K. marine sensors company that sponsored the Southampton conference.

    The technology is also improving, Dunning says. One goal is to reduce fouling—the growth of marine organisms such as algae on equipment—and the need for regular servicing by a trained technician. “You are never going to have a completely standalone vessel that you can just leave for a year,” says Dunning. But the technology to increase the length of time between servicing and data-collection visits, including some self-cleaning ability, is developing, he says.

    FerryBoxes are also tackling new problems. For example, they are starting to play an important role in monitoring the ability of the world's oceans to absorb the carbon dioxide emitted as a result of human activities. The oceans are an important “sink” for the greenhouse gas—they have managed to absorb about half of all our CO2 emissions so far—but no one can be sure whether this check against global warming will continue. And even if it does, will the oceans' rising acidity increasingly damage marine ecosystems? “We need in situ data like never before,” says Nick Hardman-Mountford, acting director for the Centre for observation of Air-Sea Interactions & fluXes at the Plymouth Marine Laboratory in the U.K. Consequently, he and his colleagues, in collaboration with the engineering company Dartcom, have developed a system that senses the amount of the carbon dioxide in surface waters, a measurement known as the partial pressure of CO2 (pCO2), for FerryBox use.

    World traveler.

    The Pacific Celebes circles the globe collecting CO2 data.


    The overall sparseness of pCO2 data is a huge stumbling block, notes Ute Schuster, an oceanographer at the University of East Anglia in Norwich, U.K. “Up until now, the E.U. has funded [pCO2 studies] in short-term projects, but much work needs to be done to persuade both the E.U. and national funding bodies that long-term sustained monitoring of pCO2 is vital and can be done very cost effectively working with commercial ships,” she says.

    Ferry and merchant shipping companies seem happy to help. After reading about successful pCO2 measurements on research vessels in a NOCS newsletter, an executive from a subsidiary of Swire Shipping, a merchant shipping company that operates mainly in Asia-Pacific, contacted Hydes to ask whether similar measurements could be made from their ships. The Swire Group Charitable Trust then funded Hydes to install a sensor system on its Pacific Celebes, a ship that has since circumnavigated the globe, collecting data on pCO2 and other variables. The company has pledged more funding to continue the work. It has even provided money for a Ph.D. studentship linking the work at NOCS with marine scientists at Xiamen University in China.

    FerryBox scientists would love to see the day when their sensors are routinely built into new ships. Some are even more ambitious. Thanks to good relations with the TESO ferry company running the Marsdiep route, Ridderinkhof managed to persuade executives to design their latest ship with a dedicated lab and a moon pool, an opening in the hull from which sensors can be lowered into the water. “We asked them if they wanted to make a hole in their brand-new ferry and they said, 'Yes,'” he quips. Hydes would like to see more of this sort of partnership. “There are a lot of ships out there,” he says. “We want to spread the message.”


    Logbooks Record Weather's History

    1. Claire Ainsworth

    Researchers are using old maritime logbooks to reconstruct a record of the weather stretching back to the 17th century.

    “What better data could a patient meteorological philosopher desire?” Francis Beaufort, inventor of the famous wind force scale that bears his name, wrote that in 1809 in reference to the British Royal Navy's logbooks, in which captains made detailed daily recordings of sea and weather conditions as they plied the waters of the globe.

    Weather report.

    HMS Experiment's logbook.


    Dennis Wheeler, a geographer at the University of Sunderland in the U.K., is following Beaufort's example: He is using old maritime logbooks to reconstruct a record of the weather stretching back to the 17th century. Wheeler got interested in the scientific potential of these historical logbooks while researching the weather conditions of great maritime battles. “It just occurred to me that these logbooks had a great deal of weather data in them,” he says.

    In 2000, Wheeler teamed up with colleagues in England, Spain, and the Netherlands to produce the world's first daily climatological record for the period 1750 to 1850 by studying logbooks from different European countries. Given that traditional meteorological records date back only 150 years, says Wheeler, the hope is that the logbook work will bring improved understanding of the current climate and how to model its future trends.

    Captains of naval and merchant ships used to record observations, including wind force and direction and general weather, as many as three times a day. Because navy ships sailed in fleets, Wheeler and his colleagues confirmed the consistency and reliability of such data by comparing logbooks of ships sailing together on the same day. They've compiled the information into a repository called the Climatological Database for the World's Oceans (CLIWOC).


    In 2006, Wheeler and his team reported that the Royal Navy logbook data showed that during the 1680s and 1690s, the waters off Britain experienced much stormier summers than normal. Whereas such a trend today might be attributed to global warming, Europe was experiencing a “Little Ice Age” at the time. Such findings illustrate how a long-term historical record could help distinguish between natural variability and humanmade climate change, says Wheeler.

    The logbook data are a useful addition to historical climate records, agrees Rob Allan, a climate scientist at the Met Office Hadley Centre in Exeter, U.K. Allan heads an international project, known as Atmospheric Circulation Reconstructions over the Earth, that aims to build up a long historical data set of global marine and terrestrial instrumental weather observations. These observations can help computer models produce “hindcasts” of past weather, to gain a better understanding of how the weather affects the environment, ecosystems, and society, as well using them to test the accuracy of the models. “We can check whether computer models and our reconstructions of past weather agree on what has happened and thus improve our confidence in computer-model simulations of future climate,” says Allan.

    CLIWOC, completed in 2003, contains 273,269 observations gleaned from 1624 logbooks. Wheeler and his colleagues in Europe and the United States are now working on an international project to digitize selected subgroups from the tens of thousands of logbooks that remain to be studied, including those of the English East India Company and the U.S. Navy. Given that there are more than 120,000 pre-instrumental logbooks out there, it's likely to make patient meteorological philosophers of them all. “There's still an awful lot to be done,” says Wheeler.


    When Juniper and Woody Plants Invade, Water May Retreat

    1. Michael Tennesen*
    1. Michael Tennesen, a writer in Lomita Pines, California, was a media fellow at the Nicholas School of the Environment and Earth Sciences at Duke University.

    Dense plants are taking over grasslands in many areas; researchers in the U.S. Southwest are studying how they tap into water supplies--and how to keep them in check.

    Dense plants are taking over grasslands in many areas; researchers in the U.S. Southwest are studying how they tap into water supplies—and how to keep them in check

    Deep thirst.

    Junipers, the conical trees concentrated in the center of this landscape, are proliferating in grasslands.


    AUSTIN—The day is hot and humid, as Robert Jackson descends into the mouth of Powell Cave and down a ladder that drops into the limestone bedrock of the Edwards Plateau about 240 kilometers west of Austin. He follows an ancient streambed through several large caverns and a few tight crawl spaces, until he arrives at a point about 18 meters below the surface, where a crystal-clear stream bubbles out of the rock.

    Several thick tree roots burst out from the limestone, reach down, and suck moisture from the water. Some of them have been wired with electronic probes. Applying pulses of heat, Jackson and his lab can gauge water flow by how fast that heat dissipates. Jackson, a biologist at Duke University in Durham, North Carolina, is investigating how roots transfer water from this depth up to the surface. “A single taproot can provide a third or more of the tree's water during a drought,” says Jackson.

    His study aims to determine what enables juniper to survive in these arid environments and how much ground water they are using. Funded by the U.S. National Science Foundation, the U.S. Department of Agriculture, and the Andrew W. Mellon Foundation, this research indicates that wildlife and water management would benefit from fewer of these trees.

    Woody shrubs and trees like juniper have in recent decades replaced arid and semiarid grasslands and savannas throughout much of the western United States, from the Great Plains to the Gulf Coast. “Encroachment of woody plants limits the amount of forage available to livestock, alters the natural landscape for native wildlife, impedes the flow of water available at the surface, and creates conditions for more catastrophic fires,” Jackson says.

    With global warming predictions calling for increased droughts, expansion could continue. And it's not just a U.S. problem: The shrubs and trees are proliferating in grasslands and savannas in South America, Africa, and Australia.

    Depleted streams

    The effect of expanding woodlands on water flow can be dramatic. Kathleen Farley of San Diego State University in California and Jackson reviewed data sets from efforts to establish new forests in Africa, New Zealand, Australia, and Europe and found that increased tree and shrub growth typically resulted in the loss of one-third to three-quarters of stream flow. “In areas where natural runoff is less than 10% of mean annual precipitation, afforestation can result in complete loss of runoff,” says Farley. As a consequence, some say, afforestation efforts that have been proposed as carbon offsets need close scrutiny.

    The same types of growth appear to be draining water from the Swiss-cheese structure of the limestone bedrock, known as karst, on the Edwards Plateau. Such systems cover one-fifth of Texas and 7% to 10% of the globe's land surface. This area in Texas gets about 63.5 centimeters of rain a year, yet there are few aboveground streams. Jackson's research shows that the drinking habits of these trees may be partly responsible.

    One hundred and fifty years ago, this area was comprised of grasslands and savanna where oak trees dotted the landscape. They were more frequently swept by wildfires, one of several factors that kept woody plants at bay. “There is likely no single driver” of the change to denser growth, says Steve Archer, a professor at the School of Natural Resources at the University of Arizona, Tucson. Fire suppression, overgrazing, droughts, and climate change have all played a part in helping woody plants succeed, he says.

    Juniper is the dominant woody invader throughout much of the Great Plains and the West, but mesquite plays a role in the southern Great Plains and the Southwest, creosote in the Southwest, and Chinese tallow in the Gulf Coast Prairies. Juniper and mesquite have invaded the plateau, forming dense monocultures in which birds and wildlife don't do well. Black-capped vireos and golden-cheeked warblers, two Texas birds that are both endangered, are among those that require a mixed-forest habitat to thrive.

    Large mammals are affected as well. The bighorn sheep of New Mexico, already endangered, are increasingly threatened by attack from mountain lions, which are moving into wooded mountaintops that were bald in the past. According to Eric Rominger, a bighorn sheep biologist at the New Mexico Department of Game and Fish in Santa Fe, “Mountain lions are ambush predators, and they need cover to launch their attacks.” The emboldened predators are also taking a toll on ranchers' livestock, mainly young cattle.


    One goal of Jackson's study is to learn how these trees survive drought. Finding water is only half the battle. All woody plants have to lift that water aboveground, which they do by creating an evaporative flow. The leaves are speckled with pores that open to absorb carbon dioxide (CO2). The leaves of a juniper are needlelike, or awl-shaped. When the pores open, they release moisture, setting up a powerful vacuum through short, interconnected passages called tracheids that draw moisture from the tree's root system in the same way one might pull soda through a straw. “When a tree doesn't have enough water to conduct this process, it doesn't die of thirst but of starvation”—because the intake of CO2 drops, and “it can't fix carbon to make food,” says Jackson.

    Tim Bleby, a research associate at the University of Western Australia in Perth, who worked with Jackson, studied root water uptake in juniper and found that water is delivered from one part of the tree to another to equalize water distribution. This gives such plants, which have multilevel root systems, a distinct advantage over shallow-rooted grasses. “When the surface is dry, they use deep roots; when it rains, they switch over to shallow roots.”

    The water-uptake mechanisms of juniper aren't fundamentally different from those of other woody plants, just more efficient. Cynthia Wilson, a former Ph.D. student in Jackson's lab, sampled junipers all over the West and then studied them for anatomy, wood density, hydraulic conductivity, and resistance to cavitation, or the formation of bubbles that interrupt the flow of water through the tracheids. Cavitation resistance is a marker for drought resistance. Wilson took branches from different junipers and spun them in a centrifuge, mimicking conditions under which the branches would cavitate and no longer transport water. According to Wilson, “Juniper are among the most cavitation-resistant tree species ever studied.”

    These attributes, coupled with fire suppression and overgrazing, have given woody plants like juniper an advantage in the West. Archer refers to the result as “thicketization.” Says Archer: “When we introduced grazing in the West in the late 1800s, fine-grass fuels were removed, and that greatly reduced fire frequency. Woody plants then had wider windows of opportunity to establish and become large enough to resist the effects of future fires.”

    Pushing back

    Ranchers and land managers have been fighting woody encroachment for years using everything from chemicals to goats. The straightforward approach—deploying chain saws and bulldozers—can be effective.

    For example, New Mexico has hired crews to cut piñon pine and juniper on sheep habitat, part of a management process including relocation and predator control that may soon see this animal downlisted. But Archer warns against trying to stop the woody invasion with a “wall-to-wall clearing of brush.” It's critical, he says, to take account of plants' “historical distribution and abundance.”

    Tapping in.

    Biologist Robert Jackson wires up tree roots in Texas's Edwards Plateau to monitor their water uptake.


    Some groups have developed clever ways to use brush control to restore historical grasslands. Jackson cites the example of J. David Bamberger, a co-founder of the Church's Fried Chicken franchise. Bamberger has been clearing juniper thickets on his land since 1969 and replanting them with grass. According to Colleen Gardner, executive director of the Bamberger Ranch Preserve in Johnson City, Texas, the ranch has established a native mix of grasses that includes 80 species and has restored 27 ponds and lakes.

    Biologist Marsha May of Texas Parks and Wildlife conducts tri-annual Audubon bird counts on the ranch and notes that the counts have grown from 48 to 205 species. She and other state biologists have located breeding pairs of black-capped vireos and golden-cheeked warblers on the property.

    Texas A&M University scientists conducted research on shrub control and water yield that addresses different vegetation types and geographic zones on the Edwards Plateau. The Edwards Aquifer Authority, in conjunction with the Natural Resources Conservation Service, is using the study as a basis to provide cash assistance to ranchers and land managers to help clear mesquite and juniper thickets from their lands. Ranchers can get as much as 70% of their expenses back. Texas Parks and Wildlife has a program to assist landowners with what they call “brush sculpting,” a careful method to return a property to historical purposes.

    If successful, the effort should help increase water flow as far away as San Antonio and Austin. Says Jackson: “It's not the typical case where you're faced with environmental and economic tradeoffs. Brush clearing here has advantages for water availability, forage production, and native wildlife. We are reaping multiple benefits.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution

Navigate This Article