News this Week

Science  27 Apr 2007:
Vol. 316, Issue 5824, pp. 526
  1. STEM CELLS

    Stem Cell President Quits After Acrimonious Meeting

    1. Constance Holden
    Burned out.

    Zach Hall, facing surgery, is leaving CIRM this month.

    CREDIT: HOWARD LIPIN/ZUMA PRESS/NEWSCOM

    Zach Hall was so rattled by a recent meeting at the California Institute for Regenerative Medicine (CIRM)—the $3 billion stem cell institute set up by statewide referendum in 2004—that he decided to quit as president earlier than he had planned. Hall cited the “contentious” nature of the meeting as well as his disappointment over likely delays in disbursing money for construction of new research facilities that scientists say are crucially needed.

    CIRM has scheduled a special teleconference meeting of its board for 2 May to respond to Hall's 30 April departure and the conflict over how to proceed with a $222 million construction program.

    Both issues arose from a 13 April meeting of CIRM's facilities working group, at which patient advocates balked at the idea of having a Request for Applications (RFA) ready by June for the so-called large facilities grant program. The members of the group wanted more time to consult experts on technical issues and sound out the public on what and where facilities are needed.

    Neuroscientist Hall, CIRM's founding president, had earlier intended despite planned prostate surgery in May to stay through the 5 June meeting of the Independent Citizens' Oversight Committee (ICOC). But, he wrote the board, “the exceedingly contentious and occasionally personal tone of the … meeting suggests that it is in both my best interest and that of the Institute for me to step down at this time.”

    The state's universities see construction of new research facilities as an essential part of the grand plan for CIRM. At a 10 April ICOC meeting, members representing research institutions expressed the need to move speedily. ICOC Chair Robert Klein observed that costs are rising, and at a 10% inflation rate, a 1-year delay would cost $60 million. The panel decided in a straw vote not to lose more time by conducting a “survey of institutional plans” to gain more information on which to base the RFA. Hall confidently predicted that the RFA—covering $150 million for a handful of big construction grants and $72 million for $5 million to $10 million grants—would be ready by July at the latest.

    Hall was taken aback by the very different reception he got at the facilities group meeting 3 days later. That group is made up of disease advocates who are also members of ICOC, as well as California real estate specialists; for conflict-of-interest reasons, it contains no researchers or university officials.

    Arguing that they were ill-prepared to gauge the need for facilities in the state, members of the working group lobbied for more time for assessment. “If we don't, we're going to be in a situation where we're backing the Brinks truck up to a couple of really well-established institutions that have access to a ton of wealth,” warned AIDS patient advocate Jeff Sheehy. Diabetes patient advocate Marcy Feit said the public has to be consulted: “I don't care if we have to meet with a hundred people or a million people. … That's our responsibility.”

    Judging by the meeting transcript, the atmosphere got a bit tense. Hall seemed perplexed, saying that he faced a “dilemma” because “there is a real split between what this … working group is saying, and what was said at the ICOC meeting by … those representing the scientific community.” The facilities group ended up voting unanimously for public hearings. There is a “cultural difference” between the disease advocates and scientists “who understand the urgency” of the program, Hall concluded.

    Such a difference was evident in comments by Joan Samuelson, who represents the Parkinson's Action Network. “I've been hearing from lots of people [who say], 'Don't throw a lot of money at facilities,'” said Samuelson. She added that it's private companies, not universities, that come up with cures. Sheehy later told Science, “I'm stunned. … I feel betrayed” by Hall's attempt to dismiss the arguments of the disease advocates. At this point, he says, the working group has “no evidence basis from which to proceed.”

    CIRM's board faces a full agenda at next week's meeting: whether to go ahead with hearings on the facilities program, and whom to appoint as interim head of CIRM. Also needed is a new head for the facilities committee, whose chair, California developer Albert “Rusty” Doms, resigned abruptly without explanation after the 13 April meeting.

    But there's light at the end of the tunnel. The presidential search is moving ahead apace. The search committee will be interviewing a half-dozen top contenders in May, with final candidates to be considered at the June ICOC meeting. CIRM also faces its final hurdle in the lawsuits that have stymied its efforts to raise money. The California Supreme Court is expected shortly to turn down a final appeal from groups that have been trying to get CIRM declared unconstitutional, in which case money from bond sales may start rolling in as early as this summer.

  2. GEOCHEMISTRY

    Humongous Eruptions Linked to Dramatic Environmental Changes

    1. Richard A. Kerr
    Pipsqueak.

    Large igneous provinces can boast a million times the lava produced by Iceland's 1783 Laki eruption (buses, left, for scale).

    CREDIT: ALAN ROBOCK

    Researchers looking for the cause of big, catastrophic changes on planet Earth have fingered a new one: so-called flood basalt eruptions, or large igneous provinces (LIPs) eruptions. These are no Mount St. Helenses or even Krakataus, which cooled the planet a degree or so and painted pretty sunsets for a couple of years. No, a single LIP eruption can spew 100 times the magma of anything seen in historical times. The 1000 such eruptions that can follow the first could build a lava pile of millions of cubic kilometers. Such massive volcanic activity seems to have dramatically altered the atmosphere and oceans for hundreds of thousands of years 94 million years ago and again 56 million years ago, according to two new studies.

    The newly strengthened link between megaeruptions and major environmental events comes in studies that draw on a single geologic record containing two signatures: that of a LIP eruption and another of a geologically abrupt environmental change. On page 587, geochronologist Michael Storey of Roskilde University in Denmark and colleagues use precise rock dating to tie the outpourings of a LIP—whose remains now span the North Atlantic from Greenland to Great Britain—to the sudden 5°C warming 56 million years ago known as the Paleocene-Eocene thermal maximum, or PETM (Science, 19 November 1999, p. 1465).

    Scientists have long thought that the gigaton burst of greenhouse gas—carbon dioxide or methane—that marked the beginning of the PETM must be linked to the 5 million to 10 million cubic kilometers of erupted North Atlantic magma, if only because they happened at about the same time. But having to date the two events in different records using different techniques made the case less than convincing. So Storey and his colleagues dated more rocks from the LIP using the argon-argon technique based on the radioactive decay of potassium-40. Combined with previously published data, the dating places one of the largest surges of magma of the past quarter-billion years at 56.1 ± 0.5 million years ago.

    The team also applied argon-argon dating to volcanic ash buried in marine sediments southwest of Great Britain that also contain a record of the PETM. That ash layer had been linked to a LIP ash deposit in East Greenland with a similar age, but the researchers beat down the uncertainty by making a total of 50 age measurements on the two ashes. Using additional published dating of the sediment between the ash layer and the start of the PETM, Storey and his colleagues put the beginning of the PETM at 55.6 million years ago.

    The new dating thus places the most dramatic warming of its kind just within the uncertainty of the beginning of one of the largest volcanic outpourings ever. “I think that the dating is quite good,” says Paul Renne of the Berkeley Geochronology Center in California. It “certainly provides strong linkage between the PETM and the [LIP].”

    Another study has strengthened the linkage between massive volcanism in the Caribbean and an abrupt transformation of the oceans 94 million years ago, known as oceanic anoxic event 2 (OAE2). OAEs were a half-dozen episodes in the warm mid-Cretaceous period 120 million to 80 million years ago when ocean sediments accumulated with so much organic matter that the sediments turned black. Something shifted ocean conditions to produce these “black shale” sediments, perhaps by eliminating oxygen from the deep sea. The leading candidate for a trigger is large volcanic eruptions.

    OAE2, the archetypal OAE event, had been linked to the massive Caribbean LIP through dating, but geochemist Junichiro Kuroda of the Institute for Research on Earth Evolution in Yokosuka, Japan, and colleagues took a different approach. They harked back to the search in the 1980s for markers of a large impact buried along with the remains of dinosaurs and other life snuffed out 65 million years ago. Instead of the element iridium brought in by an impacting asteroid, they looked at sedimentary lead, a potential marker of a rock's source. They traced lead's isotopic composition across the onset of OAE2 at an outcrop in Italy.

    In a few centimeters of sediment leading up to the start of OAE2 and beyond, the relative proportion of lead-208 dropped precipitously, they found. “The way it moves is difficult to explain without a volcano” contributing its distinctive mix of lead isotopes, says geochemist Catherine Chauvel of the University of Grenoble, France. In addition, the new lead-isotope composition bears a particular resemblance to that of the Caribbean LIP.

    So, rare and extraordinary volcanic eruptions coincide in time with rare and exceptional environmental changes, strongly linking eruptive cause to environmental effect. However, that link isn't yet clarifying just how LIPs wreak their havoc. For that, researchers will need more timings on more of the cascading effects of humongous eruptions.

  3. LUNAR SCIENCE

    Congress Restores Funds for NASA Robotic Landers

    1. Andrew Lawler

    Angry U.S. lawmakers have come to the rescue of NASA's robotic lunar lander program. NASA chief Michael Griffin had pledged to shut down the program to save money, but after strong pressure from both House and Senate members, the space agency has granted it a reprieve. The reversal, although welcomed by lunar researchers, puts more pressure on Griffin to pare other missions or win additional funding from Congress.

    In a 10 April letter, the chairs of NASA's two spending panels, Senator Barbara Mikulski (D-MD) and Representative Alan Mollohan (D-WV), ordered Griff in to restore $20 million to operate the lunar robotics office based at Marshall Space Flight Center in Huntsville, Alabama. The letter is a response to the agency's 2007 operating plan detailing how it intends to spend its $16.2 billion budget, approved in February; the plan must pass muster with Congress. As late as 12 April, Griffin was insisting that there is no need for robots beyond the Lunar Reconnaissance Orbiter planned for launch next year. But on 19 April, a NASA spokesperson said that “right now there are no plans to close” the lunar robotics office.

    Moon reversal.

    NASA's Michael Griffin won't close the lunar robotics office.

    CREDIT: 23RD NATIONAL SPACE SYMPOSIUM

    The about-face has more to do with jobs than lunar data. Faced with a $700 million shortfall in NASA's exploration program, Griffin decided this winter that the landers—the details of which have not yet been defined—were a luxury he could not afford (Science, 16 March, p. 1482). That decision upset Alabama Republican Senator Richard Shelby, who spearheaded the effort to keep open the Marshall office, with its 32 employees. In an 18 April speech, according to The Huntsville Times, the senator noted that he was “counting the days—1 year and eight-and-a-half months—[until] we have a new [NASA] administrator.” Two days earlier, Griffin had reminded an Alabama delegation visiting Washington about Marshall's central role in the human exploration effort, which aims to return astronauts to the moon by 2020.

    NASA's operating plan for the fiscal year that ends on 30 September also reflects the rising costs of several science missions. NASA will spend $63 million more in 2007 than it initially planned to keep the launch date for its Mars Science Laboratory from slipping beyond 2009. It will add $17 million to ensure a November launch of the Gamma Ray Observatory and $37 million above what it had anticipated so that the Kepler mission to find extrasolar planets can take off by the end of 2008.

    Those increased costs, combined with completing the space station and building a new launcher, are forcing NASA to find ways to save money. Although the proposed elimination of the lunar robotics program didn't fly with key legislators, NASA's larger budget problems aren't going away. Last week, several Democratic lawmakers urged the White House to meet with congressional leaders to find a way out of the morass. But so far, that call for a space summit has elicited no response.

  4. EXOPLANETS

    Habitable, But Not Much Like Home

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Amersfoort, the Netherlands.

    For the first time, astronomers have found an Earth-like planet that could be habitable. Like an oasis in space, the rocky world, possibly covered with oceans, orbits a puny red dwarf star just over 20 light-years away in the constellation Libra. “On the treasure map of the universe, one would be tempted to mark this planet with an X,” says team member Xavier Delfosse of Grenoble University in France.

    Most of the 200-plus exoplanets found to date are massive balls of gas similar to Jupiter. Only two have been found weighing less than eight times the mass of Earth. One of these is too cold, the other too hot for liquid water to exist on its surface. But the new planet, found by Stéphane Udry of Geneva Observatory in Switzerland and his colleagues, orbits right in the habitable zone of its mother star, Gliese 581, where temperatures are between 0° and 40°C.

    Being a cool red dwarf, Gliese 581's habitable zone is close-in: The planet is a mere 10.7 million kilometers from the star—one-fourteenth the distance of Earth from the sun—and completes an orbit every 13 days. Two years ago, the team found a more massive planet in an even closer orbit around the same star. And in the new data, taken by the European Southern Observatory's 3.6-meter telescope at La Silla in Chile, they also uncovered a third planet in a wider, 84-day orbit. The results have been submitted to Astronomy & Astrophysics.

    Tiny periodic wobbles of the star indicate that the mass of the new planet could be as small as five times that of Earth, strongly suggesting a ball of rock, not gas. Udry concedes that the true mass might be larger, depending on the angle between the orbit and our line of sight. But, he says, the mass cannot be much larger or the planetary system would be unstable.

    The new discovery is “wonderful news,” says Geoffrey Marcy of the University of California, Berkeley, whose team has found more than half of all exoplanets so far. But planet hunter William Cochran of the University of Texas, Austin, says, “It remains to be seen how habitable this planet actually is.” Cochran points out that the planet may always keep one face toward its mother star. Moreover, some theorists think that because of the way they form, planets close to red dwarfs may accumulate little water.

    Although it could in principle harbor liquid water, anyone visiting this cosmic oasis would find it very different from Earth. Says Udry: “The Holy Grail would of course be a planet with the mass of the Earth, orbiting a star like the sun, in a 365-day orbit. But we have to go step by step.”

  5. BIODEFENSE

    Proposed Biosecurity Review Plan Endorses Self-Regulation

    1. Jocelyn Kaiser
    Screen test.

    Proposed guidelines would have investigators decide whether their research could be useful to bioterrorists.

    CREDIT: JAMES GATHANY/CDC

    A federal advisory group has come up with a long-awaited blueprint for how the U.S. government should oversee biological research known as “dual use,” or experiments that could potentially be used by bioterrorists to cause harm. The voluntary plan would let scientists themselves decide whether their project raises concerns, which would then trigger a higher-level review—a process some critics think is woefully inadequate.

    Many microbiologists like the idea of self-regulation. But even supporters are frustrated by the lack of details provided by the 25-member National Science Advisory Board for Biosecurity (NSABB) after 2 years of work. Meanwhile, a few universities have begun reviewing all genetic engineering experiments for dual use, an approach that some say is inevitable.

    The report follows the explosion of federal biodefense research in response to the 2001 anthrax attacks. A 2004 National Research Council (NRC) report warned that stringent regulations could impede legitimate research and called for a self-governing system of oversight. That panel described seven types of “experiments of concern” that would automatically be reviewed, such as enhancing the virulence of a pathogen, but left it to a new federal advisory committee—NSABB—to develop guidance.

    NSABB, chaired by microbiologist Dennis Kasper of Harvard Medical School in Boston, has now done that. In a 50-page draft report released last week, it says scientists should report annually whether their research is potentially “dual use of concern,” perhaps starting with a check box on their grant proposal. A committee, perhaps an expanded version of the institutional biosafety committees (IBC) that now oversee genetic engineering experiments, would then review the flagged projects.

    Although the microbiology community is generally pleased with the plan, it is not entirely clear how it might work. Ronald Atlas of the University of Louisville in Kentucky says it is “somewhat schizophrenic” that the report calls for a voluntary system yet suggests that funding agencies make compliance a condition of funding. Nor does the report tackle synthetic biology directly, he says, leaving it unclear whether an experiment 5 years ago that built a poliovirus from scratch would even be covered. Also left undecided is whether the rules should cover fields outside the life sciences, such as chemical engineering.

    Richard Ebright of Rutgers University in Piscataway, New Jersey, is much harsher. He lambastes the committee's recommendation that even if an experiment fits into one of the NRC report's seven categories, an investigator could decide that the work isn't risky enough to be “of concern.” These subjective criteria “preclude meaningful oversight,” Ebright says. Ebright and others, such as Alan Pearson of the Center for Arms Control and Non-Proliferation in Washington, D.C., also say the guidelines should be mandatory and should cover privately funded research.

    The report will now go to an interagency committee, which will seek public comment and likely ask NSABB to hone the guidelines. But some universities are going ahead on their own. At Duke University in Durham, North Carolina, and two other schools participating in a regional biodefense center, IBCs are already screening all genetic engineering projects for biosecurity risks, on the grounds that scientists don't have the expertise or objectivity to decide, says Megan Davidson of Duke.

  6. GENETICS

    Erasing MicroRNAs Reveals Their Powerful Punch

    1. Jennifer Couzin

    For more than 2 decades, biologists have illuminated the roles of genes by deleting them in mice and studying these “knockout” animals, which lack the proteins encoded by the targeted genes. Now, scientists say they're beginning to uncover an entirely new layer of gene regulation by using the same strategy to erase portions of genes that make snippets of RNA. Just as knockouts of traditional protein-coding genes yielded a treasure trove of knowledge about how different genes govern health and disease, this next generation of knockouts could fill in the gaps that remain.

    In a flurry of papers, four independent groups have for the first time deleted mouse genes for microRNAs, RNA molecules that can modulate gene behavior. Each time, the rodents were profoundly affected, with some animals dropping dead of heart trouble and others suffering crippling immune defects.

    Since their discovery more than a decade ago, microRNAs have electrified biologists. Geneticists estimate that the human body employs at least 500 during development and adult life. But it wasn't clear, especially in mammals, how important individual microRNAs were, because some evidence suggested that these gene-regulators had backups. In worms, for example, erasing a particular microRNA by deleting the relevant stretch of DNA occasionally had a dramatic effect but more often didn't appear to do much.

    “I think there was a fear that nothing could be found” by deleting microRNA genes in mammals one at a time, says David Corry, an immunologist at Baylor College of Medicine in Houston, Texas. As it turns out, the opposite is true. “There's a lot more that the microRNAs are doing that we didn't appreciate until now,” says Frank Slack, a developmental biologist at Yale University who studies microRNAs in worms.

    Two of the groups that produced the mammalian microRNA knockouts deleted the same sequence, for miR-155, and describe the effects on the mouse immune system on pages 604 and 608. One team was led by Allan Bradley at the Wellcome Trust Sanger Institute and Martin Turner of the Babraham Institute, both in Cambridge, U.K., and the other by Klaus Rajewsky of Harvard Medical School in Boston. The other teams, one whose results were published online by Science on 22 March (www.sciencemag.org/cgi/content/abstract/1139089) and one whose work appears in the 20 April issue of Cell, eliminated different microRNAs and documented defects in mouse hearts.

    The two groups that deleted miR-155 found that the rodents' T cells, B cells, and dendritic cells did not function properly, leaving the animals immunodeficient. The mutation also cut down the number of B cells in the gut, where the cells help fight infection, and triggered structural changes in the airways of the lungs, akin to what happens in asthma.

    Missing molecules.

    Compared to a normal mouse heart (top, left), one from a mouse with a deleted microRNA (top, right) overexpresses a skeletal muscle gene (in red), among other defects. Erasing a different microRNA increased collagen deposits (green) in mouse lungs (above, right) compared to a normal organ (above, left).

    CREDITS (TOP TO BOTTOM): EVA VAN ROOIJ ET AL., SCIENCE; MADHURI V. WARREN

    Still, left alone in a relatively sterile lab, mice lacking miR-155 survived easily. But when vaccinated against a strain of salmonella, the animals failed to develop protection against the bacterium—as quickly became apparent when most who were exposed to it died within a month. “The animals were no longer able to generate immunity,” says Turner, an immunologist.

    Biologists typically see a specific defect when they knock out a protein-coding gene, but eliminating a microRNA may pack a bigger a punch, because many are thought to control multiple genes. In the case of miR-155, “you get much broader brush strokes … [and] very diverse immunological perturbations,” says Corry.

    There's a flip side to the promiscuity of microRNAs: A single gene may be the target of many microRNAs. That led some biologists to speculate that built-in redundancy would limit damage caused by deleting individual microRNAs. In the Cell study in which miR-1-2 was deleted, the microRNA actually has an identical twin that's encoded by a gene on another chromosome. “We thought that we'd have to delete both of them to see any abnormality in the animal,” says Deepak Srivastava of the University of California, San Francisco, who led the work. But half of his group's mice died young of holes in the heart. Others later died suddenly, prompting Srivastava and his colleagues to look for, and find, heart rhythm disturbances.

    The heart problems discovered by Eric Olson of the University of Texas Southwestern Medical Center in Dallas and his colleagues, which are also described on page 575, were more subtle. They erased the microRNA miR-208 and at first thought the mice were normal. Only when they subjected the animals to cardiac stress, by mimicking atherosclerosis and blocking thyroid signaling, did they observe that the animals' hearts reacted inappropriately to such strain.

    The four teams that knocked out the various microRNAs still don't know all the gene targets of each molecule. The findings, says Turner, “really do leave open a lot more questions than perhaps there are answers.” One is whether these and other microRNAs help explain inherited defects in diseases for which genes have been elusive. Ailments from cancer to Alzheimer's disease, says Carlo Croce of Ohio State University in Columbus, who is studying microRNAs in malignancies, may “have a microRNA component.” It's one that scientists are beginning to hunt for in earnest.

  7. BIG FACILITIES

    Researchers Get in Synch Down Under

    1. John Bohannon
    Southern lodestar.

    Australian scientists will no longer have to beg for beam time in far-flung lands.

    CREDIT: DANIEL MENDELBAUM/STATE OF VICTORIA

    MELBOURNE, AUSTRALIA—When his protein crystals melted en route to Japan last June, Jose Varghese bemoaned the loss of “months of work.” Varghese, a protein crystallographer who directs the structural biology program at CSIRO, Australia's national science agency, had planned to use Japan's Photon Factory to study the structure of human β amyloid, a protein implicated in Alzheimer's disease. Now he no longer has to worry about project-wrecking long-distance journeys: Starting this summer, he will be able to carry out the same studies without leaving the continent.

    Last week, the state of Victoria unveiled the $170 million Australian Synchrotron—the nation's first. “We've always been the poor neighbor who can't come to the party,” says Dean Morris, a physicist who has directed the machine's construction and fine-tuning. But with a synchrotron of their own—and the only one on this side of the Southern Hemisphere—set to come online in July, Morris says, “Australia will be a destination for researchers from around the world.”

    Australia is pinning much of its hopes for blossoming into a science powerhouse on what is essentially a gigantic doughnut-shaped microscope. By accelerating electrons to nearly the speed of light and bending their path within a 200-meter-long magnetic racetrack, the synchrotron produces pencil-width beams of photons a million times more intense than sunlight. The Australian Synchrotron will not be the most powerful in the world; that title belongs to the SPring-8 synchrotron in Hyogo, Japan. But its design allows for a wide range of applications, from nanotechnology and cell biology to forensic sciences. Because of this versatility, the synchrotron “has attracted more support across the whole spectrum of national science than any other project in Australia's research history,” says John Brumby, Australia's minister for innovation.

    At full capacity, the synchrotron is expected to host as many as 1200 scientists a year, up to a third of whom will be from abroad. (Four of 13 planned beamlines will be available by summer.) The dream, Morris says, “is to put Australia on the scientific map for big international collaborations.” He says that many here were chagrined that Australia was not invited to join the ITER fusion reactor now being built in Cadarache, France. “We have the expertise to take part in these sorts of projects, but without any world-class research facilities of our own, we're not considered as being in the same league.” The new synchrotron is half of the solution, Morris says. The other half is a new research reactor near Sydney—an upgrade of an older facility—that provides neutron beams for materials science experiments.

    Earning respect isn't the only aim. The synchrotron should also boost homegrown products: Casting the high beams on wool, for instance, will reveal the fine structure of fibers and enable scientists to tinker with textile properties. And the country's mining establishment will benefit from a future beamline dedicated to minerals research. The facility “will transform the technical nature of many Australian industries,” predicts synchrotron director Robert Lamb.

    Lamb and others hope the new machine will help squelch one export: scientific talent. By opening major science facilities, Australian universities hope to entice top expatriate scientists to come back home. “These tools … will enable Australia to compete effectively with researchers in the strongest Northern Hemisphere countries,” says Robert Robinson, head of the Bragg Institute in Sydney. The Australian Synchrotron puts out its first call for project proposals next month.

  8. MARINE BIOLOGY

    Killing Whales for Science?

    1. Virginia Morell

    A storm is brewing over plans to expand Japan's scientific whaling program

    On the rise.

    Humpback populations are rebounding, sparking plans to hunt them for research purposes.

    CREDIT: THE DOLPHIN INSTITUTE. PHOTO OBTAINED UNDER THE AUTHORITY OF NMFS PERMIT NO. 1071-1770-03

    When Louis Herman, professor emeritus at the University of Hawaii, Manoa, sets out to study humpback whales in Hawaii, the goal is to see the animals as individuals. His team identifies whales genetically, with small skin samples taken with a retractable dart, and physically, with photos of their tail flukes. Whale by whale, he and other marine biologists around the world are building a picture of a population rebounding from the overhunting of the last century. At the same time, however, another kind of study is planned for Antarctic humpbacks: Japanese researchers plan to kill 50 annually in an effort they claim will help explain ecosystem dynamics in the Southern Ocean. It would be the first time in 33 years that humpbacks have been killed for science.

    Japan's intention to expand their scientific whaling, which has been condemned by many Western scientists, will be discussed at what all expect to be a fiery meeting next month in Anchorage, Alaska, when some 200 whale researchers gather for the Scientific Committee meeting of the International Whaling Commission (IWC); it will be followed by the commission's full-court meeting, which is expected to be equally rancorous.* “There are going to be some fireworks,” predicts Doug DeMaster, a marine mammal biologist, director of the Alaska Fisheries Science Center in Seattle, Washington, and deputy commissioner of the U.S. delegation. The roster is peppered with contentious topics, including aboriginal subsistence whaling and whales as bycatch, but none is as explosive as scientific whaling.

    Even before the delegates have gathered, tempers are flaring over Japan's larger catch of Antarctic minke whales (in 2005, it upped its annual take from 440 to 935) and its plans to kill 50 humpback and 50 fin whales each year. “If Japan wants to resume commercial whaling, it should just come out and say that's what it's doing,” fumes marine biologist Nick Gales of the Australian Antarctic Division in Kingston, Tasmania, who is a member of IWC's Scientific Committee (IWC/SC). “But to do this in the name of science is simply not defensible.”

    Scientists at the Government of Japan's Fisheries Agency, which oversees the hunts, contend that their project is indeed scientific. “We are attempting to build an ecosystem model of the Antarctic's Southern Ocean,” explains Joji Morishita, director for the agency's international negotiations. “And to do that, we need to include data from the humpback and fin whales, since their biomass now equals that of the minke whales. We need to know their numbers, what they eat, how much, when and where, and whether they are outcompeting other whale species.”

    The issue highlights the sharply differing perspectives of wildlife conservation and resource management. Humpbacks, for example, were nearly hunted to extinction in the 20th century and now serve as the poster child for many conservation organizations; most Western nations consider them, as well as the fin whales, to still be endangered. But Morishita takes a different view. “It's dangerous to make the humpback a special animal that cannot be used,” he says. “What's wrong with using an abundant species while we still protect the endangered ones?”

    Some fear that the tension may ultimately break the fragile convention itself. The 73-member voluntary organization is virtually divided between pro- and anti-whaling nations and suffers from unhappy memories of previous meetings marred by insults and physical attacks. IWC, many say, is sinking like a harpooned humpback (although at least six new countries will join this year, as each side cultivates new members). Scientific whaling “has polarized the [IWC's] Scientific Committee,” says Scott Baker, a conservation geneticist at Oregon State University's (OSU's) Marine Mammal Institute in Newport. “We're asked to review Japan's proposals, to treat them as science when they are not. And that is objectionable.”

    In the beginning

    Scientific whaling was not the original purpose behind IWC, which serves as the decision-making body for the International Convention for the Regulation of Whaling (ICRW). Rather, it was set up in 1948 for the interests of commercial whaling. At the time various nations, including the United States, were concerned that many species of the great whale were being overhunted. According to ICRW's charter, it was organized “to provide for the proper conservation of whale stocks and thus make possible the orderly development of the whaling industry.” The convention also sanctioned scientific whaling under the four sentences of Article VIII, which allows members to catch whales for scientific purposes. Countries doing so are charged with regulating their own hunts, with no catch limits or oversight from member nations.

    Article VIII was drafted by Norwegian whaling expert and first chair of IWC, Birger Bergersen, now deceased. “It's clear that in his mind he was thinking that the number of whales a country could take for science was less than 10; he didn't intend for hundreds to be killed for this purpose,” says Lars Walløe, a physiological biologist at the University of Oslo, Norway, who has written about Bergersen and heads the Norwegian delegation to the Scientific Committee. “He had in mind, for instance, the possibility of finding a new animal and thus needing to take some in order to describe them scientifically.”

    In 1982, with many populations plummeting to near-extinction levels, IWC enacted a moratorium on commercial whaling, which took effect in 1986, and its focus shifted to conservation. “The moratorium is probably one of the greatest conservation success stories of the 20th century,” says Phillip Clapham, a marine biologist with the Alaska Fisheries Science Center in Seattle. “Many species of whales that were really hammered are now making remarkable comebacks,” including some populations of humpback and fin whales. But some blue, right whale, and bowhead populations remain worrisomely low, he adds.

    Not every IWC nation joined the moratorium. Member nations can lodge formal objections to the body's decisions, which it has no authority to enforce. Norway objected and has continued commercial hunting of minke whales, which are smaller (8 meters in length) baleen whales thought to number in the hundreds of thousands. Last year, Norway unilaterally upped its annual quota from 745 to 1052. Japan settled on a different tack, withdrawing its formal objection but launching scientific whaling programs in the Southern Ocean and North Pacific under Article VIII. In the past 5 years, Iceland has also started both scientific and commercial whaling programs targeting minke and fin whales, although its take is only a fraction of Japan's (see table, below).

    View this table:

    Although many whale researchers decry Iceland's decision, they are even more alarmed by the ever-increasing scale of Japan's scientific program and the fact that Japan kills whales within IWC's Southern Ocean Whale Sanctuary. Under the scientific whaling program launched in 1987 (called JARPA, for Japan's Whale Research Program under Special Permit in the Antarctic), the Japanese have killed an estimated 6500 minke whales there; that compares to about 2100 whales killed worldwide under Article VIII by all nations combined between 1952 and 1986.

    Japan began its second scientific whaling operation (JARPN) in the North Pacific in 1994, where it targets minke, Bryde's, sei, and sperm whales. According to Article VIII, the meat from these hunts should be used, and despite low demand, it is available in Japanese markets. Some is now stewed in ketchup at schools for lunches, and some can be found in restaurants and for sale online, says Naoko Funahashi, a conservationist with the International Fund for Animal Welfare in Tokyo.

    In 2005, at the 57th IWC meeting in Ulsan, South Korea, Japan stunned IWC/SC with its announcement that it was beginning a new operation (JARPA II), which would include taking humpback and fin whales in the Southern Ocean Sanctuary. So far, it has harpooned 12 fin whales and intends to begin killing humpbacks in 2007-08.

    Science under scrutiny

    Under the convention, the Scientific Committee is required to review scientific whaling proposals, and many researchers are sharply critical of the results of JARPA I. “The science and data are very poor,” says Clapham, echoing a complaint voiced by many other IWC/SC members. “It's outrageous to call this science; it's a complete charade,” charges Daniel Pauly, director of the Fisheries Centre at the University of British Columbia in Vancouver.

    The committee produced a consensus review of the 18-year JARPA I study last December, but the document includes few areas of agreement. On minke whale abundance: “The workshop has not developed any agreed estimates.” On the role of whales in the marine ecosystem, “relatively little progress has been made.”

    Yet the Japanese stand firmly by the science behind their whaling program. “We hear these criticisms all the time,” says Morishita. “A lot of non-Japanese scientists are always calling for us to submit our data, and we present our research results every year to the Scientific Committee and at other scientific meetings. If they think our data is so useless, I don't think they'd demand it. We would also like to publish our papers in more leading Western science journals,” but Morishita perceives these as being biased against scientific whaling. “We are also the only scientists collecting age data on these populations.” Scientists determine a whale's age by its waxy ear plugs, which can only be studied if the whale is dead. Morishita argues that humpback and fin whales are now competing with the minke for krill and says their new program will test this idea.

    Some researchers agree that the Japanese data are important. “They are doing valid science,” says Norway's Walløe, pointing in particular to Japanese genetic data that suggest the minke whale numbers in the Southern Ocean are declining, and that minkes there are growing slimmer, losing blubber. “Whether or not it is necessary for their study to take so many hundreds of whales every year for science, I cannot comment.” Walløe adds that the Japanese also provide biopsy samples, which are rare from large baleen whales in the Southern Ocean.

    Taken

    Japanese ships catch minke whales like this one, as well as a few other species, under scientific programs.

    CREDIT: IFAW

    But these data can be gathered without killing the whale, say Herman and others. “The Japanese want to ask which breeding populations the whales belong to, if these are growing, and where do they feed,” says Gales. “These are all questions which can be answered using nonlethal techniques including observation, satellite tracking, and genetic studies.” He and many others are unconvinced by the idea of food competition and say that it betrays an overly simplistic view of complex marine ecosystems.

    Researchers on all sides agree that the humpback whales' numbers in the Southern Ocean are increasing. Indeed, the data should “make everyone happy,” says Morishita. “Their numbers are so large now that their increase seems to be adversely affecting the minke whale. We want to see if that is the case.”

    But Clapham says not all southern humpback populations are rebounding. Whales from a variety of breeding populations congregate in the feeding area of the Southern Ocean. Most are part of two fairly large populations (totaling nearly 20,000) that travel from Antarctica to Australia's coasts, where they mate and birth their calves. Others, however, hail from far smaller populations that breed in the waters off Fiji, New Caledonia, and Tonga. “These stocks were devastated by illegal Soviet whaling in the late 1950s and '60s,” says Clapham. “They've never recovered and still number in the mere hundreds or fewer. But they feed in Antarctica with the whales from Australia. It's impossible to tell them apart; they don't have signs on their backs. How are the Japanese going to be sure they don't take humpbacks from these highly endangered populations?”

    Japan's program suggests to OSU's Baker that the science is largely about managing whales for future harvest. Whaling “can be done sustainably, which is why Japan collects the kind of data it does,” says Walløe. “If whales are going to be hunted in a sustainable manner, then we need this kind of information. But, if we're not going to kill any whales, then it could be argued we don't need it.” And the killing of whales, he notes, has now become more of a political than a scientific question.

    Because the scientific whaling program is “out of control,” says former U.S. Whaling Commissioner Rollie Schmitten, it might be better to just phase it out and permit tightly controlled commercial whaling, while prohibiting any international trade in whale meat. IWC has attempted to negotiate similar agreements at its annual meetings since 1996—but it has always failed, partly because some countries, notably Australia, New Zealand, and the United Kingdom, refuse to consider removing the ban. Meanwhile, subsistence hunts by aboriginal peoples in the United States, Russia, Greenland, and the Caribbean nation of St. Vincent and the Grenadines are also up for renewal this year. All this sets the stage for a contentious meeting when the full IWC gathers at the end of May.

    As a small island nation, Japan defends its right to marine resources. Japanese generally perceive antiwhaling sentiment as anti- Japanese, says Funahashi. But she holds out hope for change. “Most Japanese don't know that we hunt whales in Antarctica,” she says. “They think it's only in Japanese waters. When they hear about this other, they don't approve. Now more Japanese are going whale watching, and this is changing people's attitudes.” It's harder, after all, to eat an animal you know.

    • *59th Annual Meeting of the International Whaling Commission, 4-31 May, Anchorage, Alaska.

  9. CROSS-CULTURAL RESEARCH

    Pentagon Asks Academics for Help in Understanding Its Enemies

    1. Yudhijit Bhattacharjee

    A new program at the U.S. Department of Defense would support research on how local populations behave in a war zone

    The Iraq War was going badly in Diyala, a northern province bordering Iran, in late 2005. A rash of kidnappings and roadside explosions was threatening to give insurgents the upper hand. Looking for insights on how to quell the violence, the U.S. Department of Defense invited a handful of researchers funded by the agency to build computer models of the situation combining recent activity with cultural, political, and economic data about the region collected by DOD-funded anthropologists.

    The output from one model, developed by sociologist Kathleen Carley and her colleagues at Carnegie Mellon University in Pittsburgh, Pennsylvania, connected a series of seemingly disparate incidents to local mosques. Results from another model, built by computer scientist Alexander Levis and his colleagues at George Mason University (GMU) in Fairfax, Virginia, offered a better strategy for controlling the insurgency: Getting Iraqis to take over the security of two major highways, and turning a blind eye to the smuggling of goods along those routes, the model found, would be more effective than deploying additional troops. The model also suggested that a planned information campaign in the province was unlikely to produce results within an acceptable period of time.

    Researchers and DOD officials say these insights, however limited, demonstrate a role for the social and behavioral sciences in combat zones. And a new program called Human Social Culture Behavior Modeling will greatly expand that role. John Young Jr., director of Defense Research and Engineering and architect of the program, has asked Congress for $7 million for fiscal year 2008, which begins on 1 October, as a down payment on a 6-year, $70 million effort. Agency officials expect to direct an additional $54 million in existing funds to social science modeling over the next 6 years. Under the new program, the agency will solicit proposals from the research community on broad topic areas announced periodically, and grants will be awarded after an open competition.

    Officials hope that the knowledge gained from such research will help U.S. forces fight what the Bush Administration calls a global war on terror and help commanders cope with an incendiary mix of poverty, civil and religious enmity, and public opposition to the U.S.-led occupation of Iraq. “We want to avoid situations where nation states have unstable governments and instability within populations, with disenfranchised groups creating violence on unsuspecting citizens,” says Young. “Toward that goal, we need computational tools to understand to the fullest extent possible the society we are dealing with, the political forces within that government, the social and cultural and religious influences on that population, and how that population is likely to react to stimuli—from aid programs to the presence of U.S. troops.”

    Beyond bombs and guns.

    DOD officials say social science models can supplement the use of force to reduce violence in Iraq.

    CREDIT: MAURICIO LIMA/AFP/GETTY IMAGES

    The approach represents a broader and more scientific way to achieve military objectives than by using force alone, according to Young. “The military is used to thinking about bombs, aircraft, and guns,” he says. “This is about creating a population environment where people feel that they have a voice and opportunity.” Such tools would not replace the war games that military commanders currently use to simulate combat between conventional defense forces. Instead, the models would give military leaders knowledge about other options, such as whether improving economic opportunity in a disturbed region is more likely to restore order than imposing martial law and hunting down insurgents. Once developed in academic labs, the software would be installed in command and control systems.

    The plan has drawn mixed reactions from defense experts. “They are smoking something they shouldn't be,” says Paul Van Riper, a retired lieutenant general who served as director of intelligence for the U.S. Army in the mid-1990s. Human systems are far too complex to be modeled, he says: “Only those who don't know how the real world works will be suckers for this stuff.”

    But retired general Anthony Zinni, former chief of U.S. Central Command and a vocal critic of the Administration's handling of the Iraq War, sees value in the program. “Even if these models turn out to be basic,” he says, “they would at least open up a way for commanders to think about cultural and behavioral factors when they make decisions—for example, the fact that a population's reaction to something may not be what one might expect based on the Western brand of logic.”

    The new program is not the first time the military has tried to integrate cultural, behavioral, and economic aspects of an adversary into its battle plans. During the Cold War, for example, U.S. defense and intelligence agencies hired dozens of anthropologists to prepare dossiers on Soviet society. Similar efforts were made during the U.S. war in Vietnam, with little success. But proponents say that today's researchers have a much greater ability to gather relevant data and analyze the information using algorithms capable of detecting hidden patterns.

    A few such projects are already under way. At the University of Maryland, College Park, computer scientist V. S. Subrahmanian and his colleagues have developed software tools to extract specific information about violent incidents from a plethora of news sources. They then use that information to tease out rules about the enemy's behavior. For example, an analysis of strikes carried out by Hezbollah, the terrorist group in Lebanon, showed that the group was much more likely to carry out suicide bombings during times when it was not actively engaged in education and propaganda. The insight could potentially help security forces predict and counter suicide attacks. “This is a very coarse finding, not the last word by any means,” cautions Subrahmanian, adding that a lot more data and analysis would be needed to refine that rule as well as come up with other, more useful ones. Last year, the researchers applied their tools to provide the U.S. Army with a detailed catalog of violence committed against the United States and each other by tribes in the Pakistan-Afghanistan region.

    Other modeling projects are addressing more fundamental questions. With funding from the Air Force Office of Scientific Research, mathematical economist Scott Page of the University of Michigan, Ann Arbor, and his colleagues are modeling societal change under the competing influences of an individual's desire to act according to his or her values and the pressure to conform to social norms. The work could shed light on which environments are most supportive of terrorist cells, information that could help decide where to focus intelligence-gathering efforts and how to bust those cells. The research could also help estimate, by looking at factors such as rise in unemployment and growing social acceptance of violent behavior, when a population may be plunging into chaos. That in turn could help commanders and policy-makers decide when and how to intervene.

    Accomplishing those goals is a tall order, Page admits. “Despite tons and tons of data from U.S. elections,” he says, “we are still not very good at predicting how people will vote.”

    Building comprehensive and realistic models of societies is a challenge that will require enormous amounts of empirical data, says GMU's Levis, a former chief scientist of the U.S. Air Force. But it is doable, he says, adding that the field will benefit greatly from linking social science researchers and computer scientists. “The goal here is to win popular support in the conflict zone,” he says.

  10. CARBON EMISSIONS

    Improved Monitoring of Rainforests Helps Pierce Haze of Deforestation

    1. Eli Kintisch

    Deforestation produces a significant amount of greenhouse gas emissions through burning, clearing, and decay. But exactly how much?

    Twenty-five years ago, the best way for Brazilian scientists to gauge the rate of deforestation in the Amazon was to superimpose dots on satellite photos of the world's largest rainforest that helped them measure the size of the affected area. INPE, the government agency responsible for remote deforestation monitoring, didn't release regional maps and refused to explain its analytical methods. The result was data that few experts found credible.

    Today, Brazil's monitoring system is the envy of the world. INPE has its own remote-sensing satellite, a joint effort with China launched in 1999, that allows it to publish yearly totals of deforested land that scientists regard as reliable. Using data from NASA's 7-year-old Terra satellite, INPE also provides automated weekly clear-cutting alerts that other tropical nations would love to emulate. And image-analysis algorithms have eliminated the need for measurement dots. “They've really turned things around,” says forestry scientist David Skole of Michigan State University in East Lansing.

    Generating good data on deforestation is more than an academic exercise. The process of cutting down forests and clearing the land—by burning the wood, churning soil for agriculture or grazing, and allowing the remaining biomass to decay—produces as much as 25% of the world's yearly emissions of greenhouse gases. That makes keeping tabs on deforestation a crucial issue for government officials negotiating future climate agreements—including a meeting next month in Bonn, Germany, and one next year in Bali to extend the 1997 Kyoto agreement after its 2012 expiration.

    Despite solid improvements by scientists in monitoring deforestation, the uncertainties are still substantial. The gap between remote-sensing data and field measurements on the amount of deforested land is between 5% and 10%, say researchers. And the error bars on estimates of the amount of CO2 released by clear-cutting those tracts, they note, are 25% to 50%. Those errors, related to gaps in fundamental understanding of forest carbon, will make it harder for developing nations to verify the extent to which they have managed to reduce deforestation and, thus, reduce their output of greenhouse gases. In turn, the uncertainty undermines efforts to convince skeptical lawmakers in industrialized countries that efforts to diminish deforestation should be a part of future climate-change agreements.

    “We need to get these error bars down,” says climate negotiations veteran Annie Petsonk of Environmental Defense (ED), a New York City-based nonprofit. More precise satellite data for calculating carbon flux could also shed light on the role of trees in the global carbon cycle, a key ingredient in understanding whether global warming will accelerate.

    Margins of error

    When negotiators in 2001 agreed on what the Kyoto treaty would cover, they omitted deforestation. One reason was fear that clear-cutting halted in one country trying to achieve its Kyoto goals would move to another country under less pressure to curb the practice. But uncertainty about the science didn't help. At the time, INPE was releasing only totals, not maps, and few nations had experience turning visual data from Landsat 5 and other satellites (see chart) into integrated totals. “You'd have [negotiators] saying that it's impossible to measure deforestation,” says ecologist Paulo Mountinho of the Amazon Institute of Environmental Research at Para State, Brazil. “There was all this data but not enough know-how,” adds regional ecologist Greg Asner of the Carnegie Institution of Washington in Stanford, California.

    Not so hazy.

    An image of the Amazonian rainforest by Landsat 5 (left) includes clouds that obscure deforested areas visible on a radar image by ALOS (right). The two satellites are among a number of key sensors (below) that help researchers monitor deforestation in the tropics.

    CREDITS (LEFT TO RIGHT): JAXA

    In the last 5 years, a growing cadre of researchers in rainforest nations has begun tapping satellite data to monitor their forests; the list includes India, Thailand, and Indonesia. In addition to Brazil's weekly alert system, experts across the Americas are making increased use of NASA's medium-resolution Terra, which can scan any point on Earth roughly each day, at a decent resolution.

    Policymakers are taking notice of that increased capacity. A side presentation on detecting logging that Asner offered at the international climate meeting in Montreal in December 2005 drew hundreds of negotiators. There, Papua New Guinea and Costa Rica proposed including credit, after 2012, for efforts to curb deforestation. The idea has gathered momentum, and environmentalists are hoping that next month's meeting in Bonn, convened by a United Nations technical body, will lay the groundwork to measure and credit action against deforestation by developing countries. “The science has really driven the policy,” says ED's Stephen Schwartzman.

    The Bonn delegates will confront a number of technical challenges. The first is how to reduce primary errors in detecting forest losses from space. Brazil's yearly survey, dubbed PRODES, is based on the situation each August, before fall clear-cutting season, and uses software that searches images for bare ground. But Landsat passes over any one forest area only twice in a month, and clouds can obscure areas during one or both passes. Any gaps are filled with data from July or September, massaged with algorithms. “You're providing the best of your knowledge,” says mathematician Thelma Krug of INPE, which reported that 18,793 km2 of Amazon forest, with a 4% margin of error, were destroyed in 2005. That figure includes only clear-cutting, because the satellites' 20- to 30-meter resolution cannot detect less dramatic disturbances.

    One important omission is selective logging for timber, says Asner. In 2005, his team determined the fraction of green reflectance from each Landsat pixel, aided by considerable fieldwork to calibrate how nonvisual light frequencies could inform that calculation. They concluded that Brazil was omitting a whopping 12,000 km2 or more of so-called selectively logged forest areas per year (Science, 21 October 2005, p. 480). Asner fears that any system rewarding efforts to halt deforestation could miss a substantial source of emitted carbon if selective logging is not included. Others believe that logging has less of an impact: Skole says Asner could have mistaken thin forests or wetlands for logged forests because their infrared image can “mimic … a logged forest.” He also notes that many logged areas grow back. INPE estimates that the per-hectare emissions from selective logging are 2% of those from clear-cutting.

    Going, going…

    Tropical deforestation, in hot spots including Brazil, Madagascar, Indonesia, and West Africa, is a big driver of rising CO2 levels.

    SOURCE: THE NATURE CONSERVANCY AND HANSEN AND DEFRIES, 2004

    Even if scientists improve their monitoring of activities on the ground, however, they have only crude methods of calculating how much carbon a particular area of rainforest will emit once cleared. Estimates of the Amazon's total organic stock of carbon—including living and dead trees—range from 60 billion to 120 billion tons. National estimates are equally uncertain: Brazil calculated that deforestation and loss of grassland had emitted roughly a billion tons of CO2 into the atmosphere in 2004, plus or minus 30%. Several experts told Science that the margin of error is even larger.

    One problem is the heterogeneity of forests and the inability to identify denser, taller forest areas within larger regions. Historical sampling measurements in western Brazil only include trees at least 10 cm in diameter. “We need more science,” says geographer Ruth DeFries of the University of Maryland, College Park. One low-tech step, says ecologist Richard Houghton of Woods Hole Research Center in Massachusetts, would be repeated sampling of trunks and better biomass equations that encompass the whole tree. “We don't have many studies that have looked below ground at the roots,” he says. Even within a 1-hectare site, he says, the variability is maddening.

    Better eyes would also help. Japan's Advanced Land Observing Satellite (ALOS), launched last year, uses radar to see through the canopy and spot cleared sites that Landsat's cameras would miss. Initial results show decent contrast between forested and nonforested areas to a 50-meter resolution, says Woods Hole's Josef Kellndorfer. Upcoming ground studies in Brazil, Congo, and Uganda will aim to calibrate ALOS's ability to estimate biomass, aided by interferometry that could infer tree heights.

    Radar would also be a boon to cloudy countries such as Gabon, whose rainforests have been largely hidden from satellites until now. And ALOS's youth is also welcome. Widely available and relied upon, Landsat 5 was built for a 3-year stint and is nearing a quartercentury of labor. It “could go any moment,” worries DeFries. Christopher Justice of the University of Maryland, College Park, says that possibility highlights the need for “better international cooperation” to make sure data from other sources is just as easy to share.

    Ground truth

    DeFries says that those who care about rainforests shouldn't let the quest for improved detection stand in the way of making good use of what is already clearly visible. She's cheered by a campaign that has protected tens of thousands of square kilometers of Brazilian rainforest since 2004. A general trend of falling beef and soy prices has helped by cutting demand for land, environmentalists say. So has daily data from Terra, analyzed by INPE, that Brazilian officials have used to probe roughly 100 instances of possibly illegal deforestation, says INPE's Dalton Valeriano.

    The government could step up its enforcement activities, says geographer Carlos de Souza Jr. of independent watchdog Imazon in Brazil, if its mapping work were more solid. Using the same data that INPE collects, de Souza has calculated monthly totals that exceed or fall short of the government's number by thousands of square kilometers. He fingers data-sampling techniques, clouds, or different aggregating methods as possible culprits. And he worries that the government is learning about some illegal clear-cutting belatedly, from the yearly PRODES survey. “The most important thing is stopping deforestation as it is happening, not after,” says de Souza.

    An international incentive system could strengthen Brazilian resolve, says Daniel Nepstad of Woods Hole. “If this is happening without a carrot, imagine what would happen with a carrot,” he says.

  11. AMERICAN PHYSICAL SOCIETY MEETING

    Gravity Probe Researchers Report 'Glimpses' of Long-Awaited Payoff

    1. Tom Siegfried*
    1. Tom Siegfried is a writer in Los Angeles, California.

    AMERICAN PHYSICAL SOCIETY, 14-17 APRIL, JACKSONVILLE, FLORIDA

    After nearly half a century of plans, proposals, probing, and problems, NASA's Gravity Probe B satellite has finally reported a scientific finding. But physicists will have to bide their time a few months longer for the result they have been waiting all those decades for.

    Conceived in the late 1950s, Gravity Probe B was finally launched in 2004 to test subtle predictions of Einstein's theory of general relativity (Science, 16 April 2004, p. 385). Essentially a 3-ton Thermos bottle housing four precise gyroscopes, the probe was designed to measure the dragging of spacetime around a spinning body—in this case, Earth.

    Physicists reported here on 14 April that after a year of data collection and a year and a half of analysis, they have found “glimpses” of the frame-dragging effect. But they offered no numbers. A specific measurement was, however, reported for a second effect, caused by relativistic warping of spacetime. That “geodetic” effect tilted the gyroscopes by an angle of 6638 milli-arc seconds (mas) per year (give or take 97).

    If Isaac Newton's simple laws of gravity ruled the universe, each gyroscope's axis of spin would stay pointed toward a “guide star” tracked by a telescope on board the probe, which flew in a 640-kilometer-high polar orbit. Relativity's effects, however, should tilt that spin axis slightly away from the star by about 6600 mas per year, within the range of Gravity Probe B's margin of error (see figure).

    CREDIT: BOB KAHN, GRAVITY PROBE B, STANFORD UNIVERSITY

    But the geodetic effect has previously been measured more precisely by other methods. Gravity Probe B's primary purpose was to measure the much smaller effect caused by frame dragging, predicted to be a mere 39 mas per year: about the width of a human hair seen from 400 meters away.

    Measuring the tilt caused by frame dragging—at right angles to the geodetic drift—was complicated, however, by unanticipated wobbles and twists in the gyroscopes' motion. Eight more months of data analysis will be needed to smooth those glitches out of the data, said physicist C. W. Francis Everitt of Stanford University in Palo Alto, California, Gravity Probe B's principal investigator.

    “We have some glimpses of the frame-dragging effect, and I think they are probably authentic,” he said. “We're on track, but we're not there yet.”

    The latest problem in Gravity Probe B's long history arose from an oversight involving the gyroscopes, quartz spheres the size of Ping-Pong balls, coated with a thin layer of niobium, and caged in a quartz housing. Detectors tracked the spheres' spin direction by sensing magnetism induced in the niobium coatings. The mission was designed to avoid many possible sources of error, Everitt said. “But there were a couple of things that came in to surprise us.”

    Most serious of those surprises was the presence of electrostatic charges on the gyros' quartz housings. The rogue charge caused misalignment between the spin of the gyros and the spin of the satellite itself, which was pointed toward the star IM Pegasi to provide a constant frame of reference. Fortunately, Everitt said, the orientation of the satellite was precisely monitored throughout the mission, so researchers can recalibrate the data to correct for the error. But it will take them until December to extract the final result.

    “We're optimistic that we're going to have more interesting results” when the data analysis is finished, said Stanford's G. Mac Keiser, the mission's chief scientist.

    Ideally, Gravity Probe B would be able to measure frame dragging's effect to within half a mas, about 1% of its expected value. But the new problems suggest that goal might be unrealistic, said Clifford Will, a gravity expert at Washington University in St. Louis, Missouri, and chair of an expert team that advises the Gravity Probe B project. “It's not clear, given these effects that arose, that they'll reach the half a milli-arc second a year that NASA was thinking about 10 years before launch,” he said.

    Depending on its final level of precision, Gravity Probe B's result may not be much better than a previous frame-dragging measure reported in 2004, based on experiments with the Earth-orbiting LAGEOS satellites (Science, 22 October 2004, p. 592). That report, published in Nature, claimed an accuracy of 10%, although Will said many experts suspect that the actual uncertainties may be 20% or more.

    In any event, some physicists have suggested that Gravity Probe B's scientific payoffs might not match the mission's $760 million price tag. If the probe confirms general relativity with only moderate precision, it will have accomplished nothing new; if its ultimate answer contradicts Einstein, many physicists will more likely conclude that the experiment, not Einstein, was in error.

    If the result does differ from relativity's forecast, Everitt said, “we would publish it as long as we were convinced it was truth. Then we would leave other people to worry about it.”

    Everitt and other scientists on the Gravity Probe B team point out that the experiment has value beyond just measuring relativistic effects. It has produced technical advances already used on other space missions and has provided helpful lessons for planning future precision space probes, such as the proposed LISA mission to measure gravitational radiation from space.

    And Will said that the very fact that the mission flew, and worked as well as it did after decades of waiting, should be considered a triumph.

    “Everything worked almost perfectly,” he said. “A few things didn't work as well. And there are these strange effects that nobody could have imagined beforehand. But that's physics.”

  12. AMERICAN PHYSICAL SOCIETY MEETING

    Neutrino Study Finds Four's a Crowd

    1. Tom Siegfried*
    1. Tom Siegfried is a writer in Los Angeles, California.

    AMERICAN PHYSICAL SOCIETY, 14-17 APRIL, JACKSONVILLE, FLORIDA

    The family of self-effacing subatomic particles known as neutrinos should give up hope for the existence of an eccentric cousin, new results from the Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, suggest.

    Physicists know of three types, or “flavors,” of neutrino, designated electron, muon, and tau for their associations with other particles with those names. Neutrinos are notoriously difficult to detect and are very nearly massless. They must, however, possess at least a small mass, as they have shown the ability to switch identity in flight, a trick impossible for massless particles.

    These identify shifts, or “flavor oscillations,” have been well-established for years. Measurements of the oscillation rate provide clues to the differences in mass among the three known flavors. Such experiments at New Mexico's Los Alamos National Laboratory in the 1990s implied an unusually large mass difference, hinting that a fourth neutrino flavor ought to exist, in a “sterile” form that does not interact with other par ticles as ordinary neutrinos do.

    Over the past decade, an international team of researchers using a neutrino beam at a Fermilab particle accelerator has sought evidence to confirm or refute the Los Alamos results. If correct, the Los Alamos findings imply that many of the muon neutrinos in the Fermilab beam should oscillate into electron neutrinos before reaching a detector 500 meters away. But the Fermilab experiment, known as MiniBooNE (for “Mini Booster Neutrino Experiment”), found no evidence for the brand of flavor shifting reported at Los Alamos.

    No trace.

    MiniBooNE's sensor array failed to confirm earlier hints of noninteracting “sterile” neutrinos.

    CREDIT: FERMILAB

    “We do not see any evidence for muon neutrinos oscillating into electron neutrinos,” Los Alamos physicist Heather Ray, a member of the MiniBooNE team, said at the meeting.

    Although apparently ruling out the Los Alamos evidence for a sterile neutrino, the MiniBooNE experiment turned up a possible new mystery: a higher number of low-energy electron neutrinos than expected from “background” sources.

    “They may be a misestimation of the background, but they may be interesting,” said team member Eric Zimmerman of the University of Colorado, Boulder.

    Further analysis of the data and tests of new data now being gathered will be needed to clarify the reason for the low-energy anomalies, said Janet Conrad of Columbia University, one of the leaders of the MiniBooNE team. “There's still some possibility that there are some bizarre effects going on,” she said.

  13. AMERICAN PHYSICAL SOCIETY MEETING

    Snapshots From the Meeting

    1. Tom Siegfried*
    1. Tom Siegfried is a writer in Los Angeles, California.

    AMERICAN PHYSICAL SOCIETY, 14-17 APRIL, JACKSONVILLE, FLORIDA

    Safe zone?

    Galactic plane may block deadly rays.

    CREDIT: THE COBE PROJECT, DIRBE, NASA

    Top mass. Fermilab's Tevatron accelerator has reported a new value for the mass of the top quark, nature's heaviest basic particle. Combining data from the Tevatron's two experimental collaborations gives a top quark mass of 170.9 ± 1.8 giga-electron volts, Kevin Lannon of Ohio State University, Columbus, reported at the meeting.

    Precise knowledge of the top quark's mass is important for determining the likely mass of the Higgs boson, the as-yet-undiscovered particle that physicists believe must exist in order for other particles to possess mass. Combined with the latest value for the mass of another relevant particle, the W boson, the new top mass implies that the Higgs may still lie in the range detectable by Fermilab experiments. The Higgs will be one of the main quarries of the more powerful Large Hadron Collider near Geneva, Switzerland, scheduled to begin operations later this year.

    Cosmic extinction. For years, scientists have pondered fossil evidence that mass extinctions of life on Earth occur at regular intervals, suggesting some periodic cosmic influence. The latest analyses find a 62-million-year cycle in biodiversity, curiously close to the 64-million-year oscillation of the solar system above and below the plane of the Milky Way galaxy. But if radiation from sources within the galactic plane is to blame for biodiversity swings, the periodicity should be half as long, because Earth passes through the galactic plane twice each trip: on the way up, and on the way back. On the other hand, the 62-million-year cycle makes sense if the radiation attacks come from outside the galaxy, and only from one side, say Mikhail Medvedev and Adrian Melott of the University of Kansas, Lawrence.

    “The oscillation of the solar system gives you the right periodicity, but only if you have the effect on the north side,” Medvedev said at the meeting. The reason, the Kansas scientists say, is that the whole galaxy is rushing toward the Virgo Cluster, north side foremost. When the solar system is north of the galactic plane, galactic magnetic fields no longer shield it from radiation assaults emanating from Virgo. A paper describing the analysis has been accepted for publication in the Astrophysical Journal.

Log in to view full text