News this Week

Science  16 Jan 2004:
Vol. 303, Issue 5656, pp. 292

    Viral DNA Match Spurs China's Civet Roundup

    1. Dennis Normile*
    1. With reporting by Ding Yimin in Beijing.

    SARS is back. The deadly respiratory virus has been confirmed in one patient—now recovering—in China's southern province of Guangdong. And three other possible cases in the area are under observation. But international health experts say that, in contrast to last year, China is responding aggressively. Officials are quickly isolating patients who develop SARS (severe acute respiratory syndrome), identifying their recent contacts, and sharing information with outside groups such as the World Health Organization (WHO). “This is exactly what [China] should be doing,” says WHO spokesperson Dick Thompson.

    Hoping to cut off the source, Guangdong Province officials also ordered a sweep through farms and food markets to find and kill animals that may harbor the SARS virus, including masked palm civets and raccoon dogs. Some outside observers were initially skeptical of the cull, warning that it could spread the virus if not strictly supervised. But the purge is strongly supported by some researchers, such as a group at the University of Hong Kong (HKU), which provided key evidence that set the animal roundup in motion. The group's leader, HKU virologist Guan Yi, says that the team established a monitoring program in animal markets that identified a viral DNA sequence in civets that precisely matched a sequence found in the confirmed SARS patient.

    Clean sweep.

    Workers spray cages in a Guangzhou market and remove animals thought to harbor the SARS virus.


    Guan thinks the surveillance and intervention were critical: “Maybe we have been successful in avoiding a second outbreak.” WHO epidemiologist Andrea Ellis is more cautious. Although Guan's team made an “important contribution,” she says, “we need to keep in mind that other animals could be involved.” She warns that it is too early to celebrate.

    China's second battle with the virus began in late December after Chinese authorities announced that they had identified a suspected SARS case; they confirmed the diagnosis on 5 January. Since then they have identified three more suspected cases, still pending confirmation. They are all in Guangdong Province—which borders on Hong Kong—where SARS first emerged in late 2002.

    The decision to slaughter an estimated 10,000 civets and a lesser number of other animals in Guangdong's markets came on the same day that China confirmed its first SARS case of the 2003–04 winter. But it is not the first time that Guan and his colleagues in the microbiology department at HKU have helped authorities make a tough call. Since the early 1990s, the group has cooperated with Robert Webster, a flu expert at St. Jude Children's Research Hospital in Memphis, Tennessee, on a program funded by the U.S. National Institutes of Health to monitor viral strains circulating in chickens that could threaten humans. To eliminate a flu that proved fatal to humans in 1997, the group advised Hong Kong to slaughter all the chickens in the territory; Hong Kong followed the advice.

    Last spring, Guan was the first researcher to find the SARS virus in civets and two other species sampled at Guangdong's live animal markets (Science, 30 May 2003, p. 1351). Chinese authorities promptly banned sales of those animals until human cases stopped appearing and the SARS scare passed last summer.

    Virus hunter.

    A team led by Guan Yi of the University of Hong Kong found a match in viral DNA from a SARS patient and a civet.


    Convinced that the live animal trade still posed a risk, Guan started systematically screening civets and other exotic species in Guangdong's markets last October in cooperation with the Guangdong Center for Disease Control and Prevention. They found, he says, “many civets with high [SARS] viral loads.” Skeptical provincial authorities hesitated to take action. But they ordered a cull, Guan says, after the researchers showed that critical genetic sequences were nearly identical in both the human and the civet virus isolates.

    “This time there is no doubt that the [virus] came from the animal,” says Webster. Even though the patient claims never to have eaten or even seen a civet, and none of his contacts show evidence of infection, Webster says, he could have been infected indirectly by virus from an animal.

    Public health experts around the world were generally supportive of moves to get civets out of the markets. “Banning game animals [from the markets] is what should be done,” says Albert Osterhaus, a virologist at Erasmus University in Rotterdam, the Netherlands.

    At the same time, most scientists would like to see more efforts to define the geographical distribution of infected animals and trace the virus back to its presumed animal reservoir, which may be localized. “Animals in different areas may not all have caught the SARS virus,” says Xu Jianguo, director of the National Institute for Communicable Disease Control, a part of the national Center for Disease Control and Prevention in Beijing. His group is now working with other provinces to confirm where the virus is endemic and where it isn't as well as to search for the reservoir. “This is not the end of the story, it's [just] the beginning,” Webster says.


    President Bush Reaches for the Moon

    1. Andrew Lawler

    After months of closed-door debate, President George W. Bush was expected this week to put the American space program on a new trajectory. Critics have already attacked his proposal to send humans in the next decade to the moon and then Mars, which was leaked to the media late last week, and it is certain to face high hurdles on Capitol Hill. But Administration officials insist they are ready to back up their rhetoric with funding and political support.

    In addition to discussing a lunar base and, eventually, human expeditions to Mars (Science, 12 December 2003, p. 1873), Bush was also expected to propose retiring the space shuttle and replacing it with a more versatile vehicle capable of achieving those ambitious goals. Administration sources say that the president will ask Congress for an $800 million boost in NASA's current $15.5 billion budget for 2005, with similar annual increases of 5% over the next 5 years.

    “What you'll see is the means to carry it out: the budget, the dollars, the bucks, the capacity to actually do it,” said NASA Administrator Sean O'Keefe in an 8 January interview. He insisted that Bush's proposal will avoid the fate of his father's 1989 call for human missions to the moon and Mars, which Congress ignored and the White House soon forgot. The current president, says O'Keefe, is not only setting the direction but is willing to fight for “the dollars to carry it out.”

    Lunar roving.

    Artist's conception of a base on the moon, part of President Bush's long-range vision for NASA.


    The push for what O'Keefe labeled a “coldly pragmatic” plan comes as a relief to retired Admiral Harold Gehman, who chaired the Columbia accident board. In a 10 January interview, he told Science that NASA has been hampered since the 1970s by the lack of a politically popular vision: “If this becomes an initiative to solve our problem of getting into and out of low-Earth orbit, we're all for it.” His panel strongly recommended retiring the shuttle as soon as practical. NASA currently wants to build an Orbital Space Vehicle to replace the shuttle later in this decade; the new proposal will include a Crew Exploration Vehicle that likely would be similar in scope.

    The White House announcement comes after several months of internal interagency wrangling. O'Keefe was part of a group led by the National Security Council that hammered out the proposal with little input from Congress, industry, or the scientific community. While applauding the president for seeking a better vision for NASA, Science Committee chair Representative Sherwood Boehlert (R-NY) noted that “there are a wide variety of opinions on our committee” on NASA's future. “Any decisions on the future of manned space flight must be made in the context of budget realities,” he added.

    Democratic presidential candidates reacted very negatively. Howard Dean warned that the program could bankrupt the country, and Senator Joe Lieberman (D-CT) said other national priorities should take precedence.


    Stem Cell Research Could Be Ballot Issue

    1. Constance Holden

    Can a state on its financial uppers become a mecca for stem cell research? Some optimistic Californians have come up with a plan to raise $3 billion for research involving human embryonic stem cells. But first they need to convince 1 million voters that it deserves to appear on the November ballot.

    In late 2002 California became the first state to pass a law encouraging scientists to develop new cell lines—work that cannot be done with federal money under President George W. Bush's directive of August 2001 (Science, 27 September 2002, p. 2185). But the law provided no new funds, so Stanford researcher Irving Weissman and others joined with Palo Alto financier Robert Klein, father of a diabetic son and member of the board of the Juvenile Diabetes Research Association, to design the ballot initiative, which was launched last week.

    The proposed “California Stem Cell Research and Cures Act” would create the California Institute for Regenerative Medicine to administer grants and loans for stem cell research, including nuclear transfer or research cloning. Reproductive cloning would be banned. The money would be raised from the sale of state-backed bonds. About one-third would be spent on separate facilities to keep federally funded research at arms' length.

    The initiative has gained the backing of other prominent scientists, such as Stanford's Paul Berg and California Institute of Technology president David Baltimore. Its sponsors claim that the institute would generate enough tax revenues to offset interest payments for the first 5 years, and that repaying principal could be delayed until the sixth year. Backers claim that the research may easily pay for itself by contributing to treatments that would eventually lower health bills.

    “I think we can recruit the best and brightest,” says Lawrence Goldstein of the University of California, San Diego. “It would make us the center of the world” for stem cell research. Peter Warren, spokesperson for the California Medical Association in Sacramento, says, “We would seriously consider supporting it.” Asked about its chances of passage, Warren says the recent election of Arnold Schwarzenegger as governor suggests that “you can pull off anything in this state.”


    2005 Bush Budget Pulls NSF Schools Funding

    1. Jeffrey Mervis

    Academic researchers may be left behind by a new wrinkle in President George W. Bush's signature “No Child Left Behind” education reform program. Science has learned that the president's 2005 budget request, due out early next month, would phase out the National Science Foundation's (NSF's) largest program to improve student achievement in science and math and shift responsibility for it to the Department of Education (ED), which now runs a similar program. The change would replace a national competition based on peer review with a congressionally mandated formula to distribute money to every state based on the size of its student population.

    Bush's first budget had proposed the Math and Science Partnership (MSP) as a 5-year, $1 billion initiative to strengthen student achievement by linking university scientists and local educators. NSF heralded the program as a science-based replacement for a decade-long effort to carry out “systemic reform” in dozens of cities and states. Since then, NSF has funded two rounds of grants, totaling $260 million, and expects to receive another $140 million in 2004.

    Congress created the ED program to complement NSF's efforts, and it has grown rapidly from its initial $12.5 million budget to a planned $149 million in 2004. But there's one big difference: The ED money is distributed to states as block grants rather than by peer review once the size of the program tops $100 million (Science, 11 January 2002, p. 265). Sources say that the president's 2005 request doesn't raise the total funding for the combined MSP programs, now about $290 million, and that NSF would receive enough money to finish up projects already under way.

    Proud parent.

    President Bush visited Knoxville, Tennessee, last week to tout the second anniversary of his school reform program.


    The phaseout of the MSP program would be a blow to university researchers, who use NSF funding to support programs in local school districts to train teachers, improve curricula, and devise better ways to measure student progress in math and science. NSF currently supports 52 such projects. This fall, for example, it plans to reintroduce its wildly successful teacher institutes, which helped train a post-Sputnik generation of math and science teachers now nearing retirement age. “The change would not be good,” says Jodi Peterson of the National Science Teachers Association, which has lobbied for both programs. “It doesn't make any sense.”

    No federal officials would talk publicly about the new approach, citing the prohibition on discussing the 2005 budget until the president unveils it on 2 February. Susan Sclafani, who oversees the ED program as counselor to Education Secretary Rodney Paige, says only that “we are delighted to see an increased emphasis on mathematics and science and will work with states to make it a success.” Despite the gag rule, administrators at both agencies privately confirmed that the shift is part of the new budget.

    One source familiar with both programs says that the shift was made because White House officials felt that NSF's current MSP programs “were too close to its previous systemic initiative and not specific to NCLB [No Child Left Behind].” States that receive federal NCLB funds are required to begin testing students in science in 2007, the source noted, “and NSF's programs aren't helping them get ready fast enough.” Another official attributed it to the Administration's desire “to give as much money as possible directly to the states.”

    Although legislators welcome additional state aid, many also think that NSF is better equipped than ED to run a high-quality program with a lasting impact on student achievement. And it's Congress that has the last word on the annual budget. David Goldston, staff director for the House Science Committee, which oversees NSF programs, says that “we would vigorously oppose such a change,” although he emphasized that the committee had not been told anything. “The president chose to put it at NSF for the right reasons,” he adds, “and switching it would be very damaging to the program.”


    Panel to Prepare Plan for Underwater Network

    1. David Malakoff

    SAN JUAN, PUERTO RICO—After a decade spent standing at the dock, U.S. marine scientists are getting ready to launch a network of ocean observatories. Drawing on 5 days of talks at a meeting here attended by more than 300 scientists and engineers, an independent panel will assemble a detailed plan for a $200 million project. If all goes well, the National Science Foundation (NSF) hopes that the plan can sail through Congress in time for funding to begin in 2006. “We have a bakery full of ideas [that] we have to get down to a breadbox,” says Larry Clark, an administrator in NSF's ocean sciences division.

    Researchers have long craved an escape from the confines of ship-based expeditions that record mere snapshots of long-term ocean changes. Now improved technologies, including buoys that can bounce data off satellites and sensors that can pump vast streams of information through sea-floor cables, may sate their hunger. Three years ago, NSF endorsed the concept of building a trio of new facilities that would continuously pump marine data directly to researchers over the Internet. One observatory would consist of a set of movable buoys moored deep in the open ocean. Another would expand a nascent network of near-shore sensors. The third, and most expensive, element would be a regional observatory that would spread cable-linked sensors and automated submersibles over thousands of kilometers of sea floor.

    Scientific seascapes.

    Researchers are refining plans for ocean observatories, such as this cabled network.


    However, some scientists—including biologists and physical oceanographers—have been lukewarm to the idea. They fret that it might not work and fear that it could drain funds from existing projects. “Chunks of the ocean science community are still figuring out where they fit in,” says oceanographer Kenneth Brink of the Woods Hole Oceanographic Institution (WHOI) in Massachusetts.

    Last week's meeting* was an opportunity to share ideas, said organizers Meg Tivey of WHOI and Oscar Schofield of Rutgers University in New Brunswick, New Jersey. The concepts came in many sizes and shapes. One popular one is to use sea-floor sensors to monitor colliding crustal plates or to dispatch robotic submersibles to check on an erupting underwater volcano. “You could capture events that are rare but important,” says Brink. Other scientists offered ideas for studying extremophile bacteria living within the crust. A few pushed for installing coastal sensors first, because they might produce a quick payoff for policymakers concerned about pollution or fisheries. “We got an incredible amount of advice,” says Schofield.

    The next step is to produce a detailed plan that sets priorities—and totes up what it will all cost. A new NSF-funded planning group led by Brink hopes by the end of the year to have nailed down a half-dozen compelling science goals. That would be followed in late 2005 by a polished blueprint for surmounting a host of technical and logistical challenges, from developing sensors and information systems to finding enough ship time. “We know how to build ships and launch satellites, but nobody's ever put together an integrated ocean-observing network,” says Clark.

    By then, NSF hopes to have secured funding for the first observatories. Ultimately, researchers hope to link the pioneering U.S. sites to similar facilities elsewhere, creating a truly global ocean-observing system.

    In the meantime, there is work to do. “Even if you could put all that infrastructure out there now,” notes Tivey, “we wouldn't have the instruments to hang on it.”

  6. INDIA

    Book Triggers Attack on Research Institute

    1. Pallava Bagla

    NEW DELHI—To the scholar, the pen may be mightier than the sword. But last week a Hindu mob in western India inflicted serious damage on more than a millennium of scholarship by ransacking the Bhandarkar Oriental Research Institute in Pune. In addition to the loss of rare and valuable manuscripts and other artifacts, the unprecedented attack on a prominent research facility is being mourned as the latest example of the country's growing religious intolerance.

    The rioters belonged to Sambhaji Brigade, a right-wing Hindu nationalist organization. According to local police, the attack was carried out in response to “disparaging” remarks about the lineage of a legendary Hindu king, Chhatrapati Shivaji, contained in a 2003 book by James Laine, a professor of religious studies at Macalester College in St. Paul, Minnesota. The book analyzes Hindu-Muslim relations through a look at Shivaji's attempts to reduce the influence of Islam in 17th century India. The mob appears to have targeted the institute because Laine's book thanks senior manager Shrikant Bahulkar and other institute researchers for their help. Laine declined comment.

    At a loss.

    Staff of the Bhandarkar Oriental Research Institute survey mob damage.


    On the morning of 5 January, according to witnesses, about 150 people barged into the institute, snapped the telephone lines, ransacked the cupboards, tore thousands of books, and damaged the writings on palm leaves, rare artifacts, and old photographs in the library. The mob also grabbed several rare books, say police, who have charged 72 persons with trespass, rioting, and arson.

    The independent institute, founded in 1917 and with a staff of 50 scholars, has a collection of 120,000 books covering Indian culture, Indus Valley civilization, Sanskrit texts, and writings on the Ayurveda, the ancient Indian system of medicine. The damage to the building and equipment is estimated at $250,000, according to trustee M. K. Dhavalkar, but much of the collection is in disarray and may be difficult or impossible to repair. The government has provided $30,000 in relief, and local citizens have already raised $6000.

    In November Laine's publisher, Oxford University Press (OUP), apologized for the book and pulled it off the shelves. “It was creating some problems,” says OUP's Susan Froud, “so we decided to withdraw it from the Indian market. It's a rather sensitive matter.” The book remains available on the publisher's Web site.


    Panel Urges Shakeup of NOAA Research

    1. David Malakoff

    Under assault from Congress, U.S. oceanic and atmospheric science programs appear headed for a significant shakeup. An advisory panel last week recommended that the National Oceanic and Atmospheric Administration (NOAA) strengthen its sprawling research effort by consolidating some laboratories, appointing a new high-level research czar, and crafting a new long-term science plan.

    With a research budget of $650 million, NOAA is one of the world's largest funders of marine and atmospheric science, ranging from efforts to predict the weather to projects that tally fish populations. Recently, however, critics have stepped up complaints that some NOAA science programs are lackluster and disjointed. Last year, Congress asked the agency to examine ways to streamline—or even eliminate—one of its major research arms, the $350 million Office of Oceanic and Atmospheric Research (OAR), which administers roughly half the agency's R&D funds, operates a dozen laboratories, and employs about 900 people. To conduct the review, NOAA Administrator Conrad Lautenbacher last October created a five-member team of three outside experts and two senior agency officials; the panel presented its preliminary findings on 6 January in Washington, D.C.

    View this table:

    In an often bluntly worded report,* the panel concluded that NOAA sponsors a lot of important science but has failed miserably in explaining itself to Congress and the public. “There is no clear rationale for where some research programs are located … or how they fit into the big picture,” said climate scientist Berrien Moore III of the University of New Hampshire, Durham, the lead author of the draft report. The panel urges the agency to quickly write a comprehensive strategic plan for the next 20 years. And although the panel rejected the idea of dismantling OAR and giving its programs and labs to other offices, it urged NOAA to appoint a high-ranking “science czar” able to set research priorities and move money among programs. The agency also needs to consolidate some of its 40-plus laboratories, the panel said, focusing first on OAR's dozen facilities, six of which are in Boulder, Colorado. The panel stopped short of laying out specific mergers or program shifts, however, saying it will provide more detail in a final report due in May that will form the basis for NOAA's response to Congress.

    The advice is drawing generally positive reviews, although some observers predict that several recommendations—such as merging labs and appointing a science czar—are likely to get bogged down in bureaucratic and political wrangling. Still, Lautenbacher says, “most folks would look at the principles outlined in the report and find them pretty reasonable.” And new OAR head Richard Rosen, who was on the panel, predicts that some change “is inevitable.”


    Ebola Outbreaks May Have Had Independent Sources

    1. Gretchen Vogel

    Since 1995, when the world was horrified by a deadly Ebola outbreak in Kikwit, Zaire, central Africa has suffered nearly a dozen outbreaks of the hemorrhagic fever, which can kill more than 80% of its victims. The most recent bout killed at least 29 people in the Republic of Congo between October and December 2003. The human toll is tragic enough, but the disease is also threatening one of the world's largest populations of chimpanzees and gorillas: Researchers estimate that it has killed thousands of great apes in the past 5 years (Science, 13 June 2003, p. 1645).

    Public health authorities and conservationists are urgently trying to pinpoint the source of the continuing—and apparently increasing —outbreaks. They suspect that the virus lurks in a species that is somehow impervious to the disease but can infect vulnerable animals. One of the main puzzles is whether the recent outbreaks are all part of one larger epidemic that is spreading steadily through the forest from diseased animals to new victims, or whether each cluster stems from an independent introduction of the virus from its elusive host. Now, a team of virologists, epidemiologists, and veterinarians reports on page 387 that the evidence points to multiple, independent introductions.

    Eric Leroy of the Institute of Research for Development in Franceville, Gabon, and his colleagues sequenced virus samples from human and animal victims of five outbreaks in Gabon and the Republic of Congo between 2001 and 2003. In the five outbreaks, they found eight distinct strains of the virus. Previous studies have suggested that the Ebola virus is relatively stable; isolates from nine human patients infected during an outbreak in 1996 and 1997 were identical, and sequences from a 1976 outbreak in Zaire and a 1996 flare-up 3000 kilometers away in Gabon differed by less than 2%. Therefore, the authors of the new study conclude, the eight distinct strains they found in Gabon probably diverged over decades or even centuries, suggesting that they came from different sources.

    Tracking a killer.

    Ebola passes from an unknown reservoir to humans. In 1995 it killed 255 people in Kikwit, Zaire.


    Not everyone is convinced. Peter Walsh, an ecologist at Princeton University in New Jersey, who has argued that the human and ape outbreaks are part of a single epidemic, says the apparently different strains do not rule out a spreading wave front of Ebola. If the virus passes through multiple animal hosts as it moves through the forest, it could quickly collect enough mutations to produce the observed genetic differences, he says. “Under their theory, [Ebola] should be popping up all over the place, but it is always in areas adjacent to previous outbreaks,” he says. The pattern “provides strong evidence that there is not a magic bat cave that is spitting out infections in multiple directions.”

    Both scenarios are theoretically possible, says Les Real, a disease ecologist at Emory University in Atlanta, Georgia. The new set of data “leaves us where science almost always leaves us: unsure. It is really pointing to the necessity of finding the reservoir,” he says.

    The answer may not be so far off. Teams searching for evidence of the reservoir “are making good progress,” says William Karesh of the Wildlife Conservation Society in New York City, a study co-author. “I'm confident they'll have it nailed down this year.” Although researchers have been looking for evidence of the virus in hundreds of species, a leading suspect has long been bats, Karesh says. Lab experiments have demonstrated that bats can be infected with the virus and even shed it without becoming deathly ill, he says. But it has been difficult to pin evidence on any of the dozens of bat species native to the region. And co-author Pierre Formenty of the World Health Organization in Geneva raises a disturbing possibility: The different strains suggest that there may be more than one host species, perhaps insects as well as bats, rats, or shrews, he says.

    One observation is clear: The human outbreaks can all be traced to hunters or other villagers coming in contact with dead animals in the forest. Both sides agree that more careful monitoring of animal outbreaks will help prevent further human cases, and continued efforts to discourage hunting of great apes and other wild animals would benefit both apes and humans. The new study “does give us more leverage to work with local people about why they shouldn't be hunting these animals,” Karesh says, because it suggests that the virus is endemic in the forest and that the danger does not decrease after an epidemic has passed through a region. If that message gets across, he says, it's “going to reduce a tremendous amount of pressure on the great ape populations.”


    Hall Found Guilty of Lesser Misconduct

    1. Leigh Dayton*
    1. Leigh Dayton writes from Sydney, Australia.

    SYDNEY—A long-running case of alleged scientific misconduct involving a prominent Australian medical researcher and clinician shows no signs of ending. The latest twists involve a decision last month by the vice chancellor of the University of New South Wales (UNSW) to throw out the most serious allegations against renal transplant specialist Bruce Hall, followed by a vote last week of the university's governing council to explore the release of the entire report by an independent body that found Hall guilty of scientific misconduct.

    The original complaint against Hall, brought by two members of his lab and a graduate student, alleged that he had misrepresented and fabricated experimental results, manipulated authorship credit in presentations and papers, and provided false data in a grant application to the National Health and Medical Research Council. They also accused Hall of workplace intimidation.

    An initial university inquiry cleared Hall of wrongdoing. But an outside panel convened by the UNSW council and led by former High Court Chief Justice Gerard Brennan found early last year that he had “acted with intent to deceive” and with a “reckless disregard of the truth.” On the basis of those findings, the South Western Sydney Area Health Service—which houses Hall's laboratory and pays part of his salary—removed Hall from all supervisory duties.

    More questions.

    A University of New South Wales finding against Bruce Hall may not be the last word.


    On 24 December Vice Chancellor Rory Hume announced that his own review of the case led him to clear Hall of six allegations of scientific misconduct. Instead, he found that Hall had committed five lesser acts of “academic misconduct” warranting censure but not the loss of his job or laboratory. “None of my findings … struck at the heart of Professor Hall's science … and none warranted consideration of dismissal,” Hume wrote in a 43-page decision, which also cited mitigating circumstances such as illness and workplace stress.

    Hall has admitted making a “trivial” error on a grant application but denies any serious misconduct and has told reporters that he may fight Hume's decision. He has been highly critical of the 11-volume Brennan report and tried to obtain a court order suppressing its publication, although a judge's summary of the findings was eventually released. The UNSW council voted last week to ask its lawyers whether it could release the full report.

    Some leading academics have also criticized Hume's decision, but for the opposite reason. Ian Lowe, a policy analyst at Griffith University in Queensland, calls it a “blatant” case of favoring “the powerful over the weak.” Some academic members of the UNSW council, who wish to remain anonymous, agree with Lowe. They also share his fear that Hume's actions will “tarnish” the reputation of Australian universities.

    Meanwhile, three government bodies are pursuing separate investigations into Hall's alleged financial mismanagement of his grants and the university's handling of the whistleblowers' complaints.


    NSF Told to Open Process for Picking New Projects

    1. Jeffrey Mervis

    The National Science Foundation (NSF) may be the agency par excellence for meeting the needs of individual investigators and small research teams. But it could do a better job of deciding what big facilities the next generation of scientists will need and which ones should take priority.

    That's the conclusion of a National Research Council (NRC) report* on how NSF goes about choosing and building the likes of the Laser Interferometer Gravitational Wave Observatories in Louisiana and Washington state or the Atacama Large Millimeter Array in Chile. “I don't think that NSF's culture is appropriate for big projects,” says NRC panel chair William Brinkman, a research physicist at Princeton University and former head of research at Bell Laboratories. The private process that NSF uses so effectively to judge individual proposals needs to be more transparent and more inclusive when hundreds of millions of dollars are at stake, he says: “We hope that this report nudges them in that direction.”

    The account, called Major Research Equipment and Facilities Construction, represents less than 5% of NSF's annual $5.5 billion budget. But federal legislators began taking a closer look at it a few years ago after hearing from scientists that their projects had been approved but were awaiting funding (Science, 12 July 2002, p. 183). A proposal to build a series of ecological observatories, appearing suddenly in NSF's 2001 budget request after leapfrogging older projects, raised additional questions about how NSF sets its priorities (Science, 20 June 2003, p. 1869). Congress was also troubled by a series of reports from the agency's inspector general criticizing how the agency kept the books on some projects.

    Driver for change.

    William Brinkman says NSF needs a better road map for building large facilities.


    Responding to these concerns, NSF last spring hired a senior administrator to birddog every project from cradle to grave. The NRC report, requested by Congress, addresses the more fundamental issue of how to make sure, given the increasing competition for large facilities, that its decisions are fair. The answer, according to Brinkman, is to let the sun shine in.

    “NSF needs a road map for large research facilities that it is planning to build over the next 10 to 20 years,” Brinkman says. “Let each of NSF's [disciplinary] divisions hold meetings to hear new ideas and decide which ones deserve a closer look.” Brinkman pointed to the recent Department of Energy report setting priorities for new facilities as a model (Science, 14 November 2003, p. 1126) and observed that some fields, notably astronomy, have a long and successful history of carrying out such long-range planning exercises. Such a road map, updated regularly and including lifetime cost estimates for every project, would also allow NSF to squeeze as much science as possible out of every dollar invested.

    “Peer review behind closed doors is fine for individual grants. But the process for choosing large facilities needs to be as open as possible,” Brinkman says. “Granted, these studies take a lot of time and effort. But how else are you going to make sure that everybody has had a chance to be heard?”

    The report makes clear that the final say on projects must remain with NSF's director and presidentially appointed oversight body, the National Science Board. Board chair Warren Washington says he likes the idea of a road map, and “I agree that we need to broaden participation.” Toward that end, Brinkman says NSF has made significant progress since Congress and the community first started beating the drum for reform in declaring which projects are in the queue and managing those already in the pipeline. “But I also have to ask why the heck they didn't do it faster.”

    • * Setting Priorities for Large Research Facilities Projects at NSF (2004).


    CESR Launches Last Campaign

    1. Adrian Cho*
    1. Adrian Cho is a freelance science writer in Grosse Point Woods, Michigan.

    Physicists working with Cornell University's CESR particle collider embark on the venerable machine's final mission: to decipher the strong force that binds quarks to one another

    ITHACA, NEW YORK—Perforating the woolly hum of electronics, a voice crackles over the intercom: “Tuning is complete. Experimenters please acknowledge. Tuning is complete.” In a windowless, slightly disheveled room that has the feel of a partially finished basement, technician Donald Beyer III sits before a small wall of computer monitors. He makes several clicks with his computer's mouse and studies the screens as brightly colored graphs climb and numbers increase. Recorded applause issues softly from a speaker, followed by the playful strains of funk music. Leaning toward the microphone to his right, Beyer responds smartly: “CLEO acknowledges—and thanks you.”

    Thus, on a blustery Friday night here another run begins at Cornell University's Wilson Synchrotron Laboratory. In a circular tunnel not far below, electrons and their antimatter partners, positrons, whirl at near light speed through the 768-meter-long Cornell Electron Storage Ring (CESR). Speeding in opposite directions, the electrons and positrons slam into each other to produce more massive particles in the heart of the accelerator's mate, the three-story-tall particle detector CLEO, which squats in a cavernous hall on the other side of a cement wall. Those exotic particles quickly explode into more-familiar ones, and barrellike CLEO faithfully tracks the shards.

    For nearly 25 years, CESR has cranked out collisions and CLEO has studied the particles they produce. Now the two machines are at it again, as they embark on their last major project. When this series of experiments is finished in 3 or 4 years' time, CESR and CLEO will stop taking data for good. The end of their romance will mark the closing of an era in particle physics, for Wilson Lab is the last major particle physics laboratory in the center of a university campus. No longer will professors or graduate students be able to stroll from lecture hall to laboratory to tinker with a world-class high-energy physics experiment. A tie to the field's less formal and more freewheeling past will be cut. “I do think something will be lost,” says Cornell's Albert Silverman, who directed construction of the original CLEO detector. “But it's inevitable.” High-energy physics “no longer fits a university.”

    In the world of particle physics—the exemplar of big science—Cornell's Wilson Lab is the little lab that could. Whereas more than 1000 physicists may work together on a single particle detector at larger national or international laboratories, no more than about 250 physicists have worked at Wilson Lab at any one time. Other laboratories have annual budgets pushing into the hundreds of millions of dollars, but the Cornell lab gets by on $20 million each year from the U.S. National Science Foundation. Yet, through a combination of ingenuity, flexibility, daring, and plain good luck, researchers working on CESR and CLEO have managed to stay at the forefront of particle physics.

    Hail, CESR!

    Since 1979, the Cornell Electron Storage Ring has produced world-class physics.


    “Per person, they have accomplished much more than anyone else,” says Martin Perl of the Stanford Linear Accelerator Center (SLAC) in Menlo Park, California, who worked on CLEO briefly in the 1990s. Gordon Kane, a theoretical physicist at the University of Michigan, Ann Arbor, attributes the lab's success to the relative informality of the place. “They do things the way you do them in your kitchen,” Kane says. “So, of course, they get a lot done.”

    And now physicists at Wilson Lab have found themselves one last meaty task by redirecting their efforts in an unusual way. Whereas particle physicists generally strive for ever-higher energies, researchers at Wilson Lab are tuning their accelerator to lower energies to produce particles that have already been studied and look for some that may have been missed. The new high-precision measurements should nail down parameters and test new theoretical tools that are crucial for making sense of data taken at higher energies, says Cornell physicist and former lab director Karl Berkelman, who shares the Friday evening shift with Beyer. “CLEO keeps going,” he says, “because it keeps reinventing itself.”

    Tops at the bottom

    Wilson Lab nestles into a steep hillside beside a creek that runs into the woods and down into one of Ithaca's famous gorges. Apart from its gracefully rounded corners, the chocolate-brown brick building looks like a small factory. The accelerator tunnel loops behind the building northward, 15 meters below a parking lot and an athletic field.

    Within the tube run two accelerators. Hugging the inner wall is CESR's predecessor, a 35-year-old synchrotron that now accelerates the electrons and positrons and injects them into the newer machine. CESR runs along the outer wall, bigger and beefier, a mechanical caterpillar consisting of orange, blue, and yellow magnets and shiny steel “RF cavities” that push particles along on electromagnetic waves. Here and there tiny stalactites hang from the tunnel's white ceiling, and in places technicians have cobbled together plastic shields to protect the machines from dripping water. Viewed from the tight confines of the tunnel, CESR looks rather humble and homemade.

    And yet, for more than a decade it produced the world's most intense colliding beams. Thanks to that torrent of data—and some extraordinary good fortune—CLEO researchers have played a key role in fleshing out the reigning theory of particle interactions, the Standard Model.

    In the 1970s, while researchers at other labs planned higher energy electron-positron colliders, researchers at Cornell designed CESR to fit inside the tunnel they already had. At the time, physicists knew that the protons and neutrons in atomic nuclei consist of smaller fundamental particles they dubbed the up and down quarks. They also knew that those quarks possess a pair of heavier cousins, the charm and strange quarks, which appear in particles generated in high-energy collisions. And some theorists suspected that there were at least two more quarks, the top and bottom. But no one knew how massive they might be.

    Then, in 1977, 2 years before CESR collided its first particle beams, researchers at the Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, discovered the bottom quark. Its mass was roughly five times that of a proton, smack in the middle of CESR's energy sweet spot. “The fact that the bottom quark was in the range of the collider we were designing was just dumb luck,” says Cornell's Silverman.

    Throughout the 1980s and '90s, Wilson Lab researchers pumped out particles called B mesons, which consist of a bottom antiquark and an up or down quark. CLEO researchers concentrated on the so-called weak interactions through which the bottom quark inside the meson decays into a charm or up quark. At the same time, CESR researchers developed clever schemes to cram ever more electrons and positrons into the ring. “There was a period of about 15 years where we had the best tool for doing that work,” Silverman says.

    Old school.

    With its perpetual dark and motley electronics, the CESR control room harks back to the early days of particle physics.


    In the end, however, Wilson Lab physicists found themselves hoist with their own petard. Their measurements and others showed that the weak decays of B mesons might yield new insights into the subtle flaw in the mirrorlike symmetry between matter and antimatter, an imbalance known as CP violation that's key to understanding why the universe contains so much more matter than antimatter. Four years ago, two specially designed “B factories”—one at SLAC and the other at the Japanese High Energy Accelerator Research Organization (KEK) in Tsukuba—started spewing B mesons at a far greater rate than CESR can. In 2001, experimenters at both laboratories observed CP violation in B meson decays.

    So Wilson Lab researchers have moved down in energy to measure the properties of other particles. Ironically, the data they're collecting may prove crucial to understanding the results from the B factories.

    Pouring on the charm

    One door down from the CLEO “counting room” where Beyer and Berkelman monitor the flow of data—and separated from it by a row of bookshelves serving as a wall—lies the CESR control room. Stepping into it is like stepping back in time, at least with one foot. Whereas the control rooms of other accelerators now gleam with bright lights and fancy computer monitors, the CESR control room remains almost perpetually dark, as once was common. A broad console stretches across racks of electronics festooned with old-fashioned cables, dials, and switches. Yet, within the racks, old and new mingle; a sleek new digital oscilloscope sits cheek-by-jowl with an aging television screen. “At first all the knobs and switches are kind of overwhelming,” says operator Lawrence Wilkins. “There are over a thousand things you can adjust.” Seemingly, none of them is clearly labeled.

    Researchers have tuned CESR to produce D mesons, particles that consist of a charm quark, which is roughly one-and-a-half times as massive as a proton, and an up or down antiquark. And they have shifted their focus from the weak interactions, through which heavier quarks decay to lighter ones, to the strong interactions, through which quarks and antiquarks bind to one another by exchanging shadowy particles called gluons. “The strong interaction is like a curtain over the window” to the weak interaction, says Ian Shipsey of Purdue University in West Lafayette, Indiana.

    That's because the strong force is so strong that it's physically impossible to isolate a quark and watch it decay. Instead, researchers must study the decays of particles that contain quarks, whose tangled webs of strong interactions inevitably affect the results in ways that are nearly impossible to calculate.

    View this table:

    Recently, some theorists have claimed that a computer-intensive technique called lattice quantum chromodynamics (lattice QCD) can finally provide truly precise calculations of the effects of the strong interactions (Science, 16 May 2003, p. 1076). That claim is controversial, and to test it, Wilson Lab researchers will compare lattice QCD predictions against their very precise measurements on D mesons, says Cornell's David Cassel. “If theorists can successfully calculate the parameters that we can measure,” he says, “that will give us a great deal of confidence in their calculations of the parameters in B decays, which we desperately need.”

    CLEO researchers will also measure the precise rates at which D mesons decay into specific combinations of lighter particles. Those measurements will yield a more immediate payoff for physicists studying B mesons, because B's decay primarily into D's. So to know the absolute probability that a B meson will decay in a particular way, researchers need to know the absolute probability of the subsequent D meson decay. Finally, physicists at Wilson Lab will hunt for oddball particles that contain no quarks but only gluons. The theory of the strong interaction predicts that such “gluballs” must exist, but none has been definitively identified yet. In formulating their assault on the strong interaction, Wilson Lab researchers have “found an important piece of physics that had been left behind,” says Michael Witherell, director of Fermilab and a former member of the CLEO collaboration.

    Dialing the accelerator down to lower energies isn't as simple as turning down a dimmer switch, however. As the electrons and positrons zing around the ring at nearly 400,000 revolutions per second, they radiate x-rays, and that radiation helps cool the beam so the particles can snuggle into tightly squeezed bunches. At lower energy, that helpful energy loss declines dramatically, so CESR researchers have installed special “wiggler” magnets that shake the beam back and forth and squeeze out x-rays. In the new configuration, the wigglers provide 90% of the cooling, says Cornell accelerator physicist David Rice, far more than at any other accelerator.

    Taking chances

    A curious mixture of the practical and the purely aesthetic flavors Wilson Lab. Full-scale blueprints of particle detectors, pieces of accelerators, displays of key particle decays, and pictures of important moments in lab history adorn the paneled hallways. In a stairwell hangs a reproduction of John Singer Sargent's controversial portrait Madame X. In the lab's upper entrance, an enigmatic sculpture—the figurehead from a wooden ship—greets visitors. The jumble must have pleased the lab's founder, the eponymous Robert Rathbun Wilson, who died in 2000, and whom all credit for establishing the institution's can-do attitude.

    Wilson arrived at Cornell in 1947, a time when many major universities had particle accelerators. With right-hand man Boyce McDaniel, who died in 2002, he spent 2 decades building ever-more-powerful machines, including the synchrotron that preceded CESR, before leaving to direct the construction of Fermilab in 1967. Wilson possessed an artist's refinement and compared the value of high-energy physics to that of great poetry or painting, but he also had a ranch hand's sense of how to get things done quickly and cheaply. “If we had any secret in constructing machines rapidly and at not great cost,” he once wrote, “that secret was our willingness, almost our eagerness to make mistakes—to get a piece of equipment together first and then to change it so that it will work.”

    Big machine on campus.

    The last particle physics accelerator at a university, CESR circles beneath the Cornell running track.


    Although researchers at Wilson Lab may not be as plainly reckless as Wilson could be (he also writes of crawling into a magnet so strong it made him see strange colors), they have maintained a tradition of taking risks. That's true not only technologically but personally as well, says particle physicist Persis Drell, director of research at SLAC and a former member of the Cornell faculty. “People [at Wilson Lab] are flexible in what they're willing to do and what they're willing to try,” she says. “Maybe it's the long winters, but they get more out of people than any other place I've ever been.”

    By maintaining unusually close ties between particle and accelerator physicists, professors and students, the homey lab has fostered innovations in accelerator technology that may constitute its most far-reaching legacy, says John Seeman, an accelerator physicist at SLAC who was a graduate student at Wilson Lab. “There was a sense that, even as a student, if you had a good idea they would find a way to do it,” Seeman says.

    Now, however, Wilson Lab researchers are preparing for what may be their biggest challenge: sustaining that culture of innovation without the machine in their basement. When the current study of D mesons and gluballs ends around 2007, CESR will no longer generate particle collisions for CLEO. The accelerator may continue to run for a while as an x-ray source, but CESR's reign as a leading particle physics machine will come to an end.

    Wilson Lab researchers face that eventuality matter-of-factly. Many are preparing to pour their efforts into a proposed linear electron-positron collider that will stretch more than 30 kilometers in length and may reveal the properties of a whole slew of new particles (Science, 21 February 2003, p. 1168). Decked out with its wigglers, CESR is already a prototype for the two “cooling rings” needed to compress the gargantuan collider's beams. Particle physicists are hoping that Wilson Lab will serve as one of several centers around the world from which the collider may be remotely controlled. The little lab may play a big part in the international project thanks, once again, to its strength in both accelerator and particle physics, says lab director Maury Tigner. “That's going to give us an opportunity to make contributions that are disproportionate to our numbers,” says Tigner, who also heads the International Linear Collider Steering Committee.

    For the moment, however, there are D mesons to be made and studied. So while the wind whips snow flurries along the ground outside, 15 meters below, matter rushes into antimatter, otherworldly particles are born and die, and slowly fragments of a deeper understanding accumulate in the heart of a grand machine.


    Joining a Trend, Scientists Increasingly Say 'Call My Agent'

    1. Trisha Gura*
    1. Trisha Gura is a writer in Cambridge, Massachusetts.

    Temporary work doesn't just mean doing clerical work; a growing number of high-level scientists are discovering that they like the independence it offers

    When cell biologist David Voehringer finished his postdoc in the Stanford University laboratory of Leonard Herzenberg, he faced a chasm in his career. Ready to strike out on his own, Voehringer drew up a list of contacts and made inquiries. But the market in 2001 was crashing, and highly qualified candidates like him had flooded it with résumés. He recalls the search for an independent job as “really the first time that a lot of us are faced with chronic rejection.”

    But Voehringer did get an encouraging call—from a representative at Lab Support, a Calabasas, California-based agency that places scientific personnel in temporary assignments. Would Voehringer be interested in a 3-month stint at a biotech company called Sugen, a South San Francisco firm specializing in signal transduction research? It wasn't exactly what Voehringer was looking for, but he took the job anyway. He picked up industry experience and ultimately landed a permanent position at Cell Biosciences, a biotech company based in nearby Palo Alto.

    Like Voehringer, a growing number of high-level scientists are discovering the merits of temporary employment. Gone are the days when temping meant serving as a clerical worker, filling in for a secretary on vacation. Instead, industry and government employers are now recruiting science experts on a contingency basis to complete projects, add support at times of peak demand, or simply test for a good fit before bringing in a worker permanently.

    Scientists, for their part, are using temporary positions to get a toehold in the marketplace or to jump fields in midcareer. Some need visa sponsorship, which a temp agency can offer. Others want a chance to take care of children, pursue hobbies, or enter into quasi-retirement.

    “People are recognizing that the dynamic of the economy is far different [from what it was] a generation ago,” says Steven Berchem, vice president of the American Staffing Association (ASA) in Alexandria, Virginia. “Businesses are being more conservative about hiring. And more and more workers are seeking flexible work arrangements.” Indeed, the number of contingency workers is rising across the board. And some scientists who find the traditional academic track too rigid are using temping as a new way out.

    Guns for hire

    According to ASA, which conducts quarterly surveys of temp agencies, the number of workers performing temporary or contract work through staffing agencies reached 9.7 million in 2002, up from 800,000 in 1986. Of those, about three-quarters transitioned to permanent jobs.

    The numbers do not include job-seekers who bypass agencies and enter into direct consulting contracts with employers. To get an estimate of this group, Kelly Services, a Fortune 500 temp agency based in Troy, Michigan, conducted its own surveys. Kelly found that companies have been increasing their hires from “the free agent workforce”—defined as temporary and contract employees, freelancers, independent professionals, and consultants. This pool increased from 22% of the U.S. workforce in 1998, to 26% in 2000, and 28% in 2002—amounting currently to 30 million people.

    Big leap.

    Faced with “chronic rejection” in the job market, Stanford biologist David Voehringer became a scientific temp.


    Scientists are part of this trend, but just how big a part isn't known. “Nobody has the data yet,” says Berchem, whose group is now conducting an analysis based on unpublished data from the Bureau of Labor Statistics.

    There's general agreement that big pharma is leading the way in the use of temps, with biotech and food-related companies a distant second and third. Many drug companies need a fluid workforce. “A company will call and say, ‘I need a high-level organic-synthesis chemist right now who will not necessarily continue on after the project gets to the next step in the pipeline,’” says Shelly Carolan, vice president of Lab Support. Most drug companies making such requests are based on the U.S. East Coast, which translates into a booming local market for scientific temp positions.

    Different challenges can be found on the West Coast, where start-up biotech prevails. For these smaller firms, “the issue is burn rate,” says Chris Jock, director of Kelly Scientific Resources, a specialty arm spun off in 1995 as part of the larger temp agency, which employs about 5000 scientists globally. Biotech start-ups have to grapple with where and how fast they spend limited investment dollars. More often than not, success hinges on only one or two technologies. If they hit a snag, the company has to pare down expenses fast, usually laying off employees. If the product takes off, the company has to expand quickly to stay ahead of competitors.

    Biotech is just beginning to recognize temp workers as a solution, as indicated in a report published in October by the U.S. Department of Commerce's Technology Administration and Bureau of Industry and Security. The agency conducted a survey of 1031 U.S. biotech firms, finding that nearly half of the small firms (fewer than 50 employees) reported that more than 20% of their biotech-related positions had been unfilled for more than 3 months (compared with less than 1% for companies with more than 50 employees). Lacking the capital to offer signing bonuses and other big cash incentives, these companies had to look for more creative ways to attract talent.

    Enter the contingency scientist. With computer specialists leading the way, these temp workers and consultants have become part of the standard workforce at biotech firms, which are willing to meet high compensation demands, as long as workers accept a limited time commitment. In 2002, biotech companies employed 94% of their computer specialists on a contract basis. In fact, the hiring rate of contingency computer scientists rose 21.8% from 2000 to 2002, according to the government survey.

    “Companies need temporary workers for different reasons; for biotechs, “the issue is burn rate.”

    Chris Jock, Kelly Scientific Resources


    New temp employees with the requisite skills can jump from company to company, commanding a high salary as they hone their skills. Alex Chang, an account representative at Kelly Scientific, tells how he helped launch one of these stars. A company, which Chang declines to name, was having trouble finding someone with expertise in both molecular biology and computer programming. Chang, himself a Ph.D. in molecular biology, says he came up with a perfect match: an individual who, “knowing his net worth, liked to work on a contractual basis,” and a company that wanted to hire a temporary employee. Kelly is now an agent for this scientist, soliciting employment proposals that the worker might not have time to seek out himself.

    Chemists and other clinical specialists who can help out with drug trials are also in hot demand. They are popular at companies that specialize in clinical testing of potential drugs (called contract research organizations, or CROs). Pharma firms had been outsourcing to CROs but having difficulty “bringing projects in on a timely basis and within budget,” says Ray Cooke, director of clinical research at Kelly Scientific. Slippage in the late phases of a drug trial can cost the company as much as $1 million a day, Cooke says. So both pharmas and CROs are relying on temps to speed the projects through.

    Ground-floor view

    There are advantages to short-term employment, but also uncertainties. Take the case of analytical chemist Mei Hu, who says her experience as a temporary scientist was both opportune and ever-changing. She arrived from Japan in 1999 to work as a postdoc at Xenobiotic Laboratories, a biotech company in Plainsboro, New Jersey. She then got an offer from pharma giant Roche, but the company, faced with dozens of equally qualified applicants, hesitated because Hu needed approval to transfer her H1 visa from one employer to another. So she registered with a temp agency, which became her visa sponsor, enabling her to move to a new job.

    “Although [the work] was temporary, it was much better than not having any chance at all,” says Hu. Within 3 years of temping, and amid several offers of permanent employment, she ended up accepting a permanent job at Amicus Therapeutics, a small biotech company in North Brunswick, New Jersey.

    Often scientists choose contingency work for personal reasons. Voehringer points to colleagues at Sugen, who used their temp positions to get back into the workforce after staying home with children—for example, a nurse who used the opportunity to transition into the biotech market.

    But perhaps the fastest rising group of temps are aging baby boomer scientists. Not quite ready to retire, they see contingency work as a way to keep employed while having more free time for family or other interests. And as they go into full retirement, the demand for high-tech workers is likely to rise, abetted by the normal expansion of technology's need for skilled workers. “In a couple of short years, we will probably be facing another labor shortage,” ASA's Berchem predicts. “That bodes well for highly educated people who can essentially write their own ticket.”

    But there are disadvantages to temporary employment. Apart from the instability, benefits may be meager; some temp agencies will not pay for health care or offer retirement plans. Although companies such as Kelly and Lab Support say they do provide competitive health and retirement perks based on the length of time that workers remain with the agency, often these packages fall short of what permanent employees would receive.

    Upscale shift.

    Temporary jobs for educated and highly trained workers appear to be growing more rapidly than traditional jobs.


    One of the most vexing issues a science temp must deal with is proprietary information. If a temp employee develops something brand-new, who owns it? Generally, an invention belongs to the employer, but the degree to which the individual shares in the rewards varies from company to company. “We spell it all out in the contract,” says Nancy Allen-Smith, vice president of human resources at BD Biosciences in San Jose, California. Contracts also often include a prohibition on sharing information from one company with another or—in the case of consultants—on working for direct competitors.

    Social restrictions, which are more subtle, also separate temps and permanent employees. If a company gets too chummy with a temp worker but denies benefits and perks, it can become vulnerable to a lawsuit claiming that standard employee benefits have been withheld from what amounts to a full-time worker. Some companies try to steer clear of such risks, for example, by limiting certain social events—such as Christmas parties—to permanent employees only.

    Despite such negatives, however, both employers and employees are upbeat about the future of temping. “For niche workers who are educated,” says Carolan of Lab Support, “the forecast is very bright.”


    An Early Start for Greenhouse Warming?

    1. Richard A. Kerr

    SAN FRANCISCO, CALIFORNIA—A record 10,000 earth, ocean, atmospheric, and planetary scientists gathered here last month for the union's fall meeting.

    Humans held sway over climate long before the belching smokestacks and spewing tailpipes of the past century or two, according to paleoclimatologist William Ruddiman. Nineteenth-century industrialization didn't kick off global warming, Ruddiman told a packed ballroom at the meeting. Instead, humans started slowly ratcheting up the thermostat as early as 8000 years ago, when they began clearing forests for agriculture, and 5000 years ago with the arrival of wet-rice cultivation. The greenhouse gases carbon dioxide and methane given off by these changes in land use would have warmed the world by about 0.8°C, Ruddiman calculates.

    “This doubles what we thought humans have done” in the way of greenhouse warming, Ruddiman said. Paleoclimatologist Thomas Crowley of Duke University in Durham, North Carolina, says, “It's probably the most interesting and thought-provoking talk at the meeting. I liked it.”

    Humans must go way back as climatemakers, argues Ruddiman, who is a professor emeritus at the University of Virginia, Charlottesville, because both major greenhouse gases began unnatural rises just when humans began messing with the world's vegetation. And the surges diverge from patterns of changing greenhouse gas concentrations that held for hundreds of thousands of years before humans took up agriculture after the last ice age, he says. Those patterns are recorded in the Vostok ice core from Antarctica, which trapped air as the ice formed during the past 400,000 years as four ice ages came and went. According to the Vostok record, atmospheric concentrations of both carbon dioxide and methane declined with each of the glaciations and rose going into each of the warm interglacial periods.

    Stoking the greenhouse.

    Wet-rice cultivation starting 5000 years ago may have produced enough methane to begin warming the climate.


    The waxing and waning of greenhouse gases also stayed in step with the combined effects of Earth's orbital variations: its changing tilt, wobble, and orbital shape. Orbital variations are thought to be the pacemaker of the ice ages, at least in part through their influence on greenhouse gases. The physical link between orbital variations and carbon dioxide is murky, but the orbital redistribution of sunlight across the globe apparently redistributes precipitation as well. That in turn increases or decreases the extent of wetlands and thus the amount of methane they produce through the rotting of vegetation.

    Orbital variations neatly paced climate change for almost 400,000 years, says Ruddiman, until humans intervened. Based on orbital variations, carbon dioxide concentrations should have peaked about 11,000 years ago and then steadily declined. Carbon dioxide did peak 11,000 years ago, but it promptly reversed its decline about 8000 years ago and rose steadily until 2000 years ago. Eight thousand years ago, it turns out, is when humans started clearing forests in earnest, according to Ruddiman's reading of the literature, making way for agriculture by converting wood to carbon dioxide through burning or simple rot. By a crude calculation, Ruddiman finds that humans would have cleared enough forest to account for the observed rise in carbon dioxide.

    Likewise, methane peaked in time with orbital variations 10,000 years ago and then declined, but it rose again starting about 5000 years ago. That was when wet-rice cultivation took off, creating what are basically humanmade wetlands. And again, the methane production Ruddiman calculates would have sufficed to produce the observed rise in methane up to 1700. “We know what methane and carbon dioxide did for almost 400,000 years,” Ruddiman said, “but in the past 10,000 years they went the opposite way. Nature is not in control.” Even the beginning of the next North American ice sheet called for by orbital variations seems to have been frustrated by the rising greenhouse gases.

    Ruddiman's provocative analyses of early human intervention in the climate system got people's attention. “Maybe he's right,” says Crowley. “I didn't hear anyone walking out saying it couldn't be, but such things aren't accepted overnight.” Several members of the audience cautioned Ruddiman that the interaction of humans and climate may have been more complicated. Climate modeler Alan Robock of Rutgers University in New Brunswick, New Jersey, pointed out, for example, that early forest clearing by burning would have produced hazes that cool climate, not warm it. And Crowley says that “people are going to have to do some more model calculations” to better gauge humans' actual effect on greenhouse gases before they blame even more climate change on us.


    An Ill-Mannered San Andreas?

    1. Richard A. Kerr

    SAN FRANCISCO, CALIFORNIA—A record 10,000 earth, ocean, atmospheric, and planetary scientists gathered here last month for the union's fall meeting.

    Forecasting the next big one in southern California would be a piece of cake, if only the San Andreas fault would behave itself. Early on, seismologists thought at least one small segment of the San Andreas running through tiny Parkfield in central California was reasonably well-behaved, with a moderate quake rupturing the same 25-kilometer section of fault every 22 years or so (Science, 28 January 2000, p. 577). But the next one in the series is 10 years overdue. Now paleoseismologists dissecting the geologic record of the San Andreas near Wrightwood, California, 100 kilometers northeast of Los Angeles, report more irregularity. A 27-quake history, they say, reveals a disturbing amount of erratic behavior.

    “It looks like whatever triggers big earthquakes is variable,” says paleoseismologist Katherine Scharer of the University of Oregon in Eugene. That's bad news for quake forecasters, but the Wrightwood record may still turn out to be misleading, some researchers warn.

    Taking a long history of the San Andreas requires paleoseismologists to find the right spot on the fault. At the Wrightwood site in the San Gabriel Mountains, the fault cuts through a small valley; storm waters carry gravelly debris down into the boggy valley. Researchers have dug more than 40 trenches across the fault in the past decade, exposing the spot where fault ruptures have disrupted new storm deposits as well as the organic-rich bog layers that can be carbon-14 dated.

    All told, researchers have identified and dated 27 earthquakes at Wrightwood in two time intervals, 500 C.E. to the present and 3000 to 1200 B.C.E. The fault there has slipped 3.2 centimeters per year on average, Scharer and her colleagues reported at the meeting. That's just how fast modern geodetic measurements say the San Andreas as a whole has slipped, suggesting that the Wrightwood paleoseismic record is not wildly off base.

    But the earthquakes that rupture the Wrightwood site vary greatly. In the younger time interval, the average time between quakes is about 105 years, but the interval between quakes ranges from 10 years to 224 years. Fault slip per quake ranges from 1 meter to 7 meters. Scharer sees little prospect of predicting the time to the next earthquake at Wrightwood. However, the record is not devoid of pattern. Quakes tend to come more frequently and perhaps be larger for a time, she says, which reduces the strain on the fault. Then quakes tend to be less frequent and smaller, allowing strain to build back up. At the moment, strain is relatively high, according to Scharer's reading of the Wrightwood record, a condition that has typically been followed within a few decades by a very large quake or a flurry of average to large ones.

    If most faults behave as erratically as the Wrightwood site seems to, seismologists wouldn't be able to place much faith in their long-term earthquake forecasts (Science, 18 June 1993, p. 1724). But forecasting shouldn't be abandoned just yet, says seismologist William Ellsworth of the U.S. Geological Survey in Menlo Park, California. “I would urge caution in interpreting what is an extremely important record” at Wrightwood, he says. For one, Wrightwood may be atypical. The 450-kilometer southern San Andreas is thought to be composed of six segments. If, as some have suggested, the southernmost segments break together in large quakes, the northernmost ones break together in their own large quakes, and all break at once in great quakes, then the central segment bearing the Wrightwood site might get caught up in quakes largely driven from the south or the north, making a hodgepodge of the Wrightwood record. Resolution will come with more long records, says Ellsworth, and a fundamental understanding of how faults work.


    Vicissitudes of Ancient Climate

    1. Richard A. Kerr

    SAN FRANCISCO, CALIFORNIA—A record 10,000 earth, ocean, atmospheric, and planetary scientists gathered here last month for the union's fall meeting.

    Earth has often swung between chills and fever. Paleoclimatologists generally seek an explanation in swings in the abundance of greenhouse gases such as carbon dioxide (see p. 306), because CO2 levels have seemed to rise as the world warmed and to fall as it cooled into the great ice ages. But conventional thinking invites challenges, and last year it took a hit when a pair of researchers published an analysis indicating that past CO2 levels are not closely correlated with long-term climate variations. Now comes the response. At the meeting, paleoclimatologist Dana Royer of Pennsylvania State University, University Park, and geochemical modeler Robert Berner of Yale University reported that an updated record of CO2 variations during the past 500 million years does indeed produce a good fit between CO2 levels and both model predictions and one record of major climate swings. “It's a restatement of the importance of CO2,” says Royer. Many, but not all, researchers find it persuasive.

    Carbon dioxide was in need of a boost after geochemists Nir Shaviv of the Hebrew University of Jerusalem and Ján Veizer of the University of Ottawa, Canada, published a paper in last July's GSA Today in which they found a poor correlation between CO2 and Veizer's temperature record derived from the oxygen isotope composition of carbonates deposited on the ocean floor. But his isotopic climate record did fit well with the expected variations in the flux of cosmic rays during the past 500 million years. Cosmic rays, they suggested, might have modulated climate by affecting cloud brightness.

    Royer and Berner weren't convinced. First they updated the record of atmospheric CO2 levels. This 450-million-year record is based on measurements of atmospheric CO2 preserved in the geologic record, including the carbon isotopic composition of fossil soil carbonates and the abundance of gas-exchange pores on fossil leaves. The merged record of four such measures shows a double-hump curve of CO2 concentrations. High values more than 400 million years ago fall through 2000 parts per million to a few hundred ppm by about 300 million years ago, peak again about 200 million years ago, and fall once more toward the present's several hundred ppm. “I was surprised with how a consistent pattern has emerged,” says Royer.

    CO2 gauge.

    The sparser pores (roundish features) on fossil leaf (top) show that CO2 levels were higher 65 million years ago.


    Not only are different CO2 measurements consistent with each other, says Royer, but the composite curve bears a strong resemblance to what many researchers expected. Computer models that simulate the processes controlling the abundance of CO2, such as rock weathering and the burial of organic matter on the sea floor, produce much the same double-hump pattern as the proxies do. And the great ages of glacial ice—the past 30 million years or so and the 60 million years around 300 million years ago—fell in the deep dips in CO2, whereas only a few, brief glacial intervals came during periods of higher CO2 levels, Royer and Berner noted.

    Royer and Berner also adjusted Veizer's isotopic curve for the effects of changing seawater pH, a factor only recently recognized as important. That brought some periods more in line with other temperature indicators, says Royer, and much reduced the prominence of coolings that Shaviv and Veizer attribute to the cosmic ray effect.

    Veizer and Shaviv, in turn, are not convinced that the two-hump pattern of CO2 and climate is better than their plot of cosmic rays and climate, which has four peaks rather than two. They find the pH correction “an interesting modification,” but they believe Royer overdoes it, making the oceans at times unrealistically acidic. A more reasonable correction, they say, leaves the four-peak climate pattern intact. And that pattern of isotopic temperature is reasonably consistent with other climate indicators and the inferred flux of cosmic rays, they say.

    Many researchers are sticking with conventional thinking. “You can't say CO2 explains everything,” says paleoclimatologist Thomas Crowley of Duke University in Durham, North Carolina, but “it does explain a heck of a lot,” at least in the broad-brush picture of climate. No doubt, more details need to be painted in before everyone sees the same picture.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution