News this Week

Science  30 May 2003:
Vol. 300, Issue 5624, pp. 1350

    Industry Groups Petition for Data on Salt and Hypertension

    1. Jocelyn Kaiser

    Industry groups have invoked a new federal law to try to force university researchers to release data generated by an influential study of sodium intake and hypertension. The groups argue that the data are needed to verify the conclusions of two widely cited papers and to justify pronouncements by the National Heart, Lung, and Blood Institute (NHLBI)—most recently, this month—that Americans should eat less salt.

    The petition could be a key test of how far the so-called Data Quality Act could reach into academic researchers' data files. It is also raising questions about when data from large multicenter clinical trials should be released. The researchers say they are still publishing their results, and they are reluctant to turn over their data to a scientist who wants to conduct an analysis “of questionable scientific value,” says hypertension specialist Lawrence Appel of Johns Hopkins University.

    The study at the center of this fight, called DASH-Sodium, found that when 412 participants lowered the amount of sodium in their diet, overall their blood pressure went down, according to results first presented at a meeting 3 years ago (Science, 26 May 2000, p. 1319). The study, which was led by Harvard's Frank Sacks and involved six institutions, resulted in papers in January 2001 in The New England Journal of Medicine (NEJM) and in December 2001 in the Annals of Internal Medicine. NHLBI has cited the results as evidence that the public should consume fewer than 6 grams of salt a day.

    But David McCarron, a hypertension specialist formerly with Oregon Health & Science University in Portland, who is now a visiting professor at the University of California, Davis, wasn't convinced. A consultant to the Salt Institute, McCarron contends that certain subgroups within the 412 subjects—such as white men under 45 without hypertension—may not benefit from eating less salt if they are already consuming a healthy diet. If so, the advice that “all Americans” cut back on salt is incorrect, he says. To know for sure, he argues, the authors need to publish details for more subgroups. “Those simple sets of data have never appeared,” McCarron says.

    McCarron says he unsuccessfully appealed to the leaders of the DASH-Sodium study to release these data in a 2001 letter to NEJM, and he later asked the journal itself to intervene. His request “went over a line as to what an editor should do,” says NEJM executive editor Gregory Curfman. McCarron vented his frustrations in an editorial in the January 2003 American Journal of Hypertension, writing that “critical data from a federally sponsored trial have been withheld.”

    Salty debate.

    The Salt Institute is testing a new law on data quality.


    Last month, McCarron upped the ante. The Salt Institute, along with the U.S. Chamber of Commerce, invoked the Data Quality Act. According to the White House Office of Management and Budget's interpretation of the act, which took effect last October, agencies that promulgate “influential” results may have to provide enough data and methods for a “qualified member of the public” to conduct a reanalysis. In a 15 May petition to NHLBI, the two industry groups argue that the institute's statements on salt intake fall under this definition. And because those statements rely on the DASH-Sodium findings, they argue, the subgroup data should be made publicly available.

    NHLBI has 60 days to respond. Spokesperson Susan Sagusti notes that NHLBI had already been planning to make available a complete data set for the DASH-Sodium study, with patient identifying information removed, in January 2004, following an existing “internal guideline” that all data should be released 3 years after the main findings are published. [The National Institutes of Health issued separate guidelines earlier this year requiring researchers to share final data sets as soon as their main findings are accepted for publication, but those apply only to new grants, Sagusti notes (Science, 14 March, p. 1643)].

    The DASH-Sodium researchers, meanwhile, point out that they are continuing to publish. William Vollmer of the Kaiser Permanente Center for Health Research in Portland, the data center for the DASH-Sodium study, says his group has a dozen more papers in the works, including one that may answer some of McCarron's questions. Vollmer argues that it is reasonable to give researchers running large, multicenter trials who have invested much time and money in gathering data 3 years to publish the results, “given all of the planned but secondary papers that get written in these studies.”

    But preserving publication rights is not the main reason why the DASH-Sodium researchers have not given McCarron the data, Appel says. He says his team, which has submitted a response to McCarron's Journal of Hypertension editorial, is worried that McCarron will “dredge the data”: perform statistical analyses on groups that are too small to be meaningful. If the approach isn't “appropriate,” the request for data “isn't valid,” he says.

    Supporting Appel's concerns, some observers have warned that the industry-backed Data Quality Act could be used to harass researchers working on controversial environmental and health topics. On the other hand, some hypertension experts agree with McCarron that more analysis could be done: “I think it's a fair question on the part of Dr. McCarron,” says Curtis Morris of the University of California, San Francisco.


    Clues to the Animal Origins of SARS

    1. Martin Enserink*
    1. With reporting by Dennis Normile and Gretchen Vogel.

    A food market in southern China has yielded the first solid clues to the origins of severe acute respiratory syndrome (SARS). On 23 May, Chinese researchers announced that they had found the virus in civet cats, a species eaten as a delicacy in China. Just as the results were announced, a sobering warning about SARS's staying power arrived from Toronto, where a new cluster of 11 probable cases surfaced, 2 weeks after the Canadian city had been declared SARS-free.

    SARS has so far sickened more than 8200 people worldwide, according to the World Health Organization (WHO), and killed more than 700. Based on its unique genome, many researchers believe that the virus had been lurking in some animal species before it jumped into humans. Finding that host would not likely help stem the current epidemic, which is fueled by human-to-human transmission, but it might help prevent future outbreaks.

    Based on reports that a disproportionate number of the first patients were working in southern China's food industry, University of Hong Kong microbiologist Yuen Kwok-Yung and colleagues at the Center for Disease Control and Prevention in Shenzhen focused on exotic animals for sale in a market in Guangdong. They bought a total of 25 animals spanning eight different species; in the lab, they were able to culture a coronavirus from all six masked palm civets they sampled. They also found evidence of the virus in single samples from two other species.

    Yuen calls the virus's predominance in civet cats “extremely important,” because it suggests that the animals—which are only distantly related to house cats—may have been the virus's springboard to humans. Genetic analysis showed that the virus is almost identical to that found in human SARS patients, but its sequence had 29 extra nucleotides.

    Risky business?

    The SARS coronavirus was found in civet cats for sale at a food market.


    Other researchers remain cautious. Even if the civets carry the virus, they are not necessarily the natural reservoir, says WHO virologist Klaus Stöhr. The fact that several species appear to have been infected could mean that they all got the virus from a common food source or from yet another species. “We have to sample a huge number of animals before we can be 100% certain that the civet is the primary origin,” Yuen says.

    Even so, the 29 nucleotides present in the civet virus but missing from human isolates are intriguing, says coronavirologist Peter Rottier of Utrecht University in the Netherlands; it's possible that the loss of these few nucleotides may have made the virus more adept at preying on humans, he says. Henry Niman of Harvard University notes that of the almost 20 human coronavirus genomes posted online so far, one still has the 29 nucleotides. The patient from which that virus was derived, hailing from Guangdong, may have been an early case who got infected before the virus lost the short RNA stretch, Niman says.

    Just how difficult SARS eradication will be was illustrated by the disheartening rash of suspected new cases in Toronto. At first, health officials feared that the new wave meant that the virus might be capable of hiding for a long time in asymptomatic carriers or the environment—a nightmare scenario. But on Monday, Ontario Commissioner of Public Health Colin D'Cunha said all the new cases were linked to one 96-year-old patient who was infected by other known cases in mid-April but diagnosed only later. “The take-home message,” said D'Cunha, “is that we can never drop our guard.”


    Whitman Leaves Science Legacy at EPA

    1. Erik Stokstad

    When Christine Todd Whitman resigned as administrator of the Environmental Protection Agency (EPA) last week, she faced yet another barrage of criticism. Environmental groups said that under her tenure, the agency had caved in to industry interests and weakened environmental laws and regulations. Whitman tends to receive higher marks for EPA science under her tenure, however. “There's momentum to improve its role in the agency,” says William Glaze of Oregon Health & Science University in Portland, chair of EPA's Science Advisory Board. “I believe that's the result of the tone of the Whitman administration.”

    A moderate Republican, Whitman came to the agency with a track record of improving New Jersey's environment while governor. One of the first firestorms erupted when Whitman, citing high cleanup costs, decided to ditch a Clinton-era plan to lower the allowable level of arsenic in drinking water—a standard she ultimately accepted on advice from the National Research Council (NRC). Most recently, Whitman took heat for the agency's calculations that seniors' lives are worth less than those of younger folks in cost-benefit analyses of pollution regulations (Science, 21 March, p. 1836).

    Bright spot.

    Whitman won praise for boosting EPA science, despite criticism from environmentalists.


    But during her 2.5 years at EPA, Whitman also began to strengthen research and boost its role in decision-making, say several observers. A key move was her appointment of Paul Gilman—a Celera executive and Washington hand—to head EPA's Office of Research and Development (ORD) and as her science adviser, they say (Science, 10 May 2002, p. 1005). Although NRC had recommended a more highly placed science czar, “Gilman has provided a level of both scientific and political sophistication that has been very good for the ORD,” says David Blockstein of the National Council for Science and the Environment.

    Gilman points to a doubling of the number of EPA scientists involved in the regulatory process, to about 300, during Whitman's tenure. And the agency has bolstered its emphasis on peer review; Gilman says about 91% of research and other reports that are reviewed now go out for external comment, up substantially from the 1990s. The agency is also drafting guidelines to improve its computer modeling and has begun a review of its methods for collecting data.

    Although these efforts are winning praise, Blockstein and others say ORD has been hamstrung by chronically flat budgets, especially in comparison with other science agencies. And some are irritated that EPA requested 50% less than usual for an extramural graduate fellowship program, STAR, that NRC praised in a recent report. “To cut it makes absolutely no sense whatsoever,” says Robert Huggett, vice president for research and graduate studies at Michigan State University in East Lansing and an ORD chief during the Clinton Administration. It's not clear who might replace Whitman, but Glaze and others expect that EPA's science will continue its upswing. Bolstering their hope, Gilman says he doesn't have any plans to leave.


    Secrecy on Big Projects Breeds Earmarks, Panel Is Told

    1. Jeffrey Mervis

    Senate aide Cheh Kim winces whenever a certain lobbyist visits his Capitol Hill office to talk about earmarking money for a new research facility in the budget of the National Science Foundation (NSF). Kim believes that peer review, not pork-barrel politics—the inclusion of money for projects not requested by an agency—should be the guiding principle in carving up NSF's budget. But he says the foundation's process of deciding which large new projects to fund in a particular year is so shrouded in mystery that clients have been forced to hire lobbyists to plead their case.

    “Right now it doesn't seem like a fair process,” Cheh said last week at the first meeting of a National Academies committee asked by Congress to recommend how NSF can make the reasons behind its funding decisions more transparent to the scientific community and less vulnerable to political influences. “We want to take the politics out of this account, because we don't have the expertise to make the right choices. But we're going down a dangerous road. We've seen [an increase in earmarks] happen with NASA and the Environmental Protection Agency, and it's going to happen to NSF,” Kim predicted, unless legislators can understand how NSF selects projects for its Major Research Equipment and Facility Construction (MREFC) account.

    NSF spends about one-fifth of its $5.4 billion budget on scientific “tools”—everything from telescopes and research vessels to supercomputing networks and seismic arrays. But its bread and butter is grants to individual investigators, and in 1993 it created the MREFC account to keep the cost of new facilities from eating into those grants. The approach has worked admirably in building unique facilities such as the Laser Interferometer Gravitational Wave Observatory (LIGO) and a new research station at the South Pole. But demand for such expensive facilities has grown even faster. The result is a backlog of projects that have passed peer review but have not yet been selected for funding—and an increasingly fierce competition among research teams to move their projects to the front of the queue (Science, 27 July 2001, p. 586).

    Down a long road.

    LIGO was the first project funded by NSF's controversial large-facilities account.


    The foundation has a comprehensive, multilayer process to ensure that all proposed large construction projects receive a thorough review, NSF Director Rita Colwell told the 15-member academy committee, chaired by physicist William Brinkman of Princeton University. But winning NSF's approval isn't enough, she explained. White House budget officials must also sign off on a project before it gets into the president's budget request to Congress. And that can involve some horse-trading, Colwell said, based on the cost of the project, its timeliness, the discipline to be served, international commitments, and other factors. “They are not assigned relative weights,” she explained. “And the process cannot be quantified. We are not manufacturing shoes.”

    Even so, Congress last year ordered NSF and its oversight body, the National Science Board, to rank every approved project, accompanied by an explanation, and to revise the list each time one is added. Legislators hope the exercise will give them scientifically valid reasons for NSF's decisions—and reduce the stream of visitors who are unhappy with the president's budget request. “We definitely want the community to feel that the process is fair,” said David Goldston, staff director for the House Committee on Science, which folded the requirement into authorizing legislation passed last fall.

    Although everyone in the room appeared to agree that such an approach seems reasonable, few think that it will be easy to do. Some disciplines, notably astronomy, have done such rankings for decades, but few scientific bodies have been bold enough to set priorities across all fields of science. “I think that people are willing to make these choices, but they are often based on intuition,” said Brinkman. “It's subjective,” he told the congressional staffers, “and I hope you understand that.”

    “Yes, we do,” Goldston replied. “That's why we want somebody else to do it.”

    Brinkman promised that the committee would meet its deadline to deliver a report by the end of the year. In the meantime, the betting is that the final version of NSF's 2004 budget, now pending before Congress, will contain some of the very earmarks that Kim and others want to root out.


    Report Asks Colleges to Plug A Leaky People Pipeline

    1. Jeffrey Mervis

    The federal government needs to take action on several fronts to guarantee an adequate supply of U.S. scientific workers, according to a new report by the National Science Board. The report calls for a variety of measures, ranging from better salaries for public school science and math teachers to increased funding for basic research. But the quickest payoff, it says, could come from efforts by universities to bolster retention rates among undergraduates who declare an interest in earning science and engineering (S&E) degrees.

    “It will require a culture change within departments,” says biologist George Langford of Dartmouth College in Hanover, New Hampshire, chair of the board's education panel. “But if we succeed in improving the climate for undergraduate and graduate students, we can have a dramatic impact [on the number of students trained for scientific careers] by 2010.” Making the most of homegrown talent is even more important in today's global economy, adds National Science Foundation (NSF) Director Rita Colwell, who says “we've become overly dependent on the global workforce.”

    The board, a presidentially appointed oversight body for NSF, has spent nearly 3 years on the report, which remains untitled and in draft form ( It avoids such controversial terms as “shortage” and “shortfall” (Science, 16 May, p. 1070), opting instead for the more nuanced concept of “underproduction” in warning of “a likely decline” in the number of “native-born science and engineering graduates.”

    The report laments “the movement of undergraduate students out of S&E fields and into other majors.” It also says that women and minorities are “underused resources” and notes that the country “may not be able to rely on the international labor market” to meet its needs.

    A matter of degrees.

    Women and most minority groups are less likely to earn natural science and engineering degrees than the population as a whole.

    SOURCE: NSF, 2002

    The board's message dovetails with recent reports by other science policy bodies, including the National Academies' Government-University-Industry Research Roundtable and the nonprofit Building Engineering and Science Talent. The President's Council of Advisors on Science and Technology (PCAST) is expected to add its voice to the chorus: It has just embarked on a similar study due out next year.

    The report calls for increased spending and attention at every point in the pipeline, with an emphasis on programs aimed at broadening participation among underrepresented groups and reducing attrition. Langford says the board refrained from attaching any price tags until “all the stakeholders”—other federal agencies, the university community, state and local education officials, and industry—have weighed in. “We've talked with [presidential science adviser] Jack Marburger about the need for a sizable federal investment, and he didn't seem concerned. It will definitely cost a lot of money.”

    Ralph Gomory, president of the Alfred P. Sloan Foundation, applauds the board's focus on the undergraduate years. “It's more amenable to fixing than K-12 education,” he notes. Marye Anne Fox, chancellor of North Carolina State University in Raleigh and a PCAST member, agrees that “the freshman year is critical” for keeping promising students on the scientific track. She says that successful techniques include a shift from lectures to hands-on activities, more research opportunities, smaller class sizes, and better mentoring and career counseling.

    In calling for more S&E workers, Langford acknowledges the rising unemployment rates in most high-tech fields. But he labels them a “temporary condition.” What's more important, he says, is having enough talent on hand to take full advantage of science's role as “the engine for U.S. economic growth and national security.”


    First Cloned Mule Races to Finish Line

    1. Constance Holden

    The first equine has joined bovines, ovines, felines, rabbits, rodents, and porkers in the ranks of the cloned. On 5 May a mule named Idaho Gem was born after a normal 346-day gestation in the womb of a mare, researchers report online in Science this week ( That makes him not only the first member of the horse family but also the first sterile animal to be cloned. Mules, sired by donkeys and borne by horses, are incapable of reproduction. But a team at the University of Idaho in Moscow headed by Gordon Woods has now shown that a mule cell nucleus, despite its odd chromosome number, can cut the mustard.

    Idaho Gem is a sibling of a world-champion racing mule named Taz. The scientists didn't want to clone from an adult animal because they “wanted to take the aging component out of the equation,” Woods says. Some researchers suspect that the first clone, Dolly the sheep, aged prematurely because her DNA was derived from an adult cell. So the team rebred Taz's parents, took a somatic cell from the 45-day-old fetus, and fused it with an enucleated horse oocyte that they then inserted into a mare.

    Equines have proved difficult to clone; horse oocytes don't mature well in a dish and it's hard to get embryonic cells to divide. Woods's group found that calcium levels inside equine red blood cells are low compared with those from cows, leading the researchers to suspect that low calcium levels could be inhibiting cloned equine embryos' growth. They jacked up the calcium in the cultures and got some embryos to thrive.

    Up and running.

    Idaho Gem is a brother of a racing champ.


    The work was financed by donations from Taz's owner, Donald Jacklin, as well as tax money from the racing industry that is earmarked for horse research. Two other mares are expected to deliver twins of Idaho Gem in June and August.

    And more equine clones are in the works. Katrin Hinrichs of Texas A&M University in College Station says, “We're hoping to have the first horse.” Her group has a mare almost halfway through the gestation period with a clone. But the Italians may win this horserace. Cesare Galli of the Laboratory of Reproductive Technology in Cremona says his group has a cloned foal due in late May.

    But don't expect to see a clone of Funny Cide, the gelding that is bidding to become a Triple Crown winner. The Thoroughbreds' Jockey Club doesn't even allow artificial insemination, much less cloning, and the American Quarter Horse Association has turned its thumbs down on registering clones. Woods thinks there's a place for them, though, in competitions not involving registered breeds. That would include the Olympics, where most of the horses are geldings.

    The birth of Idaho Gem also bodes well for preserving endangered species. It's “awesome … I'm delighted,” says Oliver Ryder of the San Diego Zoo. Now, he says, cloning may become an option for conservation efforts among the perissodactyls (creatures with odd numbers of toes) such as the endangered Przewalski's horses and Somali wild asses.


    Industrial Renaissance or a New Dark Age?

    1. John Bohannon,
    2. Alexander Hellemans*
    1. John Bohannon reported from Lyon, France, and Alexander Hellemans from Naples.

    NAPLES—Italy's scientific community is in an uproar over long-dreaded reforms just unveiled by the government. The country's top science official has resigned and another key figure has stepped down, while rank-and-file researchers are bemoaning a shakeup that could mark a dramatic turn toward applied projects. “Basic research will die,” contends physicist Carlo Bernardini of the National Institute for the Physics of Matter (INFM) in Genoa.

    The measures announced on 16 May are aimed principally at the National Research Council (CNR), Italy's largest scientific organization, which has some 4300 researchers on its payroll. Several centers will be merged, and from now on, institute directors and science chiefs will be appointed by the government. Scientists say they were never consulted and fear that the reforms will allow politicians to dictate the research agenda. CNR president Lucio Bianco quit on 13 May, and Flavio Toigo, president of the powerful INFM, followed him out the door, resigning in protest on 17 May.

    Scientists had been keeping a tense vigil since last August, when the plans were leaked (Science, 16 August 2002, p. 1106). The controversy intensified late last year when the government slashed funding for some institutes as much as 30%.

    New direction.

    At CNR, president Lucio Bianco (left) is out and applied research is in.


    When the reform legislation finally emerged, it hewed closely to the leaked proposal. Several measures are designed to streamline Italy's scientific infrastructure. For instance, the independent INFM and four other institutes will become part of CNR. Government officials rather than scientists now will choose CNR institute directors, and a government-appointed administrator would take CNR's helm in case of “financial difficulties” or to “redirect” its mission. All that was too much for Toigo. “Science policy should be determined by scientists, not politicians,” he fumes. Bianco, who is said to have opposed the reforms, could not be reached for comment.

    Government officials say the reforms are needed to trim a bloated bureaucracy. And some in the community applaud the applied emphasis. Industry must play a greater role in funding research, says Fabio Pistella, president of the National Institute of Applied Optics in Florence, a center that will now report to CNR. Bianco's temporary replacement at CNRis widely tipped to be electronics engineer Adriano De Maio, rector of Luiss Guido Carli University in Rome.

    As Science went to press, more than 9000 people had signed a petition denouncing the reforms. Outsiders, meanwhile, are bemused. “The government and the scientists only see caricatures of each other,” says a member of an international team that audited CNR. Legislation implementing the reforms, already approved by Parliament, is now awaiting President Carlo Azeglio Ciampi's signature.


    Experts Say Big Cats Don't Leave Useful Tracks

    1. Pallava Bagla

    NEW DELHI—India may be the last stronghold for the endangered Bengal tiger. But the way the government keeps tabs on the majestic animal is so flawed as to be nearly worthless for conservation purposes, says a group of scientists.

    “Three decades of tiger monitoring has basically failed in India,” declare the authors of a report in the current issue of Animal Conservation, published by the Zoological Society of London. The study, by nine U.S. and Indian scientists, goes public with long-running concerns among conservationists about India's use of pugmarks—tiger footprints—to count the big cats in the wild.

    Pugmarks were thought to be unique, allowing trained eyes to track specific animals. But the authors say that even experts flunked a recent controlled test in which they were asked to distinguish the pugmarks of individual tigers. To better measure tiger trends, they recommend that India adopt statistically sound sampling methods such as transects and modern camera traps set in prime tiger habitat. But Indian officials defend their use of pugmarks, which are preserved in plaster of Paris casts or through tracings, and say they are taking steps to make the technique more accurate.

    India is believed to be home to the largest number of royal Bengal tigers, which a century ago numbered about 100,000. Its latest estimate of 3642—out of a worldwide tiger population of roughly 7500—is a drop of nearly 200 from 2000. That decline has called into question the effectiveness of the government's $7 million Project Tiger, which carries out conservation activities in 27 designated reserves and elsewhere throughout the country.

    Out of step.

    New report says counting tiger footprints doesn't produce a reliable census.


    The new report says that pugmarks fall short as a counting tool because they are drawn from an “unknown fraction” of the 300,000 square kilometers of tiger habitat in India and are difficult to locate in some terrain, including hard or rocky soil as well as mangrove swamps. “The discrimination ability of the pugmark approach completely breaks down when data from different substrates is pooled in,” notes co-author K. Ullas Karanth, a carnivore ecologist at the Wildlife Conservation Society in New York City.

    Rajesh Gopal, director of Project Tiger, defends the use of pugmarks as “in tune with the local conditions” and says that the technique will be refined as part of a $1.1 million project now under way to map all tiger habitats. But Melvin Sunquist, an expert on tiger ecology at the University of Florida, Gainesville, says “there is too much room for identification error in the pugmark approach because of variation associated with substrate, travel rates, and stride length.” For the method to work reliably, he says, park managers would need to be able to recognize each individual tiger in their area, an improbable standard for a creature that survives by its remarkable camouflage.


    The Milky Way's Dark, Starving Pit

    1. Robert Irion

    New observations all but prove that our galaxy harbors a huge black hole. Now astronomers are trying to understand its measly diet—and its youthful companions

    When it comes to speed, astronomers have seen it all: supernovas rifling pulsars into space at more than 1000 kilometers per second, gamma ray bursts spewing matter at nearly the speed of light. But recently discovered cosmic speed demons at the center of our Milky Way galaxy have left even veteran high-energy astrophysicists gasping in delight.

    Probing deeper than ever before into the galaxy's heart, rival teams of German and U.S. astronomers have detected giant stars hurtling around an unseen mass in tight orbits at as much as 9000 kilometers per second—3% of the speed of light. The breakneck motions of these stars provide convincing evidence that our galaxy hosts a black hole nearly 4 million times more massive than our sun, the best mass estimate yet derived.

    This dramatic unveiling has raised new mysteries. For example, no one can explain how the stars—which are 15 times heftier than our sun—got there. According to most astronomical models, they are too big to have formed in the chaos of the galactic center but appear to be too young to have moved there from farther out. Even more enigmatic is the ancient black hole they orbit. It is far less active than the holes that fuel energetic geysers erupting from some other galaxies. Indeed, radio and x-ray data suggest that it is a surprisingly picky eater, consuming only perhaps 1/100,000th of the available gas. Its meals are fitful; nearly every day, for reasons astronomers don't yet understand, faint hourlong belches erupt from the hot swirl of matter that must encircle the hole.

    Solving these conundrums will offer fresh insights into “the grand design of doddering old black holes,” says astronomer Frederick Baganoff of the Massachusetts Institute of Technology (MIT) in Cambridge. “It's the closest look we've ever had at a supermassive black hole, and it will always be the one we can observe in the greatest detail.”

    Darting like comets

    Billions of years ago, the Milky Way's black hole may have powered a bright beacon of energy, like a quasar in the distant universe. However, it has long since settled into a quiet adulthood. “It seems like a perfectly ordinary, garden-variety, galactic center black hole,” says astronomer Mark Morris of the University of California, Los Angeles (UCLA). “It appears that almost every galaxy has these.”

    For something so commonplace, the Milky Way's black hole has proven elusive. Early observations of gas whirling near the galaxy's core hinted at the pull of something massive. But no one could say for sure whether the tug came from a black hole, strong magnetic fields lacing through the charged gas, or a spread-out nest of neutron stars or other dense matter.

    Whip it.

    In the Milky Way's crowded center (top), massive stars dash around the galaxy's black hole on cometlike orbits.


    To narrow down the size of the suspected central mass, astronomers had to trace the orbits of the innermost stars. But when they aimed their telescopes at the Milky Way's central point, a radio source called Sagittarius (Sgr) A* (pronounced “sadge A-star”), shrouds of dust blocked the way. That changed thanks to a new tool: dust-piercing infrared light massaged with advanced optics.

    In the early 1990s, a group led by Reinhard Genzel and Andreas Eckart of the Max Planck Institute for Extraterrestrial Physics in Garching, Germany, began following several stars near Sgr A* using the 3.6-meter New Technology Telescope at the European Southern Observatory (ESO) outpost on La Silla, Chile. A team directed by Andrea Ghez of UCLA soon took on the same project with one of the twin 10-meter Keck Telescopes on Mauna Kea, Hawaii. For nearly a decade the teams have pushed each other in a rivalry that one colleague describes as “sometimes more friendly than at other times.” For years both teams battled atmospheric distortion, which fuzzes the paths of photons during the last 30 microseconds of their 26,000-year journeys from the stars. By taking short, rapid-fire “speckle” images, the teams cut down the fuzz and built up the case for a compact central mass. But the data still left room for skepticism.

    In the past year, dramatic gains have erased all doubts. Genzel's group now uses one of the four 8.2-meter mirrors in ESO's Very Large Telescope array on Cerro Paranal, Chile. More critically, both teams sharpened their views using adaptive optics, in which small, flexible mirrors counteract the jittering of air overhead. Dozens more stars popped into focus, including some that zoom perilously close to the galaxy's vortex.

    When Genzel and Ghez assembled their star positions into time-lapse sequences, the videos were startling. Stars dart on their orbits like comets plunging toward the sun. One star, which Ghez announced last summer and Genzel's team discussed in the 17 October 2002 issue of Nature, swung within 17 light-hours of Sgr A* early last year—just 120 times Earth's distance from the sun. The data trace out two-thirds of the star's 15-year oval orbit. “You can take out your ruler: It's a true Keplerian ellipse,” says Genzel, referring to the laws of planetary motion devised by Johannes Kepler nearly 400 years ago. When combined with Newton's law of universal gravitation, the observed orbit reveals that the central body is roughly 3.7 million times as massive as our sun.

    By tracing another star's orbit into the past, Ghez's team found that the star dove half as far from the center in 1999. At the most frantic point of its cigar-shaped orbit, that star raced at 1/30th the speed of light, the fastest star yet observed. “It must have been quite a ride,” says Ghez, who posted her group's analysis online in March (

    Both scientists say the stars' orbits are so tight that only a giant black hole makes physical sense as the object holding the stars in its sway. If a swarm of little black holes or other compact objects were packed so closely together, gravitational encounters would scatter them within 100,000 years. “The dynamical case [for a supermassive black hole] is just brilliant,” agrees astronomer Jonathan Grindlay of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts. “These are beautiful observations.”

    Fickle feeder

    With the hole story in place, some astronomers are shifting their focus to the physics of the galactic center. “I'm willing to believe this is a black hole; it's the best case we have,” says Sandra Faber of UC Santa Cruz. “I'm much more interested in how the black hole is influencing its environment.”

    It's clear that the Milky Way's central engine is not a scaled-down version of the gas-guzzling billion-solar-mass behemoths at the cores of gigantic galaxies, which spew jets and blaze at all energies. Astronomers think that drifting molecular clouds and the winds of massive stars provide space near the black hole with a modest but steady supply of dust and gas. If the black hole devoured this matter and radiated in the expected way, it would appear more than 100,000 times brighter than it is. “It's hard to understand how gas can flow into a black hole without producing a lot of radiation,” says astrophysicist Eliot Quataert of UC Berkeley. “An attractive idea is that only a tiny fraction of the available gas actually goes in, but most is driven away.”


    An x-ray image, 100 light-years across, reveals two lobes of hot gas (2 o'clock and 7 o'clock positions) from past outbursts of the Milky Way's central black hole.


    Studies of radio waves support Quataert's view. Astronomer Geoffrey Bower of UC Berkeley and colleagues used the Berkeley Illinois Maryland Association array of telescopes in northern California to study the degree to which radio emission from Sgr A* is polarized, or aligned along a certain angle. A heavy flow of infalling gas and dust would scatter waves randomly, whereas a trickle of matter would let some waves stay aligned.

    As the team reports in an upcoming issue of Astrophysical Journal, about 7% of the radio signals are polarized. The result—which bolsters an earlier detection from a team led by astronomer David Aitken of the University of Hertfordshire, U.K.—suggests that gas equivalent to about 1% of Earth's mass cascades onto the black hole each year, hardly an impressive bounty. “The black hole seems to be rather isolated and starved,” says Bower's co-author, Heino Falcke of the Max Planck Institute for Radio Astronomy in Bonn, Germany.

    Theorists don't know why so little gas flows into the hole. Winds streaming from the stars orbiting nearby probably keep much of the gas at bay, says UCLA's Morris. Baganoff of MIT also notes that a supernova exploded near the galactic center 10,000 to 50,000 years ago. Although the exact spatial position of Sgr A* isn't known, it appears to sit inside this supernova remnant, a vast bubble that shines nebulously in x-rays and radio waves. The remnant's blast wave may persist, pushing gas and dust away from Sgr A*.

    Still, some matter does plummet past the point of no return—and it's not a smooth ride. Two orbiting x-ray observatories, NASA's Chandra and the European Space Agency's XMM-Newton, both see outbursts in which x-ray emission from Sgr A* soars by a factor of 5 to 50 for an hour or more. In May 2002, Chandra watched for 6 days and saw seven such flares.

    The events are more hiccups than cosmos-shaking blasts, Baganoff observes. “If they happened in the [neighboring] Andromeda galaxy, the flares would be too faint for us to see,” he notes. Even so, unraveling their source could explain much about the hole's lifestyle.

    Ideas are flying at conferences and in online preprints. Baganoff thinks the flares erupt because the black hole eats comet-sized chunks of gas, like daily supplements to augment its meager diet. Astrophysicist Sera Markoff of MIT thinks the convulsions are akin to solar flares on the sun, triggered by sudden linkages of magnetic field lines at the base of a jet of charged gas rooted near the black hole. And in a paper to appear in Astronomy and Astrophysics, Sergei Nayakshin of the Max Planck Institute for Astrophysics in Garching and colleagues propose that orbiting stars trigger the flares as they plow through a disk of matter surrounding the black hole. Future observations of flares by the Chandra and XMM-Newton x-ray satellites in coordination with ground-based facilities at other wavelengths should reveal the cause, Baganoff says.

    Young and restless

    Meanwhile, Grindlay says, “the bizarre question of the hour” is what the young stars are doing there at all. Clouds of gas need a calm and cold setting to collapse into a ball dense enough to ignite nuclear fusion. Yet gravitational tidal forces—from the black hole and from stars in the galaxy's nucleus—make the galactic center the antithesis of such a nursery.

    Could the stars have formed elsewhere? Ghez's team struck a blow against that hypothesis in the 1 April issue of Astrophysical Journal Letters, with the first analysis of the spectrum of light from a star close to the center. Telltale signatures revealed that the star is a heavyweight, perhaps 15 times more massive than our sun. Other stars tracked by the Ghez and Genzel groups appear to be similarly obese. Such stars burn fiercely and live only about 10 million years—too short a time, most astronomers think, for them to rush inward from a more distant cradle. “They have no damn right to be there,” Genzel says.

    As with the x-ray flares, hypotheses abound. Genzel and his team believe the stars must arise from mergers among smaller stars near the hole, as yet unseen by their telescopes. Others think that collisions couldn't produce so many giant stars. Rather, says astrophysicist Simon Portegies Zwart of the University of Amsterdam, massive clusters of stars could have condensed a couple of dozen light-years from the center and rapidly migrated inward, falling apart within a million years and scattering their stars like seeds.

    Morris maintains that big stars could have formed right where they are. “At the galactic center, you're looking at the bottom of a gravitational well,” he says. “There's a vast reservoir of gas ready to move in, as soon as it's allowed.” The trigger, says Morris, is the death of each generation of windy, massive stars near the core. When their short lifetimes expire at nearly the same time, Morris proposes, their emissions no longer keep the gas at bay; gas floods in from dozens of light-years away and piles up. This compression might overwhelm the gravitational tidal forces at the galactic center and unleash a “star formation catastrophe” in a disk around the black hole.

    In any of the scenarios, the time constraint might be less severe than it seems. A close passage near the black hole might strip away a bit of the stars' atmospheres, eliminating heavier molecules such as carbon monoxide that typically build up in the outer layers of older stars. With “fresh” gas exposed, such tidally stripped stars could look younger than their true ages. “They could be older stars masquerading as youths, a phenomenon we understand quite well in Los Angeles,” Ghez says with a smile.

    More clues should surface as the Genzel and Ghez teams race closer to the galaxy's heart. Adaptive optics methods will improve, and each group plans to combine light from multiple telescopes at its observatory to form sharper images using interferometry. The astronomers expect to see smaller stars whirling near the center, although collisions and unusual star-birth processes may conspire to tilt the balance of stars toward the heavy end.

    Ultimately, the teams hope to see beyond the closest stars to the boundary of the black hole itself. According to astronomer Fulvio Melia of the University of Arizona, Tucson, a global network of millimeter-wave radio telescopes should have enough acuity by 2010 to see the event horizon—the threshold beyond which light cannot escape—as a small shadow against brighter background gas.

    That will complete a remarkable transformation for the Milky Way's nucleus, says Melia, who describes its history and science in his new trade book The Black Hole at the Center of Our Galaxy. “The galactic center was the most boring part, heavily obscured by dust for centuries,” he says. “Now observations are pushing physical theories to the extreme, and they may provide the ultimate test of general relativity within a few years.” Not bad for a black hole that can't even eat a proper meal.


    North Korea's Not-So-Hidden Agenda Raises the Ante

    1. Charles Seife

    While hinting that it has plutonium-based nuclear weapons, North Korea is pursuing a uranium-separation program that could be an even more serious threat

    North Korea's off-the-cuff claim last month that it has nuclear weapons came as sobering news to world leaders. Yet its apparent confirmation of a decade-old suspicion isn't what is keeping weapons experts up at night. A more fundamental issue, they say, is the recent discovery of a new wrinkle in the country's weapons program that makes it less, rather than more, transparent: the apparent addition of a uranium bombmaking project to its presumed plutonium-based effort.

    Analyses of the opaque regime are fraught with uncertainty. For one, experts don't know whether North Korea has attempted to smuggle weapons-grade fissile materials into the country. Presuming that North Korea is starting from scratch, a uranium-enrichment program is much more attractive than a plutonium program because it is much easier to hide. Making plutonium requires a nuclear reactor, which is fairly easy for outside authorities to monitor. It's big, it's got a relatively large infrastructure with relatively heavy security, and it generates a lot of heat.

    Uranium programs, on the other hand, can use small and scattered facilities that are more easily hidden than those needed to produce and separate plutonium. Indeed, a vast uranium-enrichment program in Iran, a far more open country by comparison, came to light only after opposition groups revealed the secret program last August. Even more unnerving than nuclear weapons on the Korean peninsula, say U.S. officials, is the possibility that North Korea could sell nuclear technology to nations eager to join the nuclear weapons club.

    According to Robert Gallucci, a former U.S. ambassador and arms-control expert, North Korea is believed to have started serious work on a plutonium-based nuclear weapons program in the 1980s. It began with a small Russian reactor before shifting to a larger, 20-megawatt reactor that went online in the mid- to late '80s. This worried proliferation experts, because it was a first step in building a plutonium-based bomb.

    Smoking gun.

    Steam rising from a cooling tower at North Korea's Yongbyon reactor in March is a probable sign of weapons activity.


    To make plutonium, which doesn't exist naturally, weaponeers blitz uranium with neutrons in nuclear reactors such as the one at Yongbyon, 90 kilometers north of the North Korean capital, Pyongyang. Some of the uranium in the fuel rods turns into plutonium as the reactor runs, as would uranium inserted into the reactor core. Then the plutonium is chemically separated from the uranium, after which it can be fashioned into a weapon. “If you go the plutonium route, you need to have a reactor, which is very hard to hide,” says Matthew Bunn, a nuclear proliferation expert at Harvard University.

    What's more, a specialized facility is required for extracting, or reprocessing, the plutonium from spent fuel rods—a dead giveaway for a nuclear weapons program. North Korea's nuclear ambitions became apparent in the late 1980s and early 1990s when satellite photographs revealed a reprocessing plant for purifying plutonium. One tip-off is that the facilities and equipment needed to handle acids that dissolve spent fuel rods and separate the plutonium from uranium and other radioactive products. “Analytically, you start seeing buildings whose walls are really thick for such small facilities, which you can pick up from overhead,” says Gallucci. The site starts getting stocked with “heavy glass, remote handlers, glove boxes,” and other technology that is necessary for handling highly radioactive materials. Another telltale sign is the radioactive gases released from spent fuel rods. Those gases, such as krypton-85, can be detected by radiation monitors downwind of the plant.

    News of the reprocessing plant, which confirmed a plutonium-production program, meant that North Korea was on the verge of violating agreements that prohibit manufacturing fissile materials for bombs. Behind-the-scenes diplomacy failed to resolve the crisis, which came to a head in 1993. That's when North Korea threatened to withdraw from the United Nations nonproliferation treaty, the main safeguard against nuclear proliferation, and move ahead with its bombmaking program.

    It never did, however, and tensions were eased the next year when North Korea agreed to freeze its weapons program in return for various considerations by the United States. That arrangement, called the Agreed Framework, also allowed inspectors from the International Atomic Energy Agency (IAEA) to monitor the Yongbyon facilities to make sure that no plutonium was being extracted from the plant's fuel.

    What happened to the plutonium produced in the early 1990s before the inspectors arrived remains a mystery. “A CIA assessment was that North Korea probably has one or two nuclear weapons, based on the size of the reactor and how long it had operated,” says Bunn. “But nobody outside North Korea knows how much fuel was unloaded and reprocessed” before the plant was shut down, he says.

    Despite that uncertainty, experts believed that they had put a lid on North Korea's nuclear weapons program. But the status quo was shattered last October, when, according to U.S. officials, North Korea admitted that it was pursuing a clandestine uranium-based bomb project. “Ten years ago we were in a process spookily like this one, in which North Korea gets caught cheating on [a nonproliferation] agreement,” says Gallucci. “Ten years later, they are caught cheating again.” This time, the matter is even more serious.

    Uranium is a naturally occurring element, but most of the metal lying about is of the wrong type for bombmaking. Only a small minority is the bomb-ready uranium-235; more than 99% is its heavier sibling, uranium-238, which is not bomb worthy. To make a uranium bomb, then, a weaponeer must separate the useful U-235 isotope from the uranium chaff. This is a laborious process, and it's difficult to get the isotope separation equipment (such as centrifuges that spin heavier U-238 away from lighter U-235) working with enough efficiency to get high-purity U-235.

    Even though centrifuges might seem straightforward, getting them to work properly is quite difficult. If not done precisely right with finely milled equipment, the uranium-laden gases inside the spinning cylinders can blend together because of turbulence, for example, wiping out any enrichment by centrifugal force. To get around some of the problems with designing and building a uranium-separation project, North Korea reportedly secured assistance from Pakistan, which in return received missile technology.

    The difficulties are overmatched by the payoff, however: Centrifuges can be hidden relatively easily by scattering them throughout the country. “You could easily have 10 or 100 at a site,” says Princeton University nuclear weapons expert Frank von Hippel. “A few thousand is what you need for one bomb per year.” Notes Bunn, “If you want to have a secret program, centrifuges are the only way to go.”

    The current situation is unsettling for several reasons. In December, Pyongyang expelled the IAEA inspectors and has restarted the Yongbyon nuclear reactor. U.S. officials say it is also pursuing a uranium-enrichment project. And then there's North Korea's recent hints that it has nuclear weapons. “If the North Korean statement is true, we can't actually tell if North Korea has had nuclear weapons for years or days,” says David Albright, president of the Institute for Science and International Security in Washington, D.C.

    Most chilling of all, says Gallucci, who helped negotiate the Agreed Framework, is that U.S. officials believe North Korea would consider selling nuclear material on the open market to anyone with enough money. The items up for sale could include enriched uranium, plutonium, or perhaps even complete weapons. And there are plenty of potential buyers who might wish to join the nuclear club. “This is a qualitatively different kind of threat,” he says. “It is a threat that we would have great difficulty defending against or deterring.”

    The new uranium-based program complicates future nonproliferation efforts involving North Korea. Princeton's von Hippel believes that it might, in theory, be possible to detect a uranium-enrichment program by monitoring whether suspicious compounds, such as uranium hexafluoride, are floating around in the environment. But it would be a daunting task to verify a shutdown of a uranium project in this way, especially without North Korea's approval to take soil and air samples from suspect sites.

    Albright says that any future nonproliferation agreement would need very intrusive inspections to make sure that the uranium program was truly shut down. “They would need access to any place at any time,” he says. Even then, it would be a tough task to prove that the uranium project had ceased.

    The Agreed Framework now lies in tatters, and North Korea's alleged pursuit of a uranium-based program is a serious blow to antiproliferation efforts on the Korean peninsula. Add in the country's dire economic situation, says Gallucci, and you've got a recipe for disaster. “How do we prevent North Korea from selling [nuclear material]?” he asks. “I worry about this more than anything else.”


    Pick Your Poison: U vs. Pu

    1. Charles Seife

    The key problem of detonating an atom bomb is getting a mass of plutonium-239 or uranium-235 into a single, closely packed chunk of metal that can go critical and explode with enormous energy. With uranium-235, all it takes is slamming two masses into each other with sufficient force, plus a tiny burst of neutrons to make sure the reaction gets going. The neutrons split uranium atoms, which produce more neutrons, splitting more uranium atoms, and so forth. In that way, a physical collision of a few tens of kilograms of metal turns into an enormous runaway reaction that can level a city. In fact, the first uranium bomb, exploded on 6 August 1945 above Hiroshima, was little more than a modified artillery gun. Dubbed Little Boy, it slammed a uranium shell into a uranium target at the other end of the barrel.

    Plutonium, on the other hand, is much more finicky. An artillery-barrel design won't work because the nascent chain reaction would blow the shell back from whence it came and the explosion would fizzle. A plutonium bomb requires a spherical shell of explosives that drive the metal inward with no path to escape; only then can the bomb ignite. This is why the plutonium bomb dropped on Nagasaki a few days later, called Fat Man, was round, heavy, and unwieldy.

    If North Korea has a plutonium-based bomb, it must have an implosion-type design already, so it would make sense for it to use that same design for uranium. Although more complicated than the gun-type weapon, it is more efficient and requires less uranium per bomb. Either one, of course, would earn North Korea membership in the nuclear club.


    U.S. Defense Labs Brace for a Blast From Their Bosses

    1. David Malakoff

    Pentagon planners aim to close up to 25% of military facilities over the next 3 years. They are taking an especially hard look at the sprawling network of defense R&D labs

    On their recent dash to Baghdad, U.S. military forces were aided by some of the Pentagon's most advanced gear, including powerful explosives and sophisticated sensors developed by scientists and engineers at Kirtland Air Force Base in Albuquerque, New Mexico. U.S. Navy researchers in California and Virginia also played a role, as did U.S. Army engineers in several states. To defense research advocates, that scientific overlap helps ensure victory on the battlefield. But some Pentagon officials see it as overkill—duplication that the Department of Defense's (DOD's) $11 billion science and technology enterprise can no longer afford.

    Over the next 3 years, Pentagon leaders plan to shed up to one-quarter of U.S. military bases and R&D facilities and consolidate some of their activities. The goal will be to free up billions of dollars for other uses, from new weapons to higher pay. Four previous post-Cold War downsizings have already axed nearly 400 facilities, including about 60 research, engineering, and test centers and 30,000 technical workers. But Secretary of Defense Donald Rumsfeld predicts that this round of Base Realignment and Closure (BRAC) could be as big as all the prior rounds combined. And a recently leaked memo suggests that the Pentagon could dramatically reshape its more than 80 research-related facilities, which this year will spend more than $5 billion on basic and applied research into everything from new artillery shells to better bioterror defenses. “The defense labs weren't seriously touched in the previous rounds, so you've got to believe that there are going to be some substantial changes this time,” says Paul Hirsch, a BRAC specialist and president of Madison Government Affairs, a lobbying firm in Washington, D.C.

    Other Pentagon watchers, however, aren't convinced that Rumsfeld will get his way. Many members of Congress are fighting the new BRAC round to protect their districts from potentially huge economic losses. Host communities have already begun to hire pricey consultants to advise them on ways to “BRAC proof” their facilities. And they have history on their side. Turf battles, disagreements over how to evaluate and compare facilities, and the reluctance of top science talent to move to new surroundings have stalled previous consolidation initiatives.

    In New Mexico, a coalition of state and local officials is already working to protect Kirtland, which barely survived the last BRAC process in 1995. The lumbering World War II bombers it was built to serve 60 years ago are long gone, but Kirtland has metamorphosed into a mecca for high-tech weapons and homeland security R&D. It boasts Air Force directorates working on advanced optics and futuristic directed energy weapons, such as airborne lasers, as well as a Pentagon-wide program to improve computer simulations of everything from missile tests to electronic warfare. The Department of Energy's Sandia National Laboratory also opened behind Kirtland's gates in 1949, and a slew of high-tech firms have sprouted nearby. State officials estimate that the base funnels about $2 billion annually into the local economy, creating 20,000 jobs.

    In the spotlight.

    The Pentagon will be closely examining its R&D centers, such as this advanced optics facility in New Mexico.


    To keep Kirtland open, however, backers must convince the Pentagon that the base's several thousand R&D-related employees do work that can't be done elsewhere. The Navy and Army, for instance, run some similar applied science programs, and academia and industry have traditionally argued that they could take on much of the work now done by government employees. But supporters hope the Pentagon will develop special criteria for evaluating its R&D operations that create a level playing field for Kirtland and other research-heavy bases. “You can't weigh the value of an [applied] technical center the way you would a strategic air wing or a basic research laboratory,” says Michael Hogan, president of MassDevelopment, Massachusetts's economic development authority. He is active in efforts to protect Hanscom Air Force Base in Bedford, another Air Force facility with technical know-how.

    The lack of suitable yardsticks helped doom previous consolidation efforts. During the 1995 BRAC, for instance, Pentagon leaders concluded that there was about 35% “excess capacity” in defense labs after examining workloads and trends in total work hours at various labs. But that “methodology was flawed … it unrealistically treated scientists and engineers as interchangeable, conveyable, replicable items—such as hospital beds and hangar space,” concluded Don DeYoung, a research fellow at the Pentagon's National Defense University, in a recent paper.* The experience, DeYoung says, suggests that “the only viable metric for evaluating a laboratory is its track record” of results.

    Kirtland advocates believe that they have delivered the goods, citing work through the years on synthetic aperture radar and devices for detecting and disarming explosives. “Kirtland has created a lot of technology for both military and [civilian] uses,” says Charles Thomas, a Sandia manager and leader of the Kirtland Partnership, a pro-base advocacy group.

    Kirtland and Hanscom also see themselves as potential beneficiaries of Rumsfeld's insistence that the three services unify similar R&D efforts. Hogan, for instance, suggests that Hanscom could easily accommodate similar research and technology purchasing programs from other services. “We've got the red carpet out,” he says.

    A trump card for Kirtland and Hanscom may be their skilled workforces and links to surrounding labs. “If they decided tomorrow to move Hanscom to Ohio, most of the physicists and scientists working at [Massachusetts Institute of Technology and nearby companies] probably wouldn't go,” says Hogan. And any move could compound current Pentagon difficulties in recruiting new technical talent. “It's easier to move a [fighter jet] wing than R&D,” agrees Hirsch.

    It's not clear how such arguments will play with Pentagon brass and the independent commission that will make final closure recommendations in 2005. But a memo leaked last October has already stirred up some defense R&D advocates. In it, DOD Deputy Undersecretary Michael Wynne concluded that the major defense “labs are out of favor and … their overall utility is in question.” To solve the problem, he recommended appointing an internal commission that would identify “those laboratories that are imperative for defense to retain,” close or privatize the rest, and then combine “the remnants” into a DOD-wide research facility—a long-controversial concept. The memo has helped “spark discussion” and “renewed attention to the labs,” Wynne reported wryly at a Senate hearing last month.

    The episode suggests that the Pentagon's political leaders are aiming to “manage the BRAC process with an iron hand” and not allow the turf-conscious services to shelter favorite facilities, says Steve Karalekas, a Washington, D.C.-based consultant working on the Hanscom campaign. “There isn't going to be anywhere [for the labs] to hide.”

    That prospect doesn't bother Kirtland advocates, who are inviting BRAC planners to take a close, hard look. They hope to emerge stronger from a process that may prune sister facilities that fail to show that Uncle Sam needs them, too.


    Taming Pathogens: An Elegant Idea, But Does It Work?

    1. Carl Zimmer*
    1. Carl Zimmer is the author of Evolution: The Triumph of an Idea.

    Skeptics are challenging the popular idea that an evolutionary tradeoff faced by pathogens may be the secret to making diseases less harmful

    In 1859 a rancher decided to introduce European rabbits into Australia so that he could have something to hunt. Before long the rabbits had exploded across the continent, eating so much vegetation that they began to cause serious soil erosion. In the 1950s scientists deployed a biological counteroffensive, myxoma virus, a pathogen from South America. It didn't eliminate the rabbits, but it did provide grist for an ongoing debate about virulence.

    Although myxoma was almost 100% deadly at first, its virulence soon dropped significantly. This fit with a widely held view at the time that all parasites evolve into milder forms as they adapt to the environment. Evolution favors mildness, the argument went, because it allows host and pathogen to enjoy a peaceful coexistence. “If you found a virulent association, you assumed it was recent,” says evolutionary biologist Bruce Levin of Emory University in Atlanta, Georgia. “It was even used as a way of dating parasite-host associations.”

    In hindsight, this conventional wisdom seems naïve. Myxoma became milder, but its decline stopped after a few years and it still remains lethal to many of its rabbit hosts. When evolutionary biologists took a hard look in the 1980s, they realized that a pathogen can evolve to become harmless, more deadly, or anything in between, depending on the forces guiding natural selection. Such forces can also pull the pathogen in opposite directions at the same time, creating an evolutionary tradeoff.

    Experts began to model the competing forces, and some confidently suggested that the models could serve public health, through “virulence management.” For example, by making it more difficult for a cholera-causing bacterium to be transmitted from one individual to the next, they argued, health programs could take advantage of an evolutionary balance to favor benign strains. Deadly organisms, whose toxic machinery is somewhat burdensome, might compete poorly and yield to milder strains. The notion that virulence could be managed in this way has grown increasingly popular; last year Cambridge University Press published an entire book on the topic.

    Now, two prominent evolutionary biologists—James Bull of the University of Texas, Austin, and Dieter Ebert of the University of Fribourg, Switzerland—have raised a provocative dissent. They directly challenge the assumptions that underlie virulence management, arguing that the models used to forecast pathogens' behavior collapse in the face of the true complexity of diseases, and that the quest to manage the virulence of widespread infestations is doomed to failure. Yet even the skeptics agree that the idea might be useful in planning vaccination campaigns (see sidebar).

    Born to be mild

    The concept of virulence management emerged from a particular tradeoff in evolution: the one between how fast a pathogen breeds and how easily it can infect new hosts. Evolution favors parasites that can produce lots of offspring, the argument goes, but producing offspring takes a toll on the host. The more the pathogen feeds on its host, spews out toxins, or otherwise causes damage, the more likely it is to kill. If it kills its host before its offspring can get to a new host, all its efforts are for naught. Some biologists predicted that a parasite's virulence would evolve out of the balance between its competing needs to breed and spread. Changing the balance might change a disease's deadliness. As it becomes easier for a parasite to infect new hosts, pathogens can afford to evolve into deadlier forms. As transmission gets rarer, gentler strains should take over.

    Pathogen tamer.

    Paul Ewald argues that virulence can be reduced by shaping the environment to favor milder organisms.


    Many evolutionary biologists were charmed by the elegance of the tradeoff model. Bull himself found support for it in experiments with a virus that infects bacteria. “I confess that when I first heard the ideas I bought them hook, line, and sinker,” says Bull. “A lot of people in the field did. The appeal of the original proposal was that you could take this simple concept of a tradeoff and you could apply it to any infectious disease.”

    As the tradeoff model gained strength, some evolutionary biologists thought it could become the basis for fighting human diseases. Paul Ewald, a biologist now at the University of Louisville, Kentucky, argued that altering the transmission of diseases could make them less dangerous. Putting up mosquito nets over beds and windows, for example, would make it harder for mosquitoes to carry malaria parasites from an infected person to a new host. The virulent form of malaria, which makes its hosts bedridden and often kills them, would be put at a disadvantage. “You should be able to have the milder strains favored strongly by natural selection,” Ewald predicts.

    But some biologists—Bull among them—started to have misgivings. Bull and his colleagues ran new experiments in which bacteria-dwelling viruses evolved under more realistic conditions. They found that faster transmission made the viruses more harmful, as the tradeoff model had predicted, but the difference was so slight that Bull “became much less impressed with the results,” he now says. “I began to think, hmm, this doesn't work too well.”

    Bull last year discovered a kindred spirit in Ebert, who has independently carried out some of the most important experiments on the evolution of virulence. “They're two giants of the field,” says Michael Hochberg of the University of Montpellier, France. Although Ebert's results seemed to support the tradeoff model, he had grown disenchanted as well. “We had similar frustrations,” says Ebert. The two decided to attack the tradeoff model in the January issue of Trends in Microbiology. “We're not pioneers here,” says Bull. “We know lots of other people who feel the same way.”

    They claim that most of the support for the model comes from extreme, unnatural conditions. Myxoma, a veritable poster child for virulence, was not in any sort of equilibrium with its host; an imported pathogen, it was matched against a vulnerable population that had never been exposed to it before. As for experiments, the more natural their conditions, the fuzzier their results became. “You always need additional explanations to keep the tradeoff model working,” says Ebert.

    Ebert and Bull also point out that many real-world diseases fail to support the model. The Spanish flu epidemic of 1918 broke out in the cramped, foul conditions of World War I in which transmission was easy, killing millions, Ebert and Bull acknowledge. But they wonder why foul, cramped conditions have never triggered another flu epidemic since then. And even when the tradeoff model is relevant, its practical value may be remote. “If I told you I can do something about malaria, but it will take me 10,000 years, you'd tell me to forget about it,” says Ebert.

    Ebert and Bull's challenge has been seconded by other researchers. “It's long overdue,” says Levin. Marc Lipsitch of Harvard University adds, “They're quite correct, and that's why I don't work in the field anymore.”

    The mobility factor

    The original architects of the tradeoff model are not impressed by Ebert and Bull's arguments, however. “Nothing very new in this,” says Roy Anderson of Imperial College in London. Ewald calls it “very sad and dangerous.”

    Ewald complains that the critics are trying to demolish a straw man. He says that they leave out a crucial component of his work, for example, the mode by which a disease infects new hosts. If hosts become so sick they can't move, a parasite can only infect other people who come close, unless a vector such as a mosquito can transport it. This factor is crucial in Ewald's explanation of Spanish flu. He doesn't ascribe the deadliness of the epidemic simply to cramped conditions. “That wasn't my argument,” says Ewald. “My argument was that at the Western Front you had conditions in which people who were completely immobilized could contact hundreds or thousands of people.” Sick soldiers were moved on stretchers to triage areas, then to makeshift hospitals, then onto crowded trains. In these conditions, a flu virus could devastate its host but still infect vast numbers of people. “My argument was that we wouldn't see a 1918 pandemic arise unless we duplicated this situation which occurred on the Western Front,” says Ewald.

    Nor does Ewald think critics have addressed his evidence on cholera. Vibrio cholerae makes people sick by releasing a toxin that triggers diarrhea. As a result, competing organisms get flushed out of the bowels while V. cholerae clings to the intestinal lining. It can then release its offspring into the diarrhea to infect new hosts. The bacteria reach those new hosts by two routes. Untreated sewage or runoff from laundered sheets can contaminate drinking water. Alternatively, an infected person can transmit the bacteria while handling food, shaking hands, or engaging in other social interactions—which generally require a host healthy enough to get out of bed.

    Ewald argues that in places with poor sanitation, cholera can make hosts deathly ill but still find new hosts. As a result, it will evolve to high virulence. On the other hand, in places with protected water supplies, that route is cut off. The bacteria's only option for survival is to let the host move around, which translates into reduced virulence.

    Ewald's observations of the cholera outbreak that struck South America in the 1990s support the hypothesis. In countries with poor sanitation, such as Ecuador, the outbreak was far deadlier than in countries with clean water, such as Chile. Ewald also measured the toxins produced by strains of cholera from different countries. He found that toxin production in Chile dwindled through the 1990s. “In Ecuador, it's the mirror image of Chile,” says Ewald. “Over a 6-year period, you have only the virulent strains winning out.”


    An expert on the evolution of pathogens, Dieter Ebert doubts that virulence management can work.


    Other experts agree that the critics have not yet made their case. Andrew Read of the University of Edinburgh, U.K., says of Bull and Ebert, “I think they're reacting to a quite old view. … There was a lot of optimism flung around in the late 1980s and early 1990s. I think the last 10 years have given everybody a feel for the complexity involved and the lack of data, so that nobody that I know of is making wildly optimistic statements. So they're tilting at a caricature.”

    Read nevertheless concedes that evolutionary biologists are a long way from becoming virulence managers: “We don't know enough about any one disease to be enacting anything now.” Even Ewald grants that it will be a tough hike. Just demonstrating that a change in the transmission of a pathogen can make it less harmful to humans would take a colossal study of thousands of people. In some cases the scale would require “billion-dollar experiments.” For now, Ewald suggests, we may have no choice but to continue studying “natural experiments” to see whether virulent pathogens behave as the theorists have predicted.


    Darwinian Vaccines

    1. Carl Zimmer

    Dieter Ebert of the University of Fribourg, Switzerland, and Paul Ewald of the University of Louisville, Kentucky, are for the most part on opposite sides of the current debate over taming pathogens. But there is one point on which they agree: Vaccines may offer a good opportunity for “virulence management.” Ebert calls them “a promising example,” and Ewald says that “you should be able to engineer these vaccinations so that you can drive paths towards mildness.”

    The most cited example of this evolutionary effect is diphtheria. The bacteria that cause it come in two forms. A harmless form usually found in the throat takes up whatever nutrients happen to float by, whereas a deadly form produces a toxin that rips open surrounding cells to feast on their contents. Making toxin lets the bacteria reproduce faster; it also triggers coughs that may help it spread. But the toxic strategy is risky because it can be fatal to the host.

    The diphtheria vaccine targets the toxin produced by the bacteria. It has proven very successful since its introduction in the 1920s; in countries with good vaccination programs, death rates from diphtheria have dropped dramatically. According to Benoit Soubeyrand of Aventis Pasteur in Lyon, France, disease records suggest that the vaccine has steered the evolution of the bacteria as well. The harmless bacteria have become more common while the toxin-producing ones have become rare. Yet in places where vaccination programs have been allowed to slip—such as Russia—virulent forms have made a comeback.

    Jekyll and Hyde.

    Diphtheria has two forms, and studies show that vaccination helps cut down the virulent type, even among unvaccinated people.


    Soubeyrand and others have suggested that vaccines drive the evolution of milder forms because they target the toxin and not the organism, selectively increasing the cost of virulence. As a result, the deadly bacteria end up at an evolutionary disadvantage. This may explain why vaccination benefits everyone in a diphtheria-vaccinated area—even the unvaccinated.

    Some researchers believe that diphtheria can point the way to the design of other evolution-altering vaccines. But Andrew Read of the University of Edinburgh, U.K., questions some parts of this hypothesis. “I think the data on that is pretty poor,” he says. Read believes that vaccines can have evolutionary effects, but the mathematical models he and his colleagues have constructed hint that those effects could end up being harmful. “If you vaccinate a population, you're protecting the hosts from dying, but you're also protecting the nasty parasites” from the consequences of being nasty, Read explains. The penalty for toxicity isn't high enough. According to Read, unless a vaccine program is 100% effective, newly virulent pathogens will eventually find some unprotected hosts to attack. “In unvaccinated individuals, they'll be nastier than they were before,” Read warns. It may be possible in the future to tailor vaccines to alter the evolution of other pathogens, he thinks, but scientists will have to be careful.


    Will Oil Spell Trouble for Western Pacific Gray Whales?

    1. Paul Webster*
    1. Paul Webster is a writer in Moscow.

    Experts dispute whether efforts to exploit the vast oil reserves off Russia's Far East are harming an endangered whale species

    MOSCOW—In the mid-1990s, scientists who study the endangered Western Pacific gray whale got an unexpected windfall: millions of dollars of funding from energy firms hoping to exploit petroleum reserves in the whale's summer feeding grounds, off Russia's Sakhalin Island. But the money came with a string attached. The companies—Shell Oil's Sakhalin Energy Investment Corp. (SE) and ExxonMobil's Exxon Neftegas Ltd. (ENL)—insisted that grantees sign confidentiality agreements. “I became suspicious. It seemed to me … [that] they had something to hide,” says zoologist Masha Vorontsova, Russia director of the International Fund for Animal Welfare, an environmental group.

    SE is now hoping to dispel such concerns. The company has pledged $5 million for a new 5-year research program to probe whether oil and gas operations could harm the whales and says that scientists will now be freely allowed to publish their findings. Some researchers, however, think the verdict is already in, pointing to harm that they contend the whales suffered during seismic exploration near their feeding areas. The soundings “are disturbing the whales by forcing them to leave the places [where] they fatten in summer,” asserts Eugene Sobolevsky of the Institute of Marine Biology in Vladivostok.

    SE, backed by other scientists, maintains that there's no evidence the whales are affected by its operations and has rejected calls to discontinue seismic tests when whales are nearby and to reroute a planned pipeline around the feeding grounds. The call for proposals for SE's new research effort, due out next month, should yield the necessary data to settle the disputes.

    Scientists marvel that there are any of these whales to study. The Pacific grays were thought extinct until 1974, when Robert Brownell Jr., now with the National Oceanic and Atmospheric Administration (NOAA) in Pacific Grove, California, established the existence of a small population of surviving whales, close cousins of the Eastern Pacific grays that were rescued from extinction off California. Hard on the heels of this rediscovery came another revelation: massive oil and gas deposits near Piltun Bay, where the whales feed before migrating to the South Pacific each autumn. Under a 1993 U.S.-Russian environmental agreement, SE and ENL help pay for studies on how development of the petroleum reserves might affect the mammals. Since 1997, $4 million of funding from the two companies, along with money from nonprofits and a U.S.-Russian research program, has nourished a raft of studies.

    Data in the open literature raise disturbing questions. For instance, findings from Brownell, NOAA colleague David Weller, and a Russian team led by Alexander Burdin of the Kamchatka Institute of Ecology and Nature Management suggest that the Western grays are under duress. Based on photographic tracking of the pod, they estimate there may be as few as 100 individuals, of which only 17 are capable of bearing calves. “These whales are among the top five most endangered whale species in the world,” says Brownell. Their condition seems to be deteriorating, he says, with 48 whales recently having lost weight.

    Like oil and water?

    A new research effort aims to settle whether gray whales and oil operations can coexist.


    With the whales on the edge of extinction, Weller argues, “any disruption [of normal feeding patterns] is of concern.” He and his team noticed just such a disruption in 2001, when they found that the distribution of whales in the feeding grounds had shifted significantly during 6 weeks of seismic testing by ENL 4 kilometers away. Sobolevsky observed the Piltun Bay area by helicopter at the time and says that “the whales were forced to leave” during their crucial fattening period; many whales were later spotted in another feeding area further offshore.

    Some experts believe the testing did not harm the whales. Steve Johnson, a biologist with LGL Ltd., an environmental consulting firm in Sidney, Canada, that reviewed the seismic testing program for ENL, says the underwater detonations had “very low-level” impacts: The whales moved around the feeding area more and spent less time resting or eating. What this apparent restlessness means, though, “is difficult to assess,” Johnson says, although he says there's no evidence the whales suffered. Gerry Matthews, director of external affairs for SE, says that in the absence of proof of harm, his company too will continue with seismic work while making efforts to reduce noise and test as far away from the whales as possible.

    Seismic disturbances are not the only potential threat. A toxicological study by V. V. Andreev of the All-Russian Research Institute for Nature Protection in Vladivostok suggests that pollution from drilling will have serious adverse effects on sea urchins, sand dollars, crabs, and shrimp that the gray whales eat. Vorontsova says she is especially worried about a pipeline proposed to run through part of the feeding area, but SE says it will consult marine mammal specialists to develop measures to offset possible impacts.

    With tens of billions of dollars of oil development money awaiting environmental approval from the Russian government, SE wants to speed things along by broadening the scope of the research. “At this point we want to hear from the scientific community” in Russia and internationally, says Matthews. “We'll base our plans on their suggestions.” He adds that the new effort will be managed by an independent panel including sponsors, government officials, and key researchers. And in a change of tactic, SE will push for data to appear in peer-reviewed publications.

    Vorontsova and others argue that the whales can't wait for the outcome of five more years of research. They are lobbying the energy companies to reroute planned pipelines, move drilling platforms farther offshore, and protect the entire feeding area during summertime. There is “enough data to justify [the whale's] critically endangered status,” Vorontsova says. If findings from the new research program are indeed released publicly, then it should soon be evident what steps, if any, must be taken to prevent the Pacific grays from vanishing once again.


    From Bench to Boardroom: Promoting Brazilian Biotech

    1. Alessandro Greco*
    1. Alessandro Greco has just completed a Knight science journalism fellowship at the Massachusetts Institute of Technology.

    The dynamo behind Brazil's sequencing of a crop killer is now calling the shots for the country's biggest biotech venture capital fund

    SAO PAULO—Brazilian biologist Fernando de Castro Reinach remembers exactly when his nation decided to become a global player in genomics. It was during a 1 May 1997 phone call with Jose Fernando Perez, scientific director of the State of São Paulo Research Foundation (Fapesp), the country's third-largest science and technology funding agency. “Let's sequence a genome,” Reinach told Perez, who had recently visited several prominent U.S. labs and come away depressed about the state of molecular biology in Brazil. Perez didn't hesitate. “Good. Send me a proposal,” he told Reinach.

    Three-plus years later, the work that Reinach and his colleagues performed landed on the cover of Nature (13 July 2000). The scientists had completed the full sequence of Xylella fastidiosa, a bacterium that destroys $100 million worth of Brazilian citrus every year. In addition to putting Brazilian genomics on the world scientific map, the achievement highlighted Reinach's pivotal role in efforts to apply modern molecular techniques to one of the country's most important economic sectors.

    A 47-year-old professor at the University of São Paulo (USP), Reinach has for more than a decade led a double life as a scientist and entrepreneur—a rarity in Brazil. Last year, he picked up the pace, becoming general partner for life sciences at one of the country's largest venture capital funds, Votorantim Ventures. Reinach hopes his scientific acumen, political savvy, and disarming smile will help nurture his nation's nascent biotechnology industry. “Brazilian agriculture is already very competitive,” he says. “We want to make it even better.”

    Challenging the king

    Clad in jeans and a casual shirt, Reinach doesn't look like the typical captain of industry. And don't ask for his business card: He's probably left it at home and will scribble down his phone number and e-mail on the back of a card belonging to his second wife, Lucia Hauptman. “He doesn't like to think about everyday things,” says Hauptman, a banker formerly with Credit Suisse First Boston and JP Morgan. “He sees his mind like RAM on a computer: It is limited, so you don't want to fill it with trivial stuff.”

    Reinach has been eager to link the public and private sectors of Brazilian science ever since he completed a Ph.D. with Donald Fischman at Cornell University Medical School and a postdoc with Alexander McLeod at the Medical Research Council's (MRC's) Laboratory of Molecular Biology in Cambridge, U.K. “In the early 1990s, the most successful funds in the U.S. were from the biotech area,” his brother Marcos recalls. “So Manoel [de Sa Benevides, a colleague] and I talked to my brother about what kind of business we could start.”

    Pooling their savings, the three men joined one of Reinach's former graduate students, Martin Whittle, to form Genomic. Founded in 1990, it was one of the first Brazilian companies to perform DNA tests, and one of the first to reach the market with a product. The company immediately became involved in a paternity scandal surrounding Brazil's soccer king, Pelé. At the request of a Brazilian court, Genomic was tapped to test the DNA of a woman who claimed to be Pelé's daughter. Seven years later, a sheepish Pelé acknowledged the woman as his offspring. The very public legal battle was good business for Genomic, now one of the largest DNA testing companies in Brazil.

    Capital ideas.

    Alellyx is one of three biotech companies that Fernando Reinach has started with money from Votorantim Ventures.


    Founding and running the company didn't keep Reinach from his research on proteins that control muscle contraction. A year after the company was founded, he became a full professor at the University of São Paulo. At 35, he was one of the youngest ever to attain that rank. In 1997, he was among the first group of seven Brazilian scientists to receive a grant from the Howard Hughes Medical Institute. “He is not only good, he is one of the best,” says biologist Paul Matsudaira, a biologist at the Massachusetts Institute of Technology in Cambridge who worked with Reinach as an MRC postdoc.

    But even a Hughes grant wasn't enough to occupy Reinach. After getting a green light from Perez, Reinach put together a proposal to sequence X. fastidiosa by divvying up the work among many labs. At $15 million, the project was not only the biggest ever funded by Fapesp, but it also broke the mold of awarding small grants to individuals. Not surprisingly, many senior scientists saw the project as a threat to the status quo and an unwise use of scarce resources. “I was being criticized for taking all this money to do what they saw as Fernando's project,” Reinach recalls. Fortunately, he had strong backing from Perez, who jokes that “I usually don't have good ideas, but I can recognize them.”

    Worried that his visibility could undermine the project, Reinach declined to even apply for the job of overall project coordinator. Instead, the post was filled by Andrew Simpson, who was a senior cancer geneticist at the Ludwig Institute for Cancer Research in São Paulo. Reinach quietly agreed to head up one of two main labs that oversaw the sequencing going on at the 30 other labs, keeping the scientists on track and training researchers. Against all odds, the project was a smashing success. “Everything was new for us,” says Simpson. “We weren't experts in sequencing or in pathogens, but we did it.”

    View this table:

    Through it all, Reinach remained active in his own lab. The Science Citation Index from 1981 to 1999 (a year before the X. fastidiosa paper was published) places him in the top 0.1% of biomedical researchers in Latin America, and in the top 2.5% worldwide.

    Despite a full plate, Reinach decided in 1999 to launch another private venture, comDominio, an Internet hosting service that aimed to bring major sectors of the country online. Reinach worked closely with Persio Arida, former head of Brazil's Central Bank and a major backer of the new venture, which soon became the country's second-largest hosting service provider. “I learned to be a businessman by looking at the way Lucia and Persio worked,” says Reinach, who served as the company's chief technology officer.

    A personal touch

    Reinach's winning personality is as important to his success as are his scientific skills, say his friends. “There is always something interesting to experience with Fernando,” says Matsudaira. “If you see him sitting at the lunch table, you want to sit down next to him and talk.”

    That combination of talent and charisma also captivated venture capitalist Paulo Henrique Oliveira Santos, president of Votorantim Ventures. Santos met Reinach after his company, a $300 million, multisector venture capital fund affiliated with Brazil's largest industrial conglomerate, invested in .comDominio. Many researchers “know a lot about science but little about business,” Santos says. “Fernando has both skills.”

    On the job for a little more than a year, Reinach has moved quickly to promote private sector liaisons in the life sciences. The fund put up $1 million to start Scylla, a bioinformatics start-up headed by a X. fastidiosa project veteran, João Meidanis. It also ponied up $11 million for Alellyx (another company had already claimed its intended name, which is Xylella spelled backward), which hopes to apply genomic information to improve crop quality and productivity. Besides retaining Reinach as temporary CEO, Alellyx includes four veterans from the sequencing project: Jesus Ferro, Paulo Arruda, Ana Claudia Rasera da Silva, and João Paulo Kitajima.

    Reinach is also temporary boss of a company formed this spring with $7 million in seed money from Votorantim Ventures. The company, Canavialis, intends to develop disease-resistant varieties of sugar cane and to improve productivity using genetic engineering techniques. Again, its five partners are academic scientists who, in the last 30 years, have developed 22 varieties of sugar cane. These varieties represent 60% of the sugar cane grown in south-central Brazil.

    A fast start doesn't guarantee success, of course. “In the short term, I don't think it will be possible to use any sequencing information for practical applications,” says Ernesto Paterniani, an agronomic engineer at Esalq-USP who has spent 40 years working on crop enhancement. Even 20 years may not be long enough, he speculates.

    At the same time, the X. fastidiosa project has greatly increased Brazil's capacity to do additional sequencing. It trained at least 50 young molecular biologists, some of whom have gone on to complete the genome sequences of two other pathogens, Xanthomonas citri and Xanthomonas campestris, which are responsible for citrus canker and black rot in crucifers. At least 10 genomes are now being sequenced in Brazil.

    Some scientists also wonder if Brazil, where most research is carried out in public universities, is even ready for private-sector research. “We don't have the tradition of protecting intellectual property produced in our universities,” says USP biologist Sergio Verjovski de Almeida. Almeida says that most cash-strapped universities are reluctant to encourage their scientists to apply for patents, much less to operate an aggressive technology-transfer office. “It is not clear to me how we will deal with this transfer of technology,” he says.

    Outside Brazil, Reinach's hands-on role in the companies that Votorantim Ventures is funding might represent a huge conflict of interest. Not so in Brazil, says Luiz Orenstein, a partner at Dynamo asset management, an investment company based in Rio de Janeiro. A conflict of interest exists only “if you are the CEO of the company and the fund you manage has in its portfolio companies that compete with it,” he says.

    Indeed, many scientists see the companies that Reinach is creating as a welcome opportunity for researchers to work in the private sector. “The Brazilian universities have not been able to absorb all the Ph.D.s that graduate each year,” says Walter Colli, former director of USP's chemistry institute.

    Reinach agrees that his biotech bets are risky. “This is the first time a large Brazilian private-sector group has put money on an academic spinoff,” he says. And there are cultural barriers to overcome. “The universities can't see why it is worthwhile to spin off companies,” Reinach says. “They generally think they are losing researchers to the private sector.”

    This spring, Alellyx scientists announced that a very aggressive mutant of the citrus tristeza virus is the culprit in a newly described “sudden death” disease that caused $20 million in damage to orange crops in Brazil last year. The company hopes to market a laboratory test for detecting the disease within the next 6 months. It's already landed an $8 million contract with Citrovita, a Brazilian company in the Votorantim group that is the third-largest exporter of orange juice in Brazil. Alellyx and Citrovita will work together to improve citrus productivity and resistance to the new virus.

    Last year Reinach sold his shares of the privately held Genomic and .comDominio to his partners. Although he declined to mention a figure, he says “it was a good deal.” Now he's thinking about new investment targets for Votorantim Ventures. “I am looking in the biodiversity area,” he says with his usual disarming smile. Given the country's vast range of ecosystems, it seems an appropriate setting for Reinach to demonstrate his skills as a scientist, entrepreneur, and visionary.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution

Navigate This Article