News this Week

Science  31 Mar 2000:
Vol. 287, Issue 5462, pp. 2386

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Conflict in Congo Threatens Bonobos and Rare Gorillas

    1. Gretchen Vogel

    The war that has gripped the Democratic Republic of Congo (DRC) for the past 18 months, killing thousands and displacing many more, is also taking a devastating toll on great apes. The front lines of the war, which involve troops from a half-dozen central African nations, cut through the heart of the range of bonobos, the so-called pygmy chimpanzees famous for their language-learning ability in captivity and their complex social behavior in the wild. Data are scarce and largely anecdotal because most researchers have left the country, but local conservationists have been warning in e-mails over the past few weeks that both bonobos and gorillas are severely threatened in parts of the country.

    The bonobos and other apes are not caught in the crossfire but rather are falling prey to poachers as hungry troops and refugees seek out animals for meat. And they are especially vulnerable because military leaders have disarmed the guards in national parks, leaving them unable to patrol regions that might otherwise provide a safe haven.

    The alarm went out in late February, when a conservation worker reported a sharp increase in the number of bonobo orphans arriving in Kinshasa, capital of the DRC, since January. Claudine Andre-Minesi, who runs a sanctuary in Kinshasa for young bonobos confiscated from illegal traders, told colleagues in an e-mail that in the first 2 months of this year, she confiscated eight orphans. The preceding year, she received only two. Eight orphans may not sound like a crisis, but to capture an infant, hunters usually have killed several family and group members, says Jo Meyers Thompson, a bonobo researcher in Snowmass, Colorado, and head of the Lukuru Wildlife Research Project in central DRC. “Those eight and their family members are the tip of the iceberg,” she adds. “If that many infants are visible, there are an even greater number that are not making it in [to Kinshasa].”

    Although Thompson acknowledges that the recent influx of young bonobos might be due to easier river transport between bonobo habitat and the capital, she sees it as an ominous sign. Conservation workers fear that if hunting continues at these “catastrophic levels,” the species, which lives only in central DRC and is relatively little-studied in the wild, could be wiped out.

    Some researchers are skeptical, however. When Michael Worobey, a graduate student with the late William Hamilton at Oxford University in the United Kingdom, traveled in northeast DRC in January, he and his colleagues were “overwhelmed” by the apparently intact forest that stretches over the region. Hunters the team hired to collect fecal samples from bonobos and chimps had little trouble finding animals in the area around Kisangani, a city in northeast DRC controlled by rebel forces. “The impression was that, at least in the area around Kisangani, chimp and bonobo populations have not suffered greatly,” he says.

    But there is little doubt about the threat facing gorillas farther east. Workers in the Kahuzi-Biega National Park, which overlooks Lake Kivu on the DRC's eastern border with Rwanda, report that poachers are devastating the eastern lowland gorilla, which was already threatened. More than half of the 240 gorillas known in one study section have been killed by poachers, according to an e-mail report from primatologist Juichi Yamagiwa of Kyoto University in Japan, who was in DRC last fall and has regular communication with people there. Several researchers say the situation is likely to be even worse in outlying areas of the park, which are more difficult to patrol. More than 80% of eastern lowland gorillas live in or around the park, and the subspecies “is now in critical danger of extinction,” Yamagiwa says.

    To protect the gorillas, several conservation groups have recently negotiated with rebels who control eastern DRC to rearm the park guards, says Annette Lanjouw of the International Gorilla Conservation Program in Nairobi, Kenya. Although the park has been cut off from government funding since the war began, conservation groups and the German development agency Deutsche Gesellschaft für Technische Zusammenarbeit have funded the guards for the past several years. Training will begin in April, says Lanjouw, and armed patrols should resume soon thereafter.

    It is more difficult to protect the bonobos, says Lanjouw, because most live outside official parks. One of the best deterrents to poachers is the presence of researchers, says Sue Savage-Rumbaugh of Georgia State University in Decatur, who studies the language capability of bonobos in captivity. But that could be risky. Although the warring factions signed a cease-fire last summer and the United Nations has agreed to send peacekeeping troops to the region, clashes between government and rebel troops continue. Indeed, Thompson recently heard that her station had been looted and her Congolese collaborators had fled heavy fighting in the area.

    Nevertheless, several research groups are looking into the possibility of returning to the DRC, whether or not the fighting stops. Photojournalist and conservationist Karl Ammann, for one, believes some limited research might be feasible even now. He traveled to northern DRC in February and says that rebel leader Jean-Pierre Bemba expressed support for conservation efforts and invited researchers back to the territory his troops control. Many researchers are reluctant to be perceived as supporting the rebels but are eager to return.

    “The civil war might take several more years,” says bonobo researcher Ellen Van Krunkelsven of the University of Antwerp in Belgium. “We cannot just sit and wait,” she says, because bonobos might not have that long.


    Allègre Loses Job, Research Split Off

    1. Michael Balter*
    1. With additional reporting by Peter Coles in Paris.

    PARISGeochemist Claude Allègre was dumped this week as France's minister of research and education in a Cabinet reshuffle. Allègre, whose 3-year tenure invoked strong reaction from scientists and a series of protests and recent school closings from the powerful teachers' unions, was replaced on 27 March by Roger-Gérard Schwartzenberg, a lawyer and veteran politician. Prime Minister Lionel Jospin also sacked three other ministers and split Allègre's domain into two smaller ministries, with Jack Lang, a former culture minister, taking the education portfolio.

    Firing Allègre was not an easy step for Jospin, who has known the scientist since their university days 40 years ago. But for many researchers and teachers, Allègre had become the man they loved to hate. Allègre combined far-reaching reform proposals with an aggressive, combative style, and the mix was highly combustible (Science, 4 February, p. 781). With support for his Socialist government slipping, Jospin apparently had little choice but to dump Allègre and other unpopular ministers.

    Allègre's departure leaves researchers wondering about the views of his replacement, who has no background in science. A professor of civil law at the University of Paris, Schwartzenberg served as secretary of state for education before being elected to the National Assembly in 1986. However, an initial interview with the French radio station France Info, in which Schwartzenberg stressed the importance of research to economic growth and pledged to encourage French industry to invest more in science, has some French scientists hoping for the best. Many researchers criticized Allègre sharply for pushing them to link up with industry without putting similar pressure on companies to take research more seriously. “Nothing was done to induce industry to treat research as other than a furnisher” of raw data, says Harry Bernas, a physicist at the University of Paris's Orsay campus.

    Allègre's director of research, geophysicist Vincent Courtillot, says his boss had launched much-needed reforms. He cites the creation of hundreds of new research positions in the universities and of a fund to allow young researchers to gain independence early and start their own labs as examples of Allègre's commitment to research. Ironically, Allègre's departure came just days after part of his controversial reform package of the basic research agency CNRS was approved by its executive board. Two key elements, CNRS president Edouard Brézin told Science, are “greater freedom” to set its own research agenda and the creation of a “fully independent scientific council.”

    The reshuffling leaves unclear the status of Courtillot, a longtime Allègre colleague and his right-hand man at the ministry. And although many French scientists may rejoice at Allègre's departure, they agree on the need to shake up French research. “He was asking a lot of the right questions,” says Bernas, “but giving the wrong answers.”


    Mirror Film Is the Fairest of Them All

    1. Robert F. Service

    Imagine holding a rainbow in your hand—a flimsy plastic bag that glistens red, blue, green, violet, yellow, and orange as light bounces off it from different angles. Imagine holding another flimsy bag that is a perfect mirror for light waves oscillating in one direction, or polarization, while transparent for others. Now combine the two, and you can begin to picture the dance of light on a new plastic film produced by researchers at the 3M Corp. in St. Paul, Minnesota, and reported for the first time on page 2451.

    The new material is an assembly of thin, alternating layers of two common plastics that reflect different colors and amounts of light depending on the angle at which the light strikes them. And unlike previous multilayer mirrors, which are best at reflecting light that's traveling perpendicular to the mirror's surface, the new films can reflect light coming in at all angles equally well. That's likely to make them useful for everything from improving the light emission from laptop computer displays to funneling outdoor light deep inside buildings.

    “It looks like a nice idea that can be used in a general way,” says Shaul Mukamel, a chemical physicist and optics expert at the University of Rochester in New York. Mukamel notes that the work marries two long-studied areas in optics: multilayer mirrors and a property known as birefringence, whereby light moves at different speeds as it travels through a material in different directions. The offspring of the marriage is an inexpensive plastic film capable of reflecting more than 99% of the light that hits it. (A typical silver-on-glass telescope mirror reflects only 95%.)

    The 3M researchers didn't set out to reinvent the mirror. They were developing a new set of mirrors to reflect polarized light out of multiple layers of plastics. Such multilayer mirrors and filters had been around for decades. They take advantage of the fact that light waves bounce off boundaries between two materials that pass light at different speeds, such as air and water. Multilayer reflectors amplify this effect by repeatedly alternating a “slow” material (one with a high refractive index) with another that has a low refractive index. Each boundary between layers reflects a fraction of the incoming light. As light waves reflect off different boundaries, their oscillating peaks and troughs can either line up and reinforce one another or cancel one another out. By controlling the thickness of each layer, researchers can determine how these light waves will interfere and thus which colors of light will be reflected.

    The most common multilayer mirrors are made up of alternating layers of two inorganic materials, such as glass and titanium dioxide. Though effective, such mirrors suffer a common drawback: Their refractive index is always the same no matter at which angle the light moves through the film. One result is that certain kinds of polarized light can pass through at sharp angles, because they don't see a change in refractive index as they move through the layers. Polymers, on the other hand, are birefringent: The refractive index can change depending on which way the long, chainlike molecules are oriented in a film.

    The 3M researchers wanted to see if they could use that property of birefringent plastics to reflect all kinds of polarized light. They came up with a new proprietary way to extrude sheets of hundreds of alternating layers of two or more common plastics, such as polymethylmethacrylate and polyester. They then followed the common practice of heating and stretching their polymer sheets into thin films. And when they did, they got a surprise: The resulting films not only were nearly perfect plastic mirrors, but remained almost perfect reflectors even at sharp angles.

    “When we saw it, we thought something weird was going on,” says report co-author Michael Weber. By controlling each layer's thickness and the orientation of the polymer molecules, they found that they could tailor their films to determine exactly which colors and polarizations of light were reflected in any direction. When they searched the literature, they were surprised to find that they were the first ones to control multilayer films in this manner. “It floored us that no one had ever noticed it before,” says Weber.

    People will be noticing soon. The 3M researchers have already started turning the new films into products both serious and fun. Already on its way to market, Weber says, is a way of using the films to improve the performance of displays for laptop computer and handheld organizers. Set at the back of the display, the 3M film can reflect light from an internal bulb out of the screen, thereby saving energy and battery power. Other soon-to-be-seen products include optical filters, iridescent and reflective packaging, bows and ribbons, and—who knows—off-the-rack rainbows, one size fits all.


    Richardson Puts Laser Project on Short Leash

    1. David Malakoff

    The Department of Energy (DOE) is tightening its oversight of the world's largest laser project, which is years behind schedule and at least $300 million over budget. Energy Secretary Bill Richardson last week announced a series of steps designed to put the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory in California back on track.

    The moves aren't the final word on the troubled project, observers predict. Next month federal lawmakers are expected to receive a highly critical audit report from the General Accounting Office, its investigative arm, followed by new cost estimates from DOE that could balloon NIF's price tag. “NIF is headed for choppy seas,” predicts one House aide.

    The $1.2 billion NIF is designed to focus 192 laser beams on a lozenge-sized target in a bid to test the feasibility of fusion energy and simulate nuclear weapons behavior without actual testing. A host of technical glitches and management missteps have forced Livermore officials to consider scaling back the project and stretching the timeline beyond its scheduled completion in 2003 (Science, 17 September 1999, p. 1831). On 24 March Richardson took a series of interim steps, including the appointment of Livermore weapons scientist George Miller to a new position leading the project, the hiring of an outside contractor to track the project, and the assignment of more DOE staff to oversight tasks. He also ordered demotions and reassignments for an unspecified number of unidentified Livermore and DOE staff. At the same time, in an apparent reaction to NIF's troubles, the University of California has denied a routine pay raise to Livermore director Bruce Tarter, singling him out from among 26 senior managers at the three DOE labs it manages.

    Richardson's announcement generated a terse reply from Tarter, who said he would “work cooperatively with DOE to ensure NIF's success and funding.” Richardson told the House Appropriations Committee earlier this month that his goal is to wake NIF from its “management nightmare.” And although critics say NIF faces formidable technical challenges, from growing ultrapure crystals to pouring extremely fine glass, Richardson says he's “convinced that the underlying science … remains sound.”


    Chipping Away at the Causes of Aging

    1. Jean Marx

    Aging is not kind. Our skin wrinkles, our hair may fall out, our bones and muscles weaken, and we become increasingly susceptible to a raft of fatal diseases. But despite ever-increasing interest as the baby boomers age—not to mention extensive research—relatively little is known about what causes this physical degeneration. Now, researchers are getting some clues from a hot new technology: DNA microarrays or chips, which enable them to perform wholesale analysis of gene expression patterns.

    In one of a flurry of new studies, a team led by Richard Lerner and Peter Schultz of The Scripps Research Institute in La Jolla, California, has used microarrays to provide a snapshot of the gene changes that occur in aging fibroblasts, the cells that help form skin and connective tissue. As the researchers report on page 2486, some of the changes they found could produce such signs of old age as skin wrinkling. And they also found evidence for what may be a more global explanation of aging: an impairment of the machinery needed for normal separation of the chromosomes during cell division that could lead to genetic instability and a variety of disturbances in gene function.

    “This is an extremely interesting piece of work,” says aging researcher Leonard Guarente of the Massachusetts Institute of Technology. Still, he and others caution that it will be necessary to verify that the changes the Scripps group detected occur in living people and not just in the cultured cells they are working with.

    The Scripps group compared gene expression in cells from healthy people of various ages and also from children with Hutchinson-Gilford progeria, a rare hereditary disorder that resembles an accelerated form of aging. In essence, microarray analysis involves putting snippets of DNA from known genes on a fingernail-sized chip and seeing which ones light up when the chip is exposed to fluorescently labeled DNA copies of the messenger RNAs from the cells under study. These are the active genes. The researchers found that the expression of just 61 genes—out of a total of some 6300 checked—changed with age. Many of these same changes also occurred in the fibroblasts from the progeria patients, a finding that indicates that these individuals, who often die in their early teens from such conditions as heart disease, are indeed experiencing an accelerated form of aging.

    Although the changes the Scripps group found were intriguing, some were not surprising. For instance, several of the fibroblast genes whose expression patterns were altered are involved in forming and remodeling collagen and other proteins of the extracellular matrix, which provides support for the skin and other tissues. The researchers also observed up-regulation of genes involved in inflammation, which has been linked to a variety of the ills of old age, including heart disease and Alzheimer's. But perhaps the most intriguing change was the down-regulation of a set of some 15 genes that help control mitosis, the part of the cycle in which cells actually divide.

    A common consequence of defects in the genes involved in mitosis is chromosome instability, a known contributor to cancer development, as it can lead to loss of genes that suppress tumor formation or activation of genes that promote it. Chromosome instability may also be a more general contributor to aging, by triggering the malfunction of genes other than those involved in cancer. From this, Lerner concludes, “aging is predominantly a disease of mismanagement of cell division checkpoints.” Although other studies had pointed in that direction, the Scripps work “provides much more solid evidence for that idea, because the [microarray] screen picked up a lot of genes involved [in mitosis],” says aging researcher Judith Campisi of Lawrence Berkeley National Laboratory in California.

    Still, Campisi and others would like to see the results confirmed. She notes that the Scripps team tested just 11 cell lines. Another concern is that the cells, which were obtained from commercial sources, might not be comparable in such features as the ability to divide, although Lerner says his team controlled for that possibility.

    But if they're correct, the new findings would also suggest that different gene changes underlie aging in different tissues. In previous work, a team led by Richard Weindruch and Tomas Prolla of the University of Wisconsin, Madison, used microarrays to probe aging in mouse skeletal muscle (Science, 27 August 1999, p. 1390). Like the Scripps team, the Wisconsin team found that aging did not cause widespread alterations in gene expression.

    Just 55 genes, or slightly less than 1% of those assayed, showed decreased activity in aged animals' muscles, while the activity of a comparable number went up. Many of the genes whose activities declined produce proteins needed for energy production and the synthesis of proteins, lipids, and other cell constituents—changes that could account for the muscle weakening that occurs with age. In contrast, many of the genes whose activities increased produce so-called stress proteins, which are needed to repair or eliminate damaged DNA or proteins. What's more, the Wisconsin team found that many of the changes they saw in aged animals fed a normal diet did not occur in mice on a calorie-restricted diet—a finding that provides a long-awaited explanation for how calorie restriction extends rodent life-spans.

    But except for some stress-response genes, there was little overlap between the alterations the two groups saw. That suggests, Prolla says, that the fibroblasts, which are dividing cells, and the skeletal muscle cells, which have lost that ability, “probably undergo aging through two different mechanisms—a very important observation.”

    Whether the type of microarray studies being done by the two teams will help people live longer, or at least healthier, lives, is anyone's guess. Says Lerner: “Is the Fountain of Youth here, because of this paper? I don't think so.” Still, the research is providing ideas to explore about the causes of aging—in other words, a fountain of knowledge, if not youth.


    Drug-Resistant TB on the Rise

    1. Erik Stokstad

    Tuberculosis is back with a vengeance. Once nearly vanquished by antibiotics, at least in the developed world, tuberculosis resurged in the late 1980s and now kills more than 2 million people a year—second only to AIDS among infectious diseases. Especially frightening is the emergence of drug-resistant strains. The wake-up call came in the early 1990s, when New York City was hit with an epidemic in which about 9% of cases were resistant to two or more TB drugs. The outbreak took years—and cost $1 billion—to quell. But global trends in resistant TB, though the subject of considerable speculation, have been unknown.

    Now, the World Health Organization (WHO) has some answers, and they are grim. Drug-resistant TB is rampant and appears to be spreading, WHO concludes in its most comprehensive report to date. Said WHO Director-General Gro Harlem Brundtland: “This report confirms our worst fears.” WHO warns that these resilient strains could cripple the economies of developing nations and could erupt in Western countries as well.

    Scientists blame the rise of resistant strains on a history of drug misuse. Sometimes doctors do not prescribe the proper course of treatment, which involves taking a mix of drugs for up to 6 months. Understandably, some patients fail to comply. And in many poor or war-torn countries, drugs are not always available.

    WHO's new report covers 72 regions and has statistics on 28% of known TB cases. In three of the 28 areas for which data were available from a 1997 WHO survey, the prevalence of drug-resistant TB has skyrocketed. In both Germany and Denmark, it climbed by 50% since 1996, to 10.3% and 13.1% respectively. New Zealand fared even worse—resistance more than doubled to 12%.

    Experts agree that treating TB properly from the outset can prevent the rise of resistant strains. The best weapon is a strategy called Directly Observed Treatment, Short-course (DOTS), in which health workers make sure that patients swallow every pill over the long course of treatment. The problem is, only 21% of TB patients around the world received DOTS in 1998. And for TB strains resistant to two or more drugs, known as multidrug-resistant (MDR) TB, conventional drugs and DOTS don't work. Instead, health workers must rely on second-line drugs that are less effective and more expensive. Treatment can cost up to $250,000 per person and take 2 years—well beyond the reach of many poor countries.

    Now that MDR TB has arisen, these strains are spreading through communities. The problem is particularly severe in six regions—Estonia, Henan Province in China, Latvia, the Ivanovo and Tomsk regions in Russia, and Iran—where between 5% and 14% of first-time TB cases are multidrug resistant. The spread of resistant strains has ignited intense debate over whether it's best to spend scarce resources on treating the widespread susceptible strains or on tackling MDR strains. One thing that's clear, however, is that the problem is a devil of our own making. Resistance is “almost an inevitable consequence of bacterial evolution and human nature,” says molecular epidemiologist Peter Small of Stanford University. “The bug wins again.”


    Bat Researchers Dispute Rabies Policy

    1. Christine Mlot*
    1. Christine Mlot is a science writer in Madison, Wisconsin.

    How noticeable is a bat bite? That seemingly esoteric question is center stage in a dispute over the small risk of catching rabies from bats. Based on some puzzling human rabies deaths, public health officials, including the Centers for Disease Control and Prevention (CDC) in Atlanta, recommend that in some circumstances people exposed to bats get rabies shots, even if there's no evidence of a bite. Bat researchers counter that the animals don't attack and leave stealth bites and that the policy sparks unwarranted fear of bats. The “infinitesimally small risk” of rabies from bats, says bat researcher Thomas Kunz of Boston University, has been blown “out of proportion.”

    After years of grumbling about the effects of CDC's stance, participants in the annual North American Symposium on Bat Research—which after 30 years of meetings has recently been organized into an official scientific society—are releasing a statement next week to address what they see as bad science and bad press on the issue. The researchers “find no credible support” for what they call “the undetected bite hypothesis” and argue that “it should not drive public policy.”

    Federal health officials, however, are unapologetic for what they see as a cautious stance concerning a frightening if rare disease. Of the 27 rabies deaths in the United States in the 1990s, 20 were traced to viral strains associated with bats, the rest to canine strains. Current federal immunization guidelines, established by an advisory committee to CDC, say that rabies vaccination “is appropriate even in the absence of a demonstrable bite, scratch, or mucous membrane exposure in situations in which there is reasonable probability that such contact occurred (e.g., a sleeping person awakens to find a bat in the room …).” “I think our message is pretty reasonable,” says Charles Rupprecht, chief of the CDC's rabies section.

    Bat researchers say that this advice is based on a faulty premise. CDC maintains that only one of the recent deaths “had a definite history of a bat bite,” implying that the others may have been the result of undetected or unreported bites. But bat researchers insist that a nip from a bat wouldn't go unnoticed. It's “not a bad bite, but it gets your attention,” says biologist Thomas A. Griffiths of Illinois Wesleyan University in Bloomington.

    Bat researchers agree that there's good evidence of a bat on the scene in all but a few of the cases of bat-associated rabies. They argue, however, that some victims may have been bitten but died before reporting a bite, or were bitten long ago and forgot the encounter, as apparently happened in some canine rabies deaths. In the few cases with no evidence of bats at all, a cat might have killed a rabid bat, then transmitted the virus to a person, says Denny Constantine, a retired rabies researcher with the state of California and CDC. Supporting that idea is the fact that these deaths were caused by a virus linked to two species rarely seen around people.

    The federal guidelines have sparked overreaction, observers say, such that many people get rabies shots—costing an average of $2000 a series—after simply being in the same room with a loose bat. “What has happened in many places is people have gone beyond [the CDC guidelines],” says Stephen Frantz, a disease-vector specialist with the New York State Department of Health. In 1998, for example, to comply with state and federal guidelines, 52 boys at a summer camp in New York were vaccinated against rabies after a bat flew through their cabins.

    The bat researchers' statement warns that such overreaction can have “negative consequences for bats,” many of which are endangered or in rapid decline, notes Merlin D. Tuttle of Bat Conservation International in Austin, Texas. Whenever a species is made out to be a public health threat, says Kunz, it counters conservation efforts. Declared Tuttle at a bat research symposium last October in Madison, Wisconsin: “This has set back conservation efforts by about 2 decades.”

  8. 2001 BUDGET

    Austerity Push Begins a Bumpy Ride for R&D

    1. Andrew Lawler

    A booming economy, an enthusiastic president, and a supportive Congress should provide science and technology with safe passage through the turbulent annual budget cycle. That's the conventional wisdom. And while many budget watchers predict that science will eventually prevail, a bitter partisan battle now under way over government restraint in an era of surpluses is giving R&D advocates a collective case of the jitters. “It's going to be ugly,” bemoans one congressional aide.

    The weapon of choice among conservative Republicans in their battle against increased government spending is the budget resolution, a measure that sets overall funding limits within the one-third of the $1.8 trillion budget funded at Congress's discretion. Conservatives such as Senator Phil Gramm (R-TX) want to adhere as closely as possible to spending caps imposed when the government ran an operating deficit, while Democrats and the Administration argue for the need to increase spending on education and other domestic programs (Science, 11 February, p. 952).

    The House took the first step down the 2001 budget road last week, narrowly passing a plan that would set discretionary spending well below the president's request. The Senate is still struggling with its version, which must be reconciled with the House in a process that Republican leaders have vowed to complete next month.

    The House figure of $596 billion for discretionary spending exceeds Gramm's demand for a freeze at this year's level of $586 billion, and it is $45 billion over the scheduled cap. But Clinton requested $625 billion. And not only did the House cut the total, it also wants to boost the defense budget—which takes up slightly more than half of all discretionary spending. That will make it even tougher to fund the vaunted 17% increase for the National Science Foundation (NSF) and boosts for other civilian R&D agencies. “I think everyone should prepare for a bumpy ride again,” warns Representative Jim Walsh (R-NY), who chairs the panel that funds NSF, NASA, and the Environmental Protection Agency.

    Even NASA's modest 3% requested boost could be in trouble. Representative Alan Mollohan (D-WV), ranking minority member on the panel, says his subcommittee “will have a hard time preventing another round of cuts in NASA's budget” despite strong support for its mission. The panel's priorities will be to fund hefty increases in housing and veterans' medical care, says aide Frank Cushing, who adds, “we need [more] dollars.” The combination of increased defense spending and the push for more education funding will make it doubly difficult to repeat the 15% boosts of the past 2 years for the politically popular National Institutes of Health (NIH). And election-year politics may result in another year of legislative gridlock for the NIH appropriations bill. “We're paralyzed,” complains one aide.

    Alarmed at such talk, Representative Vern Ehlers (R-MI) last week began rallying support behind civilian science. “We are concerned that funding for science may take a back seat” to housing, veterans' health care, and education programs, he wrote in a 20 March draft of a letter to colleagues that is aimed at the House leadership. “We ask you, in the strongest words possible, to assign a high priority to basic scientific research.” There is evidence that the leadership is listening. Ehlers and Representative Rush Holt (D-NJ), both physicists, successfully lobbied last week for civilian R&D spending to get a tiny sliver more—some $100 million—of the discretionary budget pie. Although appropriators are not bound by those numbers, “it shows there is political interest” in funding science, one House staffer says.

    So despite the inevitable posturing and election-year rhetoric, many observers insist that R&D will ultimately prevail. Last year, for example, NASA's spending panel, after strong pressure from the White House, found a way to restore steep cuts in space science. “The outlook for federal R&D in 2001 is highly favorable,” states a preview of the annual budget report published by the American Association for the Advancement of Science (which publishes Science). “It seems almost certain,” the report predicts, that science and technology will achieve funding levels equal to or even surpassing those requested by Clinton. Even more certain, however, is that it will take a lot of political horse trading for that conventional wisdom to prevail.


    Aging NASA Satellite Headed for Fiery End

    1. Laura Helmuth

    NASA has decided to euthanize an ailing but still functioning satellite rather than risk a slim chance that it could spin out of control and crash in a populated area. Last week NASA described plans for a late-spring maneuver that will cause the Compton Gamma Ray Observatory (CGRO) to enter the atmosphere and break apart across a swath of the Pacific Ocean.

    The satellite is equipped with three gyroscopes, one of which failed late last year (Science, 21 January, p. 403). Although it needs only two to perform its scientific duties, a second failure—a possibility the Goddard Space Flight Center in Greenbelt, Maryland, puts at 10% in the next 3 years—would make the craft more difficult to control. “It was my decision [to put the craft down],” says Edward Weiler, NASA's associate administrator for space science, noting that his decision follows the original contingency plan. Although the craft is still productive, he says, NASA was faced with trying to calculate “how many papers are worth an increased risk to human life?”

    Launched in 1991, the CGRO has already lasted almost twice its intended 5-year life-span. It mapped the sky's gamma rays, found the first gamma ray pulsars, and determined that gamma ray bursts originate beyond the Milky Way and are likely signs of the largest explosions since the big bang. Researchers have published more than 1000 papers based on CGRO data, including a paper last week linking some gamma ray sources to a new class of mysterious objects (see next story).

    CGRO's scientific instruments will be turned off on 26 May. Five days later, a Goddard team will execute the first of four burns designed to shove the craft out of its orbit. The final two burns are planned for 3 June. Most of the 15,000-kilogram spacecraft will disintegrate and burn up when it hits the atmosphere, but 30 to 40 pieces will survive, ranging from bolt-sized to some possibly as heavy as 1000 kilograms. The detritus is expected to scatter in an area 25 kilometers wide and 1550 km long in the Pacific Ocean, about 4000 km southeast of Hawaii.

    CGRO's demise will temporarily close a window onto gamma rays, which are blocked by the atmosphere. “Astronomers around the world will be quite disappointed,” says CGRO project scientist Neil Gehrels. In 2003 NASA plans to launch a satellite, Swift, that will monitor gamma ray bursts, and in 2005 GLAST will focus on high-energy gamma ray phenomena. In the meantime, says Gehrels, “we have 62 days, 19 hours, and 42 minutes left” before the receivers are shut down. “We're going to make the best use of every one of those minutes.”


    Sky Survey Finds Mysterious Strangers

    1. Charles Seife

    Amid the diffuse bath of gamma rays coming from the galaxy, about 200 point sources—tiny gamma ray beacons—twinkle within the haze. For 2 decades, astronomers have been puzzling over what they are. Now astronomers at NASA's Goddard Space Flight Center in Greenbelt, Maryland, have doubled the mystery: They have discovered that these point sources come in two different varieties.

    Neil Gehrels and his colleagues at Goddard analyzed the data from more than 4 years of observations with the EGRET telescope aboard the Compton Gamma Ray Observatory. Most of the sources lie along the galactic plane, but a few dozen lie in the middle latitudes, as much as 40 degrees above and below the galaxy's equator. In last week's issue of Nature, the team reported that the sources away from the galactic plane turned out to be surprisingly dim. “We didn't know what to make of the middle-latitude sources,” says Gehrels. “The new population is much weaker than ones along the plane.”

    The mid-latitude sources appear to lie in the so-called Gould belt, a broad, expanding ring of stars, gas, and dust about 3000 light-years across. If so, they are our galactic neighbors. The Gould belt formed 30 million or 40 million years ago, probably when a series of stars exploded or some other powerful event disrupted the matter in our region of the galaxy. Clouds of gas, shocked outward, created a burst of new, massive stars. The bright sources in the galactic plane, in contrast, are likely to be tens of times farther away and thus, in absolute terms, vastly more powerful.

    Isabelle Grenier, an astronomer at the University of Paris VII, finds the argument compelling. “What [Gehrels] has shown is that there is a clear difference between those out of the plane and those along the Milky Way,” she says. “Now we're all convinced that there are two populations.” Grenier also agrees that the weak sources are located in the Gould belt. “If you look at the distribution, you can really find that it follows the Gould belt,” she says.

    If Gehrels's team is correct, then the weak mid-latitude gamma ray sources and the intense equatorial sources may be entirely different types of objects. “The luminosity is so different between the two populations, there must be very different physical mechanisms,” says Gehrels. Perhaps the mid-latitude sources are gamma ray pulsars, rapidly spinning geriatric stars that for some reason bathe us in gamma rays rather than the usual lower frequency radio waves. (If so, the gamma rays should come in periodic bursts, but current instruments cannot tell whether they do.) The galactic-plane sources might be “microblazars,” black holes that are spitting jets of mass and energy directly at us, or high-velocity gas clouds emitting gamma rays as they slam into massive stars.

    Until new gamma ray observatories, such as NASA's Gamma Ray Large Area Space Telescope (GLAST), start to go into orbit in 2005, astronomers will have to try to solve the puzzle by finding and studying the sources using other kinds of radiation, chiefly x-rays. But if the gamma ray sources are invisible in those frequencies, Grenier says, the mystery will remain unsolved: “If we do not find counterparts, we are forced to wait for GLAST.”


    Physicists Unveil Schrödinger's SQUID

    1. Adrian Cho
    1. *APS March Meeting, 20 to 24 March.

    MINNEAPOLIS It doesn't purr, but a tiny superconducting ring is the closest thing yet to Erwin Schrödinger's famous dead-and-alive cat. At last week's meeting here of the American Physical Society,* physicists announced that, under the right conditions, such rings can carry current in opposite directions at the same time—a feat never before performed in an object so big.

    For decades, Schrödinger's cat has been the stock example of a paradoxical but fundamental property of quantum mechanics: that an object can be in two or more states at the same time. But physicists' favorite feline remains purely hypothetical, because objects much bigger than individual atoms, photons, and molecules generally interact strongly with their surroundings, which force them to choose one state or another. Now two teams of researchers have induced millions of electrons to flow simultaneously both ways around a small superconducting ring with a nonsuperconducting notch in it, a gizmo known as a superconducting quantum interference device, or SQUID.

    A SQUID prefers the total amount of magnetic field passing through it to equal an exact multiple of a fundamental constant known as the flux quantum. Add or subtract a fraction of a flux quantum by changing the field the SQUID sits in, and the ring tries to round off to the nearest whole value by creating an electric current that adds or subtracts from the imposed magnetic field. Because SQUID can round up or down, the current can flow in either direction.

    It's when the SQUID tries to polish off exactly half a flux quantum that quantum oddities begin. In that case, the energy in the ring is equal for current flowing either way, clockwise or counterclockwise, and the SQUID cannot absorb energy by jumping between the two. However, the both-ways-at-once case is different. Such mixed quantum states come in pairs, one with slightly higher energy than the other. As a result, a SQUID can consume energy by hopping from one mixed state to the other.

    To test whether they had really achieved a two-way flow, the two teams of researchers fed their SQUIDs energy by shining microwaves on them. Physicists Hans Mooij, Caspar van der Wal, and their colleagues at Delft University of Technology in the Netherlands used microwaves with just enough energy to make their SQUID jump between one mixed state and its partner and took the absorption as evidence that the mixed states were there. In contrast, physicists James Lukens, Jonathan Friedman, and colleagues at the State University of New York, Stony Brook, gave their SQUIDs microwaves with an even bigger energy boost. The SQUID absorbed at two nearly equal frequencies, revealing a pair of mixed states well above the energy of the initial state. Without mixed states, it would have absorbed only one frequency. Both teams concluded that their SQUIDs had achieved the mixed state of two-way flow.

    Their both-ways-at-once currents may earn SQUIDs a starring role in the processors of quantum computers. Whereas ordinary computers use bits that can be either 0 or 1, quantum computers require “qubits” which can be 0, 1, or 0-and-1. Researchers have fashioned handfuls of qubits out of individual atoms, molecules, and photons. But SQUIDs should be easier to manipulate, Friedman says, because they can be a million times bigger than an atom and mass-produced on silicon chips. Still, Eli Yablonovich, an electrical engineer at the University of California, Los Angeles, says SQUID researchers face the same daunting challenge as others trying to develop practical qubits: preserving the mixed quantum states in a hostile world. “They've got to get down in the trenches with the rest of us to solve that problem.”


    Lab Accident Damages Solar Flare Satellite

    1. Andrew Lawler

    A NASA mission to study solar flares has suffered a rough ride even before it leaves the ground. A vibration test of the High Energy Solar Spectroscopic Imager (HESSI) went mysteriously awry on 21 March, damaging the spacecraft's solar panels and possibly other components and forcing NASA to postpone its launch from July to next January. “It's devastating,” says principal investigator Robert Lin, a physicist at the University of California, Berkeley.

    The unusual incident also was bad news for NASA managers, who last week were busy on Capitol Hill answering questions about a spate of recent agency failures and snafus. The HESSI program to date has been a model of NASA Administrator Dan Goldin's faster, cheaper, better approach to science, say program officials. The $70 million mission to explore the physics of particle acceleration and energy releases in solar flares was conceived and executed in 3 years, and the spacecraft was slated to go up this summer, in time to monitor the solar maximum later this year.

    It's not known what went wrong, says Lin. He and his team contracted with the Jet Propulsion Laboratory in Pasadena, California, to conduct various tests at the lab's sophisticated facilities. During a vibration test, a standard procedure to mimic the stresses and strains encountered during launch, HESSI was mistakenly subjected to forces more than 10 times the appropriate level. Before the test could be halted, the solar panels cracked and the structure was damaged. “We still don't know what's been damaged,” says Lin, or how severely. “We'll have to take everything apart.”

    A January launch will still give researchers good data on the nature of solar flares, he says, as some of the most violent typically occur shortly after the peak of solar activity. “But we wanted to get the maximum of flares,” he adds. NASA has asked a team of investigators to report by May on what went wrong.


    In the Crossfire: Collins on Genomes, Patents, and 'Rivalry'

    1. Eliot Marshall,
    2. Elizabeth Pennisi,
    3. Leslie Roberts

    As the public effort to sequence the human genome comes into the final stretch, its methods and goals are increasingly being challenged by private firms

    Francis Collins, head of the National Human Genome Research Institute (NHGRI), is in the hot seat. As the leader of the public effort to sequence all 3 billion bases of the human genome, he's helping steer the most ambitious and visible project ever undertaken in biology. And he is facing a huge challenge on several levels from a privately funded team led by J. Craig Venter, president of Celera Genomics in Rockville, Maryland. Both teams are racing to complete a draft of the human genome in the next few months, and both are engaged in a vigorous, and sometimes noisy, competition.

    Just in the past few weeks, any hope of collaboration between the rival teams broke down, with both sides accusing the other of bad faith. At issue are the terms of data release: The leaders of the public project insist it be immediate and unrestricted; Celera wants those who use its data to agree not to redistribute them to others. On 14 March, the disagreement ratcheted up a notch when President Bill Clinton and British Prime Minister Tony Blair applauded the policy of instant data release. But the statement sent biotech stocks—including Celera's—into a nose dive.

    Controversy is not new to the genome project. Indeed, 7 years ago when Collins, an M.D.-Ph.D., took the helm, many scientists were arguing that the project was too ambitious for its own good. It was folly, they argued, to promise the public that the entire human genome could be sequenced by 2005, the initial target date. Biologists bemoaned the entry into their field of “big science.” Today, such concerns seem remote.

    In 1995, NHGRI began to accelerate the effort, funding six pilot projects in high-volume sequencing. A turning point came in 1998 when Robert Waterston at Washington University in St. Louis, who is funded by NHGRI, and his collaborator John Sulston of the Sanger Centre near Cambridge, U.K., who is funded by the Wellcome Trust, announced that they had deciphered the complete genome (97 million bases) of the nematode, Caenorhabditis elegans.

    Meanwhile, at The Institute for Genomic Research, a nonprofit in Rockville, Maryland, Venter was perfecting a faster “whole-genome shotgun” approach. He wowed the community in 1995 by producing the complete genome of the bacterium Haemophilus influenzae (1.8 million bases long) at record speed. In May 1998, Venter dropped a bombshell: Backed by PE Corp. of Norwalk, Connecticut, he launched Celera and announced that it would sequence the entire human genome by 2001 using the whole-genome shotgun method.

    Collins and members of the Human Genome Project (HGP) consortium responded by speeding up their own timetable. In September 1998, they announced that they would produce a “working draft” by 2001, covering 90% of the human genome with one error per 100 bases. By 2003, the public consortium promised to deliver a 99.99% complete human genome.

    In a meeting at his National Institutes of Health (NIH) office on 14 March, Science asked Collins to discuss his views on this controversy, the rivalry with Celera, and future priorities of the HGP. The following is a transcript of the interview edited for brevity.

    Q: Are you on target for finishing the draft human genome? Has your goal for completeness and accuracy changed?

    A: We are on target and expect to reach that goal this spring. In fact, we just passed the 2 billion base pair mark, which is about 70%. It took 4 years to obtain the first billion and 4 months to get the second billion. We're adding 10% of the genome to GenBank each month now. The goal for completing the working draft has not changed since it was first announced: 90% coverage of the euchromatic [informative] portion of the human genome sequence.

    Q: What do you think of the fruit fly sequence, recently published by Celera Genomics in collaboration with the Berkeley Drosophila Genome Project?

    A: It is a remarkable accomplishment—an extremely exciting opportunity to look at the complete instruction book of an organism that has occupied the minds of geneticists for 100 years. It will require another year of cleanup to close the gaps. But there is no question that as it stands now, it is of tremendous value to the community.

    Q: Does this prove that Celera's whole-genome shotgun strategy works?

    A: The paper tells you that, at least for Drosophila, with its 120 megabases of euchromatic DNA, the whole-genome shotgun method works quite well. Not perfectly. There are certainly places where the algorithm could not assemble the sequence; there are more than 1000 gaps. But it is an excellent proof of principle, though it also indicates that there's a lot more refining to do to make it practical for larger genomes.

    Q: Has the rivalry between Celera and the Human Genome Project gone beyond healthy competition?

    A: There are substantive issues about data access at the heart of the situation: Will the sequence of the human genome be freely accessible without restrictions of any sort to researchers in the private and public sectors, or will it not? Regrettably, relatively little of the press attention has focused on those bedrock issues. Far too much has been written about the personalities and the “rivalry.”

    Q: Have companies complained to Congress or the Administration about your handling of data release?

    A: We have no way of tracking such complaints. But many companies have repeatedly expressed to NHGRI, the Congress, and the Administration their strong support of the Human Genome Project's data release policy. Congress remains strongly supportive of the HGP's data access policy, as evidenced by the recent remarks of Congressman John Porter [R-IL] at the NIH Hearing on 8 March, and Congressman David Obey [D-WI] 2 weeks earlier. As shown by the recent Clinton-Blair statement, the Administration's support for the HGP and its position on immediate data release could hardly be more enthusiastic.

    Q: You and others wrote to Celera in February that it would be unethical for them to publish the human genome without the approval of scientists who contributed to the public database. Why?

    A: From the point of view of publishing ethics, I think the sequencing centers that have done the labor ought to have the chance to say what they did for themselves—and to have the whole body of work go through peer review—before somebody else does it for them. For instance, it would have been outside the normal boundaries of publishing ethics for someone not in one of the labs that actually produced the Drosophila data to massage information from the public databases and rush a paper about the total genome sequence into print.

    The other principle is that if you publish a paper, it traditionally means you've looked at the raw data and you can vouch for it. The raw data for sequencing is sequence traces that don't find their way into GenBank. (We wouldn't know where to put them all.) So if you are downloading a very large amount of data from a public database and publishing it when it hasn't previously been published, you're treading a bit upon that particular principle.

    Q: How do you respond to biotech investors who think the Clinton-Blair statement urging the release of genome data hurt them? What was the statement's purpose?

    A: In retrospect, most analysts agree there was nothing in the text of the Clinton-Blair statement that justified the market reaction. This was an exhortation to take genomic data—the raw fundamental stuff—and make it publicly accessible without restrictions. Part of the exhortation was to say, ‘Let's get all the sequence into the public domain so that if patents have not already been filed, it will be less attractive to do so.’ We've probably got enough patents filed already. … Thousands of patent applications are sitting in the Patent and Trademark Office [PTO] right now waiting to be decided upon. … Patents covering large numbers of genes could turn out to be quite deadly to the future of genomics if licenses are negotiated in an exclusive way.

    Q: You've said you support responsible patenting. What is that?

    A: I think the Patent Office deserves credit for moving toward a stronger requirement for utility. Several years ago, it looked as if any DNA sequence would be considered useful because it could be used as a probe. Now, in their proposed new guidelines, PTO says such a claim would not be specific enough to demonstrate utility.

    The Patent Office is seeing fewer of what they call “generation one” patents, where there's just a sequence and no clue as to what it does. PTO intends to reject those. They are seeing a reasonable number of “generation two” applications, where there's a sequence, and homology suggests a function. NIH views such applications as problematic, since homology often provides only a sketchy view of function. Increasingly, PTO is seeing more in the “generation three” category, which I think most people would agree is more appropriate for patent protection. These are gene sequences for which you have biochemical, or cell biological, or genetic data describing function. So we are seeing a shift in the sensible direction.

    Q: Will sequencing grants soon become a lower priority for the program?

    A: Sequencing will continue to have a critical place in the armamentarium of genome activities for at least the next 5 years. Understanding what the sequence means will require us to make multiple comparisons. For that reason, we are already plunging into the mouse genome, which is every bit as hard as the human. We'll do it differently in terms of the strategy, but it will still require around 60 million reads to get it done. That's a lot of work, and there's no way around that now. The arguments are quite strong for sequencing other mammals besides human and mouse. Having two genomes to compare will be useful, but having three would be really useful, particularly if you're looking for smaller conserved elements that are involved in regulation of gene expression. We will clearly not want to stop the vertebrate genome list after the mouse. The zebrafish and the rat genome will be highly useful to sequence. There will be strong arguments for doing the pig, the dog, or the cow—and for doing another primate. … Whether other vertebrate genomes will need a full finished genome sequence, or whether most of the value could be derived from a draft, is still being discussed.

    Q: Why not offer contracts to sequence these genomes efficiently?

    A: The contract model, one might argue, is the most efficient way to get sequencing done, if we are really moving into a production mode. But a compelling argument can be made for doing large-scale sequencing at academic institutions, where it can have lots of useful spin-offs. … If you had this activity segregated off in an industrial atmosphere, you would lose something both in terms of training and in terms of other research ideas. Here the Genome Institute will need to be heavily guided by what the biological community says the priorities ought to be.

    Q: How do you plan to encourage new ideas for genomics research?

    A: A major new initiative, approved by our council last month, is to establish Centers of Excellence in Genomic Science. We believe that ideas about technology development, computational approaches, population genetics, expression analysis, and proteomics are most likely to bubble up in academic environments where multidisciplinary teams of several investigators are focused around a common theme. We would like to see a lot of our research funding shift into that mode. … The annual budget cap will be in the neighborhood of $4 million to $5 million, with an initial grant period of 5 years. If these centers are to have the kind of stability that encourages people to take risks, we have to give them a longer lifetime than the normal 3-year genome grant. Center grants would be renewable for one cycle, but after roughly 10 years, they would need to find other sources of funding.

    Q: Where does the genome program most need support right now?

    A: I would say in bioinformatics. That poses a real challenge, given the paucity of trained individuals who are expert in both computational methods and biology. And, boy, do we need them. The talent pool that does exist has migrated heavily to the private sector, and that has injured the ability to train the next generation. I think the scientific community is really revved up to solve this problem. When I talk in academic institutions, undergraduates or beginning graduate students come up to me and ask how they can get into computational biology. They can see this coming. We just have to be sure that we're providing them with superb training experiences. One of the intentions of these centers, besides doing great science, is to provide such great training opportunities.

    Q: People expect the genome project to yield benefits soon. Are you disappointed with the clinical payoffs to date?

    A: No, I am not at all disappointed, but I am impatient. Anybody who has thought about the path from gene discovery to clinical use knows that it includes complicated and unpredictable steps. … The Herceptin story is a good example of how molecular understanding of breast cancer has led to a therapy that has significant clinical benefit. But it's hard to point to a long list where we have a home run. … We have to be realistic that the full flowering of the medical benefits of understanding the human genome probably lies 15 to 20 years away. The message we have been trying to convey is that this is the greatest revolution medicine has experienced since the introduction of antibiotics, but it's not a revolution that happens overnight.

    Q: Was gene therapy hurt by the death of a patient reported last fall?

    A: Everybody is quite shaken. … A young man has lost his life, and this tragedy has rocked the field down to its toes. It is sobering to consider that this approach not only hasn't led to cures in the past 10 years but is actually capable of harm. … It has caused everybody to tighten up a lot. I think we'll get past this. Gene therapy will find a very important niche for the treatment of a number of diseases in 10 or 15 years.

    Collins was interviewed by Eliot Marshall, Elizabeth Pennisi, and Leslie Roberts.


    When an Entire Country Is a Cohort

    1. Lone Frank*
    1. Lone Frank writes from Copenhagen, Denmark.

    Denmark has gathered more data on its citizens than any other country. Now scientists are pushing to make this vast array of statistics even more useful

    For years, any woman who got an abortion had to accept more than the loss of her fetus: For some unknown reason, she also faced an elevated risk for breast cancer. At least that was what several small case-control studies had suggested before Mads Melbye, an epidemiologist at the Statens Serum Institute in Copenhagen, undertook the largest effort ever to explore the link. He and his colleagues obtained records on 400,000 women in Denmark's national Abortion Register, then checked how many of the same women were listed in the Danish Cancer Register. Their foray into the two databases led to a surprising result: As they reported in The New England Journal of Medicine in 1997, there appears to be no connection between abortion and breast cancer.

    Their success underscores the value of a trove of data the Danish government has accumulated on its citizenry, which today totals about 5 million people. Other Scandinavian countries have created powerful database systems, but Denmark has earned a preeminent reputation for possessing the most complete and interwoven collection of statistics touching on almost every aspect of life. The Danish government has compiled nearly 200 databases, some begun in the 1930s, on everything from medical records to socioeconomic data on jobs and salaries. What makes the databases a plum research tool is the fact that they can all be linked by a 10-digit personal identification number, called the CPR, that follows each Dane from cradle to grave. According to Melbye, “our registers allow for instant, large cohort studies that are impossible in most countries.”

    But Melbye and other scientists think they can extract even more from this data gold mine. They argue that not enough money is being spent on maintaining and expanding existing databases, and they say that red tape is hampering studies that require correlation of health and demographic data. The problem is that, while they have unfettered access to more than 80 medical databases maintained

    by the Danish Board of Health and public hospitals, their use of 120 demographic databases overseen by the agency Statistics Denmark is tightly restricted. Statistics Denmark won't allow researchers to remove from its premises data coded by CPR, and the procedures for accessing information at all are unwieldy and expensive.

    Statistics Denmark officials are reluctant to release data tied to CPRs, citing privacy concerns. “The public should have confidence that information identifying them as individuals does not reside outside of this institution,” says the agency's Otto Andersen. Last month, Danish research minister Birte Weiss formed a committee to break the impasse. Denmark's databases are “a resource which can be used more optimally,” she told Science. “This should be a scientific flagship.”

    Working the health databases can yield powerful results. For years the U.S. National Institutes of Health has supported a study following twins, hoping to tease out the relative contributions of genes and lifestyle to aging. Led by University of Southern Denmark gerontologist Kaare Christensen, the project has tapped the Danish Twin Register, which includes 110,000 pairs of twins born since 1870. After following more than 2000 pairs of twins aged 70 or older, Christensen's group has so far tied to genes about a quarter of the variation in human longevity. “The project is made possible by the unmatched age and completeness of the Danish Twin Register,” he says.

    The health databases have proven invaluable for probing contradictions raised by smaller studies and following disease progression. Tapping the Danish Cancer Register and a central blood bank database, Melbye's team recently debunked the idea that getting a blood transfusion increases cancer risk. And in another large project, they resolved the long-standing question of why young women with breast cancer have a poorer prognosis than older women with the disease. Compiling data on 35,000 breast cancer cases, the researchers found that young victims live longer if chemotherapy is started as soon as the cancer is diagnosed, rather than reserving this treatment approach only for tumors that have grown to a certain size.

    The health databases are also useful for unraveling complex diseases. Psychiatric epidemiologist Preben Bo Mortensen of Aarhus University Hospital has mined the Danish Central Psychiatric Register, which contains information on all Danes who have come into contact with the public psychiatric hospital system since the 1930s. He has identified a host of environmental factors, such as prenatal viral infections and season of birth, which appear to influence the development of schizophrenia and bipolar disorder. “The register allows us to tease out the relative contribution of genetic and nongenetic factors and thereby point to possible strategies for preventing disease,” says Mortensen.

    But at a meeting on database research in Copenhagen earlier this year, scientists complained that they have been prevented from taking full advantage of the wealth of registered information. Part of the problem is that the agencies that maintain databases are reeling from budget cuts. “Increased funds are needed to secure the quality of existing and future databases if we want to keep our lead in the field,” says Olaf Ingerslev of the Board of Health. Experts estimate it would take only a modest additional cash infusion—$500,000 to $1 million a year—to better maintain and expand the Board of Health's databases.

    Budget shortfalls, however, are not as contentious as the question of access. “The central issue is the refusal by Statistics Denmark to release personally identifiable CPR-related data,” says epidemiologist Thorkild I. A. Sørensen of Copenhagen's Institute for Disease Prevention. The agency's rules, stricter than mandated by law, make accessing data a cumbersome process. If researchers want to link data from a health database to data from a demographic database, for example, they pay a steep fee for an appointment at Statistic Denmark's Copenhagen office, where they wait while a bureaucrat carries out the request. Researchers can't take home with them data coded by CPR, which constrains how they manipulate data at their institutions. Performing follow-up analyses that require linking data by CPRs means returning to Statistics Denmark and paying another fee.

    Researchers argue that the benefits of entrusting them with the CPR outweigh the risk of compromising the identifying information. It's “better to meet concerns by tightening possible sanctions than to limit research that benefits society in general,” says Sørensen. While the company deCODE's plan to create and mine an Icelandic health database has provoked heated debate in Iceland and abroad (Science, 11 February, p. 951), the database issue has aroused little concern in Denmark. In part that's because no one has suffered the embarrassment of having their medical records inadvertently released into the public domain: Researchers must strip CPRs from health data before publishing analyses, and they have no access to the names that go with the CPRs.

    As the government committee considers their request, the researchers are forging ahead with new projects that aim to marry advances in genetics with the vast database resources. “The ability to track related individuals in the many different databases makes it possible to shed light on the complex interplay between familial predisposition and environment,” says Melbye. Christensen's twins, for example, donate blood samples that are used to analyze genes implicated in aging. And in the Danish National Birth Cohort, 100,000 pregnant women and their babies are being recruited to donate blood and undergo physical exams during pregnancy. The emerging database may point to new connections between prenatal factors and congenital disorders, as well as chronic diseases that occur later in life. “In the future we can go back and analyze data concerning the mothers' health during pregnancy and test suspected genetic and nongenetic factors in blood,” says epidemiologist Jørn Olsen of Aarhus University, one of the project's leaders. The data could be all the more useful, he says, if the current restraints on linking information across databases are lifted. Indeed, Christensen sees it as a moral obligation to exploit data gathered at great expense. “Would it not be unethical not to use it to improve the population's health and health care?” he asks.


    Patent Suit Pits Postdoc Against Former Mentor

    1. Eliot Marshall

    A judge has thrown out a former postdoc's claim that her work was patented without her knowledge, ruling that the university owns her work so she lacks standing for a suit

    Two years ago, biologist Joany Chou found herself in a postdoc's nightmare. She learned that what she considered her main accomplishment in 14 years of research on the herpesvirus—the discovery of a new gene—had without her knowledge been included in a patent by her mentor at the University of Chicago (U. of C.). She appealed to the university but was rebuffed.

    Then came a mentor's nightmare. Seeking $25 million for the alleged misuse of her research, Chou took four defendants to court last July: her former professor and lab chief Bernard Roizman, a renowned herpes expert at the University of Chicago and member of the National Academy of Sciences; the university; the university's patent agency, the ARCH Development Corp.; and Aviron, a company co-founded by Roizman in Mountain View, California, which has rights to the disputed patent.

    All the defendants have denied Chou's claims in court documents, and Roizman has argued that her contributions to the patented work did not merit co-inventor status. But in a ruling last month, Judge James Zagel of the federal court for Northern Illinois said the question of whether Chou or Roizman should get credit for the gene discovery is irrelevant. In an opinion that may be instructive for other postdocs, Zagel threw out Chou's inventorship claim on a legal point. Chou lacked “standing,” Zagel said, because she was an employee of the U. of C. when the discovery was made. (The research was funded partly by the federal government and partly by the French firm Pasteur Mérieux.) The university's rules make it clear that it owns employees' inventions, Zagel noted, and this means that Chou never had any prospect of owning the patent. Zagel wrote: “One who claims no ownership of the patent has no standing to seek relief.” Chou intends to appeal Zagel's decision.

    Roizman declined to respond to questions. But his Chicago attorney, Paul Stephens, said in a faxed letter that “Chou's baseless allegations have been denied by the University of Chicago and by ARCH and by Dr. Roizman. Chou has already caused unnecessary upset and expense with her rash and unfounded charges. … So far, they have no legal merit. …”

    This bitter struggle holds lessons for postdocs and universities, says attorney Richard Aron Osman of Hillsborough, California, an expert on biotech property law. (Osman read the legal briefs and judge's decision at Science's request.) Researchers “shouldn't be naïve,” says Osman. “For better or worse, there is a lot of money surrounding biomedical inventions, and this has changed the relationship of trust that many students assume exists between them and their faculty advisers.” He notes that most graduate students and postdocs have “limited knowledge of the patent system,” and he wonders whether universities should provide an “ombudsman” to advise or advocate for student-inventors. This might forestall some litigation, he thinks.

    The dispute began in 1997, Chou says, when she first learned of Roizman's patent. The previous year, Chou had left Roizman's lab after a disagreement with him. She had worked there as a graduate student and later as a $25,000- to $35,000-per-year postdoc. Chou first saw the patent, which had been awarded in 1994 (U.S. patent 5,328,688), by chance, she recalls, when a researcher came up to her as she was being interviewed for another job and “waved this paper at me.” Chou says she was “devastated.”

    In 1990, Chou and Roizman co-authored papers on a key herpes simplex virus (HSV) gene known as γ134.5. They showed that deleting or blocking the γ134.5 gene blocks the virulence of HSV by making it unable to infect the central nervous system (CNS). Such mutant viruses can still grow harmlessly in the body. This “tame” form of HSV might be useful in an anti-HSV vaccine. Chou says she discovered the γ134.5 gene “by accident” when she was dissecting HSV under Roizman's tutelage in the 1980s. Not only was the gene novel, but some HSV researchers refused to believe it was real. Two well-known scientists recall how leaders in the field nearly came to blows at a meeting where a British scientist challenged the discovery. But Chou and Roizman stood by the results.

    Today, some researchers are unsure how they would allocate credit on the patent, even though Chou and Roizman co-authored the key papers. Joseph Glorioso of the University of Pittsburgh says, “Chou's work was very well done, with a high degree of scrutiny by Bernard [Roizman], I'm sure.” But it's “easily conceivable,” Glorioso says, “that someone could be first author on a paper and not included on a patent.”

    Two other researchers who asked not to be named say they regard Chou as the gene's discoverer, although Roizman guided the research. A third, virologist Priscilla Schaffer of the University of Pennsylvania, Philadelphia, strongly agrees that Chou should get credit as a discoverer: “I was absolutely blown away when I found that Joany's name was not on that patent,” she says. Schaffer argues that “it's very difficult” to separate intellectual contributions and benchwork in science. “Bernard may be saying that Joany had no intellectual input into this discovery, but I find that hard to believe. I think she has been done an injustice.”

    Although Roizman declined to respond to questions, his filings with the Patent and Trademark Office (PTO) and legal briefs offer a glimpse of his position. When Roizman submitted a patent application in 1990 on a virus in which the γ134.5 gene was mutated and associated vaccines, the PTO examiner initially rejected it for several reasons, including that it seemed obvious. The examiner argued that one could easily combine papers by Chou and Roizman on the γ134.5 gene with a patent Roizman obtained in 1989 for a similar vaccine using a different gene. Combining this information—as any expert could do—would make the 1990 vaccine idea obvious, the examiner said. In a series of letters and personal visits, however, Roizman persuaded the PTO to reconsider.

    Roizman argued in 1992, for example, that the γ134.5 discovery made this invention entirely novel: A virus with this mutation, he wrote, “exhibits significantly less virulence” than the mutation he had patented in 1989. Roizman also told the PTO that Chou's contribution to this improvement was not critical. In a sworn affidavit on 8 July 1993, he stated that he was the “sole inventor” and had directed the research in the key scientific paper. “While co-authoring the publication, Joany Chou is not an inventor of the subject matter reported therein,” Roizman wrote.

    Contesting this assertion last year, Chou gave the court a letter Roizman sent to the U. of C. molecular genetics department in May 1994 recommending her for promotion. In it, Roizman described Chou's research as “outstanding,” specifically her “original and seminal work” on HSV. “Several years ago,” Roizman wrote, “Joany discovered a hitherto unknown herpes simplex virus open reading frame,” the γ134.5 gene. Roizman wrote that “in short order she made a series of deletions within the gene and demonstrated” that “deletion mutants are totally unable to grow in CNS of experimental animal systems even though the mutants do multiply in non CNS tissues.” He concluded modestly: “There is significant interest in this gene and in the deletion mutants.”

    Neither Roizman nor the university informed Chou that a patent was pending on the γ134.5 gene discovery, Zagel noted. Nor did they inform Chou of the patent's issuance in 1994. Roizman's legal brief argues that as a mentor he was not “bound by an affirmative duty to discuss with Chou every patent application filed … to ensure that Chou had not been incorrectly excluded.” Zagel agreed.

    In 1996, Chou says, Roizman asked her to leave his lab because he considered her disruptive and because she had filed a supplemental grant application as an independent investigator, but hadn't cleared the text with Roizman. Chou has been on her own since 1997.

    University officials declined to discuss the case. But press officer John Easton provided a brief statement saying that the university's rule is that all researchers should be “accorded full credit for their work” and that outside attorneys help allocate credit properly. The note states that Chou is listed as an inventor on other patent applications and on a 1998 patent for a method for screening proteins. University officials have reviewed Chou's allegations carefully, the statement says, and “concluded that Dr. Chou has been treated fairly.”

    One irony in this dispute is that so far the patent has produced no vaccine, and Chou, by her own estimate, has spent $200,000 contesting it.


    Oddities Both Lunar and Martian

    1. Richard A. Kerr

    HOUSTON Earlier this month, planetary scientists with a particular affection for the rocky or icy bodies of the solar system met for the 31st Lunar and Planetary Science Conference. Talks touched on the man in the moon, putative martian microbes, and putting a squeeze on Mars.

    Man in the Moon's Birth

    For planetary scientists, the man in the moon is more than folklore; he is a conundrum. Why, they ask, did almost all of the dark lavas that outline the profile erupt on one side of the moon? The farside is a blank slate nearly devoid of dark lavas. At the meeting, geophysicists Marc Parmentier of Brown University and Shijie Zhong and Maria Zuber of the Massachusetts Institute of Technology (MIT) described how the interplay of lunar mineralogy and geophysics could have focused volcanism on one side of the moon. “There have been detailed mineralogical models, but they didn't link to the geology,” says planetary geologist James Head of Brown. This approach “links plausible models for the thermal evolution of the moon with what we see on the surface.” And if it's right, the new approach would solve several other lunar enigmas as well.

    Planetary scientists began noticing these oddities as soon as the first spacecraft revealed the near absence of dark lava “seas” or maria on the moon's farside. Then analyses of mare rocks returned by Apollo astronauts showed a late start for the main pulse of volcanism that painted the nearside figure. Instead of beginning in the moon's first couple of hundred million years, when an ocean of magma remained beneath a thin crust, the volcanism set in a half-billion years later and went on for a billion years. And the minerals that had melted to yield these mare lavas had formed near the top of the solidifying magma ocean but somehow had sunk about 500 kilometers before melting. Then Lunar Prospector confirmed that vast subsurface deposits of a chemically distinctive rock, known by its acronym, KREEP, lie beneath the region of most abundant mare lavas but not elsewhere.

    A plausible account of the man in the moon's birth would probably have to incorporate these lunar quirks in a single mechanism, Parmentier and his colleagues realized. The most obvious driving force was the unstable layering of rock left by the magma ocean. A light scum formed a crust on the cooling magma, while denser minerals crystallized one by one and sank to the ocean bottom. But minerals don't necessarily crystallize in the order of density; some of the densest minerals—a mix rich in ilmenite and pyroxene—would have crystallized late, forming a thin layer right under the crust near the end of ocean freeze-up.

    Parmentier and his colleagues suggest that this so-called ilmenite cumulate could have slowly sunk from shallow depths through the rock beneath it, like a stone through a plum pudding, draining the still-lingering molten KREEP with it down hundreds of kilometers. Enriched in radioactive elements whose decay generates heat, the KREEP could have slowly raised the temperature of the ilmenite cumulate to the melting point and produced the needed lavas hundreds of millions of years later.

    But why did these rocks melt on only one side of the moon? To answer that, the Brown-MIT researchers considered the viscosity of the moon's interior. If an outer layer of the young moon was at least as viscous as the rock beneath it, the ilmenite cumulate would sink downward in fingers all around the moon, not just on one side. But, calculated the researchers, if heat from the KREEP had softened the upper ilmenite layer until it had 1/1000th the viscosity of the rock beneath it, small fingers couldn't sink into the resistant lower layer. Only an expanse of the upper layer as broad as the width of the moon could penetrate the lower layer. This broad downwelling would drain KREEP and upper layer material from around the moon, giving the one-sided concentration of heat. As for why the man faces Earth, no one really knows.

    The one-sided sinking explanation for the man in the moon got a cautious reception at the meeting. “I think what Parmentier is doing is reasonable,” says geophysicist Roger Phillips of Washington University in St. Louis. “It's the first time somebody has attempted a quantitative model.” As the first, it's getting its share of critical attention. “I have a great deal of difficulty seeing how” less viscous, weaker rock could be layered over stronger rock, says petrologist Timothy Grove of MIT. Warming of the upper layer could easily go too far and melt some of it, decreasing its density and preventing its sinking, he says. Planetary physicist David Stevenson of the California Institute of Technology in Pasadena has more philosophical reservations. “It's a possible but not a particularly convincing argument,” he says. “There's no natural reason to expect the large change of viscosity with depth that they need.” “There are uncertainties,” concedes Zuber. “We aren't claiming we've found ‘the’ answer. This is a step in the right direction.” The man in the moon still has a secret.

    Martian Microbes Déjà Vu?

    Almost 4 years ago, microscopic bits of minerals found in a meteorite from Mars leapt onto front pages around the world as possible signs of ancient life on another planet. The excitement faded, however, as it became obvious that none of the evidence clinched the case for life (Science, 20 November 1998, p. 1398). Now, researchers have cast more doubt on the data: They have cooked up much the same minerals in the laboratory with nary a microbe in sight. Many in the field see the work as a confirmation of what they've long suspected. “It all fits so well with what we see [in the meteorite] that I can't believe what happened on Mars was very different” from the lab experiments, says meteoriticist Ralph Harvey of Case Western Reserve University in Cleveland. But the fit is not yet perfect, leaving a slim opening for martian life.

    Geologist David McKay and his colleagues at NASA's Johnson Space Center (JSC) in Houston built the case for traces of life in martian meteorite ALH84001 on four pillars: mineral shapes that look like fossilized bacteria, traces of organic matter, globules of minerals resembling some produced through bacterial action, and grains of a magnetic mineral similar to those produced by bacteria. The McKay group has since conceded that the buggy-looking minerals are likely artifacts or are too small to be intact bacteria. And the organic matter—called polycyclic aromatic hydrocarbons—turns out to be ubiquitous in the solar system, from soot to the inorganically produced primordial goo of meteorites. The globules of concentrically layered carbonate minerals spiked with the sulfide mineral pyrrhotite were more enticing but hardly definitive either; they could have been made by a sequence of mineral-laden groundwaters flushing rock fractures on Mars, critics said.

    The one potential mineralogical trace of former life that still held some appeal was tiny grains of the iron-oxide mineral magnetite concentrated in the rims of carbonate globules in fractures of ALH84001. Some of the magnetite grains resembled in size and shape those made by terrestrial bacteria as magnetic compasses. No one had ever reported finding inorganically produced grains that size and shape, but soil mineralogist D. C. Golden of Hernandez Engineering and JSC and his JSC colleagues—all in the same JSC division as the McKay group—decided to find out if it could be done.

    First, Golden and his colleagues made the carbonate globules by heating bicarbonate solutions mixed with rock chips to 150°C. The bicarbonate decomposed, and carbonate deposited in the fractures of the rock chips. To get the ringed look of the meteorite globules, they changed the solution composition in four steps, as martian groundwater may have changed composition. Finally, they reproduced the heating that ALH84001 appears to have suffered after the globules formed, perhaps from a meteorite impact. That decomposed iron carbonates into the required magnetite and pyrite into pyrrhotite. In the end, “we were able to produce minerals very similar chemically and mineralogically to those in 84001,” says Golden's colleague, soil mineralogist Douglas Ming of JSC. In particular, the magnetite produced by iron carbonate decomposition falls in a tight size range of about 35 to 60 nanometers, almost identical to the size range of bacterial magnetite.

    Golden and his JSC colleagues conclude that “it is possible to synthesize carbonate globules quite similar to those of ALH84001 by simple inorganic processes that may have occurred on early Mars.” “They did an outstanding job showing you could produce these globules,” says mineralogist Adrian Brearley of the University of New Mexico, Albuquerque, who originally suggested carbonate decomposition as the magnetite source 2 years ago.

    But the case for traces of life in ALH84001 isn't dead yet. “We've shown a potential inorganic formation,” says Ming. “What we haven't done yet is show that our magnetites look biogenic.” Since the McKay group first focused on one subset of magnetites as possibly biogenic, microscopist Kathie L. Thomas-Keprta of Lockheed Martin and the McKay group has looked again at ALH84001 and identified mineralogical and chemical characteristics beyond size and shape that match those of bacterial magnetite. Golden and his colleagues haven't reached a similar level of detail in characterizing their experimental magnetites. But in the meantime, many researchers are following Brearley in applying Occam's razor and concluding that ALH84001's magnetites “are all produced by the [inorganic] mechanism.”

    Punching Up Mars

    Mars, it seems, has been beaten up by more than asteroid and comet impacts. Geophysicist Roger Phillips of Washington University in St. Louis and his colleagues reported at the meeting that the weight of the solar system's most massive volcanic outpouring not only puts a dent in one side of the planet but also pushes up the opposite side. These planetary distortions, it now appears, redirected the flow of massive amounts of water early in Mars history that carved large areas of the planet.

    Planetary scientists have known since Mariner 9's arrival at Mars in 1971 that the Tharsis region, spanning more than 10 million square kilometers centered on the equator, weighs heavily on the planet. Built by eons of volcanic outpouring, Tharsis sports three massive volcanic shields rising higher than 10 kilometers above the mean elevation; mighty Olympus Mons rises 22 kilometers above its base. Earth's tallest volcano, Mauna Kea, stands only 9 kilometers above the sea floor. Geophysicists could calculate how much the whole pile of volcanics should depress the planet beneath it, but spacecraft had returned only the most rudimentary topographic data to test how well those calculations hold up. Now the Mars Global Surveyor (MGS) laser altimeter is giving them highly accurate data to work with.

    When Phillips and his colleagues placed the mass of Tharsis on a 15-year-old computer model of Mars, it depressed the surface at least several kilometers over much of the western hemisphere of the planet. The MGS topography confirmed that this, indeed, occurred. But in a variant of every action having an equal and opposite reaction, the model showed that the region antipodal to Tharsis, called Arabia Terra, should be lifted up by a few kilometers. That's like punching your fist into a balloon—albeit a very stiff one—and having the far side bulge outward, says Phillips. That prediction was also borne out: MGS gravity and topographic data recently indicated that Arabia Terra had been uplifted, although the data couldn't show why.

    “It's an incredibly simple model that works with almost no assumptions,” says Phillips, “and it explains details all around the planet.” Geophysicists such as David Stevenson of the California Institute of Technology in Pasadena tend to agree that Phillips's modeling is giving plausible results. For geologists, it's also “a reasonable hypothesis,” says planetary geologist James Head of Brown University. “It's exciting to think about the implications for such a large part of the planet.” For one, the planetary bulging seems to have hijacked Arabia Terra from the northern lowlands and made it, at least topographically, part of the southern highlands. As a result, geologists can see the original lowlands crust that is elsewhere covered by sediment or lavas. The uplift also explains the abrupt termination of ancient erosional valleys that trace the flow of water off the highlands more than 3 billion years ago: The rising land turned back the flow.

    Around Tharsis, the ring of low ground had its own geological effects. It appears to have directed the flow of water during much of martian history, says Phillips. Great floods drained through it on the way to the northern lowlands, cutting huge outflow channels. The larger buried channels proposed on the basis of MGS gravity data (Science, 10 March, p. 1727) also pass through the Tharsis ring, presumably carrying the huge amounts of sediment that MGS altimetry is now showing were removed from the region. “All this happened very early in the history of Mars,” says Phillips. “We're learning that the magnitude was tremendous.”