News this Week

Science  30 Jun 2000:
Vol. 288, Issue 5475, pp. 2294

    Rival Genome Sequencers Celebrate a Milestone Together

    1. Eliot Marshall

    With pomp and ceremony, including a trumpet flourish, President Clinton strode into the East Room of the White House on 26 June to announce that molecular biologists have generated “the most wondrous map ever produced by humankind”—a nearly complete readout of the 3.1 or so billion nucleotides in the human genome. Two scientific groups, one private and the other public, have reached a turning point in this work, the president said, and he wanted to celebrate an “epic-making triumph of science and reason.”

    The room was packed with research managers, senators, ambassadors, reporters, and a few famous scientists. At one point, the president paused to pay tribute to James Watson, co-discoverer of DNA's double-helix structure, seated near the back. Britain's Prime Minister Tony Blair took part in the events from London, appearing in a satellite broadcast on two giant video screens and predicting that genome-based studies will lead to “a revolution in medical science whose implications will far surpass even the discovery of antibiotics.” Government leaders in Paris and Tokyo also held press conferences to honor local scientists who contributed data.

    The White House ceremony was more than a celebration; it was also designed to heal a split in the research community. The ceremony brought together leaders of the rival public and private groups in a kind of truce, cooling off a competition that had grown intense in recent months. Clinton appeared on the podium flanked by Francis Collins, director of the U.S. National Human Genome Research Institute, representing 16 centers in the public effort, and J. Craig Venter, president of Celera Genomics of Rockville, Maryland, the company that last year began sequencing the entire human genome on its own. The two groups had talked of working together to complete the genome, but negotiations broke down in acrimony in February (Science, 10 March, p. 1723).

    Facing the cameras this week, the two praised one another. Collins, speaking first, expressed “personal gratitude” to Venter for his “openness in the cooperative planning process that led to this joint announcement.” Venter spoke of the “tremendous effort” by the international team, adding that “I'd also like to personally thank Francis for his direct actions in working with me to foster cooperation in the genome community. …”

    As reporters learned afterward, this display of harmony came about because a third leader—Ari Patrinos, chief of genome research at the Department of Energy—intervened. Upset about how the rivalry might detract from the scientific achievement, Patrinos invited Collins and Venter to a “secret meeting” at his house. “I've known both of these guys for a long time—as scientists and as friends,” he says. They met for the first time on Sunday, 7 May, but didn't make much progress. They continued talks over beer and pizza at several meetings, finally reaching an agreement on 21 June on details for the press conference.

    Although Patrinos apparently got an army of genome researchers to march in step, Celera set the pace. Celera reached its corporate milestone—assembling the raw human genome data it produced, representing 99% of the genome, into an ordered sequence—long before the public group reached its own objective. Tony White, chief executive officer of Celera's financial parent firm—PE Biosystems Corp. of Norwalk, Connecticut—said the announcement was held up until 26 June because “it took several weeks to orchestrate the dance.”

    The public consortium didn't quite reach the objective it set for itself—producing 90% of a draft genome (in which the average DNA sequence is 99.9% complete) by the spring of 2000 (see p. 2304). Collins noted that the public draft is only 85% assembled. “You could say we're still 5% short,” he acknowledges, but adds that with 97% of the genome covered by clones whose location is known, “we are substantially ahead of where we expected to be at this time.”

    Collins estimated that the cost to produce the public draft genome (not counting related research or building costs) will be about $300 million in total, of which roughly $150 million will be paid by his agency. The public consortium will finish the draft this year, then produce a polished human genome (99.99% complete) by 2003 or sooner, while moving on to sequence other organisms, including the rat and mouse. Venter declined to discuss costs, other than to say Celera's human genome effort required 27 million DNA sequencing “reads” at less than $2 each.

    Celera has not changed its policies on data release or patenting. Academic researchers who agree to use Celera's data for noncommercial purposes will be permitted free access to its raw human genome data with some minimal annotation, but not its detailed annotation of gene function and structure. Venter says the company has already filed “about two dozen unique gene patents” and will file more.

    It's not clear at this writing how substantive the cooperation between Celera and the public consortium will be. Collins, for one, said the current truce amounted to “coordination, not collaboration.” For now, the public and private teams are planning to produce independent scientific papers on the sequence data and, after that, to annotate the data independently. Collins explains that he doesn't expect Celera to share such information publicly because to do so would require giving away proprietary information. But the public consortium and Celera are expecting to hold a joint conference next year to share information on their different methods of sequencing the genome. Eric Lander, director of the Whitehead Institute/MIT Center for Genome Research in Cambridge, Massachusetts, called this an “exciting” prospect, because the approaches were “complementary,” producing “two different looks” at the genome.

    No one can say at this time where or when any of the data will be published. Patrinos hopes it will appear “back to back” in the same journal this fall. But Venter, who's closer to having results in publishable form, says he has no idea where the manuscript will go: “We haven't decided yet.”

    The scientists did agree on one thing, though: The president was right when he said that “today's historic achievement is only a starting point. There is much hard work yet to be done.”


    Making a Splash With a Hint of Mars Water

    1. Richard A. Kerr

    It began as a whisper on the Web a week ago Monday evening, grew to a noisy torrent of media babble by Wednesday, and on Thursday morning crashed onto the front pages. Moving at the light-speed pace of modern media, a wave of chatter about water and therefore possible life on Mars swept a paper at Science into headline news a week before its scheduled publication.

    The paper, on page 2330 of this issue, features high-resolution pictures of muddy-looking gullies on the sides of martian craters, suggesting the prospect of liquid water on, or at least near, the surface of the planet. That prospect has thrilled planetary scientists who have been scouring a seemingly bone-dry planet for 30 years. “It's the smoking gun that says there's liquid water and Mars has all the requirements for life,” astrobiologist Bruce Jakosky of the University of Colorado, Boulder, told a packed NASA press conference last Thursday, at which the paper was released early. Not so fast, caution a number of planetary scientists. “I'm skeptical just because of how difficult it is to have liquid water on or near the surface of Mars,” longtime Mars geologist Michael Carr of the U.S. Geological Survey (USGS) in Menlo Park, California, told the press conference. “It's just simply too cold, incredibly cold.” Carr and others are already coming up with alternative explanations for the rivulet-ridden piles of debris that exclude stores of liquid water and therefore readily accessible life.

    Opening the press conference, planetary geologist Michael Malin of Malin Space Science Systems Inc. (MSSS) in San Diego warned that “the actual science may pale before the science fiction that has been written.” The fiction grew out of an accurate, if vague, item on the independent watchdog Web site, NASA Watch (, late afternoon on 19 June. It reported, apparently from sources in the astrobiology community, that NASA had briefed the White House (presidential science adviser Neal Lane, as it turned out) on a major discovery involving water on Mars. Other Web sites added details through Tuesday, 20 June; USA Today put a Web-sourced story at the top of its front page Wednesday morning. The information gleaned anonymously from NASA headquarters personnel and researchers around the country ranged from on target—signs of recent spring activity—to unlikely: ponds and even the possibility of geysers. Although no reporters appeared to have seen the paper (by Malin and his MSSS colleague Kenneth Edgett), Science decided to stem the flow of misinformation by releasing it.

    Fiction aside, the reality proved enticing enough. The evidence for water flowing on the surface of Mars comes from Malin's high-resolution camera orbiting the planet on Mars Global Surveyor for the past 2 years. In about 200 of the 65,000 images returned so far, Malin and Edgett found places where water appears to have emerged from a crater wall or valley side. All the sites are above 30° latitude, mostly in the southern hemisphere. It looks as though the emerging water ate away at these steep slopes, the water and debris flowing down to form a channel-riddled pile or apron. “Had this been seen on Earth, there would be no question water is associated with it,” said Malin. And these “aproned alcoves” are so devoid of impact cratering and other ravages of time that they must be “very, very young,” said Edgett. They could have been active yesterday, he said, but conceded that, given the difficulties of gauging time on Mars, they could be as old as 1 million or 2 million years.

    No one doubts that a fluid emerging from the martian rock formed these stunningly Earth-like features. And, by analogy with Earth, the likely fluid is water draining from an aquifer—a permeable, water-filled layer of rock—cut by a crater or valley. “That water has to be kept warm somehow,” said Malin. “I don't know how. There has to be some geothermal component” to the warming. Because there are no volcanic heat sources apparent, as are found at Yellowstone or in Iceland, Malin considers the possibility that “our idea of what [the inside of] Mars is like thermally is all wrong.”

    However liquid water makes it as far as a crater wall, Malin and Edgett then draw on the warmth of sunlight to explain an oddity of the geographical distribution of seep sites. They find that sites occur about two and a half times more often on pole-facing slopes—the most shadowed and therefore coldest surfaces at a given latitude—than they do on warmer, equator-facing slopes. They argue that the sun's warmth on equator-facing slopes keeps aquifer water flowing out of the surface by rapidly evaporating it, avoiding any obvious erosion. On colder, pole-facing slopes, the water freezes to form an icy barrier. That barrier eventually breaks, perhaps after pressure builds in the aquifer, wasting away the wall face and releasing a burst of pent-up liquid water to form the aprons.

    This scenario of continuously liquid water doesn't sit well with some planetary scientists. “It's simply not credible to create a near-surface aquifer” on Mars, says planetary scientist Stephen Clifford of the Lunar and Planetary Institute in Houston. The surface of Mars is so cold—on average −70° to −100°C—and the internal fires of the planet so feeble that any water within 2 or 3 kilometers of the surface should be permanently frozen solid, Clifford notes. Yet the apparent martian seeps spring from rock exposed at the now-frigid surface, and they presumably flowed through layers as little as 150 meters below to get there.

    These drawbacks have many researchers reaching for alternatives. Carr, Clifford, and others are considering clathrates. These ices of water and a second component, such as carbon dioxide, form at low temperatures and high pressures but decompose to gas when warmed or depressurized. Clathrates of carbon dioxide, the most abundant gas in the martian atmosphere, may have formed in the crust, Carr noted, and could burst from rock walls to form fluid masses of gas and debris that would flow down like water, the way streams of hot gas and ash flow down from volcanic eruptions.

    A less exotic explanation is water ice frozen into rock layers that melts only on geologically rare occasions. Clifford and hydrologist Victor Baker of the University of Arizona, Tucson, each independently suggested the same mechanism to Science that Mars geologist Kenneth Tanaka of the USGS in Flagstaff, Arizona, presents in his Perspective on page 2325. All three were struck by how the seeps prefer pole-facing slopes. Although among the coldest spots on Mars today, they note, such slopes would have been among the warmest 4 million or 5 million years ago. Planetary dynamicists calculate that back then a wobbly Mars was temporarily tipped over as far as 45° compared to its current 25° obliquity or inclination of its spin axis. That would have warmed Mars generally by sending part of the water ice in the southern polar cap into the atmosphere, strengthening the greenhouse effect. The tilt would have warmed high-latitude, pole-facing slopes even more, by putting them in full sun through long summers. “I'm more and more persuaded that what they're seeing is a reflection of what happens during high obliquity,” says Clifford. “It's the most plausible explanation.”

    Whatever happened, researchers are excited. Signs of near-surface water, whether liquid or solid or clathrates, “is an important result,” says Baker. The muddy rivulets, whether a day or a million years old, “show the ground ice is there today,” says Baker. That the water got loose in some way recently calls into question that Mars has been “cold, dry, and inactive since early times.”


    Cholesterol Drugs Show Promise as Bone Builders

    1. Dan Ferber*
    1. Dan Ferber is a writer in Urbana, Illinois.

    For the millions of people worldwide with osteoporosis, one tumble can break a hip, and a hug can crack a rib. Drugs called bisphosphonates can prevent many fractures by stopping the body from breaking down bone. But even today's best drugs prevent only about half the fractures, and none of them do much to spur the body to rebuild healthy bone.

    That could soon change. Not only do statins, a group of drugs used by millions to head off heart disease, seem to prevent fractures, but they may also trigger significant bone regrowth in older people, according to four studies reported in the 28 June issue of The Journal of the American Medical Association (JAMA) and the 24 June issue of The Lancet. And another promising treatment, a recombinant fragment of human parathyroid hormone called rhPTH, is even closer to the clinic: Two clinical trials reported at meetings in the past 2 weeks show that the compound builds bone and lowers the risk of fracture by more than half. “These are really quite striking reductions in fractures,” says endocrinologist Conrad Johnston of Indiana University School of Medicine in Indianapolis, president of the National Osteoporosis Foundation.

    Like a work crew repairing an aging street, the body normally maintains bones by digging holes, then refilling them with fresh material. Osteoporosis, which afflicts 10 million Americans, most of them postmenopausal women, occurs when the body breaks down bone faster than it can replace it, rendering the bones thin and brittle. Bisphosphonates such as alendronate and risedronate, as well as estrogen replacement therapy, all slow bone loss by blocking cells called osteoclasts, which dig the holes. But none of these drugs stimulates the cells, called osteoblasts, that fill in the holes. As a result, treatment works best on people diagnosed early, while they still have most of their bone mass. But because many patients have already lost 20% to 30% of their bone mass by the time of diagnosis, Johnston says, “we want something that will build it back.”

    In a surprising finding last December, a team led by endocrinologist Greg Mundy of the biotech company OsteoScreen and the University of Texas Health Science Center in San Antonio showed that statins—cholesterol-lowering drugs taken by more than 12 million people in the United States alone—dramatically boosted new bone growth in mice and rats (Science, 3 December 1999, p. 1946). But no one knew whether they would work in humans.

    The new studies suggest—but do not prove—that they do. In the largest of the four studies, a team led by Herschel Jick of Boston University School of Medicine in Lexington, Massachusetts, compared the medical records of 3940 British patients aged 50 and over who had suffered fractures with those of 23,379 control subjects who had not, by matching each patient to controls of the same age and sex who saw the same doctor. After adjusting for the effects of other factors that affect bone strength, they reported in JAMA that statin users were 45% less likely to have suffered fractures than subjects who were not taking these drugs. “We were amazed,” Jick says. “Life is usually not that simple.”

    Repair crew.

    Today's drugs, such as the bisphosphonates, block osteoclasts from breaking down bone; statins may stimulate osteoblasts to build new bone.


    Two other teams conducted similar case-control studies, with similar results. In the same issue of JAMA, a team led by pharmacoepidemiologist Jerry Avorn and Philip Wang of Brigham and Women's Hospital in Boston showed that subjects over age 65 who used statins were half as likely as nonusers to suffer hip fractures, even after adjusting for the effects of race, smoking, other diseases, and other drugs. And in The Lancet, a separate team at the same hospital led by pharmacoepidemiologist K. Arnold Chan reports equally encouraging findings.

    By examining medical data from six health maintenance organizations in different regions of the United States, they found that women over 60 who used statins regularly were half as likely to suffer a hip, vertebra, or wrist fracture as similar women who didn't take the drugs. What's more, a team led by rheumatologist Tim Spector of St. Thomas' Hospital in London showed in The Lancet that statins seem to boost bone density significantly—a key measure of bone strength. After excluding the effects of hormone replacement therapy, age, height, and weight, the researchers found that the bones of women who took statins had 8% more mass than the bones of those who didn't.

    Johnston and other experts caution that it's too soon for doctors to prescribe statins to treat osteoporosis. To prove that statins really strengthen human bones and prevent fractures, researchers need to perform large clinical trials in which patients are randomly assigned to take statins or placebos. “They've done a good job [controlling for] everything you'd expect, but maybe there's some difference between people who take statins and people who don't,” says Johnston.

    Whether or not statins pan out in clinical trials, rhPTH already has. Last week at the Endocrine Society meeting in Toronto, Robert Neer, director of the Osteoporosis Center at Massachusetts General Hospital in Boston, presented results of a randomized trial that show that rhPTH reduces the risk of recurring fractures in women. Subjects who had already suffered vertebral fractures were 65% less likely to suffer a second spine fracture and 54% less likely than controls to suffer nonspine fractures after self-injecting rhPTH for 1 to 2 years. The drug also seems to enhance the bone-preserving benefits of hormone replacement therapy, according to trial results presented 2 weeks ago at the World Congress on Osteoporosis by Felicia Cosman of Helen Hayes Hospital in West Haverstraw, New York, and her colleagues.

    But on the downside, rhPTH, unlike statins, must be injected. “Patients generally don't like to give themselves an injection every day,” says epidemiologist Steven Cummings of the University of California, San Francisco. “We need options.” And despite the plethora of auspicious results, he cautions that it's too early to abandon bisphosphonate drugs, which have passed muster in several large, randomized trials. Even so, the new findings are generating quite a buzz among both researchers and clinicians. “It all looks very promising,” says Johnston. “We may have a lot of good drugs before long.


    Bug Bastille to Open Under New Management

    1. Michael Balter

    Paris—In the next few weeks, Europe's most advanced high-security pathogen lab is expected to open for business in the southern French city of Lyons. A striking glass-and-steel structure built on stilts over an existing building, the lab will join a handful of facilities around the world capable of handling the most dangerous human pathogenssuch as the Ebola and Lassa fever viruses (Science, 26 May, p. 1320). But the long-anticipated debut of this “hot” lab is not turning out to be a joyous occasion for some of those involved.

    Last week, the Marcel Mérieux Foundation, which paid for the lab's construction, announced that the Pasteur Institute will take over its “scientific direction.” The arrangement, which took a consortium of researchers planning to use the lab by surprise, gives Pasteur the authority to name the lab's director. The leading candidate is a Pasteur scientist, and that would leave the lab's current director, virologist Susan Fisher-Hoch, out in the cold. Fisher-Hoch says she's a victim of a smear campaign aimed at shunting her aside.

    This inauspicious spat is the latest twist in a long saga. In 1996, physician Charles Mérieux, patriarch of the Lyons-based family of vaccine producers and creator of the foundation bearing the name of his father, a student of Louis Pasteur, decided to build the lab privately after agencies in Europe balked at its $8 million price tag. Mérieux hired Fisher-Hoch, who had spent much of her career in a biosafety level 4 (BSL-4) facility at the Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia, to design and build the lab (Science, 13 March 1998, p. 1630). Technical glitches and safety concerns have delayed the lab's opening, originally planned for late 1998.

    Putting the Pasteur in charge of the lab's science makes sense for both Mérieux and the Pasteur. The European Union and French agencies turned down repeated requests to help cover the estimated $1.4 million per year to keep the lab running; the foundation and Pasteur are now negotiating the institute's financial contributions to the lab. “The Pasteur Institute was a logical choice,” says David Heymann, director of the World Health Organization's division of emerging and other communicable diseases in Geneva. And Pasteur Director-General Philippe Kourilsky says the arrangement allows Pasteur to run a BSL-4 facility without the hassle and expense of trying to build one on its already cramped campus in the heart of Paris.

    But some members of the Lyons-based European Center for Research in Virology and Immunology (CERVI)a federation of teams associated with the BSL-4 facility are left wondering how their plans will be affected. “None of the directors [of the CERVI research units] were consulted, and we do not know what [Pasteur's] scientific program is going to be,” says CERVI member Jean-Luc Darlix, head of a human virology lab run by the biomedical research agency INSERM.

    Fisher-Hoch is even less certain about her future at the lab. Confidential documents from CERVI and the foundation obtained by Science indicate that Pasteur and foundation officials intend to appoint Pasteur virologist Vincent Deubel as the new director, effective this fall. Deubel has searched in Africa for reservoirs of the deadly Ebola virus, although Darlix and others say that he has no experience in a BSL-4 lab. Deubel declined to comment, but Kourilsky defends the putative appointment of a Pasteur scientist: “If the Pasteur Institute is associated with the [BSL-4 facility],” he says, “it is normal that the scientific direction would be assured by a Pasteurian.”

    Fisher-Hoch sees darker forces at work. For the past several months, articles in Lyons newspapers and in the national press have suggested that the lab might pose a health threat to the local community. A story in the 30 March issue of the weekly magazine L'Express, for example, reported that Fisher-Hoch last fall was given a number of possibly virally infected blood samples from Sierra Leone by her husband, Joseph McCormick, and that she violated safety procedures by putting them in a freezer in a BSL-2 lab, which has fewer safeguards than a BSL-4 lab. (McCormick, also a former CDC virus hunter who works at the Lyons-based drug company Aventis-Pasteur, has had his own troubles with Pasteur; see Science, 13 November 1998, p. 1241.) Charles Mérieux refers repeatedly to this alleged incident in letters to the WHO's Heymann this spring, in which he asks for help in replacing Fisher-Hoch. Mérieux also complained about Fisher-Hoch in letters to Kourilsky. (Heymann says he did not respond to the request, and Kourilsky declined to comment, saying the issue is an internal foundation matter.)

    Fisher-Hoch and McCormick dispute the press accounts. They say the samples were from healthy Western donors, including themselves, and were drawn during a workshop they conducted in Liberianot Sierra Leoneto teach medical personnel how to perform diagnostic tests for Lassa fever. Fisher-Hoch says she laid this out in an 11 April letter to Mérieux, explaining that she intended to use the uninfected samples as controls in future work on lethal viruses. Mérieux, 93, told Science that whether or not the alleged incidents were true, they “created a bad image of the [BSL-4 facility]” in the press which “I cannot tolerate.” Fisher-Hoch's contract to direct the lab runs until February 2002, although foundation officials say she will now be asked to accept a lesser role. But she speculates that once the lab was ready to come online it was too tempting a prize: “As the French say, the cake was too beautiful, everyone wanted to eat it.”


    Enzyme Blocker Prompts Mice to Shed Weight

    1. Trisha Gura*
    1. Trisha Gura is a science writer in Cleveland, Ohio.

    When it comes to body fat, the laws of thermodynamics hold weight: Take in more calories than the body burns to produce energy, and the excess will be shunted into fat. To regulate this thermodynamic system, the body somehow keeps the brain apprised of the energy balance so it can dampen our appetites if we are overeating. Now, a multidisciplinary team from Johns Hopkins University may have discovered an important new clue about how the body performs this feat of calorie—and thus weight—control.

    The team, led by pathologist Francis Kuhajda, chemist Craig Townsend, biochemist M. Daniel Lane, cell biologist Thomas Loftus, and neuroscientist Gabriele Ronnett, reports on page 2379 of this issue that a molecule called malonyl-coenzyme A (malonyl-CoA), which is needed for fat synthesis in the body, may play a key role in appetite signaling in the brain. Moreover, the investigators produced a synthetic inhibitor that prevents this molecule from being converted to fat, causing it to build up in the body. In mice, the inhibitor spurs a dramatic, but reversible, drop in appetite and weight.

    “This is provocative and exciting, and I think we will see an avalanche of work to see if it has validity,” says Dennis McGarry, a fat metabolism researcher at the University of Texas Southwestern Medical Center in Dallas. Indeed, with an estimated third of Americans now grappling with obesity and its subsequent health problems and costs, any drug that could safely and effectively block appetite and lead to weight loss could be a big money-spinner.

    The discovery was sparked by Kuhajda's studies of an enzyme called fatty acid synthase (FAS). When the body wants to store excess fuel, this enzyme makes the long-chained fatty acids that are the building blocks of fats by transferring two-carbon units from malonyl-CoA to the growing fatty acid. Thirty years ago, Nobel Prize-winner Konrad Bloch had shown that cerulenin, an epoxide produced by fungi, inhibits FAS. But epoxides are notoriously unstable and reactive, so Kuhajda teamed up with Townsend, who synthesized a cast of cerulenin derivatives that might be a less reactive, and therefore safer, FAS inhibitor.

    Of the hundreds of compounds tested, one, dubbed C75, looked especially promising. It easily latched onto and blocked FAS with the same potency as cerulenin, but without the toxicity problems. It had a dramatic effect when given to mice: The treated animals began losing weight almost immediately. Because blocking FAS causes a buildup of the enzyme's target—malonyl-CoA—in the liver, the investigators wondered whether that compound might be somehow signaling the brain to dampen appetite.

    To explore that possibility, Kuhajda teamed up with Lane and Loftus. Loftus quickly confirmed that C75 suppresses appetite, showing that treated animals eat just 10% of the food their untreated littermates consume. The animals dropped, on average, almost a third of their body weight. Even more surprising, the treated animals lost 45% more weight than untreated mice fed the same reduced amounts of food. Fasted animals normally turn down their metabolic activity to compensate for their reduced food intake—that's one reason losing weight can be so difficult—but C75 may prevent this metabolic slowdown.

    The researchers also found that malonyl-CoA concentrations remained high in the livers of the C75 animals but not in those of the fasted mice, lending further weight to the idea that the compound might mediate the physical and metabolic changes. And tests with another inhibitor, a compound called TOFA that blocks the enzyme that makes malonyl-CoA, bolstered the hypothesis. The investigators reasoned that if malonyl-CoA is the key signal that tells the brain to quench appetite in response to C75, then TOFA should block the drug's effect by preventing malonyl-CoA synthesis. When they injected mice with TOFA before giving them the C75, the appetite suppression was indeed attenuated.

    “This lends some degree of credibility to the results,” says McGarry, although he says he still questions whether malonyl-CoA in this pathway is the sole signal orchestrating the feeding effects. “The question now is how is malonyl-CoA doing this and in which neuronal compartment?”

    With the help of neuroscientist Ronnett, the group set out to answer that question. The researchers showed that C75 works when pumped directly into the brains of mice. Surprisingly, however, the well-known antiobesity hormone leptin did not appear to conduct C75's effects: The drug quelled the appetites of mutant mice lacking the fat-busting hormone. But another peptide—the appetite-stimulating neuropeptide Y (NPY)—did prove to be involved.

    The investigators found that levels of the messenger RNA (mRNA) for NPY rose quickly in the brains of fasted animals—an indication that they were making large amounts of the protein, presumably to stimulate feeding. But even though C75-treated mice were eating very little, NPY mRNA levels plummeted in the rodents' brains. What's apparently happening, Loftus suggests, is that C75, by keeping malonyl-CoA concentrations high, is “fooling the system. We are making the system think that it has fuel when it actually hasn't.” Thus, NPY levels fail to rise, keeping appetite down.

    Longtime NPY researcher Michael Schwartz says that C75 may affect more than just NPY levels, though. “Simply blocking NPY is not likely to cause that profound of an inhibition of food intake,” he notes. Indeed, obesity researchers have identified plenty of candidates for C75 partners besides leptin and NPY (Science, 10 March, p. 1738).

    Beyond pinning down exactly how C75 works, there's the multimillion-dollar question of whether the drug or chemical derivatives of it will ever prove useful in curbing human obesity. “That would be the hope,” says McGarry, who cautions that such a scenario is still a long way off, especially as C75 is rather draconian in its suppression of appetite. It might be possible, though, to design milder versions. “We may have in our hands a mechanism that works at the level of the brain for the control of feeding behavior,” Lane says. If researchers can target it safely, he adds, “I think it has the potential to deal with obesity and all the consequences thereof.”


    Report Details Spying on Touring Scientists

    1. David Malakoff

    Foreign spies apparently find traveling U.S. nuclear scientists irresistible. A congressional report released this week details dozens of sometimes clumsy attempts by foreign agents to obtain nuclear secrets, from offering scientists prostitutes to prying off the backs of their laptop computers. The report highlights the need to better prepare traveling researchers to safeguard secrets and resist such temptations, say the two lawmakers who requested the report and officials at the Department of Energy (DOE), which employs the scientists.

    The study by the General Accounting Office (GAO), Congress's investigative arm, reviewed DOE reports on nearly 5000 foreign excursions by scientists from four national laboratories: Sandia and Los Alamos in New Mexico; Livermore in California; and Oak Ridge in Tennessee. It found more than 75 incidents between 1995 and 1999 in which researchers reported the possibility of eavesdropping and luggage tampering or said they were offered sexual favors. The report does not identify specific researchers, laboratories, or the nations visited, and DOE officials say no secrets were revealed. Some of the travel involved the 25 nations on DOE's “sensitive” list, which includes Russia, China, and Ukraine.

    The report makes for racy reading. In one case, a scientist visiting a sensitive nation was repeatedly propositioned by women who called his hotel room and knocked on his door. Another DOE researcher, in a posttrip debriefing with security officials, admitted to having sex with at least four women, including a prostitute, a waitress, and two employees of a laboratory he was visiting. Security officials were “particularly concerned about these activities because of the potential for blackmail,” the report notes. There were also reports of tampering with personal equipment, including riffling through and then locking a previously unlocked briefcase, turning on a previously shut-down computer, and trying to pry open the back of a laptop.

    Some of the incidents were almost comical. One researcher who telephoned his wife at home and chatted about her upcoming plans to play the game Bingo at a social gathering was later asked in the hotel bar about those plans. The next day another host asked him: “What is Bingo?” Some researchers even used the suspected eavesdropping to their advantage. After talking to their hotel walls about the desire for an extra roll of toilet paper or a television set, two scientists were pleasantly surprised to see the items appear within hours. Other episodes included “maids” interrupting a meeting to move potted plants closer to visiting U.S. scientists, and a technician who entered a conference room to change the tape in recorders previously hidden behind a wall. Dismayed U.S. officials hadn't been told the meeting was being recorded.

    GAO investigators say the episodes highlight the need to brief researchers more carefully and to review all travel plans, because spies “can operate worldwide.” They recommend that the weapons laboratories consult with counterintelligence agents and other scientists, who would be able to spot potentially sensitive information in planned presentations. Livermore and Oak Ridge currently conduct such reviews, which have prompted some scientists to alter or cancel travel plans.

    DOE officials agree with the findings and say they are expanding reviews and paying more attention to activities involving nonsensitive nations. But given limited funds, says one official, the agency “will probably continue to target the primary threat, and that is the sensitive nations.”


    University Company to Exploit Heart Data

    1. Andrew Lawler

    Boston—As a boy growing up in the small town of Framingham, Massachusetts, medical ethicist Arthur Caplan remembers watching excitedly as distinguished scientists from nearby Boston visited his father's drugstore. They came to inspect the pharmacy's records of patients enrolled in the federally funded Framingham Heart Study, a massive government effort begun in 1948 to monitor the cardiovascular health of more than 10,000 townsfolk. “It was a great event,” recalls Caplan, 50, who has long since left town for the University of Pennsylvania in Philadelphia. But the Framingham study, which helped establish a link between cigarette smoking and heart disease and between high blood pressure and stroke, continues to chart new territory—and Caplan is poised to play a role in its future development.

    This month, Boston University (BU), which directs the study and maintains the records, announced plans to form a bioinformatics company that will mine the data. The university will own 20% of Framingham Genomic Medicine Inc., which hopes to raise $21 million to begin modernizing the immense database and packaging it in a format that will be valuable to the pharmaceutical industry. The plan raises a host of difficult ethical issues, including patient privacy, academic conflicts of interest, and reciprocal value to the Framingham residents whose medical data will form the basis for the new enterprise. “These are all choppy waters,” says Caplan, who may become a paid ombudsman for the community in its dealings with the company and the university. But he thinks it's a voyage that may be worth taking: “We're talking about the gold standard of epidemiology.”

    Fred Ledley, chief scientist for the new company and its only full-time employee to date, also sees a golden opportunity to use what is now largely gathering dust in warehouses. “There's an enormous amount of data that's never been pulled out of boxes,” he says, “and I don't think the government has the money to do it.” However, the university's actions touch on issues similar to those raised by a controversial decision by the government of Iceland to provide a private company with health records on all its residents in return for an upgraded record-keeping system and free access to any new drugs the company develops (Science, 30 October 1998, p. 859). The University of Utah, Salt Lake City, also has provided private companies with genealogical data from Mormon church records, says Richard Koehn, Utah's vice president of research, after taking steps to ensure confidentiality and requiring involvement by faculty members.

    The Framingham company's first move will be to build a comprehensive electronic database over the next several months. Its second, more ambitious, step will be to correlate clinical records with DNA analyses from blood samples on file, with the goal of identifying some 50,000 genetic markers in individuals that are linked to specific abnormalities or diseases. Now that the human genome has been nearly sequenced (see p. 2294), Ledley and BU are betting that biotech firms also will find the Framingham data a valuable tool.

    Before Ledley can realize that dream, however, the company must win the support of residents, other universities involved in the study, and the National Heart, Lung, and Blood Institute (NHLBI) in Bethesda, Maryland, which has put up $34 million over the years. NHLBI director Claude Lenfant could not be reached for comment, but his staff says he plans to visit BU soon to discuss the new company and the institute's concerns about privacy, data access, and conflict of interest. NHLBI officials and researchers outside BU want continued access to the data, whereas residents—who for the past decade have signed consent forms for genetic analyses of their blood and tissue—want to safeguard their health records, which include psychosocial data. The relationship of BU researchers to the new company must also be resolved. “This is a difficult dance,” says Caplan.

    But Ledley and BU managers say they know the moves. The revamped database will remain available to researchers at no cost. “We're not taking any data out of the public domain, and we're not selling patient data,” Ledley says. “We're selling tools to analyze that data.” In addition, BU researchers involved in the study will be precluded from owning company stock, although they will be able to serve as consultants. “We want to preserve the integrity of the study,” says BU associate vice president David Lampe.

    A 25 April letter to the 1000 or so surviving study participants spoke about “entering an important new era of medical research” and promised to maintain “exemplary ethical standards.” It also proposed an ethical review group, to be based in Framingham, and said “a portion of its resources”—perhaps a chunk of stock—would be put in a trust controlled by a community board. “You can never pay people back, but you can show social responsibility,” Ledley says.

    Jay Lander, a Framingham attorney and vice chair of an organization called Friends of Framingham Heart Study which represents participants, says so far the community feels “surprised and somewhat apprehensive” about the new company. Pending a clearer idea of how the venture might affect the study, he says, “this thing isn't going anywhere.” But some ethicists are intrigued by the plan and see its potential value to society. “I would caution against a knee-jerk reaction about this. It's not a bad thing,” says Norman Fost, director of the medical ethics program at the University of Wisconsin, Madison.

    To signal its good intentions, the company intends to give the proposed board $150,000 to hire an ethicist. Ledley has suggested native son Caplan, noting that “he would be accountable to the community, not to us.” Caplan says the unusual arrangement would be workable if the company's contribution doesn't come with any strings attached. And he thinks that BU officials realize they are under close scrutiny. “This is a monumental study,” he adds. “Doing it right is crucial.”


    Biotech Giants Butt Heads Over Cancer Drug

    1. Eliot Marshall

    Mountain sheep settle disputes by knocking their heads together until one of them gives up and walks away; biotech companies do much the same, except they enlist patent lawyers to do the head-butting. The most recent display of this kind centers on an important new breast cancer drug, Herceptin. Developed by Genentech Inc. of South San Francisco in the 1990s, Herceptin has been generally available only since November 1998. Already, though, it has won acceptance as an adjunct to other therapy and is earning big revenues for the company—$68.7 million in this year's first quarter alone, according to Genentech. But success breeds competition. On 8 June, Chiron Inc. of Emeryville, California, challenged Genentech's patent claims and sued for a share of the profits.

    Chiron's 4-page complaint, filed in the federal district court in Sacramento, California, accuses Genentech of “willful, wanton, and deliberate” infringement of one of its patents. It seeks an unspecified amount of money for damages, including a trebling of normal penalties “due to the willful nature of Genentech's infringement.” Sean Johnston, vice president for intellectual property at Genentech, says Chiron's patent is “invalid”; the company plans to say so in an answer to be filed with the court in August. The case is being closely watched in the biotech industry not just because of the money at stake but also because it involves one of the first therapies to emerge from the burgeoning field of cancer genetics.

    Chiron launched its attack after winning what some observers call a “submarine patent”—one that had been quietly wending its way through the U.S. Patent and Trademark Office (PTO) for the past 16 years. On 25 April, PTO awarded Chiron U.S. Patent 6,054,561, which traces its lineage back to an application filed at the PTO in February 1984 by scientists from another California biotech firm, the Cetus Corp. Cetus was merged into Chiron in 1991. Among the patent's 31 claims is the invention of a monoclonal antibody that binds to a cell surface receptor called c-erbB-2, also known as HER2—the very target that Herceptin binds.

    For its part, Genentech owns six or seven patents in the area, according to a spokesperson, including one (U.S. Patent 5,677,171) that claims “a monoclonal antibody which specifically binds to the extracellular domain of the HER2 receptor and inhibits growth of SK-BR-3 breast tumor cells.…” Genentech filed for its patent in 1988 and received it in 1997.

    It's “not uncommon at all” to have patents appear to overlap, says Robert Blackburn, Chiron's chief patent counsel. He suggests that the Cetus-Chiron patent is broader and, more important, was filed earlier. Blackburn claims Genentech talked about getting a license from Chiron several years ago, but “they seemed to lose interest and go away.”

    “If [Chiron is] saying they offered reasonable royalty terms, I would disagree,” says Johnston, who acknowledges that the two companies did discuss a license. Johnston argues that in this case Chiron owes its success at the PTO more to clever management of a fragmentary legal claim than to diligent investigation of the clinical uses of HER2. “We're confident that we can demonstrate that the Cetus-Chiron scientists were not the first to make antibodies [to the c-erbB-2 receptor],” Johnston adds. For example, he notes that Robert Weinberg of the Whitehead Institute in Cambridge, Massachusetts, identified the key protein in 1982. This and other early research, Johnston claims, can be used to disprove Chiron's claim of priority in 1984. Blackburn, without going into details, dismisses these arguments as “a red herring.”

    Why is it taking so long for these disputes to surface? “Unlike chip technology,” says Rachel Krevans, lead outside counsel for Genentech at the firm of Morrison & Foerster in San Francisco, “biotech products take a long time to mature.” Questions about who profits from them take even longer to answer, “and that's why we're litigating the science of the early 1980s in the year 2000.”


    Finally, the Book of Life and Instructions for Navigating It

    1. Elizabeth Pennisi

    The publicly and privately funded teams have both finished drafts of the human genome. Now comes the daunting task of developing tools to figure out just what these volumes say

    Beaming at each other, longtime rivals Francis Collins and J. Craig Venter shook hands in the East Room of the White House on 26 June as they declared joint victory—and announced an implicit truce—in their race to decipher the “book of life.” President Clinton presided over the event, attended by a stellar cast of genome scientists, a few members of Congress, and a handful of foreign ambassadors, to celebrate completion of the “first survey of the entire human genome … the most wondrous map ever produced by humankind.” In fact, neither one's team has completely deciphered the human genome—that is, determined the exact order of all 3.12 billion or 3.15 billion bases, depending upon whom you ask, that make up our DNA. But each has completed a version of this book, which, hyperbole aside, promises to propel biology and medicine headlong through the 21st century. What's more, the two former adversaries, who until recently have minced no words disparaging the other's work, said they hope to publish their work simultaneously in a peer-reviewed journal sometime this fall (see p. 2294).

    This very public and very carefully orchestrated denouement—which required diplomatic skills akin to those behind the Camp David Peace Accord—brings to an end one of the most high-profile fights in recent biology, one that pitted a publicly funded consortium of scientists, led by Collins, against Venter's upstart company, Celera Genomics of Rockville, Maryland. With obvious relief, Collins and Venter agreed to forgo the barbs and share the credit for a biological tour de force that many scientists thought was impossible a mere 15 years ago.

    So what, exactly, have they produced, and how will they fine-tune it so that everyone from workaday biologists to pharmaceutical giants can mine its gold?

    The public consortium has finished a “working draft,” which covers 85% of the genome's coding regions in rough form. Although the sentences on some pages are mixed up and some words are missing letters, the data are freely available in several public genome databases. A polished version will be out in 2003 or sooner, promises Collins, director of the National Human Genome Research Institute, which funds most of the U.S contribution to this international endeavor. By all accounts, Celera's version is considerably more polished, thanks to a bold new sequencing strategy, deep corporate pockets, and Venter's ability to pool the public data with his own proprietary data. Venter promises to make his draft freely available to academic researchers at the time of publication; it is available now to subscribers who paid to get a first peek.

    Both books are clearly works in progress, the public's more so. As Venter is the first to admit, sequence data by themselves are of minimal use, so both teams have been scrambling over the past few months to improve the computer tools and analysis, known as annotation, that will enable biologists to make sense of the billions of A's, T's, G's, and C's contained in both databases. Although such efforts are already under way and some ingenious new strategies are in the works, full annotation of the human genome will continue well into this century.

    Before the announcement, speculation was rampant that Venter and Collins might collaborate on annotating the genome, turning the truce into a real partnership. President Clinton encouraged such hopes at the White House briefing when he said that both sides had agreed to hold a historic sequence analysis conference. At a subsequent press briefing, however, both Venter and Collins went out of their way to downplay such expectations, saying that they were exploring the possibility of a workshop to compare their approaches after publication. For now, on this ever-shifting stage, it looks as though the two annotation efforts will proceed independently—as with the sequence itself, undoubtedly speeded by the competition.

    The books

    These books, the starting points for annotation, are distinct, reflecting the different processes used to create them. From the outset, the publicly funded Human Genome Project worked by consensus, using a painstaking approach that wins kudos in terms of democracy but is not conducive to speed. Starting in about 1990, researchers across the globe divvied up the work, first making genome maps of increasing resolution, then improving the technology for sequencing, testing it on model organisms, and finally, in 1997, launching into full-scale sequencing of the human genome (Science, 12 April 1996, p. 188). Across the Atlantic, the Wellcome Trust set up the Sanger Centre in Hinxton, guaranteeing that the United Kingdom would be a big player in genome sequencing. From the outset, the goal of the public effort was to produce a “finished,” highly accurate sequence—to the extent possible (there will always be some holes), a continuous stretch of A's, T's, G's, and C's, arrayed in the exact order in which they appear along the chromosomes. But in a nod to the limitations of their technology, the team decided to sequence just the regions of the genome known to contain most of the genes—not the entire genome—but to do so with fewer than 1 error per 10,000 bases. Completion of this estimated $3 billion project was slated for 2005.

    That changed when Venter, a former NIH scientist turned entrepreneur, threw down the gauntlet in 1998, declaring that his new company would single-handedly sequence the entire genome in just 3 years—4 years ahead of the public project. As a CEO, Venter had several tactical advantages over an NIH institute director. For one, he did not have to contend with peer review, nor did he have to strive for consensus. Instead, he adopted a radical sequencing strategy that depended upon some 300 of the fastest sequencing machines—made by PE Biosystems Corp., Celera's parent company—and one of the world's most powerful supercomputers (Science, 18 June 1999, p. 1906). What's more, Venter could build on—and later incorporate—the work of the public project.

    Fearing that Venter planned to patent the sequence and sell it for profit—as well as hog all the credit—the public consortium rallied. The Wellcome Trust immediately increased its support for the project, promising that the Sanger Centre would do a third of the genome. The United States consolidated its sequencing effort and together, the two countries created five sequencing supercenters that drastically scaled up their efforts. (The five sequencing shops are the Sanger Centre; the U.S. Department of Energy Joint Genome Institute in Walnut Creek, California; Washington University School of Medicine in St. Louis; the Whitehead Institute for Biomedical Research in Cambridge, Massachusetts; and Baylor College of Medicine in Houston.)

    And in September 1998, the consortium announced a brand-new game plan: Instead of concentrating on finished sequence, it would produce a rough draft of 90% of the genome by spring 2001—about the same time as Celera's target date for producing the human genome. A year later, they moved up the completion date to spring 2000. The goal, said John Sulston, director of the Sanger Centre, was to get as much of the human genome sequence into the public domain as possible before Venter could lock it up.

    To decipher the genome—actually, a mosaic of six to 10 anonymous individuals—the public consortium opted for a careful, if tedious, piece-by-piece approach. It's akin to ripping out a page of a book, shredding it multiple times, then taping it back together by looking for overlapping letters. But in this case, researchers start with a 150,000-base chunk of DNA (the page)—known as a BAC, or bacterial artificial chromosome, in which it is cloned—then chop it up into many smaller pieces, or subclones, that have overlapping ends. These pieces are then run through the sequencing machines and reassembled by a computer into ever longer pieces, called contigs. Then the group moves on to the next clone.

    In reality, the process is more complicated. Each piece is sequenced not once but multiple times, because the more times a base shows up at the same position in these subclones, the more certain the computer is that the identification is correct. As a result, the whole genome is sequenced several times over—four times for the rough draft to have an error rate less than 1 in 100, and somewhere between eight and 11 times to reach the higher standard of no more than 1 error in every 10,000 bases.

    In this piece-by-piece approach, as a given BAC is worked through, its “sequence” will first show up as a series of short, unconnected strings of bases that may be in the wrong place on the BAC or even backward. Ideally, given enough time, as the computer slogs through additional sequence, it begins to fill in the holes, linking the small pieces of DNA together to form ever-longer stretches until most of the BAC is represented in the correct order. The next job is to align all the BACs, which also contain overlapping ends to aid in assembly—a task that sounds easy but is dauntingly difficult.

    The rough draft has not yet achieved this level of completion. The current draft consists of BACs covering 85% (5% short of their announced goal) of the gene-containing regions of chromosomes. The BACs are in order, thanks to the efforts of Washington University's Robert Waterston and John MacPherson, who worked out how to put each BAC in its proper place. But the completeness of each BAC can vary from being quite jumbled to having just a few bases missing. Some 24% of the sequence is in finished, highly accurate form, said Collins at the briefing; another 22% is in near-finished form; and 38% is in draft form. Most of the remaining 15% is currently being sequenced, except for a pesky 3% that refuses to be cloned.

    Celera, on the other hand, relied on the “whole-genome shotgun” strategy that Venter had pioneered for sequencing microbial genomes (Science, 28 July 1995, p. 496). Instead of going piece by piece, or shredding one page at a time, Celera shreds the entire volume—or more accurately, an entire set of encyclopedias—into millions of tiny overlapping pieces and then reassembles them with the aid of a superfast supercomputer. Although the company has not revealed its exact sequencing strategy for the human, it presumably resembles that used to sequence the Drosophila genome. It blasted the genome of one man first into 2000-base pieces, then into 10,000-base pieces, and again into 50,000-base pieces, covering the genome three times, between September 1999 and April 2000. Then, to fill in the gaps and increase accuracy, Celera sequenced parts of the genomes of three women and one additional man of diverse ethnic backgrounds, finishing that work by 23 June.

    Celera also took advantage of the fact that the public consortium deposits its data nightly into GenBank. Each day, Celera scientists downloaded the human sequence data in GenBank, manipulated them so they looked like its own raw sequences, and fed both data sets into the company's supercomputers for comparison. By incorporating the public data into its analysis, Celera ensured that each base, in theory, had been sequenced six times or more, significantly boosting the odds that it is accurate—and shaving a year or two off its project, says Venter. Analyses of the recently completed Drosophila sequence data suggest that Celera can get reasonably accurate and assembled coverage of the genome by sequencing it just 6.5 times, rather than 10 times as was originally thought.

    Celera then assembles these data into “scaffolds,” which are sets of contigs whose locations along a chromosome are determined by matching up known DNA landmarks. Although there are likely to be some 200,000 gaps between and within scaffolds, the Celera genome comes closer to covering all the gene-containing regions of the genome than does the public draft. Because the assembly is based solely on the overlaps—and not on the supposedly preestablished order of the pieces, as in the public project—more of the Celera genome is in the right place and in the right order. However, at this point, the human genome “will not be as good as the Drosophila genome” that was published in March, says Norton Zinder of Rockefeller University in New York City, who is also a scientific adviser to Celera.

    Charts, sextants, and compasses

    Getting those billions of bases in order is just the first step. Next comes figuring out what they mean. With so much data now in hand, the race has shifted to developing ever slicker algorithms and more user-friendly packaging for the tools needed to analyze, or annotate, the genome. Here again, companies—and not just Celera but Double Twist, Incyte, Compugen, and others—would seem to have the edge, as they have more money to invest in glitzy new software and high-powered hardware. Indeed, they are banking on making millions by selling their analyses to groups who aren't equipped to do it themselves. Even so, new databases and computer programs are cropping up monthly in the public arena, some at GenBank and others at GenBank's European counterpart, the European Bioinformatics Institute (EBI), and the DNA Databank of Japan.

    What race?

    Venter and Collins deny that they have been racing to finish the human genome; at any rate, both agree that the real work of deciphering it has only just begun.


    Together, they are providing the sextant, compass, and charts that will enable researchers to navigate the genome—to look for genes, compare genomes, and find information relevant to the stretch of sequence they want to study. “In the end it will not be the data that makes the difference; it will be the software,” predicts J. Michael Cherry, a bioinformaticist at Stanford. “If [a company] can provide their customers with good tools to mine the data, they will do very well.”

    For both public and private annotation efforts, the basic task is the same; the products differ mostly in the bells and whistles they provide. The first priority of any annotation software is to pinpoint the genes. Only computers have the ability to scan billions of bases and pick out the potential genes. They do this by looking for characteristic sequences at the beginnings and ends of genes, or by comparing new sequence to known genes or bits of genes. Additional computer programs translate those genes into proteins and, based on similarities to other proteins, attempt to assign a function to each one. Still other programs, such as the National Center for Biotechnology Information's (NCBI's) BLAST, compare the new genome data to that from other organisms, such as the fruit fly or the nematode. At Celera, Venter's crew uses its supercomputer to routinely perform “all against all” searches—comparisons of the newly generated sequence with that in all available databases. The sequence similarities such searches turn up highlight regions, such as genes or regulatory DNA, that might be missed by other gene-hunting programs.

    According to Zinder, when Celera publishes its version of the human genome, it will make available basic annotation—locations of the genes, with their coding and noncoding regions defined, and the predicted functions of their proteins—but not information based on genome-to-genome comparisons. Those comparisons and Celera's programs for manipulating and presenting that information will be the company's bread and butter, so Venter isn't cutting any corners or sparing any expense.

    Instead of sextants and compasses, Venter plans to have the genomics equivalent of the computer-linked Global Positioning System that guides his yacht. “It takes a lot more to navigate around the genome than to navigate around the world,” he says. Scientists who had a preview of what's to come are enthusiastic. “Celera's annotation and database programs are excellent,” says J. Troy Littleton, a neurobiologist at the Massachusetts Institute of Technology who worked with Celera to annotate the fly genome in November 1999.

    At the same time, bioinformatics experts working with the Human Genome Project are scrambling to complete a set of navigating tools that they plan to provide online for free. True to the democratic and somewhat individualistic nature of the public endeavor, several annotation efforts have sprung up in conjunction with the major players in the project. One called the Genome Channel is an offshoot of the U.S. Department of Energy's genome effort; others, such as that at NCBI, GenBank's home, and EBI, were spawned to help users make sense of archived data. Also, because incoming sequence is immediately available, no matter how patchy and incomplete, EBI and NCBI have been working hard to make clear what's what and where to find the best sequence for the part of the genome being studied.

    By late June, EBI's program, ENSEMBL, had identified some 38,000 genes in the existing rough draft; the total number of human genes remains a mystery, with estimates ranging from 28,000 to 120,000, although many genome scientists are now betting that the answer is close to 50,000 (Science, 19 May, p. 1146). Also, to compensate for the roughness of the public draft, NCBI plans to expand its repertoire of tools in the coming weeks. Because it's easier to find genes when the small chunks of sequence within each BAC are in the right order, NCBI will perform virtual “assemblies” that will clean up the rough draft electronically without generating additional sequence data.

    With these tools in hand, asserts NCBI director David Lipman, the rough draft should be of sufficient resolution for most tasks biologists want to perform. Indeed, he says, a dry run using a subset of the data indicated that gene-finding programs do almost as well with rough-draft sequence as with finished sequences in finding at least some part of a gene.

    Mary-Claire King, a human geneticist at the University of Washington, Seattle, concurs. “It's very rough, but very useful,” she says. Increasingly, says King, gene hunters like herself determine the general location of a gene and then pull the sequenced version of that region out of GenBank to find the gene itself.

    Setting a straight course

    Over the next few months, these first computer-based expeditions will be overtaken by human explorations of the genome. Gene-prediction programs make mistakes, identifying fossil genes that are never expressed or fusing two genes together, for example. Protein classification programs also have trouble—one part of a protein may make it look like a transmembrane receptor while another part suggests it is a DNA binding protein. The human eye sees new patterns and possibilities in sequences that computer programmers never dreamed of. “You really need to look at the data,” points out Gerald Rubin, vice president for biomedical research at the Howard Hughes Medical Institute in Bethesda, Maryland. “Any kind of [automated] annotation will not substitute for humans,” says Rubin, who ought to know, as he and Venter arranged an “annotation jamboree,” or research fest, to make sense of the Drosophila genome.

    In November 1999, Celera brought together about 45 biologists and bioinformatics experts to take a first look at the newly assembled fly genome. The synergy that resulted led to many discoveries about the fruit fly and even some new ideas about how organisms in general evolve greater complexity (Science, 24 March, p. 2182). Venter is planning another jamboree, or likely several, over the summer and fall to annotate the human genome; for now, Celera is not saying whether the insights gained in those jamborees will be included in its initial publication.

    Although a jamboree is great for a first pass, full annotation will take years, both Venter and Collins agree, and will increasingly depend on the contributions of bench biologists who are studying individual genes and proteins. For that reason, bioinformatics experts in the public consortium are focusing on ways to elicit continuing input from the biological community. EBI has embraced a strategy developed in large part by Lincoln Stein of the Cold Spring Harbor Laboratory in New York to keep nematode researchers involved with adding new results to the nematode genome database.

    Called a Distributed Annotation System (DAS), it enables any researcher to add his or her two cents to the database, providing they follow the DAS format for presenting the information. EBI has the rudiments of the system in place; by August, Stein hopes to finish the final bit of software so the system can go online. The input won't come in time for the first publications, but the system may eventually become a powerful reservoir of biological knowledge.

    Collins thinks even more is needed. “We need to figure out a way to capture community input in a way that doesn't contaminate the databases [with non-peer-reviewed information],” he says. Along those lines, Lipman envisions an army of curators who will scan the literature and cull important findings to add to the existing annotation.

    View this table:

    Lipman already has a team that is developing a definitive reference list of genes—a task that is not as easy as it sounds. Many genes have multiple names and more than one predicted function. The genes in the database often come in multiple versions as well, of varying degrees of accuracy (Science, 15 October 1999, p. 447). For about a year, these curators have been resolving discrepancies and picking one reference sequence—hence the name RefSeq—as a prelude to more in-depth annotation of the human genome. When necessary, they call in outside troubleshooters to help. So far they have double-checked 1500 genes and expect that number to increase rapidly.

    Eventually, Lipman would like to set up an electronic journal in which biologists would publish minireviews on their favorite gene families. These reviews would be hot-linked to the sequence and take advantage of “a model that's existed for a couple of decades,” he explains, “[that] of combining databases and [scientific] literature.”

    But all of this will take time—and researchers are impatient. So both teams are turning to the mouse for help: They are planning to sequence its 3-billion-base genome and compare it to the human genome. “The mouse [sequence] will identify all the human genes like no prediction program could do,” explains David Nelson, a biochemist at the University of Tennessee, Memphis.

    Celera, in characteristic style, is blazing the trail. As soon as it finished sequencing one human genome in April, Celera began blasting through the mouse genome. By late June, it was halfway there, says Zinder, and would be finished by the end of the year. When Celera overlays the mouse sequence on the human genome, it expects to be able to find many of the 35% of the human genes missed by other approaches, as well as identify regulatory regions and other key pieces of DNA. In addition, after Celera knocks off the mouse later this summer, it plans to turn its sequencing prowess to the rat, and perhaps the zebrafish, the dog, or a primate.

    That's about the same list as the public project proposes to sequence over the coming years. But true to consensus-building operations, the public consortium is still working out a sequencing strategy for the next year or so. In October 1999, it intended to start the mouse and had divided the task among 10 centers. But that work has barely started. Some researchers, like Doug Smith at Genome Therapeutics in Waltham, Massachusetts, are urging NIH to pick up the pace and sequence the mouse even more quickly than planned—especially as Celera intends to keep its valuable mouse data private.

    Yet there's also a great need to finish the human genome. Finished sequence “will be critical” for a variety of experiments, says cell biologist Shirley Tilghman of Princeton University—for example, for making sense of very large genes or for figuring out how the shape and structure of the chromosomes influence gene regulation. As a result, Collins and his advisers have been debating for months whether to push through the rest of the human genome quickly or turn to the mouse. The emerging consensus seems to be to do both: continue sequencing human but devote substantial capacity to mouse, so that a rough draft will be available in 6 to 9 months.

    Does all this mean that Celera's database is a must for genomics researchers? Opinion is divided. As long as Celera stays ahead and provides comparative analyses of an increasing menagerie of organisms, predicts Lipman, many genome researchers will likely ante up the funds to subscribe to it. Others disagree, saying that Celera's real advantage will be short-lived. “Once the human and mouse genomes are done and the genes have been identified by comparison of the two genomes, much of the excitement will pass,” says Nelson. Adds Tilghman, “Why should I pay for something I can get for free?”


    Biotech Research Proves a Draw in Canada

    1. Anne Simon Moffat

    Toronto, Canada Despite protests from Greenpeace members, some of whom dressed up as “corn fakes” to show their opposition to genetically modified organisms, more than 500 industrial and academic researchers, lawyers, and business people from about 25 countries gathered here from 5 to 8 June for the third biennial Agricultural Biotechnology International Conference. Highlights of the meeting included reports of progress toward making plants that resist nematode pests or stresses such as salt, frost, and drought.

    Making Plants More Stress Tolerant

    As farmers know all too well, drought or an unexpected cold snap can play havoc with their harvests. Indeed, drought and frost intolerance, together with intolerance to salt—a growing problem thanks to irrigation, which leads to salt accumulation in the soil—are the three major problems that restrict the growth of plants worldwide. Now, researchers are beginning to make progress in developing new strains of crop plants more capable of withstanding these stresses.

    At the meeting, plant molecular geneticist Michael Thomashow of Michigan State University in East Lansing described work by his team indicating that a relatively simple genetic alteration—the introduction of a single gene involved in controlling the synthesis of certain protective compounds—into canola plants can improve both their frost and salt tolerance. “This is a very sound approach that is going to be useful,” predicts Cornell University plant biochemist Ray Wu.

    Thomashow's result is an outgrowth of findings made over the past 10 years in several labs, including his own and that of the husband-wife team of Kazuo Shinozaki and Kazuko Yamaguchi-Shinozaki at the Japan International Research Center for Agricultural Science in Tsukuba. The work has shown that some plants can acclimate to cold, drought, and salt—all of which ultimately exert their deleterious effects by dehydrating cells—by generating a variety of enzymes that cells use to produce chemicals and proteins that can protect against dehydration. More recently, the researchers identified a small family of transcription factor proteins that turn on the genes needed to make the protective compounds. This suggested that genetically altering plants to make more of the transcription factors could improve their tolerance to drought, salt, and freezing—an idea that was subsequently borne out in gene transfer experiments on the small plant Arabidopsis thaliana.

    In experiments done 2 years ago, for example, the Thomashow team linked the gene for a transcription factor they identified to a regulatory sequence that promotes its activity and then introduced this hybrid gene into Arabidopsis plants. They found that the altered plants could survive temperatures 3°C colder than those tolerated by unaltered plants. Later that year, the Shinozakis showed that two different transcription factors identified by their team trigger tolerance to drought and salt loading, as well as to freezing, in Arabidopsis. But as Thomashow points out, “This work was in Arabidopsis. The challenge is to extend these findings to a real crop.”

    That's what the Thomashow team has now done in canola, an oilseed crop closely related to Arabidopsis. The researchers introduced each of the three transcription factor genes individually into canola plants. They obtained similar results with all of the resulting plants. At least under growth-chamber conditions, the plants showed improved freezing tolerance ranging from 2°C for plants that had not yet been exposed to cold temperatures, to 6° for plants that had already been acclimated by exposure to cold. Although a 1° or 2° improvement might be useful, Thomashow says, a 6° improvement would be significant.

    And further improvements may be on the way. Cornell University low-temperature expert Peter Steponkus, who has worked with both Thomashow and the Shinozakis, has recently transformed Arabidopsis plants with one of the Shinozakis' transcription factor genes. “It is a real winner, maximizing freezing tolerance up to 10°C,” he says. In contrast, he found only a 1° improvement using the gene that gave a 3° improvement in the Thomashow team's Arabidopsis experiments.

    The reason for that discrepancy is currently unclear, and in any event, it's likely that researchers will have to tinker with the gene constructs they are introducing into plants to get maximal improvements in tolerance to cold and other stresses. They will also have to show that the genetic manipulations that work in lab and greenhouse conditions will work in crops planted in the field. But if all goes well, farmers may worry less about drought or an unexpected freeze nipping their crops in the bud, a real threat as floral parts are among a plant's most sensitive organs.

    New Nematode Protection?

    Each year, tiny but voracious nematode worms cause $78 billion in worldwide crop losses. Now, scientists are hoping to thwart these rapacious pests by a clever trick: manipulating the cell cycle of the plants they feed on.

    At the meeting, plant molecular biologist Dirk Inze of the University of Ghent in Belgium and the agbiotech company CropDesign NV near Ghent reported that a genetic manipulation that blocks the cell cycle in plant roots invaded by a nematode prevents this pest from killing the plant. The findings open the door to developing new strains of nematode-resistant plants, which would be a major advance given the amount of crop destruction these worms cause. “Nematodes are a huge problem, and available nematocides are environmentally undesirable compounds,” says plant scientist Chris Lamb, director of the John Innes Centre in Norwich, U.K. Using cell cycle disrupters to upset nematode feeding “is a new and interesting idea,” he adds, although he cautions that it's still “untested.”

    When the tiny worms invade the roots of plants, they trigger the formation of either tumorlike growths called “galls,” which contain large, multinucleated cells with dense cytoplasm and numerous mitochondria, or syncytia, large collections of cells that have fused. Either way, the nematodes feed on these abnormal cells, sucking out nutrients for themselves and severely damaging or killing the plant. Apparently, the worms trigger the formation of these abnormal cells by turning on a variety of plant genes, including some needed to drive the cell cycle. About a year ago, Inze and his colleagues showed that cell cycle inhibitors can prevent the formation of the multinucleated cells or syncytia—a change that could deprive nematodes of their preferred feeding grounds.

    The Belgian workers have now performed a genetic manipulation on the small experimental plant Arabidopsis that essentially tricks invading nematodes into turning on a cell cycle blocker. The researchers had previously identified a gene regulatory element called a promoter that is activated in plant cells by nematode feeding. This leads to activation of the cell cycle and other genes needed to induce gall or syncytium formation. Inze and his colleagues have coupled this promoter to a gene that produces a kinase enzyme known to inhibit the cell cycle. When this hybrid gene was introduced into Arabidopsis, the cell cycle arrested specifically in the parts of the roots invaded by nematodes, preventing the formation of the large feeding cells. The nematodes could not feed anymore, and the plants became free of nematode infection.

    So far, Inze says, “cell cycle studies in plants are in the early stages.” Researchers will need to show that similar genetic manipulations can be achieved in crop plants. But Inze predicts that the work has the potential to produce plants resistant to nematodes and other pests and pathogens. “Plant development is quite plastic,” he notes. “It can be managed.”


    Will the U.S. Bring Down the Curtain on Landsat?

    1. David Malakoff

    Researchers are fawning over improved images from the new Landsat 7 satellite. But they also worry that there may not be a suitable successor to the government-built spacecraft

    Last month a small group of earth scientists got their first detailed look at data from a $700 million U.S. earth-monitoring satellite. It was a knockout. Landsat 7, launched in April 1999, was performing far above expectations, producing detailed images of forests, volcanoes, ice sheets, and other signs of global changes. “This is the finest terrestrial observatory we have ever flown,” crowed Sam Goward, a geologist at the University of Maryland, College Park.

    Their delight, however, was tempered by a big concern: Landsat 7 could be the last of a line of satellites first launched in 1972. Although the craft is scheduled to operate until at least 2006, there's already a struggle under way to decide who—if anyone—should build and operate a successor, at a cost of at least $400 million. That decision will shape the future of Landsat's 27-year-old data archive, which has been used for everything from monitoring desertification to identifying growing suburbs ripe for new fast-food outlets.

    Private companies say they are the rightful heirs to the earth-sensing throne, and they want the government to get out of the burgeoning imaging business. But many researchers worry that science will suffer if private companies call the shots. They want the federal government to remain in charge, perhaps as part of an international consortium. “The question is how to make a transition without jeopardizing the [extension of the] largest existing land-observation data set in the world,” says Donald Lauer of the U.S. Geological Survey (USGS) in Sioux Falls, South Dakota.

    An eye on change

    While other earth-sensing satellites are focused on the oceans or atmosphere, Landsat keeps an eye on terra firma. The 4-meter-long, 2200-kilogram current model, for instance, carries sensors that collect data in eight wavelengths of visible and infrared radiation, producing snapshots that cover 183-km-by-170-km patches of ground. The images, which allow researchers to identify different kinds of soil, vegetation, and land uses, detail objects down to 30 meters across and feed a community that ranges from military planners to geologists. Although new commercial satellites have much finer resolution—down to 1 meter—Landsat's broader view “is more appropriate for studying large-scale changes,” says earth scientist Curtis Woodcock of Boston University (BU). And because Landsat 7 returns to the same spots every 16 days and follows paths blazed by older siblings, researchers can monitor changes over time scales ranging from weeks to decades.

    The Landsat archive's value for tracking long-term landscape changes was highlighted at a recent meeting in Boulder, Colorado, of the satellite's science team, a group of 14 investigators funded by NASA and USGS. Woodcock, for instance, documented the growing holes that loggers have carved into Oregon's old-growth forests over the past 15 years (see images). Geologist Alexander Goetz of the University of Colorado, Boulder, has quantified the vast expansion of pivot-arm irrigation—in which a long sprinkler arm turns around a central pivot, like the spoke of a wheel—over the same period in a 1-million-square-kilometer arid patch of the western United States. The irrigation pattern could influence how ancient sand dunes in the area behave if an extended drought strikes, or when farmers exhaust groundwater supplies. “We're trying to develop a model that will tell us if we're going to get a dust bowl, or something even worse,” he says.

    The improved performance of Landsat 7's Enhanced Thematic Mapper Plus (ETM+)—its primary instrument—was the focus of other researchers. David Skole of Michigan State University in East Lansing showed that new sensors carried by the ETM+ are better able to spot the subtle electromagnetic clues left behind by logging and other activities occurring beneath the canopy of the Amazon rainforest. Those changes were invisible to earlier sensors, raising questions about the accuracy of previous estimates of Amazon deforestation using Landsat data. Drawing on images taken thousands of kilometers to the south, Robert Bindschadler of NASA's Goddard Space Flight Center in Greenbelt, Maryland, is assembling the first comprehensive picture of Antarctica since the first Landsat.

    There are several reasons why researchers are excited about the Landsat 7 data. Better calibration means they won't have to massage the data to determine exact geographic coordinates or to compensate for glare or hardware glitches. That “will save the research community untold hours and expense,” says Goward, the science team's leader. The improved quality also comes with a lower price tag and faster service: Landsat 7 images cost just $600 each, compared with $4400 for one picture in previous editions, and are available to users within days rather than months.

    Orbital soap opera

    Landsat's troubled history, however, suggests that researchers can't assume that the current flow of good, cheap, quick data will continue from a new satellite. Bureaucratic turf wars and funding crises have plagued the program since USGS researchers first proposed Landsat in 1969. “Each satellite has faced a Perils of Pauline situation,” recalled Lawrence Pettinger, a remote-sensing scientist with the USGS in Reston, Virginia, at a recent meeting on Landsat's future.* Congress has never been a great fan of earth-imaging satellites, which lack the economic and lifesaving lure of weather satellites or the romance of interplanetary probes. In addition, private companies have long argued that the government should stay out of the field of moderate-resolution imaging altogether.

    In the late 1980s, such claims helped convince legislators, over the objections of some federal agency officials, to begin privatizing the program. Landsats 4, 5, and 6 (the last failed to reach orbit in 1993) were essentially run by contractors in an arrangement that proved disastrous for researchers. The companies “decided to collect not very much data, and to charge a whole lot for it,” says BU's Woodcock. The rising costs of imagery “basically killed the [research] program” for years, he says.

    That and other “truly ugly” privatization problems, Lauer says, prompted Congress to reverse course and make Landsat 7 a government project. But in another confusing move, legislators decreed that, after Landsat 7 died, the government should strive to obtain new earth images from private sources. As a result, NASA is looking for companies to continue the quality and coverage of Landsat 7 at an affordable price, and at least one has responded affirmatively.

    “We can maintain Landsat continuity,” asserts Tom Koger, an executive with Resource 21, a Boeing-backed satellite effort. The company wants to launch four craft that would provide farmers with information about everything from soil moisture to weed growth, helping them pinpoint where to fertilize, spray, or irrigate. Resource 21's data “will be very similar to Landsat's,” he says.

    But some researchers are skeptical of such assurances. One problem, they say, is that private firms have little motivation to build the expensive sensors that collect data in all of the wavelengths covered by Landsat 7, as most customers can get by with less. Another is that the government may be unwilling to buy the global coverage—amounting to some 250 scenes per day—that Landsat now provides. Goddard's Darryl Williams, chief Landsat scientist, wonders whether the private sector will bring “the same passion and concern for detail” to the project. NASA engineers, for instance, delayed Landsat 7 for nearly a year to improve balky diodes that lowered data quality. A company focused on profits would be unlikely to do the same, he says.

    Finally, Williams and others doubt that the market for moderate-resolution satellite images is big enough for Resource 21, or any company, to make a profit. “Landsat is not commercially viable for cost recovery,” says Williams, noting that other countries subsidize their moderate-resolution imagers. In light of such issues, “why is the government-owned option the leper here?” he asks. But Koger says his company isn't counting on imagery sales to make a profit and that any government contract to continue the Landsat archive would be a bonus. “The Landsat heritage is very important to us,” he says.

    One public alternative to a U.S.-built satellite, says USGS's Lauer, is an international consortium. Nearly a dozen nations already fly or are planning to launch moderate-resolution imagers, he notes. “If they could agree on a common set of goals,” Lauer says, they could save money and researchers could be assured of a steady flow of data.

    NASA officials are expected to decide on a strategy by early next year, meaning that Landsat's backers must move quickly to preserve the program in anything like its current form. That timetable could pose an obstacle to researchers, says Woodcock, given the diverging interests of the satellite's users. “In some ways our strength is also our weakness,” he says. “Where out of all that diversity do you find a common voice?”

    • *Viewing the Earth: The Role of Satellite Earth Observations and Global Monitoring in International Affairs, George Washington University, 6 to 7 June.


    China Awakens to Fight Projected AIDS Crisis

    1. Dennis Normile*
    1. With reporting by Justin Wang and Li Hui of China Features.

    An increase in drug use and a boom in commercial sex have led China to the brink of an AIDS explosion. But its historic isolation also gives the country an advantage in testing the latest vaccines

    Beijing—China is poised to become the next AIDS battleground. The country has so far escaped the global HIV/AIDS onslaught: The official tallies count only 670 confirmed AIDS cases and 18,143 confirmed HIV-infected people among the 1.2 billion population. But by all indications, the epidemic is about to sweep through the world's most populous nation with a vengeance. The real number of infected people probably tops 500,000, according to China's National Center for AIDS Prevention and Control (NCAIDS), an estimate that has risen fivefold since 1996 based on increasing intravenous drug use, changing sexual mores, and a burgeoning commercial sex industry. If current trends continue, NCAIDS projects that China could have 10 million HIV-infected people by 2010.

    “If China does not take effective measures, AIDS will become a national disaster,” virologist Zeng Yi, former head of the Chinese Academy of Preventive Medicine in Beijing, warned in a recent report to the Chinese Academy of Sciences. The cost of treating AIDS patients and the lost productivity, he says, could “ruin the economic gains China has made” since the mid-1980s.

    One ray of sunlight in this dark picture is that China may be one of the best places to test AIDS vaccines. Researchers hope that getting an early start, before different HIV strains commingle, will make it easier to find concentrated populations suitable for testing vaccines aimed at specific subtypes of the virus. And China's well-developed public health infrastructure could help facilitate those trials.

    View this table:

    In addition, Zeng says he's “encouraged” by the recent actions of national leaders, who he says were initially slow to react. The 2-year-old NCAIDS, part of the Academy of Preventive Medicine and funded by the Ministry of Health, has expanded the academy's activities by supporting epidemiological studies and a behavioral intervention unit. It has also been promised a new building, and the state-controlled media have given high-profile coverage to the topic. China's increased activity not only benefits its own population, notes Wayne Koff, vice president for research and development at the International AIDS Vaccine Initiative (IAVI) in New York City, but “it also will complement ongoing [global] efforts by IAVI and others.”

    Yet the overall level of spending in China remains woefully low. China's national government now spends just 15 million yuan ($1.75 million) on HIV/AIDS programs, although officials hope that number will quadruple in the next fiscal year. Provincial and local governments also spend a small but unknown amount. And although other countries and international agencies such as the Joint United Nations Programme on HIV/AIDS and the World Bank are beginning to provide funds, their support only scratches the surface.

    Even a bigger budget can't overcome some structural problems, however. Wu Zunyou, NCAIDS's director of behavioral intervention, says he has not been able to establish needle-exchange programs for IV drug users or distribute condoms to commercial sex workers because the government views such efforts as condoning illegal activity. And many local officials are still unaware of the need to act. “Even with a national mandate, without local support it's very difficult to carry out effective intervention,” says Wu.

    AIDS education efforts are also hamstrung by the fact that most public health workers are hired and paid by local governments. And although many localities are actively supporting HIV/AIDS prevention activities, others are not. Zeng says some regional officials are probably underreporting cases to cover up lax supervision of such things as blood collection and to avoid scaring off foreign investors. Wu adds that police in some areas still arrest women carrying condoms on suspicion of prostitution, despite national campaigns that emphasize their legitimate public health role. As a result, Wu says, education efforts must be aimed at local officials as well as the general population. “But progress is not quick enough to match the speed of the epidemic,” he says.

    Opening the gates

    Zeng says China's leaders initially were lulled into complacency by the low number of actual AIDS cases and what was, until recently, a low estimated rate of infection. “The problem looked very small, given the 1.2 billion population,” he says. The HIV/AIDS problem came late to China because of its limited contacts with other countries, the government's strict enforcement of antidrug and antiprostitution laws, and culturally conservative attitudes about sex in general.

    But the accelerating economic reforms and increasing contacts with the rest of the world have opened up the same HIV infection routes in China that have plagued other countries. First among these is growing intravenous drug use, particularly along China's southern and western borders, and sharing of needles. A 1999 estimate by China's State Antidrug Commission of 680,000 drug users represents a 14% increase in the past year. About 72% of the 18,000 confirmed HIV-infected people are IV drug users (see pie chart).

    More worrisome to public health officials is an increasingly sexually active population, at least as measured by the rise in commercial sex and sexually transmitted diseases (STDs). Although heterosexual transmission accounts for only 6.7% of HIV infections, the number of reported STDs has doubled in the last 2 years, to more than 800,000 cases in 1999. Qu Shuquan, deputy director of epidemiology at NCAIDS, says that the actual number of new cases is probably three to five times higher because the country's surveillance network doesn't cover private clinics.

    The return of STDs is a dramatic turnabout in a country where such health problems were rarely seen in the 1960s. Wu says today's Chinese are throwing off old sexual taboos. “Having multiple sex partners [over the course of a lifetime] is not necessarily considered immoral anymore,” he says. And migrant workers are compounding the problem. The collapse of state-owned enterprises has forced rural men to seek work in the coastal cities, where the commercial sex trade is flourishing.

    Finally, China seems to be sitting on an HIV time bomb triggered by tainted blood. Much of the problem can be traced to poor, rural Chinese who sold blood or plasma to commercial collection centers. A 1999 study in a small village in Henan Province found that 80 of 140 residents who had sold blood or plasma had contracted HIV, presumably through inadequate sterilization procedures at a commercial blood collection center. They are not included in the official tally of persons who contracted HIV through blood transfusions, however, because that category includes only recipients of tainted blood.

    In 1998 the government banned paid blood “donations,” closed a number of commercial blood collection centers that didn't meet safety standards, and introduced screening. But regional newspapers have reported that many donors continue to be paid because of a dearth of volunteers.

    Local trials

    While public health officials struggle with preventive strategies, researchers are emphasizing the need to boost the country's basic epidemiological and biomedical research capabilities. Shao Yiming, a virologist and deputy director of NCAIDS, says that biomedical research now receives about 5% of the government's HIV/AIDS budget, a proportion he hopes will be maintained as the total budget rises. He also hopes to convince the Ministry of Science and Technology to start funding HIV/AIDS-related research that would complement work supported by the Ministry of Health.

    Another goal is to steer at least a small portion of the international aid now going into education and prevention into research to build China's scientific capacity. Researchers here are particularly excited about a new U.S. National Institutes of Health grant program that will be open to HIV/AIDS researchers based outside the United States (Science, 2 June, p. 1563). Such grants will “help fill the gap” in funding for basic research in China, Shao says. He says HIV/AIDS efforts in other developing countries have suffered when nonresident researchers leave at the end of their project, so it is particularly important for Chinese scientists to be involved in developing vaccines and treatments that are affordable. “We do not want to see a vaccine developed that is just a rich-country vaccine,” he says.

    China's indigenous research capabilities and the state of its HIV/AIDS epidemic, in fact, help make the country a promising location for vaccine trials. For one, Shao notes that the incidence of new infections in much of the developed world has plateaued. “If you try a vaccine in a region with low incidence of new infections, you need a huge cohort to show statistically significant efficacy,” he says. With new infections still rising in China, vaccine trials could be based on smaller cohorts and yield quicker results, he says.

    Hans Wolf, a virologist at the University of Regensburg in Germany, who is working with NCAIDS on an upcoming trial in Xinjiang, adds that each subtype of HIV is likely to require its own vaccine, much as influenza vaccines must be updated for each new outbreak. Other researchers are hopeful that a single vaccine will eventually prove to work against different subtypes, or clades, of the virus. But to reduce the variables involved, many HIV vaccines now in the works target a particular clade and will be tested in regions where that clade predominates.

    Xiao-Fang Yu, a molecular biologist at Johns Hopkins University in Baltimore, says that variation even within clades is less in China than in other countries. In 4 years of studies of the B/C recombinant strain of HIV that is common in Guangxi Province, samples from five regions have turned up just 1% to 2% variation in the genetic sequence of the HIV envelope protein. In contrast, he says, there can be as much as 15% to 20% variation in samples taken from just two patients in Baltimore. “If [developing] a vaccine depends on the homogeneity of the virus, then China will be an ideal place for that vaccine trial,” he says. The homogeneity of the virus in China “at least allows you to test whether some of the [vaccine] strategies have any efficacy under the most ideal situation.”

    A final factor is what David Ho, director of the Aaron Diamond AIDS Research Center of Rockefeller University in New York City, calls China's “reasonable infrastructure.” A well-established hygiene and epidemic surveillance network that has defined the number and distribution of the different HIV strains also can support the kind of cohort studies required for vaccine trials. After 3 years of preparatory effort, Yu's group just recently set up China's first HIV/AIDS cohort study based on 700 IV drug users in Guangxi Province. “This will be very significant in determining if the location is suitable for phase III efficacy trials,” he says.

    China's epidemiological and infrastructure advantages have attracted several groups (see table), all of which are planning to test vaccines that contain HIV genes stitched into a stretch of DNA called a plasmid. Although Wolf says he would prefer to see greater variety, IAVI's Koff notes that these tests will complement trials elsewhere of other types of vaccines.

    Although Zeng supports vaccine development, he says that its long time horizon demonstrates the need for other, short-term strategies to combat HIV/AIDS. “Maybe within 10 years we'll have a vaccine,” Zeng says. But until then, “the most urgent [need] is for nationwide education and intervention.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution