News this Week

Science  15 Feb 2008:
Vol. 319, Issue 5865, pp. 884

    Deaths in Diabetes Trial Challenge a Long-Held Theory

    1. Jennifer Couzin

    Medical treatment is often anchored in conjecture and instinct, and every so often a rigorous study proves common wisdom wrong. That happened last week when the U.S. National Heart, Lung, and Blood Institute (NHLBI) in Bethesda, Maryland, suspended a major component of a 10,000-person clinical trial that examined whether strong measures to bring blood sugar down to normal levels could prevent heart attacks and strokes in people with diabetes. There were more deaths, including fatal heart attacks, in the intensive-treatment group than among those who received standard care and had higher glucose levels. “We had every reason to believe this intervention would help,” says Denise Simons-Morton, who oversaw the trial at NHLBI. Instead, the study screeched to a halt 18 months earlier than planned.


    A glucose-reduction trial was halted when researchers found that volunteers receiving aggressive therapy were more likely to die.


    What went wrong? Researchers aren't sure—in fact, they're not even sure that bringing about a steep decline in blood sugar is harmful over the long term. An odd but consistent pattern in a handful of studies designed to control diabetes suggests that aggressive treatment worsens complications in the short term but reduces them as time passes. Indeed, a small study published last week in the New England Journal of Medicine (NEJM) reported that aggressive glucose, cholesterol, and blood pressure control led to slightly more deaths among diabetics after 4 years—the same period as the halted NHLBI trial—but after 13 years, the intensively treated group was better off.

    The NHLBI study aspired to something unprecedented. Even when doctors and patients try to reduce high glucose levels in diabetes, rarely do they bring glucose down to normal levels—a goal considered more trouble than it's worth. Now, scientists realize, it may also be dangerous for some people with higher-than-normal blood sugar.

    The halted trial, Action to Control Cardiovascular Risk in Diabetes (ACCORD), focused on older people with diabetes at high risk of dying from heart disease. Investigators in the United States and Canada recruited 10,251 people with type 2 diabetes who fit this description. Blood sugar was assessed by hemoglobin A1c, a measure of sugar inside blood cells. A healthy A1c is less than 6%.

    After nearly 4 years, 257 people in intensive treatment, whose A1c averaged 6.4%, had died, compared with 203 in standard care, whose A1c was on average 7.5%. This translates to a death rate of 1.4% per year in the intensive group and 1.1% per year in the standard- care group. The deaths had various causes—surgical complications, sepsis, strokes. But many were heart attacks. NHLBI has declined to release the number of cardiovascular deaths until the findings are published, saying only that it was higher in the intensive-treatment cohort. Puzzlingly, nonfatal heart attacks were about 10% less common, however. Is this because those people had a lower chance of surviving their heart attacks and were dying instead? “That's an interesting conjecture,” says Simons-Morton. “Don't put those words in my mouth.”

    Although endocrinologists have believed for years that the lower the glucose level, the better off the patient, there have been hints that the relationship is more complicated. In the 1990s, a Department of Veterans Affairs (VA) study found that 32% of volunteers in an intensive glucose-reduction arm suffered cardiovascular events, compared with 20% in the standard-care arm. With only 153 men participating, though, the study wasn't big enough to estimate optimum glucose levels, and VA researchers launched a much larger study of 1792 patients. Results will be unveiled in June at the annual meeting of the American Diabetes Association. In a conference call last Friday, the group monitoring the VA study agreed to continue it as is.

    At the University of Pittsburgh in Pennsylvania, epidemiologist Trevor Orchard has been a minority voice arguing against pushing blood sugar in diabetics down to normal levels, especially if cardiovascular risk is significant. “You get increased atherosclerosis in diabetes, but—and this is the key—it's perhaps more likely to be stable and less likely to rupture” and cause a heart attack, he says. That's because the plaques include extra sugar that toughens them. Postmortems of people with diabetes back this up, he says, as do studies that have looked for and found little or no relationship between A1c and cardiovascular problems in type 2 diabetes, including the 20-year UK Prospective Diabetes Study, which failed to establish a firm connection.

    One theoretical hazard of ACCORD, says Orchard, is that “by dramatically reducing blood glucose levels,” the investigators “have effectively removed a stabilizing influence on these plaques.” There's no question that lowering blood sugar is essential in diabetes, he emphasizes—but going below an A1c of 7% may be risky for precisely the kind of people in ACCORD, those with heart disease or at high risk for it. In younger people who haven't had diabetes as long and whose hearts are healthy, the consequences of normal glucose may be different.

    But dramatically reducing glucose could have other effects, too. “We had to use every possible medication out there” to get blood sugar down, says endocrinologist Vivian Fonseca of Tulane University in New Orleans, who participated in ACCORD. One treatment, insulin, often causes periods of low blood sugar, which can increase heart rate and have other untoward effects on the cardiovascular system.

    Finally, there's the unanswerable question of what would have happened had the study continued. Although Fonseca supported stopping it, he also believes that “when you intervene in a high-risk population, you may transiently worsen things” before they improve. A small study he conducted in the 1980s, he says, showed that trying to control diabetes initially led to more pain from nerve damage in patients before the pain subsided; scientists have made similar observations with eye damage associated with diabetes.

    “We do not know the molecular mechanisms behind” this phenomenon, says endocrinologist Oluf Pedersen of the Steno Diabetes Center in Gentofte, Denmark, who recorded it himself in his study published last week in NEJM.

    Still, says Pedersen, the ACCORD study was perhaps too ambitious. “Many of us without diabetes do not have an A1c of 6,” he says of ACCORD's target. “My A1c is 6.3, … [and] I'm superhealthy.”


    Alien Planetary System Looks a Lot Like Home

    1. Richard A. Kerr

    After finding nearly 250 alien-looking extrasolar planets, astronomers using a powerful new observing technique have spotted a planetary system—a star and two giant planets—that bears a striking resemblance to our own solar system. “This bodes well for there being a larger number of Earth-like planets” than previous observations had suggested, says planetary dynamicist Jack Lissauer of NASA's Ames Research Center in Mountain View, California. And it “bodes well for astrobiology,” too.

    The new technique, known as gravitational microlensing, involves a star's gravity bending the light of a more distant star behind it the way a magnifying lens bends light, explains astronomer Scott Gaudi of Ohio State University in Columbus. Gaudi heads an aggregation of four collaborations—OGLE, μFUN, PLANET, and RoboNet—consisting of 69 colleagues from 11 countries, including some amateur astronomers. They used telescopes at 11 observatories spread around the Southern Hemisphere to watch continuously as microlensing brightened a star from late March well into April of 2006. As a star slowly drifted in front of a more distant one, observers recorded brief brightenings superimposed on the weeks-long brightening and dimming of the merged image of the two stars.

    Two matches.

    Scaled to the size of their star, two newly discovered extrasolar planets strongly resemble Jupiter and Saturn.


    Calculating from the number, timing, and magnitudes of the subsidiary peaks, the group reports on page 927 the discovery of two planets orbiting the nearer star. The star has only half the mass of the sun, they calculate. The inner planet has a mass about 71% that of our Jupiter and lies about 2.3 times farther from its star than Earth is from the sun (2.3 astronomical units, or 2.3 AU); Jupiter is 5.2 AU from the sun. The outer planet is about 90% as massive as Saturn and lies 4.6 AU from the star, whereas Saturn is 9.5 AU from the sun.

    The result, the group writes, is a planetary system that “bears a remarkable similarity to our own solar system,” only scaled down. The ratio of the planets' masses and the ratio of their distances from the star are similar to those of Jupiter and Saturn. The ratio of the larger planet's mass to that of the star is close to the Jupiter-sun ratio. Even the warmth that the dimmer but nearer star sheds on the planets is similar to what the sun sheds on Jupiter and Saturn.

    The resemblance isn't perfect, Gaudi says, but it certainly beats the competition. Apart from a few lone exo-Jupiters, most known extrasolar planets are too massive or too close to their stars to be reminiscent of anything in the solar system (Science, 21 June 2002, p. 2124). Microlensing is currently the only way to detect Saturn-mass planets at Saturn-like distances from their stars, and this microlensing detection of a giant planet is the first in which astronomers could have detected a second, Saturn-like planet. “That suggests these things might be quite common,” says Gaudi.

    Other researchers agree. “This is a really neat story,” says astrophysicist Alan Boss of the Carnegie Institution of Washington's Department of Terrestrial Magnetism in Washington, D.C. Given the striking resemblance to the solar system, “there's lots of room” between the star and the inner detectable planet for an Earth-like planet or two nestled in life's environmental comfort zone, he says. And microlensing is potentially capable of detecting analogs of all the solar system planets except Mercury.


    Biodefense Watchdog Project Folds, Leaving a Void

    1. Jocelyn Kaiser

    An activist who has been both loathed and lauded for his criticism of safety at the United States's booming biodefense labs is closing his doors. Edward Hammond, director of the Austin, Texas-based watchdog group called the Sunshine Project, earlier this month posted a note on his Web site saying he is suspending operations. For 8 years, he has survived on a shoestring budget, he says, and he has had enough.


    Ed Hammond's digging into safety at biodefense labs found problems.


    The news may come as a relief to microbiologists and university officials who have been subjected to Hammond's relentless probing. But even some of those scientists say Hammond has had a positive influence. Virologist C. J. Peters of the University of Texas Medical Branch in Galveston says that although Hammond was a “pest” who often exaggerated risks to the public, his work has “made the community more careful” about biosafety. “I think the country works best with watchdogs,” he says. “I am, strangely, sad to see him go.”

    Hammond's causes included destroying smallpox stocks and sharing flu strains, but his greatest impact may be his scrutiny of the U.S. biodefense labs that sprang up after the 2001 anthrax attacks. He filed open-records requests for the minutes of nearly 400 institutional boards that oversee safety at biology labs (Science, 6 August 2004, p. 768). This revealed that some met infrequently, if at all. And last summer, after Hammond uncovered an unreported infection and other safety violations at Texas A&M University in College Station, federal authorities suspended the lab's biodefense research. A few months later, Congress held a hearing on safety at biodefense labs and called for stricter oversight.

    Despite these accomplishments, Hammond says he had had enough of a “totally consuming” job on a budget “well under $100,000 a year” cobbled together from small foundation grants. (The Sunshine Project's other staffer in Germany was part-time, he says.) Last year's fund-raising didn't go particularly well, he says, and “I hit my breaking point.” He says he hasn't figured out what he'll do next; for now, he's planning to spend a couple of years in Bogotá, with his wife, a Colombia-born attorney, and young daughter.

    Some observers, meanwhile, are lamenting the Sunshine Project's demise. “[Hammond] called attention to very real problems in the way that biosecurity has been funded and research reviewed,” says Gigi Kwik Gronvall of the Center for Biosecurity of the University of Pittsburgh Medical Center in Baltimore, Maryland. “There's no one else I know of that will look over at that level of detail and keep things transparent.”


    U.S. Prepares to Launch Flawed Satellite

    1. Eli Kintisch

    The U.S. government is planning to fly an Earth-observing satellite that will be essentially colorblind, at least as far as the oceans are concerned.

    The NPP satellite is a prototype for the $12 billion National Polar-orbiting Operational Environmental Satellite System (NPOESS), a series of satellites to be launched between 2013 and 2022. In 2006, scientists learned that there was a problem with the filter on one of NPP's instruments, VIIRS, which will prevent it from accurately measuring ocean color, a window into how sea life responds and contributes to fluctuating atmospheric and ocean carbon. But officials with NASA and the National Oceanic and Atmospheric Administration (NOAA) rebuffed a request last fall from scientists to correct the problem out of concern that working on the filter could jeopardize the other 21 parameters that VIIRS measures, including clouds, the extent of ice sheets, and forest cover. Last month, NOAA announced that a glitch in the system to regulate the satellite's temperature would push back the mission by 8 months, to 2010, prompting scientists to ask the government to use the delay to at least conduct a risk-benefit analysis of correcting the filter problem.

    Last week, NOAA officials told Science that they have no plans to do such an analysis or to alter the timetable to accommodate any changes to VIIRS's filters. Instead, says NPOESS manager Dan Stockton, the agency will try to fix the ocean-color problem “for the first NPOESS” mission set to launch in 2013.

    Bloomin' brilliant.

    NASA's 9-year-old Terra is living on borrowed time but still provides good data for studies of plankton blooms.


    Scientists fear that the flaws in NPP will disrupt the longitudinal record on ocean color, because two experimental NASA craft now providing good data are near or beyond their 5-year design life. Color data from a European satellite have been difficult to use, and an Indian sensor has yet to be launched, they add. “Are we going to [make it] to NPOESS [in 2012]? I don't think so,” says David Siegel, a marine scientist at the University of California, Santa Barbara.

    But Art Charo, who follows the issue for the U.S. National Academies' National Research Council, points out that the ocean-color community is at least better off than other climate specialties that had instruments removed from NPOESS in 2006 and don't know whether they will be restored (Science, 31 August 2007, p. 1167). “With such a tight budget environment, it's a question of triage,” he says.


    Senate Bill Would Scale Up Forest Restoration

    1. Erik Stokstad

    A new bill emphasizes collaborations to reduce fire risk, using tools such as prescribed burns.


    Focus on the whole forest and think big. That's the intent of a bill, introduced in the U.S. Senate last week, that would direct the Forest Service to fund large, collaborative projects to reduce fire risk, improve forest health, and stimulate economic development. “It's going to be more holistic, and Lord knows we need it,” says Jerry Franklin of the University of Washington, Seattle. Although they praise the bill, scientists and environmentalists say there is still room for improvement.

    Fires have taken an ever-larger toll on forests, communities, and the Forest Service budget. The Healthy Forests Restoration Act of 2003 was designed to lessen the risk of conflagrations by expediting projects to thin forests and clear out flammable undergrowth. But the projects have typically been small and picked in a scattershot fashion, says Laura McCarthy of The Nature Conservancy in Santa Fe, New Mexico. Environmentalists have also objected to the logging of old-growth trees, revenues from which helped fund the projects.

    Under the new bill, S. 2593, the Forest Service would solicit proposals from collaborations involving regional Forest Service staff, local groups, and nongovernmental organizations. Each project would encompass at least 20,000 hectares, although only part of the landscape might be treated. In addition to lessening fire risk, the 10-year projects should benefit the ecosystems by improving fish and wildlife habitat, for example, and clearing out invasive species. Another goal is to stimulate local economies by selling the small wood removed from forests to sawmills. With the guidance of a new science advisory board, the secretary of the U.S. Department of Agriculture would pick up to 10 such projects a year.

    To help pay for the work, the bill authorizes $40 million a year for 10 years. That amount is equivalent to recent increases to the service's “hazardous fuels” reduction program, which has a budget of $320 million in fiscal year 2008. These funds would have to be matched by the partners in the collaboration. The bill calls for 15 years of monitoring for social, economic, and ecological impacts, but observers note that monitoring is often the first part of a project budget to be cut. “That's one of the things that everybody gives lip service to,” says Rick Brown of Defenders of Wildlife in Washington, D.C. “This bill moves us toward thinking that monitoring is part of the job.”

    The bill isn't perfect, supporters say. Wally Covington of Northern Arizona University in Flagstaff thinks the projects should be at least 40,000 hectares to make ecological planning as strategic as possible. Covington also cautions that smaller areas may not provide enough wood to support local mills and bioenergy plants. Randi Spivak of the American Lands Alliance, an advocacy group in Washington, D.C., and others would like to see specific protections for old-growth trees, as well as tighter constraints on road building in project areas.

    The bipartisan bill has a powerful array of sponsors. It was introduced by Jeff Bingaman (D-NM), who chairs the Energy and Natural Resources Committee, and the ranking minority member, Pete Domenici (R-NM). Co-authors include the chair of the relevant appropriations subcommittee, which means there's a shot at actually funding the measure. A companion bill, H.R. 5263, was introduced in the House last week, albeit with less powerful backers. Hearings are planned for this spring.


    Back-to-Basics Push as HIV Prevention Struggles

    1. Jon Cohen

    BOSTON—At the big annual AIDS conference held in the United States, new drug studies once dominated the agenda. But last week at the 15th Conference on Retroviruses and Opportunistic Infections (CROI), treatment took a back seat to prevention. Many powerful anti-HIV drugs now exist, but few attempts to obstruct HIV infection have succeeded. Results presented at CROI, which ran from 3 to 6 February, continued the string of bad news and prompted much soul-searching about how to invigorate the ailing vaccine search. A few sessions did, however, relieve some of the gloom with reports on new ways to stop HIV's spread from mother to child and new insights into how HIV causes an infection and destroys the immune system.

    Vaccine researcher Ronald Desrosiers, head of the New England Primate Research Center in Southborough, Massachusetts, sparked debate by criticizing the funding priorities at the U.S. National Institutes of Health (NIH) in Bethesda, Maryland. NIH devotes nearly one-third of the roughly $600 million it spends annually on AIDS vaccine research to developing and testing products in humans, yet, Desrosiers asserted, no product now under development has “any reasonable hope of being effective.” “Has NIH lost its way in the vaccine arena?” asked Desrosiers, who argued for more basic research. “I think it has.” (ScienceNOW, 5 February:

    The beleaguered AIDS vaccine field took a serious hit last September, when researchers halted a clinical trial of a promising AIDS vaccine after an interim analysis revealed that it offered no protection against HIV. More disconcerting still, some evidence suggests that preexisting antibodies against an adenovirus strain, Ad5, used in the Merck and Co. vaccine to carry HIV genes, may somehow have made people more susceptible to the AIDS virus (Science, 16 November 2007, p. 1048). Data from Susan Buchbinder, an epidemiologist at the San Francisco Department of Public Health in California and co-chair of the study, offered some reassurance that the vaccine did not cause harm. Circumcision protects men from HIV, and uncircumcised men with high levels of Ad5 antibodies appear to have become infected more readily. “The effect of circumcision seemed at least as strong if not stronger than Ad5 [antibodies],” said Buchbinder. Although it's difficult to unravel cause and effect in post-hoc analyses, Buchbinder said: “I don't think at the end of the day that Ad5 was associated with increased infection.”

    In another blow to the prevention field, Connie Celum of the University of Washington, Seattle, revealed unexpected results from a study aimed at reducing susceptibility to HIV infection by treating preexisting infection with herpes simplex virus-2 (HSV-2). Infection with HSV-2, which causes genital ulcers, makes a person two to three times more vulnerable to HIV infection through sex. In a multi-country study involving more than 3000 people, Celum found that treatment with the anti-HSV-2 drug acyclovir did not reduce HIV transmission. Over the course of 18 months, 75 people who received acyclovir became infected with HIV versus 64 who received a placebo. “This is a surprising, disappointing, and important result,” said Celum. “Many people thought this was going to be a slam dunk.”

    Gutsy virus.

    Several studies suggest that the AIDS virus causes immune “activation” by destroying Th17 cells, a CD4 subset that resides in the gastrointestinal tract and secretes “defensins” that prevent bacteria from entering the bloodstream and lymph nodes.


    Celum said the problem wasn't linked to a failure to take acyclovir and that the treatment did reduce genital ulcers—although not as much as in earlier trials. That means intervention might work with a more powerful anti-HSV-2 drug or an effective HSV-2 vaccine, she said.

    On a more positive note, two studies of thousands of HIV-infected pregnant women in several developing countries showed for the first time that anti-HIV drugs given to their babies could prevent transmission of the virus through breast milk. “The data are very exciting, but there are caveats,” said Michael Thigpen of the U.S. Centers for Disease Control and Prevention in Atlanta, Georgia, which sponsored one of the studies. Babies can develop resistance to the drugs, which can limit treatment options if they do become infected.

    Some intriguing data came from studies of a new type of immune actor, discovered just 2 years ago, called Th17 cells. HIV targets and destroys CD4 white blood cells; Th17 cells are a subset of CD4 cells that secrete interleukin-17. Three labs reported that in monkeys and humans, destruction of Th17 cells in the gut make it “leaky,” allowing gut microbes, or pieces of them, to flood into the bloodstream. The researchers contend that this turns up the immune system, “activating” CD4 cells that then prematurely die or become targets for HIV themselves. In one study, said Barbara Cervasi, a postdoc in Guido Silvestri's lab at the University of Pennsylvania, Th17 cells were profoundly depleted in the gastrointestinal tracts of HIV-infected people and SIV-infected macaques—species that both develop AIDS—but not in SIV-infected sooty mangabeys that suffer no harm from that virus. “People assume that high [HIV or SIV levels] lead to activation,” said Silvestri. “What if it's the opposite and activation causes the problems?”

    In the final session, George Shaw of the University of Alabama, Birmingham, reported that his group had sequenced the HIV envelope gene in 102 recently infected people. HIV-infected people carry many genetic variants of the virus, but a single one established an infection and dominated in 80% of the subjects, Shaw and co-workers found. Although other studies have shown that a “bottleneck” occurs in sexual transmission of HIV, allowing few viruses to infect, this is the first study to clarify just how few. Four other new studies have had similar findings, said Shaw.

    Shaw, who hopes to discover and target HIV variants that are especially good at transmission, said this work is good news for vaccine researchers. “If all you've got to deal with is one virus,” said Shaw, “surely it shouldn't be so difficult to develop a vaccine.”


    Another Side to the Climate-Cloud Conundrum Finally Revealed

    1. Richard A. Kerr

    Clouds have always given climate modelers fits. The clouds in their models are crude at best, and in the real world, researchers struggle to understand how clouds are responding to—and perhaps magnifying—greenhouse warming. As a result, cloud behavior is the biggest single source of uncertainty in climate prediction. But two new studies now show that much of the worry about clouds' role in the warming has been misdirected. Clouds' response to global temperature changes may be much quicker and more direct—and thus easier to study—than experts have thought.

    “It's a little bit of good news,” says climate researcher Brian Soden of the University of Miami in Florida. “People have been working on [the cloud problem] for 2 decades or more, and we haven't done a lot to decrease the uncertainty. I'm a little more optimistic now about making progress on this problem.”

    Researchers have always considered the cloud problem a matter of feedbacks. In a positive feedback, increasing greenhouse gases warm the surface, and the warmer surface then feeds back somehow to overlying clouds. The nature of the feedback remains mysterious, but if it's positive, it would decrease global cloud cover. With fewer clouds reflecting solar energy back into space, more energy would reach Earth, amplifying the initial warming. But Earth's surface and especially its oceans are slow to warm, so cloud feedbacks operate over decades—or so scientists assumed.

    Two groups have recently looked at just how quickly model clouds actually respond to an increase in greenhouse gases. Climate researchers Jonathan Gregory and Mark Webb, both of the Hadley Centre for Climate Prediction and Research in Exeter, U.K., report in the January Journal of Climate (issue 1) that model clouds, at least, can respond quickly to added carbon dioxide—in months, not decades. In most of the models examined, the classic cloud feedback driven by change at the surface played only a minor role. The real action took place where the clouds themselves were, up in the air. Added carbon dioxide absorbs more long-wave energy radiating from the surface; the air holding that carbon dioxide warms, and clouds evaporate, letting more solar radiation in.

    At risk.

    Greenhouse gases can directly reduce cloud cover and magnify warming.


    In follow-up work in press in Geophysical Research Letters, climate researchers Timothy Andrews and Piers Forster, both of the University of Leeds, U.K., extend and refine the analysis of Gregory and Webb. In seven models, they doubled carbon dioxide while holding the global surface temperature constant and watched how atmospheric temperatures respond. The classic, slow cloud response is only half of previous estimates, they find, and most of the cloud response is fast.

    Scientists “have been looking at the incorrect part of the problem,” says Forster. Properly accounting for fast response is important when modeling rising temperatures under the strengthening greenhouse, Webb and Gregory argue. And because it is fast and therefore has been going on for decades, notes Gregory, researchers may be able to tease the newly appreciated cloud response out of observations and improve their models faster than they have the past few decades.


    Wolves at the Door of a More Dangerous World

    1. Virginia Morell

    Weeks away from being removed from the endangered species list, wolves in the northern Rockies may soon be hunted once more.

    Weeks away from being removed from the endangered species list, wolves in the northern Rockies may soon be hunted once more

    Three weeks ago, while tracking Yellowstone National Park's gray wolf (Canis lupus) packs from the air, wildlife biologist Douglas Smith darted wolf number 637, a young female from the Cougar Creek pack. Then, handling her on the ground for monitoring, he noticed that she had only three legs, probably after getting caught in a coyote trap outside the park's boundaries. Smith, leader of the park's wolf project, fears that 637's misfortune could be a harbinger of things to come, because gray wolves here are soon slated to be removed from the endangered species list. The new ruling from the U.S. Fish and Wildlife Service (USFWS) has been in the works for 5 years and is expected to be published at the end of this month in the Federal Register; it would go into effect 30 days later. Wolves on park grounds would still be protected, but “what will happen when they travel outside the boundaries?” asks Smith. “There's a good chance some are going to end up like this one, trapped or killed by hunters.”

    Smith isn't the only one worried about the future for wolves in the northern Rocky Mountains when they lose the protective shield of the federal Endangered Species Act. Yet at first glance, the announcement would seem cause for celebration. After all, wolves were intentionally driven to extinction in this region less than 100 years ago. Now, following successful reintroductions and management, their population hovers around 1500 animals.

    Pushing boundaries.

    Yellowstone's wolves don't stay inside the park, as these partial estimates of their movements show.


    But some of those who have worked to restore the wolf say the new ruling is like the proverbial wolf in sheep's clothing: It turns wolf management over to state and tribal agencies that plan to actively reduce the canid's numbers. The state management plans, already approved by USFWS, will allow trophy hunting and trapping of wolves, plus lethal control of those that harm livestock or eat too many deer and elk. Last year, Idaho Governor C. L. Otter promised to “bid for that first ticket [hunting tag] to shoot a wolf myself,” although he later said that Idaho would manage a viable wolf population. Most controversially, each state is required to maintain a population of only 100 wolves and 10 breeding pairs. That means wolf numbers could drop to a mere 300 and still be considered “recovered,” although most wolf watchers think a tally of 500-plus animals is more likely.

    So instead of popping champagne corks, as usually happens when a species is brought back from the brink, conservation groups are preparing legal briefs to challenge the ruling. They charge that it's based on politics, not science.

    But USFWS officials say they are convinced their science is sound. “That is what the law mandates,” says Edward Bangs, wolf recovery coordinator at USFWS in Helena, Montana, referring to the 1994 federal environmental impact statement that established the minimum numbers for recovery. “We've looked at every minute bit of science.” He adds that the wolf ‘s biological resilience gives him the most hope for their continued success. “Every year, about 23% of the population is killed by people legally and illegally, and yet the wolves are still growing at 24% a year. Biologically, they couldn't be any easier. But politically, wolves are the most difficult to manage.”

    Hunted with passion

    Before Lewis and Clark, some 350,000 wolves inhabited the lower 48 states, preying on bison, deer, and elk, according to genetic studies. As pioneers decimated the bison, wolves turned to livestock, and settlers and the federal government fought back with guns and poison. Ironically, it was the job of USFWS to wipe out wolves. They succeeded by the 1930s, extirpating the canids from more than 95% of their historic range. “Wolves were hunted and killed with more passion than any other animal in U.S. history,” says a USFWS publication.

    Placed on the federal endangered species list in 1974, gray wolves began making a comeback in the 1980s, when a few Canadian wolves (the Canadian population may be as high as 60,000) crossed the border and settled in Montana. In the 1990s, USFWS brought 66 Canadian and 10 Montana wolves to Yellowstone and a separate area in Idaho. Ranchers, farmers, and hunters fought the restoration, but USFWS surveys showed that many Americans wanted this top predator back on the landscape. “For many people, wolves are the symbol of Yellowstone,” says Bangs. “They think that we should find a way to live with wolves,” although he adds that this idea is more prevalent among city dwellers who don't live near wolves.

    Top dog.

    Some hunters worry that wolvesmay compete with them for elk and deer.


    The reintroductions, which cost a total of $27 million over 33 years, have been hailed worldwide as great successes, particularly in Yellowstone, where the wolves are helping to bring back a more balanced ecosystem (Science, 27 July 2007, p. 438). They also serve as key subjects in a natural laboratory for scientists. Research has shown the ecological benefits of reintroduction, many scientists say: “The most trenchant message from conservation science in the last decade comes from studies about the role of top predators in maintaining the health of ecosystems,” says Michael Soulé, a professor emeritus at the University of California (UC), Santa Cruz.

    With abundant prey and open territory, the reintroduced wolves rocketed back, doubling their numbers in the first few years. Young wolves regularly disperse in neighboring states such as Utah and Oregon, although packs have not yet been established there. And although the wolves are currently considered an endangered species, USFWS is allowed to manage them, which includes killing or relocating them. The agency removes packs that have spread into problem areas and has killed about 700 wolves since 1987.

    Given the wolf ‘s recovery, it's now time for the next step, says Bangs: removing wolves from the Endangered Species List. To gauge scientists' reactions to the delisting and the minimum population target, USFWS “surveyed 80 scientists around the world,” says Bangs. “Between 75% and 80% of them thought that this goal [of 300 wolves] was good enough, although I, personally, think it is too low. But the broad consensus was that this definition represents a minimum viable population.” Bangs adds that the “states have already committed to managing for more than the minimum, so that there will be a cushion” of about 45 breeding pairs and more than 450 wolves.

    That's still a reduction of about two-thirds of their numbers. Indeed, traces of earlier attitudes toward wolves linger. Many ranchers, farmers, and hunters despise the canids because they kill livestock and pets and compete for elk and deer. Posters put up by antiwolf groups label the wolf “The Saddam Hussein of the Animal World.” Terry Cleveland, director of the Wyoming Game and Fish Department, says that “state law requires us to have an aggressive management plan for wolves,” although he adds that this will include monitoring as well as hunting. Outside of the greater Yellowstone area, wolves will be classified as predatory animals. That means that, once delisted, they can be killed without a hunting license and by many methods, including intentionally running over them with a car or in “wolf-killing contests.” Cleveland says that “our floor wolf population here will be roughly 150 wolves. The ceiling has yet to be determined.”

    Idaho, too, plans a hunting season for its 700-some wolves, and populations will be thinned in areas of high conflict, says Steve Nadeau, a large carnivore manager for Idaho's Fish and Game Department. “But we're going to go slow and conservative to see how the harvest works.” In Montana, where about 400 wolves reside, the numbers are also certain to drop because the plan describes wolves as a “species in need of management.” Carolyn Sime, the wolf program coordinator for Montana's Fish, Wildlife, and Parks Department, says that “when there are at least 15 breeding pairs, hunting and trapping could occur.”

    The wildlife agencies insist they're not planning to send the canids back to the brink. “We manage big game for a living, and we're good at it,” says Nadeau. “We'll do a good job with the wolves, too. The whole world is watching, and we know it.”

    The states' plans to treat wolves as big game animals available for trophy hunting may actually end up helping the canids, suggests Bangs. He expects hunters will likely become some of wolves' staunchest supporters, “just as they are now for mountain lions and black bears.”

    Born to run.

    Reintroduced wolves are recolonizing their old territories.


    Battling over the numbers

    Despite Bangs's description of broad support for the delisting among the USFWS survey of scientists, many university scientists and conservation organization researchers interviewed by Science find the plan premature and unwise. In particular, they object to the notion that a population of 300 wolves is viable. “They don't even need a scientist to tell them that,” says Robert Wayne, an evolutionary biologist at UC Los Angeles, whose lab has reconstructed the past genetic history of North America's gray wolves. In a letter he sent to USFWS last February in response to the service's request for his comments on the delisting proposal, Wayne wrote that the recovery goal “severely underestimates the number of wolves required for maintaining a genetically healthy, self-sustaining meta-population.” He also notes that the delisting proposal makes no effort to assure that the populations in the three states and Canada are interconnected via corridors so that the wolves can mix genetically and form a metapopulation. He and others argue that such a metapopulation was one of the goals of the original 1987 federal wolf recovery plan.

    The lack of gene flow most threatens the 171 wolves in Yellowstone National Park, which are all descendants of the first 41 released there between 1995 and 1997. Without new wolves, the population's genetic health is certain to decline, says Wayne and his graduate student Bridgett vonHoldt, who analyzed the genealogy and genetic viability of the Yellowstone wolves last year. They note that recent studies of a highly inbred population of Swedish wolves indicate that within 60 years, the Yellowstone wolves will begin suffering from “significant inbreeding depression,” which will lead to a lower population. “It will be the equivalent of having one less pup a year,” says Wayne.

    But Bangs counters that the Endangered Species Act requires only that wolf numbers stay above the threatened or endangered level. “It isn't about maintaining genetic diversity,” he says. If inbreeding problems arise, new wolves can always be reintroduced to the park later. “Connectivity can happen through a ride in the back of a truck,” he says. That attitude dismays vonHoldt. “The impact is there on the horizon for anyone to see,” she says. “Why create a problem for others to solve down the line? Why not fix the recovery plan now?”

    “Basically, the goals of the USFWS's wolf recovery plan aren't in sync with the latest thinking in conservation science,” says Carlos Carroll, a wildlife biologist with the Klamath Center for Conservation Research in Orleans, California, who has modeled the restored wolf populations. “Biologists have moved away from the idea of a minimum viable population [MVP] to a more comprehensive population analysis.” The problem with MVP numbers, he adds, is that “wildlife managers focus solely on that number,” as they are in the three states. Instead, he and other researchers say that management plans need to include the “range of factors that might threaten a population and determine ways to make it more resilient to unexpected events,” such as a new disease. “That 300 figure reflects old thinking; new data suggest that several thousand wolves” may be needed before delisting should be considered, says Carroll. He and others note that USFWS delisted the Great Lakes gray wolves only last year in Michigan, Wisconsin, and Minnesota, when the population totaled 4000 individuals. (Although all three states now consider wolves as big game animals, none has yet initiated a hunting season.)

    And then there are the wolves of Yellowstone. Smith and others have monitored them for 13 years, collecting data that should help settle long-standing issues such as how great an impact wolves have on prey populations and how natural wolf populations fluctuate. None of the states' plans makes special provisions or buffer zones to protect these wolves; one of Montana's proposed wolf-hunting zones abuts the park's boundary. Six of the park's 11 wolf packs travel outside the park's boundaries every year (see map); and two of these six do so for extensive periods of time, largely in pursuit of elk, the wolves'main prey. “They'll get into trouble,” predicts Smith. “I support delisting. But [this] concerns me, because the parks' mission is one of protection and preservation. And we will most certainly lose some of our wolves.”

    State wildlife managers make no promises on this issue, saying that wolves in their territory are fair game. “The Yellowstone wolves will be treated the same as elk that also travel outside of the park and are hunted,” says Sime. Counters Smith, “These are park wolves; most spend 99.9% of their time here, yet they may get killed on that one trip outside. The public knows them as individuals. Which state official is going to take the call when someone's favorite wolf is shot?” Further, the loss of park wolves to hunters will “squander our research.”

    Many scientists would prefer to see the wolves remain on the endangered list until they reach a point at which they can be selfsustaining without the need for heavy human management. “It's frustrating,” says Sylvia Fallon, an ecologist with the Natural Resources Defense Council in Washington, D.C. “Having a natural population of wolves is achievable and sustainable, and we're close to being there. But now, they're going to be knocked back down. We have to stop the delisting.”

    Environmental organizations are already running ads decrying the planned delisting and have joined forces to ask for an injunction against USFWS's proposal as soon as it is published. They have also already filed a lawsuit to try to block another USFWS ruling, published in late January, that would essentially let the three states begin lethal management of the wolves (although not a public hunting season), even if the delisting is blocked in court.

    Conservationists argue that wolves should stay on the land and fulfill their ecological niche where possible. But for that to happen, people must accept the presence of wolves—and change their behavior accordingly, says Timmothy Kaminski, a wildlife biologist with the Mountain Livestock Cooperative in Augusta, Montana. Otherwise, a sad, repetitive scenario ensues, with wolves moving onto the same ranchlands, killing cattle, and then being killed, over and over. “Wolves are here; grizzly bears and mountain lions are here. You can't turn your cows out into a mountain pasture without being as vigilant as an elk,” says Kaminski. “This is no longer a 20th century landscape.”


    Framework Materials Grab CO2 and Researchers' Attention

    1. Robert F. Service

    Porous solids have become a rich playground for chemists, who can tailor the materials' makeup for use in gas storage, filtering (see p. 939), and catalysis.

    Porous solids have become a rich playground for chemists, who can tailor the materials' makeup for use in gas storage, filtering, and catalysis

    In most synthetic chemistry projects, researchers struggle to stitch molecules together one bond at a time. Not so in the lab of Omar Yaghi, a chemist at the University of California (UC), Los Angeles. Yaghi and his colleagues work to find just the right set of conditions so that entire networks of materials fabricate themselves when given the go-ahead.

    In the late 1990s, Yaghi first worked out the formula for creating a family of highly porous, yet stable, crystalline materials known as metal organic frameworks. MOFs have a Tinkertoy-like construction with metals that serve as the hubs and connecting struts made from organic compounds. By tweaking his recipe, Yaghi and others have since made thousands of such porous crystals. That's made MOFs and related compounds one of the hottest playgrounds in chemistry, and Yaghi their greatest inventor. “His work is terrific,” says Thomas Mallouk, a chemist at Pennsylvania State University in State College. “He does beautiful fundamental science that is knocking on the door of important applications.”

    On page 939, Yaghi and colleagues report a new robotic high-throughput scheme for creating MOF relatives known as zeolitic imidazolate frameworks (ZIFs). And Mallouk and others say that the work is again an important blend of fundamental research and a critical application: materials that might help coal-fired power plants filter out carbon dioxide from their smokestacks. Mallouk calls the new work “very clever” because Yaghi and his colleagues have designed their hubs and linkers to mimic the construction of zeolites, a family of natural porous compounds widely used as catalysts and filters in industry. But because ZIFs are stable at high temperatures and are easier to tailor by adding desired chemical functional groups, they may prove even more useful in the long run.

    Attempts to gain control over open-framework materials have a long and frustrating history. The frameworks are synthesized in solution and can take on a wide variety of structures depending on the hubs and linkers used. For decades, however, researchers found that their frameworks almost always collapsed when they removed the solvent. Equally troubling, they found it nearly impossible to make large pores, as multiple networks would form simultaneously and interpenetrate one another. Yaghi and colleagues solved both of these problems in the late 1990s. They increased the strength of their frameworks by selecting starting materials that preferentially assembled into a network of rigid prisms and cages. They also worked out designs that keep separate frameworks from interpenetrating. It's been off to the races ever since.

    Carbon traps.

    Cagelike zeolitic imidazolate frameworks and their kin excel at straining carbon dioxide out of a mixture of gases, a knack that could lead to CO2 scrubbers for power-plant smokestacks.


    One key race is to create a MOF that can store hydrogen for use in future fuel-cell cars. High-pressure gas tanks do the job fairly well. But pressurizing gases is a big energy drain and can create a hazard if the gas tank is punctured in a crash. By filling part of the tank with a MOF's cagelike network built with hydrogen-absorbent metal hubs and organic struts, however, it is possible— at least in theory—to store more of the gas at a lower pressure. Slightly raising the temperature or releasing the pressure then liberates the gas. Yaghi's group and Jeffrey Long's group at UC Berkeley both recently created MOFs that can hold up to 7.5% of their weight in hydrogen, better than a benchmark for hydrogen storage set by the U.S. Department of Energy. Unfortunately, they only do so at 77 kelvin (−196°C), making them impractical for real-world use.

    In July 2007, researchers led by William Goddard III of the California Institute of Technology in Pasadena reported in the Journal of the American Chemical Society that adding lithium to a MOF should make it possible to store 6% of its weight in hydrogen at room temperature. Long says many groups are working on it, but “it's not trivial.” Lithium, he points out, tends to hold strongly to solvent molecules after synthesis, and removing the solvent requires so much energy that it typically blows apart the framework.

    Other MOF applications are pushing ahead as well. Several of the new ZIFs appear to have a strong preference for binding CO2. Yaghi suspects that carbon-rich benzene rings in their struts act as valves that let CO2 molecules pass in and out of the pores. Once inside, the CO2's carbon atoms, which have a partial positive charge, readily bind to nitrogen atoms in the framework, which carry a partial negative charge. Yaghi says ZIFs could be used to capture CO2 in power-plant smokestacks. Once full, the ZIFs can be removed, and the ensuing pressure drop would release the CO2 from the pores, allowing the ZIFs to be reused. MOFs are also being looked at as filters for a variety of hydrocarbons.

    Other teams are beginning to explore using the porous solids as scaffolds for catalysts. By tuning the materials to allow certain gases inside easily while excluding others, researchers can control which compounds in a mixture gain access to a catalytic metal atom inside. Because the materials are solids, they can easily be recovered and reused after running a reaction, unlike many highly active catalysts that must be separated from a solution. With all the possible ways to construct MOFs and their many applications, today dozens of groups around the world are streaming into the field. “Interest in these materials has been increasing extremely rapidly,” says Long. “The trajectory is still going way upward.”


    In South Africa, XDR TB and HIV Prove a Deadly Combination

    1. Robert Koenig

    Since the 2005-2006 outbreak of extensively drug-resistant tuberculosis in KwaZulu-Natal Province, health experts have been grappling with how to detect and treat the disease.

    Since the 2005–2006 outbreak of extensively drug-resistant TB in KwaZulu-Natal, health experts have been grappling with how to detect and treat the disease

    Tricky diagnosis.

    X-rays, like this one taken in Port Elizabeth, show TB infection, but tests to distinguish normal from drug-resistant TB can take weeks.


    CAPE TOWN, SOUTH AFRICA—A gaunt man with dark, deep-set eyes nods toward the uniformed security guards at the gate and the nurses who wear double-thick “respirator” masks when they make their rounds. The cheerless ward, surrounded by a 3-meter fence, is “more like a prison than a hospital,” he says. “Many patients are depressed; they don't want to be here,” the chief nurse tells a visitor as a TV soap opera drones in a nearby room.

    That feeling is understandable. The two dozen men and women in the isolated ward are undergoing harsh and possibly futile treatment for the often lethal, contagious, and stigmatized disease that has brought them to Brooklyn Chest Hospital: extensively drug-resistant tuberculosis (XDR TB). The emergence over the past 2 years of the disease—which is even more difficult to treat effectively when patients are coinfected with HIV, as many are—is posing complex medical, ethical, and scientific issues in South Africa, the site of the largest and deadliest XDR TB outbreak to date. Last year, more than 500 cases of XDR TB were diagnosed here, and the total number was probably far higher.

    On the medical front, the challenges include treating an infection that resists even last-ditch medications and finding the best ways to prevent hospital transmission of the disease (see sidebar, p. 897). Among the research challenges are identifying new drug targets and rapid diagnostics, as well as investigating the molecular evolution of the TB strains that led to the emergence of this new threat. The main ethical quandary is the extent to which hospitals can or should isolate XDR TB patients against their will or force them to take potentially lifesaving yet toxic drugs—perhaps for years.

    Few warning signs

    In August 2006, researchers made headlines at the annual AIDS meeting in Toronto, Canada, with a report that a new strain of TB, apparently resistant to almost all known drugs, had emerged in South Africa. The cases had been detected in 2005–2006 in the poor, mainly Zulu community of Tugela Ferry in South Africa's KwaZulu-Natal (KZN) Province; nearly all the victims were also coinfected with HIV. Especially alarming was the fatality rate: 52 of 53 patients had died within a median of 16 days after being tested for TB (Science, 15 September 2006, p. 1554).

    XDR TB caught health care workers off guard and sparked fears of a new wave of “killer TB” outbreaks—especially in countries with high rates of HIV infection—that could jeopardize the progress in global TB control. The outbreak provided a “wake-up call,” says Mario Raviglione, director of the World Health Organization's (WHO's) Stop TB Department, which had first discussed the emergence of XDR TB of Tugela Ferry and elsewhere at a meeting in May 2006. WHO quickly formed a global XDR TB task force that soon made recommendations for dealing with the threat. These include better TB and HIV/AIDS control and stricter management of drug-resistant TB, as well as better laboratory services and more extensive surveillance.

    Although the Tugela Ferry outbreak was startling, XDR TB wasn't brand-new. Sporadic cases had been reported in the United States, Latvia, Russia, and elsewhere; WHO and the U.S. Centers for Disease Control and Prevention in Atlanta, Georgia, had first defined the strain in a March 2006 article. Nor was the new bug totally unexpected, given the poor record of treating TB in many countries. After multidrugresistant (MDR) strains of TB surfaced a couple of decades ago, some scientists had warned, it was only a matter of time before new strains, resistant to even more drugs, would emerge.

    MDR TB first garnered widespread attention in the 1990s, when researchers and clinicians around the globe began identifying an alarming number of cases that were resistant to at least two of the four standard drugs used to treat TB. Suddenly, the already arduous task of treating TB became even more difficult and expensive. MDR TB can take as long as 2 years to treat, compared with 6 to 8 months for drugsensitive TB. Costs run 3 to 100 times higher, depending on the country and the drug-resistance pattern. WHO now estimates that of the 8 million cases of active TB diagnosed each year, more than 400,000 are MDR. Cases tend to be concentrated in regions where inadequate healthcare services make it harder to ensure that patients can follow the lengthy drug regimen.

    Resistance can arise when patients fail to complete their therapy, thereby giving the TB bacteria an opportunity to mutate to evade the drugs. That's why a cornerstone of TB therapy has long been directly observed treatment-short course (DOTS), which focuses on supervised adherence to a fixed combination of anti-TB drugs. However, DOTS does not require drug-resistance testing, meaning that many undiagnosed MDR TB patients have been treated by an ineffective DOTS drug regimen that may have allowed those MDR TB strains to develop even further drug resistance. To help address that problem, WHO in March 2006 began recommending what's called the “DOTS-Plus” protocol—which calls for using second-line TB drugs for people with confirmed or presumed MDR TB—for some high-incidence countries.

    “The major challenge is to see that TB patients stay on the treatment regimen,” says Karin Weyer, head of the TB program at South Africa's Medical Research Council (MRC). Lindiwe Mvusi, who heads the South African Health Department's TB Program Directorate, estimates that at least 20% of the country's MDR TB patients are defaulting, making it more likely that some may eventually end up with XDR TB. Because XDR TB is resistant to most of the second-line drugs that are used to treat MDR TB (including fluoroquinolone-category medications as well as either amikacin, capreomycin, or kanamycin), clinicians have few options, other than trying older drugs or new combinations of drugs.

    Paul van Helden, co-director of the Centre of Excellence in Biomedical TB Research at Stellenbosch University, questions whether the DOTS drug protocols are always the best approach in high-incidence TB countries such as South Africa. He believes more investigations are needed to determine the best mixture of drugs to treat MDR and XDR TB in different regions.

    At this point, no one knows exactly how many cases of XDR TB there are globally, because most go undiagnosed and are not reported. WHO recently estimated that XDR TB may infect about 27,000 people a year in at least 41 countries. But this is just an educated guess, based on a percentage of the MDR TB cases diagnosed each year. Later this month, a new WHO report will give a more detailed picture of the spread of drug-resistant TB.

    Flash point at Tugela Ferry

    In retrospect, it's not surprising that the 2005–2006 outbreak occurred in KZN Province, which includes areas of extreme poverty. Although for centuries tuberculosis has been called The White Plague, in South Africa it is predominantly a disease of black Africans, a byproduct of poverty, poor health care, and—perhaps most perniciously—a high HIV infection rate. About 5.5 million South Africans are HIV-infected, about 11% of the population, with the highest infection rate in KZN. The combination of drug-resistant TB and HIV is especially dangerous because the weakened immune systems of HIV-infected persons make them more vulnerable to TB and also more difficult to treat.

    The Tugela Ferry outbreak was detected when doctors at Church of Scotland (COS) Hospital began investigating the unexpectedly high mortality rate among TB-HIV-coinfected patients. Drug-sensitivity tests revealed that not only was MDR TB rampant, but even more patients had the superresistant XDR strain. Before then, few clinicians tested for drug resistance because it was expensive and time-consuming. That has changed over the last 2 years; today, many South Africans who test positive for TB are started on first-line drugs while being tested for drug resistance.


    The new XDR TB ward at Brooklyn Chest Hospital in Cape Town is guarded around the clock and surrounded by a high chain-link fence. A patient who tested positive for XDR TB awaited treatment at a rural hospital in Tugela Ferry in 2006 (right).


    Since the initial reports, a total of 217 XDR TB cases have been found in Tugela Ferry, with a mortality rate of 84% between June 2005 and last March. Paul Nunn, the TB-HIV and drugresistance coordinator at WHO's Stop TB Department, calls the Tugela Ferry outbreak “the worst of its kind” worldwide, in terms of the number of cases, fatality rate, and the high ratio of XDR to MDR cases.

    Was Tugela Ferry the harbinger of other severe XDR TB outbreaks or an anomaly resulting from an unusual convergence of risk factors? Gerald Friedland of Yale University School of Medicine—whose research group reported the outbreak at the 2006 AIDS conference as part of its collaboration with physician Anthony Moll's COS hospital staff and other institutions—worries that interlinked HIV and XDR TB epidemics could “create a firestorm” in many South African communities. He argues that the current South African statistics are unreliable and the extent of the problem underestimated because “there has been a marked underreporting of XDR TB.”

    But other TB experts, including Weyer and Mvusi, regard Tugela Ferry as atypical, in large part because its mortality rate has not been matched anywhere else in South Africa. Mvusi says there were 183 confirmed deaths from XDR TB in South Africa last year, but 342 XDR TB patients were still under treatment—giving hope that some cases can be managed. Although the Eastern Cape and KZN provinces had the most XDR TB cases, the strain has been found in all nine South African provinces. Many of those XDR TB patients were HIV-infected and many others had defaulted on TB drug regimens.

    Searching for origins

    In the wake of the Tugela Ferry outbreak, scientists have been using molecular fingerprinting techniques to analyze thousands of old, frozen TB samples to try to reconstruct the history of XDR TB's emergence in South Africa. At the University of KwaZulu-Natal, A. Willem Sturm's team discovered that XDR isolates of the KZN strain had existed undetected as far back as 2001. About 9% of the province's 2634 MDR TB cases had actually been XDR infections.

    XDR isolates dating back to 2001 were also found in western South Africa, where biologists at Stellenbosch University's TB research center are conducting a retrospective analysis of thousands of TB samples. They are also cooperating with the Broad Institute and Harvard School of Public Health in Boston to sequence and compare the genomes of several drug-resistant TB strains isolated here (Science, 9 November 2007, p. 901).

    Meanwhile, other groups are searching for faster and cheaper ways to detect XDR TB, as well as new drugs to treat it. Testing for resistance to second-line drugs can take up to 2 months using the standard techniques, by which time patients may have been treated with the wrong drugs.

    In South Africa and elsewhere, clinical trials of new, molecular-based tests for drug resistance are already under way by MRC, in cooperation with the Foundation for Innovative New Diagnostics in Geneva, Switzerland. If WHO validates the results, approval seems likely for a German firm's test that can quickly detect MDR TB and is being modified to detect XDR TB. Other groups are also testing their own molecular-based techniques.

    Drugs for treating XDR TB are much further away, although several trials are under way. For now, doctors prescribe older TB drugs such as capreomycin and ADT that have not been used in typical second-line drug regimens.

    Despite the daunting challenges, there is some reason to hope that XDR TB patients can be effectively treated, if not cured. In contrast to the Tugela Ferry outbreak, Iqbal Master, chief of medicine for drug-resistant TB at King George V Hospital in Durban, says two-thirds of the 133 XDR TB patients who were in King George V Hospital during 2007 were still alive at the year's end. Researchers in Latvia suggest that up to 30% of XDR TB cases that are not HIV-infected could be effectively treated. That is good news for patients but poses problems for medical officials who must decide whether and how to separate XDR TB patients from others during the lengthy treatment period.

    The isolation debate

    “We were caught off guard by XDR TB,” concedes Marlene Poolman, the deputy director for TB control at the Western Cape province's health department in Cape Town. No cases were diagnosed until the end of 2006, and the following year, the number of XDR TB admissions at Brooklyn Chest Hospital soared to 72. “Virtually overnight, we had to convert an empty ward into a new XDR unit.” The chain-link fence and 24-hour guards were added in October after several patients left the hospital and had to be returned under court order.

    Involuntary isolation or confinement of XDR TB patients is controversial, allowed in some nations only if the disease is found to pose an immediate threat to public health. WHO recommends separating XDR TB patients from others, especially in regions with high HIV prevalence, and South Africa's health department has adopted that policy.

    But even high fences and guards at some specialized TB hospitals in South Africa haven't kept all patients inside. In December 2007, 20 XDR TB patients and 28 MDR TB patients in another ward cut a hole in the fence and fled a TB hospital in Port Elizabeth. A month later, eight of those patients had not returned, despite court orders.

    Mvusi says overcrowding at some hospitals and clinics, especially in high-incidence areas such as KZN, has made it difficult to separate XDR TB patients. King George V Hospital had a waiting list of 120 drug-resistant TB patients at the end of last year. Many of those were being treated as outpatients.

    Master says the caseload is challenging the health care system. If patients with M(X)DR TB survive their entire 2-year treatment regimens and still test positive for drugresistant TB, Master asks, “What do you do then? You can't put everyone in the hospital for an indefinite period.”


    Research Project Mimics TB Transmission

    1. Robert Koenig

    This spring, South African and U.S. researchers will investigate variables in tuberculosis transmission at the new Airborne Infection Research Facility in the coal-mining city of Witbank.

    PRETORIA, SOUTH AFRICA—A half-century ago, Richard L. Riley of Johns Hopkins University in Baltimore, Maryland, and others set up an innovative experiment at a Baltimore Veterans Administration Hospital: venting air exhaled by tuberculosis (TB) patients in a six-bed ward into an “exposure chamber” housing 150 guinea pigs. The challenge was to prove that TB can be transmitted by tiny airborne droplets and that individual patients vary greatly in how infectious they are to others.

    But Riley's classic experiments did not test the effectiveness of interventions such as air filters and bacteria-killing ultraviolet lights that aim to reduce airborne TB transmission. They also took place before the emergence of drug-resistant TB strains and the AIDS epidemic, two key factors that influence airborne spread of TB and patient susceptibility in Africa's crowded hospital wards.

    This spring, South African and U.S. researchers will use a hospital setup similar to Riley's to investigate those and other variables in TB transmission at the new Airborne Infection Research (AIR) Facility in the coal-mining city of Witbank. In helping to plan the studies, TB researcher Edward Nardell of Harvard School of Public Health in Boston consulted with his mentor Riley before his death in 2001 along with scientists at South Africa's Medical Research Council (MRC) and the U.S. Centers for Disease Control and Prevention in Atlanta, Georgia.

    Research at the new facility will focus on patients who are coinfected with HIV and drug-resistant TB, Nardell says. The goal is to “tease out the importance of infectious source strength, microbial resistance to environmental interventions, and the critical importance of microbial genotype and host factors” in airborne transmission, says Nardell.

    Lindiwe Mvusi, the chief TB official in South Africa's health department, hopes the AIR experiments will yield more data on the best ways to block airborne transmission of TB. During the deadly extensively drug-resistant (XDR) TB outbreak in Tugela Ferry in 2005–2006 (see main text), hospital transmission was a major factor. Eight hospital staff members later died from drug-resistant TB (half of them from XDR TB, the other half from multidrug-resistant TB) before ventilation was improved and other control steps were taken.


    Air from a TB ward is vented into guinea-pig cages (left) at an experimental facility in Witbank.


    Although the experimental setup is complex, Nardell says Riley's model is the only one developed so far that accurately mimics airborne TB transmission in hospital wards. Other efforts to simulate hospital conditions by exposing lab animals to artificially aerosolized TB bacteria have failed to simulate the natural infection process.

    The AIR experiments will expose as many as 360 guinea pigs at a time to air vented directly from a six-bed TB unit. Preliminary experiments last year validated the model, Nardell says, showing the same sorts of infections found in humans. Karin Weyer, who heads MRC's TB program, agrees that AIR is “an ideal model for studying environmental infection control.”


    Number Theorists' Big Cover-Up Proves Harder Than It Looks

    1. Barry Cipra

    Recent progress in the theory of coverings--collections of number sequences that, taken together, include every single integer--was described at the Joint Mathematics Meeting.


    Every integer is either even or odd. Not the deepest of mathematical theorems, to be sure. But number theorists have molded it into conjectures that challenge the intellect, including one the redoubtable Paul Erdös declared “perhaps my favorite problem.”

    In an invited address at the joint meetings, mathematician Carl Pomerance of Dartmouth College described recent progress in the theory of coverings: collections of number sequences that, taken together, include every single integer—zero, positive, and negative—stretching toward infinity. The research includes computer-intensive investigations into the nature of prime numbers (integers that can't be broken down into smaller factors), including an Internet-wide attack on a poser known as the Sierpiński problem.

    “There is a flurry of activity right now,” Pomerance says. Jeffrey Lagarias, a mathematician at the University of Michigan, Ann Arbor, agrees. “Covering questions are perennially important,” he says. The fact that the problems are simple to state yet notoriously tough to solve, he adds, “illustrates how little we know about the integers.”

    In the even-odd covering, each of the two covering sequences has the same step size, namely 2. One starts at 0, the other at 1. You can also cover the integers with sequences of step size 3 (starting at 0, 1, and 2) or of any other size. The game gets interesting when each sequence has a different step size, or modulus. For example, every integer fits into one (or more) of the following five arithmetic progressions: step sizes 2 and 3 starting at 0, step sizes 4 and 6 starting at 1, and step size 12 starting at 11 (see figure, below).

    Mod squad.

    Progressions with small step sizes, like these five sequences, cover the integers handily, but the problem gets trickier as the gaps increase.


    There are infinitely many essentially different ways to cover the integers using 2, 3, or 4 as the smallest step size. But above that, things get dicey. The Hungarian mathematician Paul Erdös's “favorite problem,” dating back to about 1950, is to prove that there are coverings with a smallest step size as large as you like. But the record starting step size for any covering to date is a mere 36.

    That record was set last year by Pace Nielsen of the University of Iowa in Iowa City. Nielsen's 36 topped the record 25 established in 2006 by Jason Gibson, now at Eastern Kentucky University in Richmond. Gibson's paper presented a new way of looking for coverings and concluded with pages of starting points and step sizes for the arithmetic progressions starting at step size 25. Nielsen also gave a systematic procedure for his new covering, but he didn't describe the cover explicitly. For good reason: Nielsen estimates his approach produces a covering with more than 1040 different progressions—“too many to list individually with current computer resources,” he wryly notes.

    The outrageous number of progressions in Nielsen's covering, and presumably any beyond it, is not a surprise, according to Pomerance. He and colleagues have proved another conjecture, first formulated in 1973 by Erdös and John Selfridge of the University of Illinois, Urbana-Champaign, which implies that coverings with large smallest step size must have extremely many step sizes to compensate for the wide gaps between covered numbers.

    Coverings have no immediate “real world” applications, but they do carry implications for number theory, a branch of mathematics deeply entwined with practical issues of computing and computer security. Many key algorithms in cryptography, for example, are based on probing large numbers for primality.

    One byproduct of coverings is a method of generating sequences that are guaranteed to avoid primes. It got its start in 1960, when the Polish mathematician Waclaw Sierpiński of the University of Warsaw used the theory of coverings to show that there are infinitely many values of k for which every number of the form 2nk + 1 (i.e., k + 1, 2k + 1, 4k + 1, 8k + 1, and so on) is nonprime, or composite. In 1973, Selfridge showed that the smallest composite number Sierpiński's result describes is the one formed when k = 78,557. He conjectured that for all smaller values of k, the sequence k + 1, 2k + 1, 4k + 1, 8k + 1, … contains at least one prime.

    Selfridge's idea was easy to check for most small values of k, but a few holdouts still stymie mathematicians. In 2002, Louis Helm of the University of Michigan, Ann Arbor, and David Norris of the University of Illinois, Urbana-Champaign, launched an online effort to put the problem to rest. At the time there were 17 small values of k for which no prime had been found, so they called the project “Seventeen or Bust” and put it on the Web at the address

    Today, only six small values of k remain unsettled. The most recent value to bite the dust is k = 33,661. Last October, Sturle Sunde of the University of Oslo, a contributor to Seventeen or Bust, reported a computation showing that 2n × 33661 + 1 is prime when n = 7,031,232. Earlier calculations mistakenly reported the number as composite, but double-checking confirmed its prime status. Sunde's prime, which weighs in at 2,116,617 digits, is currently the 10th-largest known prime on the books. The meatiest prime number currently on record is a 9,808,358-digit monster of the type known as a Mersenne prime, 232,582,657 − 1.


    A Woman Who Counted

    1. Barry Cipra

    At the Joint Mathematics Meeting, two mathematicians reported that early 19th century mathematician Sophie Germain did far more work in number theory than she has ever been given credit for.



    Sophie Germain was one of the great mathematicians of the early 19th century. Number theorists laud her for “Sophie Germain's theorem,” an insight into Fermat's famous equation xn + yn = zn aimed at establishing its lack of solutions (in positive integers) for certain exponents. Oddly, Germain's fame for her theorem stems not from anything she herself published but from a footnote in a treatise by her fellow Parisian Adrien-Marie Legendre, in which he proved Fermat's Last Theorem for the exponent n = 5. Now, two mathematicians have found that Germain did far more work in number theory than she has ever been given credit for.

    Poring over long-neglected manuscripts and correspondence, David Pengelley of New Mexico State University in Las Cruces and Reinhard Laubenbacher of Virginia Polytechnic Institute and State University in Blacksburg have discovered that Germain had an ambitious strategy and many results aimed at proving not just special cases of Fermat's Last Theorem but the whole enchilada. “What we thought we knew [of her work in number theory] is actually only the tip of the iceberg,” Pengelley said at a session on the history of mathematics.

    The theorem in Legendre's footnote asserts that if the exponent n is a prime number satisfying certain properties, then any solution to xn+ yn= zn must have one of the numbers x, y, or z divisible by n. Pengelley and Laubenbacher report this is just the first of numerous theorems in a long manuscript by Germain now housed at the Bibliothèque Nationale in Paris. Germain also wrote another bulky manuscript on the subject and covered it at length in a letter to the German mathematician Carl Friedrich Gauss. In the three sources, Germain elaborated programs for proving, first, that all primes satisfy the necessary properties; and second, that exponents n cannot divide any of the numbers x, y, or z. Together, those two results would have proved Fermat's Last Theorem. Along the way, she showed that any counterexample would involve numbers “whose size frightens the imagination,” as she put it to Gauss.

    A proper Hollywood ending would have Germain's proof of Fermat's Last Theorem take precedence over Andrew Wiles's monumental accomplishment of 15 years ago. However, Pengelley notes, that's not how the story goes. Much of Germain's approach was rediscovered later by others and found to fall short of its goal. Nonetheless, Pengelley says, the fact that she had developed far more than a single footnoted theorem “calls for a reexamination of the scope and depth of her work.”


    Exact-Postage Poser Still Not Licked

    1. Barry Cipra

    Using techniques from computational algebraic geometry, researchers announced stunning progress in solving large cases of what's known as the three-stamp problem at the Joint Mathematics Meeting.


    Quick: What's the largest amount of postage it's impossible to pay exactly using 41-cent and 26-cent stamps? What if you have a supply of 58-cent stamps as well?

    Stumped? So are mathematicians—at least when the numbers are large and there are more than three different stamps involved. But using techniques from computational algebraic geometry, researchers have recently made stunning progress in solving large cases of what they call the linear Diophantine problem of Frobenius. Should inflation take the price of stamps into 10-digit territory, mathematicians are confident they can quickly solve the impossible postage problem with as many as a dozen different denominations of stamps.

    That such a simple-sounding problem turns out to be so hard is part of the Frobenius problem's appeal, says Matthias Beck of San Francisco State University in California. “It has a certain flair to it,” he says. Moreover, “all kinds of mathematical areas connect to it,” including number theory and theoretical computer science.

    The problem, named for the 19th century German mathematician who popularized it, was first posed by the English mathematician James Sylvester in 1884, as a money-changing problem. Sylvester showed that if two denominations, Aand B, have no common factor, then the largest amount that cannot be formed is given by the formula AB - A - B. This makes the 41/26-cent problem a snap; the reader will find the answer pleasantly surprising.

    Mathematicians have proved that with three or more stamps, there is no simple, Sylvester-style formula for the largest impossible amount. In fact, researchers have shown that the problem is what computer scientists call NP-hard: Any algorithm powerful enough to give an efficient general solution would automatically solve an entire class of problems, including everything that currently underlies the cryptographic security of Internet transactions.


    Various researchers found practical but complicated algorithms for the three-stamp problem in the 1970s and '80s. In 1992, Ravi Kannan, now at Yale University, showed that there are theoretically efficient algorithms for each fixed number of denominations, but his approach is so wildly impractical that “nobody has yet dared to implement it,” Beck notes.

    Nevertheless, researchers are chipping away at the problem. In a special session devoted to the Frobenius problem at the San Diego math meetings, Stan Wagon of Macalester College in St. Paul, Minnesota, reported that he and three colleagues had created an algorithm that works remarkably fast: It solves typical four-stamp problems with 100-digit numbers in under a second and solves up to 11 stamps with 10-digit numbers in under 2 days. Bjarke Roune of the University of Aarhus in Denmark has pushed the envelope even further. Based on techniques from a branch of mathematics called computational algebraic geometry, Roune's algorithm knocks off in seconds the four-stamp problem with numbers up to 10,000 digits long and extends what's feasible with 10-digit numbers to 13 stamps.

    Neither algorithm comes close to cracking Frobenius: A practical solution would be able to handle hundreds of 100-digit numbers without batting an eye. Still, other researchers are impressed. “Stan's algorithm was already a big leap,” Beck says. “Bjarke has taken this to the next level. … I wouldn't be surprised if he comes up with something even better.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution