- STEM CELLS
Teams Reprogram Differentiated Cells--Without Eggs
- Constance Holden
Scientists this week reported major advances toward a central goal of stem cell research: directly reprogramming fetal mouse cells so that they are indistinguishable from embryonic stem (ES) cells. The technique, which they say should also work on adult cells, could one day enable researchers to generate cell lines tailored to individual patients without the use of eggs or embryos.
“The papers are stunning,” says Harvard stem cell researcher Chad Cowan. “Amazing, … very exciting,” adds Stephen Duncan of the Medical College of Wisconsin in Milwaukee. The advances, which are sure to have political implications as well, are reported in two papers in Nature—written by groups headed by Shinya Yamanaka of Kyoto University in Japan and by Rudolf Jaenisch at the Massachusetts Institute of Technology—and in another paper by Konrad Hochedlinger at Harvard, which was published in the inaugural issue of Cell Stem Cell.
The three reports extend a finding made last year by Yamanaka's team. By inserting various combinations of genes related to pluripotency active in mouse ES cells, the researchers discovered a combination of four genes that, when introduced into skin cells from mice's tails, conferred ES-like properties upon them (Science, 7 July 2006, p. 27).
But the tail cells did not completely mimic ES cells, and many scientists were skeptical. “When Shinya first claimed this, I would say that 90% of the field said 'No way.' It seemed so stunning that it could happen like that,” says Cowan. Now it's clear not only that it happened but also that with an alteration in the original formula, scientists have generated what they believe are fully pluripotent cell populations from both fetal and adult mouse fibroblasts.
The three teams all began by following Yamanaka's procedure, using a viral vector to introduce copies of genes for four transcription factors active in ES cells: Oct4, Sox2, c-Myc, and Klf4. Because the reprogramming works for only one in every 1000 cells, the researchers needed to weed out the nonstarters. Yamanaka did this by looking for the activity of a gene that, as it turned out, selected for cells that were incompletely reprogrammed. In the new studies, the scientists used the expression of Oct4 and Nanog—well-known pluripotency markers.
The cells selected using these markers appear to have all the same traits as ES cells. To test this hypothesis, the researchers tagged the reprogrammed cells, called induced pluripotent stem (iPS) cells, with a fluorescent dye and injected them into early-stage mouse embryos. Some of the resulting chimeric animals had descendents of the iPS cells throughout their bodies. The researchers confirmed this by successfully breeding the chimeras to normal mice. This showed that iPS cells had made it to the germ line in the chimeras.
Together, the three papers give a convincing picture of the reprogramming phenomenon, says Cowan. Jaenisch says his iPS cells passed the “most stringent” test of pluripotency, which involves being injected into tetraploid embryos. Created by fusing two normal embryos, tetraploid embryos are incapable of development beyond forming a placenta. But when they are injected with normal pluripotent cells, the embryos develop into mice that are entirely the products of the introduced cells. In Jaenisch's experiment, the iPS cells came from reprogrammed fetal cells, but the researchers are confident that the strategy will work with cells generated from adult fibroblasts, says Hochedlinger, a co-author.
In the study led by Hochedlinger, the team thoroughly explored the epigenetics—changes that modify chromosomes and control gene expression—of iPS cells, demonstrating that the epigenetic signatures of their genes are the same as those in ES cells.
But the Yamanaka study showed a big downside to the strategy. The only author to study the offspring of the chimeras after birth, he observed that 20% of the 121 mice developed tumors. That finding, Yamanaka notes, shows the danger of using retroviral vectors, which can turn on cancer-causing genes.
This highlights what Jaenisch calls “major, major problems which need to be resolved” if this work is to be applied to humans. Once the appropriate human transcription factors are identified, scientists will have to find ways to introduce the factors without retroviruses or to safely activate the relevant genes within the cell.
A likely alternative would be the use of small molecules to penetrate cell walls and turn on production of the necessary transcription factors. “Now that we know there's a mechanism by which this [reprogramming] can happen, there will be an aggressive search to find small molecules that can activate these pathways,” predicts Duncan. “It will take a lot of work, but the fact that the pathways exist gives you a [place] to start. … This is huge.”
It's still a long road to potential therapies with reprogrammed adult cells. But Cowan is optimistic: “The most amazing thing about these papers is you now take this whole idea of reprogramming out of the hands of cloning specialists and put it into the hands of anyone who can do molecular and cell biology.” In some ways, he says, “it's a much simpler system than we thought.” As a result, he predicts, “we're really going to see this process accelerate.”
Coincidentally, the advances were announced the day before the House was scheduled to vote on a Senate-passed bill loosening restrictions on ES cell lines available to federally funded researchers. The measure was expected to pass but to be vetoed by President George W. Bush in a replay of the events of last summer (Science, 28 July 2006, p. 420).
In this environment, the reprogramming studies are likely to be seized on by critics of ES cell research as further evidence that there is no need for the contentious practice of destroying early embryos to obtain stem cells. Hochedlinger and others hasten to point out that research needs to progress on all fronts because all systems “have their limitations.”
- STEM CELLS
Reprogramming, Take Two
- Constance Holden
In the same issue of Nature, Harvard's Kevin Eggan reported another reprogramming advance: nuclear transfer (also called research cloning) using a fertilized mouse egg, or zygote. Early attempts in the 1980s to clone animals by transferring the nucleus of a somatic cell into a zygote failed, leading to a focus on doing the procedure with unfertilized oocytes. Human eggs, however, are difficult to obtain.
Now Eggan and colleagues have shown that if nuclear DNA is removed from the dividing mouse zygote at just the right moment, it will successfully reprogram an introduced nucleus from a somatic cell. In this experiment, the researchers succeeded in creating both new lines of ES cells and apparently healthy cloned mice.
This work, says Cowan, suggests that to create patient-specific human ES cell lines, researchers could make use of fertilized eggs that would otherwise be discarded at fertility clinics, sidestepping the problematic issue of egg donation. “It's an exciting paper,” says cloning researcher Robert Lanza of Advanced Cell Technology Inc. in Worcester, Massachusetts.
- ACADEMIC TENURE
MIT Colleague Quits to Protest Sherley Dismissal
- Yudhijit Bhattacharjee
The head of a biomedical innovation center at the Massachusetts Institute of Technology (MIT) in Cambridge has resigned to protest the school's treatment of an African-American colleague in the biological engineering department. Frank Douglas, who is also black, says he will leave MIT on 30 June—the termination date for his colleague, James Sherley. An MIT spokesperson says that Douglas's decision would have no impact on the institution's stance regarding Sherley, who went on a 12-day hunger strike in February after being denied tenure (Science, 16 February, p. 920).
A stem cell researcher and the winner of a 2006 Director's Pioneer Award from the National Institutes of Health (NIH), Sherley began a fast on 5 February to fight what he claims was a racist decision by MIT to reject his bid for tenure. MIT says that race played no role in the decision. When Sherley ended his strike, MIT put out a statement expressing a commitment to address his concerns about fairness and to “continue to work toward resolution of our differences with Professor Sherley.” In the weeks that followed, however, MIT administrators made it clear that they would neither reopen Sherley's tenure case nor review how his grievances had been handled.
Douglas is a chemist and pharmaceutical industry veteran who was hired in 2005 to establish MIT's Center for Biomedical Innovation. He holds faculty appointments in the science, engineering, and management schools. In a 1 June letter to MIT Associate Provost Claude Canizares, Douglas expressed dismay that MIT, “after having agreed to arbitration, which led to Prof. Sherley ending his hunger strike, now, has negated that agreement and insists on his departure on June 30th, 2007. … I leave because I would neither be able to advise young Blacks about their prospects of flourishing in the current environment, nor about avenues available to effect change when agreements or promises are transgressed.”
Douglas says he has no opinion on whether Sherley deserved tenure but is frustrated by MIT's “lack of desire” to resolve the issue. “Does the institution really believe that Prof. Sherley would have ended his hunger strike if he really understood that 'continue(ing) to work towards resolution of differences' meant no arbitration process and a pre-agreement that he should leave regardless of the outcome of the resolution of those differences?,” he asks, saying that as a minority, he felt obligated to speak out.
In a 3 June reply to Douglas, Canizares wrote, “I can state categorically that MIT did not agree, implicitly or explicitly, to arbitration or to extend Professor Sherley's faculty appointment beyond June 30.” The next day, MIT issued a statement saying that Douglas's decision was based on inaccurate information, and that it hoped he would reconsider his resignation.
Outsiders familiar with MIT's environment do not think Douglas's departure will have much impact. “The administration is inclined to take the same approach that governments take while dealing with terrorists: They won't negotiate,” says Philip Phillips, an African-American solid state physicist at the University of Illinois, Urbana-Champaign, who was denied tenure at MIT 16 years ago. Instead, he predicts, MIT will treat Douglas as it did Sherley—“as just another irrational person.”
Gravity Distorts Big Bang Afterglow, Opening New Window on Cosmos
- Adrian Cho
The best picture of the infant universe is distorted as if in a fun-house mirror, cosmologists report. Ironically, the aberration may lead to a clearer view of how the young universe evolved.
The observation marries two of cosmologists'more fruitful pursuits. Researchers have analyzed the afterglow of the big bang, known as the cosmic microwave background (CMB), to nail down the precise age and composition of the universe. Meanwhile, others have studied how light from distant galaxies is deflected by the gravity from vast filaments of “dark matter” stretching through space, a phenomenon known as “weak lensing.” Now, a team of young upstarts has detected weak lensing of the CMB itself.
Researchers were certain that the CMB would show the lensing distortion, but they were eager to find the effect because it could be used to probe the evolution of the structure of the universe, says David Spergel, a cosmologist at Princeton University. “What they saw is what they expected,” he says, “but this is groundbreaking because this is a new tool that could be very powerful.”
To spot the effect, Kendrick Smith of the University of Chicago in Illinois and colleagues started with the portrait of the CMB produced by NASA's orbiting Wilkinson Microwave Anisotropy Probe (WMAP). The temperature of the microwaves varies slightly from point to point across the sky. By analyzing the hot and cold blotches, WMAP researchers have deduced that the universe is 13.7 billion years old and consists of 4% ordinary matter, 22% dark matter, and 74% weird space-stretching “dark energy” (Science, 19 December 2003, p. 2038).
With images of galaxies, lensing makes the elliptical shapes tend to align with their neighbors in the sky, like fish in a swirling school (Science, 17 March 2000, p. 1899). In the case of the CMB, a clump of dark matter in the foreground should magnify the splotchy pattern behind it. In principle, researchers could spot the effect by scrutinizing the CMB alone. In practice, that “autocorrelation” technique isn't yet feasible because WMAP can't quite bring the microwave pattern into sharp enough focus.
So the team compared the CMB map to one pinpointing 1.8 million galaxies that was produced by the Very Large Array (VLA) of radio telescopes in New Mexico. The galaxies trace the dark matter filaments, so the spots in the sky where galaxies are most numerous should also line up with the patches in the CMB that appear to be magnified. The comparison makes the effects of lensing clear, much as superimposing two patterns of seemingly random dots might reveal a picture partially encoded in each, the researchers reported in a paper posted last week to the arXiv preprint server (http://www.arxiv.org/).
Lensing of the CMB could enable researchers to probe how the dark-matter web evolved even before galaxies formed. But that will require autocorrelation analysis of higher precision data from future projects such as the South Pole Telescope; the Atacama Cosmology Telescope at Cerro Toco, Chile; and the European Planck satellite, says Christopher Hirata, a cosmologist at the Institute for Advanced Study in Princeton, New Jersey. Hirata and his colleagues will soon publish an independent observation of the CMB lensing.
The advance was made by three of the field's younger members. Smith and Oliver Zahn of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts, are graduate students, and Olivier Doré of the University of Toronto is a postdoc. They were able to make their mark because the WMAP and VLA data are open to the public. If that had not been the case, Smith says, “two graduate students and a postdoc probably wouldn't have been the drivers” for the project.
- MONGKOL NA SONGKHLA INTERVIEW
Thai Health Minister Defends Controversial Drug-Patent Policy
- Richard Stone
BANGKOK—Thailand's audacious moves to break drug patents have earned it raves from many governments and patient advocates—and the ire of big pharma. Thailand has asserted its right, under a World Trade Organization provision for coping with public health needs, to make or import generic versions of two AIDS drugs—efavirenz and lopinavir/ritonavir—and clopidogrel, a heart medication (Science, 11 May, p. 816). Several countries say they will follow in Thailand's footsteps. Brazil has already invoked a compulsory license for Merck's efavirenz.
Thailand has paid for its activism. In March, Abbott Laboratories, which makes lopinavir/ritonavir, withdrew applications to sell seven new medications in Thailand. And the Office of the U.S. Trade Representative has put Thailand on its Priority Watch List and will reportedly decide next month whether to slap tariffs on certain Thai exports.
At the center of this firestorm is Mongkol Na Songkhla. A 65-year-old medical doctor, Mongkol was appointed minister of public health by Thailand's military-installed government last October. Described by colleagues as a down-to-earth workaholic with a sharp sense of humor, Mongkol, in an interview with Science last week, insisted that the compulsory licenses are aimed solely at helping poor people and that Thailand has borne “unfair” criticism. He also revealed that his ministry has offered an olive branch to drug companies.
There are reports that Thailand may issue compulsory licenses on two more drugs, and that you have set a limit of five drugs.
Compulsory licensing [CL] is not a routine action. It is a very special, exceptional one. If we want to do CL for our poor people, we cannot do more than five drugs.
Why five? How did you arrive at that limit?
Cost. We don't want to interfere with R&D, with innovation, with the investment of the companies. Twenty percent of Thai people can pay from their own pocket and never want to use generic drugs. Government employees prefer to use the originals. And foreigners who come to get health services in Thailand, they never get the generic drugs. But the majority—about 80% of our people—never has a chance to use the original product.
It's been reported that the next two drugs that might be issued compulsory licenses are cancer drugs.
People guess this from my criteria, from the cost of drugs to people. … I cannot confirm. We have a committee to consider this. I never make [these] decisions by myself. I'm only the gateway.
Some critics claim the licensing is more of a negotiating strategy to force drug companies to come down in price. Is that a fair observation?
Maybe, maybe. You see, more than 2 years ago we invited the drug companies to come and negotiate. They never wanted to sit and talk with our committee. And right now it is easier for us to invite them.
You certainly got their attention.
On balance, do you think the focus on compulsory licensing has been positive for Thailand?
I don't think it's positive, because some people have a double standard. Why don't they criticize other countries, especially developed countries that do CL?
What would you have done differently?
We did it too transparently. I don't think the other countries announced their intention to do compulsory licensing. Everyone knows the details of our negotiations. This is maybe too transparent.
As a medical doctor, you must be particularly distressed with Abbott's action to withdraw drugs it had intended to sell here. Have you conveyed that to them?
They want me to stop doing CL before negotiations. That is very tough.
You will not agree to that?
Yes, I will not.
Do you think the rift with the drug companies can be healed?
It will take time. I have proposed that they sell another version of their original products of the same quality but different packaging for the poor, at a lower price.
Is it possible in Thailand to tightly control the distribution of a second version for poor people?
Sure, because this would only be for noncommercial use. It cannot be used in private clinics or [private] hospitals.
How have drug companies reacted to this idea?
I am still waiting [for their reaction].
- SAUDI ARABIA
Graduate University Launched With $10 Billion Endowment
- Jeffrey Mervis
Saudi Arabia is using some of its vast oil wealth to create what it hopes will be a world-class graduate research institution.
The King Abdullah University of Science and Technology (KAUST), which is scheduled to open in the fall of 2009, will have a $10 billion endowment—the sixth largest in the world. Although the university won't open for more than 2 years, officials said this week that they will prime the pump by awarding some 500 undergraduate scholarships this year and next to students around the world, with the understanding that the recipients will form KAUST's inaugural classes. They are also launching a $100-million-a-year global research partnership program to fund work by scientists who agree to become affiliated with the new university.
“We realized that we cannot wait until when we open, in September 2009, to begin our research program,” says Nadhmi Al-Nasr, KAUST's interim president. “We also wanted to find researchers working on things that have the potential to benefit Saudi Arabia.”
The university is the brainchild of the country's current ruler. It's being built from scratch on a tract along the Red Sea, some 80 km north of Jeddah. The project, which includes constructing a surrounding city of 15, 000, was given to Saudi Aramco, the government-owned oil giant, and is being overseen by the Minister of Petroleum and Mineral Resources. But once completed, the university will be financially and administratively independent of the Saudi government, says Al-Nasr, who's on leave from his post as vice president for engineering at Saudi Aramco.
“KAUST will have its own governing board of trustees as well as its own endowment,” he explains. Government officials also decided to place it outside the purview of the ministry of higher education. “The kingdom is in the midst of reassessing its entire system of higher education,” Al-Nasr explains. “So it would not make sense to try to build KAUST within the existing bureaucracy.”
The university will be organized around multidisciplinary research institutes. The initial ones will encompass energy and the environment, biosciences and engineering, materials science and engineering, and applied mathematics and computational science. Accordingly, the first round of global partnerships will lure faculty members by tackling challenges such as desalination, carbon capture and hydrogen-rich fuels, and computational linguistics.
KAUST has hired some academic heavyweights to help it jump-start the school. Al-Nasr says the Institute of International Education, which runs the U.S.-based Fulbright Program, will manage the student scholarship program, while the Washington Advisory Group (WAG) will handle the competitive grants program using a National Science Foundation-style merit-review system. The university will eventually have a capacity for 2000 graduate students and 600 faculty members and researchers, and Al-Nasr hopes as many as one-fifth of the students will be Saudis.
“I think it's a potential game changer for the region,” says WAG's Frank Rhodes, president emeritus of Cornell University and an adviser to the university. Adds Richard Sykes, rector of Imperial College London and another adviser: “I don't think that it'll be too difficult to get good people to come for a short time. The issue will be, 'Can you sustain the quality over time?'” KAUST hopes to have a founding president on board by January 2008.
- CONSERVATION BIOLOGY
Whales (Mostly) Win at Whaling Commission Meeting
- Virginia Morell*
ANCHORAGE, ALASKA—Whales and environmentalists were the big winners at this year's 4-day meeting of the International Whaling Commission (IWC) here, with a majority of member nations reaffirming the organization's 21-year moratorium on commercial whaling. The meeting also passed a resolution calling for Japan to “suspend indefinitely” its scientific whaling in the Southern Ocean Whale Sanctuary and to change its plans for lethal research on endangered humpbacks and fin whales (Science, 27 April, p. 532).
But the meeting of 400-plus delegates from 71 countries from 28 to 31 May was hardly a joyful reunion. Rather, like any unhappy family, IWC members exchanged insults and accused each other of bad faith and broken promises—sentiments that added to the ever-growing divide between those who fancy a whale steak and those who prefer their cetaceans whole, alive, and swimming in the ocean's waters.
And in the final hour, Japan, which strongly favors a resumption of commercial whaling, announced that it was considering a divorce. The IWC has spurned its “original role as a natural resource management organization,” Akira Nakamae, one of Japan's deputy commissioners, said at the meeting. “Japan upholds the principle of using all living marine resources,” he said, and may form “a new organization with other countries” that follow that principle.
Last year's IWC gathering had a very different outcome: Japan and its pro-whaling allies narrowly passed a symbolic resolution calling for an end to the moratorium on commercial whaling. Given the rise in abundance of some whale species, Japan has also asked the Convention on International Trade in Endangered Species to review the status of great whales; CITES currently prohibits trade in these cetaceans or their products. But the IWC voted 37-4 for a resolution saying that the whaling moratorium was still needed and asking CITES, which meets this week, to defer to IWC on the question of whales.
Japan chose “not to participate” in that vote and others, underscoring the widening gulf between pro- and antiwhaling nations. However, the IWC did throw a lifeline to the Gulf of California's highly endangered vaquita dolphin (Phocoena sinus), pledging support for Mexico's plans to save it from extinction.
Some IWC members also strongly criticized Japan's scientific whaling. Japan kills more than 1000 whales each year (primarily minkes) in the Southern Ocean and North Pacific and sells the meat. That is the largest number of whales harpooned by any IWC nation. A majority of the commission—with Japan and its allies not participating—approved a resolution that expresses “deep concern at [Japan's] continuing lethal research” and states that the program does “not address critically important research needs.” Delegates from several nations delivered far harsher words: “The practice of scientific whaling by Japan flies in the face of the whole purpose of this convention and demeans the IWC,” charged Malcolm Turnbull, Australia's deputy commissioner and a member of parliament. “Show goodwill toward the people of Australia, and drop the humpbacks if not the program itself,” he asked in an appeal echoed by others.
Australia's request was based “on emotion, not science,” responded Joji Morishita, a deputy commissioner from Japan. “We are proud of our scientific program and its achievements,” he said, noting that the hunts will go ahead as planned for the winter of 2007-08. The IWC has no authority to force Japan to curb research whale hunts, which require only a “special permit” issued by the Japanese government itself.
The IWC prides itself on using science to guide its decisions, although science was invoked in so many ways that the New Zealand delegation finally noted that the term had a “very elastic meaning.” Prior to the full meeting that ended last week, the IWC's Scientific Committee gathered for 2 weeks in Anchorage to discuss everything from the status of the world's whale species, to whale deaths as a result of entanglement in fishing nets, to whale-watching tours and pollution. The resulting 100-plus-page Report of the Scientific Committee then served as the basis for the full meeting's discussions.
Indeed, it was because the U.S. delegation made the case with strong scientific data that the meeting attendees voted unanimously to support subsistence hunts of Western Arctic bowhead whales by certain Alaskan Eskimo aboriginal whaling communities—a request expected to be controversial. “The data were compelling and show that the Western Arctic bowhead population is healthy and recovering,” said Douglas DeMaster, a deputy commissioner on the U.S. delegation. “This hunt is also sustainable.”
Japan had hoped to persuade the IWC commissioners to allow four coastal communities to resume hunting North Pacific minke whales, a request they have made at the last 20 IWC meetings. Such a limited hunt would not “negatively impact the abundant minkes” and would alleviate “suffering in the villages,” said Nakamae. But in contrast to the Western Arctic bowhead whales, there are few data about these minke whale populations, which are already killed as bycatch, says DeMaster. And few countries agreed that Japan's village whalers, who would sell the whale meat, are equivalent to native subsistence hunters. Japan has threatened to leave the IWC before but has not. Still, it is not clear whether it will participate in the 2008 IWC meeting, planned in Santiago, Chile.
- CLIMATE CHANGE
Pushing the Scary Side of Global Warming
- Richard A. Kerr
Greenhouse warming might be more disastrous than the recent international assessment managed to convey, scientists are realizing. But how can they get the word out without seeming alarmist?
Climate modeler James Hansen knows all about sounding the alarm. In the summer of 1988, drought wracked the country, fire was consuming Yellowstone National Park, and the nation's capital sweltered. Even the Senate hearing room where Hansen was testifying was warm and stuffy—the Democrats had opened the windows the night before. Then Hansen, dubbed NASA's top climate scientist by the media, shouted “Fire!” in the crowded theater: “With a high degree of confidence,” he declared, greenhouse warming had arrived. Although many of his colleagues agreed, none chimed in with support; they could not share his high degree of confidence. Still, Hansen's lone authoritative voice was enough to send the media into a years-long brouhaha over global warming.
That uproar quieted within a few years, but Hansen, still the director of NASA's Goddard Institute for Space Studies (GISS) in New York City, finds himself at the head of an informal movement to again rouse the public and policymakers. This time he worries that sea level could rise several disastrous meters by the end of the century, as the warming he heralded sends the great ice sheets rumbling toward the sea. If nothing is done to rein in greenhouse gas emissions, he says, “I just can't imagine that you could keep sea-level rise under a meter.” Then the sea would flood many kilometers inland along the world's low-lying coasts, from Florida to Bangladesh.
That was Hansen's warning to Congress in late April, but it's not the message that came out of the U.N.'s Intergovernmental Panel on Climate Change (IPCC) in early February. Many news reports gave the impression that the prestigious international assessment actually downgraded the risk of imminent sea-level rise to a small fraction of a meter.
So Hansen seems to be out on a limb, again. This time, however, he's got company. No longer reticent, other scientists are going public about how bad things might get by the end of the century. “The IPCC has been overly cautious in not wanting to give any large number to [future] sea-level rise,” says climate researcher Stefan Rahmstorf of the Potsdam Institute for Climate Impact Research in Germany.
Scientists are still trying to strike a balance between their habitual caution and growing concern over uncertain but disastrous greenhouse outcomes. “Most scientists don't want to, but I think we need a way to explore” the extreme end of the range of possibilities, says glaciologist Robert Thomas of NASA contractor EG&G at Wallops Flight Facility in Virginia. Thomas says scientists need “a better way” than IPCC's consensus approach, “so we can communicate with the public without becoming scaremongers.”
Seldom have mainstream climate scientists spoken out about the scary possibilities of global warming. “Most people [in the field] realize this really is an extremely serious problem we're facing” in sea-level rise, says Thomas. But no one understands just why the great ice sheets of Greenland and West Antarctica have accelerated their slide to the sea in recent years (Science, 24 March 2006, p. 1698). Will the acceleration continue? Speed up? Slow down? Stop? In the face of such uncertainty, most climate scientists have traditionally let IPCC speak for them. When they've gone public, it was usually to counter greenhouse contrarians arguing for an inconsequential warming with trivial impacts.
In the latest report, its fourth since 1990, the IPCC spoke for scientists in a calm, predictably conservative tone (Science, 9 February, p. 754). It is, after all, an exhaustive, many-tiered assessment of the state of climate science based exclusively on the published literature. In IPCC's Working Group I report on the physical science of climate, 600 authors contributed to an 11-chapter report that drew 30, 000 comments from reviewers. The report was in turn boiled down to a 21-page “Summary for Policymakers” (SPM). Its central projection of sea-level rise by the century's end—0.34 meter—came within 10% of the 2001 number. And by getting a better handle on some uncertainties, it even brought down the upper limit of its projected range, from 0.89 to 0.59 meter.
The SPM did add that “larger values [of sea-level rise] cannot be excluded.” Whatever has accelerated ice-sheet flow to the sea, the report said, might really take off with further warming—or not. “Understanding of these effects is too limited” to put a number on what might happen at the high end of sea-level rise, it concluded. Lacking such a number, the media tended to go with the comforting 0.34-meter projection or ignore sea level altogether.
Some scientists believe IPCC did as well as it could in assessing the sea-level threat. “Since 2001, nature has revealed some pretty remarkable behavior in the ice sheets,” says glaciologist Waleed Abdalati of NASA's Goddard Space Flight Center (GSFC) in Greenbelt, Maryland, who manages NASA's ice-observation program. That behavior has included the catastrophic collapse of the Larsen B ice shelf—which triggered glacier accelerations—and the galloping glaciers draining the Greenland ice sheet, which have doubled their pace. But “we just don't have the capacity to quantify” that sort of ice-sheet behavior, he notes, so “the best you can do is point to some red flags. The language of the SPM does that, if you're looking for it.”
Quantifying ice-sheet behavior does indeed have its limitations, says climate and sea-level modeler Jonathan Gregory of the University of Reading, U.K., who coordinated the production of the sea-level section in the IPCC projections chapter. A predictive model cited by IPCC would melt more Greenland ice as the air warms, he says, because that is a well-understood and quantifiable process. However, the model would not include the effect of that glacial meltwater lubricating the base of the Greenland ice sheet. Although researchers have seen signs that such lubrication speeds the ice sliding into the sea, they aren't yet able to model it. “If there are no models to give us some numbers,” says Gregory, “all you can do is make numbers up. It wouldn't be appropriate to make up numbers.”
Going beyond such physically based models—for example, by extrapolating from past trends—wouldn't be such a good idea either, Gregory says. Lessons taken from how sea level rose as the 20th century warmed, he says, would be useless in predicting sea-level rise in this century if the underlying causes change. “If you don't know what is causing the relationship” between warming and sea-level rise, he says, “is that really a good basis for making projections?”
A bolder assessment
Scientists are well aware of the hazards of straying far from the hard science of climate change, but some are eager to change the IPCC process and even move beyond it. They would begin with wording. “IPCC gets an A+ for scientific assessment,” says climate modeler Richard Somerville of the Scripps Institution of Oceanography in San Diego, California, “but a gentleman's C for communication.” The communication problem is largely a matter of structure, says geoscientist Michael Oppenheimer of Princeton University. “All the facts are there in the [main-report] chapter,” he says, “but the SPM didn't tie those facts together in a coherent statement of risk that would allow a policymaker to make an informed decision.”
Beyond the IPCC's language, a number of climate scientists think the report missed an opportunity to broaden public appreciation of the risk of the most dangerous climate change. “If you don't understand the physics, your uncertainty is larger,” says Thomas. That greater uncertainty extends the range of possible ice losses to higher, more dangerous levels. But IPCC didn't capture that increased risk, says climate modeler Michael MacCracken of the Climate Institute in Washington, D.C.
A big part of IPCC's problem, say MacCracken and others, was its strict adherence to the use of models. By IPCC standards, “if it's not in a model, it's speculation,” says Rahmstorf. By ignoring factors that can't yet be modeled, he says, IPCC came up with deceptively reassuring numbers.
Although forewarned, some researchers are generating numbers for public consumption by going beyond physics-based models. In a paper published in Science in January, too late for the IPCC to consider it, Rahmstorf took “a semiempirical approach to projecting future sea-level rise.” He determined how much sea level rose in the 20th century per year per degree and projected that rate through the 21st century, with its expected warming. That projection produced a sea-level rise in 2100 of 0.5 to 1.4 meters above the 1990 level, well above the IPCC's projection of 0.18 to 0.59 meter.
Then, Rahmstorf and six co-authors, including Hansen, published a paper in Science on the day the IPCC report was released. They pointed out that warming had been running toward the high side of IPCC projections during the past few decades, while sea levels rose at the upper limit of projections. “These observational data underscore the concerns about global climate change,” the authors wrote. IPCC had clearly not exaggerated sea-level rise, they said, and may even have underestimated it. Reinforcing their message, news stories published a few days before the IPCC report's release quoted Rahmstorf and other scientists lamenting the expected shortcomings on sea-level projections.
That was more media attention than suited some climate researchers. “When we speak to the public, we should not rely on the new result,” argues Hans von Storch of the GKSS Institute for Coastal Research in Geesthacht, Germany. “The newest results are not necessarily the best ones. The IPCC should represent a certain filter. That every taxicab driver knows about [the latest result] is a bit premature.”
Some scientists would have IPCC reach even farther back to try to deal with “factors that you don't understand,” as MacCracken puts it. He notes that paleoclimatologists and geologists have extracted records of ancient sea level for times when Earth was warmer or colder than today. In the case of the penultimate warm interglacial 120, 000 years ago, the globe was only about 1°C warmer—a temperature we could reach by 2100—but sea level was 4 to 6 meters higher. Even though that warmth had millennia to shrink the great ice sheets back then, MacCracken says, history still suggests that the world's ice is more vulnerable than IPCC's modeling implies.
Gregory calls projections drawing on such studies “a scientific hunch.” Hansen prefers “insight,” but whatever it is called, Hansen says, you won't find much of it in an IPCC report. IPCC “overall does a good job,” he says, but “there are limitations on that process. Everybody in [sea-level] research is much more concerned than 6 or 7 years ago” when the previous report came out, he says; yet the latest message from IPCC was seemingly unchanged. “There is a role for something in addition.”
As an example of an alternative to the IPCC report, Hansen cites the U.S. National Academy of Sciences' climate change report of 1979. Chaired by the late meteorologist Jule Charney, then at the Massachusetts Institute of Technology, the small committee delivered, among other things, a best estimate and range for the sensitivity of climate to greenhouse gases, a central figure in climate science. IPCC never ranged far from those numbers, and this year it confirmed them. “There were huge uncertainties back then,” says Hansen, “yet the Charney repor t came up with an estimate by coming at it from different angles. That's what you need to do with sea level.”
Hansen practices a multipronged approach himself. With colleagues at GISS, he draws on several lines of evidence—climate modeling, recent observations, paleoclimate records, and the basic physics of the greenhouse—to “gain insight into how the world works.” That approach helped him to see greenhouse warming under way in 1988, he says. Now it is revealing positive feedbacks in the ice-climate system that can allow modest warming to accelerate losses from the ice sheets. The world is on a “slippery slope,” Hansen has written, that could lead to meters of sea-level rise in the next century or two unless people take immediate actions to cut greenhouse gas emissions.
As in 1988, Hansen's pursuit of such insights has put him at odds with many in the climate community. “Maybe he's still within the error bars,” says glaciologist Robert Bindschadler of GSFC, but “I'm not prepared to put centuries on [the timing] rather than millennia.” No matter. Hansen has again taken to the bully pulpit as NASA's top climate scientist, publishing in peer-reviewed journals, testifying to Congress, writing op-ed articles, and appearing in documentaries. Only last week in a GISS press release announcing a new publication, Hansen warned of disastrous effects—including increasingly rapid sea-level rise—if greenhouse gas emissions continue apace for even a couple more decades. And a few days later, he took his boss, NASA Administrator Michael Griffin, to task for publicly questioning the need to tackle global warming. Hansen's take: “remarkably uninformed.”
Besides a streamlined IPCC process and individual scientist activism, Oppenheimer sees another approach: using expert elicitations to broaden the assessment of uncertainty. This survey technique grills selected experts in private on the state of the science. Without the drive to reach a consensus, the experts “give a different view of the probability of various outcomes,” says Princeton's Oppenheimer. Ultimately, these and other methods will be needed, he says. “You really need a broad, thorough, and comprehensive assessment of uncertainty and risk,” he says. “There is no one answer to the assessing of uncertainty.”
- ALZHEIMER'S DISEASE
A New Take on Tau
- Jean Marx
The search for drugs to combat neuron-destroying diseases is prompting researchers to take a fresh look at a familiar suspect
When it comes to combating Alzheimer's disease, neurobiologists need all the help they can get. So far, efforts to develop drugs that halt or reverse the relentless brain degeneration caused by the disorder have met with only modest success. The few approved therapies slow cognitive decline, but only temporarily, possibly because they don't target the root cause of the disease.
Drugs that may do that are in the pipeline, however. They're aimed at reducing production of amyloid beta (Aβ), a protein fragment thought to be the instigator of the nerve cell death driving Alzheimer's disease (Science, 3 November 2006, p. 781). And now researchers are taking a closer look at another possible target, a protein called tau that is involved in the pathology of a number of neurodegenerative diseases, including Alzheimer's.
Tau has taken a back seat to Aβ for the past several years, but recent work with animal models and with cells in lab culture suggests that focusing on tau could pay off. Treatments that reduce the formation within the brain of certain mutant taus, or even of the normal protein itself, can alleviate memory loss and other neurological deficits in mice that have been genetically engineered to develop brain pathology similar to that in human Alzheimer's brains.
Although a lot more work will be needed before any of these early experiments pay off in the clinic, tau researchers are more optimistic than before. The findings “open up a new potential [Alzheimer's] treatment; reducing tau might complement the antiamyloid strategy,” says Lennart Mucke of the University of California, San Francisco, School of Medicine, whose lab is among those doing the work.
And big pharma is taking note as well. For example, Merck & Co. Inc. has just hired Michael Hutton, a tau researcher at the Mayo Clinic in Jacksonville, Florida, to head up a new effort to develop tau-based drugs for Alzheimer's disease. “I'm putting my money where my mouth is,” Hutton says. “I do think tau is a great therapeutic target.”
For more than 20 years, neurobiologists have known that both Aβ and tau are prominent in the abnormal structures that stud the brains of Alzheimer's patients: Aβ is located in the so-called plaques that form outside dead and dying nerve cells, whereas tau appears inside neurons in a mesh of proteins called neurofibrillary tangles (NFTs). For a time, a controversy raged about which protein causes the brain neurons to degenerate. The tide began to swing in Aβ's favor after the discovery about 16 years ago that mutations in the APP gene, which makes the larger protein from which Aβ is clipped, cause some hereditary Alzheimer's cases.
Indeed, no mutations in the tau gene have ever been linked to the disease, and today even many tau experts concede that Aβ-linked abnormalities initiate brain neuron loss. “There's no doubt that Aβ [toxicity] is a fairly early event in Alzheimer's disease pathology and that tau is likely downstream,” says Christopher Eckman, also at the Mayo Clinic in Jacksonville.
But tau researchers got a boost about 10 years ago, when several teams found that mutations in the gene for the tau protein cause some cases of a less common but devastating dementia known as FTLD (for frontotemporal lobar degeneration). This discovery showed that abnormal forms of the protein can cause nerve degeneration and as a result, memory loss and other neurological deficits. The work also buttressed the case that counteracting tau's effects might help Alzheimer's patients. “Focusing on tau [for Alzheimer's therapy] makes perfectly good sense,” says Zaven Khachaturian, senior science adviser to the Alzheimer's Association in Chicago, Illinois.
In the past few years, researchers have begun to test the idea of targeting tau, using various mouse models of Alzheimer's disease. The most recent example, described in the 4 May issue of Science (p. 750), comes from Mucke's group. These researchers used a genetically altered mouse strain that carried a mutant human APP gene and varying numbers of the mouse tau gene—either zero, one, or the normal complement of two copies. Because these mice carry the mutant APP gene, all their brains developed structures resembling the amyloid plaques of Alzheimer's brains. But as is commonly seen in such modified mice, none had tangles or neuron loss.
Mucke and his colleagues found that, as the animals aged, those with two tau gene copies became impaired in their ability to learn the Morris water maze, which requires that the animals find an underwater platform. Animals with one copy were less impaired, whereas those with no tau gene learned as readily as normal controls did—results indicating that the absence of tau somehow prevents the behavioral deficits that would otherwise occur in animals with mutant APP.
Yet the tau reduction had no effect on Aβ deposition. The animals lacking tau “had brains full of amyloid plaques but could solve the water maze in a snap,” Mucke says. Further work indicated that tau might contribute to neuronal malfunction by making brain neurons hyperexcitable, which can ultimately lead to their death.
Other researchers have also found that reducing tau can alleviate memory loss in mouse models. For their experiments, a team led by Karen Hsiao Ashe of the University of Minnesota Medical School in Minneapolis and the Mayo's Hutton created a strain of mice bearing a mutant human tau gene combined with regulatory sequences that allowed it to be turned off by the antibiotic doxycycline. As the researchers reported about 2 years ago, as these mice age, they accumulate NFTs in brain neurons where the tau gene is active.
What's more, the animals' brains shrink as a result of nerve cell death, and the mice show a marked deterioration in their ability to learn the water maze. But suppressing expression of the mutant tau gene with doxycycline improved the animals' memories and halted the neuronal losses without affecting NFT accumulation (Science, 15 July 2005, p. 476).
Last year, Frank LaFerla and his team at the University of California, Irvine, reported on a mouse model that they engineered to develop both plaques and tangles. Treatment with two antibodies, one directed at Aβ and the other at tau, preserved the animals' ability to learn. Both antibodies were needed. “If we reduce Aβ without reducing tau, we don't improve [the animals'] learning and memory behavior,” LaFerla says.
But which tau?
In addition to showing that tau can play a role in cognitive decline, these studies also shed light on what Hutton calls a “massive uncertainty”: What form of tau damages neurons? Identifying the culprit could be important for guiding efforts to develop therapies; those trying to stop Aβ buildup, for example, continue to wrestle with whether to target soluble or insoluble forms.
For most of the 100 years since Alois Alzheimer described the plaques and tangles in his patients' brains, researchers focused on the tangles with their insoluble tau as the species at fault. But Ashe and her colleagues found that when they turned off tau formation in their animals, NFTs continued to accumulate despite the other improvements. Similarly, the LaFerla team's antibodies were directed at soluble tau as well as soluble Aβ and did not reduce plaques or tangles. Some form of tau “before the tangles [develop] is causing the memory problems,” Ashe concludes.
Further evidence that a nontangle form of tau causes neuronal damage comes from Virginia Lee, John Trojanowski, and their colleagues at the University of Pennsylvania School of Medicine in Philadelphia. When these researchers tracked the changes in brain neurons in mice bearing a mutant tau gene, they found that the synapses, which are the connections between neurons, deteriorated at 3 months of age—long before NFTs appeared.
By this same early age, microglial cells, the brain's immune cells, became activated, presumably due to the presence of the mutant tau, the researchers reported in the February issue of Neuron. This suggests that the microglia might be contributing to the neuronal damage by causing inflammation—an idea that got a boost when Lee, Trojanowski, and their colleagues treated the animals with an immunosuppressant. “We delayed everything,” Lee says. “The tangles decreased, the neuronal loss decreased, and the animals lived longer.” These results tie in with other evidence suggesting that inflammation plays a role in Alzheimer's etiology and that anti-inflammatory drugs might help.
Still, tau abnormalities could contribute in other ways to neuronal degeneration and these, too, suggest therapeutic strategies. Normal tau binds to, and stabilizes, the microtubules, which run through the neuronal axon and help transport nutrients, proteins, and other materials back and forth between the cell body and the synapse. For reasons not yet understood, tau becomes excessively phosphorylated in Alzheimer's brains and as a result no longer binds properly to the microtubules. The microtubules in turn deteriorate, ultimately leading to nerve cell death.
Some drugs, including the taxols used to treat breast cancer, stabilize microtubules. About 2 years ago, Lee, Trojanowski, and their colleagues showed that one of these drugs, paclitaxel, improved axonal function and ameliorated the neurological problems of mice carrying a human tau gene. Lee notes, however, that the taxols and other microtubule-stabilizing drugs kill dividing cells and are thus too toxic, especially for long-term use. Her group is currently working to develop less toxic taxol derivatives.
Other researchers are taking a different tack, aiming to inhibit the kinases that add phosphates to tau. Some of these have shown promise in animal models. For example, Hutton's team, working with that of Hanno Roder at Sirenade Pharmaceuticals in Martinsried, Germany, found that an inhibitor of a kinase called ERK2 reduced the excessive tau phosphorylation occurring in mice carrying a human tau gene and also reduced the animals' difficulties in moving.
Similarly, LaFerla and his colleagues have evidence from their mouse model that interventions thought to be protective against Alzheimer's disease, including learning interventions and dietary intake of omega-3 fatty acids, might work partly by decreasing levels of enzymes that phosphorylate tau. But at least in their model, Aβ reductions also appear to be necessary for neurological protection.
All of this work in mice raises the obvious question of whether the results are relevant to Alzheimer's disease and other so-called tauopathies such as FTLD. But there is at least one hint that inhibiting tau phosphorylation could help people afflicted by the conditions. The drug memantine is one of the few approved for treating Alzheimer's disease. And although it wasn't designed to inhibit phosphorylation, Khalid Iqbal and his colleagues at the New York State Institute for Basic Research in Developmental Disabilities on New York's Staten Island have evidence from cell studies that the drug decreases tau phosphorylation and inhibits neurofibrillary degeneration. “I don't understand,” Iqbal says, “how anyone can think about Alzheimer's and not think about tau.”
- SMALLPOX VACCINE
A Tame Virus Runs Amok
- Jocelyn Kaiser
The massive response that saved a child from a rare infection demonstrated the strength of U.S. medicine—and the vulnerability of U.S. biodefenses
On 3 March, doctors treating a 2-year-old boy in a Chicago hospital were alarmed by the angry rash that covered his body. They checked out the usual suspects—chickenpox and herpes simplex, which can have serious consequences for children, like this one, with eczema. But lab tests ruled out both. As the rash worsened days later, forming ring-like, indented pustules on half of the boy's body, the hospital's staff learned that the father, a soldier in Iraq, had been vaccinated for smallpox several weeks before. The vaccine—or contact with recent vaccinees—can trigger a life-threatening rash in people with eczema, one that looks much like smallpox itself.
What happened next was a race straight out of a TV medical drama. Doctors and at least two dozen outside experts, including a veteran of the smallpox eradication campaign, worked around the clock for days to diagnose and treat the boy's disease, known as eczema vaccinatum. Ultimately, they made a bold decision to try an experimental drug; it may have helped save the child's life. Today, despite severe skin loss, the boy is back at home in Indiana, healthy except for possible scarring where the pustules formed.
The incident, described in the 18 May issue of Morbidity and Mortality Weekly Report (MMWR), offers a mixed view of U.S. efforts to defend against a bioweapons attack, had this been the first in a wave of smallpox cases. On the one hand, a half-dozen federal and local agencies deserve good marks for working together to diagnose and treat a mysterious infectious disease. The experimental drug was developed with funds from the U.S. biodefense research program. But as a test of how quickly the system might contain a disease, the results seem not so good: If the boy's illness had been caused by a real smallpox release, notes poxvirus program chief Inger Damon of the Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia, there likely would have been casualties.
The MMWR report also highlights the risks of the Department of Defense's (DOD's) smallpox vaccination program, which uses a vaccine formulation that many consider outmoded (Science, 20 December 2002, p. 2312). Moreover, the soldier should not have been vaccinated, the MMWR report says. “It does not mean the program is broken,” because the rate of side effects has been very low overall, says infectious-disease physician Luciana Borio of the Center for Biosecurity at the University of Pittsburgh in Pennsylvania, but the incident underscores the need for safer vaccines, which are now in the works.
The current smallpox vaccine is an old formulation made with live vaccinia virus, a relatively benign cousin of the variola virus that causes smallpox. Serious side effects in immunocompromised patients and children are rare but well-known; former chancellor of the University of Colorado Health Sciences Center Vincent Fulginiti, 75, an expert on these adverse events, saw eczema vaccinatum cases often in the 1960s through 1972, when children were routinely immunized for smallpox.
After wild smallpox was declared eradicated in 1979, vaccinations continued only for members of the U.S. military. They had to be protected, the government decided, because they might be the first to encounter the lethal virus if secret stocks were deployed by an enemy. That program ended in 1990. In 2002, however, in the wake of the post-9/11 anthrax attacks, the Bush Administration resumed vaccinations for soldiers, but included screening so that people at risk for side effects would be excluded.
Why the father of the Indiana boy was not screened out is not known. Before he was vaccinated in January in anticipation of deployment to Iraq, he had received an educational pamphlet and briefing warning about possible side effects. He was also given a form asking whether he had a history of eczema, which he did. People with this skin condition or those who have household contacts with it are not supposed to receive the smallpox vaccination. For privacy reasons, DOD declined to disclose the soldier's answers on the screening form.
Then, in February, when his deployment was delayed, the soldier made a visit home. Although a scab from the vaccination had fallen off, he later told DOD, he kept the site covered as instructed. In late February, his 28-month-old son, who also had eczema, developed a fever and rash. On 3 March, after his family visited an Indiana emergency room, the child was transferred to Comer Children's Hospital, part of the University of Chicago in Illinois.
Doctors there took skin samples from the child and tested them for herpes and varicella; the tests were negative. Meanwhile, the lesions had spread over about 80% of the child's body and had worsened: Each pustule was circular, with a dimple in the middle, says University of Chicago pediatric infectious-disease physician John Marcinak, who coordinated the boy's care. Another viral culture tested positive for an unidentified virus. On 7 March, Marcinak says, he and pediatric dermatologist Sarah Stein, who by this time had been told of the father's vaccination, began to suspect vaccinia: “From a clinical standpoint, it all fit.”
Marcinak and colleague Surabhi Vora contacted CDC, which set up conference calls with various CDC and Army experts to analyze digital photos. The Illinois Department of Public Health also supplied polymerase chain reaction tests of skin scrapings from the child and his mother, who had developed a milder rash. Those tests showed the presence of a poxvirus. At that point, says Fulginiti, who was on the daily calls, he knew “it was eczema vaccinatum.”
The Chicago doctors began the standard treatment, injecting vaccinia immune globulin, or antibodies from people given the vaccine, which a U.S. marshal had hand-carried on a flight from CDC's biodefense stockpile. The boy was given narcotics for pain, sedated, and put on a ventilator. Two days later, his condition worsened. “We weren't sure he was going to survive,” Marcinak says. Next, his team tried a second-line drug, cidofovir, normally stocked by hospitals for AIDS patients with cytomegalovirus eye infections but approved by the U.S. Food and Drug Administration (FDA) off-label for complications from smallpox vaccination. Cidofovir seemed to be of limited help; the boy's abdomen had filled with fluid and his kidneys were failing, a condition that could have been exacerbated by the second drug.
On the next conference call on Saturday, 10 March, CDC's Damon suggested one more thing: an experimental smallpox antiviral drug, ST-246. “The condition of the child on the weekend was just so grim,” Damon says, that they wanted to try anything that might give him a chance.
The drug had first been discovered through a routine small-molecule screen for compounds that inhibit poxviruses. It had later been improved, reducing toxicity. With funding from the National Institutes of Health's biodefense program, SIGA Technologies Inc. in Corvallis, Oregon, reported last year in experiments at CDC that ST-246 protects monkeys against variola infections; it has also conducted a phase I clinical safety trial. Hours after Damon suggested using ST-246 that Saturday, FDA approved emergency use of the drug to treat the sick boy.
That night, SIGA Chief Scientific Officer Dennis Hruby tucked a vial in his pocket and flew to Chicago from Corvallis in a private jet paid for by billionaire Ronald Perelman, an investor in SIGA. The boy got an initial dose through a stomach tube Sunday morning. “The next day, he started to slowly get better,” Marcinak says. He also received skin grafts.
On 7 April, the boy moved out of the intensive care unit into a regular bed; 12 days later, he went home. “He looks very good right now,” says Marcinak. The grafts are healing well, he says, but it is too soon to tell whether he will have scarring.
Researchers say they may never know exactly what saved the boy's life because he was receiving three drugs (cidofovir lingers in the body for a week) as well as extraordinary medical care. Moreover, analyses of daily blood samples that could reveal when virus levels dropped have been inconclusive so far, Damon says. Still, he adds, “I think it [ST-246] was certainly a component” in the boy's recovery.
SIGA received $16.5 million from the National Institute of Allergy and Infectious Diseases last fall to conduct further clinical trials and scale up production of ST-246. But whether the federal government will eventually purchase ST-246 for the biodefense stockpile is not known, Borio says.
Some public health experts are now raising questions about DOD's procedures for identifying soldiers for whom the smallpox vaccine is too risky. But others say DOD, which has excluded 116, 000 of 1.3 million potential vaccinees, has done a good job. Overall, side effects have occurred at “a much, much lower rate than would have been expected,” says vaccine expert John Modlin of Dartmouth Medical School, including the one eczema vaccinatum case and 61 cases of mild vaccinia rash in contacts. But the Indiana case is “a very important reminder that we can't become lax about screening,” he says.
Dryvax, the old vaccine, apparently has other side effects as well, including myopericarditis, inflammation of the heart, which has occurred in 140 soldiers so far. Most people recover, says Modlin, but in a few, heart muscles may be permanently weakened.
A new vaccine produced by cell cultures was supposed to be slightly safer. But data presented last month to the FDA vaccine advisory panel by the manufacturer, Acambis in Cambridge, U.K., suggest a comparable rate of myopericarditis cases, says Modlin, a panel member. Although the panel found the vaccine safe and effective for the U.S. stockpile, “in my view, [the myopericarditis risk] should” prompt DOD to conduct a new riskbenefit analysis.
It is sobering to consider what might have happened if the Indiana boy's rash had been the first case in a wave of smallpox cases from a bioterror attack. The intense response—more than 20 experts on daily conference calls for a month—could not have been sustained for dozens of patients or more, Damon notes. The current system would have been overwhelmed “if 10, 000 people had had an [smallpox] infection,” says Hruby. That's why it is so important, he says, to develop standard therapies that can be tested in advance and kept on the shelf.
- 210TH MEETING OF THE AMERICAN ASTRONOMICAL SOCIETY
Black Holes: Galactic Homebodies?
210TH MEETING OF THE AMERICAN ASTRONOMICAL SOCIETY, 27-31 MAY, HONOLULU, HAWAII
Rogue black holes hurtling through space may be nothing to worry about, a new analysis concludes.
When galaxies collide, the black holes at their cores should eventually merge in a violent collision, imparting a high-velocity kick that might shoot the newly merged black hole out of the joined galaxies. In fact, recent simulations have suggested that the recoil velocity could occasionally reach 3000 kilometers per second, more than enough for the black hole to escape its galaxy's gravity. Yet searches do not find galaxies without central black holes. “When we actually look around at galaxies, all galaxies appear to have black holes,” says Christopher Reynolds of the University of Maryland, College Park. “So why aren't there these empty-nest galaxies?”
In a new search for black holes traveling away from their host galaxies, Erin Bonning of the Observatoire de Paris, Meudon, and collaborators at the University of Texas, Austin, examined 2600 quasars cataloged by the Sloan Digital Sky Survey. Quasars are fountains of energy powered by supermassive black holes at the centers of distant galaxies. A black hole being ejected from its host galaxy would drag some of the surrounding matter with it and would retain its quasarlike appearance. But the quasar's extra motion would make its redshift differ from that of the galaxy. The survey saw no convincing evidence for such shifts, the astronomers said at the meeting.
“For some reason, we do not get large kicks in quasars,” said Bonning. “This is an interesting nondiscovery.”
Astrophysicists at the University of Maryland, College Park, presented a possible explanation for the lack of large kicks. The simulations forecasting high-velocity ejections assume that the merging black holes are spinning rapidly, with their spin axes in the same plane as their galactic orbits. According to these models, configurations capable of kicking the final black hole out of the merged galaxies should occur in about 10% of galactic collisions.
But such a spin orientation is unlikely to persist throughout the merger process, Tamara Bogdanovic and her Maryland colleagues reported at the meeting. In gas-rich merging galaxies, their analysis showed, the two black holes will become embedded in a spinning disk of gas that will twist their axes until the black holes are spinning upright. As a result, the kick velocity after their merger will be less than 200 km per second, far short of the speed required to escape the merged galaxies' gravity.
“Ejection should not be common in such cases,” Bogdanovic says.
Collisions of gas-poor galaxies might still eject black holes with high velocities, she says. But without surrounding gas to give off radiation, such wandering black holes would be extremely difficult to detect.
Another possible explanation for the missing rogue black holes is that they are not initially spinning rapidly enough to be ejected, but that seems unlikely, said Reynolds. A survey of black hole spins, reported by Maryland graduate student Laura Brenneman, found a range of spin rates, some approaching the maximum possible spin velocity.
“So the logical possibility,” Reynolds said, “is that something upsets the orientation of the black holes as they are merging.”
- 210TH MEETING OF THE AMERICAN ASTRONOMICAL SOCIETY
Exoplanet Jackpot Shows Astronomers Are Looking for Worlds in All the Right Places
210TH MEETING OF THE AMERICAN ASTRONOMICAL SOCIETY, 27-31 MAY, HONOLULU, HAWAII
As news stories go, finding a new extrasolar planet is now about as surprising as another athlete arrest or celebrity trip to rehab. But recent, rapid increases in the number of known exoplanets have begun to reveal some noteworthy patterns, with implications for understanding the formation of planets and the prospect for life on some of them.
Planets around distant stars typically betray their presence when astronomers observe wobbles in stellar motion induced by the subtle tugs of the planet's gravity. Until recently, the search for such wobbles focused mostly on stars similar to the sun. Now surveys of other star types, from dim red dwarfs to massive subgiants, are showing that all varieties of stellar parents produce planetary offspring.
“Every sort of star we've looked at has a planet of some sort,” says Alan Boss of the Carnegie Institution of Washington.
At the meeting, planet hunter Jason Wright of the University of California (UC), Berkeley, noted that 28 newcomers have been added to the planetary roster in the past year, bringing the total to 236. Of the recent additions, four orbit “A stars,” said UC Berkeley's John A. Johnson.
In general, A stars, which are more than 1.3 times the mass of the sun, are poor candidates for planet searches, as their high temperatures and rapid spins make it hard to get precise velocity measurements by measuring shifts in the color of the light they emit. But as A stars age, they evolve into “retired A stars,” expanded subgiants with cooler temperatures and slower spins, making planet detection more feasible.
Using the Keck Observatory in Hawaii and the Lick Observatory in California, Johnson and collaborators scanned 150 subgiants, adding the four planets orbiting retired A stars to the six that had been discovered previously. The 10 A-star planets orbit at greater distances than planets of sunlike stars do, Johnson notes. All but one are farther from their parent stars than Earth is from the sun, and none is closer than 80% of the Earthsun distance.
“This is very intriguing,” Johnson said. “It's probably telling us something about how planets migrate from where they were born in closer to the star and how stellar mass affects that migration process.”
A stars also appear more likely than other stars to possess Jupiter-sized planets. Jupiters are found in orbit around 4% to 5% of sunlike stars; 8% to 9% of retired A stars have them, Johnson said.
Such findings hint that increased attention to A stars could further inflate the planetary lineup. “They're twice as likely to have a planet as a sunlike star,” says Johnson. “This means that there's probably a treasure trove of planets waiting for us around these retired A stars.”
On the lightweight side of the solar-mass divide, smaller, dimmer stars known as red dwarfs, or M stars, have also begun to attract more attention from planet hunters. A recent media frenzy accompanied news of a planet orbiting the red dwarf Gliese 581 because of its potential habitability (Science, 27 April, p. 528). The planet's orbit may be locked around its star in a way that keeps one side in the sun and one side in the shade, and the shady side's temperature may resemble Hawaii's, said Edward Guinan of Villanova University in Pennsylvania.
Guinan and collaborators have been studying the long-term habitability of M-star planets, particularly with regard to x-ray or ultraviolet radiation emissions that would endanger life. The danger would be especially high during the early days of the star's life, they reported, because young red dwarfs emit frequent stellar flares that would bathe nearby planets with deadly radiation. But nascent life on planets with atmospheres and magnetic fields would be protected from such emissions, Guinan noted.
After about 2 billion years, red dwarfs settle down into an exceptionally long period of stability, as they burn their nuclear fuel slowly and maintain a fairly constant temperature for billions of years. That makes long-term prospects brighter for life on M-star planets than on Earth, where the sun will become intolerably hot in just a few billion years.
“M stars … are suitable hosts for harboring life,” said Guinan. “Your habitable zone will last for 20, 30, 40 billion years as the star's magnetic activity dies down.”
M stars are the most numerous stars in the galaxy, and many are close enough to Earth to be available for detailed follow-up study, including attempts to directly image planets themselves.
“All these results make the case strongly that we're on the right track" for success in such imaging missions, said Boss.
“These studies are showing us that the frequency of planetary systems is really larger than what one might have guessed as recently as 10 years or so ago,” he said. “Planetary systems are not rare oddballs. They really are quite common.”
- 210TH MEETING OF THE AMERICAN ASTRONOMICAL SOCIETY
'Pristine' Galaxy Gives a Glimpse of Purity
210TH MEETING OF THE AMERICAN ASTRONOMICAL SOCIETY, 27-31 MAY, HONOLULU, HAWAII
A small but speedy dwarf galaxy hurtling toward the Andromeda galaxy has provided astronomers some reassurance about their favorite model of galaxy formation, along with hope for improved understanding of star formation.
Andromeda XII, discovered in 2006, is dim even by dwarf galaxy standards, with a brightness of only about 100, 000 suns. But new results, reported at the meeting, show that it is exceptionally fast, falling toward the Andromeda galaxy at 281 kilometers per second. At that speed, it may be moving too rapidly to be trapped by the gravity of the Local Group of galaxies, which includes Andromeda and our own Milky Way.
Jorge Peñarrubia of the University of Victoria in British Columbia and his colleagues established the speed of Andromeda XII using the DEIMOS spectrograph at the Keck II telescope on Mauna Kea, Hawaii. (A paper on the new findings is to be published in the Astrophysical Journal.) The quickness of Andromeda XII suggests that it formed far from the Local Group and is making its first visit, says Peñarrubia. The dwarf's arrival, he adds, fulfills predictions by the standard model of galaxy formation from cold dark matter, exotic nonluminous material that dominates the mass of both dwarfs and larger galaxies.
Dwarf galaxies contain from 50 to 500 times as much dark matter as ordinary matter, Peñarrubia said: “According to cold dark matter theory, the dwarfs are building blocks of galaxies. If you understand dwarfs, you learn about how galaxies form.”
If it is entering the Local Group for the first time, Andromeda XII should be free from gravitational disturbances induced by the gravity of the group's larger galaxies. “Andromeda XII is pristine; its properties are those it had when it was formed,” said Peñarrubia. Consequently, it offers astronomers an unusual opportunity to study star formation and dark matter distribution. In dwarf galaxies that have already entered the Local Group, gravitational interactions may have altered the location of dark matter and obscured the history of star formation.