News this Week

Science  25 Feb 2005:
Vol. 307, Issue 5713, pp. 1178
  1. ASTROPHYSICS

    Giant Neutron-Star Flare Blitzes the Galaxy With Gamma Rays

    1. Robert Irion

    WASHINGTON, D.C.—Starstruck astrophysicists are agog over an explosion toward the center of our galaxy that irradiated Earth with gamma rays and x-rays shortly after Christmas. The outburst was brighter than any solar eruption ever measured, even from its estimated distance of tens of thousands of light-years. The source, scientists believe, was the most exotic kind of neutron star: a “magnetar” shot through with twisted magnetic fields, powerful enough to shred the ultradense surface of the tiny object.

    Lasting just one-fifth of a second, the flare released as much energy as our sun produces in 250,000 years. It spawned a fireball that radio telescopes see blasting into space at 30% of the speed of light. That pattern—an intense spike of energy and a fading afterglow—leads researchers to suspect that they caught a miniature gamma ray burst (GRB), startlingly close to home.

    Although the biggest GRBs probably arise from giant stars that collapse into black holes in remote galaxies, magnetar flares in nearby galaxies might produce some of the short bursts that no one could explain until now. “For all practical purposes, this could be a short GRB in our own galaxy,” says astrophysicist Chryssa Kouveliotou of NASA's Marshall Space Flight Center in Huntsville, Alabama. “It's an amazing event, something I never expected to see in my lifetime.”

    Other scientists used similar superlatives at a NASA press conference here on 18 February, convened far in advance of several reports about the flare to appear in Nature. NASA officials fretted that the story would elude their control, thanks to other papers already posted online. The hasty publicity alienated members of some teams, who felt excluded until NASA agreed to cite their work during the briefing.

    But the flare's glow should erase those hard feelings as observers and theorists race to understand its origins. The fierce shell of radiation tripped detectors on about 15 satellites and solar-system probes on 27 December, says astronomer Kevin Hurley of the University of California, Berkeley. Scientists quickly traced the flare to a magnetar called SGR 1806–20, in a crowded and dusty part of the sky toward the Milky Way's center.

    Sizzling shell.

    Radiation floods the galaxy from an ultramagnetic neutron star (left). The intense pulse (right, top graph) ionized Earth's daytime atmosphere to startlingly low levels (bottom graph).

    CREDITS (LEFT TO RIGHT): NASA; UMRAN INAN/STANFORD UNIVERSITY

    Of the dozen or so suspected magnetars astronomers have identified so far, SGR 1806–20 is the third to spew a giant flare seen by modern telescopes. The two previous outbursts, in 1979 and 1998, helped researchers devise their theory that SGR 1806–20 and its kin are slowly rotating neutron stars powered by the most intense magnetic fields known (Science, 23 April 2004, p. 534). The latest flare—about 100 times stronger than the others—probably arose from a sudden transformation of the entire neutron star, theorists believe.

    The flare's impact on Earth was stunning. It struck during the daytime over the Pacific Ocean, from a spot on the sky just 5 angular degrees from the sun. Very-low-frequency radio transmissions between Hawaii and Antarctica showed that Earth's ionosphere compressed tens of kilometers inward from its usual daytime altitude of about 70 kilometers, says atmospheric physicist Umran Inan of Stanford University in California. Moreover, the disturbance lingered for an hour. “That's totally unheard of,” says Inan, who thinks the radiation altered chemical reactions that typically restore the ionosphere after solar flares.

    “I'm awestruck by that,” says Dale Frail of the National Radio Astronomy Observatory in Socorro, New Mexico. “This little thing near the center of our galaxy reached out and physically invaded us, even though it's just 20 kilometers across.”

    The flare's position near the sun prevented most telescopes and satellites from taking a close look. But radio dishes across the globe—immune to sunlight—took aim in the days after. Most notably, two teams used the Very Large Array of 27 radio telescopes in Socorro to look for a glowing blast wave. “All hell broke loose seconds after the data came in,” says Bryan Gaensler of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts. “[The afterglow] was 500 times brighter than we expected. I thought we had pointed the telescope at the wrong place.”

    Meanwhile, astrophysicists worked furiously to figure out the flare's actual energy, a tough task because most radiation detectors were so swamped. For example, NASA's Swift satellite—launched in November to study GRBs—was overwhelmed by nearly a trillion photons that pierced the body of the spacecraft, says astrophysicist Neil Gehrels of NASA's Goddard Space Flight Center in Greenbelt, Maryland. But fingernail-sized particle detectors on a few satellites kept up with the onslaught, enabling researchers to calibrate the flare's 0.2-second fury.

    Two independent teams, led by Hurley and by astrophysicist David Palmer of Los Alamos National Laboratory in New Mexico, conclude in their upcoming Nature papers that the energy flux resembles the spikes of radiation seen from short GRBs in other galaxies. But opinions differ over just how many magnetar flares pop off within about 300 million light-years from the Milky Way, the expected viewing range of satellites. “This type of event conceivably could explain all of [the short GRBs],” says Palmer. Others argue that most short GRBs still must arise from different sources—colliding neutron stars, for instance.

    Debate also rages about the physics that powered the flare. Four years ago, David Eichler of Ben-Gurion University in Beer-Sheva, Israel, forecast that a wholesale decay of magnetic fields outside a magnetar could spark a supergiant flare. Magnetar theorists including Robert Duncan of the University of Texas, Austin, now propose a variation: The flare arose from wound-up magnetic fields deep inside the neutron star that catastrophically “untwisted” near and above the surface. As the fields snapped into new configurations, the theorists believe, they sheared the entire surface of the star and unleashed a 2-billion-degree fireball of electrons and positrons.

    As SGR 1806–20 moves safely away from the sun, more observatories will examine the fading outburst—including infrared telescopes on the ground and x-ray telescopes in space. Astrophysicists eagerly await each result, creating a tumult that the field has not seen since Supernova 1987A flared into view 18 years ago this month. “I heard stories that the supernova was a life-changing experience for many astronomers,” says Gaensler. “I now understand why.”

  2. HUMAN ORIGINS

    Battle Erupts Over the 'Hobbit' Bones

    1. Elizabeth Culotta

    Research on human fossils generally proceeds at a leisurely pace. Those who discover new bones sometimes take years to analyze them, while their colleagues and rivals wait impatiently to get a good look. But that's not the case with the 18,000-year-old “hobbit” skeleton of Indonesia. Ever since the Australian- Indonesian team that discovered the bones made the startling claim that they are the remains of a species of small, archaic human, Homo floresiensis (Nature, 28 October, p. 1055), the bones have been analyzed and reanalyzed at a breathtaking pace. For the past 3 months, however, the studies have been directed not by the discoverers but by a rival who has taken possession of the skeleton.

    The bones were discovered in 2003 in the Liang Bua cave on the Indonesian island of Flores by a team led by archaeologist Mike Morwood of the University of New England, Armidale. But in November last year, the Center for Archaeology in Jakarta agreed to let Indonesian paleoanthropologist Teuku Jacob study the skeleton in his laboratory at Gadjah Mada University in Yogyakarta (Science, 12 November 2004, p. 1116). Jacob has since invited other researchers to inspect it and, in a move that Morwood calls “unethical” and “illegal,” asked a team from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, to conduct DNA and other analyses on a 1-gram sliver of rib. Jean-Jacques Hublin, director of the department of human evolution at the Max Planck, who carried the sample to Leipzig, counters that he has “formal authorization” from Tony Djubiantono, head of the Jakarta archaeology center, to analyze the sample.

    Jacob has announced that, in his view, the skeleton is that of a modern human pygmy with microcephaly. “There are [living] pygmies near there—near Liang Bua,” he notes. Last week, three paleoanthropologists, two of whom had publicly challenged Morwood and his colleagues' analysis—Maciej Henneberg of the University of Adelaide, Alan Thorne of Australian National University in Canberra, and Robert Eckhardt of Pennsylvania State University—announced, after examining the bones with an Australian camera crew looking on, that they agree with Jacob. Morwood calls this interpretation “mind-boggling.”

    Bone of contention.

    Teuku Jacob has commissioned DNA analysis of the skeleton.

    CREDIT: AGUS SUPARTO/AFP/GETTY IMAGES

    DNA analysis could settle the case. If “the DNA sequence falls outside the variability seen in modern human sequences, then we could be confident that it's not a modern human,” says Svante Pääbo of the Max Planck. But he estimates that the sample has a “less than 50%” chance of yielding ancient DNA. And “if it carries a sequence similar to a modern human, we will not be able to exclude contamination [with DNA from those who have handled the sample] completely,” says Pääbo.

    Max Planck scientists also may do stable isotope analysis, which offers clues to diet, and radiocarbon dating. “Professor Jacob has the fullest authority to ask us to perform this analysis,” says Hublin, who has worked with Jacob for years.

    Behind the infighting lies the question of who should have the right to study important fossils. Jacob argues that for decades, archaeologists have brought bones excavated from Liang Bua to his laboratory for anatomical analyses. But Morwood points out that he has a signed agreement with the archaeological center, the official repository for the bones, allowing his team to work with the center's scientists to study the specimen. “We haven't been able to do the analysis we'd want to do because we haven't seen the specimen since [Jacob] took it in November,” says geologist and Morwood colleague Bert Roberts of the University of Wollongong. The deadline for returning the bones was 1 January, but Djubiantono has extended it twice.

    Last week Jacob told Science that he had “almost finished” analyzing the specimen. But Morwood, speaking from the field in East Java, says he is not optimistic about the bones' return to Jakarta. “We're moving on with our research,” he says, planning publications based on previous measurements and also mounting new field expeditions to gather more material. Jacob says he, too, plans “a series of articles on different aspects, with colleagues.” The conflict continues—and for now, at least, so does the research.

  3. SOUTH KOREA

    Radical Reforms Would Shake Up Leading Science Institute

    1. Mark Russell*
    1. Mark Russell is a freelance writer in Seoul.

    SEOUL—The Korean government was looking for fresh ideas when it hired physics Nobelist Robert Laughlin last summer as president of the Korea Advanced Institute of Science and Technology (KAIST), one of the country's top science and engineering universities. It may have gotten more than it bargained for.

    Laughlin has floated a plan to cure the prestigious institution of its “addiction” to government subsidies. His prescription—more undergraduates, higher tuition, and courses that appeal to nonscience majors—has, however, been loudly denounced by some faculty members as a danger to the patient.

    “KAIST is too old and too important to be the experiment of one person's ridiculous ideas,” says Park O-ok, who resigned as dean of office planning after Laughlin circulated a draft of his plan in December. Others say the proposal is simply unworkable. “I'm open-minded about it,” said one member of the faculty committee that reviewed the plan, which is expected to be formally unveiled on 1 March. “But he has no chance of succeeding.”

    Established in 1971, KAIST was intended to provide essential talent for Korea's high-tech sector. But the number of students interested in scientific careers has declined steadily since the mid-1990s, and government officials thought that reforming KAIST might help reverse the trend. Last summer they took such a step by hiring Laughlin, a physics professor at Stanford University whose explanation of how electrons acting together in strong magnetic fields can form new types of “particles” with fractional charges earned him a Nobel Prize in 1998 (Science, 4 June 2004, p. 1427). He was the first foreigner to lead a university or major government research institution in Korea.

    Strong medicine.

    Robert Laughlin says KAIST needs to end its “addiction” to government funding.

    CREDIT: COURTESY OF KAIST

    Laughlin wasted no time. Within 6 months he had decided that KAIST was being “squeezed [financially], with no exit.” The solution, he declared, was to increase income from sources other than the government. In December, he sent a draft plan to the Ministry of Science and Technology and a faculty committee that would triple KAIST's current enrollment of 7300 and shift the balance toward undergraduates, quadruple the current $850-a-semester tuition, revamp the undergraduate curriculum by appealing to premed and prelaw students, and tweak its graduate research programs to turn a profit.

    “I do not think that the problem of lack of interest in science should be solved,” he said provocatively at his inauguration. “The right course of action is to change science and technology so it becomes economically competitive.” In other words, reduce the number of poorly enrolled basic research classes in favor of subjects that will attract students.

    Laughlin holds up the Massachusetts Institute of Technology as the model of an institution attuned to finding outside sources of revenue. “KAIST was made to subsidize industry,” he told The Korea Times in January. “MIT was set up to make money.”

    Critics say that Laughlin has ignored an existing 10-year plan aimed at achieving financial independence. “The idea of getting an endowment and finding industry financing was discussed a long time ago,” says physicist Shin Jung-hoon, a member of the 21-person faculty review committee. His blunt, forceful style, they add, has bruised egos in the face-conscious country. Although Laughlin says he clearly labeled his December draft as “the starting point for discussion … with strong language to sharpen the issues,” Park and others say that caveat wasn't on the original documents.

    Despite the controversy, Laughlin remains optimistic that his vision will eventually prevail. “They're being terrific,” he said of the KAIST faculty and the science ministry, which oversees KAIST. “This is an unusual and historic cultural experiment. Of course it isn't easy.”

  4. NATIONAL INSTITUTES OF HEALTH

    NCI Gears Up for Cancer Genome Project

    1. Jocelyn Kaiser

    The National Cancer Institute (NCI) is hoping to launch a $1.5 billion effort to identify all major mutations in the most common human cancers. The 10-year project would gather tumor samples from thousands of patients and scrutinize them for genetic glitches. “If we can sequence the cancer genome, … I think we must,” says Anna Barker, NCI deputy director for strategic scientific initiatives.

    The idea for a Human Cancer Genome Project comes from a working group of the National Cancer Advisory Board (NCAB) commissioned by NCI Director Andrew von Eschenbach to find new opportunities to use technologies in cancer. The panel, led by Eric Lander of the Massachusetts Institute of Technology Broad Institute in Cambridge and Lee Hartwell of the Fred Hutchinson Cancer Research Center in Seattle, Washington, spent 18 months hammering out a plan, which includes expanding NCI's biomarker efforts (Science, 12 November 2004, p. 1119). The cancer genome is the one “specific project” they came up with, Lander told NCAB last week.

    Insights into the genetic basis of cancer have led to targeted drugs like Gleevec, Herceptin, and Iressa, noted Lander. “But we know a minority of the story” about genetic changes in tumors. A “comprehensive” effort to identify these mutations, he added, “would propel the work of thousands of investigators.”

    As outlined in a 21-page white paper, the project would collect tumor samples from patient volunteers and use technologies such as gene chips to find mutated regions. Those sections would then be resequenced to identify specific mutations. The goal—to identify all mutations occurring at 5% frequency in the 50 most common types of cancer—would require 250 samples per type, or 15,000 samples. The work would be carried out by “sample acquisition centers” and genome analysis centers, and it would include studies of ethical and intellectual-property issues. It would begin with roughly $50 million a year in pilot projects for a few years before ramping up to $200 million a year.

    The project would be jointly managed by NCI and the National Human Genome Research Institute (NHGRI), which is already on board. “I am extremely enthusiastic about this proposal,” NHGRI Director Francis Collins told NCAB. “The potential value here is massive,” said cancer board member Franklyn Prendergast of the Mayo Clinic in Rochester, Minnesota; another member called it “mind-boggling.”

    Advocates hope that the price tag is not. Although Lander predicts that the project would “require new funds” from Congress and perhaps industry, his working group wants NCI to launch the pilot projects next year. To do that, von Eschenbach has asked the board to begin prioritizing other programs that can be cut back or eliminated. Although von Eschenbach didn't offer specific suggestions, he noted that “we are applying a lot of fiscal discipline within our own house.”

  5. ENVIRONMENTAL SCIENCE

    Forging a Global Network to Watch the Planet

    1. Daniel Clery

    CAMBRIDGE, U.K.—The dream of creating a global earth-monitoring network came a step closer to reality last week. Proponents met in Brussels to launch a 10-year program to turn gauges, sensors, buoys, weather stations, and satellites that monitor Earth's surface, atmosphere, and oceans into a unified whole. The Global Earth Observation System of Systems (GEOSS), as it's called, is expected to evolve slowly from national systems into a comprehensive, coordinated, and sustained set of observations for the benefit of everyone, including developing countries. By adding links and standards, “earth science will step up to the next level: a total earth-observing system,” says Conrad Lautenbacher Jr., head of the U.S. National Oceanic and Atmospheric Administration.

    There is little coordination today among the roughly 50 satellites observing Earth—or the more numerous sensors in ground- and ocean-based networks. As a result, there are gaps in coverage as well as a massive duplication of effort. The drive to add coherence began in July 2003 when government ministers from some 30 nations along with heads of various agencies—collectively dubbed the Group on Earth Observations (GEO)—met for an Earth observation summit in Washington, D.C. At a second summit in Tokyo in April 2004, GEO came up with the idea of having a 10-year transition from the current hodgepodge of observations to a global coordinated system. And on 16 February in Brussels, the third summit signed off on GEOSS's 10-year implementation plan. The agreement puts GEO itself—with 60 countries and 33 organizations now on board—on concrete footing, with a permanent secretariat hosted by the U.N.'s World Meteorological Organization in Geneva.

    Center of attention.

    Brussels, shown in this 2003 image from a European environmental satellite, hosted a summit to launch a decadal plan for GEOSS.

    CREDIT: 2004 ESA

    The goals of GEOSS are lofty: Its proponents say it will improve weather forecasts, reduce the devastation of natural disasters, monitor climate change, support sustainable agriculture, help understand the effect of environment on human health, and protect and manage water and energy resources.

    The first job, according to Errol Levy, scientific officer at the European Commission in Brussels, “is to look at what is measured now, what is needed,” and find the gaps. GEOSS will initially build on existing satellites and sensors, such as NASA's Earth Observing System satellites and the European Space Agency's Envisat. The most difficult part, Lautenbacher says, will be achieving an agreement on data sharing. There will have to be some horse trading on who observes what and who launches which satellite. “Some of this is difficult. It will require serious negotiation,” Lautenbacher says.

    One of GEOSS's key aims is to involve developing countries. Lautenbacher says they “have the most to gain,” in terms of helping them tackle problems such as desertification or the spread of malaria. But they can also contribute by launching weather balloons into the upper atmosphere or installing tide gauges to measure sea-level rise.

  6. DRUG SAFETY

    FDA Panel Urges Caution on Many Anti-Inflammatory Drugs

    1. Jennifer Couzin

    GAITHERSBURG, MARYLAND—At a 3-day meeting last week to hash out how a class of painkillers might trigger heart attacks and strokes, new puzzles emerged, even as two U.S. Food and Drug Administration (FDA) advisory committees agreed that use of the drugs, known as COX-2 inhibitors, be allowed but sharply curtailed. Wrestling with questions that affect millions of patients and billions of dollars in drug company revenue, the committees—on arthritis drugs and drug safety—voted in favor of keeping Celebrex, Bextra, and even Vioxx on the market but adding stringent warnings to them and possibly other nonsteroidal anti-inflammatory drugs (NSAIDs). Steven Galson, acting head of FDA's Center for Drug Evaluation and Research, said in advance that FDA would act rapidly on the panel's recommendations, likely within a few weeks.

    But the recommendations belie a confusion that, if anything, has intensified in the 5 months since Merck withdrew its COX-2 inhibitor Vioxx from the market. At issue are two questions, neither of which could be definitively answered last week, despite exhaustive parsing of clinical trials data involving tens of thousands of volunteers. How do drugs that inhibit COX-2, which mediates inflammation, disrupt the cardiovascular system? And which drugs present a significant risk?

    “We need to worry about the data we don't have,” said James Witter, an FDA rheumatologist who reviewed Pfizer's Celebrex and Bextra, the COX-2 inhibitors that remain on the U.S. market.

    COX-2 drugs are no more effective at controlling pain than traditional NSAIDs like naproxen, but they quickly gained favor because they don't cause NSAID-associated stomach problems. The reason is that COX-2 inhibitors can blunt COX-2 while steering mostly clear of COX-1, a related enzyme. Dampening COX-1 can cause stomach ulcers. Traditional NSAIDs as well as COX-2 drugs may also protect against and help treat cancer.

    Tough choices.

    Naproxen, marketed as Aleve, got more backing for its safety profile than either Celebrex or diclofenac, marketed as Voltaren.

    CREDITS (CLOCKWISE FROM TOP LEFT): STEVEN SENNE/AP PHOTO; MARY ALTAFFER/AP PHOTO; NOVARTIS PHARMACEUTICALS

    Merck voluntarily withdrew Vioxx after a colon cancer prevention trial suggested that it nearly doubled the rate of strokes and heart attacks compared with a placebo. At first, many scientists believed that Vioxx's unusually selective targeting of COX-2 was the culprit. That, they thought, might be upsetting a fine balance between two fatty acids, prostacyclin and thromboxane, that control blood clotting. By that reasoning, related drugs like Celebrex, a slightly less targeted COX-2 inhibitor with a shorter half-life, as well as traditional NSAIDs, might be safe, or at least safer. Even when a 2000-person Celebrex trial was halted in December after participants taking the drug suffered roughly two to three times more cardiovascular problems, the targeting theory remained plausible.

    New data, however, hint that something else might be at work, said committee member Steven Abramson, a rheumatologist at New York University and the Hospital for Joint Diseases in New York City. He and others were struck by a Pfizer study of cardiac surgery patients posted online last week in the New England Journal of Medicine. That study, which Kenneth Verburg, head of the company's arthritis drugs division, described at the meeting, found that participants taking Bextra reported nearly three times the rate of cardiovascular events compared with those on placebo.

    The puzzle for Abramson was that all the study volunteers were also taking aspirin—commonly used by cardiac patients and by those with arthritis. Because aspirin inhibits COX-1 as well as COX-2, noted Abramson, it might be expected to counteract the cardiac risks caused by inhibiting COX-2 alone. That didn't appear to be the case in the Bextra study. “We're left with a lot of unknowns” about why Bextra caused problems despite aspirin, said Verburg.

    If simply blocking COX-2, regardless of what else is also blocked, is at least partly to blame, then a much broader class of drugs—the traditional NSAIDs—could be implicated. “I think you have to tell people there may be a problem here,” said panel member and cardiologist Steven Nissen of the Cleveland Clinic. The group voted unanimously to add new warning labels to NSAIDs, although it cautioned that some clearly pose a greater risk than others.

    That was underscored by a presentation made by FDA drug safety officer David Graham. He described his preliminary results from an epidemiologic study of Medicaid patients; those taking Indocin (generically, endomethacin) had nearly double the number of heart attacks, while those on Mobic (generically, meloxicam) had a 37% increased risk. Many patients switched to these drugs, which are thought to block COX-2 more selectively than naproxen but less selectively than Vioxx, when concerns arose about COX-2 inhibitors.

    Ernest Hawk, a chemoprevention expert at the National Cancer Institute in Bethesda, Maryland, believes that NSAIDs, including COX-2 inhibitors, still show great promise as anticancer agents. “COX-2 remains a relevant oncology target,” he says, backed by dozens of animal and some human studies.

    A critical question, Hawk and others agreed, is whether genetic differences among patients might offer clues to their response to these drugs. Garret FitzGerald, a pharmacologist and cardiologist at the University of Pennsylvania in Philadelphia, for one, emphasized how badly such information is needed. “We've talked about personalized medicine for a long time,” he said. “Here is a situation where we actually have to care” about it.

  7. HIV/AIDS

    Experts Question Danger of 'AIDS Superbug'

    1. Jon Cohen

    “NEW AIDS SUPER BUG: Nightmare strain shows up in city,” blared the 12 February headline on the front page of the New York Post, a tabloid not known for understatement. Even the sober New York Times played the story on the front page, albeit with characteristic reserve. “The story is the story,” says AIDS researcher Steven Wolinsky, who heads the infectious disease department at Northwestern University Medical School in Chicago, Illinois.

    The hubbub, which shows no sign of flagging, began on 11 February, when the New York City Department of Health and Mental Hygiene issued an alert to physicians and held a press conference, warning that a highly promiscuous New York City man was infected with a strain of HIV that was both very virulent and “difficult or impossible to treat,” as it was resistant to three of four classes of anti-HIV drugs. The message was clear: Because of one man's crystal methamphetamine use and unprotected anal sex with many partners, a new killer bug might be on the loose.

    One of a kind.

    David Ho insists the case is unique.

    CREDIT: AARON DIAMOND AIDS RESEARCH CENTER

    But immediately AIDS researchers around the world began to question just how new or dangerous this supposed superbug really is. One of them, Julio Montaner, acting director of the B.C. Centre for Excellence in HIV/AIDS in Vancouver, co-authored a similar report in the May 2003 issue of the journal AIDS about two patients who became infected by drug-resistant viruses that rapidly destroyed their immune systems. Susan Little, Douglas Richman, and co-workers at the University of California, San Diego (UCSD), reported 3 years ago in the New England Journal of Medicine that 10% of recently infected people have mutations in the virus that are associated with resistance to at least two classes of drugs. Several researchers have also questioned just how drug resistant the man's virus is; viruses deemed “resistant” in lab tests sometimes still respond to drugs.

    What's more, many researchers doubt that a multidrug-resistant strain of HIV can transmit efficiently from one person to another. “There's always a fitness cost for mutations, either functionally or structurally,” explains Northwestern's Wolinsky. A study of 220 recently infected people bears this out: As Bernard Hirschel and Luc Perrin of the University of Geneva and colleagues reported in the June 2004 issue of Antiviral Therapy, multidrug-resistant strains were indeed less likely to transmit than “wild-type” strains.

    But David Ho, director of the Aaron Diamond AIDS Research Center in New York City, whose lab first detected this patient's multidrug-resistant strain, stresses that this case is unique. Ho alerted the New York City health department after his clinic saw the patient in January. The man had last tested negative for HIV 20 months earlier and may have become infected as late as October. By the time he came to Aaron Diamond, his CD4 white blood cells, which HIV targets and destroys, had plummeted to below 30. (AIDS is defined as 200 or fewer CD4s.) Typically, it takes 10 years to progress from infection to AIDS. Ho then scoured a national database of recently infected people and looked at their CD4 loss. “There's no case like this,” says Ho.

    Overblown?

    Critics say the New York Health Department overreacted to a single case.

    CREDIT: NEW YORK POST

    UCSD's Richman downplays notions of a superbug and instead suggests that the patient may be genetically susceptible to infection or rapid development of AIDS. But Ho says a preliminary analysis of the man's genetics hasn't “found anything unusual yet.”

    Ho also says this strain replicates in test tube experiments slightly better than the standard reference strain used by his lab. “It's not a wimpy virus,” says Ho. In fact, Aaron Diamond's Viviana Simon, Martin Markowitz, and co-workers in a July 2003 Journal of Virology paper showed that some drug-resistant isolates from recently infected people remain highly infectious and replicate well, suggesting that they had continued to evolve and compensated for the fitness cost the original mutations extracted.

    Several researchers agree that it is premature to dub this man's HIV strain a “supervirus,” as studies have yet to demonstrate that it's highly pathogenic. And many experts question whether the case deserves widespread attention in the absence of data linking the man's infection to others. “Until there's evidence that this virus is highly transmissible or even normally transmissible, it remains an anecdote,” says AIDS researcher John Moore of Weill Medical College of Cornell University in New York City.

    Ho says he and his co-workers hope to publish their findings rapidly online, and they will present data from this case at a large U.S. AIDS meeting in Boston this week, where those in the field can discuss it in context.

  8. ECOLOGY

    Reviving Iraq's Wetlands

    1. Andrew Lawler

    The fight is on to save Mesopotamia's drained marshes. But it's not easy finding a realistic and salable plan—or gathering data in a dangerous environment

    “Redeeming a swamp … comes pretty near to making a world.”

    —Henry David Thoreau

    Azzam Alwash enjoys kayaking with his wife in southern California. But his real dream is to paddle among the high reeds of Mesopotamia's ancient marshes near where he was born. Those marshes exist mostly in his memory, however; in an unprecedented ecological and human disaster, some 90% of the famed Iraqi wetlands were destroyed by 2000. Alwash, a 49-year-old civil engineer who left Iraq a quarter-century ago, envisions a full restoration that would preserve both the vibrant wildlife and the unique culture of the Marsh Arabs in the region. He even quit his high-paying job as a partner in an environmental consulting firm to drum up international support for his effort, which he grandly dubbed Eden Again.

    Alwash has helped energize a coterie of donors, scientists, local leaders, and politicians who are hotly debating the future of the marshes. The first scientific studies of the wetlands in decades appear in this week's issue of Science (see p. 1307), and foreign nations have pledged a total of $30 million. The Iraqi government recently set up an interagency center to draw a blueprint for revitalizing this desperately poor and ecologically battered area. But coming up with a common vision—and financing—in an unstable nation may prove even harder than collecting data. “This is a scientifically difficult and tremendously complex effort,” says Edwin Theriot, a U.S. Army Corps of Engineers official who has advised the Iraqi government. “We're having difficulties with the Everglades and in Louisiana—and we're supposed to have all the resources we need.”

    The sheer scale of the destruction is of biblical proportions. In one generation, some 20,000 square kilometers of marsh shrank to a tenth of that size, as did a population that once numbered a half-million. Three wars and one insurrection played a big role, as did a concerted effort in the 1990s by Saddam Hussein to drain the marshes.

    As the marshes turned to desert, local peoples fled or were forced from their homes. Left behind were vast salt flats laced with insecticides and landmines. The fisheries—which provided a large share of Iraq's overall catch—crashed, while animals from the Goliath heron to the pygmy cormorant face extinction.

    The effects of the destruction radiate far beyond southern Iraq. No longer cleansed by the marshes, the salty and polluted waters flowing into the Persian Gulf from the Tigris and Euphrates rivers are playing havoc with marine life there, including the lucrative shrimp business. And Asian migratory birds have lost a major staging and wintering area on the western Siberian-Caspian-Nile flyway. “The impact on biodiversity has also been catastrophic,” states a 2004 United Nations study on the marshes.

    Down the drain.

    Iraq's famed marshes have largely dried up in the past 2 decades. Some areas are now reflooded, but recovery will take time—and require lots of water.

    SOURCE: AMAR

    Out of Eden

    Few places on Earth have a stronger hold on the imagination than do the Iraq marshes. They are the legendary site of the Garden of Eden and incubator for the first great urban centers, home of the world's first writing system. Its trackless stretches have long hidden both wildlife and rebels. Sumerian princes hunted game there, and Assyrian King Sennacherib led a force into the region in the 7th century B.C.E. to flush out pesky Chaldean rebels.

    Isolated, yes, but far from pristine. “This is the oldest and most tinkered-with landscape on Earth,” says Iraq's new water resources minister Abdul Latif. For at least 5000 years, humans have widened and dredged channels, dried and flooded fields, and built reed houses atop artificial islands of reed bundles. Its lifeblood was the spring floods. “This pulse of sweet fresh water, laden with sediments, flushes the salt, provides nutrients to revitalize the reed beds, and is key to bird migration,” says Alwash.

    Most of that water comes from outside Iraq. The Euphrates and Tigris originate in the eastern mountains of Turkey. More than 90% of the water from the Euphrates comes from Turkey, Syria, and Saudi Arabia; the Tigris's basin covers large parts of Turkey, Iran, and a slice of Syria.

    Beginning in the 1950s, governments began diverting that flow, first by creating natural lakes within Iraq and later by building large dams on both rivers. There are now nearly three dozen major dams, with eight more under construction and a dozen in the planning stages. Turkey alone can store up to 91 billion cubic meters of water and will need more to irrigate its dry eastern provinces. Iraq and Syria can store as much as 23 billion cubic meters. The 2003 war and its aftermath halted plans to build additional dams in Iraq—there are currently a dozen large ones—but Iran recently embarked on a major dam-building effort on tributaries of the Tigris.

    The result of this half-century of water management has been dramatic. The spring flood is barely noticeable. The maximum flow of the Euphrates during May has dropped by two-thirds since 1974, when dam building began in earnest. Even before the 1991 Gulf War, many experts feared the result would irreparably harm, and eventually destroy, the Iraq marshlands. Severe deforestation from overgrazing upstream, combined with more than a decade of drought in the Middle East, exacerbated the environmental problems to the point at which Minister Latif believes the marshes would soon have been history “even without Saddam.”

    Dream boat.

    Iraqi expatriate Azzam Alwash envisions Marsh Arabs and ecotourists taking advantage of fully restored wetlands.

    CREDIT: DANA SMILLIE

    Brutal ecocide

    But it was Saddam Hussein's regime that delivered the coup de grâce. Part of the Iran-Iraq border runs through the wetlands, and during the 1980s war, both sides built causeways and drained marshy areas for better access to the front. After the first Gulf War and the unsuccessful uprising of Shiite Muslims in the south, the Iraqi government set about draining the remaining marshes. Its goal was to remove the threat of insurgency and replace the marsh culture of fishing and rice production with dry agriculture. Massive dikes and canals were built to divert water from the marshes, quickly turning them to desert.

    “The demise of these once-vast wetlands has been hastened through deliberate drainage by the Iraq regime,” the U.N. study notes. To Latif, Saddam's actions constitute “a brutal ecocide” as well as “a crime against humanity.” Many of the marsh inhabitants are now returning home, although most remain scattered in Iranian refugee camps or in cities.

    The marshes are actually three distinct regions, each with its own particular ecosystem. The once vast Central Marsh, which covered more than 3000 square kilometers in 1973, has shrunk by 97%. Most of what remains are reeds growing in irrigation canals. Another marsh, called al-Hammar, lost 94% of its area, and al-Hawizeh, which borders Iran, is two-thirds smaller than 3 decades ago. Even the Hawr al-Azim Marsh, which is the Iranian extension of al-Hawizeh, is less than half its size due to reductions in water flow from Iraq.

    Their depletion has led to the extinction of an otter, bandicoot rat, and a long- fingered bat particular to the marshes, and 66 species of water birds are at risk. Aquatic animals also have suffered, from shrimp to fish, with devastating consequences for coastal fisheries. “The wetlands were like a vast sewage treatment plant for the Euphrates and Tigris system,” says Hassan Partow, who helped write the U.N. report. “They were the kidneys.” Without them, the patient is imperiled.

    Ancient battleground.

    Relief from the palace of Assyrian King Sennacherib, who sent troops to ferret out rebels in the species-rich Mesopotamian marshes in the late 7th century B.C.E.

    CREDIT: WERNER FORMAN/CORBIS

    Just add water?

    How those kidneys function is uncertain. For decades, foreign and even Iraqi researchers were forbidden to enter the marshes, and in the 1990s the government destroyed a research station in the Hammar. As a result, most studies have relied upon Landsat remote-sensing data.

    After the U.S. invasion in 2003, foreign scientists suddenly gained access. As they scramble to create an extensive database, it's hard to stay ahead of the population. When Saddam's regime collapsed, local residents jubilantly broke open the dikes and dams, reflooding nearly half of the marshes. “They did not wait for us,” says Alwash.

    The reflooding has been haphazard, however, and many dikes and dams from the Saddam era remain. But there is now plenty of water available: The U.S. invasion coincided with the end of a long drought in the region. Given the fast pace of dam construction in countries upstream and the possibility of another drought, though, renewed desertification is likely.

    Last April a team funded by the Italian government began the first in situ study of the marshes, focusing on a small marsh of about 200 square kilometers called Abu Zirig. Reflooded in 2003 by locals, it has recovered rapidly. “This is a happy example,” says Italian hydrologist Andrea Cattarossi. “The environmental conditions were pretty good.” Carp, trout, smaller fish, and nearly half of the 50 to 60 species of birds that once flourished in the marsh have returned.

    Although the marsh appears to be recovering, Cattarossi's data show that the volume of water is preventing light from reaching the roots of aquatic plants, threatening their growth. He says new structures are needed to control water circulation. He also worries about overfishing by a culture grown accustomed to using pesticides, explosives, and electrocution. A drought year will spell doom for Abu Zirig, he warns, because the water source first flows through agricultural areas.

    But Abu Zirig is located north of the three main marshes and therefore receives larger quantities of less saline water than wetlands downstream. Those areas to the south will be more difficult to restore, scientists say.

    Curt Richardson, an ecologist at Duke University in Durham, North Carolina, and lead author on the Science paper, used U.S. Agency for International Development (USAID) funding to examine other marsh regions during two visits. He says that reflooded portions of the eastern Hammar marsh are essentially saltwater deserts. “We're not quite sure why,” he says. But he found that the area suffers from high salinity and high levels of hydrogen sulfide, which inhibits plant growth. Located close to the Persian Gulf—at the terminus of the “kidneys”—the marsh must be flushed with clean water to remove the salt and hydrogen sulfide. “The real question is whether there is enough water to do so,” says Richardson.

    One silver lining to the grim security situation, in which foreigners are often targets, is that Iraqi researchers are taking the lead in gathering data. “We had to train these people,” says Cattarossi—no simple task given the lack of equipment and expertise following nearly 2 decades of Iraqi isolation. “And the situation in the field is very difficult; the vegetation is thick, and the [residents] can be a little bit suspicious.” Richardson says that Alwash helped build trust with locals to ensure a steady flow of data for his study.

    More than two dozen Iraqi biologists now help gather data from the Hawizeh, Hammar, and Central marshes for foreign researchers such as Richardson. Iraqi scientists declined to be interviewed, fearing reprisals from insurgents. But Ali Farhan, an Iraqi engineer and adviser to the Iraqi government, explains that two teams visit each marsh every month. They gather data on water quality, phyto- and zooplankton, bottom sediments, fish, and birds. Such practical training, he says, should help the shattered Iraqi scientific establishment gain a place at the table in the marsh discussions. So far, he adds, data gathering has gone smoothly, thanks to careful cultivation of local sheiks. But he says researchers are still harassed at times by the bandits who roam the region.

    Mud to dust.

    The lush marshes around Abu Subat, unproductive desert after Saddam Hussein diverted water in the 1990s (left), are now trying to make a comeback.

    CREDITS: AZZAM ALWASH

    Romance or realism

    During the next year, scientists using Italian funding hope to map the current water flow in the Iraq marshes as a first step toward understanding how to stabilize and revitalize the marshes. The U.S. Army Corps of Engineers is working up a model that details the flow of the Tigris and Euphrates. Meanwhile, the Center for Restoration of the Iraqi Marshes (CRIM), an organization of several Iraqi ministries created last fall in Venice, will put together a “master plan.” “We need an international and Iraqi consensus,” says Thomas Rhodes, an American ecologist and head of USAID's southern Iraq region based in Basra.

    So far that's been elusive. Marsh advocates such as Alwash yearn for a vast reclamation project, with ecotourism to fuel the anemic local economy. He says it could be done for as little as $100 million, using Iraqi labor. Alwash worries that USAID, which is focusing on replanting date palms and ensuring the health of farm animals in southern Iraq, fails to grasp the enormous agricultural and economic benefits to the Marsh Arabs of a successful restoration.

    But others dismiss such visions as unrealistic, given the current fiscal, social, and security situation. USAID has recently discontinued its funding of Alwash's organization, and Andrew Natsios, head of the agency, says, “the marsh people are not interested in restoring all the marshes. They want some restored and some left dry for agriculture.”

    Peter Reiff, an anthropologist who works on the marsh issue as a USAID contractor, notes that local Marsh Arabs have abandoned their old way of life cultivating rice and using water buffalo: “They are becoming farmers, and they get better returns with sheep, wheat, and cattle.” And he accuses outsiders such as Alwash of possessing “a wistfulness about the marshes that is almost romantic. You are not going to make this into an ethnographic museum.”

    Iraqis themselves are divided. Minister Latif says he is “not very keen on the word “restoration”—restoring does not help the population.” He favors a plan that focuses on health, education, and transportation needs. And the long-suffering local people appear to want it all. During two recent conferences on the marshes held in southern Iraq, participants urged that the wetlands be restored—and that new schools, clinics, and roads be built to lift them out of their dire poverty.

    CRIM is supposed to bring all these disparate parties together. But some doubt it has the political and financial muscle to do so. “CRIM is understaffed, under-resourced, undertrained, if well-intentioned,” says one foreign scientist involved in its creation. And negotiating a deal with Turkey, Syria, and Iran on water rights—a crucial element in any restoration plan—poses a daunting diplomatic challenge. “It is quite obvious that there isn't enough water to restore all the desiccated marshes,” says Farhan.

    Two cultures

    There is consensus on the need to preserve at least some of the marsh. “There is great potential to restore a portion,” says Cattarossi. Rhodes sees a scattering of core areas of protected, healthy, and biodiverse marsh surrounded by zones of compatible use. The cores would support ecotourism, provide a haven for animal and bird life, and allow the old marsh life to survive and even flourish alongside a more typical lifestyle of dry farming. “There will be enough room for both cultures,” says Farhan. “In fact, Marsh Arabs are currently practicing both cultures.”

    To sell that vision, advocates argue that marsh restoration offers more than environmental and social benefits. “If we can get some of the marshes back, it would [also increase] security and stability,” says Barry Warner, a biologist at the University of Waterloo, Canada, who is organizing a bird count this year in the marshes. Adds Natsios: “U.S. support for the marsh people is also protection for our troops.”

    Making that case will be tough, however. Last year, the U.S. Congress rejected spending any money on marsh restoration. But Natsios says his agency will continue to provide modest sums for marsh research and planning.

    Even the enthusiastic Alwash—who calls the Eden Again project “his mistress”—acknowledges that full restoration is unrealistic, given the constraints of water, money, and political will. But he has not abandoned his dream of threading his way through endless beds of high reeds. Iraqis and their foreign friends may not be able to reconstruct a Garden of Eden. But they are hoping for the chance to recreate at least a piece of paradise.

  9. AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE MEETING

    Ocean Warming Model Again Points to a Human Touch

    1. Richard A. Kerr

    WASHINGTON, D.C.—Befitting its location in the nation's capital, this year's meeting of AAAS (publisher of Science) from 17–21 February addressed the nexus of science and society. More than 9000 attendees heard about great strides in robotics (Science, 18 February 2005, p. 1082), celebrated the Year of Physics and Einstein's legacy (Science, 11 February 2005, p. 865), and presented research that spanned science, medicine, and politics. For additional stories from the AAAS meeting, see ScienceNOW (sciencenow.sciencemag.org)

    Climate researchers concerned that their model might have overlooked something have retested the links between the burning of fossil fuels, greenhouse warming, and the warming of the deep oceans. A closer look at the evidence, they say, has bolstered their earlier conclusion: Humans are indeed warming the world, right down to thousands of meters deep in the oceans.

    The high statistical significance of the new study reported at the AAAS meeting “should wipe out much of the uncertainty about the reality of global warming,” says the study's lead author, climate researcher Tim P. Barnett of Scripps Institution of Oceanography in La Jolla, California.

    For a 2001 study (Science, 13 April 2001, p. 270), Barnett and colleagues ran a state-of-the-art climate model that traced where and when atmospheric heat trapped by rising greenhouse gases of the past century would have entered the oceans. When they compared the model's simulation to actual measurements of ocean temperature, they found a good match. With a confidence of 95%, they calculated, human-produced greenhouse gases are behind real-world warming. Three additional studies using three other models have yielded similar results.

    A match.

    A model's warming (green) fits the actual warming (red) and exceeds climate noise (blue).

    CREDIT: T. P. BARNETT

    Barnett and colleagues at Scripps and Lawrence Livermore National Laboratory in California, have now checked the model against better data, paying more attention to possible uncertainties in the model. They used a revised and updated set of ocean observations, this time avoiding a quirk in the data processing that had skewed temperatures in the data-poor Southern Hemisphere. To take account of variations among models, they compared detailed results from a second, independent model and studied how ocean warming in eight other models would affect the results. Unlike most other studies, they also followed the heat deep into separate ocean basins. In the end, the two main models “absolutely nailed the greenhouse signal” seen in the ocean, Barnett says. This time, statistical confidence is much greater than 95%, he says.

    Most climate scientists are reassured. “The fact that multiple models simulate a comparable [ocean] warming gives a robustness to the results,” says climate modeler Thomas Delworth of the Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey. But climate researcher and modeler Gerald North of Texas A&M University in College Station still wonders whether the models have realistic enough oceans. More tests no doubt await.

  10. AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE MEETING

    More Infectious Diseases Emerge in North

    1. Jocelyn Kaiser

    WASHINGTON, D.C.—Befitting its location in the nation's capital, this year's meeting of AAAS (publisher of Science) from 17–21 February addressed the nexus of science and society. More than 9000 attendees heard about great strides in robotics (Science, 18 February 2005, p. 1082), celebrated the Year of Physics and Einstein's legacy (Science, 11 February 2005, p. 865), and presented research that spanned science, medicine, and politics. For additional stories from the AAAS meeting, see ScienceNOW (sciencenow.sciencemag.org)

    CREDIT: PETER DASZAK ET AL.

    A preview of a new global map of emerging infectious diseases turns a common assumption on its head. The map, presented in D.C. by Peter Daszak of the Consortium for Conservation Medicine at Wildlife Trust in New York City, spans the years 1940 to 2004 and indicates roughly 500 locations around the world where specific diseases first emerged. (Red indicates multiple events.) The map suggests that the majority of emerging diseases originated in Europe, North America, and Japan—a result that appears to hold up after correcting for reporting biases, according to Daszak and his co-workers. The media and funding organizations tend to assume that most infectious diseases emerge in the tropics because AIDS, severe acute respiratory syndrome, Ebola, and other high-profile diseases began there, says Daszak. But the preliminary map suggests that food-borne infections and drug-resistant microbes in the northern industrialized countries—the result of factors such as agricultural practices, the overuse of antibiotics, and international travel—are a more significant public health threat. “It's very counterintuitive to what most people think about emerging diseases,” says Joshua Rosenthal of the Fogarty International Center at the National Institutes of Health.

  11. AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE MEETING

    Whaling Endangers More Than Whales

    1. Dan Ferber

    WASHINGTON, D.C.—Befitting its location in the nation's capital, this year's meeting of AAAS (publisher of Science) from 17–21 February addressed the nexus of science and society. More than 9000 attendees heard about great strides in robotics (Science, 18 February 2005, p. 1082), celebrated the Year of Physics and Einstein's legacy (Science, 11 February 2005, p. 865), and presented research that spanned science, medicine, and politics. For additional stories from the AAAS meeting, see ScienceNOW (sciencenow.sciencemag.org)

    For more than a decade, scientists have gone to great depths to study the unusual deep- ocean communities known as whale falls. When a whale dies, its carcass sinks to the sea floor and provides a long-lived home to worms, clams, mussels, and many other creatures. New results presented at the AAAS meeting suggest that commercial whaling, even at so-called sustainable levels, would drive many of the novel species found at these cetacean gravesites to extinction.

    In 1987, while surveying the sea floor in the submersible Alvin, biological oceanographer Craig Smith, now at the University of Hawaii, Manoa, and co-workers came across a strange menagerie thriving on and near a submerged whale skeleton. Since then, he and other researchers have shown that dead whales, like hydrothermal vents and cold seeps, can for decades support their own deep-sea biological communities. After all, the carcass of a great whale deposits up to 160 tons of blubber, meat, and bone in one fell swoop.

    To estimate how 2 centuries of commercial whaling has affected whale-fall communities, Smith and his colleagues have now combined whale-population estimates with two ecological models: one that links habitat loss to biodiversity and a second that estimates how abundant a species must be to avoid going extinct. The best published estimates indicate that 75% of whale populations—and therefore whale-fall habitats—have been lost in the North Atlantic since large-scale whaling began in the early 1800s, Smith says. Based on those numbers, both models predict that whaling has already caused about 40% of North Atlantic whale-fall species to go extinct. Even at the so-called sustainable levels of whaling being considered by the International Whaling Commission, in which whale populations would be maintained at 50% of historic, prewhaling levels, 15% of the whale-fall species will disappear forever, according to the models.

    Homeless?

    Whaling threatens the survival of creatures such as this worm that live off whale carcasses

    CREDIT: ADRIAN GLOVER

    In an ambitious effort that may help identify whale-fall species before they do go extinct, Smith and his colleagues have recently towed out to sea the huge carcasses of five whales that had beached themselves and died, sunk them, and periodically returned to each carcass in a submersible. Sleeper sharks, crabs, and hundreds of hagfish munched away at the carcasses for months, and many thousands of pea-sized amphipods nibbled on the smaller pieces. A strange assortment of creatures then colonized the bones and nearby nutrient-rich sediments, including several new species of the bone-eating zombie worm (Science, 30 July 2004, p. 668), which uses symbiotic bacteria to help digest the fatty marrow of whale bones. Over many decades, these microbes and other free-living bacteria break down oils trapped in whale bones, producing sulfide that fuels the growth of an average of 185 species per large whale skeleton. From these studies, the Hawaii team has upped the tally of species potentially unique to whale falls to 32.

    “It's absolutely fascinating,” says biological oceanographer Steven Palumbi of Stanford University in California. The work, he adds, shows that “there's a whole community of organisms in the deep sea that specializes in being the undertaker of these whale carcasses.” And, says Palumbi, the modeling demonstrates that human activity can drive at least some deep-ocean creatures, besides whales, to extinction.

  12. AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE MEETING

    DNA Tells Story of Heart Drug Failure

    1. Jennifer Couzin

    WASHINGTON, D.C.—Befitting its location in the nation's capital, this year's meeting of AAAS (publisher of Science) from 17–21 February addressed the nexus of science and society. More than 9000 attendees heard about great strides in robotics (Science, 18 February 2005, p. 1082), celebrated the Year of Physics and Einstein's legacy (Science, 11 February 2005, p. 865), and presented research that spanned science, medicine, and politics. For additional stories from the AAAS meeting, see ScienceNOW (sciencenow.sciencemag.org)

    Physicians have long known that a drug commonly prescribed for heart failure helps only about half the patients who receive it. Now, researchers are explaining that puzzle using genetics. A study reported at the AAAS meeting found that a difference of a single amino acid within the drug's protein target may determine whether the drug works. The discovery could ultimately help physicians better juggle drugs in heart failure patients and possibly in those with high blood pressure as well.

    In the late 1990s, pulmonologist Stephen Liggett of the University of Cincinnati, Ohio, along with his colleagues, found that people had a certain polymorphism, or genetic variation, in the gene encoding the beta-1 adrenergic receptor. That's the receptor targeted by the heart drugs known as beta-blockers.

    In the general population, there are two common forms of the receptor gene: One version makes the receptor with arginine at a particular site; the other, which varies by just a single nucleotide, places a glycine there instead. Because every person has two copies of the receptor gene, inheriting one from each parent, an individual can have two copies of the glycine variant, two of the arginine, or one of each. Mice endowed with the human arginine variant are both more susceptible to heart failure and more responsive to beta-blockers, raising the possibility that the same holds true in people.

    So, Liggett's team recruited 1040 volunteers, all people with severe heart failure. Roughly 490 had two copies of arginine, 450 had one of arginine and one of glycine, and the rest had two copies of the glycine version. Patients were randomly assigned to receive either a placebo or the drug bucindolol, a beta-blocker.

    The researchers found that the cohort with two copies of the arginine variant were helped most by the drug. Compared to the placebo group, the bucindolol users experienced fewer deaths and hospitalizations over about 2 years (and in some cases up to five). Over the course of the study, 82% of them survived compared to 65% of those on the placebo. But those with one copy of each receptor variant or two copies of the glycine version weren't helped at all by the drug, faring about as poorly as the patients on placebo who had two copies of the arginine variant.

    Liggett intends to put together a larger study to confirm the findings. This time, however, all patients would receive the bucindolol because it wouldn't be ethical to give double-copy arginine patients a placebo, Liggett said during his presentation.

    Liggett notes that, on its own, the variation in the beta-adrenergic receptor gene doesn't seem to affect heart failure risk. But people with two copies of the gene for the arginine variant and two copies of another gene—a combination nearly unique to African Americans—have 10 times the risk of heart disease.

    “He's found a polymorphism that seems to predict response” to bucindolol, says molecular pharmacologist Kathy Giacomini of the University of California, San Francisco. The result, she adds, is “very exciting” and “very specific.” It's not clear yet, says Liggett, whether the findings apply to other beta-blockers, which are also used to treat high blood pressure.

  13. ARCHAEOLOGY

    Ancient Alexandria Emerges, By Land and By Sea

    1. Andrew Lawler

    Excavators are finding surprisingly late signs of intellectual life in the ancient capital of Hellenistic Egypt and discovering that geology played a dramatic role in the city's fall

    OXFORD, U.K.—For centuries the massive Pharos lighthouse, one of the seven wonders of the ancient world, guided sailors to the busy wharves that made Alexandria a prosperous center of Mediterranean culture and home to the greatest library of ancient times. Yet while rivals Rome and Constantinople survived the chaotic period following the collapse of the Roman Empire, Alexandria faded from the historical record. By the 8th century C.E. the famed metropolis had fallen into oblivion.

    Today the city of Alexandria, site of Alexander the Great's tomb and Cleopatra's death, attracts scholars the way it once drew merchants and philosophers, as shown by a recent conference at Oxford University.* Rescue archaeology amid rapid urban growth combined with new underwater mapping technologies are yielding new insight into the old city's role and history. Archaeologists have uncovered tantalizing hints of surprisingly early beginnings as well as signs that the city's vibrant intellectual life lasted far longer than anyone had expected. “Now we can imagine the functioning of a university in antiquity,” says historian Manfred Clauss of Germany's University of Frankfurt.

    New data also suggest that environmental disaster played an important role in ancient Alexandria's downfall, which has long been attributed primarily to religious and political turmoil; the fate of Alexandria could provide a warning for today's fast-growing cities built on deltas, researchers say.

    An ancient wonder.

    Excavators seek the remains of the Pharos lighthouse, shown here in a Renaissance artist's view.

    CREDIT: BETTMANN/CORBIS

    An ancient think tank

    Most of the new data comes from digs that began in the 1990s, when the Egyptian government lifted a ban on underwater archaeology and began to encourage salvage work on land as the city of 6 million expanded over its ancient foundations. Those foundations were officially laid in 332 B.C.E., when legend has it that the Greek bard Homer appeared to Alexander the Great in a dream and urged him to found a city along the narrow strip of land separating Lake Mareotis from the Mediterranean Sea. It seemed an unlikely choice; the mouth of the Nile was far to the east, where the major Egyptian ports were well established, and only small villages existed on the spot.

    The city grew to prominence after Alexander died in 323 B.C.E., and his general Ptolemy made it the capital and largest port of Egypt. The vast wealth of Ptolemy and his Greek and Macedonian successors built a Greek-style city of temples, lavish palaces, and the famous library, says historian Gunther Grimm of Germany's University of Trier; scholarship in philosophy, physics, mathematics, and astronomy thrived. Under the Roman rule that followed Cleopatra's death in 30 B.C.E., Alexandria served as the nexus for grain exports for the vast empire. But within a few centuries, the city largely vanishes from the historical record.

    Now nearly a dozen teams of excavators are sifting through what remains both in the city and in the harbor. Archaeologist Jean-Yves Empereur of the Center for Scientific Research in Paris is working fast to salvage remains of the ancient city as the new one expands. “Ten to 12 meters beneath the modern city, the old one is very well-preserved,” says Empereur, who was one of several scholars who declined to attend the conference because of their concerns about Oxford's ties to a private underwater archaeologist, Franck Goddio (see sidebar). Empereur adds that the houses, streets, and mosaics that have been uncovered represent “just 1% of what could be rescued.”

    Even that small percentage is rewriting the city's history. Most historians assume that intellectual life in the city withered with the destruction of the library—which likely occurred over hundreds of years—and the rise of Christianity. But among the most intriguing recent finds is a complex of lecture halls that appear to be “the center of [the city's] intellectual and social life in late antiquity,” says Warsaw University's Grzegorz Majcherek of the Polish-Egyptian Archaeology Mission. Each hall includes a single central seat for a notable—likely the teacher—and often a smaller seat on the floor, perhaps for student recitations. The complex is part of the old city's most extensive area of urban architecture. Majcherek estimates that the halls were built in the late 5th and early 6th centuries C.E. and notes that a Roman theater was even converted into a lecture hall at this time. He speculates that what he calls “the Oxford of antiquity” could have survived into the era of Arab control—“surprisingly late.”

    The find intrigues historians, who say there has been little evidence that intellectual life in the city flourished for so long. “This is the most exciting find in years in Alexandria,” says Clauss. “The buildings Professor Majcherek has found demonstrate the existence of a think tank” long after the fall of Rome. “It is surprising that it seems to function in a modern way,” he adds.

    Facing the past.

    Divers have found artifacts such as this sphinx representing Cleopatra's father.

    CREDIT: FRANCK GODDIO/HILTI FOUNDATION/DCI

    Down under

    Just a few hundred meters away, an important part of ancient Alexandria lies undisturbed underwater, meters from the modern breakwater lining the harbor. In the 1990s, Empereur uncovered statuary and blocks that may be portions of the Pharos lighthouse, which survived in ruins until an earthquake in the 14th century. Goddio found a sunken palace from the Ptolemaic era and brought up statues and other artifacts that he hailed as remnants of Cleopatra's palace. That claim, as well as the exact location of the Pharos, remains in dispute.

    More recent finds are less spectacular, but they shed important light on the evolution of the harbor that was Alexandria's heart. For example, Goddio's team now has found evidence of a dock that dates to about 400 B.C.E., predating Alexander. “We were surprised, took new samples, and got the same answer—this was most probably a pre-Ptolemaic structure [and is now] 7.5 m below sea level,” says Goddio. Geologist and team member Jean-Daniel Stanley of Washington, D.C.'s Smithsonian Institution told meeting participants he has found tantalizing hints that inhabitants smelted lead on the site as early as 2000 B.C.E.

    The discoveries are part of an ambitious effort by Goddio to map the entire harbor bottom—one data point for every 25 centimeters—and conduct extensive radiocarbon dating of planks and pilings brought up by divers. The survey of the 2.5-kilometer-by-15-kilometer area will give researchers “quite a precise idea” of the location of docks and buildings that lined the harbor, says Goddio: “A ghost from the past is being brought back to life.”

    Meanwhile, geologist Stanley has examined dozens of cores from the harbor and uncovered evidence of the centuries-long battle that ancient engineers waged against both gradual and sudden subsidence. He says the subsidence was brought on by a lethal combination of earthquakes, tsunamis, and the slow but relentless sinking of heavy foundations into unstable soil, which defeated even savvy Roman engineers. Although several wharves appear to have been reconstructed over centuries, no amount of piling could long hold up heavy stone foundations and buildings, he says. “[Adding] on all that material was asking for trouble,” Stanley says. “The additional weight of a wave surge could be powerful enough” to submerge part of Alexandria's shore.

    SOURCE: ADAPTED FROM FRANCK GODDIO/HILTI FOUNDATION

    The historical record also shows an unusually active period of tremors from the 4th to the 6th centuries C.E. Quakes and tsunamis could have transformed sediment into a more fluid state, says Stanley. Sixty-five cores taken from the western harbor show signs of ancient liquefaction, he said, and numerous pieces of red coral not native to the harbor suggest that a tsunami washed them into the basin. But he says it is too early to reconstruct details of ancient collapse and rebuilding. “We need better 3D images of harbor substrate” to understand what repairs were done and when, he said.

    The impact of these geological forces extended beyond Alexandria—and with even more dramatic consequences. Stanley and Goddio also are excavating three submerged cities in nearby Aboukir Bay: Herakleion, Canopus, and Menouthis. The first was an important entrance point to the mouth of the Nile, and the others were well-known pilgrimage sites. The area received huge amounts of sediment from the Nile, which compacted and sank over time. This process, combined with a slow rise in world sea levels, pushed the water at least 2 meters higher between the 6th century B.C.E. and the 7th century C.E. “Arabic texts show a huge Nile flood in 741 and 742 A.D.,” notes Clauss. And by the 8th century—the same time Alexandria slips into obscurity—the historical record on these sites goes silent.

    Stanley's research supports a theory that combines catastrophe and gradual sinking to explain the disappearance. Submergence alone cannot explain why much of the area is now a full 6 meters under water, and Stanley posits that sudden shifts in the flow of Nile branches on the delta—perhaps brought on by the spate of earthquakes—may have triggered more dramatic changes. Unstable sediment would have been laterally displaced, causing sudden destruction as the Nile moved into a new bed. Goddio's team has found evidence of human remains underneath toppled walls at the three sunken cities, backing up this theory.

    In an era of climate change and fast urban growth along coasts, this research may have implications today. Stanley notes that modern cities such as Venice, Bangkok, and New Orleans sit on unstable delta soils. “This is becoming a world problem,” he says. “Understanding the subsidence threat might help such cities avoid the fate of Herakleion, Canopus, and Menouthis.” Alexandria, at least, escaped with only a flooded harbor—a sign that Homer perhaps was as canny a geologist as he was a storyteller.

    • *“City and Harbour: The Archaeology of Ancient Alexandria” at the Oxford Centre for Maritime Archaeology, 18–19 December 2004.

  14. ARCHAEOLOGY

    Oxford Center Raises Controversy

    1. Andrew Lawler

    Ancient Alexandria was famed for its philosophical disputes, and that tradition is very much alive in excavations now under way in the Egyptian port. Scholars are hotly debating a controversial agreement that gives a nonscientist, French businessman Franck Goddio, control over underwater archaeological data collection for Oxford University. At a conference held in December—a coming-out party for Oxford's new Center for Maritime Archaeology—dozens of scholars discussed new finds (see main text). But others avoided the event, arguing that contracting out the leadership of maritime digs to nonscientists sets a poor precedent.

    Under the deal signed 18 months ago, Goddio will oversee undersea excavations; Oxford graduate students, under the guidance of professors, will analyze the data. The Hilti Foundation of Lichtenstein, which has supported Goddio's work for a decade and is funded by a tool company of the same name, will provide at least $300,000 to fuel the center, which for now will focus on Goddio's work in Alexandria and nearby Abukir Bay.

    Goddio, who has worked for 20 years on more than 50 underwater sites, is a Jacques Cousteau of archaeology, often featured on European television. Although he lacks a degree in archaeology—he studied statistics—Goddio says his experience speaks for itself. But he also seeks academic respectability. The Oxford agreement “is a chance for us to get closer to a university which could back our work and take advantage of our discoveries,” says Goddio. “We were looking for a scientific base or ‘harbor’ for the findings and results from Franck Goddio's excavations,” adds Michael Hilti, who heads the Hilti Foundation. “With Oxford, I think we have found a perfect partner.”

    The arrangement makes sense to university officials, who are eager to enter the burgeoning and expensive field of maritime archaeology. “We were blown over by the quality of Franck's underwater fieldwork,” recalls Barry Cunliffe, the Oxford classical archaeologist who helped broker the deal. “It was an extremely smart piece of archaeology, well-ordered and observed.”

    Modern mariner.

    Former businessman Franck Goddio leads spectacular underwater digs.

    CREDIT: CHRISTOPH GERIGK/FRANCK GODDIO/HILTI FOUNDATION

    But Goddio's deal with Oxford raises concerns among many maritime archaeologists uncomfortable with turning over part of the scientific process to those who lack formal training. “I'd be wary of parceling out one of the links in the chain of science,” says maritime archaeologist Jon Adams of the University of Southampton, U.K. “Archaeology should be conducted by proven and trained archaeologists,” adds George Bass, a professor emeritus at Texas A&M University in College Station who is considered one of the founders of maritime archaeology.

    Robert Grenier, head of Ottawa's Parks Canada maritime archaeology unit, adds that Goddio's record is big on coffee-table books but small on scholarly publications. For example, he says, Goddio excavated the 35-meter-long Spanish galleon San Diego off the coast of the Philippines and produced a glossy catalog but limited scientific data. Grenier worries about data that may not be collected, such as apparently inconsequential fragments that might provide a clue to a ship's identity or place of construction.

    Goddio defends his record, noting that the second San Diego mission lasted more than 4 months and was devoted to understanding the ship's hull construction; he adds that he still hopes to publish more details.

    Cunliffe insists that skilled nonscientists can make an enormous contribution because retrieving information from underwater digs is so technologically intensive and expensive. The choice he sees is to ignore nonscientists' expertise and funding, or to find a creative way to work with it. “The cost of doing this work is almost prohibitive unless you have the backing of a large foundation,” he says.

    A large maritime excavation can cost upward of $1 million a month, forcing many underwater archaeologists to seek foundations or television producers to help fund their work. “We've all done a bit of whoring to get the money we need,” admits one respected maritime archaeologist. And when it comes to Goddio's bountiful financial support, he adds, “I'm jealous.”

  15. RADIO ASTRONOMY

    Bristling With Promise

    1. Kim Krieger*
    1. Kim Krieger is a freelance science writer in Washington, D.C.

    By substituting software for massive signal-gathering dishes, arrays of simple FM antennas offer astronomers a cheap, versatile alternative to traditional radio telescopes

    In a remote Chinese valley sit 25 neat clusters of antennas, each tipped slightly askew. They are testing the airwaves, listening for interference from TV signals. If reception is clear enough and other things go well, within the next year or two the fields of the Ulastai Valley will fill with tilted antennas, like a Christmas tree farm pummeled by wind.

    The valley will become a huge array of 2-meter-long antennas, 10,000 strong, covering 30,000 square meters. The array, dubbed the Primeval Structure Telescope (PaST), is the brainchild of a group of Canadian, Chinese, and American scientists pursuing a low-frequency portrait of the early universe. And they hope to find it out in the vast, quiet stretches of western China, one of the last places on Earth out of the reach of jabbering TV and FM radio broadcasts.

    Though just 25 pods of 127 antennas each right now, PaST is a herald of what's to come. Thanks to recent advances in theory and computing power, radio astronomers can now build telescopes consisting of huge arrays of antennas capable of viewing the universe in a novel palette of low frequencies hitherto rarely used for astronomical observations. “What's most exciting to me [is] that we don't know what we're going to see,” says PaST collaborator Jeffrey Peterson of Carnegie Mellon University in Pittsburgh, Pennsylvania.

    Peterson isn't alone in his enthusiasm. Several other array telescope projects are under way in the Netherlands, Western Australia, and the American Southwest. Their scientific goals include finding radio equivalents of gamma ray bursts and detecting the faint traces of the first stars.

    The arrays will take radio astronomy back to its roots in the 1930s, when Karl Jansky, an engineer at Bell Telephone Laboratories in Holmdel, New Jersey, noticed radio waves emanating from the center of the Milky Way galaxy at 20 MHz. The field took off after radar operators during World War II discovered a technique called interferometry, which enabled astronomers to string together several small antennas to get the same resolution as that of one huge antenna. But researchers soon realized that low frequencies were wrinkled and warped into indecipherability by Earth's ionosphere. Frustrated, they switched their attention toward frequencies above 1 GHz. And that's where radio astronomy stayed until recently, when new calibration techniques opened a window into the low-frequency range.

    The breakthrough came in 1991, when astronomers using computer algorithms to correct for the effects of ionospheric interference jiggered the Very Large Array (VLA), a Y-shaped assemblage of 27 dish antennas in western New Mexico, into receiving at a record low frequency of 74 MHz. Their success blasted open opportunities for large low-frequency arrays. “The old low-frequency telescopes were like a nearsighted person trying to read from far away without his glasses,” says Namir Kassim, a radio astronomer at the Naval Research Laboratory in Washington, D.C. “When we learned how to put glasses on these telescopes, all hell broke loose, and now everyone's trying to build them.”

    VLA and its kin, however, suffer from a major disadvantage: Dish antennas are not good at picking up low frequencies. If the length of the wave is close to the diameter of the dish or longer, the dish can't see it at all. By contrast, arrays of wire antennas—either simple FM dipoles or log-periodic antennas made of multiple dipoles—can be designed to pick up any wavelength astronomers might fancy. All it takes is the right arrangement, enough land, and a supercomputer programmed to convert the jumbles of waves sweeping across the arrays into useful images. Somewhere between 200 MHz and 100 MHz, the efficiency advantage switches over from dishes to dipoles.

    In the early 1990s, inspired by the new calibration techniques, scientists from the United States, the Netherlands, and Australia started investigating designs and locations for the Low Frequency Array. LOFAR was conceived as a cheap, quick instrument to get a first look at the low-frequency universe, says Kassim, who was LOFAR's international project scientist. But ballooning costs and disagreements over siting and scientific goals crippled the project, and the collaboration disintegrated.

    Today, Germany and the Netherlands are working to build a 25,000-antenna LOFAR array spread over an area 350 kilometers in diameter on both sides of their border; researchers hope to develop sophisticated new techniques to filter out radio and television signals. Other erstwhile LOFAR partners, meanwhile, have hatched projects that rely on geography to solve the problem of interference. Kassim and colleagues at the Naval Research Lab are now collaborators in the Southwest Consortium, a low-frequency radio astronomy project based in the American Southwest, a popular site for traditional radio telescopes. The telescope, known as the Long Wavelength Array (LWA), will examine ultralow frequencies from 20 MHz to 80 MHz with 10,000 dipoles strung along 400 kilometers.

    Array of hope.

    Peterson predicts a bright future for PaST.

    CREDIT: COURTESY OF J. PETERSON

    Other former LOFAR participants—astronomers at the Massachusetts Institute of Technology (MIT), the University of Melbourne, and the Australia National Telescope Facility—have joined with the Harvard-Smithsonian Center for Astrophysics to plan a 3000-antenna low-frequency array in the outback of western Australia. If all goes well, the Mileura telescope will look at large-scale structure in the universe. The United States and Australia are negotiating joint funding for the $10 million project.

    Location is key. FM radio and television signals are as damaging to radio astronomy as light pollution is to optical astronomy and are far more pervasive. “The best place for this would be the far side of the moon,” says Jacqueline Hewitt, an astronomer at MIT involved in the planning of the Mileura project. “But if we can't go to the moon, we'll go to western Australia.” PaST scientists say that if they could afford to, they would build their array at the South Pole, 2000 km from the nearest TV transmitter and with a 6-month polar night to increase the transparency of the ionosphere.

    Remote real estate aside, antenna arrays offer several advantages over traditional radio telescopes. For one, their cost—a fraction of the $3.5 million price tag of a single VLA dish—puts them within reach of even small astronomical partnerships. In the spring of 2003, for example, PaST was just an idea Carnegie Mellon's Peterson was batting around with Ue-Li Pen of the University of Toronto. Within a year, Xiang-Ping Wu of the National Astronomical Observatory of China had joined the project, and they had gotten $600,000 of funding from the Chinese government, found a site, and set up test antenna. The current funding will support an array of 2500 antennas. The researchers expect that support from the National Astronomical Observatory will enable them to install another 7500 antennas by 2006.

    Antenna arrays boast technical advantages as well. Existing radio dish telescopes, such as Arecibo in Puerto Rico, just aren't up to the task of seeing in the ultralow frequencies, the researchers say. Big dish telescopes with just one, huge antenna have too narrow a field of view. They can sweep the sky over a period of time, but sweeping complicates the delicate calibration needed to compensate for the ionosphere. Dish arrays have the same drawback. Dipole arrays, by contrast, can effectively look in any direction or all directions at once, as long as computers are available to crunch the data. Log-periodic arrays are similar but have narrower fields of reception.

    The 180-degree view of a dipole array means that transient phenomena may be noticeable. For example, researchers say, a very high-resolution array might pick up bursts of ultralow-frequency radio waves from gas-giant planets circling other stars—something a dish telescope could spot only if it were looking in exactly the right direction at the right time. Arrays might also pick up long-theorized radio counterparts to the cosmic energy blasts known as gamma ray bursts. By studying how such radio bursts distort as they cross space, astrophysicists could test cosmological models that predict there should be lots of ionized gas between galaxies.

    The most mouthwatering possibility is that sensitive array telescopes will catch whispers of reionization, the moment early in cosmic history when the first stars flickered on. At that time the universe was filled with neutral hydrogen gas. Ultraviolet light from newly ignited stars blew electrons off hydrogen atoms, creating ions that don't radiate at certain wavelengths. The result, if radio astronomers can spot it, should be big patches of silence between 100 and 150 MHz amid an otherwise steady hum of neutral hydrogen. PaST, Mileura, and LOFAR all hope to detect reionization, although researchers acknowledge that it's a long shot.

    At least astronomers won't have to wait long for the arrays themselves. PaST spotted galaxies at redshifts of 0.3 last year with just two protopods, and it is scheduled to start collecting data seriously this spring. The Southwest Consortium could have the core of the LWA up and running by 2008. The Mileura project hopes to erect its first test antennas by the end of the month. And more ambitious arrays may be in the works. From now on, when the universe rumbles in the ultralow, we'll be listening.

Log in to view full text

Log in through your institution

Log in through your institution