News this Week

Science  26 Jun 2009:
Vol. 324, Issue 5935, pp. 1626
  1. Biosecurity

    Discovery of Untracked Pathogen Vials at Army Lab Sparks Concerns

    1. Yudhijit Bhattacharjee

    Like forgotten ice cream in the back of a freezer, more than 9200 unlisted vials of dangerous pathogens and toxins have turned up in an inventory of biological materials at the U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID) in Fort Detrick, Maryland. USAMRIID officials say most of the samples—representing about 13% of 70,000 vials housed in the institute—were left behind by researchers who worked at the institute before 2005. Officials say although they cannot be “100%” certain that no unlisted samples were lost or stolen, security measures at the institute make it unlikely.

    The finding, announced by USAMRIID deputy commander Mark Kortepeter last week, has surprised some researchers and security experts, who say the institute may have violated federal rules by not conducting such an exhaustive inventory right after the enactment of the 2002 Bioterrorism Act. It has also left some wondering whether other biodefense labs might have incomplete databases of select agents.

    “I am greatly shocked,” says Janet Shoemaker, director of public affairs at the American Society for Microbiology in Washington, D.C., who helped draft the bioterrorism legislation after the 2001 anthrax letter attacks. However, “the fact that USAMRIID is releasing this information shows a new level of transparency, and that's refreshing,” says Paul Keim, a microbiologist at Northern Arizona University (NAU) in Flagstaff.

    The institute has been under intense public scrutiny since last summer, when the Federal Bureau of Investigation implicated former USAMRIID researcher Bruce Ivins in the 2001 anthrax attacks. On 4 February, institute commander John Skvorak ordered a full inventory of the materials after a spot inspection uncovered four vials of Venezuelan equine encephalitis that had never been entered into the institute's electronic database, created in 2005 (Science, 13 February, p. 861). Over the next 4 months, research at the institute stood still as scientists counted every vial in its 330-odd freezers and refrigerators. “The vast majority” of the unlisted samples were “working stocks that had accumulated over several decades,” says Kortepeter.

    The initial inventory for the electronic database 4 years ago had focused on vials that were in active use, Kortepeter says. The process failed to capture materials that had been pushed to the back of the freezers after the researchers working with them had left the institute. “It would have been better to do a complete inventory,” concedes Sam Edwin, USAMRIID's inventory control officer.


    USAMRIID officials say documenting vials from every box in every freezer was a monumental task.


    Shoemaker points out that one of the rules that went into effect after the 2002 Bioterrorism Act was passed, CFR 73.17, requires that labs permitted by the government to possess select agents maintain an “accurate, current inventory” for each agent, including the quantity and where it is stored. USAMRIID's failure to do so may have violated that rule, she says.

    In addition, in late 2002, the Centers for Disease Control and Prevention (CDC) and the Department of Health and Human Services notified all labs across the country that they must dispose of any select agents in their possession or register under the select-agent program. The notification prompted some labs to autoclave their select agents or ship them to other facilities. Shoemaker finds it surprising that managers at USAMRIID “did not go into the institute's freezers” back then “to determine what they had in there.”

    Keim, whose lab at NAU handles anthrax and other pathogens, is surprised, too. After the notification, “we scrambled like crazy and had to divert people from other projects to do a 100% inventory,” he says. “I remember spending hundreds of hours in the biocontainment suite over a 6-month period, putting bar codes on every vial.”

    The reason that was never done at USAMRIID was sheer volume, says Kortepeter, who insists that the institute broke no rules. “There may be some latitude in how the regulations regarding inventory are interpreted,” he explains, adding that the rules don't specify accounting down to the vial level. And documenting every vial is tough, he says. “You open these small drawers, the vials are caked up in frost, it's dark,” he says. “You can't hold the door open very long” because an increase in temperature could destroy samples.

    Army spokesperson Michael Brady says the Army may decide to investigate whether any unlisted vials could have gone missing. He says all of the Army's other biodefense labs, which are much smaller than USAMRIID, have completed 100% inventories in recent months. From now on, says Brady, all Army labs, including USAMRIID, will have to do a complete inventory every year, and the Army will conduct random inspections every quarter.

    Shoemaker says USAMRIID's case raises questions about lab inspections conducted by federal agencies to ensure compliance with select-agent rules. At least in USAMRIID's case, a CDC inspection in September 2008 failed to identify the problem of unlisted samples, say USAMRIID officials.

  2. Fusion Research

    ITER Gets the Nod for Slower, Step-by-Step Approach

    1. Daniel Clery

    The managers of the ITER fusion project are still scrambling to draw up a final design, schedule, and cost estimate for the massive reactor. But the international partners behind the effort agreed last week to build the project in stages so engineers can make corrections if something goes awry early on.

    First, a simple stripped-down reactor will start producing a superhot hydrogen plasma in 2018; then components will gradually be added to prepare it for a power-producing plasma of deuterium and tritium by the end of 2026, some 18 months to 2 years later than previously planned. Member countries agreed to the new plan, known as scenario 1, at the half-yearly meeting of the ITER council in Mito, Japan, contingent on their accepting the full revised design, costing, and schedule, known as the baseline, at their next meeting in November. “We're learning as we go,” says David Campbell, deputy head of ITER's fusion science and technology department. “There is some delay, but it's the first time we've done this.”

    ITER, originally called the International Thermonuclear Experimental Reactor, aims to prove that nuclear fusion, the process that powers the sun and other stars, could provide Earth-bound energy. Researchers spent about 15 years drawing up a design for ITER, and the ground is now prepared at Cadarache in southern France to begin construction this year. But when the ITER organization was officially created in October 2007, staff were working from a baseline drawn up in 2001. Researchers spent the next year working on a redesign incorporating the latest results from plasma physics.

    But the physicists' wish list inevitably pushed up the price tag. The governments of the seven members—China, the European Union, India, Japan, South Korea, Russia, and the United States—were alarmed by early cost estimates, which insiders say ranged as high as twice the original €5 billion construction estimate (Science, 27 June 2008, p. 1707). “There's no question that costs have gone up,” says Steven Cowley, director of the Culham Science Centre near Oxford, the U.K.'s fusion research lab, adding: “I'm pretty optimistic that we'll see a number of measures to bring down those costs.”

    Ready, set, …

    The ground is prepared at Cadarache in southern France. Construction of ITER will begin this year.


    ITER managers proposed scenario 1 mainly to reduce technical risk, says Campbell. “We'll be able to shake down the core system, demonstrate first plasma, then integrate other systems,” he says. “If something goes wrong, it's easier to get at it and fix it.” Earlier fusion reactors, including JET at Culham, currently the world's largest, adopted a similar staged approach. The machine that researchers will fire up in 2018 will pretty much just comprise the vacuum vessel, superconducting magnets to hold the plasma in place, and a cryogenic system to cool the magnet coils. Researchers will first run the reactor with normal hydrogen inside to learn how to control the plasma, and there won't be any fusion reactions.

    In following years, engineers will install diagnostic instruments, microwave and particle beam heating systems to raise the plasma's temperature, a metal blanket on the inside wall of the vessel to absorb neutrons from the fusion reactions, and a diverter to extract spent fuel. Only when all of those components are working properly will researchers fire up a plasma of deuterium and tritium and attempt to generate energy. Once they start, the vessel will be radioactive and harder to modify. Campbell says scenario 1 will delay the start of deuterium-tritium operation from early 2025 to late 2026. “Scenario 1 has a lot of attractions for a machine builder,” says Cowley. “It's a better idea than the original idea.”

    Estimating ITER's cost remains a thorny issue and may be impossible to do accurately, project scientists say. In the 2001 baseline, researchers had calculated a value for ITER's various components so that they could be parceled out fairly to the partners to build and deliver to the site. Each partner decides for itself how much to spend on their components, depending on local market conditions and commodity prices. The ITER organization in Cadarache is in charge of only about 10% of spending. “We're trying to make the best estimate of the costs we're responsible for,” says Campbell.

    As part of that effort, after ITER staff presented the first results of the redesign at a council meeting in June 2008, the council ordered JET's former operations director, Frank Briscoe, to carry out an independent review of costs. As of last week's meeting, that review was still going on, and the council requested that a final revised baseline be presented at its next meeting in November.

    Cowley thinks the council would not even consider scaling down the reactor to cut costs. “You would never get to fusion with anything less,” he says. “ITER [as it stands] is the minimum possible.” In the end, Cowley believes, the members are less concerned about the overall cost than about keeping annual expenditure steady. The phased construction plan has the added benefit that it avoids a huge peak in construction activity and spending during about 2015–16.

  3. Depression Gene

    Back to the Drawing Board for Psychiatric Genetics

    1. Constance Holden

    Geneticists have long been immersed in an arduous and largely fruitless search to identify genes involved in psychiatric disorders. In 2003, a team led by Avshalom Caspi, now at Duke University in Durham, North Carolina, finally landed a huge catch: a gene variant that seemed to play a major role in whether people get depressed in response to life's stresses or sail through. The f ind, a polymorphism related to the neurotransmitter serotonin, was heralded as a prime example of “gene-environment interaction”: whereby an environmental trigger influences the activity of a gene in a way that confers susceptibility.

    “Everybody was excited about this,” recalls Kathleen Merikangas, a genetic epidemiologist at the National Institute of Mental Health (NIMH) in Bethesda, Maryland. “It was very widely embraced.” Because of the well-established link between serotonin and depression, the study offered a plausible biological explanation for why some people are so much more resilient than others in response to life stresses.

    But an exhaustive new analysis published last week in The Journal of the American Medical Association suggests that the big fish may be a minnow at best.

    In a meta-analysis, a multidisciplinary team headed by Merikangas and geneticist Neil Risch of the University of California, San Francisco, reanalyzed data from 14 studies, including Caspi's original, and found that the cumulative data fail to support a connection between the gene, life stress, and depression. It's “disappointing—of all the [candidates for behavior genes,] this seemed the most promising,” says behavioral geneticist Matthew McGue of the University of Minnesota, Twin Cities.

    The Caspi paper concluded from a longitudinal study of 847 New Zealanders that people who have a particular variant of the serotonin transporter gene are more likely to be depressed by stresses, such as divorce and job loss (Science, 18 July 2003, pp. 291, 386). The gene differences had no effect on depression in the absence of adversity. But those with a “short” version of the gene—specifically, an allele of the promoter region of the gene—were more likely to be laid low by unhappy experiences than were those with two copies of the “long” version, presumably because they were getting less serotonin in their brain cells.

    Subsequent research on the gene has produced mixed results. To try to settle the issue, Merikangas says, “we really went through the wringer on this paper.” The group started with 26 studies but eliminated 12 for various reasons, such as the use of noncomparable methods for measuring depression. In the end, they reanalyzed and combined data from 14 studies, including unpublished data on individual subjects for 10 of them.

    When life gets you down …

    Serotonin-linked variant may not have a hand in it after all.


    Of the 14 studies covering some 12,500 individuals, only three of the smaller ones replicated the Caspi findings. A clear relationship emerged between stressful happenings and depression in all the studies. But no matter which way they sliced the accumulated data, the Risch team found no evidence that the people who got depressed from adverse events were more likely to have the suspect allele than were those who didn't.

    Caspi and co-author Terrie Moffitt, also now at Duke, defend their work, saying that the new study “ignores the complete body of scientific evidence.” For example, they say the meta-analysis omitted laboratory studies showing that humans with the short allele have exaggerated biological stress responses and are more vulnerable to depression-related disorders such as anxiety and post-traumatic stress disorder. Risch concedes that his team had to omit several supportive studies. That's because, he says, they wanted to focus as much as possible on attempts to replicate the original research, with comparable measures of depression and stress.

    Many researchers find the meta-analysis persuasive. “I am not surprised by their conclusions,” says psychiatric geneticist Kenneth Kendler of Virginia Commonwealth University in Richmond, an author of one of the supportive studies that was excluded. “Gene discovery in psychiatric illness has been very hard, the hardest kind of science,” he says, because scientists are looking for multiple genes with very small effects.

    Dorrett Boomsma, a behavior geneticist at Amsterdam's Free University, points out that many people have questioned the Caspi finding. Although the gene was reported to have an effect on depression only in the presence of life stress, she thinks it is “extremely unlikely that it would not have an independent effect” as well. Yet recent whole-genome association studies for depression, for which scientists scan the genomes of thousands of subjects for tens of thousands of markers, she adds, “do not say anything about [the gene].”

    Some researchers nonetheless believe it's too soon to close the book on the serotonin transporter. Psychiatric geneticist Joel Gelernter of Yale University agrees with Caspi that the rigorous demands of a meta-analysis may have forced the Risch team to carve away too much relevant material. And NIMH psychiatrist Daniel Weinberger says he's not ready to discount brain-imaging studies showing that the variant in question affects emotion-related brain activity.

    Merikangas believes the meta-analysis reveals the weakness of the “candidate gene” approach: genotyping a group of subjects for a particular gene variant and calculating the effect of the variant on a particular condition, as was done in the Caspi study. “There are probably 30 to 40 genes that have to do with the amount of serotonin in the brain,” she says. So “if we just pull out genes of interest, … we're prone to false positives.” Instead, she says, most geneticists recognize that whole-genome scans are the way to go. McGue agrees that behavioral gene hunters have had to rethink their strategies. Just in the past couple of years, he says, it's become clear that the individual genes affecting behavior are likely to have “much, much smaller effects” than had been thought.

  4. International Collaboration

    Bioscientists Slowly Bridge The Taiwan Strait

    1. Dennis Normile

    TAIPEI—At a recent conference here on bioscience, the presentation by Chen Pei-Jer, a molecular virologist at the National Taiwan University (NTU) College of Medicine in Taipei, attracted attention not just because of his solid work on hepatitis B but more importantly because it was one of very few reports resulting from a collaboration between scientists in Taiwan and the Chinese mainland. “It is quite natural to have those on both sides [of the Taiwan Strait] work together. Both not only share a common cultural heritage but also a very similar history of human disease loads,” Chen says. Unfortunately, despite being natural, “there still is little cooperation.”

    The dearth of joint work among life scientists is puzzling because cross-strait ties in other fields are flourishing. The bottom line, researchers say, is that for various reasons, life scientists in Taiwan and the mainland just haven't had the chance to get to know each other yet, which was one of the objectives of the conference. “It's part of the mission of the Society of Chinese Bioscientists in America to foster scientific collaborations,” says SCBA treasurer Dihua Yu, a molecular biologist at the University of Texas (UT) M. D. Anderson Cancer Center in Houston.

    It took a while for Cold War tensions between Taiwan and the mainland to thaw. But economic and scientific links started growing in the 1990s, despite thickets of regulations (Science, 16 May 2003, p. 1074). Scientific cooperation has now become routine in many fields. The Academia Sinica Institute of Astronomy & Astrophysics in Taipei is working with astronomers at the Chinese Academy of Sciences (CAS) Purple Mountain Observatory in Nanjing to develop instruments for radio telescopes. Particle physicists on both sides of the strait are also working on each other's neutrino experiments.


    In a recent advance, just after the May 2008 Wenchuan Earthquake devastated Sichuan Province, CAS president Lu Yongxiang wrote to his counterpart at Academia Sinica, Wong Chi-Huey, suggesting that Taiwanese experts share knowledge gained from the September 1999 earthquake that struck central Taiwan. It was the first ever official contact between the presidents of the two institutions, which both trace their roots to an organization founded on the mainland in 1928. Liu Chao-Han, Academia Sinica vice president for international affairs, led an 11-member team of engineers, geologists, psychologists, and lawyers that met with mainland counterparts last summer. That workshop led to “regular exchanges and joint studies related to earthquake engineering and reconstruction,” says Liu. They are now discussing joint studies of typhoons and other topics.

    Meanwhile, the logistics of cross-strait cooperation have gotten easier. Since July 2008, there have been direct flights between the mainland and Taiwan, eliminating the need to change planes in Hong Kong. And although there are still some bureaucratic hurdles, researchers say such things as visa approvals are going through more smoothly.

    Deadly difference.

    According to a 2005 study, liver cancer is a more prominent cause of cancer deaths in China than in the West.


    Amid this progress, why is cooperation among life scientists lagging? One reason is that unlike physicists and astronomers, life scientists are less dependent on shared facilities and often work in small groups. And if they think of collaborations, many turn to U.S. contacts, in part because the infrastructure in Asia “is not even close to what is in the U.S.,” says Wang Xiaodong, a biochemist at UT Southwestern Medical Center in Dallas who is also co-director of the National Institute of Biological Sciences in Beijing.

    What promises to bring the two sides together is biomedicine. Given the genetics involved, “to answer questions about disease patterns in East Asia, we have to collaborate within the region,” says Yang Pan-Chyr, dean of the NTU College of Medicine in Taipei. UT Southwestern's Wang notes that for clinical studies in general, “China offers so many patients.”

    Chen, for example, went to China to expand his studies of liver diseases, particularly cancer and viral hepatitis B. “I realized the impact of our research would be bigger if it extended to the mainland,” Chen says. The point has been driven home by demand for a hepatitis B mouse model he developed in 2006: 80% of requests for the mice came from China, with the rest split between the United States and Europe.

    Chen formed close ties with Xiamen University, on the mainland coast directly across from Taiwan, and is now codrafting an agreement for visits of graduate students and joint work on diagnostic kits and therapeutics. There are still hurdles. Chen wants to sequence the genome of a Taiwanese—hence ethnic Chinese—liver cancer victim. But the approvals needed to send samples directly to China look daunting, so he is considering working with U.S. colleagues.

    Chen points to other examples: The lead investigator for an Asia-wide clinical trial of a liver cancer drug is his NTU colleague Cheng Ann-Lii, though many of the clinical sites are on the mainland. And Yang, dean of the NTU medical school, says NTU is also working with Guangzhou Institute of Respiratory Diseases on lung cancer studies.

    Just when such collaborations will extend to more basic life science research remains a question, although researchers who gathered here for the conference professed to be interested in making them happen.

    • * The 12th Society of Chinese Bioscientists in America International Symposium: 14–18 June 2009.

  5. Science in Society

    China Reins in Wilder Impulses in Treatment of ‘Internet Addiction’

    1. Richard Stone

    BEIJING—World of Warcraft was Sun Jiuqing's undoing. It began to take over his life last autumn, after the 18-year-old had transferred to a new high school with higher academic standards than his previous one. He struggled from the start. “I couldn't catch up to the other students,” he says. Sun chose instead to escape. Before long, he was spending 10 hours a day playing World of Warcraft, an online game. Sun spurned pleas to stop. Finally, in late March, his father drove the young man a few hours from their home in Tianjin to a People's Liberation Army barracks in south Beijing and admitted him to the General Hospital of Beijing Military Region's Addiction Medicine Center (AMC).

    Boot camp.

    For youth diagnosed with “Internet addiction disorder” at the Addiction Medicine Center, psychiatrist Tao Ran (inset) prescribes everything from drugs to military-style discipline.


    No one doubts that logging long hours on the Internet can erode quality of life and on occasion can lead to ruinous consequences. “It's a global phenomenon,” says AMC director Tao Ran, a psychiatrist and senior colonel. In China alone, Tao estimates, 5 million of the country's 300 million Internet users are “Internet addicts.” Adolescents are especially vulnerable. “Youth who compulsively seek social contact on the Internet at the expense of offline activities may be finding it difficult to establish sufficiently gratifying social ties in their regular lives,” says sociologist Zeynep Tufekci of the University of Maryland, Baltimore County.

    But there is no meeting of the minds on whether Internet addiction (IA) is a genuine disorder. “Labeling these behaviors as deviant is being done by older generations who have very different experiences with technology,” argues Shelia Cotten, a sociologist at the University of Alabama, Birmingham. Overlooking or ignoring social factors can result in unnecessary medical interventions for behaviors that “do not fit into the structure of our society as it happens to be now,” adds Tufekci.

    An American Psychiatric Association panel is now weighing whether to include IA in the fifth edition of the field's practices bible, the Diagnostic and Statistical Manual of Mental Disorders (DSM-V), planned for release in 2012. In an editorial in the March 2008 issue of The American Journal of Psychiatry, Jerald Block, a psychiatrist in Portland, Oregon, argued that IA “appears to be a common disorder that merits inclusion.” But Cotten, for one, disagrees. She thinks it is premature to include IA in DSM-V given the lack of consensus about what constitutes IA or “whether Internet addiction disorder even exists.”

    In China, the official view appears to be that Internet addiction is a genuine disorder, but attitudes are shifting about how aggressively it should be treated. Last year, CCTV-12, a central government channel, ran a series of glowing reports on a clinic in Shandong Province in eastern China that has used electric shocks on unanesthetized IA patients as part of what the clinic's director, Yang Yongxin, has called a “holy crusade” to cure IA. Earlier this month, the state-owned newspaper China Daily ran an article raising questions about Yang's methods, indicating an official about-face on the use of electric shocks as a valid IA treatment.

    Sizing up the beast

    Several years ago, when parents started showing up at AMC claiming that their adolescent children were Internet junkies, “at first I was skeptical we were seeing a true disorder,” says center psychologist Huang Xiuqin. But as cases accumulated, she and Tao, her boss, became convinced that IA is an authentic disorder. Of the more than 3000 cases they have chronicled, patients were spending on average 9 hours a day on the Net.

    Tao has been trying to put diagnosis and treatment of IA on a more solid footing. Last November, his group released the first diagnostic criteria for IA; a paper outlining the guidelines is under review at the journal Addiction. The group classifies sufferers in three categories: simple IA (about 40% of cases), IA with accompanying symptoms such as anxiety or depression (30%), and IA with a second disorder, such as attention deficit hyperactivity disorder (30%). About 80% of patients are teenage boys.

    Tao's group has defined seven IA symptoms, including preoccupation with the Internet, disregard of harmful consequences of spending too much time online, and loss of interest in other activities. “Only being on the Internet makes them happy,” says Tao. To qualify for diagnosis as an Internet addict, they have proposed that a person must spend at least 6 hours a day on the Internet (for reasons other than business or academic work) for at least 3 months after showing symptoms.

    In the vast majority of cases, Tao says, the culprit is online games, although female patients also often get hooked on online chat rooms. After arriving at AMC, “almost all” patients suffer withdrawal symptoms, Tao says, including anger, irritation, and restlessness, that fade after a few weeks. (Those reactions may not be too surprising: Adolescents are usually taken against their will to AMC, whose dormitory's entrance has steel bars—a state requirement of psychiatric wards.)

    AMC's treatments include behavioral training, drug therapy for patients with mental symptoms, dancing and sports, reading, karaoke, and elements of the “12 step” program of Alcoholics Anonymous. A “very important” part of the regimen is family therapy, says Tao. “Internet addiction occurs because the parents are doing something wrong,” he asserts. Patients tend to have parents who are strict authoritarians or demand perfection, or come from single-parent households or homes in which the parents are frequently fighting, Tao says. In the beginning, parents tend to blame their children, he says, but after treatment they recognize their failings.

    Beyond the pale.

    As documented in a CCTV-12 program last year, Yang Yongxin administered electric shocks to “H” (top) with an electroconvulsive therapy machine (bottom).


    In the absence of guidance from China's health ministry, which is considering but has not yet adopted the military hospital's IA definition, dubious clinics have sprouted up throughout the country. Some force IA patients to go on kilometers-long hikes day after day as therapy. “That's unscientific. It doesn't treat the nature of the disorder,” says Tao.

    The most infamous, perhaps, is the Yang Yongxin Center for IA Treatment at public hospital number four in Linyi, Shandong. Last year, a CCTV-12 segment recounted how the parents of a young man, “H,” drugged him with a dozen sleeping pills and brought him to Yang's clinic. After “H” had woken up, he protested to Yang that he was over 18 years old and therefore they could not force him to stay without his consent. Yang bundled “H” into a room, and other patients restrained him on a bed, after which Yang administered shocks—for more than 1 hour, the narrator claimed—with a DX-IIA electroconvulsive therapy (ECT) machine, clearly shown in the program. In an 8 May article in China Youth Daily, Yang explained that he uses a weaker current than standard ECT and that the shocks, although “very painful,” are “harmless.”

    After months of appeals from Tao and colleagues, last month three government entities—the central government's information ministry, Shandong Province government, and the Communist Youth League—launched an investigation of Yang's clinic. A Linyi hospital spokesperson declined to comment. She stated that as Science went to press, Yang was occupied with a CCTV interview and would not be available to speak with Science.

    Cracking down on extreme treatments is unlikely to alter views in China that IA is a disorder. At the barracks in Beijing's Daxing district, Sun has taken a break from drills in AMC's courtyard. It was difficult, he says, adapting to waking at sunrise, lights out at 9 p.m., and other elements of the center's strict regimen. But after nearly 3 months, he says, “I feel normal now,” and adds that he and his father, who has stayed in Beijing the whole time, are communicating much better with each other.

    In a few days, Sun will return to the real world. Tao's statistics show that there is a 40% chance the teenager will relapse. But for now, Sun is eager to get back to school—and face down the temptation of losing himself, once again, to the Internet.


    From Science's Online Daily News Site

    Serial Killers of the Sea A criminal investigation tool used to place a suspect at the scene of a crime is now being applied to track vicious killers in the ocean—great white sharks. Typically used in serial crime cases, geographic profiling evaluates crime-scene locations to determine the most likely area of the perpetrator's residence. Now, for the first time, a research team has used the tool to study sharks hunting Cape fur seals off the coast of South Africa.

    Atoms Speeding From the Moon Scientists probing the outer reaches of our solar system have hit upon an unusual phenomenon much closer to home. Instruments aboard a NASA spacecraft have detected fast-moving hydrogen atoms emanating from the moon. The atoms, which originated as protons from the sun, may help scientists study the lunar surface and other solar system objects in greater detail than believed possible. When the solar wind hits the moon, most particles stick to the surface, but as many as 10% of solar protons are bouncing back as hydrogen atoms.


    Fish Throws Away Its Genes as It Grows Whether it's its extraterrestrial looks or status as a “living fossil,” there's always been something fishy about the sea lamprey. Now scientists have added another oddity to the creature's repertoire: The lamprey jettisons 20% of its genome during development.

    A Yawn From the Napping Sun Maybe old Sol didn't hear the alarm clock. After a mysterious 2-year delay, the next 11-year solar cycle seems ready to begin, scientists say. That means the reemergence of sunspots and with them periodic electromagnetic assaults on global navigation, communications, and power supplies—as well as brilliant auroras in the polar regions.

    Read the full postings, comments, and more on

  7. Biodiversity

    Biodiversity Databases Spread, Prompting Unification Call

    1. Claire Thomas

    LONDON—“For millennia, researchers have hoarded data, because essentially data [are] power,” says Norman MacLeod, keeper of paleontology at London's Natural History Museum. That attitude has faded in recent years, as scientists increasingly recognize the value of collaborative, open-access data sharing for understanding the world. But there's still a wide gap between wanting to share and figuring out how to do it right, discovered those who attended an international meeting on biodiversity here this month.

    MacLeod was one of the organizers of e-Biosphere 09*, a meeting for creators and users of the Encyclopedia of Life (EOL), the Consortium for the Barcode of Life (CBOL), the Catalogue of Life (CoL), and other major efforts to build and manage open-access biodiversity databases. CoL now lists more than 1.1 million species, for example, and EOL has compiled more than 150,000 vetted species pages and 1.4 million short articles, called “stubs,” that will be expanded as information on each species is gathered. “At e-Biosphere, many groups demonstrated they are now providing actual services, around the clock, with interfaces that large numbers of people can use,” says Jesse Ausubel, program director of the Alfred P. Sloan Foundation in New York City. The goal of the conference was to figure out how to combine data from at least 100 systems into one gigantic, online, open-access database that will eventually cover all life on Earth, with lots of information, including primary research.


    The bar codes for the Great Horned Owl (above, left) and the Barn Owl show great variation in the mitochondrial gene sequenced from these two birds. A global bird bar-code database is under construction.


    But whether these researchers are ready to create one-stop shopping for biodiversity remains to be seen. They are behind in gathering data, some may soon be strapped for cash, and not everyone is eager to share information. “None of the groups has a permanent, sustainable business model,” says Rainer Froese, coordinator of FishBase, a comprehensive database on fish.

    Many of the individual projects have run short of funding or underestimated how long it would take to meet their targets. Begun in 2007 with $12.5 million, EOL hoped to profile 700,000 to 1 million species by 2011 (Science, 11 May 2007, p. 818). But 2 years into the project, it has just 150,000 on its computers and will be happy to hit between 500,000 and 700,000 by that point. CoL, essentially a species list that provides a “taxonomic backbone” for many of the other databases, including EOL, has pushed back by 3 years its 2011 goal of covering all 1.75 million known species. It won't say how much it's spent so far but now needs new money to complete the job.

    Funding has also been an issue for the International Barcode of Life (iBOL) project, a yet-to-launch international project based at the University of Guelph in Canada that is building a database of DNA bar codes—short sequences of DNA that can be used to “tag” species (Science, 18 February 2005, p. 1037). iBOL needs $60.7 million for the next 6 years. It has promises of $24 million from various Canadian agencies but this year got only $1.7 million of $21.7 million expected from the government's Genome Canada. Paul Hebert of iBOL is confident the other $20 million will be awarded by next February, but that still leaves the project searching for another $15 million or so. iBOL's barcoding centers around the world will have to come up with their own funding as well.

    Pick a species.

    Like baseball cards, species pages from the Encyclopedia of Life give the vital statistics on an organism.


    Scientists have also discovered that making the databases comprehensive is a tough process. The Global Biodiversity Information Facility (GBIF) Web site, for example, allows users to generate biodiversity maps. “If you look at [maps on] the GBIF Web site, you'll see a lot of blank spots,” says CBOL's executive secretary, David Schindel. “Those are either countries that don't have a lot of digital records or for some reason have decided not to share their data.”

    Converting scientists from data hoarders to data sharers can still be a problem, says MacLeod. Yet combining data across disciplines can be quite useful. One, for example, can link a species' global-positioning data with its biological makeup, and then with climate-modeling data, to get a clearer picture of threats to its existence.

    Because funding is often an issue, projects that have something to offer government agencies may have an easier time keeping afloat. The Fish Barcode of Life Initiative is developing bar codes for commercial fish for the Regulatory Fish Encyclopedia at the U.S. Food and Drug Administration and also for the National Oceanic and Atmospheric Administration. “In general, we are seeing more interest from government agencies [in bar coding],” says Schindel, who hopes these agencies will eventually provide funding for these efforts.

    But even if the projects can meet their targets, how will they then ensure that they have sustainable funding to maintain the databases and continue to add new species and primary research data? At the London meeting, the participants agreed that some of their efforts may be short-lived. “A lot of competition is going on, and [some] people are creating similar sorts of tools,” says Schindel.

    For projects that do survive, funding will remain a key concern. Dave Roberts of the European Distributed Institute of Taxonomy says that it's “unrealistic to give guarantees” for funding of many projects in the current economic climate—a sentiment echoed by many of the project leaders in attendance. However, many hope they can make a good case for sustained funding, citing the 27-year history of GenBank, a public archive of DNA sequence data. “We know GenBank will still be there 10 years from now,” says CoL's Frank Bisby. “I believe this will also be true of the CoL and several other key components of the biodiversity informatics community.”

    Two in one.

    They look alike, but these two butterflies are two species—so says their different bar codes—an observation confirmed by watching what their caterpillars eat.


    For some of the projects, public involvement and interest outside of the scientific community will be crucial to their sustainability. For EOL, allowing access to update pages and add pictures will help keep the project relevant to the wider world, says EOL's species page group director, Cynthia Sims Parr. But access will be controlled to some extent to avoid “the Wikipedia problem,” which is caused when users introduce errors into articles unchecked. EOL is now recruiting site curators for various groups of species; they will monitor and verify if information added is correct.

    But as researchers try to fix problems facing individual projects, they need to also figure out how to integrate these initiatives. Bisby notes, for example, that GBIF failed to include species added this year because it was still using last year's version of CoL's species checklist. In general, scientists must now navigate each database site separately to pull together information needed on a species. But aligning software from all the databases to enable interoperability will be a huge challenge, involving reformatting large amounts of data into a standardized form.

    After the conference ended, a smaller working group from the major initiatives discussed a “digital road map” of how to connect the databases on the Web. The first steps they agreed upon included completing a list of global species names, on which many other databases can be built, and reaching out to potential collaborators in the computer science community to help construct the road map. Initial progress will be presented either at October's GBIF GB16 meeting in Copenhagen or the Taxonomic Database Working Group's November 2009 meeting in Montpellier, France. The e-Biosphere working group doesn't have any milestones set yet but hopes to present incremental progress and outline new action items toward creating the road map. Ballpark estimates are that the “virtual laboratory” envisioned will take 10 years to construct. But EOL Executive Director James Edwards says that the “guts” of the system, an integrated database incorporating the largest projects such as EOL, GBIF, and CoL, should be available within 2 years.

    • * The e-Biosphere 09 International Conference on Biodiversity Informatics, London, 3–5 June,

  8. ScienceInsider

    From the Science Policy Blog

    ScienceInsider this week reported on a U.S. government official's prediction that H1N1 flu will continue unabated during the summer and into the fall, a decision by the G8 nations to skip science at their summit meeting in Italy this month, and an interruption of clinical trials caused by procedural errors at two U.S. cancer centers.

    ScienceInsider also highlighted a warning that the continuity of U.S. weather data is “at extreme risk” because a program to develop new Earth-observing equipment is poorly managed. The effort to replace older satellites is well under way, but the National Polar-orbiting Operational Environmental Satellite System, as it is called, has doubled in cost to $14 billion.

    Two large companies have offered to provide free vaccine against the H1N1 flu to people in developing countries who can't afford to pay. An offer of 100 million free doses came from Sanofi-Aventis shortly after a similar pledge was announced by GlaxoSmithKline.

    A widely debated plan to change the way the U.K. government doles out research funds to universities has been shot down. The proposal to switch from live peer review to evaluation based on citation analysis and other bibliometric data—both complex and controversial—has been sidelined indefinitely.

    The U.S. Congress is fighting over changes to a $2 billion research program to tap the creativity of small businesses, with the goal of reaching agreement before the program expires next month. Also this week, four legislators asked the U.S. National Academies to tell them what the government needs to do to keep U.S. academic research strong. A similar 2005 letter spawned the influential Rising Above the Gathering Storm report.

    Keep up with the latest science policy news at

  9. The Brain Collector

    1. Greg Miller

    Jacopo Annese plans to create an open-access digital brain library—starting with the most famous amnesic patient of all time.


    A new brain library will archive multiple views of individual human brains, including MRI scans, photographs, and thin slices dyed to reveal detailed anatomy.


    The refrigerators against the back wall of Jacopo Annese's lab are filled with human brains. Some float in tubs of formaldehyde; others have been sliced into ultrathin sections to be treated with stains that will reveal various anatomical features. One of the tubs is labeled HM, after Henry Molaison, the most studied human being in the history of psychology. In 1953, an experimental brain surgery intended to correct a devastating seizure disorder left Molaison unable to form new memories. At age 27, he became frozen in time. He could remember facts he'd learned and names of people he'd encountered before the surgery but virtually nothing after it. For half a century, until his death last December, Molaison gamely cooperated with psychologists and neuroscientists, and his case reshaped scientific thinking on the neural basis of memory.

    Annese, 42, is a neuroanatomist at the University of California, San Diego (UCSD), and he has been charged with preserving Molaison's brain for perpetuity. He takes this responsibility seriously. Among other precautions, the fridge containing Molaison's brain has an alarm system that calls Annese's office, home, and cell phones if the temperature deviates too far from the set point. He has also taken certain security measures he's loath to discuss on the record. “It probably sounds a bit paranoid,” he says, “but we've tried to think of anything that could go wrong and eliminate the risk.”

    With funding from the Dana Foundation, a New York City nonprofit that supports research in neuroscience, and the National Science Foundation, Annese plans to create a digital, zoomable atlas of Molaison's brain and make it freely available online, the first entry in what he hopes will become an open-access brain library to be used by scientists, students, or anyone with an Internet connection and an interest in neuroanatomy. Ultimately, the library will include donated brains from people with Alzheimer's disease and other neurological disorders, Annese says, as well as a collection of healthy brains of different ages. But for now, Molaison's brain is the star attraction.

    “More was learned about memory by research with just that one patient than was learned in the previous 100 years of research on memory,” says Vilayanur Ramachandran, a behavioral neurologist at UCSD who is not directly involved with Annese's project. Postmortem anatomical studies could reveal new clues about the cause of Molaison's amnesia by revealing more precisely the extent of his surgical lesions, as well as subsequent degeneration, Ramachandran says. Preserving the donated brains of such unique patients has both historical and scientific value, he says.

    A long history

    Molaison's story begins decades before Annese got involved. Henry Gustav Molaison was born in 1926 and grew up in and around Hartford, Connecticut. His seizures began at age 10. The cause is uncertain, but his father's family had a history of epilepsy, and a collision with a passing bicycle rider had once knocked Henry unconscious for several minutes. The seizures worsened through his teens and 20s until frequent blackouts and increasingly severe convulsions left him unable to continue his work repairing electric motors, despite high doses of anticonvulsant drugs. He ended up in the care of William Beecher Scoville, a neurosurgeon at Hartford Hospital.

    Because of the incapacitating nature of Molaison's seizures, Scoville decided to try what he later described as a “frankly experimental operation.” He removed, via suction, a finger-sized piece of the temporal lobes on both sides of the brain, including most of the hippocampus, amygdala, and nearby parahippocampal gyrus. Scoville had previously performed such bilateral medial temporal lobotomies on dozens of psychiatric patients, hoping to calm their psychosis without the personality changes associated with the more drastic frontal lobotomies that were beginning to fall out of favor at the time. By coincidence, two of Scoville's previous patients had been prone to seizures, and those diminished postsurgery, says Suzanne Corkin, a neuroscientist at the Massachusetts Institute of Technology in Cambridge who worked with Molaison for 46 years and is writing his biography. As a last resort, Scoville thought the surgery was worth a try with Molaison.

    In a landmark 1957 paper, Scoville and Canadian psychologist Brenda Milner reported that although Molaison's seizures had diminished greatly, he now exhibited profound amnesia. He easily recalled events prior to his operation but had virtually no lasting memories of anything that had happened since. A half-hour after having lunch with Milner in the hospital cafeteria, for example, he couldn't name a single thing he had eaten and did not even remember eating. The paper also described moderate to severe memory deficits in seven other patients with similar surgeries, mainly for schizophrenia. In the coming years, Scoville urged other surgeons not to perform this surgery on both sides of the brain.

    Memory lessons

    Molaison's case startled memory researchers. Conventional wisdom at the time held that memory could not be pinned to any specific location in the brain. This view sprang largely from the work of Karl Lashley, an American physiologist who in the 1930s and '40s had searched in vain for the “engram,” or neural trace of memory, by removing different regions of the cerebral cortex of rats to see whether this would destroy a previously learned memory, such as the location of food in a maze. Because removing even large swaths of cortex did not render the rodents amnesic, Lashley concluded that memory traces must be widely distributed.

    Molaison's case suggested otherwise. His ability to learn and retain new facts, what researchers now call declarative memory, had been devastated by removing a relatively small part of his brain. (Lashley, certain he would find the engram in the cerebral cortex, never tried removing the hippocampus.)

    Further studies provided additional clues about the biology of memory. Although Molaison was unable to create long-lasting new memories, his recollection of distant events—World War II, for example—demonstrated that older memories must reside outside the medial temporal lobes. At the same time, he also retained a fleeting short-term memory: He could remember a string of numbers or words for 30 seconds, about the same as someone with an intact brain. And in another widely cited study, Milner reported in 1962 that Molaison could learn and remember a new skill. She asked him to trace a double star design while watching his hand in a mirror. Over several days, his drawings improved substantially, demonstrating a memory of previous practice sessions. Yet he always insisted that he'd never done this task before and seemed surprised that it turned out to be easier than he'd thought.

    Painstaking work.

    Jacopo Annese aims to digitally reconstruct the brain of amnesic patient Henry Molaison (inset).


    Molaison's case helped establish several modern tenets of memory research, including the notion that there are different types of memory that depend on different brain regions, and the concept of memory consolidation, which holds that new memories formed by the hippocampus are later archived in the cerebral cortex for long-term storage.

    Molaison was interviewed by at least 100 researchers and has been mentioned in thousands of journal articles, says Corkin, who began working with him as Milner's graduate student in 1962 and continued until shortly before his death. “Everyone liked him,” she says. “He was soft-spoken, polite, and he had a good sense of humor.” Yet despite thousands of interactions, Molaison displayed only a flicker of recognition, sometimes saying he'd known Corkin in high school. “Whatever knowledge he had of me was very sketchy,” she says. “He never really knew who I was.”

    In 2002, Corkin set up a committee, which included Annese, to plan for Molaison's eventual death. Molaison, who had no immediate living family, and a distant relative who acted as his conservator signed paperwork agreeing to donate his brain. Corkin delivered Cryopaks to his nursing home in Windsor Locks, Connecticut. “They kept them in the freezer so that the moment he died they could wrap his head to preserve the brain,” she says. When Molaison died of respiratory failure at 5:05 p.m. on 8 December 2008, the plan sprang into action. A hearse took his body to Massachusetts General Hospital (MGH) in Charlestown, where researchers began collecting anatomical magnetic resonance imaging (MRI) scans of his brain at about 9 p.m.—and continued until 6 a.m. the next day, when Annese arrived on a red-eye flight from San Diego.

    Annese assisted MGH neuropathologist Matthew Frosch in removing the brain. “I was sweating bullets,” Annese says. But the delicate procedure went smoothly, and for the next 2 months, the brain remained at MGH. A fresh brain is the consistency of Jell-O that has barely begun to set. Only when fixed in formaldehyde would it be firm enough to transport safely.

    Artistic anatomy

    Annese grew up in Florence, Italy, and the Renaissance city's cultural sensibilities seem to have rubbed off on him. His office décor includes old anatomical prints and antique microscopes. Issues of an artsy cooking magazine and the Italian-language version of The New York Review of Books fill a magazine rack in the common area of the lab, and an espresso machine stands ready to jolt sleepy anatomists back into action.

    On a large touch-screen monitor, a student scrolls through the MRI scans of Molaison's head taken the night he died. A green outline shows the software's best guess at the contours of the brain, and he uses a stylus to make adjustments here and there where the computer got it wrong. Once corrected, these MRI data will yield a three-dimensional image Annese likens to the globe that appears on the opening screen of Google Earth—a starting point for navigation. Digital photographs and histological images will provide more detailed neuroanatomical maps, down to the level of individual cells.

    In February, Annese flew back to Boston to collect Molaison's brain. In these security-conscious days, it took some advance planning to arrange to bring the brain—in an ice-packed cooler—as carry-on luggage. An official from the Transportation Security Administration escorted Annese through the security checkpoint. Jet Blue reserved the front row for him. He flew back to San Diego with Molaison's brain sitting beside him in the window seat.

    The brain now soaks in a mixture of formaldehyde and sucrose, which will help prevent ice crystals from forming and poking holes in the tissue when Annese freezes it for slicing. Using a microtome, he will slice the brain into very thin sections. “Like prosciutto,” he says, but less than 1/20 the thickness and a lot more fragile. Annese aims to slice the brain whole instead of first cutting it into smaller chunks as is more routinely done. Small chunks are much easier to work with, but the resulting slices are hard to keep in register with one another. Whole-brain slices will keep more of the tissue intact and result in a more faithful reconstruction of the brain, he says.

    Annese estimates he will end up with about 2600 slices of Molaison's brain. He and his colleagues will mount some of these, perhaps every 12th one to start, on extra-large glass slides—13 by 18 centimeters—and treat them with a stain that colors cell bodies purple. A camera attached to a microscope will photograph each slice at 20× magnification, sufficient to distinguish different cell types. At that magnification, photographing a single slice will require a mosaic of about 40,000 individual images. An automated system Annese designed will carry out the tedious job of moving the slice for each photo, focusing, triggering the shutter, and sending the shot to the San Diego Supercomputer Center, which will store the data and stitch the images together into a composite for each slice.

    A memorable brain.

    A preliminary reconstruction of Molaison's surgical lesion (red), based on MRI scans.


    Other memory researchers are eager for a better look at Molaison's brain. “The main thing we will all look to is exactly where the boundaries of the surgical lesion were,” says Larry Squire, a neuroscientist and veteran memory researcher at UCSD. Knowing the precise location of the lesion “will sharpen the brain-behavior correlation,” says Corkin, and may shed light on some lingering puzzles. Despite his well-documented declarative memory deficits, for example, Molaison every once in a while surprised researchers with a newly learned fact. “He knew that Archie Bunker's son-in-law was called Meathead” in the 1970s sitcom All in the Family, Corkin says. “When he came up with stuff like that, everyone's jaw just dropped to the ground.” MRI scans lack the resolution to determine with certainty which bits of tissue survived Scoville's surgery and may have provided such residual memory function, Corkin says. Annese's work will provide a much clearer view.

    Comparing Molaison's brain with those of other amnesic patients may yield additional insights, Squire says. He has entrusted Annese with the donated brains of three amnesic patients his group has studied. One, known as E.P., died in early 2008. Viral encephalitis had ravaged his medial temporal lobes in 1992 and left him with memory deficits strikingly similar to Molaison's. Subtle differences in the two cases, paired with anatomical comparisons, might help clarify the function of different regions within the temporal lobes. UCSD's Ramachandran, who works on a variety of unusual neurological conditions, says several of his patients have agreed to donate their brains, which he will send to Annese.

    Annese hopes others will follow suit. But he sees the project as a two-way street, an experiment in open-access science. “The lab is not going to be a fortress,” he says. “Everything is going to be up online and discussed with colleagues.” His Web site ( will soon include a blog where other scientists can offer advice on methodological issues, ask him to photograph a particular brain region with higher magnification, suggest experiments, or request tissue samples. Non-scientists will be welcome, too. Just as amateur archaeologists have found undiscovered sites by combing Google Earth, Annese says, amateur neuroscientists might make discoveries that have eluded experts.

    Even the slicing will be viewable online in real time, Annese says. His head technician, Natalie Schenker, finds this a bit daunting. “How about a 5-second delay?” she suggests. Annese laughs and says he'll consider it. He plans to slice the brain himself in a marathon 30-hour session, probably sometime in July. If it goes well, all but the most die-hard viewers will be bored silly as he cuts slice after slice and transfers each one from the blade of the microtome to a fluid-filled well in a plastic tray. But a lot could go wrong. The MRI scans reveal deterioration of the white matter, Annese says, which might make the slices especially delicate and prone to tearing. An even more nightmarish scenario is a cracked brain, he says. Sometimes, a brain will freeze unevenly and break apart—destroying it before it can be sliced. Annese is taking every precaution, but he's not taking anything for granted. “Cutting will make or break the project,” he says. “But if the brain cracks, I go back to Italy.”

  10. Microbiology

    Antibiotics in Nature: Beyond Biological Warfare

    1. Christine Mlot*

    A body of evidence emerges that the infection-quelling miracle drugs of biomedicine play more basic roles in the metabolism of microbial communities.

    Signal lights.

    Engineered Salmonella cells light up in response to low levels of compounds produced during the crosstalk of the microbial community in a sediment sample. Some of the detected compounds may lead to new antibiotics.


    At the beginning of World War II, Hollywood released the movie Dr. Ehrlich's Magic Bullet, depicting the life of turn-of-the-20th-century German microbiologist Paul Ehrlich and his quest to develop an antibiotic drug to treat syphilis. It was Ehrlich who originated the idea of medical “magische Kugeln,” launching the warfare metaphor that has clung to such drugs ever since. Even their generic name—antibiotics—has a weaponlike ring to it. The compounds are widely regarded simply as microbe-killers, both in medicine and in nature, where they are produced by soil-dwelling fungi and bacteria. “You still read all the time [about] biological warfare in the soil,” says microbiologist Roberto Kolter of Harvard Medical School in Boston.

    But there's scant evidence that bacteria or fungi deploy antibiotics to kill or ward off other microbes. What researchers know about antibiotic-producing microbes comes mainly from studying them in high numbers as pure cultures in the lab—artificial conditions compared with the numbers and diversity found in soil. And in the past 15 years, Kolter and other researchers, with Julian Davies of the University of British Columbia at the forefront, have become convinced that natural antibiotics are doing other things in the complex world of microbial communities. These molecules, they assert, may be less weapons for competition or combat than tools of communication, or even essential cogs in their producers' own metabolism.

    Kolter's team, for example, recently reported a possible “peaceful” role for the commonly prescribed drug nystatin, a molecule with potent antifungal properties isolated in the 1940s from a soil bacterium. When certain bacteria are exposed to nystatin, the microbes form slimelike communities known as biofilms. Kolter, who has also teased out the novel mechanism by which nystatin triggers this transformation, believes this may be just one of the molecule's natural roles.

    Microbiologists' shift away from the biomedical view of antibiotics is no less than “a Copernican turn-about for the role of antibiotics in nature,” José Martínez of the Campus Universidad Autónoma de Madrid and his co-workers have written. Beyond reworking the tenets of microbial ecology, understanding the true roles “antibiotics” play has practical potential, researchers say: It could provide a way to manage the increasing problem of resistance to these once-miraculous drugs and lead to discovery of new pathogen-fighters.

    Antimicrobial hunter.

    Julian Davies is looking for new antibiotics by screening for microbial signals.


    Fossil molecules repurposed

    The term “antibiotic” was coined by Selman Waksman, who shared the 1952 Nobel Prize in physiology or medicine for discovering streptomycin, the first effective treatment for tuberculosis. And it was streptomycin, one of thousands of antibiotics produced by soil bacteria known as actinomycetes, that caught the attention of the Welsh-born Davies, who was trained in organic chemistry but had always been interested in biologically produced natural compounds. (In grammar school, he memorized the structure of penicillin.) While at Harvard Medical School in the early 1960s, Davies figured out that streptomycin inhibits the growth of bacteria such as Mycobacterium tuberculosis by latching onto a part of the ribosome, so the cell can no longer make protein.

    Davies spent the next decades studying antibiotics and antibiotic resistance. By the 1980s, after investigating a class of antimicrobial peptides composed of unusual amino acids that have also been detected on meteorites, he started wondering what antibiotics did in soils and other microbial communities. In 1990, Davies wrote a provocative though little-noticed review proposing that the compounds are evolutionarily old—“fossil molecules” that predate proteins—and could have originated in the primordial soup as “effectors” that interacted with early nucleic acids in a variety of reactions. He theorized that as the protein-making RNA-based machinery of early cells was honed and evolved into the ribosome, these effector molecules were kept around because their bioactivity proved useful in the microbial communities taking shape.

    In the past decade, as researchers started sequencing actinomycete genomes, Davies found more evidence that antibiotics play peaceful roles in nature: An individual actinomycete genome can code for many more apparent antibiotic compounds than scientists had realized, sometimes upward of 25. It didn't make evolutionary sense to Davies that an organism would maintain such an arsenal. It made even less sense when he looked beyond the actinomycetes to the vast genetic diversity of the microbial world. “There are millions and millions of small molecules out there [made by microbes],” says Davies. “I couldn't believe they were all weapons.” (Antibiotics are considered “small” molecules because of their size relative to enzymes and many other cellular proteins.) Davies also noted that the microbes producing these molecules live mainly in the soil, whereas most of the pathogens the antibiotics can inhibit thrive on animals. “They would never see each other” in nature, says Davies.

    There are exceptions, of course. One place where microbes do clearly produce antibiotics that ward off threats from other microbes is the rhizosphere, the nutrient-rich area around plant roots. Linda Thomashow leads a group in the U.S. Department of Agriculture's Agricultural Research Service lab at Washington State University, Pullman, that has demonstrated that wheat plants provide a nurturing habitat on their roots for Pseudomonas bacteria, which in turn produce antibiotics that suppress fungi harmful to the plants. In a similar three-way interaction, Cameron Currie's group at the University of Wisconsin, Madison, has shown that bacterial symbionts living in insects produce antibiotics that suppress fungi that would otherwise harm the insects (Science, 3 October 2008, p. 63).

    But most attempts to detect microbes in their natural environments making antibiotics at the relatively high concentrations needed to inhibit or kill other microbial cells have come up short. So Davies took a cue from a phenomenon known as hormesis (Science, 17 October 2003, p. 376). Many agents, from sunlight to caffeine, can have stimulating effects on cells at low concentrations and lethal effects at high ones. Davies wondered if the same held true for antibiotics in nature.

    In a seminal 2002 study, Davies's group collaborated with Michael Surette at the University of Calgary in Canada to examine what low concentrations of microbially produced antibiotics such as erythromycin and rifampicin did to other microbes. They used a large library of Salmonella typhimurium colonies, each one engineered so that it would bioluminesce the way fireflies do if a specific Salmonella gene promoter was activated.

    The antibiotics put on an unexpected light show, indicating they can repress or activate scores of promoters and thereby influence the activity of many microbial genes, the researchers discovered. “Any given antibiotic will affect as many as 200 promoters,” says Davies, “and different compounds affect different promoters.” The affected genes are involved in a variety of cellular activities, from transport of molecules to DNA repair, Davies and his colleagues have found.

    To Davies, those findings suggested that there is a whole lot of chemical communication going on in the soil, with microbes secreting many compounds at low concentrations that turn on or off the genes of other microbes and orchestrate the growth of the community. “I thought that was exactly right on,” says Melvin Simon, a microbiologist at the University of California, San Diego, who cites the 2002 study as an example of “a really good idea … where you say ‘Of course.’”


    Pseudomonas bacteria make an array of colorful phenazines (above) that are toxic to other cells but play key roles in their producers' metabolism, including handling electrons when oxygen becomes unavailable. With a mutation that disrupts their ability to make phenazines, Pseudomonas cells that normally grow into a smooth dome form a wrinkled colony (right), an arrangement that seems to allow the cells better access to oxygen.


    “These compounds are signals,” Simon adds, “and a sort of language that microorganisms have that allows them to interact with each other.”

    The complex molecular interchanges might explain why so few microbes—less than 1% of the known diversity—can be grown in pure cultures in a laboratory. The microbes likely depend on compounds produced by other members of the community, whether for nutrition, signaling with other cells, or other activities.

    An antibiotic's true colors

    Davies wasn't the only one at the time finding unexpected roles for antibiotics. Dianne Newman was a new professor in geosciences at the California Institute of Technology in Pasadena whose research into how microbes shape the planet's geochemistry had, almost by accident, turned to include antibiotics.

    Newman started her career with an interest in how bacteria use minerals in their metabolism. Instead of consuming oxygen in respiration as plants and animals do, some bacteria turn to iron or other mineral compounds to handle the electrons that get moved about in the course of burning fuel, a process known as reduction-oxidation, or redox. Newman's group figured out how Shewanella bacteria use quinone compounds in soil humus as a shuttle to ferry electrons out of the cellular interior and ultimately to iron in the soil. The researchers noticed something striking: The structure of the quinone-containing shuttle looked a lot like those of phenazines, antibiotics naturally made by Pseudomonas bacteria. These antibiotics also have the thermodynamic capacity to handle electrons in redox reactions, and in doing so they change colors as they accept or release electrons.

    The structural similarity of the quinone redox shuttle and the phenazines was “a light-bulb moment,” recalls Newman, who is now at the Massachusetts Institute of Technology in Cambridge. Suspecting that the antibiotics could function as electron shuttles, not just as weapons, Newman and her co-workers turned their full attention to the phenazines and the Pseudomonas bacteria.

    The well-studied P. aeruginosa can cause infections in people, particularly those with cystic fibrosis, whose lungs are filled with mucus that allows the bacterium to thrive. Infected patients often have blue sputum, an indicator of one particular phenazine that can be toxic to other bacteria as well as to lung cells.

    Researchers in the 1930s had suggested that the pigments play a role in bacterial metabolism, but that idea faded once their toxicity was discovered. Newman's group has revived this interpretation, and in a series of papers, including one last year in Science (29 August 2008, p. 1203), they describe the complex ways P. aeruginosa uses phenazine. The microbe begins to produce phenazine at a late stage of growth, which is typical for all antibiotic-producing microbes and the reason such molecules have been labeled “secondary metabolites.” At this point, cells have piled up to form a biofilm, resulting in pockets of more or less oxygen. With oxygen running out, phenazine can step in to handle the electrons, changing color in the process.

    Using P. aeruginosa mutants unable to make phenazine, the researchers connected this metabolic turning point to a change in the appearance of the biofilm's cell colonies, which typically remain smooth even as they reach their maximum size. But when the bacteria can't produce phenazine, the colonies become wrinkly and spread out to cover a larger area, allowing all cells to access oxygen. “It's like living in Los Angeles or New York,” says Newman. “If you're crowded for breathing room, you either move out to the suburbs or build skyscrapers.”

    Newman's studies indicate that phenazine is a key architect in determining how P. aeruginosa communities get built. “Clearly, [phenazine plays] a role that's different from weaponry here,” says biochemist Bruce Demple of the Harvard School of Public Health in Boston. Understanding that role, he adds, offers the potential for disrupting biofilm formation in Pseudomonas infections.

    Newman's group has found that phenazines can increase or decrease the activity of a number of Pseudomonas genes, including signaling genes that appear to lead to the cells' excreting biofilm-related polymers. Phenazines can even improve a microbe's ability to access iron by transferring electrons to the metal. “Phenazines are functioning at many levels,” says Newman. Rather than being “secondary,” she says, these antibiotics are central to the metabolism of their producers.

    In addition to community building with its own phenazines, P. aeruginosa can also “listen” to antibiotics produced by other microbes. In 2005, Samuel Miller of the University of Washington, Seattle, and his co-workers reported that aminoglycosides, a class of antibiotics including streptomycin that are often used to treat P. aeruginosa infections, can at lower concentration actually induce the microbe to form biofilms, another apparent example of hormesis. And Martínez and his co-workers reported in 2006 that low concentrations of other antibiotics such as tetracycline also turn on P. aeruginosa genes involved in biofilm formation.

    Kolter's nystatin story, reported in the 6 January issue of the Proceedings of the National Academy of Sciences, is one of the latest connections between antibiotics and biofilms, but with an unexpected twist. Named after New York state—where in the 1940s public health department microbiologists Rachel Brown and Elizabeth Hazen isolated it from a culture of soil bacteria—nystatin is one of the oldest drugs available for fungal infections. Yet Kolter's team surprisingly found that nystatin has an effect on bacteria in addition to fungi.

    Triggers of mass construction.

    Bacillus subtilis growing in broth (left) clump together when the antibiotic nystatin is added (right).


    Nystatin jolts Bacillus subtilis, another soil dweller, into synthesizing the sugar and protein mix that allows the cells to stick together as a biofilm. The researchers further discovered that nystatin does this by creating pores in the bacterial cell that allow potassium ions to flow out. The rapid departure of the potassium ions turns on a key enzyme, which leads to creation of the extracellular material for the biofilm.

    “It's almost like neurons,” says Kolter, referring to how the flow of charged ions in and out of nerve cells allows them to signal each other. Kolter suspects that this antibiotic-induced potassium leakage will turn out to be a common mechanism for translating signals from other cells, so his team is looking for it in other microorganisms. “It makes sense,” says Miller, and uncovering this novel mechanism “is an important contribution.”

    Volume control

    If serving as signals is more commonly the natural role of antibiotics rather than killing or inhibiting growth, then the medical problem of antibiotic resistance may really be an issue of volume control. Although genes that confer resistance to antibiotics, including ones that encode molecular pumps or enzymes that dismantle the drugs, are most often found in microbes in hospitals or other clinical settings, they're also counterintuitively present in soil microbes that live in pristine environments little exposed to pharmaceuticals. In a commentary last year, Martínez suggested that such “resistance determinants,” rather than serving as a defensive countermeasure, fine-tune the communication role of antibiotics in a natural community (Science, 18 July 2008, p. 365).

    Davies shares this perspective, adding that these genes “might be one way of modulating the signal another organisms is producing, one way to mute the response of another.” Given that premise, Davies advocates turning to soil to address the antibiotic-resistance problem. He believes that soil contains an abundance of yet-to-be-discovered signaling molecules made by microbes and that screening for ones with bioactivity at lower concentrations—drug companies looking for antibiotics typically screen for activity that depends on high concentrations of compounds—will yield leads that ultimately turn into new infection-fighting drugs. Kolter agrees: “Scan widely for signaling and you might find other antibiotics.”

    Davies's group is currently doing just that, using its library of bioluminescent bacteria to screen soil itself. They are finding that various colonies do react—lighting up, says Davies, “like stars against the sky”—indicating that compounds or organisms in the soil are activating a range of specific gene promoters in S. typhimurium.

    Thanks to the work of Davies and others, no one now suggests that antibiotics made by microbes perform a single function in nature. It's increasingly clear that the molecules are more often a part of microbial crosstalk rather than crossfire. It's just a lucky, lifesaving accident for us that the right dose of a microbial signal can make it a poison, too.

    • * Christine Mlot is a science writer based in Madison, Wisconsin.

  11. Space Weather Forecasting

    Are We Ready for the Next Solar Maximum? No Way, Say Scientists

    1. Richard A. Kerr

    Forecasters testing their skills against the sun's mounting ferocity find themselves still in the early days of space weather prediction.

    Solar assailant.

    The sun violently ejects magnetically confined bubbles of charged particles (left) that collide with Earth's magnetic field (right), triggering geomagnetic storms.


    The Big One for space physicists struck on 28 August 1859. The sun had blasted a billion-ton magnetic bubble of protons and the like right at Earth. On smashing into the planet's own magnetic cocoon at several million kilometers per hour, the bubble dumped its energy, pushing the solar-driven aurora from its customary arctic latitudes to overhead of Cuba. This once-in-500-years “solar superstorm” crippled telegraph systems for a day or two across the United States and Europe but otherwise was mainly remembered for its dramatic light show.

    Now that our world has evolved into a so-called cyberelectrosphere of modern electronics, we can hardly hope to fare as well. Today, the charged-particle radiation and electromagnetic fury of a geomagnetic superstorm would fry satellites, degrade GPS navigation, disrupt radio communications, and trigger continent-wide blackouts lasting weeks or longer. Even a storm of the century would wreak havoc. That's why space physicists are so anxious to forecast space weather storms accurately. If predicting a hurricane a few days ahead can help people prepare for a terrestrial storm's onslaught, they reason, predicting solar storms should help operators of susceptible systems prepare for an electromagnetic storm.

    And space weather forecasters' next challenge is coming up soon. The next peak in the 11-year sunspot cycle of solar activity looms in 2012 or 2013. A space weather symposium* last month asked, “Are we ready for Solar Max?” The unanimous answer from participants was “No.” “I think we are better off ” than a decade ago, says space physicist Daniel Baker of the University of Colorado, Boulder. Back then, researchers were about to launch their first concerted effort to predict space weather the way meteorologists predict terrestrial weather, using computer models. “But we probably aren't as ready as we ought to be,” Baker adds.

    In fact, space forecasters are about where their meteorological colleagues were in the 1960s: making useful but unimpressive forecasts in the short term and lacking computer models able to improve on longer-term predictions by human forecasters. And even the short-term forecasts could go by the boards if the sole but aging early-warning satellite fails before a replacement—as yet unfunded and unplanned—arrives in orbit.

    High-tech target

    While researchers have been working to improve their forecasting skills, the world has, if anything, become more susceptible to space weather. “The general trend would be increasing vulnerability to the effects of space storms,” says Baker, who chaired a December 2008 workshop report on the subject by the Space Studies Board of the U.S. National Academies. “In general, the systems are becoming softer.” The power grid operates more efficiently, he says, but that gives it less margin for error and less capacity to buffer a storm's disruptions. The surging power-line currents induced by a severe solar storm could push the grid into uncharted territory. GPS technology, especially the highest-precision variety, has become commonplace since the last solar maximum—for navigating planes more autonomously, for example—but it comes with new codes and new signals untested by the ionospheric disturbances of a major solar storm. Now-ubiquitous cell phones are no less vulnerable.

    The academies' report put a huge price tag on a repeat of the 1859 superstorm. Judging by the costs of smaller incidents in recent decades, the panel estimated the economic cost in just the first year after such an extreme storm at $1 trillion to $2 trillion. Full recovery would take 4 to 10 years. Disturbances in the high-altitude ionosphere would disrupt radio communications and GPS for days; surges induced in the power grid could destroy expensive and hard-to-replace transformers. Satellites that survived could cost $100 million apiece to put back into operation. Even a recurrence of the lesser superstorm of May 1921 could lead to blackouts affecting 130 million Americans and half of North America, the panel reported.

    Status quo

    If you pity the poor weatherman, then your sympathies for the space weather forecaster should be unbounded. In May 1996, Ernest Hildner—then director of the National Oceanic and Atmospheric Administration's Space Environment Center in Boulder, Colorado—told an audience that when it comes to predicting space weather, “we don't do very well. We're several decades behind weather forecasters.”

    A big part of the problem, Hildner said, was a dearth of observations. Space weather forecasters used ground-based telescopes to observe sunspots, solar flares, and other signs that the sun was primed to launch solarstorm-inducing disturbances toward Earth. But forecasters had no spacecraft between the sun and Earth to record the passage of threatening solar disturbances, much less whether they were actually going to hit Earth. It was “like predicting Washington, D.C., weather with one weather station in San Francisco,” Hildner said.

    Enter ACE. In 1997, the Advanced Composition Explorer arrived at its station, L1 or Lagrangian point 1, about 1.5 million kilometers sunward of Earth. There it could monitor the high-speed bubbles of protons and other charged particles—called coronal mass ejections (CMEs)—that would slam into the bulbous end of Earth's teardrop-shaped magnetosphere 30 to 60 minutes after passing ACE. The speed and density of a CME reflects its total energy. But ACE also reports the orientation of the magnetic field embedded in a CME, which must be opposite that of Earth's magnetic field if the CME's power is to gain entry to the magnetosphere and drive a storm.

    ACE made the first short-term storm warnings possible in 1999. Issued only 20 to 60 minutes ahead of a storm's arrival by the Space Weather Prediction Center (SWPC, the former Space Environment Center), the warnings have fallen far short of perfection. One-third of major storms arrive unheralded and almost one-quarter of the warnings turn out to be false alarms, according to SWPC's own analysis. More severe storms are so rare that it's hard to say how much skill forecasters have in predicting them, says Christopher Balch, acting head of SWPC's forecast office.

    ACE can offer no help with forecasting storms a day ahead. Next-day SWPC forecasts of geomagnetic activity based on observations of the sun have performed better than simply assuming that the current day's conditions would persist into the next day. But next-day forecasts performed no better than if forecasters assumed the next day would be like the average of the previous 30 days. Accurate forecasting 8 hours to 1 day ahead, Baker concludes, “is just not in the cards right now.”

    Playing catch-up

    To push useful space weather forecasting beyond a few minutes ahead, geophysicists are emulating terrestrial weather forecasters. From the 1950s onward, meteorologists built and refined computer models that ingested a torrent of observations and produced a picture of current weather around the globe. The models could then calculate how the weather would evolve. Over several decades, the models surpassed human forecasters at short-range prediction and pushed useful forecasts out beyond 7 days.

    Coming out blazing.

    The sun (masked here to reveal faint features) will be blasting more coronal mass ejections (lower right) Earth's way as the sun enters its next cycle of rising activity.


    Space weather forecasters face extra hurdles. There's still a severe dearth of observations to feed into the models. And rather than evolving within one relatively uniform atmosphere, space weather progresses from the near-vacuum of a million-degree solar corona—where magnetic fields rule—to Earth's relatively dense, cold upper atmosphere and eventually the ground: “sun to mud,” as they say. That forces researchers to develop a dozen submodels to make a chain linking the sun to Earth. Space scientists hoping to transfer their research models to the forecasting arena should “expect to have your egos hurt,” magnetospheric physicist W. Jeffrey Hughes of Boston University (BU) said at the May meeting of the American Geophysical Union. “It's a painful process.”

    To ease the pain of moving from research modeling to day-to-day forecasting, the American space weather community has developed a loose structure for creating and testing forecast models. Two 8-year-old centers—one at the University of Michigan (UM), Ann Arbor, and the other an 11-institution consortium headed by BU—vie in friendly competition to develop physics-based, sun-to-Earth models. A 10-year-old interagency modeling center at NASA's Goddard Space Flight Center (GSFC) in Greenbelt, Maryland, is evaluating 30 contributed submodels. Finally, a test bed will soon be created at SWPC for debugging candidate submodels before they go operational.

    No model—not even a submodel—has made it through this system to operational status. A submodel called ENLIL is leading the pack, says SWPC Director Thomas Bogdan. ENLIL forecasts how newly formed CMEs will propagate from the sun to ACE. But it won't become operational for 2 to 3 years, around the time of solar max, when it will be run on the same supercomputer National Weather Service forecasters use. Models carrying the disturbance into and through the magnetosphere and the atmosphere and to the ground all trail ENLIL.

    “We've made very good progress in the last decade,” says space physicist Tamas Gombosi, director of the UM modeling center. “But can we forecast? No. We have a long way to go. My hope is that not this solar max but the next, physics-based forecasting” will be a reality.

    In the meantime, scientists are keeping their fingers crossed for ACE. At 12 years old, it has entered satellite old age. It and the 14-year-old SOHO satellite that images CMEs near the sun “can fail any time, no one knows,” notes Michael Hesse, director of the modeling center at GSFC. Although enough time remains to build and launch a backup for ACE's monitoring system, none has been proposed, much less funded.

    • * The 2009 Space Weather Enterprise Forum, 19–20 May, Washington, D.C.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution