News this Week

Science  02 Jul 2010:
Vol. 329, Issue 5987, pp. 18
  1. Chronic Fatigue Syndrome

    Conflicting Papers on Hold as XMRV Frenzy Reaches New Heights

    1. Martin Enserink

    It was just a snippet of news, reported by an obscure journal in the Netherlands. And yet it lit up the Internet. Twitter was all atwitter, scientists' mailboxes on both sides of the Atlantic began filling up, and dozens of bloggers started jubilating. “It's happened. I cannot tell you all how this changes the world as we have known it for 25+ years,” one patient wrote on her blog. “Now to work on the vindication part!”

    The reason for all the excitement? Scientists at the U.S. National Institutes of Health (NIH) and the Food and Drug Administration (FDA) were reported to have confirmed the link, first published in Science last year, between a human retrovirus and the elusive condition called chronic fatigue syndrome (CFS). Earlier this year, three other groups reported being unable to replicate such a connection. That federal scientists now confirmed it was a huge mood-lifter for patients, many of whom are desperate to find a biological cause, and a cure, for their debilitating ailment.

    But the story wasn't as simple as that. Science has learned that a paper describing the new findings, already accepted by the Proceedings of the National Academy of Sciences (PNAS), has been put on hold because it directly contradicts another as-yet-unpublished study by a third government agency, the U.S. Centers for Disease Control and Prevention (CDC). That paper, a retrovirus scientist says, has been submitted to Retrovirology and is also on hold; it fails to find a link between the xenotropic murine leukemia virus-related virus (XMRV) and CFS. The contradiction has caused “nervousness” both at PNAS and among senior officials within the Department of Health and Human Services, of which all three agencies are part, says one scientist with inside knowledge.

    The remarkable impasse is the latest twist in a story that keeps arousing fierce passions. It started in 2009 when a group of researchers led by Judy Mikovits of the Whittemore Peterson Institute (WPI) for Neuro-Immune Disease in Reno, Nevada, reported in Science finding traces of XMRV in peripheral blood mononuclear cells, a type of white blood cell, of 67% of CFS patients. By contrast, only 3.4% of healthy controls were found to harbor the virus. The team also showed that XMRV could infect human cells and concluded that the virus—which had previously been linked to prostate cancer—might play a role in causing CFS (Science, 23 October 2009, p. 585).

    Many scientists were skeptical, however, and in May, Science published three Technical Comments that tried to poke holes in the study, along with a rebuttal by Mikovits and first author Francis Ruscetti of the National Cancer Institute. By that time, two groups in the United Kingdom and one in the Netherlands had also published papers failing to find a link; in fact, they found little or no evidence of XMRV infection at all, either in patients or in healthy people. Three other groups, two from the United States and one from Europe, have also reported negative findings at meetings, says Kim McCleary, president of the Chronic Fatigue and Immune Dysfunction Syndrome Association of America, a patient advocacy group.

    Going viral.

    News that federal scientists confirmed the virus link reported by Judy Mikovits (inset) sparked thousands of tweets.


    The FDA-NIH paper would offer fresh hope that Mikovits is on to something after all, but so far, details about the work are scant. Ortho, a Dutch magazine about nutrition and food supplements, last week issued a press release saying that Harvey Alter, a renowned virologist at NIH's Clinical Center, mentioned the study when he gave a talk at a blood safety meeting in the Croatian capital Zagreb in late May. In his PowerPoint presentation, Alter wrote that the data in the 2009 study in Science “are extremely strong and likely true, despite the controversy.” Another bullet point said: “We (FDA & NIH) have independently confirmed the Lombardi group findings.” (WPI's Vincent Lombardi was the paper's first author.) But the presentation offered no detail beyond that tantalizing summary, and an NIH spokesperson says Alter is not available for comment.

    Meanwhile, a group working with retrovirologist William Switzer at CDC, which has done an independent study, has held its cards closer to its chest. But Science talked to several scientists who say they have seen the data, and they are negative. Although it's not unprecedented for government scientists to be on opposite ends of a scientific debate, two contradictory press releases on a flashpoint issue like CFS would look odd, scientists say. With publication deferred, “they want to find out what's going on first,” says one researcher who says he has been briefed about the controversy.

    For patients, the stakes are huge. A viral cause would offer a path to better understanding, prevention, and treatment. A paper published in PLoS ONE in April showed that three registered antiretroviral drugs can inhibit XMRV in the test tube, and although many scientists warn that it's premature even to consider them as treatment options, some patients have started testing out different combinations anyway. (One of them is Jamie Deckoff-Jones, a physician from Santa Fe, New Mexico, who is blogging about her experiences and those of her daughter, also a patient.)

    Patients have become a loud voice in the scientific debate as well—and it's taking its toll on scientists who don't support the XMRV hypothesis. “It's ghastly, ” says retrovirologist Myra McClure of Imperial College London, the lead author on one of the three published studies that came up empty-handed. “I've had people writing me, and I quote, that I don't know my arse from my elbow, and that I should be fired.” Four months after her first paper on CFS came out, McClure says it was also her last one. “Nothing on God's Earth could persuade me to do more research on CFS, ” she says. “I feel bad for the scientists, because it's true, we are a very angry community, ” says Wilhelmina Jenkins, a physicist living in Atlanta who has had CFS since 1983.

    The potential involvement of XMRV in CFS is having other ramifications as well. Last week, the AABB, an international association of blood banks, recommended to its members that they discourage CFS patients from donating blood ( A special task force on XMRV conceded that the evidence was preliminary but decided it's “prudent” to err on the side of caution, says task force member Louis Katz, the medical director at the Mississippi Valley Regional Blood Center in Davenport, Iowa. “If [XMRV] turns out to be important, ” says Katz, “I don't want to be criticized for doing nothing when I could have done something.”

  2. Bhopal

    India Launches New Probe of Cyanide Disaster

    1. Pallava Bagla

    NEW DELHI—More than a quarter-century after the world's worst chemical disaster, the Indian government last week ordered remediation of the contaminated site in Bhopal and the creation of a research institute to study the toxic legacy.

    The tragedy began on the night of 3 December 1984, when more than 40 tons of methyl isocyanate gas escaped from Union Carbide Corp.'s pesticide plant in Bhopal. Investigators learned that three safety systems, including a flare tower that could have burned off the leaking gas, were all switched off at the time of the accident. The gas killed 5295 people within days, and, according to local authorities, nearly 10,000 more people have subsequently died of complications from cyanide lesions.

    The government has now instructed the Indian Council of Medical Research (ICMR) to set up a center in Bhopal to study respiratory diseases, cancers, genetic disorders, and birth defects. The decision is controversial because years of research have uncovered only acute effects of the gas. The new center is a case of “political expediency,” charges Sushil Kumar Jain, a lung specialist here at Moolchand Hospital and lead author of many Bhopal studies.

    The move comes in the wake of public outrage over verdicts handed down last month to seven Union Carbide executives, each of whom was sentenced to a maximum of 2 years in prison for negligence. The light sentences in the long-running case are a “travesty of justice,” claims Sunita Narain, director of the Centre for Science and Environment, a nonprofit here. Stung by widespread criticism, the government is looking to retry the accused on the far more serious charge of “culpable homicide.”

    Paying homage.

    D. K. Satpathy of Gandhi Medical College and colleagues conducted over 2500 postmortems in 3 days in 1984.


    Bhopal is already familiar turf for ICMR. In an exhaustive report on the accident in 2004, ICMR found that the gas inflicted acute injuries to eyes and lungs but did not trigger cancers or other “progressive” diseases. ICMR also found no evidence of genetic or congenital abnormalities. There is “no interest among researchers to study this tragedy anymore,” contends ICMR Director General Vishwa Mohan Katoch. In 2008, he notes, a call for proposals for more Bhopal medical studies drew a mere three responses. Still, Katoch says, ICMR must comply with the order to set up a new center within 90 days.

    Also raising hackles is a decision to transfer control of Bhopal Memorial Hospital and Research Centre, established after the accident, to the Department of Biotechnology (DBT) and the Department of Atomic Energy. “This is the limit of absurdity,” says molecular biologist Pushpa M. Bhargava, chair of the Sambhavna Trust in Bhopal, a nonprofit body helping victims. “What experience does the Department of Biotechnology have in patient care?” According to DBT Secretary Maharaj Kishan Bhan, the hospital will become a base for environmental biomedical research. The government has also allocated $77 million to dismantle the plant and for “complete” restoration of the grounds by the end of 2012.

    Earlier this year, the National Environmental Engineering Research Institute (NEERI) in Nagpur found no contamination of groundwater in five wells dug at least 25 meters deep at the site. What saved Bhopal from a bigger catastrophe, says NEERI Acting Director Tapan K. Chakrabarti, is a 17-meter-thick layer of clay that has blocked pollutants from leaching into the water table—a silver lining, perhaps, to Bhopal's gloomy story.

  3. Condensed-Matter Physics

    Evidence for Free-Flowing Supersolid Slipping Away?

    1. Adrian Cho

    Six years ago, frigid solid helium set the world of condensed-matter physics afire when one team found that at temperatures near absolute zero the stuff appears to flow like a liquid without any viscosity (Science, 1 July 2005, p. 38). Physicists have since debated how such “supersolidity” might come about, or whether it even exists. Now, an experiment calls into question the first, best evidence for supersolidity. What physicists took as a sign of resistance-free flow at very low temperatures is in fact a manifestation of an odd softening of the solid at slightly higher temperatures, says John Reppy of Cornell University.

    If the result can be reconciled with previous experiments, “then I think John's conclusion that what we're seeing is not low-temperature supersolidity … is kind of hard to escape,” says John Beamish, an experimenter at the University of Alberta in Edmonton, Canada. But others say that supersolidity and the proposed softening may arise from the same underlying mechanism and that, by virtue of its design, Reppy's experiment may accentuate the latter.

    Which way?

    If solid helium flows, then adding defects to it should make this torsional oscillator twist faster at low temperatures (blue trace). Instead, it twists slower at higher temperatures (red trace).


    Liquid helium—specifically the isotope helium-4—flows without resistance when chilled below 2.17 kelvin as the light atoms in it collect in a quantum wave. Since the late 1960s, some theorists had speculated that might still happen even when the liquid is pressurized to 25 times atmospheric pressure to form a solid.

    In 2004, Moses Chan of Pennsylvania State University, University Park, and a colleague reported signs of such flow in solid helium. They set a little can of the material twisting atop a small metal shaft in a “torsional oscillator.” When the temperature dropped below about 0.2 kelvin, the frequency of the twisting shot up, suggesting that some helium atoms had let go of their neighbors, effectively lightening the can and reducing its “moment of inertia.” But those atoms would have to stand still as the rest of the helium moved, so they'd have to flow through the solid unimpeded.

    In 2006, however, Reppy and a colleague showed that they could reduce the size of the frequency shift if they warmed their helium to close to its melting point (Science, 24 March 2006, p. 1693). Such warming, called annealing, repairs defects in the pattern of atoms in a crystalline solid. So if atoms were flowing, it had to be along defects in the crystal, physicists concluded.

    But Reppy's new result suggests that nothing flows. This time, instead of annealing his helium to reduce defects, he pushes on it to create more. The 79-year-old don of low-temperature physics created a torsional oscillator that holds an annulus of solid helium between an outer cylinder and an inner one that moves like a piston. By moving it by a micrometer, Reppy could deform and add defects to a sample he'd already studied.

    If atoms really do flow freely along defects, then adding more defects should allow more flow, and at low temperatures the frequency of twisting should soar even higher. But adding defects did not increase the frequency at low temperatures, as Reppy reported online on 21 June in Physical Review Letters. Rather, it reduced the frequency at higher temperatures. “And that's completely contrary to the supersolid picture,” he says.

    It's possible to explain the observed frequency shifts without flow, Reppy says. Adding defects may make the solid helium softer so that it acts less like a chunk of diamond and more like a blob of Jell-O that sloshes as the oscillator twists back and forth. That sloshing would reduce the frequency of twisting. The effect would go away at low temperatures as the defects lock up and the solid stiffens.

    How did physicists mistake a frequency decrease at higher temperatures for a frequency increase at lower temperatures? Previous experiments relied on annealing to change the density of defects. But the process causes the helium to settle and generally creates a large, uncontrolled frequency shift that researchers must subtract before comparing data taken before and after annealing, Beamish says. Thinking that all the action was at low temperatures, physicists had assumed that plots of frequency versus temperature should coincide at high temperatures (see figure).

    So is this the end of supersolid helium? Not necessarily, Chan says: “I don't see anything wrong with John's measurement, but probably it can't be taken as evidence that the whole idea of supersolidity is gone.” For example, he says, softening does not explain signs of flow seen in helium crammed into the nanometer-sized pores of a type of glass, where the stiffness of the helium shouldn't make a difference.

    Moreover, Chan notes, theorists think that the flow in supersolidity may occur along stringlike defects called dislocations, but only after they have frozen into place. So higher-temperature softening could be related to lower-temperature flow. Reppy's cell squeezes the helium into a very thin annulus, which may accentuate the effects of softening over those of flow, Chan says.

    The new experiment may only replace one puzzle with another. To fully explain the frequency shifts Reppy sees with softening, the helium would have to become almost as squishy as a liquid, Beamish says. And it's not clear why solid helium should be so soft. “It's completely miraculous,” Reppy says, “as miraculous as the supersolid effect in terms of trying to understand it.” Stay tuned; the mystery of solid helium continues to deepen.


    From Science's Online Daily News Site


    Earth-Like Planets May Be Shielded From Solar Scorching Many of our galaxy's suns have destroyed the atmospheres of orbiting Earth-like planets—or so astrobiologists have long feared. The Milky Way, they note, is dominated by M dwarf stars: violent, unpredictable suns that frequently hurl high-energy particles and solar flares into space. Because they are much cooler than our sun, any potentially habitable planet would need to orbit them much closer than Earth does, putting it smack in the danger zone. But a new study indicates that these planets may be unexpectedly shielded from solar activity, keeping life safe.

    Lightest Bits of Matter Just Got Lighter Cosmologists have used measurements of some of the most massive objects in the universe to place a limit on the mass of the lightest particle in the cosmos. Using data from a survey of 700,000 galaxies, the researchers found that an elusive subatomic particle called a neutrino can have a mass of no more than 0.28 electron volts, which is less than one-billionth of the mass of a hydrogen atom.

    Filoviruses Go Way Back Relatives of the lethal Ebola and Marburg viruses likely began infecting mammals tens of millions of years ago, according to a new study, making this family of viruses, known as filoviruses, far older than scientists had thought. By comparing viral remnants in different species, scientists found they were nearly identical, indicating that they infected mammals only once early in evolution, and then the viral remnants were passed down as the groups diverged. Remnants of genes from these viruses exist in the DNA of bats, marsupials, rodents, and other mammals, a finding that may suggest where these deadly microbes lurk before they emerge to kill people.


    Tanning Ability Driven by Evolution Sunbathers who bronze beautifully have natural selection to thank. Research published in the Proceedings of the National Academy of Sciences suggests that the ability to tan is a trait that evolved several times in mid-latitude regions, such as China and the Mediterranean, where the sun's intensity varies dramatically from season to season. If inhabitants of these regions had consistently dark skin, which blocks the sun's rays, they wouldn't have produced enough vitamin D in the winter. If they had consistently light skin, their bodies would have been robbed of folate, a light-sensitive vitamin essential for cell division and repair. Folate is especially important during pregnancy; too little can result in birth defects. Indeed, the researchers posit that sun-induced folate deficiency, rather than skin cancer or sunburn, was the driving force behind the evolution of dark skin and tanning.

    Read the full postings, comments, and more at

  5. ScienceInsider

    From the Science Policy Blog

    The Supreme Court this week struck down a controversial business patent in its ruling on Bilski v. Kappos. But advocates who wanted the court to spell out clear rules on the patentability of inventions that deal with methods and processes were disappointed with the ambiguous ruling. Biotech advocates, fearing new barriers on intellectual property, were pleased.

    President Barack Obama this week released a National Space Policy that affirms the Administration's commitment to commercial space flight and exploration. The policy gives the mid-2030s as a target for the first human flight to Mars and promises a human mission to a nearby asteroid in 2025.

    Marine scientist Sean Powers says he's seeing more oil and less of a key seaweed called Sargassum in the wake of the gulf oil spill. Meanwhile, the federal government halted a Louisiana plan to build giant sand berms to protect wetlands.

    Former University of Wisconsin, Madison, biologist Elizabeth Goodwin pleaded guilty to falsifying data on a grant progress report. She will pay $50,000 and be barred from federal research for 3 years. A group of her graduate students reported the misconduct.

    Kentucky agrarian literary legend Wendell Berry has pulled his personal papers from the University of Kentucky in protest over the school's emphasis on research over teaching and its environmental policies. “The University of Kentucky has a mandate to look after the country people and the rural landscapes,” he told Insider.

    Presidential science adviser John Holdren says that a report on scientific integrity, nearly a year late, will simply augment efforts already in place that are aimed at improving government performance on the issue. President Obama had requested a full “plan” to tackle the problem.

    For the full postings and more, visit

  6. Gulf Oil Disaster

    Hunting for Plumes, Learning to Live in a Media Spotlight

    1. Erik Stokstad*

    Not long after the Deepwater Horizon rig burned and sank, Samantha Joye had a hunch about how the impending disaster would unfold. As oil and gas gushed out of the ruptured well into the Gulf of Mexico, she reasoned, many of the hydrocarbons would spread underwater in deep plumes rather than all rise to the surface, as many expected. But she had no inkling of what her confirmation of this insidious subsurface dispersal would mean for her life and career in the crazy weeks and months to follow.

    Reluctant star.

    Samantha Joye has become one of the most-quoted scientists on the oil spill.


    Thanks to a combination of expert insight and the good fortune of having the first research vessel out of the docks, Joye and her colleagues were the earliest of several groups to find evidence of the plumes. For Joye, a biogeochemist at the University of Georgia, Athens, the impact of the discovery was immediate and unrelenting. Since the findings made national headlines in mid-May—after Joye tipped off a reporter from The New York Times—she has been the focus of persistent media attention. “This dwarfs anything I could have imagined,” says Joye, who has become one of the chief independent scientific voices of the spill. “I think she's done a great service drawing attention to issues that otherwise would have possibly gotten swept under the rug,” says chemist Jeffrey Short of Oceana, an advocacy group in Washington, D.C.

    Joye, 44, first heard rumors that something was badly amiss with an oil platform from colleagues who had spotted large columns of smoke while doing research nearby. Shortly after the oil rig sank, Joye e-mailed the program manager who was handling her upcoming cruise on the R/V Pelican, which was being funded by the National Oceanic and Atmospheric Administration (NOAA). The original plan had been to map gas vents and other features on the sea floor, but Joye wanted to change the focus to the impact of the oil spill. He quickly agreed.

    Based on her previous studies of naturally released hydrocarbons in the gulf, Joye suspected that oil and gas from the leaking well would become entrained in the water column and then move laterally with the prevailing currents. A bad back prevented Joye from going on the cruise, but she helped coordinate research and real-time analyses from her office.

    On 12 May, scientists on the Pelican detected signals that indicated an unusual layer of dissolved organic matter that suggested plumes between 700 and 1300 meters deep. Oxygen levels were lower than normal in the suspected plumes, probably because bacteria were metabolizing methane gas. Back in Georgia a few days later, Joye called the Times reporter, who wrote about the finding on 15 May. The story described Joye's alarm: “There's a shocking amount of oil in the deep water, relative to what you see in the surface water,” she said.

    Reporters eagerly chased the front-page story. Just 10 minutes after the story went online, Joye's phone began to ring. Reporters kept calling until 2 a.m., she says, when she unplugged the phone. News trucks were parked in front of her office on the campus. “I didn't have a minute's peace,” she says. “It was absolutely insane.”

    The media maelstrom did not go unnoticed at NOAA. On Monday, 17 May, NOAA Administrator Jane Lubchenco released a cautious statement emphasizing that the Pelican cruise hadn't yet confirmed the presence of oil in the plumes and that the falling oxygen levels weren't an immediate danger to marine life. Lubchenco added: “We eagerly await results from their analyses and share with them the goal of disseminating accurate information.” In a subsequent lecture at a large research meeting, while discussing preliminary plume measurements from a NOAA vessel, Lubchenco warned the audience: “If we jump to conclusions, that doesn't serve science well. Verification, not conjecture, is what I would urge.”

    Joye and others were puzzled by the reaction to the Pelican findings. “A lot of academicians were surprised by NOAA's behavior,” says Ernst Peebles of the University of South Florida in St. Petersburg, who has also identified plumes in the gulf. Ian MacDonald of Florida State University, Tallahassee, felt that NOAA was “basically challenging Samantha's interpretation of this data.” NOAA says it wanted to correct misleading news stories and put out accurate information. “The intent was to distinguish what we actually knew from the cruise from speculation,” according to Steve Murawski of NOAA, who heads an interagency group on subsurface sampling.

    Peebles thinks Joye has prompted NOAA to be more open; the agency now regularly puts out reports on preliminary results from cruises. “NOAA has changed its policy … as a result of interaction with the Pelican cruise, for the better,” he says. NOAA's most recent report, released last week, confirms the presence of plumes at a concentration of 1 to 2 parts per million within a few kilometers of the well.

    Joye says she was also motivated to keep talking with reporters because she was frustrated that more research wasn't under way on tracking the hydrocarbons and their impact in the deep sea. Since then, she has been using her newfound soapbox to advocate for more research, even testifying on Capitol Hill.

    Just a few days after the NOAA statement, Joye got an unexpected chance to do more science. The University of Miami's R/V F. G. Walton Smith had become available for a research cruise, but she had only 4 days to organize the mission—a task that normally takes more than 2 months. “It wasn't the most pleasant cruise, but I had to be out there.” To manage her back pain, she brought along a bottle of ibuprofen—the extra-large pills she normally gives to her horse.

    The team of 11 scientists from four institutions worked around the clock for 2 weeks, collecting samples to track and analyze the plume. As Joye described in her daily blog* from the cruise and in press conferences, the team found the first visible signs of oil in the deep-water samples, found methane concentrations up to 100,000 times higher than normal, and discovered that oxygen levels were as much as 40% below normal. Press interest remained high: A TV news crew chartered a boat to visit the ship at sea; more were waiting at the docks when the ship returned.

    Joye's main goals right now are to figure out whether the microbes are feeding more on oil or gas and which nutrients might limit their action; these factors likely determine how much oxygen the microbes are using, which is important to understand possible impacts on other marine life. She's hoping to get another cruise next month and ideally take a submarine into the plume in August to look for impacts on fish and other marine life. “It's a disaster, and I love the Gulf of Mexico,” she says. “I'm going to do whatever I have to do to figure out what's going on out there.”

  7. Gulf Oil Disaster

    How to Kill a Well So That It's Really Most Sincerely Dead

    1. Richard A. Kerr

    After a run of “top kill” attempts failed to stanch the flow of crude oil from its Macondo well in the Gulf of Mexico, oil giant BP has one major option left: plug the flow near its source by drilling into the bottom of the runaway well. That will require hitting an 18-centimeter-wide well 5500 meters beneath the gulf's surface. It may sound like a tall order, but drillers are confident. It “is going to be the ultimate answer,” says Paul Bommer, a petroleum engineer at the University of Texas (UT), Austin. The technology “is miraculous,” adds Bommer's UT Austin colleague Tadeus Patzek. “It's difficult, it's going to take time, and it will be frustrating. [But] eventually they'll get there, absolutely.” And a kill looks almost as certain.

    A bottom kill may take months of drilling to reach its target, but it has an innate advantage over the top-down approach. As in a top kill, the goal is to fill the well with enough heavy drilling “mud” to counter the pressure of oil and gas rushing up from the reservoir below. Mud injected at the top of the well needs enough extra back pressure to slow the flow so it can sink down the well. That never happened on BP's repeated attempts. In a bottom kill, by contrast, the rising oil and gas can actually give the mud a boost.

    On target.

    Two relief wells (yellow) are being guided toward the gushing Macondo well (red) by drillers using onboard navigation and remote-sensing instrumentation.


    First, however, the drillers must find the bottom of the target well. They start with a navigation device invented for submarines and missiles, says borehole geophysicist Roger N. Anderson of Columbia University's Lamont-Doherty Earth Observatory in Palisades, New York. Twenty years ago, workers at Los Alamos National Laboratory developed a set of accelerometers so compact they could be installed in a missile, or just above a drill bit. By precisely measuring acceleration in three dimensions throughout drilling, the navigation package can tell drillers where the drill bit is within several meters at all times, says Anderson.

    Another technology, directional drilling, lets drillers act on that precise navigation information. The fastest method for turning a borehole in a preferred direction is the “point the bit” approach, says Bommer. The entire string of drill pipe is turned from the drilling platform; in addition, the assembly at the bottom of the drill string can be set to give the bit an “extra bump” in the preferred direction with each rotation. Knowing the location of both drill and target within a few meters isn't good enough when drilling into an 18-centimeter wellbore, so drillers eventually bring into play an active technique for locating the target well.

    In the current operation, BP's drillers started their relief well (the first of two the federal government required) 850 meters from the Macondo well. They drilled about 3660 meters straight down and then bent toward the target well. Using an electromagnetic technique that senses the steel casing of the well, they steered the drill bit to 6 meters from the Macondo well and are now drilling parallel to it with another 300 meters to go to the intended intersection point. Ranging continuously, drillers will home in on the intersection point in the last 60 meters of drilling.

    “This is the point when we have to be very good, and we are,” BP Senior Vice President Kent Wells said at a press briefing on 28 June. The record bears him out. As Wells noted, John Wright of Boots & Coots International Well Control Inc., which is running the relief well operation, has overseen 40 relief wells and borehole intersection operations. All 40 successfully intersected their targets. “It's not so much ‘if’ but ‘when’” the well is intersected, said Wells.

    Intersection is not a kill, however. Patzek cautions that the way this well appears to have failed—at the point where cement was supposed to have sealed the space between two steel pipes—may make stanching the flow more difficult than usual. With such potential complications in mind, Anderson puts the odds of a kill using two wells at 98%. There are no blowouts gushing endlessly, Anderson notes. “They will succeed,” he says. “The only question is how long it will take.”

  8. Epigenetics

    The Seductive Allure of Behavioral Epigenetics

    1. Greg Miller

    Could chemical changes to DNA underlie some of society's more vexing problems? Or is this hot new field getting ahead of itself?


    Michael Meaney and Moshe Szyf work in the same Canadian city, but it took a chance meeting at a Spanish pub more than 15 years ago to jump-start a collaboration that helped create a new discipline. Meaney, a neuroscientist at the Douglas Mental Health University Institute in Montreal, studies how early life experiences shape behavior later in life. Across town at McGill University, Szyf is a leading expert on chemical alterations to DNA that affect gene activity. Sometime in the mid-1990s, both men attended the same meeting in Madrid and ended up at a bar talking and drinking beer. “A lot of it,” Szyf recalls.

    Meaney told Szyf about his findings that rat pups raised by inattentive mothers tend to be more anxious as adults than pups raised by more nurturing mothers. He also described how the activity of stress-related genes was altered in the undernurtured pups. At some point in the conversation, Szyf had a flash of insight: This difference must be due to DNA methylation—the chemical alteration he had been studying in stem cells and tumor cells.

    The idea cut against the conventional thinking in both fields. In neuroscience, the prevailing wisdom held that long-term changes in behavior result from physical changes in neural circuits—such as when neurons build new synapses and become more sensitive to messages from their neighbors. And most scientists who studied DNA methylation thought the process was restricted to embryonic development or cancer cells.

    Back in Montreal, the pair eventually began collaborating, and Szyf's hunch turned out to be right. Their research, along with work from a handful of other labs, has sparked an explosion of interest in so-called epigenetic mechanisms of gene regulation in the brain. Long familiar to developmental and cancer biologists, these molecular mechanisms alter the activity of genes without changing their DNA sequence (Science, 10 August 2001, p. 1064). And they have recently become a white-hot topic in neuroscience. Meaney and Szyf 's work suggests that epigenetics could explain how early life experiences can leave an indelible mark on the brain and influence both behavior and physical health later in life. These effects may even carry over to subsequent generations. Meanwhile, other researchers have implicated epigenetics in drug addiction. Still others have described important roles in cognition (see sidebar, p. 27).

    Some researchers speculate that if these rodent findings extend to humans, epigenetics could turn out to be at the heart of some of the most vexing problems in society. These ills include the long-term health problems of people raised in lower socioeconomic environments, the vicious cycle in which abused children grow up to be abusive parents, and the struggles of drug addicts trying to kick the habit.

    Tempting as such speculation may be, others worry that the young but fast-growing field of behavioral epigenetics is getting ahead of itself. They point out that so far there's very little evidence in humans that epigenetics connects early life experience to behavioral or health problems later in life. Moreover, several experimental obstacles will make finding proof exceedingly difficult. “I think there's been a lot of putting the cart before the horse,” says Gregory Miller (no relation), a psychologist at the University of British Columbia (UBC), Vancouver.

    The importance of a loving mother

    In 2004, Szyf and Meaney published a paper in Nature Neuroscience that helped launch the behavioral epigenetics revolution. It remains one of the most cited papers that journal has ever published. The paper built on more than a decade of research in Meaney's lab on rodent mothering styles.

    Rat moms vary naturally in their nurturing tendencies. Some lick and groom their pups extensively and arch their backs to make it easier for their young to nurse. Others spend far less time doting on their pups in this way.

    Meaney had found that the type of mothering a rat receives as a pup calibrates how its brain responds to stress throughout its life. Rats raised by less-nurturing mothers are more sensitive to stress when they grow up. When confined to a Plexiglas tube that restricts their movement, for example, they exhibit a greater surge in corticosterone, a hormone pumped out by the adrenal glands in times of stress. The likely cause is reduced numbers of a receptor for steroid hormones in the brain. This so-called glucocorticoid receptor is part of a negative feedback loop that dials down the volume on communication between the brain and adrenal glands, thereby reducing reactivity to stress.

    The Nature Neuroscience paper linked this reduction in glucocorticoid receptors to DNA methylation. Rats raised by less-nurturing moms tended to have more methyl groups attached to the promoter region, the “on” switch, of the glucocorticoid receptor gene. These methyl groups block access by the transcription factors that turn the gene on. As a result, fewer receptors are produced. Subsequent experiments showed that enzymes that reverse DNA methylation of the glucocorticoid receptor gene also reverse the effects of unenthusiastic mothering on the offspring's hormonal and behavioral responses to stress.

    Several of Meaney's students have carried on with this work and extended it in new directions. Frances Champagne, a co-author of the 2004 paper, went on to show that female rats raised by nurturing mothers are more nurturing mothers themselves. She also found that pups raised by less-nurturing moms exhibit greater methylation—and reduced expression—of the gene for a particular estrogen receptor in the hypothalamus, a brain region involved with reproductive behavior. This receptor amplifies signaling by oxytocin, a hormone that promotes mother-infant bonding.

    Now at Columbia University, Champagne has been investigating the long-term effects of other kinds of social experiences early in life. She has found that mice raised communally by multiple mothers (as rodents raise their young in the wild) are better socially adjusted as adults: They are less likely to pick a fight with a stranger put into their cage, for example. Communally raised mice also spend more time fussing over their own offspring. Fussed-over daughters in turn tend to grow up to be nurturing mothers. These changes correlate with a higher density of oxytocin receptors in some brain regions, the researchers reported last September in Frontiers in Behavioral Neuroscience. Champagne's lab is now investigating whether that higher density could be due to epigenetic modifications. “What's exciting to me is that the social world, which can be perceived as being this ethereal thing that may not have a biological basis, can affect these mechanisms,” she says.

    Adversity takes its toll

    Whereas a nurturing environment can predispose a rodent to be calmer in adulthood and raise a nurturing family of its own, an adverse environment can have the opposite effect. There's evidence that this effect, too, may involve epigenetic changes. Last year, researchers led by Tania Roth and J. David Sweatt of the University of Alabama, Birmingham, helped show this by building on earlier work showing that rat mothers denied access to the materials needed to make a proper nest become anxious and spend less time nurturing their young. Pups raised by these stressed-out rat moms exhibited increased methylation of the gene for BDNF, a neural growth factor, in the brain's prefrontal cortex, they reported in the 1 May 2009 issue of Biological Psychiatry. In addition, this methylation pattern, which would tend to reduce the amount of BDNF produced, was passed on to the subsequent generation.

    Different upbringings.

    Being raised by a nurturing (top left) or a lackadaisical (top right) mother can cause epigenetic differences that affect a rat pup's behavior later in life. Whether similar differences occur in people raised in wealthy (bottom left) or impoverished (bottom right) neighborhoods remains an open question.


    Exactly what low BDNF levels mean for a mouse isn't entirely known. However, levels of this growth factor are reduced in mouse models of depression and anxiety, at least in certain brain regions, and restoring it can mimic the effect of antidepressant drugs. In 2006, Eric Nestler, currently at Mount Sinai Medical Center in New York City, and colleagues reported in Nature Neuroscience that the Bdnf gene is down-regulated in the hippocampus of adult mice exposed to social stress—in the form of chronic bullying by a bigger mouse. In the same paper, Nestler's team linked this reduction in Bdnf activity to epigenetic modifications involving histones, tiny protein spools that keep DNA wrapped up. Chronic stress triggered an increase in a type of histone methylation that suppresses gene activity by keeping the DNA containing the Bdnf gene tightly wound. Anti-depressant drugs, on the other hand, boosted histone acetylation, which helps unwind DNA from histones and promote Bdnf activity. Such findings hint that epigenetic modifications could be an important link between adverse life experiences and the risk of psychiatric disorders such as depression and anxiety, Nestler says.

    Another line of work in Nestler's lab suggests that epigenetic mechanisms could also play an important role in drug addiction. His team has now documented several epigenetic changes provoked by cocaine administration in rodents. The drug increases acetylation and decreases methylation of histones—both of which tend to promote gene activity—in the brain's reward circuitry. “Cocaine produces changes in the genome that make the brain more sensitive to the next dose of cocaine,” Nestler says.

    For example, in the 8 January issue of Science (p. 213), Nestler and colleagues reported that repeated cocaine administration suppresses methylation of a particular histone in one reward region, the nucleus accumbens. Suppressing this type of histone methylation in mice that had never received cocaine triggered the growth of extra dendritic spines—tiny extensions on neurons that tend to sensitize them—in the nucleus accumbens. Suppressing methylation also increased the animals' preference for cocaine once they finally tried it. In contrast, enhancing histone methylation reduced cocaine-seeking behavior. Nestler cautions that any treatments for human drug addicts are a long way off, but he says a better understanding of the genes altered by such epigenetic changes could eventually point to new treatments to break the hold of addiction.

    Epigenetic breakdown.

    Several epigenetic mechanisms alter gene activity in neurons, with potentially important effects on brain function and behavior. Histone acetylation tends to promote gene activity, whereas histone methylation and DNA methylation tend to inhibit it.


    The human story

    If the rodent research on epigenetics translates to humans, the implications could be far-reaching. The effects of adverse environments early in life are well documented and notoriously hard to shake. Childhood abuse, for example, elevates the lifelong risk of depression, anxiety, and suicide. Growing up in an impoverished environment also takes a lasting toll, affecting physical health as well as behavior. Could epigenetics be part of the reason?

    So far, the evidence in humans is scant. A major reason, and a huge obstacle to the field in general, is that human brain tissue is hard to come by. In one of the few studies to date, Meaney, Szyf, and postdoctoral fellow Patrick McGowan examined postmortem brains from 24 people who had committed suicide, half of whom had been abused as children. From their rodent work, the researchers hypothesized that those who had been abused might have more methyl groups on the glucocorticoid receptor gene than did those who had not been abused. That's indeed what they found, the researchers reported in the March 2009 issue of Nature Neuroscience.

    Lacking access to brain tissue, researchers have looked elsewhere in the body for signs of epigenetic alterations. In the March–April 2008 issue of Epigenetics, a team led by Tim Oberlander at UBC reported increased methylation of the glucocorticoid receptor gene in cells isolated from umbilical cord blood in 33 infants born to women who suffered symptoms of depression during their pregnancy. Those infants also had higher cortisol concentrations in their saliva when tested 3 months later, suggesting an increased susceptibility to stress.

    “Those data are very consistent with the rodent work, although there's some ambiguity about what's actually going on,” says UBC's Miller, who was not involved with the study. One issue, he and others point out, is that cord blood contains a mishmash of different cell types. Methylation patterns can vary significantly from one cell type to another, and the proportion of cell types in cord blood can vary significantly from one individual to another. As a result, Miller says, the findings in the Epigenetics paper could be accounted for by a difference in the combination of cell types sampled rather than a difference in methylation in any particular cell type. “That's a real problem for that study and others like it,” he says.

    In Vancouver, Miller and UBC colleagues Edith Chen and Michael Kobor head an ongoing study of gene expression in people raised in different socioeconomic conditions. Their work focuses primarily on white blood cells. Miller says he expected to find increased methylation of the glucocorticoid receptor gene studied by Meaney's group, reasoning that the lower socioeconomic environment of his subjects might be roughly analogous to the un-nurturing environment of Meaney's rat pups. So far they haven't found it. “We're not seeing anything in the way of DNA methylation in the glucocorticoid receptor [gene],” he says. One possible explanation, Miller says, is that blood and brain cells don't necessarily undergo the same epigenetic changes in response to a given life experience.

    Miller also suspected that he and his colleagues would find epigenetic alterations in genes related to physical health. Because of its long-lasting effects on gene activity, DNA methylation seemed like a potentially attractive explanation for why people who grow up in poor households have an increased lifetime risk of health problems. (A 2006 study of graduates of Johns Hopkins University School of Medicine, for example, showed that even in this well-educated, affluent population, those who'd grown up in poor households decades earlier had 2.4 times the risk of heart disease compared to those who'd grown up in wealthier households.)

    Socioeconomic status early in life does appear to alter gene expression: In a 25 August 2009 paper in the Proceedings of the National Academy of Sciences (PNAS), Miller and colleagues reported disparities in the activity of more than 100 genes related to immune system function in the white blood cells of men who lived in lower socioeconomic environments before the age of 5. The net result of these changes in gene activity would tend to increase inflammatory immune responses, a potential contributing factor to the documented increases for infectious and cardiovascular diseases related to poverty.

    But epigenetics does not seem to be the cause of these changes. “We spent the last few years trying to see if we could find evidence of epigenetic alterations in the immune system that are related to early life experience,” Miller says. “This work is still ongoing, so I think it would be premature to conclude anything definitively, but we've had less success than we'd hoped and imagined.”

    Others also feel that too much emphasis is being put on epigenetics as the link between environment and genes. “The big-picture story is that clearly social interactions can regulate gene expression, but they do so in different ways in different tissues,” says Steve Cole, a genomics researcher at the University of California (UC), Los Angeles, who collaborated on the PNAS study. Cole notes that epigenetics represents only one class of potential mechanisms for altering gene activity. He argues that changes in the activity of transcription factors can cause long-term changes in gene expression without help from DNA methylation or other epigenetic mechanisms.

    The excitement over the Meaney findings has led many researchers to look for epigenetic alterations, but Cole says he knows of several others who are coming up empty-handed: “Lots of people have spent lots of time and money and are now a little grumpy about this.”

    At UC Berkeley, one of Meaney's former students, Darlene Francis, says she has mixed feelings too. “These phenomena are really exciting from a public-health perspective,” says Francis, a neuroscientist and public-health researcher. Epigenetics provides a potential explanation for how social conditions can affect biology in ways that can contribute to poor health, Francis says: “This allows folks who are convinced that social forces are huge contributors to risk and vulnerability to make more effective arguments.” At the same time, Francis says too many researchers are embarking on “undirected” searches for epigenetic alterations in human populations without a solid rationale. “What some people take away [from the rodent work] is that methylation is now the cause and solution to a lot of life's problems,” she says. “I get frustrated with the overextrapolation of the animal findings, and some of it is my work so it's ironic,” Francis says. “I don't know when I became the sensible one.”

  9. Epigenetics

    A Role for Epigenetics in Cognition

    1. Greg Miller

    The push to show that epigenetics can translate early life experiences into lasting changes in behavior has been accompanied by a parallel surge of interest in how chemical modifications to DNA can affect cognition.

    The push to show that epigenetics can translate early life experiences into lasting changes in behavior (see main text, p. 24) has been accompanied by a parallel surge of interest in how chemical modifications to DNA can affect cognition. This work sprang from research in the late 1990s showing that abnormalities in DNA methylation are involved in developmental disorders that cause intellectual impairment, including Angelman syndrome and Rett syndrome, says J. David Sweatt, a neurobiologist at the University of Alabama, Birmingham. Sweatt's lab and others have since found evidence that epigenetic mechanisms play important roles in learning and memory in adult rodents. One recent study even suggests that these mechanisms may help explain why memory declines with age.

    Sweatt's team recently discovered that training mice to associate a certain location with a mild electric shock reduced DNA methylation in the hippocampus, a brain region crucial for memory formation. More specifically, the Bdnf gene was demethylated, boosting its activity, they reported 15 October 2008 in The Journal of Neuroscience. This gene encodes a growth factor that promotes new synaptic connections between neurons so that a memory is retained. Injecting a drug that inhibits demethylation of DNA into the hippocampus prevented the increase in Bdnf activity after learning and weakened rodents' memory of the place where they had received a shock. The work is one of several clues that DNA methylation affects the formation and maintenance of memories.

    Other work has focused on histone acetylation, a chemical modification that unwinds DNA from protein spools called histones, thereby enabling gene activity. One of the most intriguing studies to date was led by Li-Huei Tsai and André Fischer at the Massachusetts Institute of Technology in Cambridge. In the 10 May 2007 issue of Nature, they reported that a drug that promotes histone acetylation improved learning and memory in a mouse model of Alzheimer's disease. When injected into the hippocampus, the drug even seemed to restore a forgotten memory of a location where the rodents had previously received a shock.

    Maintaining memories.

    Epigenetic mechanisms in the brain seem to play important roles in memory and may be an attractive target for drugs to stave off memory loss in old age.


    More recently, Fischer, who is now at the European Neuroscience Institute in Göttingen, Germany, has been investigating alterations in histone acetylation that occur naturally with age. In the 7 May issue of Science (p. 753), he and colleagues reported that adult mice, compared with juveniles, exhibit reduced histone acetylation and diminished activation of genes in the hippocampus that were related to learning and memory. As in the Alzheimer's mice, drugs that boosted histone acetylation improved the older mice's performance on tests of rodent cognition.

    Biotech and pharmaceutical companies are already exploring these drugs, called histone deacetylase inhibitors, for treating Alzheimer's disease, says Ottavio Arancio, a neuroscientist at Columbia University. Some of these drugs are already approved for treating cancer. However, because the drugs alter the activity of multiple genes, Arancio cautions that more work is needed to determine whether they can aid memory without causing serious side effects.

  10. Profile: Dolores Piperno

    In Archaeobotanist's Hands, Tiny Fossils Yield Big Answers

    1. Michael Balter

    Dolores Piperno's microscopic techniques overcame skeptics and revolutionized views of early agriculture in the Americas.

    Hypothesis tester.

    Dolores Piperno perfected microscopic methods to trace early agriculture.


    WASHINGTON, D.C.—Some scientific breakthroughs are the result of lucky breaks. And then there are those that come when researchers know exactly what they're looking for and where and how to find it.

    That's what happened last year, when archaeobotanist Dolores Piperno and her co-workers reported in the Proceedings of the National Academy of Sciences finding the earliest known maize (corn) deep in the Balsas River Valley of southern Mexico. Geneticists had traced the origins of maize to these lowlands and estimated that humans domesticated it about 9000 years ago. Yet their findings seemed to clash with the archaeological evidence: Radiocarbon dates put the earliest maize cobs, from the Mexican highlands, at only about 6200 years ago.

    But Piperno and her team weren't looking for cobs, which disintegrate in the tropical humidity. Instead, armed with techniques that Piperno had perfected over many years, they searched for stone tools to which traces of maize might have adhered. In a Balsas Valley rock shelter, they found just that. A trove of grinding stones, closely associated with charcoal radiocarbon dated to 8700 years ago, harbored microscopic fossils of maize. For many researchers, the findings reconciled the genetic and archaeological evidence and ended a long debate about whether maize had been domesticated in the highlands or the lowlands.

    “That's exactly how you're supposed to do science,” says archaeobotanist Deborah Pearsall of the University of Missouri, Columbia. “If you look at the corpus of Dolores's work, you see the power of a scientist who chooses her research topics on the basis of hypotheses she wants to test.”

    Along the way, Piperno, who has a joint appointment at the Smithsonian Institution's National Museum of Natural History here and its Tropical Research Institute (STRI) in Balboa, Panama, has persuaded many former skeptics to accept early dates for farming in the Americas. One such is archaeobiologist Bruce Smith, also of the Smithsonian. “I did have concerns regarding the dating” and identification of plant microfossils, Smith says. “But [microfossil] approaches have revolutionized archaeobotany, and Dolores is rewriting the early history of crop plants in the lower latitudes.”

    A key element behind Piperno's success, her admirers say, has been a determination to plant herself firmly on the scientific side of archaeology, a field that represents an often uneasy marriage between science and the humanities. “Her research in the Latin American tropics, more than that of anyone else, has put in place hard evidence where there had previously been mainly speculation” about agricultural origins, says archaeologist Dorian Fuller of University College London (UCL).

    Philly girl

    Piperno, 61, was born and raised in Philadelphia, the child of first-generation Italian immigrants. She first trained as a medical technologist, spending several years looking at blood samples under a microscope in a Philadelphia hospital. With diverse interests in botany and history, she started taking night courses in anthropology. Then, while visiting one of the world's earliest known farming villages in Jericho as a tourist in 1973, Piperno had an epiphany: She suddenly decided to become an archaeologist. “I knew what I wanted to do,” she says.

    So in 1976, Piperno arrived at Anthony Ranere's archaeology lab at Temple University. Ranere says that at the time, his group's search for early agriculture in the Americas was stalled. He had found a number of prehistoric sites in Panama, some as early as 7000 years old, but little evidence for farming. “We weren't finding any macro plant remains, nor any pollen,” Ranere says. “I was kind of at a loss.” Piperno recalls saying to Ranere, “Tony, we need another way.”

    Then Ranere heard a talk by Pearsall, who had pioneered the use of plant microfossils called phytoliths to detect evidence of 5000-year-old maize cultivation in Ecuador. Phytoliths are formed when silica enters the plant from the soil and is deposited within its cells. Because they are mostly inorganic, they remain long after all other traces of plant tissues have vanished. They were well known to soil scientists but not to archaeologists. Pearsall contended that she could distinguish maize phytoliths from those of other plants, as well as from maize's presumed wild ancestor, teosinte, claims that were controversial at the time. “Dolores said, ‘This could be a tool for finding these things out,’” says Ranere, who still collaborates with Piperno.

    Piperno telephoned Pearsall to find out more about her pioneering techniques. “She called me up and said she was going to do my study all over again in Panama and see if it would work. … It's common in laboratory sciences to have someone replicate someone else's work but rare in archaeology,” says Pearsall.

    In 1979, Piperno traveled to STRI in Panama to do her doctoral work on phytoliths from nearby archaeological sites. She began a lifelong love affair with Panama, eventually landing a staff position at STRI, where she was in residence from 1988 to 2003 and still maintains an active lab today. “It's a wonderful place,” Piperno says. “I drive 2 hours to a dig site and then start working on the samples the next day in the lab.”

    In Panama, Piperno refined and expanded the phytolith technique, putting together a massive reference collection that today includes 2000 species and is used by archaeologists worldwide to classify phytoliths by species, genus, or family. “She defined much of the way we do phytolith research today,” says geoarchaeologist Arlene Rosen of UCL.

    At first, however, many archaeologists were reluctant to accept the early dates Piperno and Pearsall kept finding for maize domestication—at least 7000 years ago in Panama and 6000 years ago in Ecuador. “Dolores took a lot of flak,” Ranere says. “People said these dates couldn't be possible, it was contamination, things were filtering down into lower archaeological levels.”

    Some researchers say there was good reason to be skeptical. At first, Piperno and colleagues “would point to the occurrence of just a handful or less of these microfossils and then to a presumed associated radiocarbon assay and trumpet new antiquity” for domesticated plants, says archaeologist David Browman of Washington University in St. Louis. “I simply found this not to be adequate science.” Browman, along with anthropologist John Staller of the Field Museum in Chicago, Illinois, and others, says that plant roots, worms, and water can easily move the tiny microfossils up or down in archaeological layers. Staller also questions how reliably maize and its ancestor, teosinte, can be distinguished using microfossils. Lately, however, “Piperno and her associates have begun to be a bit more careful” and to put “appropriate controls in place,” Browman says.

    Macro and micro.

    Microfossils can distinguish maize (right hand and lower right) and its wild ancestor, teosinte (left hand and lower left).


    Piperno rejects such criticisms. “Soil scientists had [already] established that phytoliths don't migrate through the kinds of undisturbed soils and deposits that archaeobotanists rely on,” she says, adding that this was confirmed when improved radiocarbon techniques allowed her and others to date the tiny phytoliths directly (Science, 14 February 2003, p. 1054). And she insists that blind tests by numerous researchers have demonstrated that maize and teosinte can be told apart.

    For many researchers, the debate over maize origins is now over. “Microfossil evidence has demonstrated conclusively that maize domestication … took place nearly 9000 years ago and that it was a tropical lowland phenomenon,” asserts archaeologist Mary Pohl of Florida State University in Tallahassee.

    Telltale teeth

    Piperno's phytolith research had pushed back the earliest dates for domestication of maize and other plants such as squashes. But many root crops, such as manioc, sweet potato, and arrowroot, leave few identifiable phytoliths. By the late 1990s, Piperno realized that she needed a new tool to solve the entire riddle of agricultural origins in the Americas. Then she read a paper by Australian archaeologists who were identifying taro from its characteristic starch grains. Plants store energy in these microscopic particles, which take on characteristic shapes depending on what plant species produces them. Remarkably, for reasons not entirely understood, they are preserved intact for thousands of years. Food scientists had been familiar with starch grains for decades, but archaeologists were only just beginning to discover them.

    To see if starch grains might help reveal the origins of root crops, Piperno and colleagues first looked at stone tools dated as old as 7000 years ago from the Aguadulce rock shelter in Panama and found starch grains from manioc, yams, arrowroot, and maize. They confirmed these findings in 1997, when Piperno and Ranere re-excavated Aguadulce, unearthing more stone tools.

    “This was an entire group of plants that we would not know anything about if not for starch grains,” says Pearsall. “The majority of these finds were made by Dolores.”

    To date, Piperno has built a starch-grain reference collection of more than 400 plant species and used it to explore a variety of prehistoric diets. For example, she demonstrated that people 23,000 years ago at Ohalo II in Israel were grinding wild barley with a grindstone (Science, 29 June 2007, p. 1830). And working with archaeologist Tom Dillehay of Vanderbilt University in Nashville, Tennessee, she also found starch grains of cultivated squash, peanuts, and two genera of beans embedded in calculus on human teeth from northern Peru and dated as early as 9000 years ago.

    Recently, Piperno realized that this new method could help address another kind of question: How did prehistoric people prepare their plant foods? In a preliminary study, published last year in the Journal of Archaeological Science, Piperno and archaeologist Amanda Henry of George Washington University in Washington, D.C., cooked 10 domesticated crops, including wheat, rice, and lentils, using methods such as boiling, baking, parching, and fermenting, with or without grinding beforehand. They found that many combinations of plants and cooking methods resulted in characteristic starch grain shapes. They're now applying this work to starch grains from Neandertal teeth, among other projects.

    Piperno says she wouldn't be surprised if these new studies land her in the middle of controversies once again. “All I want to do is continue producing data,” she says. “Eventually, the data win out in the end.”

  11. History of Science

    Righting a 65-Year-Old Wrong

    1. Richard Stone

    The head of Indonesia's best known science institute was accused of contaminating vaccines and executed by Japanese forces in 1945; new evidence suggests he was innocent.

    Glory days.

    A bacteriology laboratory in the Eijkman Institute (inset), circa late 1930s.


    JAKARTA—In July 1944, at a labor camp on the outskirts of Jakarta, several hundred Indonesian forced laborers, or romusha, were injected with what was claimed to be a cholera-typhoid-dysentery vaccine produced at an institute run by the Japanese military, which then occupied Indonesia. Within a week, every last romusha was dead. A few months later the Kenpeitai, Japan's military police, arrested Achmad Mochtar, director of another research center—the Eijkman Institute—and most of his staff. “They were accused of deliberately contaminating the vaccines as an act of sabotage against the Japanese war effort,” says J. Kevin Baird, director of the Eijkman-Oxford Clinical Research Unit at the present-day Eijkman Institute of Molecular Biology here. On 3 July 1945, with the war in the Pacific nearing its end, Mochtar was beheaded at a Kenpeitai execution ground.

    But was Mochtar innocent? Over the past several months, Baird and Mochtar's successor at Eijkman, molecular biologist Sangkot Marzuki, have unearthed evidence that exculpates Mochtar, one of Indonesia's leading scientific lights in the early 20th century. They believe the deaths were the result of a medical experiment by the occupying forces. “It's now well documented that he was made a scapegoat,” says Iris Heidebrink, an archivist at the National Archive in The Hague, Netherlands. Baird and Marzuki plan to rehabilitate Mochtar, with a graveside memorial service on 3 July.

    Mochtar was an early star. Born in West Sumatra in 1892, he graduated from STOVIA medical school here in 1916 and took up an obligatory 2-year posting at a remote clinic in Panyabungan. There he crossed paths with W. A. P. Schüffner, who was carrying out what would become a classic study: the first longitudinal microscopic research on the malaria parasite. “Schüffner was Mochtar's mentor,” says Baird. Perhaps thanks to Schüffner's influence, the Dutch colonial administration in Indonesia—then known as Netherlands East Indies—dispatched Mochtar to the University of Amsterdam for Ph.D. training.

    The seeds of Mochtar's demise may have been sown in Amsterdam. His thesis in 1927 largely disproved the widely held belief that leptospira cause yellow fever. The principal promoter of that view was Hideyo Noguchi, a Japanese microbiologist who in 1911 had shown that another kind of spirochete causes the neuropathy of tertiary syphilis. Mochtar returned to Indonesia and continued to work on leptospirosis before joining the country's top biomedical research facility, the Central Medical Laboratory here, in 1937.

    It was also here in the late 19th century that the lab's founding director, Christiaan Eijkman, put Indonesia on the scientific map. He won a Nobel Prize in medicine in 1929 for his revelation that a dietary deficiency causes beriberi, a finding that led to the discovery of vitamins. In 1938, the 50th anniversary of its founding, the lab was renamed the Eijkman Institute. Four years later, Japan invaded and immediately interned Eijkman's Dutch staff in camps. (The former Dutch director, W. K. Mertens, died of beriberi in confinement.) Mochtar was appointed director, and the Indonesian staff members carried on their work. Meanwhile, the Japanese Army commandeered the Pasteur Institute in Bandung, a regional center for vaccine production, and renamed it Boeki Kenkyujo.

    Some details of the romusha tragedy may never be known. It's unclear, for example, how many romusha were injected with the vaccine; estimates range as high as 1500. The laborers came down with symptoms consistent with tetanus, according to three doctors allowed into the camps. “The Japanese described it as a meningitis outbreak, but lumbar punctures ruled that out,” says Baird. The medical team, led by Bahder Djohan, rushed some 90 romusha who had not yet become extremely ill to Jakarta's central hospital for antitetanus therapy, but they all died. In the weeks that followed, Eijkman researchers analyzed postmortem tissue samples and concluded that the vaccine had been contaminated with tetanus toxin.

    In October 1944, the Kenpeitai arrested Mochtar and his colleagues, as well as central hospital doctors and staff members of the city health service who carried out the vaccinations at the behest of Boeki Kenkyujo. Prisoners later said they had been beaten, shocked, burned, and subjected to waterboarding. “It was brutal and sadistic,” says Baird, who with Marzuki interviewed one survivor and the families of several others, and tracked down written testimony. One doctor died while being tortured, and another died later in captivity.

    All the survivors from the Eijkman Institute were released in January 1945 except Mochtar. Later, three independently reported that Mochtar assured them of their release several days beforehand and indicated he would remain imprisoned. “He may have struck a deal with his captors: He would confess to sabotaging the vaccine if his colleagues were released,” says Baird. Mochtar was beheaded on 3 July; according to a Japanese officer's diary, his body was then crushed by a steamroller and his remains were dumped into a mass grave. “Mochtar died a martyr, protecting his subordinates,” says Sjamsuhidajat Ronokusumo, chair of the Medical Commission of the Indonesian Academy of Sciences.

    Baird and Marzuki believe the Eijkman Institute had nothing to do with the vaccine. In a written account after the war, Djohan said he observed vials marked cholera-typhoid-dysentery vaccine produced by Boeki Kenkyujo—the preparation administered to the romusha. Boeki Kenkyujo was the only vaccine producer in Indonesia before, during, and immediately after the war. “The Eijkman Institute had no vaccine production capacity whatsoever,” says Baird.

    There may have been some payback as well for Mochtar having discredited Noguchi's conclusions about the cause of yellow fever. According to the book Tales of a Revolution by Abu Hanifah, an independence fighter and Mochtar's nephew, when the Kenpeitai searched Mochtar's home after his arrest, the only item they took was his thesis challenging Noguchi.

    Baird and Marzuki knew that a clearer picture of the circumstances surrounding the romusha vaccinations would be essential to clearing Mochtar's name. They had to uncover the motive. Conditions in the labor camps generally were deplorable; only one in five of the estimated several million romusha survived the experience. “Why vaccinate people you do not even bother to feed?” asks Marzuki.

    Searching for the truth.

    J. Kevin Baird (left) and Sangkot Marzuki contend that the romusha deaths were a sinister experiment.


    Heidebrink offers one possible explanation. The tragedy occurred at Klender, a romusha camp touted as a “model village.” “They were having trouble getting enough romusha, so to help with recruitment they built this village,” says Heidebrink. Families were relocated to Klender and after a few weeks, men were sent to work in the far reaches of the archipelago; wives and children would remain behind and, among other benefits, had access to health care, Heidebrink says. She believes that the romusha deaths were “a terrible mistake,” most likely stemming from contaminated vaccine. Indeed, Aukje Zuidema, a historian at the Dutch Institute for War Documentation in Amsterdam, notes in The Encyclopedia of Indonesia in the Pacific War (2010) that the wartime head of Jakarta's health service suspected that the typhus-cholera-dysentery vaccine contained “virulent bacilli” due to “less accurate production methods.”

    Marzuki and Baird are dubious. “Very simple guinea pig testing ensures that no contamination occurs in vaccine production runs, and a facility as experienced and sophisticated as Boeki Kenkyujo would not screw this up,” says Baird. They believe the tragedy at Klender was a medical experiment. “This was something more sinister than we expected when we first started to look into it,” Marzuki says. Getting to the truth is not easy: With the end of the war nigh, “the Japanese military destroyed all documents, as far as we know,” says Heidebrink. And many Dutch archives were destroyed during Indonesia's war for independence in the late 1940s.

    To rest in peace.

    A 3 July graveside service aims to rehabilitate Achmad Mochtar (inset), unjustly accused of killing hundreds of Indonesians during World War II.


    As circumstantial evidence, Marzuki points to another episode documented by an Australian war crimes tribunal in 1951. In January 1945, the fleet surgeon of the Japanese Navy's Second South Seas Expeditionary Fleet, Hirosato Nakamura, “was turning over every stone to get his hands on antitetanus plasma. He needed the stuff desperately,” Baird says. None could be spared by the Army facility in Bandung. An alternative would be tetanus vaccine, which uses inactivated toxin, or tetanus toxoid. “If you have tetanus toxoid, you don't need plasma; problem solved,” Baird says. Nakamura injected 17 condemned Indonesian prisoners with a tetanus vaccine whipped up by the military. The prisoners tolerated the vaccine well. To test efficacy, Nakamura injected them with pure tetanus toxin; 15 died, and the two survivors were executed. The Australian War Crimes Court convicted Nakamura of unlawful killing and sentenced him to 4 years in prison.

    That incident suggests a possible motive for the apparent vaccinations of the romusha: With antitetanus plasma and vaccine so scarce in the waning months of the war, the vaccinations could have been a challenge with tetanus toxin to gauge the efficacy of an experimental vaccine, Baird and Marzuki suggest. “The probability of accident may be considered so remote as to be virtually impossible,” Baird says.

    After the war, the Eijkman Institute lost its luster. “What happened to Mochtar and the subsequent trauma during the independence war affected the institute significantly,” Marzuki says. After a long decline with no effective leader, Eijkman was shuttered in 1965. Then in 1992, after watching how Singapore and other Asian neighbors were developing biotechnology industries, B. J. Habibie, the longtime science minister who later briefly served as the country's president, recruited Marzuki from Monash University in Melbourne to lead a revival of the Eijkman Institute of Molecular Biology. After a half-century hiatus, Mochtar had a successor.

    Indonesian scholars have long felt that Mochtar suffered an injustice. Marzuki says that for years he quietly compiled a dossier on his predecessor. “I knew it was important to clear his name,” he says. Baird was the catalyst. His interest in the case was sparked earlier this year, after he asked a student to write a history essay to help improve her English. The essay glossed over the war years. “She said she didn't know anything about it. I didn't either,” Baird says. He began reading up. Chatting with Marzuki, he learned about Mochtar. “I was hooked; I had to find out more,” Baird says.

    In addition to amassing a trove of documents and interview records, the duo early last month learned about the location of Mochtar's grave. That discovery inspired the memorial service, which will be a brief, low-key affair for a few dozen of Mochtar's family and Eijkman staff members. The quest for additional evidence to restore Mochtar's reputation—and reveal what really happened at Klender in 1944—will continue.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution