News this Week

Science  25 Apr 2008:
Vol. 320, Issue 5875, pp. 432

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    New Superconductors Propel Chinese Physicists to Forefront

    1. Adrian Cho

    Hai-Hu Wen went to work as soon as he heard the news. In late February, the 44-year-old physicist at the Institute of Physics (IOP) at the Chinese Academy of Sciences in Beijing learned from a colleague that researchers in Japan had discovered a new superconductor that carried electricity without resistance at a relatively balmy temperature of 26 kelvin. He immediately looked up the paper—via Google—and set his group to work. “We ordered the materials the same day,” Wen says. “Within 3 or 4 days, we had the first samples.”


    Planes of iron (red) and arsenic (gold) interleave with those of oxygen (white) and lanthanum or other elements (blue).

    CREDIT: Y. KAMIHARA ET AL., J. AM. CHEM. SOC. 130 (2008)

    Wen's group is one of several in China that, building on the discovery by materials scientist Hideo Hosono of the Tokyo Institute of Technology, has cranked out a new family of high-temperature superconductors, materials that conduct electricity without any resistance at inexplicably high temperatures. Physicists around the world are hailing the discovery of the new iron-and-arsenic compounds as a major advance; the only other known high-temperature superconductors are the copper-and-oxygen compounds, or cuprates, discovered in 1986. Those older materials netted a Nobel Prize and ignited a firestorm of research, but physicists still don't agree about how they work. High-temperature superconductivity remains the biggest mystery in condensed matter physics, and some researchers hope the new materials will help solve it.

    “It's possible that these materials will provide a cleaner system to work with, and suddenly [the physics of] the cuprates will become clearer,” Wen says. But Philip Anderson, a theorist at Princeton University, says the new superconductors would be more important if they don't work like the old one: “If it's really a new mechanism, God knows where it will go.”

    The torrent of results from China also signals the country's emergence as a power in condensed matter physics, many say. “What surprises me—probably it shouldn't—is the number of good papers coming out of Beijing,” says Peter Hirschfeld, a theorist at the University of Florida, Gainesville. “They've really jumped on this.”

    Superconductivity is nature's best parlor trick. Electrons flowing in an ordinary metal lose energy as they ricochet off defects in crystalline material. In superconductors, the electrons experience no such drag. Below a certain temperature, they form pairs, and deflecting an electron then requires breaking a pair. At low temperatures, there isn't enough energy around to do that, so the duos waltz along unimpeded.

    What holds the negatively charged electrons together? In an ordinary superconductor, such as niobium chilled below 9.3 kelvin, the “glue” is supplied by vibrations rippling through the material. When one electron moves, it sets off a vibration that drags the second electron in its wake. Most physicists, however, think this cannot explain the cuprates, which work at temperatures as high as 138 kelvin. Each cuprate compound contains planes of oxygen and copper ions. Electrons hop from copper ion to copper ion and somehow pair, although physicists do not agree on how that happens.

    Like the cuprates, the new materials are layered, containing planes of iron and arsenic along which the electrons presumably glide (see figure, below). Between the planes lie elements such as lanthanum, cerium, or samarium mixed with oxygen and fluorine. On 23 February, Hosono and colleagues reported in the Journal of the American Chemical Society that lanthanum oxygen fluorine iron arsenide (LaO1-xFxFeAs) becomes a superconductor at 26 kelvin. That was a surprise, Hosono says, because iron is magnetic, and magnetism and superconductivity generally don't mix.

    Warming trend.

    Physicists have quickly bumped up the compounds' highest critical temperature. Some say things may get hotter still if researchers can find crystal structures that pack more iron-and-arsenic planes into a given volume.

    Then Chinese researchers jumped in. On 25 March, Xianhui Chen of the University of Science and Technology of China in Hefei reported on the arXiv preprint server ( that samarium oxygen fluorine iron arsenide (SmO1-xFxFeAs) superconducts at 43 kelvin. Four days later, Zhong- Xian Zhao of IOP reported on the server that praseodymium oxygen fluorine iron arsenide (PrO1-xFxFeAs) has a “critical temperature” of 52 kelvin. On 13 April, Zhao's team showed that the samarium compound becomes a superconductor at 55 kelvin if it is grown under pressure. Calculations suggest that vibrations provide too little pull to produce such high critical temperatures.

    At least four different Chinese groups, including three working independently at IOP, have synthesized new compounds and posted results on the arXiv. IOP's Nan Lin Wang says his team was already working on lanthanum oxygen iron arsenide when word came that adding fluorine was key. “We had the materials, the glove boxes, everything was ready,” he says. China also has a bumper crop of young researchers and is investing heavily in basic research, he says. IOP has 288 staff members in 50 research groups and a $25 million budget, says Director Yupeng Wang: “We get about 10 new members each year and will keep that pace in the near future.” However, he adds, “funding for fundamental research is still quite low compared to Western countries.”

    The tsunami of results from China has heightened American researchers' worries about the health of condensed matter physics in their country. “What is striking is not only that it's coming out of China but that it's not coming out of the United States,” says Steven Kivelson, a theorist at Stanford University in Palo Alto, California. He notes that a U.S. National Research Council report released in June warned that the country is in danger of losing its edge in condensed matter physics as funding stagnates.

    All agree it's too early to tell exactly how the new materials work. “The community has to accumulate more data on high-quality samples,” Nan Lin Wang says. Don't be surprised if the samples and data come from China first.


    When Hobbits (Slowly) Walked the Earth

    1. Elizabeth Culotta

    COLUMBUS, OHIO—Fans of J.R.R. Tolkien know that hobbits walked shoeless on large, hairy feet. Now anthropologists have gotten an eagerly awaited glimpse of the feet of their own hobbit, a meter-tall skeleton from the island of Flores on Indonesia, and the results are almost Middle-earthly. When the discovery was announced in 2004, the world was riveted by the hobbit's astonishingly small brain: 400 cubic centimeters, about the size of a chimp's. At the American Association of Physical Anthropology meetings here (9–12 April), anatomist William Jungers of Stony Brook University in New York revealed that the most controversial member of the human family was strange right down to its soles.

    The partial skeleton of the hobbit, a specimen known as LB1 from Liang Bua Cave on Flores, had large, flat feet and a high-stepping gait unlike that of living people; it would have been a poor runner, Jungers said. He argued that the anatomy links the 18,000-year-old hobbit with 2-million- to 3-million-year-old human ancestors from Africa and may offer “a window into a primitive bipedal foot.”

    The data-rich presentation is part of “a continued drip of [hobbit] analyses done responsibly and carefully” that are illuminating the mystery of what the discovery team considers to be a new species, Homo floresiensis, said paleoanthropologist Bernard Wood of George Washington University (GWU) in Washington, D.C. But several hypotheses about the origin of the hobbit are still in play. Some researchers argue that the creature emerged by evolutionary dwarfing from a more recent ancestor. Others note that although discoverers say they have remains of about 12 individuals, so far most of the distinctive anatomy has been described only in LB1, leaving open the possibility that the specimen is a diseased H. sapiens. Whatever LB1 is, the new analyses are bringing its anatomy into sharper focus. “We've moved beyond the brain,” says Leslie Aiello, director of the Wenner-Gren Foundation for Anthropological Research in New York City.

    Not built for speed.

    Foot bones of the hobbit skeleton suggest that it walked differently from the way we do and was a poor runner.


    In Columbus, Jungers showed photos and measurements of the nearly complete left and partial right foot bones of LB1 to a ballroom overflowing with what Aiello calls “hobbit groupies,” who flocked to every talk on the subject. Jungers reported that LB1's foot was a whopping 70% as long as its very short femur; living people's feet are only about 55% of femoral length. The “big” toe was “incredibly short,” whereas the other toes were quite long, and the shape of the bones suggests that the foot was not arched.

    LB1 would have been a poor runner with a high-stepping gait, according to Jungers, rather like a living person wearing oversized shoes. “Don't bet on Homo floresiensis to win a marathon,” he said. But the big toe was also stiff, as it is in humans, and was aligned so that the hobbit could “toe-off” as we do when taking a step.

    Jungers and his co-authors also compared the shape of the hobbit foot bones with a large database on the foot of people and apes. LB1 sorted not with our species but with African hominins—ancient members of the human line-age—such as H. habilis and even the primitive Australopithecus afarensis, which is known from about 2 million to 3 million years ago. LB1's femur also resembles the femurs of early hominids when analyzed according to eight standard measurements, Brian Richmond of GWU reported in a separate talk. “It shows just how very primitive the morphology of H. floresiensis is,” he said.

    All this fits with previous data on other parts of the skeleton, Aiello says. Analyses of the jaw, shoulder, wrist (Science, 21 September 2007, p. 1743), and most recently, the cranium (ScienceNOW, 17 March: put LB1 with H. erectus or even earlier African hominins. “It's 18,000 years old, but it seems to correspond to a grade of hominin that ceased to walk on the Earth a few million years ago,” says Wood. “How it got there and managed to persist—that's clearly a challenge to explain.”

    One explanation is that the hobbit stems from a very ancient migration of a primitive hominin out of Africa. H. habilis, for example, had a short stature and a smaller brain than later hominins did, so it's easier to derive something hobbitlike from such an ancestor, Jungers said.

    But there's no other evidence anywhere in the world of such an early exodus out of Africa. Such a hypothesis requires “special pleading,” says paleoanthropologist Russell Ciochon of the University of Iowa, Iowa City. Australopithecine expert William Kimbel of the Institute of Human Origins at Arizona State University in Tucson agrees. “I'm not ready to go back to the Early Pleistocene of Africa for an ancestor,” he says. “That spans a lot of time and space, and I'm not comfortable with those gaps.” Ciochon thinks the evidence suggests that a more recent ancestor, perhaps H. erectus itself, shrank to hobbit size in an evolutionary dwarfing process.


    Researchers continue to puzzle over why the hobbit's cranium is so tiny.


    For other researchers, the whole debate is moot because they view LB1 as simply an aberrant H. sapiens. The foot “is such a mixture of characters, requiring very convoluted evolutionary and biomechanical explanations, that a developmental anomaly in a pathological human seems much more parsimonious,” says paleopathologist Maciej Henneberg of the University of Adelaide in Australia, a longtime hobbit critic.

    A few recent papers have expressed such skepticism. The authors of a controversial report on small-bodied H. sapiens skeletons on the island of Palau argue that LB1 could have been a small-bodied H. sapiens, too (ScienceNOW, 11 March: And several papers have attributed LB1's peculiarities to microcephaly, Laron syndrome, and most recently, cretinism, in a procession Dean Falk of Florida State University in Tallahassee calls “disease of the week.” In her talk, she added more data to her rebuttal of Laron syndrome (Science, 10 August 2007, p. 740); a cretinism rebuttal is in the works.

    At the Columbus meeting, the pathology scenario took a blow from an unexpected source, in a talk by mammalian tooth-development expert Jukka Jernvall of the University of Helsinki in Finland. Jernvall has been working for years on a model of how teeth grow and develop, finding that the first molar sets the template for the size of the second and third. This is true in living people and also for ancient hominins, although the precise relationship among the molars varies somewhat among species.

    If development is disrupted, as by an illness, the molar relationship falls apart, says Jernvall. For example, in pituitary dwarfs—one of the syndromes suggested for LB1—the second molar is small as predicted, but the third molar usually doesn't appear at all.

    And LB1? Although small overall, it retains the tooth proportions typical of larger bodied hominins, as does a second hobbit jaw, LB6, Jernvall says. “If you look at it from a tooth-development point of view, the drop in size looks like an evolutionary process, not a medical condition,” he says.

    Critics were unswayed, saying that even if one kind of pathology has been refuted, hundreds of others remain possible. And several experts would prefer not to discuss the whole issue, saying that they're still taking a wait-and-see approach. Given the wildly diverging opinions on the hobbit, “Somebody's going to take a big fall here,” says paleoanthropologist C. Owen Lovejoy of Kent State University in Ohio. He's waiting for DNA from LB1 or for a second skull. On that, at least, groupies and skeptics agree: All are hoping for another skull when the discovery team returns to dig at Liang Bua this summer.


    Two Geologic Clocks Finally Keeping the Same Time

    1. Richard A. Kerr

    First the bedroom clock reassures you that you're right on schedule. Moments later, the kitchen clock tells you that you're running minutes behind. If you find that annoying, pity the geochronologists. For decades, two of their workhorse timepieces—isotopic clocks ticking to the steady decay of two different radioactive elements—have been disagreeing by millions of years.

    Now geochronologists have recalibrated one of the clocks, bringing it into agreement with the other. They've tried it before, but this time it looks like the fix will stick. “This is a huge step forward,” says geochronologist Mike Villeneuve of the Geological Survey of Canada in Ottawa. “You'd like to see it reproduced, but it looks very solid to me.” The synchronization of clocks lends more support to a link between huge volcanic eruptions and mass extinctions.

    The clocks in question are argon-argon radiometric dating, based on the decay of potassium-40 to argon-40, and uranium-lead dating, based on the decay of uranium-238 to lead-206. Both techniques have been yielding increasingly precise ages, but argon-argon dating was giving slightly younger ages for the same rocks than the uranium-lead technique. Researchers suspected that they had gotten the decay rate of potassium-40 wrong, but they couldn't really say.

    So isotope geochronologists looked around for an absolute measure of passing time to which they could tie their argon-argon ages. They settled on orbital variations, the regular nodding and wobbling of Earth's rotation axis and the changing elongation of its orbit. On page 500, Klaudia Kuiper, now at Vrije Universiteit Amsterdam, and colleagues from Utrecht University, the Netherlands, and the Berkeley Geochronology Center in California report on the latest linking of astronomical variations and argon-argon dating.

    They found their chronological connection in 6-million- to 7-million-year-old layered rocks exposed in northern Morocco. Back then, the Melilla Basin was undersea. Orbitally induced climate variations translated Earth's rhythmic orbital variations into marine sediment layers of alternating mineral composition. Astrodynamicists had calculated the subtleties of orbital-variation timing over the ages. That made the Melilla layering a time scale readable with an accuracy of 10,000 years.

    At random intervals over the same time period, nearby volcanoes were peppering the sea with ash containing large grains of the mineral sanidine, ideal for high-precision argon-argon dating. The group dated the sanidine grains in a particular layer of ash by noting the layer's position relative to astronomically dated sediment layers. And they measured how far potassium-40 decay had gone in the layer's sanidine grains. Then they compared their measurements with analyses of sanidine in a rock known as the Fish Canyon Tuff, which has long been used as the standard for argon-argon dating.

    In effect, Kuiper and her colleagues linked the poorly dated Fish Canyon standard to the metronomic astronomical time scale via the volcanic ash of the Melilla Basin. By this astronomical recalibration, the Fish Canyon standard is 0.65% older than had been thought. Recalculate previous argon-argon ages using the standard's new age, and everything ever dated using the technique becomes 0.65% older.

    The new calibration “gives us a much better hook to hang our ages on,” says Villeneuve. “It's a very nice piece of work,” agrees geochronologist Samuel Bowring of the Massachusetts Institute of Technology in Cambridge. “It brings us closer to agreement between argon-argon and uranium-lead, [although] we need to see a lot more of these” studies.

    Tick, tick, tick.

    The rhythmic layering of sediments in this seaside cliff at Zumaia, Spain, reflects precisely timed variations in Earth's orbit used to calibrate an isotopic time scale.


    Using their new calibration, Kuiper and her colleagues recalculate some key dates in geologic history. Going back in time, they move the great impact 65.5 million years ago and the accompanying extinction of the dinosaurs to 66.0 million years ago. That shift matters particularly to astronomical daters because they use the impact as a benchmark when working farther back in time. The argon-argon age of the mother of all mass extinctions—the Permian-Triassic—moves from 251.0 million years ago to 252.5 million years ago. The new date puts it precisely at the group's preferred uranium-lead age for the Siberian Traps eruptions, the mother of all volcanic outpourings. That supports the claim that the eruptions could have triggered the extinction (Science, 17 September 2004, p. 1705).

    Older argon-argon ages would likewise make another of the big-three mass extinctions—the Triassic-Jurassic—coincide precisely with the great volcanic outpourings of the central Atlantic magmatic province. That's according to a new uranium-lead age that places the extinction at 201.6 million years ago, as published by geochronologist Urs Schaltegger of the University of Geneva, Switzerland, and colleagues in the 1 March issue of Earth and Planetary Letters. So a fraction of a percent multiplied by geologic time can make a difference.


    Europe Takes Guesswork Out of Site Selection

    1. Daniel Clery

    Picking a home for a large international research facility is usually fraught with tension. Political alliances hold sway over technical considerations, and deals are often struck during whispered conversations in the corridors of government (Science, 19 October 2007, p. 380). But hoping to teach politicians how to make a decision that is best for the science, the three cities vying to host a €1 billion neutron beam research center called the European Spallation Source (ESS) will this week submit bids to a specially created, independent panel of “wise people.” “Rational criteria are better than the handshake of two powerful people,” says Colin Carlile of Lund University in Sweden, director of the ESS-Scandinavia consortium.

    The assessment panel won't choose a winner or rank the candidate sites, but it does represent the start of an effort for European science to avoid the horse-trading that takes place now whenever a proposed facility needs a home. “Europe as a whole should have a mechanism to choose sites,” says Peter Allenspach of Switzerland's Paul Scherrer Institute, chair of the European Neutron Scattering Association.

    Fifteen years ago, Europe was preeminent in the science of neutron scattering with the world's top two neutron sources sited in France and the United Kingdom. Neutrons are subtle probes that penetrate to the heart of materials and reveal where atoms are and what they do. Neutron beams are used by, among others, physicists, materials scientists, crystallographers, and biologists. Producing them requires either a nuclear reactor or a particle accelerator to fire a beam of protons at a fixed target, knocking out neutrons—a process known as spallation.

    European neutron researchers had a plan to keep their lead: They would build a next-generation spallation source so big it would need to be an international facility. The design and cost estimate were finished by 2002, but European politicians never became convinced a new facility was needed. The project foundered and its central project office closed in 2003. Meanwhile, the United States built the Spallation Neutron Source in Tennessee, and Japan built a source as part of its nearly complete J-PARC facility at Tokai.

    ESS was given new impetus, however, by the European Strategy Forum on Research Infrastructures (ESFRI), a body tasked by the European Union with drawing up a list of planned large facilities that E.U. nations should work together on. ESS was one of 35 projects in the first ESFRI road map released in 2006 (Science, 27 October 2006, p. 580). Three cities were soon vying to host ESS—Lund in Sweden, Bilbao in Spain, and Debrecen in Hungary—and seeking allies. The Lund team is building an alliance of five Scandinavian nations, the three Baltic states, and Poland. Debrecen is working on its central European neighbors (including Poland) as well as Russia. And the Debrecen and Bilbao teams pledged to support each other should one of them have a face-off with Lund.


    After 15 years of planning, researchers are ready to build the European Spallation Source. The three candidate sites must gather support from other nonbidding governments to spread the cost of construction and operation.


    ESFRI, which was last year developing a general strategy for deciding on the sites of international facilities, saw a chance to help. The candidates were not from one of Europe's science powerhouses, and none was gathering support from other countries fast enough. “It was a race, but no one had defined the distance or how long it should run. The winner would emerge when the other two were exhausted,” Carlile says.

    In February, ESFRI sent a 50-page questionnaire to the candidates, asking about issues such as nearby research centers, local infrastructure, economics of the bid, and political support within the host country and its neighbors. The bidders were due to submit their answers on 25 April. An ESFRI working group also drew up criteria for judging the sites and a long list of potential assessors—authoritative neutral figures with science policy backgrounds or experience constructing or building large user facilities. All three cities were asked to approve the criteria and assessors. Now that the questionnaires are in, the ESFRI group will select three to five of the evaluators to analyze the answers and visit the sites. “The wise people will express their opinions on how well the criteria are met,” says Carlo Rizzuto, president of Italy's Trieste Synchrotron and ESFRI chair.

    What happens after ESFRI receives those opinions in September is far from clear. The horse-trading between potential partner states, now armed with more detailed briefs, will likely begin again. “The report will give the criteria, but politicians will put their own weighting on those criteria,” says John Wood of Imperial College London, former ESFRI chair. Many involved hope the ESFRI assessment will nonetheless speed the process, allowing a site decision by an E.U. meeting on research infrastructures at the end of this year.

    Although ESS may be entering its endgame, other European facilities, including a high-powered laser called the Extreme Light Infrastructure, are just starting to look for a home. Will the ESS strategy help those contests? Most scientists agree that the ESFRI process will be beneficial but say that it would be better if governments cede some authority to a pan-European body that plans out, and even funds, the region's research facilities. A few fields already have such a body, such as ESA for space science, says Rizzuto, but in others “there is not yet an institution capable of strategic planning.” Some hope that the newly formed European Research Council could take on that role.

    Neutron researchers are just looking forward to a new place to call home. Says Carlile: “This has been on the go for 15 years now. I want to refocus our energy onto the project itself.”


    Rebuilding the Injured Warrior

    1. Constance Holden

    In an initiative to speed treatments for wounded soldiers, the U.S. Department of Defense (DOD) is entering the fast-growing field of regenerative medicine. Over the next 5 years, at least $250 million will be funneled into two university-led consortia that compose the new Armed Forces Institute of Regenerative Medicine (AFIRM), DOD announced last week.

    Bridging the gap.

    Defective rat skull (top) shows bone formation 12 weeks after implant of scaffold with bone growth factors (bottom).


    AFIRM will focus on regrowing severed fingers, recreating shattered bones, reconstructing mutilated faces, and covering burn victims with genetically matched skin. “We hope to get products into patients within 5 years,” says tissue engineer Anthony Atala of Wake Forest University Baptist Medical Center in Winston-Salem, North Carolina, co-director of one consortium led by Wake Forest and the University of Pittsburgh in Pennsylvania.

    Last year, Atala reported isolating from amniotic fluid highly versatile stem cells (Science, 12 January 2007, p. 170), which are likely to figure prominently in the new technologies. Embryonic stem cells or their equivalents aren't in the mix here. Rather, says Atala, the focus is on getting rapidly to the clinic, using cells that can get quick Food and Drug Administration approval.

    DOD decided 2 years ago that it was time to make a major commitment to regenerative medicine treatments, largely at the instigation of dental researcher Robert Vandre, director of combat casualty care research at the U.S. Army Medical Research and Materiel Command at Fort Detrick, Maryland. Vandre says he originally managed to round up a commitment for $8.5 million a year, including $500,000 a year from the U.S. National Institutes of Health (NIH). Then after receiving competitive proposals for a single consortium, he got a call “out of the blue” from the White House, which ended up telling DOD to double the funding from $42.5 million to $85 million over 5 years. That made it possible to fund two consortia that had come in neck-and-neck in the competition. Vandre, who is AFIRM's DOD manager, says the 5-year total should top $265 million, including $80 million in public and private funds to match DOD's input and some $100 million in NIH grants already held by researchers in the consortia's 28 research groups.

    A top priority will be engineered skin that can be quickly grown to treat burn victims. Atala points out that at present there is “no real skin replacement” because skin grafts from cadavers are prone to rejection; supply is also short. One of the earliest fruits of the venture may be a method to grow a patient's own skin rapidly enough to use as a graft for life-threatening burns. Ultimately, says chemist Joachim Kohn of Rutgers University in New Brunswick, New Jersey, co-head of the other consortium, led by Rutgers and the Cleveland Clinic in Ohio, “you could take a skin sample from every soldier in danger zones and store it” so that the moment a soldier is injured, people back at the Army medical center in San Antonio, Texas, could start growing a graft.

    Another major focus is on “compartment syndrome”: internal muscle trauma from blast or other injuries that results in rapid swelling of arm or leg tissues so they compress nerves and blood vessels. If not treated swiftly, muscles die, and amputation is often necessary, says Kohn, who adds that so far the Iraq war has resulted in about 800 amputations. Other high priorities are wound healing, cranial-facial reconstruction, and regrowing severed fingers and toes.

    “I'm fighting the perception that we will regrow limbs and heads and arms,” says Kohn. Rather, “what we want to do is take our ability to grow 2 inches of bone and extend it into 6 inches of bone. … We are pushing the border of where limbs can be salvaged further and further out.”


    Bypassing Medicine to Treat Diabetes

    1. Jennifer Couzin

    By altering the gut's production of hormones, gastric bypass surgery may be able to eliminate type 2 diabetes. But scientists worry that this radical operation can also cause dangerously low blood sugar.

    By altering the gut's production of hormones, gastric bypass surgery may be able to eliminate type 2 diabetes. But scientists worry that this radical operation can also cause dangerously low blood sugar

    In 1980, bariatric surgeon Walter Pories of East Carolina University School of Medicine in Greenville, North Carolina, performed his first gastric bypass surgery on an obese patient with type 2 diabetes, then a second, then a third. He noticed right away that the patients no longer needed insulin. Family doctors confirmed that what Pories had considered a transient phenomenon seemed like something more: Each person's diabetes had disappeared, even before they'd lost much weight. Pories was convinced that the doctors had erred. “I said, ‘You guys don't know how to work up diabetes. Diabetes is an incurable disease.’” After the fourth patient, Pories and an endocrinologist took matters into their own hands. “We marched right down to the lab, very self-righteous,” and accused the lab employees of incorrectly measuring blood sugar levels. (“If you're a doctor, you like to blame other people,” Pories explains.)

    As the number of patients with vanishing diabetes mounted, Pories recognized that the effect was real. Still, the concept that diabetes could be reversed surgically was so outlandish, he says, that “we didn't dare publish” the results. Instead, Pories began tracking his patients. In 1995, he reported in the Annals of Surgery that among 146 people with diabetes who had had the surgery in the past 14 years, 121, or 83%, had quickly become diabetes-free. The result was far superior to that achieved by any other treatment at the time—or now.

    “The surgical world noted that paper,” says endocrinologist David Cummings of the University of Washington, Seattle. But it took “another 10 years for the rest of us” to catch up, he says. Now, endocrinologists are beginning to pay close attention to the effects of gastric bypass surgery, which had long been a backwater of medicine, in part because obesity was not considered a genuine disease.

    As America and other countries confront surging rates of obesity, with few treatments that shrink the widest waistlines, the surgery's popularity is soaring. The most common form in the United States, Roux-en-Y gastric bypass, was performed on more than 120,000 people in 2007, according to estimates. That's almost double the number 5 years ago. Doctors often learn from their patients, and the hundreds of thousands of people who have had gastric bypass surgery are now prompting an overhaul in our understanding of metabolism and diabetes. Scientists are also going back to animals to figure out the impacts of the procedure. They are finding that the surgery's rerouting of the intestines and closing off of much of the stomach appears to have drastic effects on gut hormones and disease, independent of the weight loss that accompanies it.

    These effects can also have dire consequences. Beginning in 2000, F. John Service, an endocrinologist at the Mayo Clinic in Rochester, Minnesota, began seeing patients with some alarming symptoms: confusion, abnormal behavior, seizures, and unconsciousness. In each case, the culprit was a low level of blood sugar that struck after eating, which is when it rises in healthy people. Every patient, it turned out, had undergone gastric bypass surgery months or years earlier. The Mayo Clinic now sees at least two new patients a month with this unusual hypoglycemia disorder, which was the topic of a meeting at the Joslin Diabetes Center in Boston earlier this month.* As a last resort, surgeons have removed part or even all of the pancreas, which churns out insulin, from many of these patients.

    Unintended effects.

    Roux-en-Y gastric bypass surgery reduces the stomach to a fraction of its original size and skips past part of the small intestine, which causes profound metabolic changes in the gut.


    How to decipher and harness the surgery's metabolic effects is prompting much debate. On the one hand, some surgeons are already operating on less obese people with diabetes as a treatment for that disease. But others would prefer to wait until the science catches up, especially because the surgery isn't harmless, with a death rate ranging from 0.1% to 2%, depending on where it's performed. “Surgeons have for too long acted in a vacuum. … Most of them aren't thinking about the mechanisms of what they're doing,” says John Dixon, an obesity researcher at Monash University in Melbourne, Australia. “But we need to dissect out” what's happening in these patients.

    Early clues

    Gastric bypass was inspired by similar intestinal operations employed for ulcers and gastric cancer that induced dramatic and enduring weight loss and were reported to reverse diabetes as far back as the 1950s. “As soon as we started doing the operation, we were aware of the fact that before the patients got out of the hospital, they no longer needed insulin,” says Edward Mason, a retired surgeon from the University of Iowa in Iowa City who developed the procedure for weight loss. Most current forms of gastric bypass, and Mason's original operation, have one element in common: A newly created exit from the stomach is reconnected to a piece of small intestine a few feet lower down, “bypassing” the upper portion of the small intestine. In addition, the stomach is drastically restricted, by about 95%. (Another weight-loss surgery, gastric banding, seals off most of the stomach but leaves the intestines intact and is not considered gastric bypass.) Today, most gastric-bypass patients shed 30% of their body weight and keep it off.

    Beyond fat.

    From the early days, doctors recognized that for many patients, diabetes vanished after gastric bypass.


    Mason, now 87 years old, recalls that he and others explained away the reversals of type 2 diabetes because their patients weren't eating right after surgery, which would lower blood glucose levels and, in turn, their need for insulin. (The surgery does not have the same effect on type 1 diabetes, in which afflicted individuals cannot produce insulin.) But Pories's study years later slowly began to convince people that something more fundamental was occurring.

    Almost a decade later, a second report strengthened the case. In 2003, Philip Schauer, a bariatric surgeon now at the Cleveland Clinic in Ohio, published follow-up data from 1160 obese people who in the preceding 5 years had undergone Roux-en-Y gastric bypass, which gets its name from a French surgeon who developed the technique. Of the 191 people with diabetes or impaired glucose metabolism who could be tracked down, 83%, precisely the figure reported by Pories, no longer had the problem.

    Although impressive, it's not yet clear if these success rates will hold up in clinical trials. These are “typically the observations of a single surgeon or group of surgeons” and “very anecdotal,” says David D'Alessio, an endocrinologist at the University of Cincinnati in Ohio.

    Getting at biology

    After years of absence, science is slowly making inroads into gastric bypass surgery. “The development of the field was not based on real research,” says Francesco Rubino, a bariatric surgeon at Weill Cornell Medical College in New York City. “That has tarnished the field somewhat.”

    Recently, however, a growing number of studies are suggesting that the surgery has a profound effect on gut hormones, which could explain its impact on appetite, diabetes, and the low blood sugar that's turning up. One of the first clues emerged in 2002, when Cummings looked into a well-recognized oddity. Gastric bypass restricts the stomach, forcing people to eat smaller meals. One might then expect “that people would be compelled to sip milkshakes all day long,” says Cummings. That's not what happens. Many move away from calorie-dense foods altogether.

    Curious, Cummings began examining levels of ghrelin, a hormone produced mainly by the stomach that stimulates appetite. Most people have peaks and valleys in ghrelin levels throughout the day as they consume meals and then become hungry again. In those who've had gastric bypass, Cummings found, ghrelin levels in blood were low and changed little all day, suggesting that something about the surgery dampens ghrelin production and hence appetite.

    The role this plays in diabetes resolution has not been firmed up, and researchers are now more closely examining how gastric bypass affects other hormones. Rubino's work, for example, has focused on the intestines, which produce a different suite of chemicals and hormones from those the stomach churns out. In 1999, Rubino turned to rats to examine whether the surgery's effects on diabetes were due to calorie restriction and weight loss alone. He tried to tease apart distinct features of his “patients”—the rats, in this case—and different features of surgery. When performed on lean animals with type 2 diabetes, gastric bypass had the same positive effects on the diabetes as in obese ones, suggesting that weight loss was largely irrelevant. Furthermore, Rubino performed the intestinal bypass portion of the operation, skipping past the duodenum and the jejunum that link up to the stomach, but leaving the stomach intact. There was a “direct antidiabetic effect,” he says.

    Rubino's rat work dovetails with a popular theory: that a hormone produced by the intestines called glucagon-like peptide 1 (GLP-1) lies behind the vanishing diabetes in many gastric bypass patients—and may be linked to the hypoglycemia that later strikes others, most of whom did not have diabetes before the surgery. The GLP-1 theory is that the small intestine goes into overdrive making hormones in gastric bypass patients. Because of the surgical rerouting, food “empties directly into this part of the intestine that it normally wouldn't see at that stage” of digestion, says Mary-Elizabeth Patti, an endocrinologist at the Joslin Diabetes Center.

    In healthy people, GLP-1 has a variety of effects, including increasing insulin secretion, and a diabetes drug on the market, called Byetta, mimics the effects of GLP-1. Physiologist April Strader of Southern Illinois University in Carbondale is now performing an intestinal surgery in rats that leaves the stomach intact and prompts the animals to secrete more GLP-1. Strader is examining whether that in turn causes proliferation of insulin-producing cells in the pancreas.

    Insulin overload?

    An islet in the pancreas (left, red) appears larger in a gastric bypass patient (right) who suffers from dangerously low blood sugar, but scientists dispute whether the surgery changes that organ.


    Linking the good and bad

    GLP-1's impact on the pancreas may also explain the hypoglycemia originally seen by the Mayo Clinic. One sharp contrast between the disappearance of diabetes and the hypoglycemia stemming from the surgery is that the former occurs immediately or within weeks, whereas the latter takes several years to show up. At the Boston meeting, the 40 or so surgeons, endocrinologists, pathologists, and others gathered there admitted that they couldn't explain this but wondered whether changes to the pancreas over time generated the low-blood-sugar problems, whereas diabetes improvement might be due to other nonpancreatic effects.

    When Service and his Mayo colleagues examined the pancreatic tissue removed to help their hypoglycemic patients, they noted islets that appeared larger than normal. Joslin researchers have also reported an excess of insulin-producing cells in three hypoglycemic patients, two women and a man, who had a portion of their pancreases removed. The hypoglycemia is “diabetes reversal in people who don't have diabetes,” says Patti.

    D'Alessio is now trying to study GLP-1 in people who have had gastric bypass surgery and are suffering from hypoglycemia to determine whether the hormone might induce such pancreatic changes. But not everyone agrees that gastric bypass surgery alters the pancreas. Peter Butler, an endocrinologist at the University of California, Los Angeles, examined the pancreases from Mayo at Service's request and found that they looked like pancreases from obese individuals. He attributes this difference of opinion to his more extensive analysis, which did not identify an upsurge in insulin-producing cells.

    Butler did make one intriguing find, however. Obese people tend to produce more insulin over time to accommodate the growing amount of tissue that requires the hormone. Based on the appearance of the islet cells, Butler deduced that the gastric bypass pancreases hadn't made the adjustment to their host's new weight: They were still producing just as much insulin as before the surgery, effectively increasing the insulin available. That this occurs after meals would make sense, because this is when the pancreas normally releases extra insulin. In these patients, the insulin they secrete would far exceed what's needed.

    Stephen Bloom, an obesity researcher at Imperial College London, notes that it's far from clear whether GLP-1 has the same effect on human pancreases as it does on those of rodents. Furthermore, the small intestine secretes dozens of hormones, many of them poorly understood. “It's still too soon to rule out everything else,” says Strader. Rubino agrees that researchers need to think expansively. Instead of largely undigested food stimulating hormone secretion from the intestines it's dumped into, it's possible that when food does not touch the walls of the duodenum, as happens in gastric bypass, that has hormonal effects of its own. Rubino says his recent findings in animals suggest such antidiabetic effects.

    Surging popularity

    As research picks up pace, gastric bypass surgeries continue unabated, and some surgeons, particularly outside the United States and Europe, are beginning to operate on less obese patients with diabetes. Bariatric surgery is “kind of the Wild West,” says D'Alessio. There's “huge demand, no regulation, everybody's got their own operation, [and] patients are willing to do whatever it takes to get it.”

    Currently, U.S. National Institutes of Health guidelines recommend that gastric bypass surgery be considered only for people who have a body mass index (BMI) of at least 35. (A BMI of 18.5 to 25 is considered normal.) At a meeting in Rome last year, 78% of attendees supported lowering the limit to a BMI of 30 for those with diabetes. Should the number be even less? “We need more data to know if a lower bar is okay or if there should be any bar at all” when the goal is diabetes treatment, says Cummings.

    But many still view gastric bypass as extreme therapy for diabetes. Some who undergo the operation have serious problems, such as infections, gallstones, and hernias, that can require additional surgery. And given the time lag between gastric bypass and the severe hypoglycemia that Service, Patti, and others are just now documenting, no one knows how prevalent the side effect will be nor how much such patients will affect the cost-benefit analysis. The death rate from gastric bypass surgery also scares many diabetes researchers. “We had a death in a 28-year-old recently; she had a complication but didn't want to come to the hospital,” says Bloom. “When you see that and have to go to the funeral, you don't think it's such a harmless procedure.”

    Yet type 2 diabetes isn't harmless, either, contributing to more than 1 million deaths worldwide each year. “There is a barrier we need to get over” in considering gastric bypass as a diabetes treatment, says Rubino. He points to a paper published last summer, concluding that the surgery reduces diabetes deaths by 92%. “It's the most profound effect in terms of mortality from diabetes ever reported,” Rubino says. “What is the price of that?”

    • *Hyperinsulinemic Hypoglycemia Following Gastric Bypass: Pathogenesis and Treatment Symposium, Boston, Massachusetts, 7 April.


    Japanese Experts Steal a Glance at Once-Taboo Royal Tomb

    1. Dennis Normile

    Japan's key-shaped burial mounds offer tantalizing glimpses into prehistory. Researchers have been given access for the first time to those built for the imperial family.

    Japan's key-shaped burial mounds offer tantalizing glimpses into prehistory. Researchers have been given access for the first time to those built for the imperial family

    NARA, JAPAN—From a nearby street, the wooded hill beyond the pond looks ordinary. But looks are deceiving at Gosashi Kofun. Centuries ago, earth was deliberately mounded into tiers, in a keyhole pattern, and surrounded by a moat to serve as the final resting place of a powerful person, perhaps the legendary 3rd century Japanese empress Jingu. But just who was buried there, and when, are among a host of questions that archaeologists and historians hope to resolve. Other unknowns include how the kofun (mounds) were constructed and what clues they give about society and religion in an era just before written records appeared in Japan.

    Gosashi Kofun and some 900 other sites presumably holding the remains of imperial family members promise a tantalizing peek into a formative period in Japan's early days as a nation. “The imperial tombs are a very important resource for understanding Japanese history,” says Koji Takahashi, an archaeologist at Toyama University. But for more than a century, the imperial mounds—the cream of Japan's 30,000 known kofun—have been off-limits to prying eyes. Last February, 16 researchers were for the first time permitted a direct look at Gosashi Kofun. At a symposium here earlier this month, they shared their impressions and recounted the outstanding questions.

    Burial mounds, or tumuli, are found throughout the world. Only in Japan do they come in a distinctive keyhole shape. A typical keyhole-shaped kofun has a high, circular, rounded mound at one end containing the burial site, a stone chamber entered through a passageway cut into the mound. Early sites contain a simple pit. The other end of the keyhole usually has a lower platform that may have been used for funerary rites. The largest kofun cover more area than Egypt's largest pyramids, though they are not as high. Often the mounds are studded with arrangements of terra cotta sculptures, or haniwa, that range from simple cylinders to figures including armored warriors, animals, boats, and household implements. The purpose of the haniwa is not fully understood. But their embedded cylindrical bases may have helped stabilize mound slopes.

    Kofun raiding.

    Koji Takahashi and colleagues got a first glimpse at a 2000-year-old imperial burial mound, Gosashi Kofun (obscured by trees).


    The appearance of kofun marked the emergence of an aristocratic state with considerable wealth and military power. Mounds were de rigueur for rulers and clan chieftains during the 300-plus years of Japan's Kofun Period, beginning in the middle of the 3rd century. In the absence of written histories, burial goods and haniwa are indispensable sources of information about societal structure, as well as contemporary weapons, tools, and clothing. The presence of exotic items such as bronze mirrors, armaments, and pottery from mainland Asia attest to brisk commercial, cultural, and even military ties between Japan's chiefdoms and kingdoms on the Korean peninsula and China.

    Although archaeologists have explored many kofun, the largest and most elaborate are the 900 associated with the imperial family. Since the 1970s, researchers have petitioned the Imperial Household Agency for access. But, aside from allowing a few researchers to occasionally accompany maintenance crews, the agency rejected the requests on the grounds of preserving the “peace and dignity” of the graves.

    Then something surprising happened: The imperial agency relented. Early last year, in response to a petition from 16 academic societies, the agency agreed to allow entry onto an unspecified number of mounds. After a year of negotiations, the first visit took place on 22 February, when one representative from each of the 16 societies was permitted to examine the lower part of Gosashi Kofun. The researchers were not allowed to ascend to the burial site or upper levels, do any digging, or collect artifacts. They were permitted to make drawings and take notes and photos, though they were asked not to make the photos available to the press.

    Most of the discussion at the symposium centered on where Gosashi fits into kofun evolution based upon minutiae such as the number of tiers and the precise shape of the corners of the lower platform. Fumiaki Imao, an archaeologist at the Nara Prefectural Archaeological Institute in Kashihara, says that better dating of kofun and the order in which they were built may yield clues as to the location of the seat of power at a given time, the dates when rulers reigned, and who is interred in which tomb. The unprecedented inspection of Gosashi added to the list of questions. The researchers stumbled upon the partially embedded remains of a line of cylindrical haniwa along the moat. “This is such an unusual location, I wonder why it was placed there,” says Imao. Haniwa have typically been found on top of mounds.

    Whether these haniwa are peculiar to Gosashi or a regular feature of imperial mounds of this period might be answered if researchers win more access to the kofun. Takahashi says the Imperial Household Agency has agreed in principle to allow more visits. The next could take place toward the end of this year. Which tomb, how many researchers, and the ground rules for the inspection are to be negotiated with the agency this summer.

    Experts at the symposium also discussed what kind of access to request. As a first step, most would like to see more accurate mapping of the mounds. For now, no one is talking about entering the hallowed imperial burial chambers. Some 2000-year-old secrets are not about to be revealed.


    Picking Up Evolution's Beat

    1. Michael Balter

    Pardis Sabeti mixes geek cool with hot science as she studies how human populations have evolved to resist malaria and Lassa fever.

    Pardis Sabeti mixes geek cool with hot science as she studies how human populations have evolved to resist malaria and Lassa fever


    Pardis Sabeti—malaria researcher, role model for young scientists, and rock performer—keeps on the move.


    CAMBRIDGE, MASSACHUSETTS—Over the past 72 hours, Pardis Sabeti has managed only 2 hours of sleep each night, most of them inside a crumpled blue sleeping bag she keeps under a desk at the Broad Institute Center for Genome Research in Cambridge, Massachusetts. Sabeti, who burst on the scientific scene in 2002 with a novel test for natural selection in the human genome, has been racing to meet the submission deadline for a National Institutes of Health (NIH) grant to support her research on the evolution of resistance to malaria and Lassa fever. Also in her schedule this year: serving as a panelist at the World Economic Forum in Davos, Switzerland; a research trip to Africa; speaking to young women about careers in science; and writing songs and recording for her pop/rock band.

    To manage all this, Sabeti, 32, has been sleeping under desks for much of her relatively short career. The petite Iranian-American with a toothy smile has cut a wide swath through the research world, racking up awards and honors at a dizzying pace: a Rhodes scholarship at Oxford University, a L'Oreal Women in Science award, and summa cum laude honors at Harvard Medical School, to name a few. She's also made a name in the wider world: The London Daily Telegraph recently called her one of the “top 100 living geniuses” (she tied for 49th place with Henry Kissinger, Richard Branson, Stevie Wonder, and Meryl Streep), and CNN named her one of eight “geniuses who will change your life.”

    Sabeti also seems to have a genius for raising money. While still a postdoc, her own grants totaled more than $600,000, and she is currently a co-investigator on a $2 million Bill and Melinda Gates Foundation grant. She was recently hired as a Harvard assistant professor, turning down offers from several other leading universities.

    And then there's the band: She's the lead singer in the Boston-based alternative group Thousand Days, which plays gigs up and down the East Coast and has released three albums. Sabeti's singing voice is “sweet and sexy,” wrote one music reviewer, adding wryly that “it's nice to know she has a successful back-up career in case her attempts at winning the Nobel Prize don't pan out.”

    The band may boost Sabeti's visibility, but it's her scientific drive that elicits enthusiasm from her colleagues. Broad Institute geneticist David Reich, who has worked closely with Sabeti, sums her up this way: “She is a very cool person but also sort of a nerd.”

    Sabeti was born in Tehran, Iran, where her father was a high-ranking official in the Shah's government. He sought asylum in the United States shortly before the 1979 revolution, and Sabeti grew up in Orlando, Florida, with a large extended family.

    She traces her academic success to her early life in this close-knit clan. “My mother created a summer camp in our house, where she would teach the children and make us do book reports. And my sister, who is 2 years older than me, would teach me and my cousin what she had learned in school.” Sabeti says mathematics was her first love. Her high-energy personality, she adds, appeared in those early years. “I'm a hyper person,” she says. “My parents always told me to relax.”

    Reich, who met Sabeti when they were both grad students at Oxford, says she has always been “very driven.” Her habit of pulling all-nighters was well-established by then, recalls Hans Ackerman, a fellow Rhodes scholar who is now a medical fellow at NIH in Bethesda, Maryland. “I would come into the lab and find her asleep under her desk after a full night of doing PCRs.”

    Why does she work so hard? “I guess I just want to make my parents proud of me,” she says.

    Although Sabeti's workaholic ways have brought her scientific success, Broad Institute Director Eric Lander and others also note her charisma and her efforts to reach out to the community. For example, as an undergrad at the Massachusetts Institute of Technology, Sabeti founded a still-thriving program to help incoming freshmen develop leadership skills. She also worked with RNA pioneer David Bartel, who recalls only one glitch in their association: “She gave out the lab phone number as the contact” for the leadership program. So many students called, “we had to change the number,” says Bartel.

    Sabeti also took the lead at Harvard Medical School, producing a lighthearted orientation video for first-year students, featuring prancing, juggling, balloon-wielding students and faculty. She now presents this video in person to each entering class. In fact, she's still making videos. One will be shown during a NOVA profile of her to be aired in July, featuring appearances by researchers including Lander, as well as Sabeti's music.

    Sabeti's band, Thousand Days—which describes itself on MySpace as a “love child” between the rock group U2 and the pop band Mazzy Star—has been a regular presence on the New England music scene for several years. Sabeti writes her own songs, including one called “Coming Up” that seems a metaphor for her career. She says she “loves the creative spirit” in both music and science but is “more at home” in science. Given her scientific schedule, she hasn't found time to perform in recent months.

    She continues to focus on the research she began at Oxford: teasing out signs of selection in the human genome. At Oxford, Sabeti focused on genetic susceptibility to malaria, zeroing in on two alleles that conferred resistance to the parasite. Most researchers assumed that these genetic variants had been favored by selection, but there was little evidence to prove it.

    Over the previous 20 years, researchers had developed dozens of tests for detecting “signatures” of natural selection in the genome (Science, 16 June 2006, p. 1614), but they had very low power to detect more recent evolutionary changes, particularly during the last 10,000 years, when many of the diseases that afflict humankind, including malaria, arose.

    Working with Reich and with her doctoral adviser, Dominic Kwiatkowski of Oxford University in the U.K., Sabeti hit on a novel way of combining two types of genetic information to create a more powerful test: the frequency of a particular variation and the structure of the genome surrounding it. Normally, variants are shuffled in a random fashion across the genome. But if a particular variant is the target of recent natural selection, its rapid increase in frequency can create so-called haplotype blocks, groups of genes that have “hitchhiked” along for the Darwinian ride (see graphic, below). Some earlier selection tests looked at both variant frequencies and haplotypes in humans, but they weren't very sensitive. Sabeti used her math skills to devise a genetic “clock,” based on haplotype structure, that could reveal whether recent, high-frequency variants were due to selection or just chance—greatly strengthening the power to detect evolution's hand.


    When a genetic variant favored by selection (pink bar) spreads rapidly in a population, other variants linked to it come along for the ride.


    In collaboration with Kwiatkowski, Lander, Reich, and others, Sabeti then applied the new approach to the protective malaria variants. “We saw a whopping signal” of positive selection, Sabeti says. When these results were published in Nature in 2002, her scientific reputation was made. “This test is one of the most exciting developments in the field in the past few years,” says Chris Tyler-Smith, a genome researcher at the Wellcome Trust Sanger Institute in Hinxton, U.K. Evolutionary biologist Martin Kreitman of the University of Chicago, who had developed a similar test but was beaten into print by a few months, says he has “nothing but praise for her contributions.” He adds that Sabeti's most recent contribution, a genome-wide search for selected genes in collaboration with Lander, the International HapMap Project, and others, “is a beautiful piece of work.”

    Lander says Sabeti's test anticipated the detailed information that the HapMap would later make available. “Pardis has a very energetic imagination,” he says. “Not many people think about what they would do if they had data they don't yet have.”

    The genome-wide study, published in Nature last October, identified two genes called LARGE and DMD that are involved in Lassa fever infection and show strong signals of natural selection in West Africans. Despite striking about 300,000 people each year and killing 20,000 of them, Lassa fever has been neglected by public health experts. Sabeti hopes to use her test to identify variants protective against the disease, which could eventually help in the search for new therapies and a vaccine. Looking at Lassa's evolutionary history is “a very innovative approach” that “might breathe some new life into field research” into the disease, says Lassa fever expert Joseph McCormick of the University of Texas School of Public Health in Brownsville, who is a consultant to Sabeti on the project.

    At the moment, Sabeti seems to be flying high. But some colleagues are concerned that her fame could set her up for a big fall if her hyper pace slackens. “I worry that too many expectations are being put on Pardis,” says one researcher. But Nancy Oriol, Harvard Medical School's dean of students, isn't worried. “If you are motivated by serving others and doing good work, as is Pardis, you won't get burned out,” she says.

    Indeed, despite all the attention she attracts, Sabeti says she feels more at home with her inner nerd: “Even though I am gregarious, I interact more with [scientific] papers than with people. Deep down, I am just a math geek.”


    A Renowned Field Station Rises From the Ashes

    1. Jerry Guo*
    1. Jerry Guo is a writer in New Haven, Connecticut.

    After a frustrating hiatus, wildlife researchers are returning to a swamp in war-torn Aceh Province in Indonesia for a chance to study some of the world's smartest apes.

    After a frustrating hiatus, wildlife researchers are returning to a swamp in war-torn Aceh Province for a chance to study some of the world's smartest apes

    Special skills.

    The orangutans of Suaq Balimbing are renowned for their distinctive behavior, including using tools to open fruit.


    SUAQ BALIMBING, INDONESIA—After an hourlong ride wending up the Lembang River through dense tropical rain forest, the speedboat sputters up to a rickety plywood dock. At the top of steps carved into the riverbank is a pair of shacks. Soaked clothes hang outside, drying. The Suaq Balimbing research station may look like hillbilly central, but inside it is buzzing with scientific life: 15 researchers and assistants jostling for space alongside generators, laptops, and cans of food.

    No one is complaining about the cramped quarters. Suaq shot to fame in the mid-1990s, when researchers discovered tool use among orangutans here in Sumatra's Gunung Leuser National Park. Apart from humans and chimpanzees, no other primates have demonstrated such abilities, and Suaq remains the only location where orangutans regularly use stick tools to crack fruits and hunt for termites and honey. More recently, a landmark paper in Science (3 January 2003, p. 102) demonstrated that orangutan populations possess distinct traditions, skills, and social quirks: their own cultures. These orangutans “do things that we haven't seen orangutans do in other sites,” explains Cheryl Knott, a primatologist at Harvard University. “[Suaq] helped change our views and understanding of these animals.”

    For the better part of a decade, this unique window into orangutan culture was shuttered. At the end of the 1990s, an upsurge of violence in Aceh Province's long-running civil war forced wildlife researchers to abandon their work in the province. The Suaq team, led by primatologist Carel van Schaik of the University of Zürich, Switzerland, pulled out in September 1999, after insurgents murdered the station's head assistant. When van Schaik and his colleagues returned for a brief survey in 2006, they found that the Indonesian army had razed the station's two sturdy buildings. “The rebels had been using our camp, so the military didn't burn it down for nothing,” he says. Boardwalks that traversed the research site—a 500-hectare swamp—had rotted away.

    After the warring parties signed a peace treaty in 2005, researchers began to trickle back to Aceh. “The bad times are over,” proclaims van Schaik. “People are getting back into the field.”

    Nowhere has the homecoming been more anticipated than at Suaq. Last year, van Schaik's crew and a Switzerland-based nonprofit, PanEco, built a modest replacement station and hacked a 46-kilometer trail grid for observing orangutans night and day. The collection of behavioral data resumed last September. Scientists couldn't be happier. “We were all waiting for this place to reopen,” says Andrea Gibson, a Ph.D. student at the University of Zürich who had to delay her fieldwork for 3 years because of the hiatus.

    Cultured apes

    During the rainy season, which lasts from October to March, trails at Suaq are waded, not walked. The knee-deep, pungent red mud has a powerful suction. “You can disappear in these waters,” says Ellen Meulman, another Zürich University Ph.D. student. Leeches are ubiquitous, and king cobras and tigers lurk unseen. Among primatologists, says van Schaik, Suaq is known as “human hell” but “orangutan heaven.” The shaggy apes are undisturbed—the nearest village is dozens of kilometers away—and food is plentiful, with some 70 kinds of fruit for the picking.

    Meulman and her colleagues head out from the field station in the wee hours of the morning and slog for nearly 2 hours across perilous terrain to get to the orangutan nesting site before dawn. They don't get back to camp for their rations of rice and canned mackerel until after dark. In between, they track orangutan behavior in minute detail: Is a subject playing with a neighbor? Eating, and if so, what? Vocalizing? Using a tool?

    The orangutans have some remarkable skills. For example, they know how to fashion a stick to crack open the razor-sharp shell of Neesia fruit. Van Schaik hypothesizes that they learned this skill after using simpler tools to dig for honey, fish for termites, and scoop for water. But it's unknown how these skills are acquired and transferred. “There's still a lot of doubt in the literature on this,” he says. “We would really like to nail it.”

    One clue may be the friendliness of the lowland orangutans, who frequently gather in small groups known as “parties.” “There are a lot of opportunities for social learning,” says Meulman. “We're looking at what they're doing when they're together. We've seen teaching and cultural learning.” Curious adults, for instance, will observe neighbors making umbrellas or gloves out of leaves and imitate those behaviors. Juveniles learn to build mosquito-repellent nests out of terentang leaves by watching their mothers.

    Even simple nests, perhaps used for afternoon naps, suggest the presence of culture. “It was always assumed they randomly break sticks together and build nests the same way,” says Gibson. “But there are definite differences in the arrangement of branches [among groups].” For her research, Gibson has been scaling trees to examine old nests and plans to sleep in one of the 30-meter-high beds for a night. This has never been attempted, she says, apparently due to the possibility of attack from a reticulated python. But it's worth the risk, she says: The data could answer basic questions, such as why orangutans sleep in nests rather than just out on a sturdy branch. “There are some things you have to experience firsthand,” she says with a smile.

    With the return of primatologists to Aceh, research on orangutan culture is gathering momentum. Although Suaq has stolen the show, the research requires extensive collaboration with other study sites. Case in point: Van Schaik didn't make the jump from observing tool use at Suaq to concluding that orangutans hold a deep cultural repertoire until he compared notes with scientists at five other field stations. Likewise, a 1999 Nature paper that first presented evidence for chimp culture required data from seven sites across central Africa to document the full range of their behavioral repertoire. “You can't say anything about culture if it's just one site,” says Knott. “You need the comparative perspective.”

    Knott's orangutan field station at Gunung Palung in Borneo also resumed work last year. She had shuttered it in 2003, after the staff became concerned that hostile loggers might become violent. Knott is studying differences in diet, vocalization, and reproduction to determine whether these behaviors have cultural or ecological origins. Much of this research will be in collaboration with other sites, including Suaq and the Ketambe Research Center, the longest running orangutan field site in Sumatra. Take, for instance, the differences in foraging behavior. “Is this variation purely due to optimal foraging or social learning?” says Ketambe manager Serge Wich, a researcher with the Great Ape Trust in Des Moines, Iowa. “We want to know which kinds of food prompted cultural innovation.”

    Losing time

    Because fieldwork stopped for several years across Aceh, it has been difficult to quantify the impact of the civil war on a biodiversity hot spot that is home to elephants, rhinoceroses, leopards, sun bears, tigers, and some 6500 orangutans. “Only now are many studies restarting,” says Wich. Although the fighting at Ketambe, in the island's interior, was not as fierce as on the west coast near Suaq, researchers had to evacuate in 2002 and only returned 2 years ago. After compiling a 37-year data set, says Wich, “it was sad to have the gap.”

    Up and running.

    The research center in Sumatra's Gunung Leuser National Park has been rebuilt after being occupied by rebels and destroyed by the army.


    Although the primatologists at Suaq lost much more time—8 years' worth of data—the 70 or so orangutans they study haven't missed a beat. The concentration of orangutans here is higher than anywhere else in the world: twice the density of other sites on Sumatra and four times that of Borneo, the only other place where these apes are found in the wild. A vibrant population appears to be the key factor enabling otherwise solitary creatures to “teach” each other skills such as tool use, making Suaq the ideal laboratory for studying the origins of human culture, says van Schaik.

    Ironically, the war may have given Leuser's orangutans a reprieve. When Indonesia's former President Suharto was ousted in 1998, illegal loggers were about to overrun the station. “The civil war stemmed that,” explains Ian Singleton, director of conservation for PanEco and a manager of Suaq. “The illegal loggers and poachers didn't want to risk being shot, so the civil war was extremely good for conservation.” During the past 5 years, when Suaq was offline, Singleton released 100 rehabilitated Leuser orangutans back into the Sumatran rain forest.

    But other threats loom large. Faced with expanding oil palm plantations, the resurgence of illegal logging and mining after the peace treaty, highway construction, and a booming pet trade, orangutans may become extinct in the next decade or two, says Singleton. A United Nations Environment Programme report published in February 2007 warned that 98% of the orangutan's habitat—tropical rain forests—would disappear by 2022. Based on satellite imagery, the report listed Leuser as one of the most vulnerable hot spots. Last year, Singleton says, developers began draining swamps and burning forests north of Suaq for oil palm plantations.

    Van Schaik knows that he and his colleagues can't afford to lose more time. They plan to build a six-room dormitory, install solar panels for a constant supply of electricity, and build three boardwalks to ease the trip to the research site. They hope to have the station restored to its former glory by fall. In the meantime, van Schaik is hiring assistants for an expanding research agenda. “We only scratched the surface before,” says Gibson. “We have the most intelligent and interesting orangutans. There are so many bubbling questions to be answered.”

  10. GM Crops: A World View

    Science maps where genetically modified crops are grown and imported, as well as which countries avoid them. An expanded version of the map is presented online.

    Explore the interactive version

    Download the PDF of the print version

    In 2007, farmers grew more than 114 million hectares of GM crops—mainly soy, maize, cotton, and canola. The Science staff has prepared a map showing who grows them, who imports them, and who avoids them, and highlighting the top eight countries that together produce more than 99% of the world's biotech plants.

    For online visitors, you can get the map in two forms: a special interactive version (part of this week's special Plant Genomes multimedia feature) providing information not included in the print map, and a PDF copy of the version that appeared in the print magazine.

  11. Tough Lessons From Golden Rice

    1. Martin Enserink

    It was supposed to prevent blindness and death from vitamin A deficiency in millions of children. But almost a decade after its invention, golden rice is still stuck in the lab.

    It was supposed to prevent blindness and death from vitamin A deficiency in millions of children. But almost a decade after its invention, golden rice is still stuck in the lab


    It's easy to recognize Ingo Potrykus at the train station in Basel, Switzerland. Quietly waiting while hurried travelers zip by, he is holding, as he promised, the framed and slightly yellowed cover of the 31 July 2000 issue of Time magazine. It features Potrykus's bearded face flanked by some bright green stalks and a bold headline: “This Rice Could Save A Million Kids A Year.”

    The story ran at a time when Potrykus, a German plant biotechnologist who has long lived in Switzerland, was on a roll. In 1999, just as he was about to retire, Potrykus and his colleagues had stunned plant scientists and biotechnology opponents alike by creating a rice variety that produced a group of molecules called pro-vitamin A in its seeds. The researchers thought this “golden rice”—named for the yellow hue imparted by the compounds—held a revolutionary promise to fight vitamin A deficiency, which blinds or kills thousands of children in developing countries every year.

    Almost a decade later, golden rice is still just that: a promise. Well-organized opposition and a thicket of regulations on transgenic crops have prevented the plant from appearing on Asian farms within 2 to 3 years, as Potrykus and his colleagues once predicted. In fact, the first field trial of golden rice in Asia started only this month. Its potential to prevent the ravages of vitamin A deficiency has yet to be tested, and even by the most optimistic projections, no farmer will plant the rice before 2011.

    The delays have made Potrykus, who lives in Magden, a small village in an idyllic valley near Basel, a frustrated man. For working on what he considers a philanthropic project, he has been ridiculed and vilified as an industry shill. Relating the golden rice saga at his dinner table while his wife serves croissants and strong coffee, he at times comes off as bitter.

    Fields of gold.

    The only field studies of golden rice to date took place in Louisiana in 2004 and 2005.


    There's more at stake than golden rice and personal vindication, he says. In his view, 2 decades of fear-mongering by organizations such as Greenpeace, his prime nemesis, have created a regulatory climate so burdensome that only big companies with deep pockets can afford to get any genetically modified (GM) product approved. As a result, it has become virtually impossible to use the technology in the service of the poor, Potrykus says.

    Not everybody is so gloomy. Potrykus's co-inventor and main partner, plant biochemist Peter Beyer of the University of Freiburg in Germany, agrees that it's been a difficult decade. But a more cheerful character by nature, Beyer believes rules are just something to be dealt with; complaining about them does little, he says. A handful of other researchers working on GM crops to fight malnutrition also feel confident that their work will eventually pay off.

    Many scientists agree with Potrykus, however, that GM technology has become so controversial that for now, there's little point in harnessing it for the world's poorest. HarvestPlus, a vast global program at public research institutes aimed at creating more nutritious staple crops, is forgoing GM technology almost entirely and using conventional breeding instead, despite its built-in limitations. GM products just might end up on the shelf, says HarvestPlus Director Howarth Bouis.

    Potrykus, now 75 years old, worries that he may not live to see his invention do any good. “It's difficult for me not to get upset about this situation,” he says.

    A dream takes root

    The idea for golden rice was born at an international agricultural meeting in the Philippines in 1984, says Gary Toenniessen of the Rockefeller Foundation, a philanthropy in New York City. It was the early days of genetic engineering, and over beers at a guesthouse one evening, Toenniessen asked a group of plant breeders how the technology of copying and pasting genes might benefit rice. “Yellow endosperm,” one of them said.

    That odd answer alluded to the fact that a quarter-billion children have poor diets lacking in vitamin A. This deficiency can damage the retina and cornea and increase susceptibility to measles and other infectious diseases. The World Health Organization (WHO) estimates that between 250,000 and 500,000 children go blind every year as a result, and that half of those die within 12 months. Vegetables such as carrots and tomatoes, as well as meat, butter, and milk, can provide the vitamin or its precursors, but many families in poor countries don't have access to them. A rice variety producing precursors to vitamin A in its endosperm, the main tissue in seeds, might provide a solution—and it would have yellow kernels.

    Time flies.

    The hope expressed in a 2000 Time cover story about Ingo Potrykus remains unfulfilled.


    Classical breeding cannot produce such a rice, however, because although pro-vitamin A is present in the green parts of the rice plant, no known strain makes it in its seeds. The only option is to tinker with rice's DNA to produce the desired effect. Throughout the 1980s, the Rockefeller Foundation funded several exploratory studies, but the plan didn't gel until a brainstorming meeting in New York City in 1992, at which scientists discussed the bold idea of reintroducing the biochemical pathway leading to beta carotene, the most important pro-vitamin A, into rice but putting it under control of a promoter that's specific to endosperm.

    Potrykus, then a pioneer in rice transgenics at the Swiss Federal Institute of Technology (ETH) in Zürich, attended, as did Beyer, who specialized in carotenoid biochemistry and molecular biology. The two met on the plane to New York and hit it off; their fields of expertise were complementary, and the fact that Zürich is less than 2 hours from Freiburg was helpful. They soon had a proposal written up.

    Beyer admits he barely believed in the idea himself, and the Rockefeller's scientific advisory board was equally skeptical. Introducing an entire genetic pathway into rice seemed like a stretch. Still, the foundation rolled the dice and supported the project.

    It took 7 years, but Potrykus and Beyer eventually succeeded in making golden rice by splicing two daffodil genes and a bacterial gene into the rice genome. The eureka moment arrived late one night in Freiburg, Beyer recalls. He was analyzing the molecular content of seeds produced in Potrykus's lab, as he often did, using a technique called high-performance liquid chromatography. This time, peaks showed up on the screen where they had never appeared before—the signals of carotenoids. When Beyer went back to look at the batch of seeds, he noticed something he had missed: The grains had a faint yellow hue. Golden rice had been born.

    The battle begins

    Potrykus says he always knew golden rice—a Thai businessman suggested the catchy name—would be controversial. As a professor in Switzerland, one of the most fiercely anti- GM countries in Europe, he had been confronted with angry students since the 1980s. To protect his plants, ETH spent several million dollars on a grenade-proof greenhouse. For Beyer, unofficial road signs declaring the Upper Rhine Valley a “GM technology-free region” are a twice-daily reminder that the climate in Germany isn't much better.

    But golden rice posed a special dilemma to GM crop opponents, admits Benedikt Haerlin, who coordinated Greenpeace's European campaign at the time and now works for the Foundation on Future Farming. Unlike the existing GM crops that primarily helped farmers and pesticide companies, it was the first crop designed to help poor consumers in developing countries. It might save lives. The decision whether to oppose it weighed heavily on him, Haerlin says, which is why he consulted with WHO experts on vitamin A and why he traveled to Zürich to spend a day at Potrykus's lab to talk. Potrykus, impressed by Haerlin's intelligence, hoped to convince his fellow countryman.

    He failed. Although Greenpeace pledged not to sabotage field trials, it did launch an aggressive campaign against golden rice. It argued that the crop was an industry PR ploy—seed company Syngenta was involved in the project, the group pointed out—designed to win over a skeptical public and open the door to other GM crops. Golden rice did not attack the underlying problem of poverty, Greenpeace said; besides, other, better solutions to vitamin A deficiency existed.

    Seeds of discontent.

    Although Potrykus has retired, Peter Beyer is still working on golden rice at the University of Freiburg.


    Perhaps Greenpeace's most effective argument, however, was that golden rice simply wouldn't work. The most successful strain created in 2000 produced 1.6 micrograms of pro-vitamin A per gram of rice. At that rate, an average 2-year-old would need to eat 3 kilos of golden rice a day to reach the recommended daily intake, Greenpeace said, and a breastfeeding mother more than 6 kilos. To drive the point home, an activist in the Philippines sat down behind a giant mound of golden rice during a press conference. “Fool's gold,” Greenpeace called it.

    A photo of the event, which quickly found its way around the world, still makes Haerlin chuckle—and it still makes Potrykus angry. Greenpeace assumed that children had to get all of their vitamin A from rice, which was unrealistic; it also ignored the fact, says Potrykus, that even half the recommended intake may prevent malnutrition. And Greenpeace assumed that the uptake of beta carotene by the human gut and its conversion into vitamin A were quite inefficient, resulting in one vitamin molecule for every 12 molecules of beta carotene. Nobody knew the true rate at the time, but a recent, soon-to-be-published study among healthy volunteers who ate cooked golden rice, led by Robert Russell of Tufts University in Boston, suggests that it's more like one for every three or four. “That's really quite good,” says Russell, who supports the golden rice project. (A similar study is planned among people with marginal vitamin A deficiency in Asia.)

    Haerlin says his calculations were based on the best data at the time. But even if they were correct, Potrykus says, the first golden rice was just a proof of principle. Greenpeace might as well have blamed the Wright brothers for not building a transatlantic airplane, he says.

    Industry gets in

    The low beta-carotene yield would eventually be tackled by Syngenta—even though Potrykus resented the way the company got involved. Between 1996 and 1999, Beyer's lab received funding through a European Commission contract that also included agrochemical giant Zeneca (called AstraZeneca after a merger in 1999). Under the program's rules, any benefits had to be shared by the signers. AstraZeneca had not worked on golden rice per se, Potrykus says, but the company claimed a share of that intellectual property anyway; it was interested in developing the technology commercially, for instance in health foods, says Potrykus, who was initially “furious” that a big corporation now had a say in his project.

    Gold fever.

    GR2 (left) contains more than 20 times more pro-vitamin A than GR1 (top right). Ordinary rice (bottom right) contains none.


    David Lawrence has a different take on those events: At the time, AstraZeneca primarily wanted to support the humanitarian development of golden rice, says the cur rent head of research at Syngenta; the company didn't have any commercial plans. (AstraZeneca's agribusiness division merged with that of Novartis to form Syngenta in 2000.) But whoever's right, the move proved a blessing in disguise, Potrykus now says. At Syngenta, he found a new partner in Adrian Dubock, a bubbly, fast-talking Brit with experience in patents, product development, regulation, and marketing—subjects Potrykus and Beyer admit they were clueless about.

    Dubock helped work out a deal in which Syngenta could develop golden rice commercially, but farmers in developing countries who make less than $10,000 a year could get it for free. He also helped solve patent problems with several other companies. Dubock retired from Syngenta in 2007 but remains involved as a member of the Golden Rice Humanitarian Board, a group Potrykus chairs. “Without him, the project would have ended already,” Potrykus says.

    But perhaps most important, Syngenta scientists replaced a daffodil gene with a maize gene, thus creating a new version of golden rice, dubbed GR2, that produces up to 23 times more beta carotene in its seeds. Even with the one-in-12 conversion factor, that meant 72 grams of dry rice per day would suffice for a child, the company's scientists said in 2005. A 2006 paper by Alexander Stein of the University of Hohenheim in Stuttgart, Germany, estimated that the rice could have a major public health impact at a reasonable cost.

    Those results didn't convince the skeptics. Real-world studies are still lacking, says WHO malnutrition expert Francesco Branca, noting that it's unclear how many people will plant, buy, and eat golden rice. He says giving out supplements, fortifying existing foods with vitamin A, and teaching people to grow carrots or certain leafy vegetables are, for now, more promising ways to fight the problem.

    A golden future?

    Today, the debate about golden rice has quieted down, in part because its inventors are keeping a low profile. Syngenta stopped its research on golden rice and licensed the rights to GR2 to the humanitarian board on World Food Day in 2004; given consumers' distrust, there was no money in it, says Lawrence. Most golden rice work is now taking place at six labs in the Philippines, India, and Vietnam, the countries chosen as the best candidates for the crop's launch.

    There's a long way to go. Both the original golden rice, now called GR1, and GR2 were created with Japonica cultivars that are scientists' favorites but fare poorly in Asian fields. Researchers are now backcrossing seven GR1 and GR2 lines with the long-grained, non-sticky Indica varieties popular among Asia's farmers. In early April, researchers at the International Rice Research Institute in the Philippines finally started a field trial with a GR1 backcrossed into a widely used Indica variety called IR64—the first field trial ever in Asia. (The only other outdoor studies were two done in Louisiana in 2004 and 2005.) The new varieties must not only produce enough beta carotene but also pass muster in terms of yield, seed quality, and appearance.

    The project could have been much further along, Potrykus says, if there weren't so many rules governing GM crops that make little sense. Conventional breeders can bombard plant cells with chemicals and radiation to create useful mutants without having to check how it affects their DNA; a GM insertion must be “clean”—that is, the extra genes must sit neatly in a row without disrupting other genes—which adds months or even years to the lab work. Because field trials take long to get approved, researchers have been confined to greenhouses, in which they have trouble growing the large numbers required for breeding and feeding studies. These requirements have caused “year after year of delays,” Potrykus complains.

    Even if field trials are successful, there are no guarantees that golden rice will eventually be approved in the target countries. Use of other GM crops, such as Bt cotton, has exploded in Asia in recent years (see infographic). But GM rice has languished. In India and China, regulatory agencies have shied away from approving insect-resistant GM rice despite extensive testing. “The expectation is that they will [be approved] eventually,” says Toenniessen, “but it's a major decision for any Asian country.” Thailand, a major rice exporter, has decided to steer clear of GM rice altogether.

    Kavitha Kuruganti of the Centre for Sustainable Agriculture, an anti-GM group in Hyderabad, India, promises a major battle should golden rice head to the market in India. She thinks that the crop is unnecessary and probably unsafe to eat and that a massive switch would reduce diversity and threaten India's food security. “We will try to organize a broad public debate,” she says.

    Not worth funding?

    Whether justified or not, the turmoil over golden rice has shaped other efforts to improve the nutritional value of crops. Take Harvest- Plus. With a $14 million annual budget that targets 12 crops, it aims to boost levels of three key nutrients: vitamin A, iron, and zinc. It relies almost entirely on conventional breeding—which has Greenpeace's blessing—because it wants to have an impact fast, says Bouis, the director. What little GM technology HarvestPlus supports is a “hedge,” in case the political and regulatory climates shift.

    Eat your dinner.

    Greenpeace said more than 3 kilos of golden rice daily were needed to meet dietary standards for vitamin A.


    But in plants that have little or no natural ability to produce a nutrient, breeders have nothing to work with. Thus, vitamin A-enriched non-GM rice and sorghum are essentially off the table, says Bouis, as is boosting zinc and iron in sweet potatoes and cassava. Iron in rice is a question mark.

    The uncertainty about the future of GM foods also tends to scare off the financial donors on which programs like HarvestPlus depend. Rockefeller, for instance, is frustrated that a GM rice whose field trials it helped pay for in China is stalled, says Toenniessen. “To avoid making the decision to approve it, the Chinese keep asking for more field trials,” he says. “In the end, that becomes a foolish use of our funds.”

    The only charity still investing massively in GM crops with enhanced nutritional value is the Bill and Melinda Gates Foundation. Through its Grand Challenges in Global Health initiative, it is spending more than $36 million to support not only golden rice but also GM cassava, sorghum, and bananas. The foundation declined to comment for this story. But the researchers it supports say that they are optimistic that their products will make it through the pipeline.

    James Dale of Queensland University of Technology in Brisbane, Australia, who heads a project to add iron, vitamin A, and vitamin E to bananas, says he has learned several lessons from golden rice, including the importance of local “ownership”—which is why he has teamed up with researchers in Kampala. “This will be a Ugandan banana made by Ugandans,” he says.

    Not that this mollifies opponents. Greenpeace will fight to keep GM bananas, cassava, and sorghum from poor countries' fields, just as it will keep opposing golden rice, says Janet Cotter of Greenpeace's Science Unit in London.

    Battle-scarred, Potrykus says he hasn't given up hope that the regulatory system can be overhauled so that GM technology can benefit the poor. He believes a massive, multimillion-dollar information campaign might help convert the public. He has tried in vain to contact Bill Gates in hopes of tapping his wealth for such a media blitz.

    He also wrote the late Pope John Paul II to ask for support for golden rice. “You know the definition of an optimist?” he jokes: “Someone who's asking the church for money.” His Holiness declined, but Potrykus was invited to join the Pontifical Academy of Sciences, where he hopes to convene a meeting on golden rice next year—the 10th anniversary of his tarnished invention.

  12. Papaya Takes on Ringspot Virus and Wins

    1. Erik Stokstad

    This GM fruit has a long track record and potential for developing countries, yet it is still running into acceptance problems.

    This GM fruit has a long track record and potential for developing countries, yet it is still running into acceptance problems


    Engineered papaya trees withstand virus while surrounding trees succumb.


    If there is an example of a silver bullet among genetically modified (GM) crops, it would be virus-resistant papaya trees. They saved the papaya industry in Hawaii from devastation by the ringspot virus, a serious pathogen that deforms fruit and eventually kills conventional trees. “That's a fantastic accomplishment,” says George Bruening, a plant pathologist at the University of California, Davis.


    As the world's first GM fruit to be successfully commercialized, papaya has a 10-year track record of safety. Yet its future outside Hawaii is far from assured. Although the virus threatens papaya trees almost everywhere they grow, environmental groups are campaigning against its adoption in other countries. “It's really a tragedy,” says Sarah Davidson, a Cornell University Ph.D. student, whose analysis of GM papaya in Thailand will appear next month in Plant Physiology. “It should have been a model for technology transfer to developing countries.”

    The story starts in 1978, when Dennis Gonsalves, a young plant virologist at Cornell, returned to his native Hawaii for a visit. The ringspot virus was slowly spreading toward Puna, the main papaya growing region. Gonsalves decided to stand in its path of destruction. By 1991, he and his colleagues had shown that papaya carrying the gene for a viral protein could resist the virus in the greenhouse. Field trials began in April 1992.

    Later that spring, the virus started a rampage across Puna. Within 6 years, the papaya harvest had dropped by half, to 11.8 million kilograms. “We were devastated,” recalls Loren Mochida, a board member of the Hawaii Papaya Industry Association and manager of Tropical Hawaiian Products, an exporter that suffered two rounds of major layoffs. But the field trials were promising: Whereas conventional trees were infected and nearly barren, the GM trees were large and heavy with fruit.

    Richard Manshardt of the University of Hawaii, Manoa, then crossed the GM papaya with a high-yielding variety called Rainbow. Gonsalves says the team showed off the fruit to politicians, schoolchildren, anybody they could interest. “People were rooting us on,” he recalls. Today, virus-resistant trees account for about 80% of papaya acreage in Hawaii. “Dennis Gonsalves got there just in time,” says Bruening.

    The ringspot virus is a problem around the world, and scientists from many countries flocked to Gonsalves's lab to learn how to put the gene into their varieties of papaya. By 1999, Thai scientists had virus-resistant papaya in small field trials. GM papaya is being field-tested or studied in the Philippines, Vietnam, Taiwan, Malaysia, and elsewhere.

    GM papaya “is pretty close to an ideal pro-poor crop,” says Davidson. Unlike some GM crops, the technology doesn't require herbicides. And in Thailand, papaya is second only to bananas in importance, with 80% consumed domestically. Many poor farmers cannot afford to eat papaya unless they grow it themselves.

    Despite the success in Hawaii, criticism is fierce: In 2005, for example, the Global Justice Ecology Project, based in Hinesburg, Vermont, claimed that GM papaya in Hawaii had caused an “economic and ecological disaster for organic, conventional, and GM papaya farmers alike.” It and other opponents are concerned about allergic reactions to the virus coat protein and widespread genetic contamination of conventional papaya. They also assert that GM papaya has become more susceptible to a disease caused by a pathogen called phytophthora.

    Most researchers reject these concerns. There is no evidence that the GM plants are allergenic or more vulnerable to phytophthora, says Gonsalves, who now directs the U.S. Department of Agriculture Pacific Basin Agricultural Research Center in Hilo, Hawaii. However, GM papaya can impact conventional growers: In a study of conventional trees published last year, Manshardt and colleagues reported transgenes in 1% of seeds from uninfected, self-pollinating trees planted next to GM papaya. No transgenes were found in seeds from an orchard that was 400 meters downwind.

    Ultimately, the acceptance of GM papaya will rest on politics and economics, says Davidson. Countries such as Brazil will likely continue to put up with virus as long as they can profitably export conventional papayas to Europe or Japan, which prohibit GM papaya. In contrast, Mexico may decide to permit the planting of GM papaya, because its major market is the United States, which allows the GM fruit. And although protests by Greenpeace caused the Thai government to ban field tests of GM papaya in 2004, the Thai government may eventually relent in the face of competition from neighbors such as China. Once China deregulates papaya and other GM vegetables, “you won't be able to stop [their spread],” predicts Frank Shotkoski, director of the Agricultural Biotechnology Support Project II at Cornell. “It won't be long before the rest of the world will see it as safe.”

  13. Is the Drought Over for Pharming?

    1. Jocelyn Kaiser

    Companies are plowing ahead, making drugs and other compounds in plants, despite technological, economic, and social issues.

    Despite technological, economic, and social issues, companies are plowing ahead, making drugs and other compounds in plants

    In the bag.

    These cultured carrot cells are engineered to make a human drug.


    Many a child has been told “carrots are good for you.” That advice could soon take on new meaning for people with Gaucher disease, an inherited metabolic disorder that leads to liver and bone problems. Patients must now be injected every 2 weeks with a manufactured enzyme that costs on average $200,000 a year, making it one of the most expensive drugs ever. If ongoing clinical trials go well, the 5000 Gaucher patients on the therapy could soon have a second option—a cheaper version of the enzyme that stays in the bloodstream longer and can be injected less often.

    If the U.S. Food and Drug Administration (FDA) approves recombinant glucocerebrosidase, it will be good news not only for medicine but also for a community far removed from the clinic: plant scientists. Protalix Biotherapeutics in Karmiel, Israel, produces this new version of the protein in giant plastic bags, not in steel vats of mammalian cells like most biologics are. The bags are filled with transgenic carrot cells that are cultured and then processed to extract the drug. “If Protalix gets regulatory approval, that would [make it] the first plant-made pharmaceutical,” says plant scientist Charles Arntzen of Arizona State University in Tempe. “For people who work in this field, it will be a very exciting step forward.”

    Arntzen is chasing an elusive dream: using whole plants as factories to make drugs. Nearly 20 years ago, when researchers first showed that a tobacco plant could be engineered to crank out an antibody, they envisioned harvesting cheap supplies of therapeutic proteins, antibodies, and vaccines from vast fields of crops. For this approach, researchers isolate the target gene and usually insert it into a bacterium called Agrobacterium that readily infects the plants and passes on the gene. The gene becomes part of the plant and is passed from one generation to the next, producing foreign protein much as if it were one of the plant's own genes.

    However, technological hurdles and a lack of interest from drug companies have hamstrung “pharming,” as have worries that pharma crops will escape from their experimental plots and taint the food supply. As a result, many companies have abandoned this research or gone under. And no plant-made drugs for humans have made it to the pharmacy.

    But academic scientists and some companies have persisted, improving yields of plant-made drugs and developing innovative ways to keep pharming inside the lab, or the greenhouse. Several plant-made pharmaceuticals (PMPs) are now in patient trials (see chart). Moreover, the European Union, the Bill and Melinda Gates Foundation, and the U.S. Department of Defense are fertilizing the field with new funding. “We're actually not doing too bad,” says Julian Ma, an immunologist at St. George's University of London in the U.K. “It's just that everyone is in a hurry.”

    Fields of dreams

    The excitement over plant-made pharmaceuticals began with a 1989 paper in Nature showing that monoclonal antibodies could be produced in tobacco. The paper “really captured the imagination,” says Ma. Monoclonal antibodies were being used to treat a growing number of diseases, from arthritis to cancer, but were expensive to make in mammalian cells. So-called plantibodies appeared to offer a cheaper production method—a kilogram might cost $100 rather than $3 million—and might be simpler to process because they would be free of animal pathogens.

    Temporary transgenic.

    Fluorescing protein shows tobacco leaf's pharming potential.


    Other discoveries followed. In 1995, for instance, Arntzen's group reported in Science that potatoes engineered to make a cholera protein worked as a vaccine when the spuds were fed to mice. Such “edible vaccines” could offer developing countries cheap oral vaccines that didn't require refrigeration, Arntzen suggested (Science, 5 May 1995, p. 658).

    A company called Large Scale Biology Corp. in Vacaville, California, came up with a shortcut. It didn't bother to create a new tobacco strain when it wanted to produce an antigen for a lymphoma vaccine. It simply sprayed tobacco plants with a tobacco mosaic virus carrying the appropriate gene. The leaves produced useful amounts of the vaccine protein within 14 days. The drug worked in mice, suggesting that vaccines tailored to lymphoma patients' tumors could be made in plants in just weeks. And because the plants carried the foreign gene only until they shed their leaves, they were potentially more acceptable than permanently modified crops.

    Steps along the way.

    No plant-made human drug has made it through final clinical trials, but several “pharmed” proteins are close to or on the market as supplements, a vaccine reagent, and a medical device.


    Scores of biotech companies sprang up to commercialize these discoveries, and some big agbiotech companies got involved as well. By the mid-1990s, more than 180 companies and organizations were working on pharming, according to the Biotechnology Industry Organization.

    The companies soon ran into technological snags, however. Biotechnologists couldn't always get plants to express enough protein and had trouble purifying the protein product. Efforts to make edible vaccines stalled after researchers realized that the amount of antigen fluctuated widely from plant to plant. Arntzen thinks that oral vaccines made from dried plant material could work for developing countries, but a vaccine without a strictly controlled dose “would never be approved” in the United States, he says.

    Another reality check: lukewarm interest from the big drug companies. They didn't much care that plant-made drugs would be cheaper to make because production is a small chunk of the cost of drug development; the big-ticket item is clinical trials. The companies were also leery of the regulatory hurdles, because both the drug and the new production process would have to clear FDA. “Most pharmaceutical companies aren't willing to take a chance on a drug produced in plants,” says Roger Beachy, president of the Donald Danforth Plant Science Center in St. Louis, Missouri.

    Also, like other genetically modified crops (see pp. 468, 472), pharma plants can be a public relations nightmare. In 2002, leftover corn plants engineered by ProdiGene Inc. to make a pig vaccine sprouted in a soybean field in Nebraska. For this and an Iowa mishap, the U.S. Department of Agriculture (USDA) fined the company $250,000 and made it pay $3 million to buy and destroy tainted soybeans. The incident stoked opposition from farmers and activists worried about “drugs in your cornflakes.”

    Other companies underestimated the public's concerns. A company called Ventria Bioscience that wanted to conduct field trials of rice containing two breast-milk proteins useful in combating diarrhea drew the ire of rice growers in California, then Missouri. It wound up in Kansas, where no other rice is grown.

    USDA tightened its rules for field trials of pharma plants in 2003 to prevent mistakes like the ProdiGene episode. But skeptics were not assuaged. Bill Freese of the Center for Food Safety in Washington, D.C., says enforcement is “horrendous.” As a result, “we don't think [drugs] should be in any food crops, indoors or outdoors,” he adds. Many ecologists and some plant scientists are also leery of using food crops for pharma. “It's too dangerous,” says Kenneth Palmer, former director of the vaccine program at Large Scale Biology.

    Flower power.

    This transgenic safflower makes seeds containing insulin.


    These concerns drove many companies away from using food crops such as corn for pharmaceuticals. A few big companies, such as Monsanto, dropped PMP research altogether. Stung by bad press and lack of interest from drug companies, many leading plant pharma companies have folded, including ProdiGene and Large Scale Biology. As Palmer puts it, “the field imploded.”

    Close to the clinic

    Despite the setbacks, a handful of companies in the United States and Europe haven't given up. A few have plowed ahead with food crops, grown outdoors, for their pharma products; others have focused on other plants or on unconventional growing schemes.

    Meristem Therapeutics in Clermont-Ferrand, France, plans to start final clinical trials for a corn-grown gastric lipase for cystic fibrosis patients by the end of the year. And the Canadian company SemBioSys Genetics Inc. uses transgenic safflower—“much less of a lightning rod than some other crops,” says CEO Andrew Baum—to produce insulin, which should be in clinical trials this year. Companies such as Protalix and Biolex Therapeutics sidestep the growing of crops altogether: the former with its carrot-cell culture to make a Gaucher disease enzyme, and the latter by producing interferon using duckweed, tiny clonal plants grown as a layer in clear plastic bags. “We are careful not to be associated with whole-plant transgenic technology,” says Protalix CEO David Aviezer.

    New technologies are attracting attention. To boost expression, the German biotech Icon Genetics relies on bacteria to get transgeneladen viruses into tobacco plants. The company dips the plants into a solution of Agrobacterium that carries the DNA for a deconstructed tobacco mosaic virus, which in turn contains the gene for the desired drug. The bacterial bath, followed by a few seconds in a vacuum, gets far more of the virus into plant-leaf tissue than conventional spraying.

    In a 2006 paper in the Proceedings of the National Academy of Sciences (PNAS), they reported that this method, combined with other techniques, increases the amount of antibody by up to 100-fold, reducing the size of the crop needed and making it feasible to grow plants commercially indoors. Compared with making a transgenic plant, which takes a year or two to develop, this “magnifection” can go from gene to grams of protein in a couple of weeks. “It's incredibly promising technology,” says Ma, who, like other academic researchers, is trying out magnifection.

    With help from the drug giant Bayer, which bought the company in 2006, Icon Genetics will open a clinical-grade manufacturing plant in June. It expects to begin trials with a cancer vaccine tailored to individual patients in 2009, says CEO Yuri Gleba.

    Bayer's move is a healthy sign of regrowth for the pharming field, Ma and others say. And other new sources of support are helping too. Last month, Pharma-Planta, a €12 million, 5-year, European Union-funded project co-coordinated by Ma, described in PNAS an anti-HIV microbicide grown in corn or tobacco that could be ready for testing next year. The Defense Department and other U.S. government agencies have provided the Fraunhofer USA Center for Molecular Biotechnology in Newark, Delaware, nearly $14 million to use a technique like magnifection to make vaccines. It has tested anthrax and plague vaccines in nonhuman primates and a pandemic flu vaccine in ferrets. “[We] can do things much faster than any other technology,” says Executive Director Vidadi Yusibov, slashing in half the 6 months it now takes to make flu vaccine the traditional way, in chicken eggs. The organization also has $8 million from the Gates Foundation for plant-based vaccines for malaria, sleeping sickness, and flu.

    As visions of endless fields of pharma crops have faded, so have unrealistic expectations for pharming. Scientists say they now realize that they need to be smarter about the marketability of the drugs they develop in plants. They think the best bets—Protalix aside—may be high-volume biologics, such as microbicides, monoclonal antibodies, and vaccines, particularly for use in developing countries. Getting these first low-hanging fruits through clinical trials and FDA approval should allay concerns about safety and environmental risks. Says Palmer, now at the University of Louisville in Kentucky, “Once two or three products [win approval], the field should really take off.”

  14. Uncorking the Grape Genome

    1. John Travis

    Pour a glass of wine and read the tale of two teams racing to sequence the fruit worth billions to the world's economy.

    Pour a glass of wine and read the tale of two teams racing to sequence the fruit worth billions to the world's economy


    Among wine connoisseurs, opinions differ about whether 2007 will prove a good year for Pinot Noir. But among plant geneticists, it's the finest vintage ever: Last year, two European teams published high-quality drafts of two Pinot Noir-derived genomes.

    Plant biologists are toasting the genomic double-header. This is the first fleshy fruit and just the fourth flowering plant to have its genome decoded. And in economic terms, grapes top the world's fruit crops: We consume them fresh or dried, crush them into juice, and use them to make wine that can sell for many thousands of dollars a bottle. “The contributions of these sequencing efforts are enormous and historical,” says grape researcher Steven Lund of the University of British Columbia in Vancouver, Canada.

    Wine woes.

    Powdery mildew (left) and other fungal diseases can devastate vineyards.


    The story behind the grape genome is one in which a worldwide scientific community came together, then partially splintered into rival camps; money to support sequencing was hard to come by; and success has brought both new insights and delicious questions. The rivalry provided the drama of the story. For a while, a French-Italian grape genome alliance called Vigna/Vigne looked like it was going to be beaten by a disgruntled researcher who started his own genome effort. “Undoubtedly, competition was a driver here, perhaps in a microcosm of the human genome sequence drama of years past,” says Lund, referring to the bitter contest between public and private programs to decipher our genetic code. Recently, however, at a workshop* in Udine, Italy, the two grape genome groups began to put aside their rivalry. “I'm hopeful there will be more collaboration now,” says Vigna/Vigne member David Horner of the University of Milan in Italy. “It's cool there are two cultivars done. It allows more comparative work.”

    A key motivation for deciphering the grape genome is to prevent a repeat of the economic devastation that struck the European wine industry in the late 1800s. At that time, phylloxera, sap-sucking insects from North America, ravaged European grapevines. Today, winemakers and grape researchers are struggling to combat new threats, particularly downy and powdery mildew, diseases that have made their way to Europe from the United States over the past century. These fungi are an environmental as well as an economic nightmare: Although only about 5% of Europe's farmland is dedicated to wine vineyards, they account for about 70% of the region's fungicide use.

    The new genome information should speed the creation of hardier vines, which has been slow going. “The target now is clearly resistance genes,” says Vigna/Vigne member Michele Morgante from the Institute of Genomic Applications (IGA) in Udine. New insights into the locations of these genes can assist breeders as they try to develop better varieties, for example. And identifying genes in the few grapes that are resistant to drought or pests may pave the way for genetically modifying common wine grapes to have the same attributes.

    But as viticulturists enter the genomic age, many wonder whether the wine industry, particularly the conservative European sector, would dare bypass conventional breeding for genetically modified (GM) grapevines. Scientists have for years been experimenting with GM grapes—usually putting nongrape genes into the fruit's genome—but most winemakers have shied away from public association with such efforts. “For a lot of consumers of wine, especially high-end ones, history and tradition is a very important part of their experience. If you produce wine from a genetically engineered grape, you strike at the heart of that,” says Carole Meredith, a former grape geneticist who now runs a vineyard (see sidebar).

    Taking root

    Although there's a dazzling variety of wines produced around the world, the great majority flow from the juices of a single species, Vitis vinifera, commonly called vinifera. Indeed, in Europe, this grape is the only source of fine wines—other grapes are limited to fruit, juice, or so-called table wines. Thanks to centuries of breeding, 7000 cultivated varieties, or cultivars, of vinifera now provide an incredible diversity of flavors, from hearty reds to light whites.

    A vinifera genome project began to take root in the mid-1990s, largely through the instigation of Meredith, who was then conducting grape research at the University of California, Davis. Meredith and others wanted to use genetics to identify various grape cultivars—Chardonnay, Cabernet Sauvignon, and so on—that can be tricky to distinguish in other ways. In a business in which wines made from different cultivars can vary enormously in price, correct identification is critical.

    The researchers proposed to “fingerprint” grapes based on variable DNA sequences called microsatellite markers. Back then, however, identifying such markers “was expensive and laborious, and no one lab was ever going to develop enough to make their efforts worthwhile,” Meredith recalls. “So I approached all the people I knew and suggested working together.”

    From that suggestion arose an international consortium that amassed hundreds of such markers within several years. The researchers were thirsty for more. In 2001, many of them, including Meredith, formed the International Grape Genome Program, arguing for a full-fledged sequencing of a vinifera cultivar.

    But who would pay for it? It was well-known that grapes weren't on the U.S. genomics menu. In 2001, Enrico Pè of the Sant' Anna School of Advanced Studies in Pisa proposed a grape genome program in Italy, sponsored by the private sector. But bank foundations and winemakers declined.

    Finally, in 2005, the French research agencies INRA and Genoscope joined with various Italian groups, including IGA and universities in Verona and Udine, to form Vigna/Vigne, which means vine in French and vineyard in Italian, respectively. This time, researchers did not ask winemakers for money, only political support, says Pè, who now leads the Italian side of the collaboration. France ultimately contributed about €8.5 million, Italy, about €12 million.

    The race is on

    Except when discussing European football—their national teams are bitter rivals—the French and Italian groups meshed smoothly. Still, agreeing on what to sequence wasn't easy. Compared with many plants, vinifera grapevines are extremely heterozygous: The female and male versions of the plant's chromosomes differ significantly. This complicates the sequencing and assembly of an accurate genome. The Italian scientists first considered a Sangiovese grape, then a Pinot Noir. Their French counterparts lobbied for a Cabernet Sauvignon grape, but after studying its DNA further, the team decided it was too challenging as well.

    Finally, an almost-forgotten grapevine growing in a French greenhouse provided a solution. In the 1980s, a French viticulturist hoping to develop a better vine for winemakers began inbreeding a Pinot Noir. Several generations of selfings produced a few lines with simplified genomes but also stunted their growth and made them unappealing for wine production. Vigna/Vigne decided one of those, PN40024, would offer the best potential for sequencing, even if it was, as Pè jokes, “pathetic, hardly a grape.”

    Just months after Vigna/Vigne began its PN40024 sequencing effort, Pè and his colleagues were shocked to learn of a rival effort. In March 2006, Riccardo Velasco and his colleagues at the Istituto Agrario San Michele all' Adige (IASMA) in Trentino, Italy, announced that they were almost done sequencing the genome of a Pinot Noir grape used in many countries to make red and sparkling wines. Velasco had been involved in some of the original genome planning efforts, including Pè's Italian proposal, but he disagreed with the initial decision to switch from sequencing an outbred Pinot Noir to a Cabernet Sauvignon. His institute then lobbied the Trentino regional government for about € 10 million to pursue its own grape genome. IASMA initially kept quiet about the true scope of the project, says Velasco, because “the possibility of failure was high.” Unlike Vigna/Vigne, which kept sequencing in-house, IASMA contracted with Myriad Genetics in Salt Lake City, Utah, to sequence and help assemble most of the genome.

    After the IASMA bombshell, Anne-Francoise Adam-Blondon of INRA says that the French-Italian effort ground to a halt for several months as members discussed whether to spend their governments' money on something that had apparently already been accomplished. But IASMA hadn't published its genome, so Vigna/Vigne decided to press on.

    The race ended with Vigna/Vigne winning by a nose. It published the PN40024 genome online 27 August in Nature ( Velasco's Pinot Noir genome appeared online 19 December in PLoS One. Some Vigna/Vigne members felt that Velasco initially downplayed their feat, by, for example, reminding the media that PN40024 wasn't a real Pinot Noir or used for wine. But Velasco attended last month's workshop, which focused on Vigna/Vigne's results, and he was greeted warmly. “The two projects are fully complementary,” he says.

    Sweet finish.

    Riccardo Velasco samples wine from the grape he raced to sequence.


    Decanting the genome

    It's already clear that the two genomes vary significantly. Although both efforts predict that the vinifera genome contains about 30,000 genes, the Pinot Noir sequenced by Velasco's team has about 270 members of a gene family associated with disease resistance, whereas the inbred PN40024 strain has almost 360, Adam-Blondon reported at the workshop. The two grapes also differ in the number of the many hundreds of genes related to the production of polyphenols, flavenoids, and resveratrols—all of which contribute to a wine's color, aroma, and taste.

    Several of the talks in Udine centered on how genomic data could aid the breeding of disease-resistant grapevines. Gabriele Di Gaspero of IGA, for example, is trying to harness the powers of a Central Asian grape that is used for raisins. It is the first vinifera shown to stand up to powdery mildew, a discovery by a Hungarian group that electrified grape biologists a few years ago. Di Gaspero and colleagues have been crossbreeding the grape with a mildew-vulnerable vinifera and have used DNA markers identified through the genome projects to pinpoint the chromosomal location of a disease-resistance gene.

    In theory, researchers can now—even without identifying the exact gene—breed resistance into popular wine grapes such as Chardonnay and Pinot Noir. After crossing the Central Asian grape with their favorite wine grape, they can select and continue to breed just the seedlings whose DNA contains the markers bracketing the mystery resistance gene. Ultimately, such marker-assisted selection could result in a disease-resistant grape that retains most of the qualities needed to make a good wine, instead of raisins. Still, bringing a new grape to market can take decades—and scientists have to be sure the transferred resistance gene is stable in its new genetic surroundings. “When you plant a grape field, it's for 30 years, so you really need durable resistance,” notes Adam-Blondon.

    Grape researchers are also using the new genome data to probe the interplay of genes, environment, and wine flavors in a variety of cultivars. In Udine, Mario Pezzotti of the University of Verona detailed genetic studies of the unusual process that produces the Italian red wine Amarone. The grape involved typically produces a sweet wine, but decades ago winemakers realized that if those same grapes dry for several months after harvest, the withered fruit make a more intense and bitter wine that has since become highly valued. Although some of his colleagues predicted that genetics had little to do with the withering process, Pezzotti revealed that large numbers of genes are active during this period, including many that influence a wine's taste and aroma. Amarone makers are now following the research with intense interest, he says.

    The Vigna/Vigne and the IASMA teams are now on more cordial terms, but they do have some scientific disagreements. Both have found evidence of whole-genome duplications in vinifera's past—a common feature of plant evolution in which new species arose after an ancestral plant accidentally duplicated its genome or hybridized with another to expand its gene set (see p. 481). Velasco and his colleagues argue that such an event happened relatively recently, after grapes had split off from the branches on which Arabidopsis and poplar belong. Vigna/Vigne, on the other hand, has concluded that the grape genome did not undergo any recent expansion. It instead suggests that vinifera derives from an ancient hybrid that once had six sets of chromosomes. Because each team used different strategies for discerning and dating duplications, says Velasco, “who knows who's right?”

    Indeed, Pè ended the Udine workshop with the reminder that having a grape genome—or two—in hand merely provides a foundation for future work. “Most of the data still have to be digested,” he notes. A glass or two of vinifera's valuable juices, perhaps a nicely aged Cognac, should speed the process.

    • *Tuning the Taste of Wine, 7 March 2008, Udine, Italy.

  15. A Life With Grapes

    1. John Travis

    After studying grape genetics for 3 decades, in 2003, Carole Meredith traded her lab bench for a life of winemaking.


    Carole Meredith spent 3 decades studying grape genetics, and she helped start the grape genome efforts that bore fruit last year. But in 2003, she traded her lab bench for a life of winemaking in California's Napa Valley.

    Before that career shift, Meredith had earned fame for using genetic fingerprinting to resolve the origins of some of the world's great wines, including Chardonnay and Cabernet Sauvignon (Science, 3 September 1999, p. 1562). She and her colleagues garnered the most attention by tracing the roots of the iconic American wine grape Zinfandel to an ancient Croatian grape called Crljenak kastelanski. Such research, Meredith says, “was very attractive to the popular press and wine-geek consumers.”

    It didn't pay the bills, however. Her grapevine sleuthing was never directly funded, and Meredith argues that there is a bias in the United States against supporting grape research despite the fruit's economic importance. “I think that, in large part, it's due to our history of Prohibition, and there's still the feeling in Washington that research related to the alcoholic beverages is not the best use of taxpayer money.”

    To Meredith, it's fitting that Europeans sequenced the grape genome. “In Europe, grapes for wine are a fundamental part of their agricultural heritage and modern economy,” she says.

    Burned out from the bureaucracy of research and the constant search for grants, Meredith retired and joined her husband, Stephen Lagier, who had been working for the Robert Mondavi Winery, to make their own wine using land they purchased in 1986. Doing almost everything themselves—vineyard work, winemaking, bottling, sales—the couple now produce a well-regarded Syrah.

    Although she may have helped pave the way for genetically modified (GM) vines, Meredith says she wouldn't plant them herself, for practical reasons. She has no concerns about the safety of GM grapes and believes vines engineered to resist disease could be useful. But Meredith predicts “tremendous consumer resistance” to GM wines. “I'm a realist,” she says. “The only thing that would convince me to switch to a genetically engineered grape … is if my alternative was a dead vineyard.”

    Meredith loved her research career, particularly the historical studies of wine grapes. But today, sitting in her house on a Napa mountainside watching birds fly over her vineyards, she's content making delicious wine with grapes from her own land. “It's a lovely existence,” she says.

  16. Sowing the Seeds for High-Energy Plants

    1. Eli Kintisch

    New crops and improved genetics could be key to successful biofuel agriculture.

    New crops and improved genetics could be key to successful biofuel agriculture

    Carpet sale.

    Test plots assess corn, Miscanthus, and switchgrass (left) as biofuels.


    Once, plant breeders dreamed of plumper tomatoes, heartier soybeans, and juicier corn kernels. These days, visions of squat poplars and earless corn stalks are dancing in their heads. They are hoping these new-fangled crops will make cost-effective biofuels.

    The dominant method of making biofuels today, converting sugars from crops such as corn or sugarcane to ethanol, threatens the food supply and imposes environmental costs. Ultimately, processing cellulose from the cell walls of stems and leaves, which are generally discarded, would make better use of agricultural acreage, as would increasing the oil content of oil-producing crops. In the United States, both government-supported genomicists and privately funded plant scientists are expanding plant genomics research and field studies to figure out the species with the best biofuel potential and how to wring more energy out of each acre planted.

    Many researchers are looking at well-known species whose genomes have already been sequenced for clues to making other plants better energy crops: Arabidopsis, rice, poplar, and now, corn (Science, 7 March, p. 1333). Others plan to tackle sequencing projects for species few had cared about until a few years ago. These include perennials such as switchgrass (Panicum virgatum) and Miscanthus, both considered good candidates for energy crops because of their high cellulose content. And some scientists are breeding a wide variety of candidate crops around the world, hoping to find optimal varieties. “The spotlight is on this underdeveloped field,” says plant biochemist Kenneth Keegstra of Michigan State University in East Lansing, part of the Great Lakes Bioenergy Research Center.

    Humans have been growing food crops for 10,000 years, but the effort to produce fuel down on the farm is in its infancy. Studies of the genetic factors that control cell walls are just revving up, and key finds have often occurred serendipitously. “We're at such a basic stage,” says Great Lakes center biologist Richard Amasino. For example, in examining why certain varieties of maize showed sugar-rich yellow splotches—researchers dubbed them “tie-dyed” mutants—plant geneticist David Braun of Pennsylvania State University in State College found abnormal amounts of cellulose accumulating in the mutants' cell walls—a potentially useful feature in biofuel crops. Braun hypothesizes that the genes involved could be used in grasses to boost their cellulose content. And last year, Keegstra and colleagues described a gene that likely encodes the enzyme that makes one of four hemicelluloses, which along with cellulose and lignin, make up cell walls, in Arabidopsis. Keegstra believes a cell wall engineered to include more hemicellulose might be more easily digestible by biofuel-processing enzymes.


    Keegstra and Amasino's institute is one of three centers established by the U.S. Department of Energy (DOE), which is providing each $135 million over 5 years to bring genomics to bear on biofuels. Another, the Joint BioEnergy Institute, led by Lawrence Berkeley National Laboratory, will try to find other relevant genes in Arabidopsis and rice. The BioEnergy Science Center at Oak Ridge National Laboratory in Tennessee will focus on poplar and switchgrass. (Each includes enzyme scientists and microbiologists chasing better techniques of breaking cellulose down into fermentable sugars.) In addition, the DOE Joint Genome Institute (JGI) in Walnut Creek, California, is training its sequencing firepower on energy crops.

    “There's nothing like having a genome,” says JGI Director Edward Rubin. Take poplar, the first tree sequenced (Science, 15 September 2006, p. 1596). Before its code was completed in 2006, researchers knew of only 23 genes for proteins called auxin response factors, which control growth. Scientists have now identified 40 in poplar and shown in experiments with mutants that they can manipulate the genes to grow tall, fat, or short varieties. Making shorter branches, for example, could allow foresters to plant trees closer together and maximize biomass density.

    Private companies are also chasing after biofuel genes. Synthetic Genomics in San Diego, California, co-founded by geneticist J. Craig Venter, is looking for genes to increase oil yield, drought tolerance, and disease resistance in oil palm, used to make biodiesel in Southeast Asia and Africa. BP's Energy Biosciences Institute at the University of California, Berkeley, a $500 million center started last year, is mapping the genes in the fast-growing Miscanthus, a 1.5-meter-high perennial that has been tested as a cellulosic ethanol crop in European trials for 15 years.

    The BP institute is also using traditional breeding to study Miscanthus in Illinois, sugarcane in Brazil, and sweet sorghum in China, among other projects. It's important to cast a wide net for biofuel crops, says Christopher Somerville, the institute's director. “There's still a lot of uncertainty about what the optimal species are.”