News this Week

Science  17 Oct 2008:
Vol. 322, Issue 5900, pp. 356

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Falsification Charge Highlights Image-Manipulation Standards

    1. Gretchen Vogel*
    1. With reporting by Rachel Zelkowitz.

    Controversy continues to plague work from the lab of prominent stem cell researcher Catherine Verfaillie. The University of Minnesota (UM) announced last week that an academic misconduct committee had concluded that Morayma Reyes, while a graduate student in Verfaillie's lab there, “falsified” four data images in figures in a 2001 stem cell article. The committee found that misconduct allegations against Verfaillie were unsubstantiated, but it did criticize her oversight and mentoring of lab personnel. The new charges come a year after questions were raised about the misuse of images in another key stem cell publication from the group (Science, 2 March 2007, p. 1207).

    Reyes, now an assistant professor of pathology at the University of Washington (UW), Seattle, and Verfaillie, who now heads the Stem Cell Institute at the Catholic University of Leuven in Belgium, both acknowledge that errors were made in the preparation of the 2001 paper. But Verfaillie defends her supervision, and Reyes says that for several of the disputed images she merely globally adjusted the brightness and contrast in data images without any intent to deceive. “These errors were unintentional and were common and accepted practices at the time,” Reyes wrote in an e-mail to Science.

    The paper, published in Blood, claims that stem cells purified from human blood can form precursors of bone, fat, cartilage and muscle cells, as well as the endothelial cells that line blood vessels. At the time, blood stem cells weren't thought to be that versatile. Verfaillie and Reyes say the figure errors do not alter the Blood paper's conclusions, but Verfaillie has asked the journal to retract the paper, calling it “the proper course in this situation.”

    The Blood paper relates to work that the group later published in Nature, reporting that cells from mouse bone marrow could become a wide variety of cell types. Several groups have reported trouble reproducing that paper's results (Science, 9 February 2007, p. 760). Then last year, Nature conducted a re-review of the paper when a journalist at New Scientist questioned whether some data shown were identical to those in another paper. A UM investigation concluded that any duplication was the result of honest error. Nature published several corrections but said that the paper's conclusions were still valid and that Verfaillie continues to stand by the work.


    New Scientist also alerted the university to an apparent duplicated image in the Blood paper (Science, 30 March 2007, p. 1779). The university then convened a new committee, which submitted its final report on 5 September. The school last week stated that the committee found that in four of the seven figures in the Blood paper, “aspects of the figures were altered in such a way that the manipulation misrepresented experimental data and sufficiently altered the original research record to constitute falsification.” The committee cited “elimination of bands on blots, altered orientation of bands, introduction of lanes not included in the original figure, and covering objects or image density in certain lanes,” the statement says.

    The university has not released the full report, citing privacy laws, and experts in image analysis say it is hard to determine intentional fraud solely from the original paper. James Hayden, manager of the microscopy core facility at the Wistar Institute in Philadelphia, Pennsylvania, says that to make a clear point, scientists often alter images, sometimes more than they should. Good laboratory practice means all such adjustments should be noted in a paper and copies of the original image files kept, he says. Jerry Sedgewick, head of the Biomedical Image Processing Lab at UM and one of Reyes's mentors, says he is not convinced that she did anything wrong with the image adjustments she made. “This is done routinely and has been done since film and imaging began,” he says.

    During the investigation, Reyes asked George Reis, who heads the consulting firm Imaging Forensics in Fountain Valley, California, to assess whether changes made between the original image scans and the published images could be due to “global” adjustments, which would imply there was no intent to deceive. Reis told Science that he did determine that significant global adjustments could account for “most of the changes in most of the images.” But he says he did not examine the images specifically for signs of editing such as adding or deleting individual lanes.

    UM says it has forwarded the panel's report and supporting materials to the federal Office of Research Integrity in Rockville, Maryland. UW is waiting for more information from UM before deciding whether to discipline Reyes, according to a spokesperson.

    Both Verfaillie and Reyes say they have implemented much stricter rules for dealing with data images in their labs as a result of the case. “I have learned a hard lesson,” Reyes e-mailed Science. “Now that I am a mentor … I will make sure that my students will get the proper training, supervision and education.”


    DNA Test for Breast Cancer Risk Draws Criticism

    1. Jennifer Couzin

    It's been 8 years since the human genome was sequenced with the promise of revolutionizing medicine, and since then, efforts to put DNA discoveries into the doctor's office have only grown more controversial. The latest tussle came last week after deCODE Genetics, an Icelandic company, released the first-ever breast cancer risk test designed to cover common forms of the disease. The rollout and reaction were predictable: deCODE hailed the test as offering women a chance to take advantage of more aggressive screening if they're found at higher risk. Many oncologists and geneticists decried the $1625 test as premature because it includes just seven genetic variants out of the dozens or hundreds driving breast cancer that scientists expect to find soon.

    Unlike some new genetic tests, this one is not in question over its science: The seven variants it uses, all single-nucleotide polymorphisms (SNPs) found in the last couple of years, have been linked to an increased risk of breast cancer in thousands of women, mainly of European descent. Five of the variants were identified by a group at the University of Cambridge, U.K., using genome-wide association. These SNPs are in the public domain, and companies can incorporate them into new products.

    The test does not check for mutations in BRCA1 and BRCA2, two genes that dramatically increase the risk of breast cancer. Those mutations are rare, accounting for only a small fraction of cases, and they have been patented by a company in Utah that holds exclusive rights to test for them in the United States.

    The quandary presented by deCODE's breast cancer test, its sixth genetic risk test for a common disease, reflects a broader puzzle in genetics. Each new disease-linked SNP scientists uncover confers only a slight increase in risk, often no more than 20%. That might boost someone's lifetime chance of a chronic disease from 8% to 10%, so small as to be of questionable use to an individual. Even having several of these SNPs isn't likely to increase risk more than 100%, which amounts to a doubling. For breast cancer, that's roughly equivalent to having one family member with the disease.

    Tipping point.

    A $1625 genetic test will inform women if they're at high enough risk to merit MRI screening for breast cancer, the manufacturer says.


    Whereas some argue that such information isn't robust enough for clinical use, others see no reason to hold off. “It goes against our tradition to say, 'Let's wait until we discover more,'” says Kári Stefánsson, a neurologist and the chief executive officer of deCODE. The risk his test uncovers, he argues, is meaningful: Published work suggests that about 5% of women who use the company's test will find that they have a 20% risk of breast cancer; average risk is just over 12%. At 20% risk, U.S. guidelines recommend additional screening using magnetic resonance imaging (MRI).

    “This is a suitable test for anyone,” says Owen Winsett, a surgeon and director of the Breast Center of Austin in Texas. Winsett contacted deCODE last spring after learning about its work, and the company quotes his favorable comments in its press release. Winsett, who says he has not received any compensation, is backing his words with action. In the last month, he has offered the test to about 25 patients worried about breast cancer and recommended regular MRIs for some based on the results. Winsett says he would support prescribing tamoxifen, a drug taken to prevent breast cancer, based on the deCODE test results, though he hasn't done that yet.

    Others are more wary. “Any test, even based on the best SNPs so far, will probably misclassify a substantial fraction of women,” says David Hunter, a genetic epidemiologist at Harvard School of Public Health in Boston. That's because many believe the genetic risk identified in DNA so far is only a few percent of what will eventually be discovered. “Women need to know that their risk estimates might actually change over time as more variants become available,” Hunter adds, noting that some labeled high-risk may later learn that the news isn't so bad, or vice versa.

    “What you're seeing is someone's risk based on a small subset of variants,” agrees Douglas Easton, a genetic epidemiologist at the University of Cambridge in the U.K. who led the team that identified five breast cancer SNPs last year. “You don't know what the whole hand is.”

    Mitchell Gail, a medical statistician at the National Cancer Institute in Bethesda, Maryland, who designed a commonly used breast cancer risk model, earlier this year analyzed how much risk predictions would be strengthened by testing for most of the SNPs in deCODE's test. (Predictions are now based on factors such as family history.) “I'm not seeing a lot of improvement,” says Gail, who published his analysis in July. He estimates that about 300 SNPs are needed to dramatically improve risk forecasts for breast cancer.

    Stefánsson finds such arguments infuriating. “They basically say we should wait until we have discovered everything about breast cancer,” he says of his colleagues. “That is somewhere between ridiculous and incredibly dangerous.”


    Hawaii Marine Lab Fights to Stay Afloat

    1. Elizabeth Pennisi

    Just a few kilometers down the coast from Waikiki Beach in Hawaii, the death knell is tolling for the University of Hawaii's (UH's) Kewalo Marine Laboratory. The university has agreed to give up its lease on the 35-year-old lab 17 years early and plans to move the lab's faculty to its other marine lab on Coconut Island, or to the main Manoa campus, or even to the Waikiki Aquarium. The state redevelopment agency, which owns the land, plans to tear down the waterfront lab to expand a park and set up a public arts place.

    Word of the lab's demise is spreading through blogs and listservs, setting off protests across the globe. Last week, biologist Paul “PZ” Myers of the University of Minnesota, Morris, called the decision “short-sighted” in his blog, Pharyngula. In a letter to UH officials, Alessandro Minelli of the University of Padua in Italy praised the lab's evolutionary and developmental research and warned: “Stopping this activity would be a disaster for biology.”

    But Gary Ostrander, UH Manoa vice chancellor for research and graduate education, insists that he has no alternatives. “I don't like the idea of closing a marine lab, but we are a university that's struggling with budgets, and I have needs that are more pressing right now,” he told Science.

    The lab was established in 1972 with a focus on using marine animals and molecular methods to study cell and developmental biology. Given Hawaii's location and the central role marine resources play in Hawaii's economy, it would be “tragic” if the lab closes, says its director, Mark Martindale. “We will have the same number of marine labs as Alabama and Ohio.”

    Waterfront property.

    The University of Hawaii's Kewalo Marine Laboratory may soon be demolished for a park.


    The closure will be “really creating a deficit” both for Hawaii and the country, says George Boehlert, director of the Oregon State University Hatfield Marine Science Center in Newport. “You have very limited resources in the United States to do work on coral reefs, so this would be a significant blow to the research capacity of the United States.”

    But Ostrander says the lab is falling apart, and for several years the landlord, the Hawaii Community Development Authority, has been pushing to get the land back before the lease is up in 2030. Ostrander says he's looked into moving the lab a few blocks back from the water but has been unable to raise the $30 million estimated to be needed to rebuild the facility in this new location.

    Places like the Kewalo lab “tend to have a fragile existence,” being small and off campus and therefore more vulnerable to being closed down, says James Sanders, president of the National Association of Marine Laboratories and director of the Skidaway Institute of Oceanography in Savannah, Georgia. Indeed, Kewalo has been overshadowed by UH's other marine lab, the 15-year-old Hawaii Institute of Marine Biology, which also has dorms and conference facilities. But Sanders says that Kewalo is “well-respected” and that he would like to see it protected somehow. “We tend to view marine labs as windows on the ocean,” he says. “I hate to see any of those windows shut.”


    Two Strikes and You're Out, Grant Applicants Learn

    1. Jocelyn Kaiser

    Taking some by surprise, the National Institutes of Health (NIH) announced last week that scientists applying for grants will get only one chance to resubmit a rejected proposal. The current policy, which allows two revisions, bogged down the review process and forced investigators to wait in line for funding, NIH says. Giving applicants just one more try should fund the best science sooner.

    The change is in response to an advisory panel that identified problems in peer review earlier this year. The panel found that because more researchers are applying for money at a time when NIH's budget has stopped growing, study sections are shying away from funding applications submitted for the first time. Instead, NIH data show, even investigators with very strong proposals must resubmit at least once. This has increased the workload for reviewers and applicants, and it means that many grantees wait up to 2 years for a decision. The advisory panel had a radical solution: Abolish revised proposals and consider all applications “new.”

    Some scientists, including the 80,000-member Federation of American Societies for Experimental Biology (FASEB), argued that was too harsh. In June, NIH officials said they planned to continue to permit more than one revision but would “rebalance” the system to lower the success rates for resubmitted proposals (Science, 13 June, p. 1404).

    Over the summer, NIH decided to scrap the rebalancing idea, says Anthony Scarpa, director of NIH's Center for Scientific Review. “This goes further and achieves the same thing,” he says. Beginning in January, only one amended application will be allowed. If that is rejected, the applicant “should substantially re-design the project,” states an 8 October notice.

    “It's a reasonable compromise,” says Princeton University geneticist David Botstein, a member of the peer-review advisory committee. “It will push study sections in the direction that we want them to go.” But “there has not been a lot of enthusiasm” among FASEB members, says Howard Garrison, the society's public-affairs director. He worries that “meritorious projects” will not get funded. But as Garrison notes, there is little point in protesting, as the new policy is final.


    Most Devastating Mass Extinction Followed Long Bout of Sea Sickness

    1. Richard A. Kerr

    Dying in a cesspool may not have the popular appeal of perishing in a giant asteroid impact, but fouled waters are looking more and more like the cause of the ocean's greatest mass extinction 252 million years ago. The witches' brew that may have wiped out 90% of marine species in a geologic moment “had been stewing for millions of years,” says paleontologist David Bottjer of the University of Southern California in Los Angeles. Geochemists and paleontologists have new evidence that fouled ocean water “was a prelude to the mass extinction,” Bottjer says, not just the immediate driving force behind it.

    Geochemically, traces of bad water are preserved in the chemical and isotopic composition of marine sediments laid down late in the Permian Period. At the upcoming fall meeting of the American Geophysical Union (AGU) this December, biogeochemist Changqun Cao of the Nanjing Institute of Geology and Paleontology of the Chinese Academy of Sciences and his colleagues will report their analyses of organic matter from a rock core drilled at Meishan in southern China. Throughout the core, which spans the extinction and the 3 million years before it, they found distinctive hydrocarbons produced only by green sulfur bacteria. These bacteria live only in sunlit (and thus shallow) waters that are oxygen-free and loaded with usually noxious hydrogen sulfide. Toxic levels of carbon dioxide are common under such conditions as well.

    Early losers.

    Brachiopods (shells) and spongelike Bryozoa suffered an early battering.


    Signs of similar circumstances inimical to higher life forms, much like those of the stagnant depths of the Black Sea today, are widespread in the geologic record of the Permian-Triassic mass extinction as well as the few million years before it, says geochemist Roger Summons of the Massachusetts Institute of Technology in Cambridge, a co-author on the AGU presentation. Judging by his work and that of others, the encroachment of these foul waters was “progressive and pervasive” around the late Permian world, Summons says.

    Bottjer and his colleagues found signs of the same progressive deterioration of environmental conditions in the fossil record of the late Permian. Reporting in the September issue of GSA Today, they summarize their published and in press analyses of the fossil records of three late Permian animals: clamlike, stalked brachiopods; encrusting, coral-like bryozoans; and bottom-dwelling mollusks. They counted not just the number of species but also the abundance of individual fossils as a clue to ecological changes. They found that starting as much as 8 million years before the extinction, the abundance of brachiopods in deeper, offshore waters began to drop; bryozoans started becoming less diverse there, while mollusks proliferated. The more mobile mollusks would dominate the oceans after the extinction.

    “I certainly don't mean to say the mass extinction wasn't abrupt,” Bottjer says, but “it was brewing for a long time, and it was coming from deep water.” Perhaps all that the rising foul waters needed to precipitate the crisis of the extinction was a trigger, he says. The huge, climate-altering eruption of the Siberian Traps coincides with the extinction (Science, 17 September 2004, p. 1705) and could have sent foul waters shoreward.

    The new results from the late Permian “are starting to swing the pendulum back” toward a more protracted episode of change for life, says paleontologist Paul Wignall of the University of Leeds, U.K. The exact nature of that episode—its pace throughout and the identity of the killer agent or agents—remain to be determined, he says.


    Skewed Symmetries Net Honors for Particle Theorists

    1. Adrian Cho*
    1. With reporting by Dennis Normile in Tokyo.

    When Yoichiro Nambu won half of this year's Nobel Prize in physics he was surprised, ironically, because he had grown so used to others' saying that he deserved it. “I've been told that this is a possibility for many, many years,” says Nambu, 87, who emigrated from Japan in 1952 and has been at the University of Chicago in Illinois since 1954. “I was not expecting it.” Nambu's peers are happy to see him finally honored. “It's high time,” says theorist Jonathan Ellis of the European particle physics laboratory, CERN, near Geneva, Switzerland. “This prize could have been awarded when I was a graduate student in 1968.”

    Nambu is cited for applying a concept called spontaneous symmetry breaking to particle physics. The other half of the prize goes to Makoto Kobayashi, 64, of the High Energy Accelerator Research Organization (KEK) in Tsukuba and Toshihide Maskawa, 68, of Kyoto Sangyo University, both in Japan. While trying to explain an asymmetry between matter and antimatter, they correctly predicted the existence of new fundamental particles.

    Triumphant trio.

    Nambu, Maskawa, and Kobayashi teased deep insight out of “broken” symmetries in interactions of subatomic particles.


    Spontaneous symmetry breaking occurs whenever the forces within a system are in some way symmetric but the lowest energy “ground state” that the system nestles into is not. Balance a pencil on its tip, for example, and gravity will pull the pencil down but won't tug it in any particular horizontal direction. The symmetry “breaks” only when the pencil flops onto the table in some random direction.

    In the early 1960s, Nambu applied this concept to the interactions of protons and neutrons, known collectively as nucleons. Physicists knew the strong force that binds nucleons into atomic nuclei is conveyed by particles called pions whizzing between them. Nambu assumed a subtle symmetry among nucleons involving the way the particles spin and the fact that, to the strong force, a proton is essentially indistinguishable from a neutron. He showed that if the symmetry were spontaneously broken, pions had to emerge with exactly the mass and other properties they were already known to have.

    Spontaneous symmetry breaking has since become a key tool in particle theory. Physicists think that spontaneous breaking of a different symmetry explains how fundamental particles obtain mass. That bit of conceptual calisthenics also predicts a new particle, the long-sought Higgs boson.

    Kobayashi and Maskawa made one of science's more inspired educated guesses. In 1972, physicists knew of three types of particles called quarks: the up quark and down quark that make up nucleons, and the strange quark found in fleeting particles called K mesons. Kobayashi and Maskawa predicted that at least six types of quarks had to exist.

    “It was very brave,” says Michael Gronau, a theorist at Technion-Israel Institute of Technology in Haifa. “Nobody took the model seriously when it came out, but it turned out to be the right one.” From ever-higher-energy particle collisions the charm quark emerged in 1974, the bottom quark in 1977, and the top quark in 1995.

    Of course, the two theorists didn't just toss out a number. They were trying to explain an asymmetry between matter and antimatter called charge-parity (CP) violation that had been observed 8 years earlier in the decays of K and anti-K mesons. Kobayashi and Maskawa found that if there were six quarks, then there would be enough theoretical wiggle room so that, in particle decays, effects conceptually akin to the interference between waves could create differences between matter and antimatter.

    Their scheme was confirmed to high precision when, starting in 1999, experimenters at KEK and the Stanford Linear Accelerator Center in Menlo Park, California, studied decays of B mesons, which contain bottom quarks and are the only other particles so far to exhibit CP violation. That was both a triumph and a disappointment. The Kobayashi-Maskawa “mechanism” produces too little CP violation to explain the bigger mystery of why the universe contains so much matter and so little antimatter. So, many physicists had hoped to see discrepancies that would lead to a more complete theory.

    Even Kobayashi says, “in the sense that a deviation is a clue to new physics, then, of course, it's desirable.” Nobels aside, science's most coveted prize is still a deeper understanding.


    Theorist Revolutionized Study of What Gets Made Where

    1. Adrian Cho

    A Nobel Prize often gives its winner a first taste of fame. The winner of this year's Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel was already well known. Paul Krugman, an economist at Princeton University, a columnist for The New York Times, and a best-selling author, has won for his analyses of international trade and economic geography.

    “The papers he writes are so simple and crystal clear that in hindsight, you might say, ‘I could have thought of that,’” says Samuel Kortum, an economist at the University of Chicago. “But nobody did” until Krugman did. In 1979, Krugman reshaped the study of international trade. Economists had generally argued that countries trade with one another in response to differences between them. So a country that makes cars might trade with one that produces cotton to their mutual benefit. In reality, a country that produces cars often ends up trading with another country that makes cars.

    Krugman explained that seemingly illogical “intra-industry trade” by taking into account “economies of scale”—the fact that the cost of producing an object decreases the more you make. If two countries trade, then each will effectively enlarge its economy, leading to a proliferation of goods and companies. However, to exploit economies of scale, each company will be based in one country or the other, even if that means Sweden sends Volvos to Germany as Germany sends BMWs to Sweden, Kortum explains.

    Back in the day.

    Perhaps best known now as a pundit, Krugman performed seminal analyses of trade and city formation.


    In 1991, Krugman applied similar thinking to the growth of cities. He showed that, because more economically developed areas provide a wider variety of goods, they will also attract more people, further fueling economic growth and speeding the process of agglomeration. “Krugman's insight was that the towns themselves are sources for the demand for products,” says Vernon Henderson, an economist at Brown University.

    A self-proclaimed liberal, Krugman has criticized the policies of President George W. Bush. But Henry Overman of the London School of Economics says Krugman won for his economic insights and not his politics—although he notes that Krugman's win is “not going to be very popular with Bush lovers.”


    Three Scientists Bask in Prize's Fluorescent Glow

    1. Robert F. Service

    Early in the summer of 1961, a young Japanese organic chemist named Osamu Shimomura wedged himself into a station wagon packed with lab equipment and three other passengers and drove across the United States looking for the secret of what makes certain jellyfish glow in the dark. The work was pure curiosity-driven basic research, Shimomura says. But his discovery—a luminescent protein known as green fluorescent protein (GFP)—blossomed into one of the most powerful imaging tools for molecular biology. Last week, it earned Shimomura and two Americans—Martin Chalfie of Columbia University in New York City and Roger Tsien of the University of California, San Diego—this year's Nobel Prize in chemistry.

    Shimomura, who retired in 2001 from the Marine Biological Laboratory in Woods Hole, Massachusetts, set out from Princeton for San Juan Island off the coast of Washington state to collect samples of the jellyfish Aequorea victoria, whose outer edge glows green when the jellyfish is agitated. Initially, Shimomura, working with Princeton University biologist Frank Johnson, isolated a blue luminescent protein they called aequorin. But almost in passing, their 1962 paper describing aequorin also mentioned that they had found another protein, later called GFP, that puts out a soft green glow. “I had no idea” GFP would go on to become such a major tool for biologists, says Shimomura, who continues to study bioluminescence in a lab in his home.

    Decades later, Douglas Prasher, a biologist then at the Woods Hole Oceanographic Institution in Massachusetts, did get an inkling of GFP's future utility. Unlike aequorin and other bioluminescent proteins, GFP doesn't need other proteins or cofactors to glow. That property raised the possibility that GFP's light-emitting capability could be transferred to other organisms by outfitting them with just a single extra gene. In a 1992 paper in Gene, Prasher reported sequencing and cloning the gene for GFP. Prasher and others later tried inserting the gene and expressing it in bacteria, without success.

    Bright idea.

    Shimomura, Chalfie, and Tsien discovered luminescent proteins that have cast a new light on cells and tissues (top).


    In 1994, Prasher teamed up with Chalfie and three other colleagues. The team reported in Science that they had successfully cloned the gene for GFP into Escherichia coli bacteria and the Caenorhabditis elegans worm and could use its luminescence to track the expression of neighboring genes (Science, 11 February 1994, p. 802). The result set off an avalanche of interest in using GFP as a marker to investigate everything from how cells develop to what makes cancer cells metastasize. The fact that Prasher didn't share in the Nobel, Chalfie says, “is a very bittersweet aspect to this award.”

    But Chalfie and others agree that Tsien also richly deserves the recognition. Over the past 2 decades, Tsien has created an extended family of GFP relatives that fluoresce in a palette of colors across the visible spectrum. These enable biologists today to track the expression of multiple genes and other molecules inside cells simultaneously. Tsien says he periodically considers moving on from making fluorescent proteins, but the steady progress in the field “isn't making it easy to give it up.”

    “The fluorescent proteins have revolutionized medical research,” says John Frangioni, an oncologist and imaging expert at Harvard Medical School in Boston. Last year, more than 12,000 papers reported using GFP and other fluorescent proteins, according to Marc Zimmer, a chemist at Connecticut College in New London whose book Glowing Genes recounts the discovery of fluorescent proteins. Today, GFP and other fluorescent proteins “are probably as important as the development of the microscope,” he says. That underscores the value of basic research: If Shimomura's pursuit of jellyfish fluorescence were funded today, Zimmer says, it would be more likely to earn scorn than anything else. It would be “a great candidate for the Ig Nobels.”

  9. Q&A: China's Scientist Premier

    1. Hao Xin,
    2. Richard Stone

    In a rare one-on-one interview, Premier Wen Jiabao spoke with Science about China's efforts to ground its economic and social development in sound science.

    In a rare one-on-one interview, Premier Wen Jiabao spoke with Science about China's efforts to ground its economic and social development in sound science

    Taking charge.

    Hours after the Sichuan earthquake struck, Wen was on the scene.


    BEIJING—2008 has been a roller-coaster ride for China and for Premier Wen Jiabao. Recent highs were the spectacular Olympics and the successful space walk late last month during the Shenzhou-7 mission, a key step toward China's aspirations of building a space station and sending astronauts to the moon. Lows included the Tibet riot, a devastating earthquake in Sichuan Province, and the tainted milk scandal.

    In 2003, early in his first term as head of China's government, Wen promoted measures to address the spread of AIDS and the emergence of SARS. His leadership qualities were tested again after the 12 May Wenchuan earthquake. Within hours, Wen was on the scene, rallying rescuers and comforting victims.

    Wen led the earthquake response with technical authority few politicians anywhere could match. The Tianjin native studied geological surveying as an undergraduate and geological structure as a graduate student at the Beijing Institute of Geology from 1960 to 1968, then spent the next 14 years with Gansu Provincial Geological Bureau in western China. In the 1980s, Wen rose through the ranks of the Communist Party and became vice premier of the State Council, China's Cabinet, in 1998 and premier in 2003. Wen began a second 5-year term as premier last March.

    In a 2-hour conversation with Science Editor-in-Chief Bruce Alberts at the Zhongnanhai leadership compound in the heart of Beijing on 30 September, Wen, 66, spoke candidly and forcefully, without notes, on everything from social and economic development being the “wellspring” of science and technology to cultivating scientific ethics and reducing China's reliance on fossil fuels. Here are highlights edited for clarity and brevity; a more complete version is posted on the Science Web site.

    Bruce Alberts: You were famous all over the world for going to the site of the earthquake as a professional geologist immediately afterwards and having a great effect on China's response. Could you tell us more about your response to the earthquake and what you see in the future in the way of earthquake protection for China?Wen Jiabao: When the Wenchuan earthquake occurred on 12 May, I was sitting in my office. Beijing shook, too. My instinct told me it was an earthquake. I instantly knew this disaster would affect a large area and the devastation would be severe.

    I decided to go to the scene immediately. I understood clearly the importance of the [initial] 72 hours and especially the importance of the first day in saving people's lives. Simply put, the faster the better.

    Within 3 days, we mobilized a force of more than 100,000 people and rescued some 80,000 from underneath the rubble. Often it is the inattention to aftershocks that causes more severe damage than the main shock. This required us to mobilize the residents to leave their homes and to find shelter.

    More than 100 quake lakes were formed, the largest of which was Tangjiashan quake lake, which contained 300 million cubic meters of water. A possible bursting of the dammed-up quake lake would endanger large cities such as Mianyang and more than 10 million people along the path of the water. I went to the site of the quake lake many times and, together with engineers and experts, researched technical solutions and decided to solve the quake-lake problem quickly, safely, and efficiently. We dealt with perhaps the biggest quake lake in the world very successfully; not a single person was injured or died.

    We need to gradually restore life, production, and ecological function in the region. This is a very arduous task.

    B.A.: Will new buildings in this area be built in a special way to make them highly resistant to future earthquakes?W.J.: We must establish building codes according to the magnitude and intensity of possible earthquakes in this region. Especially for public buildings such as schools and hospitals, we need to apply even safer standards, to assure parents and make children feel at ease.

    B.A.: In the United States, we read every day about what you were doing and the earthquakes … but also how people came as volunteers from all over China to try to help.W.J.: We put into practice the principle of opening to the outside and announced news about the earthquake in real time to China and also to the world. The reason we did this is to tell people ways to avoid harm and help them properly settle [in shelters].

    B.A.: I assume that what you did in the earthquake is related to your new campaign to implement something you call “The Scientific Outlook on Development.” I think most of us don't understand exactly what that is. Could you explain what the plans are and how Chinese scientists are going to contribute?W.J.: The number-one principle is to put people first. The second is comprehensive development, the integration of economic development with social development, the integration of economic reform with political reform, the integration of an opening-up and inclusive approach with independent innovation, and the integration of advanced civilization with traditional Chinese culture. Thirdly, we need to resolve the disparities—rich-poor disparity, regional disparity, and urban-rural disparity—in our country's developmental process. Fourthly, sustainable development: That is, to meet the challenges of population, resources, and environmental protection faced by a population of 1.3 billion in its modernization process. We want to achieve sustainable development by adopting a resource-conserving and environment-friendly approach. These four goals cannot be achieved without science and technology or without innovations.

    B.A.: We just published a major article from China (Science, 19 September, p. 1676) that shows that your transgenic cotton, used in your country, has reduced the need for pesticides not only for the cotton but also on other crops in the vicinity.W.J.: You know, 10 years ago, we did not have this transgenic technology in cotton plants. Back then, the cotton bollworms would not die even when immersed in pesticides. Since we began transgenic engineering of cotton, the plants not only increased their ability to resist bollworms but also increased yield. Therefore, I strongly advocate making great efforts to pursue transgenic engineering. The recent food shortages around the world have further strengthened my belief [in developing such technologies].

    Meeting of the minds.

    Bruce Alberts and Wen Jiabao share a light moment during their 2-hour discussion of China's scientific challenges.


    B.A.: As you know, in Europe there's been a big reaction against transgenic crops, and this has affected the use of this important technology all across Africa as well.W.J.: Don't mix transgenic science with trade barriers. That would block the development of science.

    B.A.: May I turn to the issue of your attempts to create a more innovative system, which, of course, means you must attract innovative, talented people to China and train your own people to be innovative as well as smart. How is that going?W.J.: This has two aspects. One is we need to cultivate our own large numbers of innovative talents. This needs to start with children, to develop independent thinking from a young age. After they enter secondary schools and universities, there needs to be a free environment to enable them to develop creative thinking and critical thinking. I often say that to raise a question or to discover a problem is more important than solving a problem. This is exactly the kind of talents we need.

    Secondly, we also need to integrate closely science and technology with economic and social development, because science and technology finds its wellspring in economic and social development. That's why we strongly push for integration of production, academic study, and research.

    Thirdly, our scientists need to cultivate scientific ethics; most importantly, they need to uphold the truth, seek truth from facts, be bold in innovation and tolerant to failure. Only science and the spirit of seeking truth from facts can save China. I firmly believe in this.

    We hold fast the policy of opening up to the outside world. To bring in the best brainpower and scientific and technological talents through opening up is most important.

    From this perspective, scientists can leap over barriers of ideology and national boundaries to serve all of humanity. I can assure you that we will certainly create a good environment for scientists from the outside to work in China. But I don't believe this is the main thing. They should feel that they have the right conditions to develop their careers in China, that they are respected by China, that the results of their work are respected by China. This will require us to protect their independent creative spirits and intellectual-property rights.

    B.A.: In the United States, we often talk about the fact that the real innovation, if we look backwards, comes from fundamental science, basic science, that was done 20 to 25 years earlier. When I visited the Ministry of Science and Technology, I was told that China's investment in what we call basic research has been fixed at 5% of total research investment. Do you think that is the right number?W.J.: Personally, I attach great importance to research in fundamental sciences because I believe that no applied or developmental research can do without basic research as the wellspring and driving force. But, in this world of ours, often because of material gains and immediate interests, it is easy to neglect basic research. This should be avoided. In recent years, we have continuously increased the level of support, but I think the [investment] ratio is still insufficient.

    B.A.: One of the things that I think is very impressive about China is the extent to which Chinese-Americans feel a great sense of belonging also to China. There's a very effective organization of Chinese scientists in the United States dedicated to helping China develop its own science. This is unusual—other countries do not have this kind of loyalty of their scientists to their homeland.W.J.: Our policy is to let them come and go freely. They can serve the motherland in different ways. We impose no restrictions on them and adopt a welcoming attitude.

    B.A.: As you probably know, the National Institutes of Health has put a very strong emphasis lately on supporting innovative young scientists. I met with many wonderful young scientists in China already, both students and young faculty, and having those kinds of opportunities would be very encouraging for them.W.J.: We should pay more attention to young scientists. I should say that we haven't done enough in this respect. In the future, we will definitely increase support for young scientists.

    B.A.: Your response to the milk crisis was very impressive, and it still needs, of course, a lot of attention. That terrible crisis awakens the need for more efforts in food safety, more broadly. Do you have new plans for food-safety protection in China?W.J.: We feel great sorrow about the milk incident. We feel that although problems occurred at the company, the government also has a responsibility. The important steps in making milk products—production of raw milk, collection, transportation, processing and making formula—all need to have clear standards and testing requirements and corresponding responsibilities, up to legal responsibilities. I once again solemnly emphasize that it is absolutely impermissible to sacrifice people's lives and health in exchange for temporary economic development. Food, all food, must meet international standards. Exported food must also meet the standards of importing countries. We have decided that the Ministry of Health will have main oversight responsibility over food safety.

    B.A.: There's another very important area in which scientists and engineers must collaborate all over the world, and that, of course, is in developing better ways to use and obtain energy. We have a world crisis with greenhouse gases and shortages of resources. What we do in China and the United States will be central with regard to how we treat this planet we're on and make sure that we don't destroy it. What are China's plans now for energy usage and development?W.J.: China is a main energy consumer and, therefore, is also a big greenhouse gas emitter. We must use energy resources rationally and must conserve. This needs us to adjust our economic structure, transform the mode of development, to make economic development more dependent on progress of science and technology and the quality of the work force.

    We need to take strong measures, including economic, legal, and administrative measures when necessary, to restrict high energy consuming and heavily polluting enterprises and encourage the development of energy conserving and environmentally friendly enterprises.

    Spilt milk.

    Wen, expressing sorrow, promises new food regulations after melamine-tainted milk poisoned thousands of babies.


    Now every year, China produces about 180 million tons of crude oil and imports about 170 million tons. China's coal production exceeds 2.5 billion tons a year. This kind of huge consumption of energy, especially nonrenewable fossil fuel, will not be sustainable.

    We have established a goal that our GDP [gross domestic product] growth every year must be accompanied by a 4% decrease in energy consumption and a 2% reduction in COD [chemical oxygen demand] and sulfur dioxide emissions every year. We will also adopt various measures to reduce the use of oil and coal in order to reduce the emission of greenhouse gases, including energy-conserving technologies and carbon-capture technologies.

    We have only been industrializing for several decades, while developed countries have been on this road for over 200 years. But we will now begin to shoulder our due responsibilities, namely, the common but differentiated responsibilities set forth in the United Nations Framework Convention on Climate Change and the Kyoto Protocol.

    B.A.: The U.S. and China have a special role to play by working together. I wonder if we could imagine a really large-scale joint effort on issues like carbon capture. I know we're working on that in the United States, you're working on it in China, but working closely together on some of these things might make progress more rapid. It would also be a great symbol, for the world, that we are seriously, both of us, taking this issue to heart and are really going to do something about it.W.J.: China and the United States have just signed an agreement on a 10-year collaboration in energy conservation and adapting to climate change. This is a new highlight in our bilateral cooperation.

    I agree to strengthen our cooperation. We can send a message to the world: We will make joint efforts to protect our common habitat.

    B.A.: Another area where I think we can be effective is using science for diplomacy. Scientists from all nations can work together effectively, even when their governments don't agree. I wonder if there is an enhanced role you might seek for cooperation of scientists, including China and the United States, with North Korean scientists who seem to be so isolated, and whether building new bridges to North Korea that way, through our scientific communities, might help the cause of world peace.W.J.: I believe that's entirely possible. Scientists from all over the world share the same desires and characteristics in their pursuit of scientific research, respect for science, and seeking truth through facts. Strengthening their collaboration and association will certainly make it easier to build consensus and mutual trust.

    Secondly, the work scientists do has become increasingly relevant to economic and social development and everyday life: for example, the Internet. Therefore, exchanges and collaborations between scientists can help promote exchange and cooperation in economic and social realms between countries. More scientific language and less diplomatic rhetoric may make this world even better.


    Paradoxical Effects of Tightly Controlled Blood Sugar

    1. Gary Taubes

    Researchers are puzzling over recent trials that had great success in lowering blood sugar in type 2 diabetics but no success in reducing deaths from cardiovascular disease.

    Researchers are puzzling over recent trials that had great success in lowering blood sugar in type 2 diabetics but no success in reducing deaths from cardiovascular disease


    Last January, the steering committee overseeing a clinical trial on diabetes and heart disease received some disturbing news. One of the interventions being tested—intensive control of blood sugar in patients with type 2 (formerly called adult onset) diabetes—was anything but benign: The death rate among subjects undergoing the intensive therapy was higher than among those following the standard blood sugar-lowering strategy to which it was being compared. On 7 February, this part of the so-called ACCORD study (for Action to Control Cardiovascular Risk in Diabetes) came to an end, 17 months prematurely.

    By the time the ACCORD collaboration released detailed results in early June, it was one of three major trials—comprising in total nearly 23,000 type 2 diabetics—reporting that lowering blood sugar in diabetics to levels considered normal provides no benefit in preventing heart attacks and strokes. If the ACCORD trial was any indication, more intensive therapy might even do more harm than good.

    The three trials have intensified debate over what David Nathan, a diabetes specialist at Harvard Medical School (HMS) in Boston, calls “one of the most contentious issues in medicine”—the relationship between high blood sugar and the numerous long-term complications that beset diabetics. The results challenge a long-standing assumption that high blood sugar is the central villain, but they have left the field divided over where to cast the blame.

    All three trials—ACCORD, a 20-country study called ADVANCE, and the Veterans Affairs Diabetes Trial (VADT)—were designed to see whether older patients with type 2 diabetes could reduce the risk of heart attacks and strokes—and thereby prolong their lives—by maintaining their blood sugar at near-healthy levels. The clinical results were clear: The trials failed to demonstrate the hoped-for benefit. Why they failed, though, and why ACCORD saw significantly more deaths in the intensive-therapy arm of the study are matters of dispute. “You can reach a lot of conclusions” based on these studies, says Derek LeRoith, an endocrinologist at the Mount Sinai School of Medicine in New York City. “The question is whether anybody concludes anything correctly.”

    Trouble in the blood

    Type 1 diabetics, who lose their insulin-making cells and must take insulin to stay alive, are beset by a host of complications—from eye and nerve damage to extensive, accelerated atherosclerosis and a twofold to fivefold increased risk of dying from a heart attack. Since the 1940s, diabetes specialists have assumed that high blood sugar, or hyperglycemia, was at least partially responsible for these problems. It wasn't until the 1980s, when type 2 diabetes became a relatively common disorder and type 2 patients were living longer, says Nathan, that it became clear that these diabetics suffered many of the same complications.

    Diabetologists divide these complications into two categories: macrovascular, including stroke and cardiovascular disease, and microvascular, including kidney disease, blindness, and nerve damage. In 1993, a North American trial, the Diabetes Control and Complications Trial (DCCT), reported that microvascular complications in type 1 diabetics could be delayed and suppressed by keeping blood sugar levels low and under very tight control. In 2005, an observational follow-up to DCCT reported that heart attacks and strokes could also be partially prevented.

    The question since then has been whether the same cause and effect could be demonstrated in type 2 diabetics, who develop a resistance to the insulin they secrete. “There has always been a fair amount of evidence that chronic hyperglycemia in type 2 diabetics is associated with increased rates of cardiovascular events,” says William Duckworth, co-chair of the VADT study and a diabetes special-zist at the Carl T. Hayden VA Medical Center in Phoenix, Arizona.

    A range of biological mechanisms point to hyperglycemia as a possible cause. “There are several ways that glucose can alter vascular structure and function,” says Duckworth. The accumulation of glucose molecules permanently bound to structural and functional molecules may contribute to stiffening of the arteries and the oxidation of low-density lipoprotein particles, an early and necessary step in the formation of atherosclerotic plaques. Hyperglycemia can also theoretically alter vascular function by altering oxygen delivery and nitric oxide in the cell, Duckworth adds. “There's also some suspicion it might cause turbulence in the vascular system, which can lead to increased clotting.”

    For decades, the notion that high blood sugar is a causal agent in cardiovascular disease remained a hypothesis; diabetologists wanted proof that lowering blood sugar in type 2 diabetics would prevent disease. “The only way to prove that halfway definitively is to take a group of people and lower their glucose and see if that decreases these macrovascular events,” says Duckworth. “These three studies are unfortunately unanimous in saying, 'No, it doesn't.'”

    What failed?

    Interpreting the negative results is complicated, however, partly because the methods of controlling blood sugar varied from study to study and even patient to patient. The best measure of blood sugar control is a variable called hemoglobin A1c, the percentage of hemoglobin molecules in the blood that are bound to a glucose molecule, or glycated. Measuring blood sugar directly provides an indication of current blood sugar levels; hemoglobin A1c reflects the levels over the course of a few months. In healthy individuals, less than 6% of hemoglobin is glycated. In untreated diabetics, the levels are typically above 9.5% and can be far higher. Levels of 7.5% to 8.5% are reachable with standard insulin and insulin-sensitizing therapy. The ACCORD, ADVANCE, and VADT trials all compared these standard blood-sugar-lowering strategies with intensive therapies that lowered hemoglobin A1c below 6.5%.

    Safe range.

    Diabetics monitor their blood to avoid high levels of blood sugar, but very low levels—below 70 mg/dL—can be dangerous as well. Clinical studies use a different, long-term measure of blood sugar, the percentage of glycated hemoglobin, or A1c, in circulation.


    The obvious explanation for why the three studies came up negative is that the hypothesis that high blood sugar causes macrovascular complications in type 2 diabetes is simply wrong. But these studies cannot absolutely rule it out; there are too many variables. Type 2 diabetes is associated with a spectrum of metabolic disorders, all of which are risk factors for cardiovascular disease: insulin resistance, obesity, hypertension, and cholesterol and lipid abnormalities known as diabetic dyslipidemia.

    In all three trials, the investigators aggressively treated hypertension and diabetic dyslipidemia in all the subjects. As a result, mortality rates were so low that the trials may have had insufficient statistical power to demonstrate a beneficial effect from lowering blood sugar alone.

    The ACCORD trial was planned with the expectation that 10% of the subjects would have a heart attack or stroke over 3.5 years. Instead, only 3% to 4% did. The same trend held true for ADVANCE. Its results were consistent with no benefit at all, says Anushka Patel, a co-chair of the ADVANCE trial and cardiologist at the George Institute for International Health in Sydney, Australia, but because of their statistical weakness, they were also “potentially consistent with a benefit on macrovascular disease.” Even the ACCORD trial, despite being halted prematurely, showed a mild but insignificant benefit on nonfatal heart attacks and strokes and all deaths from cardiovascular disease.

    One possibility is that, like the DCCT study of type 1 diabetics, these trials simply didn't run long enough to see a benefit. Just last week, British researchers published the latest results from the United Kingdom Prospective Diabetes Study (UKPDS) suggesting that this might be the case. UKPDS began in the late 1970s, comparing dietary therapy in type 2 diabetics with drug therapy that lowered hemoglobin A1c levels from 7.9% to 7%. The initial 5-year follow-up, published in 1998, reported no benefits on macrovascular complications. Now, after an average of 17 years of follow-up, slight reductions in heart attacks and mortality were seen. What caused this “legacy effect,” however, was still open to speculation because glycemic control in the two groups had been identical for all but the first year of the study.

    Lower may not be better

    The most intriguing question is why, even with a slightly decreased risk of nonfatal heart attacks and strokes, the intensive-therapy subjects in the ACCORD trial had an increased risk of death. Here, again, hypotheses are plentiful, but not so the evidence to nail them down.

    The goal of the ACCORD trial was to lower hemoglobin A1c levels in the intensive-therapy group below 6% rather than 6.5% in the ADVANCE and VADT trials. To do so, as LeRoith says, “they threw the kitchen sink” at these subjects. They were given more drugs than those in the intensive-therapy arms of ADVANCE or the VADT trial, at higher does, in more combinations, and more insulin itself.

    As a result, one possible explanation for the excess deaths in ACCORD is that intensive therapy resulted in blood sugar so low—a condition known as hypoglycemia, defined as blood sugar below 70 milligrams/deciliter (mg/dL)—that it in turn caused deaths that would not necessarily have been identified as heart attacks or strokes. When blood sugar dips into the hypoglycemic range, it activates parasympathetic nerves, triggering hormonal responses that include secretion of epinephrine, glucagon, growth hormone, and cortisol, all aimed at limiting glucose uptake and exit from the bloodstream. It's possible, says Graham McMahon, a diabetes specialist at HMS and Brigham and Women's Hospital in Boston, that in the older type 2 diabetics recruited by ACCORD—with longer-standing disease and poorer underlying vascular function—these compensatory responses, combined with nerve damage in the autonomic nervous system, triggered fatal cardiac arrhythmias.

    “We know when we drive people's blood sugar down this far,” says William Friedewald, a biostatistician at the Columbia University Mailman School of Public Health and chair of the ACCORD Steering Committee, “they have a much higher rate of hypoglycemia and that, in fact, is what we observed in ACCORD.” But when the ACCORD researchers searched for an association in their data between hypoglycemic events in the subjects and mortality, they came up empty.

    But hypoglycemia still can't be ruled out. The more intensive the blood-sugar control in diabetics, says Stephen Davis, a diabetes researcher at Vanderbilt University in Nashville, Tennessee, the more frequently a patient suffers a hypoglycemic event, even a mild one, and the more likely they will be unaware of future hypoglycemic events. “Even quite minor levels of hypoglycemia,” he says, “can down-regulate defenses against subsequent hypoglycemia. You get a vicious cycle whereby hypoglycemia begets more hypoglycemia.” In these cases, blood sugar can drop to 30 or 40 mg/dL, without evoking compensatory responses or noticeable symptoms. In this hypothesis, the researchers running the trial would be unaware of many hypoglycemic events and therefore unable to link them directly to the death. “The data we have in ACCORD shows no tie-in to hypoglycemia,” says Friedewald, “but it may still be a factor.”

    Is it the insulin?

    Another possible explanation for the excess deaths in ACCORD is that some drug or combination of drugs turned out to have fatal side effects. One prime suspect is a drug called rosiglitazone, an insulin-sensitizing drug marketed by GlaxoSmithKline as Avandia and used widely to treat type 2 diabetics. Last year, both The New England Journal of Medicine and the Journal of the American Medical Association published meta-analyses reporting that rosiglitazone use was associated with an increased risk of heart attacks and, perhaps, an increased risk of death. Neither ACCORD nor the VADT trial, however, could find such an association. The researchers in the ADVANCE trial, says Patel, are not planning to do that analysis because of the many ways it can be confounded by other factors.

    View this table:

    The most controversial hypothesis to explain the excess deaths in the intensive-therapy arm of ACCORD is that insulin therapy itself was responsible. Since the 1980s, when type 2 diabetes was widely accepted as a disease of insulin resistance rather than insulin deficit, diabetes specialists have debated whether macrovascular complications are caused primarily by high blood sugar or by chronically elevated insulin levels. This condition, known as hyperinsulinemia, arises when the beta cells of the pancreas compensate for insulin resistance elsewhere in the body by secreting ever-greater amounts of insulin. Many of the other metabolic disorders that accompany type 2 diabetes—obesity, hypertension, and diabetic dyslipidemia—may in turn be caused by hyperinsulinemia. “The idea has been around for a long time that insulin itself is a problem,” says Friedewald, “and certainly people in the intensive group in ACCORD had higher exogenous insulin levels than people in the standard group.”

    Ralph DeFronzo, chief of the Diabetes Division at the University of Texas Health Science Center in San Antonio, is a leading proponent of the idea that heart disease in type 2 diabetics is not caused by high blood sugar but by insulin resistance, hyperinsulinemia, and the metabolic disorders that follow. “When you're treating and trying to control type 2 diabetes,” says DeFronzo, “you have to understand that the basic defect is insulin resistance. You have to give enormous amounts of insulin to overcome that resistance, and when you do that, you activate growth-promoting inflammatory pathways.” These in turn have direct effects on blood vessels and so can potentially cause or exacerbate cardiovascular disease, leading to heart attacks and death.

    McMahon says that diabetologists don't like to think about this possibility because insulin is often lifesaving for type 2 diabetics. As these diabetics age, their beta cells eventually fail to maintain the excessive insulin secretion necessary to lower blood sugar in the face of insulin resistance. In those cases, says McMahon, insulin therapy “is the only rational treatment and the only treatment that works.”

    At the moment, the insulin hypothesis remains speculative. The ACCORD data revealed no link be-tween mortality and insulin use, says Friedewald, but this could be a product of the various biases and confounding factors inherent in these kinds of studies. “The fundamental problem in this area of investigation,” says Harvard's Nathan, “is that the studies, almost by design, build in confounding elements. Whenever you're interpreting clinical trials, there's always other consequences of the intervention that affect the outcome. … The ideal study would have patients treated with exactly the same profile of medications, with the exception of one drug that you would ramp up in one group but not the controls. For example, if you put one group on a lower dose of insulin and one on a higher dose, that would be closer to an ideal trial.”

    Without such studies, the only thing that can be said with any confidence about the ACCORD, ADVANCE, and VADT results, as Friedewald says, is that they are more evidence against the hypothesis that high blood sugar is a fundamental cause of heart disease and stroke in type 2 diabetics.


    Trail of Mare's Milk Leads to First Tamed Horses

    1. John Travis

    Traces of ancient mares' milk may indicate that horse domestication began 1500 years earlier than previously believed, according to research presented at the Third International Symposium on Biomolecular Archaeology.


    Got milk?

    People in central Asia still milk mares (left), and residues on ancient potsherds (inset) show that this practice goes way back.


    Herds of horses still race across the steppes of Northern Kazakhstan, and the people in that harsh environment have long depended on the animals, riding them, eating their meat, and exploiting their skins for clothes. Indeed, the oldest accepted evidence for horse domestication—equine bones and chariots found together and dated to 2000 B.C.E.—come from the region. Now, traces of ancient mares' milk may extend Northern Kazakhstan's equine roots another 1500 years.

    Locals today still consume a fermented drink called koumiss made from mare's milk. Koumiss tastes “horrible” to her Western palate, confesses chemistry Ph.D. student Natalie Stear of the University of Bristol in the U.K., but an ancient version may have yielded some appetizing data: Stear reported at the meeting that she found the isotopic signature of mare's milk on 5500-year-old pottery fragments from Kazakhstan. “It is the smoking gun for horse domestication, since no one would attempt to milk a wild mare,” says anthropologist Sandra Olsen of the Carnegie Museum of Natural History in Pittsburgh, Pennsylvania.

    Many researchers believe people began domesticating horses about 4000 B.C.E., but finding clear evidence of that has been difficult. The shards Stear analyzed were made by the Botai, who dwelled in Central Asia between 3700 and 3100 B.C.E. In the 1990s, Olsen's team excavated one of their villages, revealing tons of animal bones, 90% of them equine. Olsen and others digging at Botai sites have uncovered other suggestive evidence of horse domestication, including signs of corrals and bit wear marks on horse teeth. But the bit wear conclusions have been hotly disputed, and some argued that the Botai primarily hunted wild horses for meat.

    Stear's attempt to settle the issue evolved out of her work in Richard Evershed's group at Bristol, which has pioneered the technique of identifying milk residues on ancient pottery by carbon-isotope analysis. The varying amounts of such isotopes within lipids that permeate the vessels can sometimes reveal what species the fat came from and whether it was from meat or milk. In a paper that appeared online in Nature in August, for example, Evershed's team pushed the earliest dairy use back by 2000 years, to about 6050 B.C.E., after detecting milk fats in ancient Turkish pottery shards.

    Stear used carbon isotopes to confirm the presence of equine fats on about 50 Botai shards, but the method couldn't distinguish between lipids from milk or meat. So she tested local horse meat and koumiss and confirmed a hypothesis posed by Evershed and Alan Outram of the University of Exeter, U.K.: that horse meat and milk contain different amounts of the hydrogen isotope deuterium. For reasons related to the isotope's heavier weight, summer rains in the region contain much more deuterium than winter precipitation. Because mares are only milked after they foal in the spring, researchers theorized that the isotope would be concentrated in milk, whereas horse meat's deuterium signal would be averaged over the course of each year.

    Testing the ancient potsherds, Stear found that five had the horse-milk deuterium signature. “The way she did it was quite elegant,” says Oliver Craig, a biomolecular archaeologist at the University of York. Stear also notes that colleagues have found new signs of bit use on the Botai horse teeth, giving her greater confidence that the animals were domesticated.

    Still, some are reserving judgment. Marsha Levine of the McDonald Institute for Archaeological Research in Cambridge, U.K., has argued that the Botai primarily hunted wild horses, but she accepts that the deuterium evidence suggests that at least some horses there were domesticated. Levine cautions, however, that the new technique must be independently vetted before the Botai horses are definitively tamed.


    Old Bones Reveal New Signs of Scurvy

    1. John Travis

    At the Third International Symposium on Biomolecular Archaeology, researchers described a new way to detect more subtle signs of scurvy in ancient bones.


    By some estimates, scurvy killed or debilitated millions of early sailors before it was recognized that vitamin C can ward off the condition's connective tissue degradation, which results in spotty skin, gum disease, and bleeding throughout the body. Many researchers suspect that other ancient peoples were also vulnerable to scurvy, such as those facing famine or early farmers who ate little wild fruit as they transitioned to a cereal-based diet. But such ideas are hard to prove because only the most severe cases leave telltale lesions on the slow-growing bones of adults.

    At the meeting, however, postdoctoral researcher Hannah Koon of the University of York, U.K., described a new way to detect signs of scurvy in human skeletons. “Making a diagnosis with what you see on a skeleton is really problematic,” says Megan Brickley of the University of Birmingham, U.K., a forensic anthropologist who specializes in scurvy and skeletal diseases. “This technique could potentially have a huge impact.”

    Koon focuses on collagen, the most abundant protein in the human body and the molecule at the heart of scurvy. Collagen molecules consist of three long chains of about 1000 amino acids, which twist and bind together and then pack into fibers that are the building blocks of bones and connective tissue. Collagen chains contain numerous proline amino acids, and for the fibers to become stable, many prolines must have hydroxyl groups attached to them through the work of an enzyme. For this hydroxylation, explains Koon, “one of the cofactors needed is vitamin C.”

    The prevailing wisdom had been that certain prolines in a collagen molecule were always hydroxylated and others never were. But in studying different collagen molecules from the same samples of modern cow and human bones, Koon found that certain prolines are only hydroxylated sometimes. She hypothesized that scurvy would alter how frequently these variable prolines are hydroxylated. Tests on guinea pigs raised on a low-vitamin C diet supported the idea.

    Now Koon has begun to extend the work to humans, testing skeletons from a group of 17th to 18th century Dutch whalers excavated from the permafrost in the 1980s by George Maat of Leiden University in the Netherlands. Although they lack major bone lesions, most such sailors described scurvy symptoms in diaries or had scurvy-induced hemorrhages still evident in their well-preserved bodies. If the collagen assay “can't see scurvy in [the bones of] these guys, you won't see it anywhere,” says Matthew Collins, Koon's lab chief.

    Koon reported in York that collagen from the first two whalers does indeed reveal fewer hydroxylated prolines. “I'm cautiously optimistic” that the technique will work, she says. After more work on whalers, Koon plans to test bones of British sailors from before and after the turn of the 18th century, when the Royal and Merchant navies began to require sailors to drink lime juice daily to thwart scurvy, a practice resulting in the nickname of Limey.


    Hope for the Rhone's Missing Sturgeon

    1. John Travis

    Ancient DNA research may help give one species, the European sea sturgeon (Acipenser sturio), a new future, according to research presented at the Third International Symposium on Biomolecular Archaeology.


    Ancient DNA research typically provides a glimpse into a species' past, but work presented in York showed that it may also help give one species, the European sea sturgeon (Acipenser sturio), a new future.

    Until the 19th century, this large fish thrived along the coast of Europe and spawned in most of the major rivers from the Baltic Sea to the Black Sea. Once a major source of food and caviar, it was fished almost to extinction. Today, only a few thousand remain, all breeding at a single site, the Gironde estuary, where the French rivers Dordogne and Garonne meet.

    Good old days.

    In 1850, the European sea sturgeon roamed widely (blue) and spawned in many rivers. Today, few survive, but captive-bred larvae (inset) may be reintroduced.


    Captive-breeding efforts using sturgeon from this site were begun in the 1990s, and last year 11,000 larvae were born from captive-bred fish for the first time. France's Cemagref (Research Institute for Agricultural and Environmental Engineering) released those hatchlings into the Gironde last summer.

    Now Cemagref and its partners want to reintroduce their hatchlings to other rivers in which the European sturgeon once swam. The Rhone is an obvious target—or is it? There's little doubt that some kind of sturgeon once densely populated the river, but no one has been sure whether the native Rhone fish was A. sturio or the Adriatic sturgeon, A. naccarii. The two fish are quite different ecologically—the European sturgeon spends much of its life at sea before briefly visiting rivers to spawn, and the Adriatic sturgeon is more of a freshwater homebody that struggles in sea migrations. Which one should be reintroduced to the Rhone?

    In York, Olivier Chassaing of the école Normale Supérieure in Lyon described how he and colleagues used ancient DNA to find out. Researchers had previously analyzed DNA from museum specimens labeled as Rhone sturgeons, but the results were inconclusive. So Chassaing, Patrick Berrebi of the University of Montpellier, and colleagues in the Lyon lab of Catherine Hänni turned to an unusual collection of thousands of sturgeon bones found several decades ago on the Rhone river bank near Arles. Dated to between the 6th and 2nd century B.C.E. and apparently left by people fishing, the bones look like those of modern A. sturio. The DNA extracted from them clinched the case, Chassaing reported: All 14 samples tested were from the European sea sturgeon. There was no evidence of its Adriatic relative, or hybrids between the two. “We're convinced there was A. sturio in the Rhone,” says Chassaing. “Our recommendation is to reintroduce A. sturio but not A. naccarii.”

    Although the ancient DNA work can't rule out that A. naccarii ever roamed the Rhone, the evidence is good enough for Cemagref fish biologist Mario Lepage, who has worked on the sturgeon recovery plan. Still, he cautions, it may be some time before the European sea sturgeon returns to its former home. He says researchers need to assess whether the modern Rhone is now suitable for young sturgeon and whether their reintroduction would harm species that took the sturgeon's ecological role decades ago.