News this Week

Science  07 Nov 2008:
Vol. 322, Issue 5903, pp. 834

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Zerhouni's Parting Message: Make Room for Young Scientists

    1. Jocelyn Kaiser

    An intractable problem faced Elias Zerhouni when he became director of the National Institutes of Health (NIH) 6 years ago: The agency's corps of more than 20,000 independent investigators was getting old. The average age at which researchers receive their first NIH research grant had been creeping up for decades. (It is now 42.) Zerhouni saw this as a crisis and tackled it head on. After probing the data, he launched an experiment. Instead of relying solely on peer review to apportion grants, he set a floor—a numerical quota—for the number of awards made to new investigators in 2007 and 2008.

    Last week on his final day as director, Zerhouni made this a formal NIH policy. He hopes his successors will keep it: “I think anybody who thinks this is not the number-one issue in American science probably doesn't understand the long-term issues,” he says. The notice states that NIH “intends to support new investigators at success rates comparable to those for established investigators submitting new applications.” In 2009, that will mean at least 1650 awards to new investigators for R01s, NIH's most common research grant.

    The quotas have meant pain for some institutes in a time when NIH's budget isn't growing. Many are trying to steer money to new grantees by setting funding cutoff points in peer-review scores at more generous levels for new investigators than for established ones. Although some scientists may see this as a kind of affirmative action, Zerhouni says it is not. To him, it is simply “leveling the playing field” by correcting peer reviewers' bias against the young.

    In 1980, the average age of a first-time NIH grant recipient was 37. The 5-year rise in average age since then, observers say, can be blamed on longer time spent in training, including in postdocs, and the older age at which faculty are first hired at medical schools, where they begin independent careers. In 2003, when NIH's budget stopped growing, the situation “collapsed,” Zerhouni says: The number of R01-like research grants (known as R01 equivalents) going to first-time investigators slipped to 1354 in 2006, the lowest level in 9 years.


    Departing NIH Director Elias Zerhouni says it is urgent to bring new blood into U.S. biomedical research.


    This is “detrimental for all sorts of reasons,” says Jeremy Berg, director of the National Institute of General Medical Sciences. One concern is that scientists are not getting enough support when they're young, during their most creative years. Another is that the well may run dry. When Zerhouni asked his staff to model the age distribution of NIH-funded scientists over time, the results were startling. If trends continue, by 2020 there will be more investigators over 68 than under 38 (see p. 848). “If we don't fund the pipeline now, we will pay for it 20 years from now,” Zerhouni says.

    A leg up.

    After NIH set a numerical target for grants to first-time investigators in 2007, the number of awardees grew. Their success rates matched those of established investigators seeking new grants.


    Zerhouni created special awards for young scientists but concluded that wasn't enough. In 2007, he set a target of funding 1500 new-investigator R01s, based on the previous 5 years' average. Some institutes struggled to reach their targets, NIH officials say. At the National Institute of Neurological Disorders and Stroke, for example, the shift to new grants meant that only 9% to 10% of established investigators with strong peer-review scores received funding, whereas 25% of comparable new investigators did, says NINDS Director Story Landis. She maintains, however, that “it's not as though a huge number of investigators lost out.”

    Some program directors grumbled at first, NIH officials say, but came on board when NIH noticed a change in behavior by peer reviewers. Told about the quotas, study sections began “punishing the young investigators with bad scores,” says Zerhouni. That is, a previous slight gap in review scores for new grant applications from first-time and seasoned investigators widened in 2007 and 2008, Berg says. It revealed a bias against new investigators, Zerhouni says.

    The 2007 target had an immediate effect: For the first time since 1995, new investigators and established ones submitting new grant applications had nearly the same success rate, about 19%. (Investigators renewing existing grants still do much better, however.) From now on, NIH will set award targets designed to equalize new grant success rates for the two groups.

    NIH will also fine-tune its policy to tilt it in favor of early-career scientists. The goal is to adjust for the recently discovered fact that only about 55% of investigators who receive their first NIH grants are at an early stage of their career. The rest are scientists who had been funded by other agencies or came from NIH's intramural program or from Europe after being forced to retire there. “It was embarrassing” to realize, for example, that the new investigators included two department chairs with Veterans Administration funding, Landis says. The targets will favor “early stage investigators,” defined as researchers within 10 years of finishing their Ph.D. or residency.

    Those outside NIH are generally supportive of the new-investigator targets, which were also endorsed earlier this year by an advisory committee reviewing NIH's peer-review policies (Science, 29 February, p. 1169). But at the same time, some scientists may be uneasy about the cost, says Howard Garrison, spokesperson for the Federation of American Societies for Experimental Biology in Bethesda, Maryland: “Every time you give a leg up to a young investigator, you're pushing someone off the edge of the cliff.” Some observers say the real test will come when early stage investigators try to renew their grants: They may have trouble, and gains in creating a more youthful corps of investigators could be lost (Science, 26 September, p. 1776). NIH officials say they've looked at the data, and so far it seems that first-time investigators do just as well as established investigators who are renewing a new grant.


    Rules for Ocean Fertilization Could Repel Companies

    1. Eli Kintisch

    An international body has for the first time placed restrictions on experiments designed to fertilize large swaths of the world's oceans with a view to combating global warming. Meeting last week in London, delegates from 85 nations noted that such experiments “may offer a potential strategy for removing carbon dioxide from the atmosphere” by producing algal blooms that would absorb CO2 and sink to the ocean floor. But they limited the experiments to “legitimate scientific research,” a phrase not yet defined that could complicate plans to commercialize the approach.

    Created in 1972 under the auspices of the United Nations' International Maritime Organization, the London Convention Treaty is supposed to regulate pollution in international waters. Members of the convention and the related London Protocol had been silent on ocean fertilization until several companies announced plans last year to carry out large-scale tests, setting off concern about environmental effects. The companies hope the technology will allow them to sell carbon credits on domestic and international markets.

    On 31 October, delegates agreed unanimously to set scientific guidelines for proposed fertilization experiments, taking into account their expected carbon flux, impacts on oxygen levels and food webs, and the possibility that they will promote the growth of toxic species. Scientific bodies affiliated with the treaty will meet in May to hash out details.

    “There was a widespread recognition among the delegates that there should not be a ban on legitimate research,” says Henrik Enevoldsen, who observed the 5-day negotiations as a scientific staff member of UNESCO's International Oceanographic Commission. But the reference to “legitimate” studies was intended by some nations to exclude for-profit fertilization efforts, he says. “Most countries are looking to oppose something that's commercial research with an eye toward obtaining carbon credits.”

    Not ironed out.

    New global guidelines are being drawn up to govern experiments like this 2002 release of iron in the Southern Ocean.


    The experiments dump elements such as iron or nitrogen into the open ocean to stimulate the growth of plankton blooms (Science, 30 November 2007, p. 1368). Up to 3 tons of iron at a time have been released in a dozen small-scale fertilization experiments since 1993, and prominent scientists believe the technique, if scaled up, could sequester up to 1 billion tons of carbon dioxide per year as the blooms grow and die. But there are no international rules to regulate the practice, and researchers have identified myriad possible side effects, including local disruption of marine ecosystems or emissions of nitrous oxide, a potent greenhouse gas.

    To quantify both the promise and perils of ocean fertilization, scientists want to launch experiments 10 to 30 times larger than earlier tests. Last week's vote implicitly supports such work, says geochemist Ken Buesseler of Woods Hole Oceanographic Institute in Massachusetts. But Buesseler worries that the upcoming rules, which will form the basis for permits to be issued by individual countries, “could preclude even legitimate science, if the [environmental] assessment needs to include measurement of all impacts on all time [and] space scales.”

    On the other hand, Greenpeace and other environmental activist groups are concerned about possible bias in reporting results among commercial companies looking to fight global warming by exploiting ocean fertilization for profit. But Dan Whaley, CEO of Climos, a San Francisco, California-based ocean fertilization start-up, defends his company's ethics and notes that the text of the resolution doesn't explicitly bar commercial projects. Climos still hopes to abide by the treaties and obtain permits for operations previously scheduled in 2010.


    Chinese Cave Speaks of a Fickle Sun Bringing Down Ancient Dynasties

    1. Richard A. Kerr

    A 1.2-meter-long chunk of stalagmite from a cave in northern China recorded the waning of Asian monsoon rains that helped bring down the Tang dynasty in 907 C.E., researchers report on page 940. A possible culprit, they conclude: a temporary weakening of the sun, which also seems to have contributed to the collapse of Maya civilization in Mesoamerica and the advance of glaciers in the Alps. “I think it's one of the coolest papers I've seen in a long time,” says paleoclimatologist Gerald Haug of the Swiss Federal Institute of Technology in Zurich. This latest cave record also points to the potentially devastating effects that climate change—even change that's mild when averaged around the globe—can have on vulnerable local populations.

    Although hardly the final word in such controversial fields, the cave record—which other researchers describe as “amazing,” “fabulous,” and “phenomenal”—provides the strongest evidence yet for a link among sun, climate, and culture. The key to obtaining it was “a really, really clean sample,” says paleoclimatologist Lawrence Edwards of the University of Minnesota (UM), Twin Cities. Paleoclimatologists Pingzhong Zhang of Lanzhou University in China, Hai Cheng of UM, Edwards, and colleagues collected a stalagmite (a mound composed mostly of calcium carbonate slowly precipitated from dripping groundwater) from Wanxiang Cave in northern China at the far reach of the rains of the summer Asian monsoon.

    Relatively high amounts of uranium and exceptionally low clay-borne thorium in this stalagmite enabled them to conduct uranium-thorium radiometric dating of the layered deposits to within an average of just 2.5 years. As a result, they could calculate precise dates for subtle variations in the stalagmite's oxygen isotope composition that reflect variations in rainfall near the cave. “They absolutely nailed the rainfall history of [northern] China over the past 1800 years,” says Haug.


    Comparing their rain record with Chinese historical records, Zhang and colleagues found that three of the five multicentury dynasties during that time—the Tang, the Yuan, and the Ming—ended after several decades of abruptly weaker and drier summer monsoons, possibly poor rice harvests, and social turmoil. In turn, decades that included the strongest, wettest monsoon of the past millennium coincided with the Northern Song Dynasty's golden age of rich harvests, exploding population, and social stability. “Our results really match the historical record,” says Edwards. “You can't figure it's all climate, but when you see these nice correlations, you see that climate probably played an important role.”

    The group then looked farther afield. Critical parts of their monsoon rainfall record—in particular the dryness of the late Tang dynasty—match neatly with a previously published climate record from a lake on the southern coast of China, with the advances and retreats of Swiss alpine glaciers, and with records from within and near Central America. Most striking is the correlation between the Asian monsoon and the collapse at the end of the Maya Classic period under severe drought duress around 900 C.E. (Science, 18 May 2001, p. 1293), near the end of the drought-stricken Tang dynasty.

    Good times.

    Monsoon rains were plentiful early in the Northern Song Dynasty of China, according to the isotopic record in a cave stalagmite (top). A cave-wall painting from the same province (above) recorded the bounty.


    Previous research had linked changes in both the Asian monsoon and Mesoamerican climate to variations in the brightness of the sun (Science, 6 May 2005, p. 787). Checking their record, the group found an 11-year cycle in rainfall—the length of the shortest cycle of solar variability. In their record, rain tracked centuries-long trends in solar activity as measured in records of carbon and beryllium isotopes. And a climate model driven in part by solar variations broadly tracked the monsoon trends. “Solar variation is a player, but the sun is not everything,” Edwards concludes. Internal jostlings of the climate system must also play a role, he says.

    Climate modeler David Rind of NASA's Goddard Institute for Space Studies in New York City agrees. In a modeling study in press in the Journal of Geophysical Research, Rind and colleagues found that “the solar influence on the monsoon was more like a 'weighting of the dice'—it influenced the net result, but did not dominate,” he writes in an e-mail.

    De'er Zhang, chief scientist of the National Climate Center in Beijing, stresses that both climate and culture are too complex to be reduced to a simple cause-and-effect relationship. A single spot cannot properly represent such a vast area as encompassed by the monsoon, she writes in an e-mail, and numerous political factors influenced the Tang dynasty. “Climate might have played a role,” she writes, but it was “far from playing 'a key role' as stated by [Pingzhong] Zhang et al.” Sorting out what “key” or “important” meant a millennium ago could require a lot more spelunking.


    Number of Sequenced Human Genomes Doubles

    1. Elizabeth Pennisi

    Less than a decade ago, it took hundreds of millions of dollars and a large international community to sequence a single human genome. This week, three reports in the 6 November issue of Nature describe three more human genomes—the first African, the first Asian, and the first cancer patient to have their entire DNA deciphered. The sequences provide clues about genome variation and disease; they also demonstrate the potential of a relatively new sequencing technique to mass-produce human genomes. “The methods are extremely powerful,” says geneticist James Lupski of Baylor College of Medicine in Houston, Texas. “Reading these papers, I think the personal genomes field is moving even faster than I anticipated.”

    Until now, four human genomes have been published: the reference human genome, derived from sequencing DNA from several anonymous individuals; one by Celera Genomics; and those of genome stars J. Craig Venter and James Watson. Efforts to date to identify differences among individuals have relied not on entire genome sequences but on surveys of single-base changes called SNPs and of structural variations in duplicated pieces of DNA (Science, 21 December 2007, p. 1842).

    Even the broadest SNP surveys look at just a few million SNPs out of the 3 billion bases in the genome, leaving researchers in the dark about how much individual variation there is and how specific differences correlate with disease risks. Hence the push to drive down the cost of sequencing to $1000 per genome (Science, 17 March 2006, p. 1544). The newly published genomes came in with price tags of $250,000 to $500,000 each but would cost half that or less if done today. The three groups all used a technology developed by Solexa, now part of Illumina Inc. in San Diego, California, to speed and slash the cost of sequencing. It generates smaller pieces of sequence faster and cheaper than previous technologies. Such small pieces used to be difficult to stitch together, but this approach can work well now because the reference genome helps guide their assembly.

    New genome on the block.

    The first genome sequence from a Chinese was on display last year at a technology fair in Shenzhen, China.


    To explore the genetic underpinnings of cancer, Richard Wilson and colleagues at the Washington University School of Medicine in St. Louis, Missouri, sequenced genomes from both normal skin tissue and tumor tissue of a middle-aged woman who died of acute myelogenous leukemia (AML). They compared the DNA to determine what was different about the cancer cells. About 97% of the 2.65 million SNPs found in the tumor cells also existed in the normal skin cell, suggesting they were not critical to the cancer process. The researchers also eliminated SNPs that had been previously identified elsewhere as well as those that did not change the coding of a gene, ending up with 10 SNPs unique to the tumor cells. “I don't think we missed anything,” says Wilson.

    Two occurred in genes previously linked to this leukemia. Eight led the researchers to new candidate AML genes, including several tumor suppressor genes and genes possibly linked to cell immortality. By sequencing the whole cancer genome, “we capture what we don't know as well as what we do know [about cancer genes],” says Illumina's David Bentley. “That can really transform our ability to understand cancer.”

    Bentley and colleagues sequenced the genome of a Yoruba man from Nigeria whose DNA has already been extensively studied, enabling them to check the accuracy of their technology. In the third Nature paper, Jiang Wang of the Beijing Genomics Institute in Shenzhen, China, and colleagues sequenced the genome of a Han Chinese male. The Yoruba analysis uncovered almost 4 million SNPs, including 1 million novel ones. The Chinese genome had about 3 million, including 417,000 novel SNPs. As anticipated, the African genome had greater variation per kilobase than either the Chinese or sequenced Caucasian genomes, indicative of its ancestral status.

    These new genomes were already significantly cheaper than their predecessors were; next year, Illumina expects the cost to drop to about $10,000. Other companies are promising even lower prices per genome. Nonetheless, geneticist Aravinda Chakravarti of Johns Hopkins University School of Medicine in Baltimore, Maryland, is cautious about how quickly genome sequencing should enter the clinic: “We still don't know how to interpret [the data],” he notes. Bentley agrees. Because of the uncertain applicability and utility of sequence data, “and possibly ethical barriers,” he notes, saying the technology is poised to enter the clinic anytime soon is “pushing it.”


    The Touchy Subject of 'Race'

    1. Constance Holden

    Nothing makes scientists more nervous than the topic of “race,” so much so that they'd like to find a way not to talk about it at all. That was the core issue last week at a meeting* at the National Human Genome Research Institute (NHGRI) in Rockville, Maryland, where about 40 scientists and ethicists debated how to present the torrent of new findings from human gene sequencing studies to the public.

    In different parts of the world, different gene mutations become advantageous and spread quickly through a population, making some variants more prevalent in particular ancestral groups. Some are innocuous enough—such as the emergence of lactose tolerance in farming populations. But there's already much debate over the use in medicine of findings of racial differences in the prevalence of genes associated with certain diseases. Many scientists predict that it won't be long before they have solid leads on much more controversial genes: genes that influence behavior—possibly including intelligence.

    Everyone at the meeting agreed on the need for non-“fraught” terminology—“geographic ancestry,” for example, instead of “race.” But specifying such ancestries is also a minefield. “Amerindian,” for example, is offensive to Native Americans, according to one speaker. “Caucasian” is also unacceptable because it implies racial rather than geographic ancestry. Some speakers even advised that it is inappropriate to refer to a “European allele” for lactose tolerance, because it also occurs in other groups.

    Participants acknowledged that however they characterize their findings, they can't control what the public makes of them. “When translated into popular culture, society reads whatever term we pick as 'race,'” said Timothy Caulfield, a health law professor at the University of Alberta in Edmonton, Canada. Carlos Bustamante, a population geneticist at Cornell University, said that when his group published a study in Nature this year indicating that European-Americans had more deleterious gene mutations than African-Americans, some publications touted the report as suggesting that blacks are fitter than whites.

    Some tense moments came during a discussion of a paper on brain genes. In 2005, geneticist Bruce Lahn and colleagues at the University of Chicago in Illinois reported evidence for selection in mutations of two genes regulating brain development that are more common in Eurasians than in Africans (Science, 9 September 2005, pp. 1717 and 1720). They hypothesized that these mutations were related to the human cultural explosion some 40,000 years ago (Science, 22 December 2006, p. 1871). Celeste Condit, a professor of speech communication at the University of Georgia, Athens, criticized the way the papers were written, saying they could be seen as having a “political message embedded” in them: that the genes might contribute to racial differences in brain size and therefore perhaps to racial differences in IQ. Lahn denied any political message, telling her she was “putting words in [my] mouth.”

    Ancestry, not race.

    Researchers are grappling with how to communicate genetic data on differences among populations.


    Later, Lahn commented that some scientists “are almost like creationists” in their unwillingness to acknowledge that the brain is not exempt from selection pressures.

    At the end of the day, Allen Buchanan, a philosophy professor at Duke University in Durham, North Carolina, warned the group against going overboard. “A visible, concerted effort to change vocabulary for moral reasons is likely to trigger a backlash,” he said. There's “risk of … stifling freedom of expression in the name of political correctness,” he said, and losing credibility in the process.

    • *Workshop on Ethical, Legal, and Social Issues in Natural Selection Research.


    Economic Woes Threaten to Deflate Plans for 2009

    1. Jennifer Couzin*
    1. With reporting by Jon Cohen.

    Uncertainty has become the new norm for economic forecasters. But scientists planning next year's experiments want to know how the stock market turmoil, a credit crunch, and a recession will affect their research. It's an urgent question, especially with the U.S. government facing a yawning deficit and a likely squeeze on domestic spending. Among the first to feel the slowdown are charitable foundations and other philanthropies, which provide billions of dollars in funding to scientists each year, including support for innovative, risky research that the government may be reluctant to back. Some are scaling back; some say they're holding steady. Others say they cannot plan far ahead—not even to predict what the next 2 months, normally flush fundraising time, will bring.

    “I've been in this business 30 years, and I've never seen an environment” like this, says Richard Mattingly, executive vice president and chief operating officer of the Cystic Fibrosis Foundation, which in 2008 gave out $199 million in research money. The foundation relies exclusively on fundraising. Mattingly, emerging from a board meeting last week, said he expects funding to drop next year, though he can't yet say how much.

    One of the hardest-hit organizations so far is the Dr. Miriam and Sheldon G. Adelson Medical Research Foundation in Needham, Massachusetts, which has delayed $65 million in research funding to dozens of investigators for the second half of 2008 and 2009. Projects in melanoma, lymphoma, neurodegenerative diseases, inflammatory bowel disease, and others were ready to go, says Bruce Dobkin, the group's executive director and a neurologist at the University of California, Los Angeles. “Now, we're going to have to wait and see what happens.” Dobkin says he doesn't know what, precisely, prompted such drastic action, and the foundation declined to comment further.

    The Nature Conservancy, with more than 600 scientists on staff, decided 3 weeks ago to cut its current research budget by 10% and laid off some scientists. Peter Kareiva, the group's chief scientist, says international programs will be especially hard hit. The group is drawing up contingency plans in case further cuts are necessary.

    Groups that rely mainly on contributions are nervously entering their peak fundraising season. Last week, the Multiple Myeloma Research Foundation (MMRF) in Norwalk, Connecticut, held its biggest gala of the year, at the Hyatt Regency Greenwich Hotel in Connecticut, with supermodel Cindy Crawford. Nine hundred people accepted; the event was expected to raise $1 million—impressive, but below the $1.5 million originally hoped for, says Scott Santarella, the group's chief operating officer. MMRF scaled back its research funding several months ago for 2008, cutting it from $17 million to $15 million, but expects to bring it back up to $17 million in 2009. The group, like many charities, typically raises 40% of its money in the last quarter of the year.

    Similarly, the Michael J. Fox Foundation for Parkinson's Research held its star-studded gala this week in economically battered New York City, with the English rock band The Who performing. Leaders hope to raise a bit over $4 million from that event, down from last year's $5 million, says co-founder Debi Brooks.


    For organizations that live off endowment income, the drop in their value can be dizzying: The Burroughs Wellcome Fund fell from nearly $700 million at the end of July to $540 million last Friday, and the Bill and Melinda Gates Foundation lost $3.6 billion between the beginning of this year and the end of September, to end with $35.1 billion. The Howard Hughes Medical Institute's (HHMI's) worth fell from $18.7 billion in August 2007 to $17.4 billion at the end of August this year—and that was before the big drop in the stock market. (Neither the Gates Foundation nor HHMI would release current figures.) “Sometimes you go into a meeting and by the time you come out the endowment's gained $10 million, and by the end of the day it's lost $20 million,” says Burroughs Wellcome spokesperson Russ Campbell, describing the wild gyrations in the market.

    HHMI is required by law to distribute 3.5% of its assets each year, and foundations like Burroughs Wellcome must give out 5%. This is normally not a problem because these groups offset the outflow with investment gains, keeping their principal intact. Not all may be able to manage that this year.

    A mid-September survey by the Association of Small Foundations (ASF), whose members have an average endowment of $20 million and give away $1 million each year, found that 84% said their endowments had dropped this year. But, responding days after the investment bank Lehman Brothers collapsed, 64% said they plan to maintain or increase grant budgets in 2009. That said, “I do get e-mails that say, 'Oh my gosh, we're down 30%,'” admits Tim Walter, ASF's CEO.

    Fundraisers, meanwhile, are considering how to persuade donors to keep giving. Although many will continue to send checks, those checks may be smaller than before. To prevent that, and to stem the departure of donors altogether, many groups are redoubling their communication efforts. At the American Cancer Society, chief medical officer and oncologist Otis Brawley recently disseminated a list of 10 scientific discoveries funded with ACS dollars. “We have not made any plans right now to decrease our funding for cancer research,” says Brawley, and in fact he says research funding is up about 5% this fiscal year, which began in September, over last. But although he doesn't work closely with ACS fundraisers, “I see those guys on the elevator, and they're not happy.”


    17q21.31: Not Your Average Genomic Address

    1. Elizabeth Pennisi*
    1. With reporting by Martin Enserink.

    This region of chromosome 17 has had a storied history, with changes in its DNA of import to human evolution and disease.

    This one region of chromosome 17 has had a storied history, with changes in its DNA of import to human evolution and disease

    Short on DNA.

    From birth, Anne Zandee's development lagged. A deleted piece of chromosome 17 is to blame.


    For most of us, 17q21.31 is a meaningless alphanumeric. For geneticists, it's a genomic postal code identifying a region of chromosome 17. But for Tjitske Dansen, a Dutch mother of three, it's an answer for which she waited 17 years. From birth, her oldest daughter, Anne Zandee, had trouble. “She kept lagging in many respects: walking, talking, growing,” Dansen recalls. For a while, the toddler had epileptic seizures. Weak jaw muscles cause Zandee to drool; weak back muscles may have contributed to her scoliosis. “But we never knew what was wrong with her.”

    Four years ago, an orthopedics doctor referred Zandee to Bert de Vries, a clinical geneticist at Radboud University Nijmegen Medical Centre (RUNMC) in the Netherlands, to try to find a genetic explanation for the scoliosis. “There had been so many studies, and they never found anything, so we didn't think anything would come of it,” says Dansen. “We didn't hear anything from De Vries for a year and a half, when suddenly he called us.” He told them Zandee was missing a piece of chromosome 17; to be exact, a piece of 17q21.31. Zandee is now one of 22 documented cases of a new genomic disorder. “We were happy to find out,” says Dansen. But, “of course we had never heard of 17q21.31 before.”

    Among geneticists, however, 17q21.31 has been gaining notoriety for almost 20 years. Its half-dozen genes include one controversially implicated in Alzheimer's disease and firmly tied to other dementias. More recently, researchers excavating this single chromosomal address, located about 19 million bases down the “q,” or longer arm, of chromosome 17, have uncovered a tumultuous past. Here, vulnerable DNA has gone astray to cause mental retardation, learning disabilities, and even cancer. Genes hide within genes, and variation in this sequence even suggests to a few researchers that our species interbred with Neandertals. “It's probably one of the most bizarre and fascinating regions of the human genome,” says Evan Eichler, a geneticist at the University of Washington, Seattle.

    DNA locator.

    Under a microscope, gene-rich and gene-poor regions stain differently, creating readily defined and labeled genomic addresses. 17q21.31 is in red.


    Zoom, zoom

    The most famous gene that lives at this address is MAPT (microtubule-associated protein tau). Tau first drew neuroscientists here because it is the protein that gets jumbled together to form neurofibrillary tangles in the brains of patients with Alzheimer's disease. But even though Athena Andreadis, now at the University of Massachusetts Medical School's Eunice Kennedy Shriver Center in Waltham, and her colleagues cloned MAPT in humans in 1992, neither they nor others have been able to find mutations that could explain Alzheimer's. Most considered further investigations a waste of time and lost interest in 17q21.31.

    But John van Swieten of Erasmus University Medical Center in Rotterdam, the Netherlands, was eyeing that gene region with another disorder in mind: Pick's disease, a neurodegenerative condition in which individuals become increasingly bored, listless, and incapable of relating to others. Personal hygiene fails, emotions falter, and symptoms become progressively worse until full-time care and supervision are required. In this disease, also called frontotemporal dementia (FTD), the frontal lobe of the brain shrinks and tangles form.

    In 1994, Kirk Wilhelmsen and Timothy Lynch of Columbia-Presbyterian Medical Center in New York City established a link between FTD and 17q21.31 by evaluating how the disease was inherited in one large family. Four years later, Van Swieten, Erasmus geneticist Peter Heutink, and Michael Hutton, now at Merck Research Laboratories in Boston, pinpointed mutations in MAPT responsible for 10% of the cases of this disease.

    Suddenly, the gene had sex appeal. “It provided a rationale for why tau was important in Alzheimer's,” and it became possible to develop mouse models to study tau's effects, recalls Hutton. As a result, “a lot of people moved to the field,” says Van Swieten.

    Hutton, who now devotes his career to looking for potential Alzheimer's disease therapies that target tau tangles, was investigating yet another tau-related disease. In 1999, while sequencing the MAPT gene from patients with progressive supranuclear palsy, he and his colleagues noticed something odd. “We got interested in it almost as a piece of DNA rather than its relationship with the tangles,” recalls Hutton's collaborator, John Hardy of University College London.

    Most chromosomes undergo recombination: During cell division, bits of one chromosome swap places with comparable bits of the matching chromosome, introducing small differences in the DNA sequence from one generation to the next and between one individual and another. The pattern of those differences is called the haplotype. What struck Hutton, Hardy, and their colleagues was that there seemed to be two very distinct haplotypes in the MAPT region. One, dubbed H1, seemed to be slightly variable, indicative of some recombination. But the sequence of the other, H2, was nearly identical across about 1.3 million bases in everyone with that haplotype, at least at all of the bases they examined. “It was inherited as one long lump of DNA,” Hardy explains.

    Hutton and Hardy realized there was something very odd about H2. “There were clearly unusual structures in or close to the boundary of the haplotype block,” Hutton recalls. Researchers in Iceland were coming to a similar realization, and eventually they scooped the British group in making a startling determination: Almost a million bases in H2 were pointed in the wrong direction.

    Kári Stefánsson and his colleagues at deCODE Genetics in Reykjavik, Iceland, had also noticed the lack of variability along 17q21.31 in some individuals. When Stefánsson's group took a close look at the reference human genome sequence at this location, they realized that the sequence, which had involved deciphering DNA from multiple individuals to come up with a consensus genome, contained bits of both H1 and H2. So, they went back to the drawing board to sort out the differences between the two.

    Flip and fall.

    The H2 version of this genomic region has genes (green) and duplicated regions (thick arrows) facing the wrong way compared to H1. This orientation predisposes H2 to losing a gene-rich section of DNA (Deletion).


    A comparison of separate H1 and H2 sequences revealed that H2 has a 900,000-base stretch of 17q21.31 that is inverted relative to H1, the deCODE group reported in 2005. The boundaries, or breakpoints, of the inverted region consist of low-copy repeats, blocks of DNA duplicated multiple times. Based on a comparison with chimpanzee DNA from the same region, the researchers concluded that the second haplotype emerged at least 2 million years ago.

    The haplotypes were not evenly distributed, however. Most people are H1. Stefánsson and, independently, Hardy's group found H2 almost exclusively in Europeans, at a frequency of about one in five. “Since the inversion is largely restricted to Caucasians, we all thought the ancestral state would be the H1 orientation,” says Eichler. Indeed, the deCODE data suggest that once the inversion occurred, H2 spread because it provided a reproductive edge, says Stefánsson: Women with H2 had more children than women with H1, they reported in 2005. They saw a similar, but less clear-cut, trend for men. These results implied that H2 was under positive selection and should be on the rise.

    There was a problem with this scenario, though: Some data indicated that H2 predated H1. In August, Eichler and his colleagues showed that was indeed the case. Eichler's group sequenced both H1 and H2 and carried out a detailed comparison among the two human versions of the region and the same stretch of DNA in chimpanzees, macaques, and orangutans. All three macaque species they examined and the Sumatran orangutan carried only the inverted version. The two chimp species carried a mix of inverted and non-inverted copies, with the inverted version predominating, and the Bornean orangutan has both as well. H2 is the more ancient haplotype, Eichler and his colleagues concluded in a paper published online by Nature Genetics on 10 August. “It's an amazing result,” says Hutton.

    Unusual distribution.

    A survey of different populations around the world reveals that the inverted version of 17q21.31 (H2) is largely confined to Europeans.


    The sequence comparisons also reveal that independently in humans, chimps, and orangutans, this 900,000-base region has reoriented itself into the H1 orientation, which explains why Eichler found both orientations in these primates. “This bit of DNA has been flip-flopping up and down. There must be an evolutionary reason for that, but we don't know what it is,” says Hardy.

    Eichler suspects that when H1 appeared, it somehow provided a strong fitness bonus and became much more common over time at the expense of H2. In Africans, H2 almost disappeared, except in the relatively few people who migrated to Europe 50,000 to 100,000 years ago. Then, for as-yet-unknown reasons, H2 provided its own advantage in the European population—as Stefánsson's data show—and the pendulum has begun to swing in the other direction.

    Hardy and, to a lesser extent, Stefánsson give credence to a more extreme explanation for the distribution of H2. Hardy thinks that H2 had disappeared from the modern humans moving out of Africa to populate the Northern Hemisphere but not from Neandertals, who reintroduced the inversion into the European gene pool through interbreeding with Homo sapiens 28,000 to 40,000 years ago. This view is not supported by the genetic evidence emerging from sequencing Neandertal DNA, and “I realize it's an off-the-wall idea,” says Hardy. But he nonetheless thinks it's plausible.

    Whatever happened, about 50,000 years ago, H2 went haywire, with duplicated regions begetting ever more duplicated regions. Low-copy repeats can destabilize a chromosome by confusing the DNA recombination machinery and causing the repeat regions to be copied extra times. Repeats can also cause skips, resulting in DNA between repeats getting left out. This had happened in H2 but not as much in H1. H2 has 441,000 bases' worth of repeats at the boundaries of the inverted DNA, compared with 169,000 bases in H1. H2 also carries extra duplications within its boundaries and is organized in such a way as to predispose the sequence to further rearrangement.

    Which came first?

    Readily distinguishable red and green tags merge and look yellow in chromosome 17 containing inverted DNA. Labeled macaque, chimp, orangutan, and human chromosomes reveal that the inversion dates deeper in the primate tree than the noninverted version.


    Lost DNA

    As Dansen and her daughter Zandee know all too well, those extra duplications can spell trouble. They were the tip-off, not just for De Vries but also for other researchers trying to understand mental retardation, that drew them to the microdeletion responsible for Zandee's syndrome.

    In 2002, Eichler and his colleagues were cruising the genome in search of repeats, or segmental duplications—nearly identical stretches of genome at least 10,000 bases long, each separated by 50,000 to 10 million bases—thinking they might mark places where the genome was in disarray and causing disease. They came up with 130 possible problematic spots, then investigated those spots in 290 people with mental retardation. They found 16 rearrangements, including four people with a piece of 17q21.31 missing. Using microarrays, they figured out that when the missing DNA dropped out, it took a half-dozen genes with it, including MAPT. The breakpoints are two 38,000-base low-copy repeats flanking the DNA deleted in these individuals.

    Parents have the inversion, which is necessary to set up the repeats in such a way that deletions become more likely. “Some think it's the inversion itself that's the culprit, but it's not,” says Eichler. It's the large number of repeats and their orientation that make 17q21.31 vulnerable.

    De Vries came to this microdeletion by a different route. Eager to help parents understand the basis of unexplained mental retardation in their children, De Vries and his colleagues initially screened 340 patients using a technique called microarray-based comparative genomic hybridization to detect genomic rearrangements too small to see by simply staining the chromosomes. In one, they detected a missing piece of 17q21.31.

    Because the missing piece was flanked by low-copy repeats that might give rise to deletions in other individuals, De Vries and his colleagues designed a probe to test for that missing piece and screened an additional 840 mentally retarded individuals. They found two more people with the same deletion—one was Zandee—and the same set of symptoms. “We went first to the genotype and then to the phenotype,” says De Vries. “This was something new, but it will become more common.”

    Nigel Carter of the Wellcome Trust Sanger Institute in Hinxton, U.K., independently came across this microdeletion syndrome through queries to a database called DECIPHER. Entries include comparative genomic hybridization results, along with clinical descriptions of the symptoms of the people tested. “Many of us as clinicians may see one of these kids in our lives, but [these researchers] got the same descriptions with the same array findings from three [entries],” says James Lupski, a geneticist at Baylor College of Medicine in Houston, Texas.

    The three teams published independent reports back to back in 2006 in Nature Genetics. Now they have joined forces to describe 22 patients in molecular and clinical detail in a paper published online 15 July by the Journal of Medical Genetics. They calculate the prevalence of this new genomic disorder to be 1 in 16,000 newborns, and it may account for up to 0.64% of unexplained mental retardation in Europeans. “This is the first novel microdeletion syndrome identified and one of the most frequent ones,” says collaborator Joris Veltman, a molecular geneticist at RUNMC.

    The deleted region contains six genes, and at this point, they don't know which loss matters the most. Even so, “we've gone from 2 years ago not even knowing the syndrome existed to having [dozens of] kids [diagnosed],” says Lupski. “We're going to see that more and more and more.” Last year, Danish clinical geneticists came across a patient with unexplained mental retardation whose abnormality was an extra copy of what was deleted in De Vries's patients.

    Common cause.

    These individuals share facial features indicative of a DNA deletion, visualized by the absence of a red tag in a patient's stained chromosomes.


    Still a puzzler

    Despite this quick success, 17q21.31 is still slow to give up its secrets. It is clear that both haplotypes have their pitfalls: H1 increases the risk of progressive supranuclear palsy and other neurodegenerative diseases, likely by increasing the production of MAPT, and H2 increases the chances that offspring will have mental retardation because of a microdeletion. But at every turn, this genomic address proves a little more complicated.

    Consider the elusive Alzheimer's connection, where no causative MAPT mutations have yet been found. Frustrated, Christopher Conrad, a neurogeneticist at Columbia University, began looking for other undiscovered genes in that region. In 2001, he found a very tiny one inside MAPT that bears no resemblance to any known gene. “It's one of the few examples of a gene within a gene,” says Conrad. He named it Saitohin, after the deceased adviser who helped get him started on the project, and has spent the past several years trying to figure out its role. The gene seems to have appeared first in primates, and Andreadis, who is collaborating with Conrad, has determined that it interacts with a protein involved in antioxidation. It seems to lead to alternative splicing of MAPT, which may result in a version of tau that is more likely to aggregate into tangles. “Given the appearance only in primates, it's tempting to say the gene could have something to do with brain development,” says Conrad.

    Likewise, Stefánsson is frustrated by 17q21.31's enigmatic connection to psychiatric disorders. “We've done a lot of work to see what the risk [of the inversion] is to schizophrenia, but we have not succeeded yet,” says Stefánsson.

    De Vries is continuing to search for more individuals with the microdeletion syndrome and to characterize the disorder. Before comparative genomic hybridization, about half the cases of mental retardation went unexplained. Now, this new technology is making sense of about 10% of those enigmatic cases.

    That makes a big difference to the parents. Dansen says her family was just glad to have a name for their daughter's disorder and to see that there were others just like her with the same problems. For at least one mother, the diagnosis brought good news. Her toddler was still not walking, but De Vries could reassure her that the others had also been slow to walk but did so eventually. “When I explained that to the mother, she was very relieved,” says De Vries.

    Now almost 20, Zandee plays in a special band, works in a canteen, and paints. Eventually, she will move from her parents' home to a group house with others with mental handicaps. “I will keep following the research,” says her mother, although she's not sure what more it will tell her. But she knows that by tracking Zandee's progress, De Vries will learn a lot about adult diseases associated with the syndrome and even about life expectancy. “My son recently asked, 'How long can she live anyway?'” says Dansen. “We have no idea. Nobody does.”


    Engineering a Fix for Broken Nervous Systems

    1. Greg Miller

    A recent meeting on neural prosthetics provided an update on progress and some interesting digressions.

    A recent meeting on neural prosthetics provided an update on progress and some interesting digressions

    PALO ALTO, CALIFORNIA— “I believe we're at the beginning of a new age of neurotechnology,” Brown University neuroscientist John Donoghue told researchers who gathered here recently to discuss the state of the art in neural prosthetics, surgically implanted devices designed to restore sight to the blind, hearing to the deaf, and movement to paralyzed people. The idea of engineering a fix for nervous systems that can't heal themselves continues to spur both hope and hype; the meeting here at Bio-X, Stanford University's interdisciplinary research center, provided a glimpse of where the technology really stands. It also prompted frank discussions of current challenges and some fascinating, if slightly tangential, dinner conversations.

    If the age of neurotech is indeed upon us, Donoghue is one of those ushering it in. In 2001, he co-founded a company (Cyberkinetics) to develop and commercialize brain-computer interfaces. He and colleagues made headlines with a 2006 Nature report describing their work with Matthew Nagel, a young man paralyzed by a knife attack that severed his spinal cord. Surgeons implanted a 4 × 4-millimeter chip studded with 100 hair-thin electrodes into the part of Nagel's motor cortex responsible for planning arm movements. Now, as Nagel imagines moving his arm, a computer infers his intentions from the neural chatter. Videos accompanying the paper showed Nagel moving a cursor to operate an e-mail program and moving the fingers of a prosthetic arm. Nature apparently bleeped out Nagel's candid reaction when he first saw the hand respond to his thoughts: “Whoa, holy shit!” he says in an uncensored version Donoghue played at the meeting.

    Three more patients, including one suffering from amyotrophic lateral sclerosis, have now received implants. All have been able to use the thought-controlled cursor without any training, Donoghue said. But video clips of their efforts showed that the cursor's movement is plodding and wobbly. When Nagel attempts to draw a circle onscreen, the result is subpar. “We're asking him to draw a circle with 24 neurons,” Donoghue explains. “When we do something like that, we're using millions.” Brown computer scientist Michael Black has developed algorithms to reduce the wobble—but so far the tradeoff is an even slower cursor.

    Other presenters described the potential for prosthetic devices for people deprived of hearing or sight. Stanford University Medical School otolaryngologist and surgeon Nikolas Blevins gave a brief history of research on cochlear implants, beginning with a seminal, if ill-advised, experiment by Italian physiologist Alessandro Volta circa 1790. Volta connected two metal rods to a battery and stuck them into his ear canals. Apparently unharmed, he reported hearing something like water boiling, thereby demonstrating that electrical stimulation could produce the sensation of sound.

    Think about it.

    Researchers are testing neural prosthetics that would enable paralyzed people like this man with ALS to control a cursor with their thoughts.


    Today's cochlear implants have restored hearing to tens of thousands of people but still have drawbacks. One of Blevins's patients, an articulate middle-aged woman with a trace of a British accent, said her implant “gave me back my life.” But she still struggles to follow a conversation in a noisy restaurant and can't appreciate music. “It's just terrible, like honky-tonk piano or just bass and no melody,” she said. The likely problem, Blevins said, is that individual nerve fibers in the cochlea normally respond to a narrow range of frequencies, but the electrodes in the implants stimulate many fibers at once.

    Even so, cochlear implants are far ahead of retinal prosthetics, neuro-ophthalmologist Joseph Rizzo of Harvard Medical School in Boston told the audience. So far, about 50 people have received retinal implants, which transmit signals from a tiny camera to an array of electrodes attached to the retina. Patients tolerate the implants well, but exactly what they're able to see is difficult to know because the companies making the implants have been reluctant to release their data.

    One of the most surprising exchanges occurred over dinner. Vilayanur Ramachandran of the University of California, San Diego, who had captivated the audience earlier with case studies of neurological curiosities, was about to tuck into his salad when he was interrupted by a tap on the shoulder. “Are you the guy who did that transgender study?” asked Stanford neurobiologist Ben Barres. He was. In a paper last year, Ramachandran hypothesized that transgendered people who have reassignment surgery might be immune to the “phantom penis” phenomenon. Just as many people who have had an arm amputated retain a vivid sense that the arm is still there, he explained, about 60% of men who have their penis amputated for cancer experience a phantom penis. He believes such sensations arise because the brain's representation of the body still has a place for the missing appendage. But does the brain's body representation include a penis for a woman born into a man's body? Ramachandran thought not, and a preliminary survey backed him up: Transgendered people were far less likely to report phantom penises (or breasts, in the case of female-to-male operations).

    “That fits my experience exactly,” said Barres, who is transgendered, adding that he'd heard similar stories from other transgendered people. Ramachandran seemed relieved. He said he'd gotten flak from some psychologist colleagues who didn't like his suggestion that some aspects of transsexuality could be explained by innate differences in the brain's body map. “I have a name for that,” he said. “I call it neuron envy.”


    The Graying of NIH Research

    1. Jocelyn Kaiser*
    1. With reporting by Rachel Zelkowitz.

    Many scientists who got their first grant in the 1950s or 1960s are still going strong. How do they view affirmative action for first-time grantees?

    Many scientists who got their first grant in the 1950s or 1960s are still going strong. How do they view affirmative action for first-time grantees?

    Lifelong passion.

    Harold Scheraga, 87, Phillips Robbins, 78, and Roger Unger, 84, are active researchers with NIH funding.


    Roger Unger found himself drawn to research as a young internal medicine resident sometime around 1950, when he was treating diabetes patients in New York City. He had a controversial idea—that glucagon, a biomolecule then thought to be a contaminant in insulin made from ground-up beef and pork pancreases, might actually be a key hormone affecting blood sugar. Unger and colleagues in Texas had no direct evidence for this, but “we had the tools to answer the question, and we needed some money,” Unger says. So at age 32, Unger applied for and won a research grant from the U.S. National Institutes of Health (NIH).

    It didn't seem hard, “because I didn't know what I was doing back then,” says Unger, now at the University of Texas (UT) Southwestern Medical Center in Dallas. Several years later, Unger's group published a landmark paper pinning down glucagon's role as a counter to insulin in regulating blood glucose levels: Glucagon tells cells to make more glucose, whereas insulin brings excess amounts down.

    Today, at 84, Unger still runs a lab that enjoys NIH support. Now he's motivated by a new public health problem—the “meltdown” in Americans' health due to rising rates of obesity, he says. He's deep into exploring a concept his lab put forward: that a surfeit of lipids in obese people contributes to diabetes and heart disease. “I always decided I would retire when I ran out of ideas. But I didn't. The ideas got more exciting,” says Unger.

    That researchers such as Unger are still going strong in their 70s and 80s—and pulling down grants—would have been unheard of 3 decades ago. Because the biomedical enterprise was young and most universities had mandatory faculty retirement until 1994, there were few NIH-funded principal investigators older than 70 in 1980. But in 2007, there were at least 400 of them, according to NIH data. Indeed, NIH projections indicate that grantees over 68 could outnumber scientists under 38 by 2020 (see graph). The average age for obtaining a first NIH research grant is now 42. These data worry some research leaders, who have called on the community to reverse the trend. They have also contributed to a sense of crisis at NIH, which is taking steps to bolster the number of new investigators and slow the rising age of the average NIH-funded scientist, now 51 (see p. 834).

    NIH officials say they do not mean to discourage very senior investigators from continuing in research. “It's not young against old,” says NIH Director Elias Zerhouni. The number of investigators over 70 among those funded by NIH is a tiny fraction of the total, and some of them are incredibly productive into their later years—for example, Nobel laureates Eric Kandel and Paul Greengard are both around 80. Furthermore, peer review is supposed to winnow out any whose productivity has decreased. Scientists who have served on study sections generally say they haven't noticed a bias in favor of keeping older scientists' labs running, even if many of the reviewers are the applicants' former students and postdocs.

    At the same time, concerns about the aging biomedical work force have prompted NIH to deploy what amounts to an affirmative action plan, setting numerical targets at each institute for grants to newcomers. To sample the community's views of this plan, particularly among those who won't benefit from the initiative, Science interviewed a score of researchers 70 or older. Most were drawn from a list of NIH investigators who have had the same basic NIH research grant, known as an R01, for at least 35 years, nearly all of them men. We asked: How does a very senior scientist decide when to shut down his or her lab? And does the current plight of young investigators influence their thinking? Most praised the idea of introducing fresh blood, but only about half said that they're ready to relinquish their own lab.

    No time to quit

    One strong theme—a sense that the review process was more interested in originality in the past—emerged in comments from this generation of scientists who applied for their first grants in the 1960s or earlier, often in their 20s or early 30s. It was a different game, they say. Not only did NIH have plenty of money to go around, but peer reviewers wanted ideas, not preliminary data. Microbiologist Samuel Kaplan, 74, of the University of Texas Medical School in Houston says he proposed studying a “newish” bacterium that he had never cultured. “If I submitted a proposal like that now, the study session couldn't stop laughing,” he says. Peter von Hippel, 77, who earned a Ph.D. from the Massachusetts Institute of Technology in Cambridge at age 24 and then moved on to a postdoc there, found his grant waiting for him when he joined the Dartmouth Medical School faculty at age 28. “There was less to learn, and if you got on to a good project, things moved along pretty fast,” says Von Hippel, now a professor emeritus and researcher at the University of Oregon, Eugene.

    Some of these scientists were part of a cadre who created the field of molecular biology. Others were pioneers in areas such as spectroscopy and protein chemistry. Forty or more years later, most have published hundreds of papers and trained scores of graduate students and postdocs. Many are members of the National Academy of Sciences. Some of those interviewed edit journals. (NIH intramural researcher Herbert Tabor, the editor of the Journal of Biological Chemistry, is nearly 90.) And many are still publishing in high-profile journals such as the Proceedings of the National Academy of Sciences and Science.

    Most of those over 75 said they have cut back their research in recent years and stopped taking graduate students, who might be left in the lurch if their mentor developed health problems. Some have retired and are now emeriti, so their university no longer pays their salaries. Most say they are sympathetic to the funding difficulties faced by young investigators and support NIH's plans to target more grants to this group. “I couldn't agree more that we have to bring down the age of investigators,” says Unger.

    Representative of the nonretiring group is Cornell University protein chemist Harold Scheraga, 87, who may be the oldest NIH investigator. Since 1947, he has published more than 1200 papers, 20 of them in 2008. Scheraga is winding down an NIH grant for experimental work that expires in March 2009, which will free up lab space for a new faculty member, he points out. But he plans to continue with 10 workers on another NIH grant, the one that funds his theoretical study of protein folding, which he's had for 52 years.

    “I'm very productive and making good progress,” Scheraga says. “I'll keep going as long as I'm sane and my health is holding up. Only when somebody—my peers or myself—says that my science is washed up will I quit.”

    Some say that, like Unger, they're motivated by finding new research directions. The University of Pennsylvania's Robin Hochstrasser, 77, has a 14-person lab that is using lasers to study how protein structures change with time. “These techniques were only created 8 years ago. Close to 100 people are using them, and they started here,” he says. But he praises NIH's new grants for young investigators and thinks setting targets for newcomer R01s is “reasonable … to ensure the future of medical research.” John Dietschy of UT Southwestern, 76, says he has no plans to give up his R01 of 44 years on cholesterol metabolism, which ranked in the top 1% of proposals when it was last reviewed. “We're ahead of everybody in our field at the moment,” says Dietschy. “As long as I'm having fun in the lab, we'll probably keep going.”

    The passion for doing research doesn't correlate with youth, some point out. “I think the people who are my age and continue to work in science have a certain amount of tenacity and they have a passion for it. I see the flame extinguishing in people in their 40s or 50s as much as people in their 70s,” says molecular biologist E. Peter Geiduschek, 80, of the University of California (UC), San Diego. He says he will “keep doing research until somebody stops me from doing it.” Like many others interviewed by Science, he says he can't imagine doing anything else.

    Biochemist Carl Frieden, 79, of Washington University in St. Louis, Missouri, also says he will let peer reviewers tell him when it's time to close his lab. He says that although he's sympathetic to the struggles of young scientists, funding should be based strictly on scientific merit, not age. “We're the only profession judged by our peers every 3 to 5 years. If older scientists can pass that trial, I'm comfortable with that,” he says.

    Graying work force.

    NIH investigators are aging, and those over 68 could outnumber those under 38 by 2020.


    Moving on

    Others have decided to wind down. For molecular geneticist Robert Wells of Texas A&M University in College Station, who is just 70 but gave up his NIH grant and has been closing his lab for the past 2 years, the difficulties young investigators face were foremost in his thoughts. “If I and other old birds continue to land the grants, the [young scientists] are not going to get them,” he says. He worries that the budget won't be able to support the 100 people “I've trained … to replace me.” He will stay involved in science through advocacy.

    Boston University biochemist Phillips Robbins, 78, has mixed feelings about his plans not to seek renewal of his grant when it ends in 3 years. He recently teamed up with a parasitologist to study glycosylation patterns in human parasites such as Giardia. “It's almost as though I've opened up a brand-new career,” he says. But it's time, he says. “I think the folks who want to go out the door feet first, that that mindset is wrong. Once I reach 81, 82, it would be a poor decision for myself, for my university, and for students.”

    “It depends on what you're working on,” suggests UC Berkeley biochemist Howard Schachman, 89, famous for fighting forced retirement at UC schools (Science, 14 September 1990, p. 1235). He says he let his last R01 lapse a few years ago only because his work studying a bacterial enzyme is out of date. “To what extent do you keep working and depriving young faculty of space in your department? I asked myself that question at 80 and decided I should keep going. But I couldn't do that today,” he says. But Schachman, now emeritus, teaches the main biomedical research ethics course at Berkeley. “It's something that was interesting and important to me,” he says.

    For those researchers who do decide to leave the lab, the transition should be easier, says Harvard University molecular biologist Richard Losick, 65. He wishes there was more recognition for teaching and mentoring junior faculty members. “I don't think the culture of science fosters a graceful transition for aging scientists,” says Losick, who says his own thoughts are to teach more in a few years. Others support the idea of giving retired faculty a small lab and encouraging them to keep up other activities.

    Whatever their individual choices, the dilemma of how and when aging scientists should hang up their lab coats is only going to become more urgent. As Frieden points out, “It's rare to be as old as I, but there are going to be more of us.”

  10. Parsing the Genetics of Behavior

    1. Constance Holden

    It takes more than one gene, or even a few genes, to make a personality trait. But which ones?

    It takes more than one gene, or even a few genes, to make a personality trait. But which ones?


    Some years ago, A scientist-educator told Science she would never be convinced of a biological basis for sex differences in math performance until someone showed her a “math gene.” The comment rests on a commonly held misconception: that simple one-to-one links exist between a gene and each facet of our personalities. Headlines such as “'Ruthlessness' Gene Discovered” or “'Divorce Gene' Linked to Relationship Troubles” feed that impression.

    For some of us, it's satisfying to attribute social awkwardness to anxiety genes or to think that the driver who cuts off other cars as he zips across lanes is pumped up by the “warrior” gene. Was it a bad dopamine receptor gene that made author Ernest Hemingway prone to depression? Can variations in a vasopressin receptor gene—a key to monogamy in voles—help explain adulterous behavior?

    But as scientists are discovering, nailing down the genes that underlie our unique personalities has proven exceedingly difficult. That genes strongly influence how we act is beyond question. Several decades of twin, family, and adoption studies have demonstrated that roughly half of the variation in most behavioral traits can be chalked up to genetics. But identifying the causal chain in single-gene disorders such as Huntington's disease is child's play compared with the challenges of tracking genes contributing to, say, verbal fluency, outgoingness, or spiritual leanings. In fact, says Wendy Johnson, a psychologist at the University of Edinburgh, U.K., understanding genetic mechanisms for personality traits “is one of the biggest mysteries facing the behavioral sciences.”

    All we really know so far is that behavioral genes are not solo players; it takes many to orchestrate each trait. Complicating matters further, any single gene may play a role in several seemingly disparate functions. For example, the same gene may influence propensities toward depression, overeating, and impulsive behavior, making it difficult to tease out underlying mechanisms.

    Each gene comes in a variety of flavors, or alleles, with varying degrees of sequence variation. One allele might contribute to a winning personality whereas another may raise the risk of mental illness. Environment plays a strong hand, bringing out, neutralizing, or even negating a gene's influence. And genes interact with one another in unpredictable ways.

    Science took a look at a few genes that have been in the news, with an eye toward understanding just what we do—and can—know about genes behind individual variation in temperament and personality.

    Loves me, loves me not …

    A genetic screen for marital success? It sounds like a Saturday Night Live skit, but one Canadian company is actually offering just that sort of test. For $99, Genesis Biolabs in St. John's, Newfoundland, will examine your—or your partner's—vasopressin 1a receptor (AVPR1a) gene, which this year grabbed headlines once as the “ruthlessness gene” and again as a “divorce gene.” But is the test really any more predictive than pulling petals off a daisy?


    Vasopressin is a hormone involved in attachment to mates and offspring. Among voles, prairie voles are true to their mates. Meadow and montane voles prefer to play the field. Prairie voles have a few extra bases in the DNA in front of this gene, which influences how much and where vasopressin is released in the brain. This difference matters: Extra AVPR1a in the brain makes promiscuous meadow voles act more like monogamous prairie voles, spending more time with partners and grooming offspring (see p. 900).

    Subsequent research has disproved any simple relationship between this gene and animal mating patterns. Nonetheless, scientists have now observed hints that variation in the human AVPR1a gene may influence the far more complex arena of human behavior.

    A team led by Hasse Walum of the Karolinska Institute in Stockholm looked at the DNA preceding the AVPR1a gene in about 500 pairs of adult same-sex Swedish twins, all of them married or cohabiting for at least 5 years, and their partners. One short variant of a stretch of DNA in this region—there are several variants—was associated with less stable relationships. Answers to questions such as “How often do you kiss your mate?” and “How often are you and your partner involved in common interests outside the family?” reflected slightly lower feelings of attachment on the part of men with this variant, researchers reported in the 16 September issue of the Proceedings of the National Academy of Sciences. These men were less likely to be married and, among those in relationships, more likely to have experienced recent marital strife.

    A gene worth testing to be assured of marital bliss? Not quite. “This is a brand-new study for which replication has not been attempted,” Johnson points out.

    Another paper published last spring showed a different link between AVPR1a and how people treat others. Richard Ebstein and colleagues at Hebrew University in Jerusalem reported that the length of the variant predicted how human subjects would respond in the “dictator game,” a way to assess altruism. The researchers divided 200 volunteers into groups “A” and “B.” The “A's” received $14 each and were told to share as much as they wished with a “B” whom they had never met. About 18% kept all the money, and 6% gave it all away, with the rest somewhere in the middle. The people who behaved more selfishly—or, as the headlines proclaimed, more ruthlessly—had the same variant as the people with the less stable relationships in the study mentioned earlier. Ebstein speculates that in these people, vasopressin receptors were distributed in such a way that they provided less of a sense of reward from the act of giving (or loving). He and other scientists suspect that short variants of this gene will be implicated in autism and related disorders, because a core feature of autism is the inability to make connections with other people.

    Although such theories are intellectually appealing, there are few replicated studies to give them heft, notes psychologist Simon Easteal of Australian National University in Canberra. Too often, the subjects assessed are too different—How does one compare adolescents with married couples?—and the effect of environment too difficult to control for. So, getting reliable replications of studies involving behavior is, he says, “much harder than for studies of medical conditions.”


    A “bounce-back” gene

    Some people are like Woody Allen characters who melt down in the face of the smallest obstacles. Others seem to have a thick hide against life's slings and arrows. The roots of such resilience may lie in a gene for a protein that regulates serotonin, a brain messenger that has been associated with emotional ups and downs. The gene is called SERT for serotonin transporter.

    In a classic paper published in Science in 1996, Klaus-Peter Lesch of the University of Würzburg, Germany, and colleagues at the U.S. National Institutes of Health reported that the length of the regulatory DNA at the beginning of SERT affected human behavior. Lesch's team found that among 505 adults, those scoring high on various tests measuring “neuroticism”—depression and anxiety—tended to have one or two copies of a short variant whereas those who were more laid back had only the long form. The short version translates into more serotonin in the synapse, and too much serotonin leads to anxiety, in both animals and humans.

    The short version accounted for up to 4% of the increase in anxiety and negative emotions in this group. Four percent doesn't sound like much, but it's huge for any personality trait, says psychologist Turhan Canli of Stony Brook University in New York state. In fact, he says, scientists have been able to find “no gene in the intervening years that has accounted for that much variability.”

    In another landmark study published in 2003, researchers showed that the effect of the gene depends on life experiences. In Dunedin, New Zealand, researchers led by Avshalom Caspi of the Institute of Psychiatry in London tracked 847 people over more than 20 years from the age of 3. The researchers counted stressful life events occurring between the ages of 21 and 26 and asked subjects if they had been depressed in the past year.

    Among people who had not reported any major life stresses, the probability of depression was low regardless of their SERT alleles. But among people who had been through four or more stressful experiences, 43% of those with two short alleles reported a major depressive episode—more than double the proportion of subjects with two long alleles. The study also showed that almost two-thirds of people with a history of abuse as children experienced major depression as adults if they had two short alleles. But child abuse didn't raise the risk of adult depression in people with two long alleles.

    Unfortunately, however, the picture is still unclear. Psychologists at the University of Bristol in the United Kingdom published a meta-analysis of studies of this gene in July in Biological Psychiatry. They concluded that the published studies weren't based on large enough samples and that the interaction effect between the gene and stressful life events is probably “negligible.”

    The more researchers look into this gene, the more widespread its associations appear to be, adding to the confusion. “The serotonin transporter is implicated in everything from heart disease to sleep disorders and irritable bowel syndrome [as well as] schizophrenia, depression, attention deficit hyperactivity disorder, autism, and sensation-seeking, to name just a few,” says Johnson. With such a broad scope, its effects on behavior must be “extremely general,” she notes. So to call it a resilience gene doesn't really fit.


    Warrior gene

    In 2006, a New Zealand researcher, Rod Lea, stirred up a political storm when he reported that a variant of a gene for monoamine oxidase-A (MAO-A)—which breaks down neurotransmitters—could be behind risk-taking and aggressive behavior in Māori, the indigenous people of New Zealand. The Māori have a warlike heritage, and a large proportion of this ethnic group carry a version of the gene shown in animal studies to be connected to aggressive behavior. Lea, a genetic epidemiologist at the Institute of Environmental Science and Research in Wellington, suggested that the gene could help explain Māori social and health problems such as fighting, gambling, and addictions. Although it's true that 60% of Asians (including Māori) carry the “warrior” variant (compared with 40% of Caucasians), Lea's critics quickly pointed out that it was too big a leap to ascribe Māori social problems to a single gene.

    Yet brain-imaging studies “underscore the role of MAO-A [quite] specifically” in male aggressiveness, says neuroscientist Nelly Alia-Klein of Brookhaven National Laboratory in Upton, New York: Researchers have not detected connections between brain MAO-A and any other personality trait, she notes. In one study using functional magnetic resonance imaging, Andreas Meyer-Lindenberg and colleagues at the National Institute of Mental Health (NIMH) in Bethesda, Maryland, presented normal subjects with neutral or “emotionally aversive” images such as fearful faces. Monitoring activity in key brain regions such as the amygdala, the seat of fear, they found that the amygdalas of subjects with the “warrior” variant were hyper-responsive to such images. This sensitivity suggests that these individuals had problems regulating their emotions, which would also make them more likely to act on aggressive impulses, Meyer-Lindenberg reported.

    But the gene variant isn't all that matters. Caspi's Dunedin study has shown that the environment—in the form of traumatic life events—plays a critical role in how this gene is expressed. Caspi's group reported in 2002 that the warrior MAOA variant is associated with violent and antisocial behavior but only in people with a history of abuse as children. These men were 2.8 times as likely as nonabused males with this genotype to develop behavioral problems that are often the precursor to a life of crime and drug abuse. Children with a different variant were less likely to develop antisocial problems in response to maltreatment (Science, 2 August 2002, p. 851).

    Earlier this year, researchers drew similar conclusions based on the University of North Carolina's (UNC's) long-running National Longitudinal Study of Adolescent Health (NLSAH). Guang Guo of UNC Chapel Hill and colleagues analyzed genetic and social data from 1100 males and found that the undesirable effects from the “warrior” allele were only manifested when “social controls”—the steadying influence of a healthy family and social environment—were absent. They reported these results in the August 2008 issue of the American Sociological Review.

    Testosterone seems to add toxicity to the mix. Rickard Sjöberg of Uppsala University in Sweden and David Goldman of the National Institute on Alcohol Abuse and Alcoholism (NIAAA) in Bethesda compared the genes and testosterone levels of 95 male alcoholics who have criminal records with those of 45 nonalcoholic, law-abiding controls. They reported that the combination of low MAO-A and high testosterone spells antisocial behavior, as revealed by answers on an aggression scale. If these findings are replicated, Goldman says, they might help clear up the relationship of testosterone to aggression: Maybe the hormone causes trouble only in males who also have this gene variant, he says.

    The warrior gene as the root of social ills may be dead, but it still has a fighting chance as a gene important to behavior.

    Can't get no satisfaction

    What do Janis Joplin, Amy Winehouse, and Jimi Hendrix have in common? If you want to find examples of people whose brain reward circuits have gone haywire, the world of rock stars is probably a good place to look. But is a dopamine receptor gene at the heart of these musicians' addictions?

    Scientists have proposed that deficiencies in the brain messenger dopamine lead to various unhealthy forms of sensation-seeking to compensate for the failure to get a charge out of things that give most people pleasure. For years, they've been trying to nail down the role of dopamine receptors, in particular one called the D2 dopamine receptor, in addictions to alcohol, drugs, smoking, or gambling, as well as eating disorders and obesity.

    The A1 allele of this gene yields receptors that don't work as well, and that translates into less dopamine firing up the reward circuits. Some scientists think this can lead to a tendency to abuse drugs and engage in impulsive, sensation-seeking, or antisocial behavior—including problems forming relationships.

    Scientists led by anthropologist Dan Eisenberg of Northwestern University in Evanston, Illinois, reported last year in Evolutionary Psychology that in a group of 195 student subjects at Binghamton University in New York state, those with A1 alleles were more likely to engage in early sexual activity but were less inclined to develop steady relationships. This putative role in attachment has attracted the attention of political scientists looking for possible biological foundations for political behavior (see p. 912).

    James Fowler and colleagues at the University of California, San Diego, picked up on reports such as this, as well as on animal research suggesting a connection between low dopamine receptor binding and low social bonding. They hypothesized that people with more efficient receptors—that is, with one or more A2 alleles—would be more trusting and therefore more likely to join a political party. After delving into NLSAH, they reported that, indeed, people with two A2 alleles (and no A1) were 8% more likely to form political attachments. Fowler called it “the first gene ever associated with partisan attachment.”

    But that's only the latest in the long and contradiction-riddled history of research on the D2 dopamine receptor gene. Guo looked for a link between social behavior and this gene by assessing delinquency rates in teenagers. What he found was that boys with one A1 allele tended to have higher delinquency rates than those with two copies of the A2 allele. But the rates were also higher than in those boys with two A1 copies, suggesting that there is not a simple relationship between the amount of dopamine and behavior. Warns Goldman: “There is still more heat than light with this gene.”


    Titrating anxiety

    Scientists aren't doing much better at understanding the biological role of another player in the dopamine circuit. Dozens of studies have tried to figure out the gene for catechol O-methyltransferase (COMT), an enzyme that breaks down dopamine in the prefrontal cortex, the seat of higher cognitive functions such as planning and reasoning.

    The two major variants of the gene code for enzymes that differ by one amino acid: The substitution of a valine for a methionine revs up the protein's activity fourfold. Both the high- and low-activity versions of the gene have their costs and benefits. Mice with the high-activity COMT—meaning less dopamine in the synapses—have poor memories and reduced sensitivity to pain. With the gene knocked out, and thus higher dopamine activity, mice show increased startle and anxiety responses.

    In humans as well, different versions of the gene have been implicated in cognitive and emotional dysfunction, says Goldman. In several studies, people with two low-activity COMT genes have tested high for fear, anxiety, and negative thinking. A study at Yale University in 2005, for example, gave 497 undergraduates personality tests and found that those with low-activity COMT genes were more neurotic and less extraverted.

    In research getting closer to the interface between biology and behavior, published in the August issue of Behavioral Neuroscience, researchers reported a difference in a simple test that has come to be recognized as a reliable indicator for anxiety: the startle reflex, as manifested in involuntary eye blinking in response to a sudden noise or unpleasant pictures. Among 96 female psychology students, individuals with two copies of the low-activity COMT had the most exaggerated startle responses, says Christian Montag of the University of Bonn in Germany.

    Yet other work evaluating how well individuals organize their thoughts found low-activity COMT to be an asset. Psychiatrist Daniel Weinberger and colleagues at NIMH think they know why. Brain-imaging studies of 100 normal adults found that those with the low-activity COMT have denser nerve connections. Weinberger and others speculate that the elevated dopamine in the prefrontal cortex may bolster temporary connections, leading to better concentration but reduced ability to shift focus and more behavioral rigidity. As a result, a person may dwell excessively on stressful thoughts. So the gene seems to come with a tradeoff—better cognitive function but more anxiety—the scientists conclude.

    The trouble with a lot of research on COMT, however, is that some studies find significant linkages only in women, and others don't find any at all. “COMT leaves a trail of intriguing hints,” says Edinburgh's Johnson, “but nothing that solidly replicates.”

  11. Wanted: Math Gene

    1. Constance Holden

    Last year, researchers at Washington University in St. Louis, Missouri, reported in the journal Behavior Genetics that certain aspects of IQ seemed to be related to CHRM2 (cholinergic muscarinic 2 receptor), a gene whose protein is involved in pathways related to learning, memory, and problem-solving.


    There, a team led by psychiatric geneticist Danielle Dick analyzed DNA and IQ test results from members of 200 families, 2150 individuals in all, as part of the Collaborative Study on the Genetics of Alcoholism. The team found a modest correlation between spatial and logical reasoning skills and certain variations in this gene.

    But this study is one of very few to find any connection between genes and IQ—and it has yet to be replicated. This situation reflects a major paradox. Cognitive abilities are among the most genetically influenced of human behavioral traits: In studies over the years, scientists have estimated that somewhere between 40% to 80% of the variation in individual IQ scores in a given population is attributable to individual genetic differences. This is comparable to the genetic influence on height. Yet IQ genes have so far been impossible to nail down.

    Psychologist Robert Plomin of the Institute of Psychiatry in London has spent years scouring genomes for signs of loci associated with high IQ. The largest study yet was a genome-wide scan of DNA from 6000 children using 500,000 markers that could help pinpoint relevant DNA. The study compared groups of low-IQ children with groups of high-IQ children in hopes of teasing out markers linked to intelligence.

    A handful of markers had a significant association with the aspects of IQ deemed most heritable, such as verbal ability. But none accounted for more than 0.4% of the variance. In other words, if the IQs of the population in question ranged from 80 to 130 points, the biggest gene effect the researchers could find would account for less than one-quarter of an IQ point.

    It seems to be much easier to identify genes for disabilities than for abilities. “The only genes we have identified so far for cognitive ability are for mental retardation, and there are about 300 of them,” some of which have quite severe repercussions, says Wendy Johnson of the University of Edinburgh, U.K. Many are also associated with other types of disabilities. But corresponding genius-type alleles, particularly for specific skills such as math ability, don't seem to exist.