News this Week

Science  31 Oct 2014:
Vol. 346, Issue 6209, pp. 526

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution

  1. This week's section

    Giving shapes to signaling proteins

    New consortium aims to find structures for 200 membrane proteins.


    Some 40% of all approved drugs target the proteins called G protein-coupled receptors (GPCRs), which relay signals across the cell membrane. But we know the 3D shapes of just 22 of the estimated 826 human GPCRs. This week, a trio of U.S. and Chinese academic institutions announced that they'll join forces over the next 5 years with three pharmaceutical companies to determine the structures of 200 more. Obtaining such structures has been dif cult. The GPCRs solved to date all required adding small druglike molecules in order to stabilize them, says Raymond Stevens, a structural biologist at the University of Southern California (USC) in Los Angeles, who will lead the new consortium. Drug companies are well placed to supply such stabilizers, Stevens adds. In exchange for sharing their molecules, each of the companies—Amgen, Sanofi, and Ono—will get to recommend five GPCR targets per year. In addition to USC, the iHuman Institute at ShanghaiTech University in China and the Shanghai Institute of Materia Medica will participate.

    Postcards from the edge of space

    Alan Eustace dangling beneath an unseen helium-filled weather balloon before his free fall.


    Computer scientist Alan Eustace, 57, a senior vice president at Google, last week broke the records for the world's highest and fastest free fall jump. At 41,419 meters above Earth's surface, Eustace jumped from 2374 meters higher than previous record-holder Felix Baumgartner, who made his leap in 2012. Eustace began his free fall at the upper edge of Earth's stratosphere. He plummeted for 4 minutes and 27 seconds and reached a top speed of 1321 kilometers per hour before deploying two parachutes. In addition to being low in oxygen, the stratosphere's air temperatures increase with altitude and therefore required careful engineering of Eustace's pressurized life-support suit to keep him cool and breathing. Other technologies that accompanied the VP on his sound barrier–breaking descent included cameras and a simple two-way radio to communicate with ground control. “It was a wild, wild ride,” Eustace told The New York Times.

    Tummy in a dish

    This tissue from a lab-grown human stomach could help researchers test drugs.


    The colorful snakelike image here is fluorescently labeled tissue from a stomach smaller than a pea. But the organ is not from a very small animal; it's a mini human stomach grown in a dish. The digestive systems of mice, flies, and other model organisms differ from those of humans, making them of limited use for studying human gut diseases. So researchers have turned to pluripotent stem cells—cells derived from embryos or reprogrammed adult cells that can potentially turn into any cell type—to grow digestive organs in the lab. Last week in Nature Medicine, one group unveiled a small intestine created from human stem cells. This week, a different team reports in Nature that they've coaxed both types of stem cells to form small spheres with all the properties of a functional stomach. When the researchers exposed the tiny stomachs to the bacterium Helicobacter pylori, which contributes to stomach ulcers and cancers, they saw the same cellular changes known to occur in life-size stomachs.

    “The perfume … is quite strong.”

    Kathrin Altwegg, principal investigator for the Rosetta probe's spectrometer, on the “odor” given off by comet Churyumov-Gerasimenko, which emits hydrogen sulfide, ammonia, formaldehyde, and other pungent compounds.

    By the numbers

    40%—The minimum cut in greenhouse gas emissions, compared with 1990 levels, that Europe should reach by 2030, according to an agreement E.U. leaders announced on 23 October.

    $65 million—Largest ever gift to the University of California, Santa Barbara, from billionaire investor Charles Munger for a residence for visiting scholars at the Kavli Institute for Theoretical Physics.

    35,000—Estimated number of African lions, compared with 76,000 in 1980, according to a U.S. Fish and Wildlife Service proposal to list the animal as a threatened species.

    Around the world


    Chinese science funding revamp

    The Chinese government is readying a major shake-up of how it doles out science funding. The Ministry of Science and Technology may hand control of the lion's share of research spending to as-yet-unidentified “independent institutes,” reports the state-run People's Daily. In 2013, the ministry doled out $3.6 billion in R&D funding, primarily through its 863 high-tech development and 973 basic research programs. In an interview with China Radio, science minister Wan Gang welcomed the pending reform, saying that it's intended to “get rid of the shackles on technological innovation.” Observers say that the National Natural Science Foundation of China could serve as a model for what may be a new agency or agencies for managing R&D spending.

    Bethesda, Maryland

    Pause on virus studies protested

    Risky studies on the H5N1 bird flu virus are on hold.


    A U.S. government moratorium on certain risky virology studies has gone too far, researchers said at a 22 October meeting of the National Science Advisory Board for Biosecurity (NSABB). The 17 October pause halts new federal funding for so-called gain-of-function (GOF) studies that make influenza, MERS, or SARS more transmissible in mammals or more pathogenic. The idea is to provide a year for experts to work out a U.S. government-wide policy for reviewing the risks and benefits of GOF studies. But scientists told NSABB that the pause is sweeping up even routine influenza surveillance and vaccine work and halting efforts to develop a mouse model for MERS. White House officials said they are working with the National Institutes of Health to allow exceptions for research needed to protect public health.

    East Lansing, Michigan

    Creationist event stirs concerns

    A creationist conference set for Michigan State University (MSU) is creating unease among some of the school's students and faculty, which includes prominent evolutionary biologists. The 1 November Origin Summit is sponsored by Creation Summit, an Oklahoma-based Christian group founded to “challenge evolution and all such theories predicated on chance.” The event will include eight workshops, including one on why “the big bang is fake” and another targeting the work of MSU biologist Richard Lenski, who has led a major study of bacterial evolution. The organizers invited Lenski to participate in a debate, but he hasn't responded and is encouraging others to ignore the event. It “will be just another forgettable blip in the long history of antiscience, antievolution screeds,” he predicts.


    Pakistan's polio program blasted

    Pakistan's polio eradication program is “a disaster” and needs new leadership, concludes the Independent Monitoring Board of the Global Polio Eradication Initiative in an October report. So far in 2014, more than 200 Pakistani children have become paralyzed, accounting for 80% of the world's polio cases, and the country has exported the virus to Afghanistan and the Middle East. Killings of vaccinators and a Taliban ban on polio vaccination in North Waziristan pose enormous challenges. But “[t]he government … can reach its children if it wants to,” the report says, calling on the prime minister to immediately move the program from the ministry of health to the National Disaster Management Authority. Without decisive action, “Pakistan is very likely to be the polio virus' last home on earth,” it warns.

    Washington, D.C.

    New ethics guidance for trials

    The federal Office for Human Research Protections (OHRP) is standing by its position that a study that gave some 1300 premature infants various levels of oxygen did not adequately inform parents of risks. The SUPPORT study, which ran from 2005 to 2009 at 23 institutions, has been criticized as unethical by the advocacy group Public Citizen and some bioethicists but defended by others and its funder, the National Institutes of Health. OHRP's comments were part of proposed guidance released last week that lays out a new ethics blueprint for clinical trials that include existing standard-of-care treatments.

    Barcelona, Spain

    Looming diabetes and TB link

    For reasons that remain murky, diabetes triples the risk of developing tuberculosis (TB), and a new report warns of this “looming co-epidemic.” Nearly 400 million people had diabetes in 2013, a number projected to jump by 50% in the next 2 decades, notes the analysis released at the 45th Union World Conference on Lung Health by two disease advocacy groups. Some 9 million people developed TB last year, and one-third of the world's population lives with a latent form of the mycobacterium that causes the disease. Diabetes impairs immune responses against that microbe, and drugs for the two diseases can interfere with each other and exacerbate each one. The report calls for TB testing in all people with diabetes and increasing research to clarify the TB-diabetes link.

    Bethesda, Maryland

    NIH awards 12 diversity grants

    The National Institutes of Health (NIH) last week committed $240 million over 5 years to 12 projects that it hopes will eventually lead to more minority grant applicants. Previous NIH programs, with budgets one-tenth the size, have served a handful of minority undergraduates at individual institutions. In contrast, each of the new awards involves many institutions, targets hundreds rather than tens of students, and plans to extend beyond the campus into the public schools. NIH Director Francis Collins says previous efforts have left NIH “far short of where we need to be,” although some program directors say that tight funding has limited the impact of successful approaches to training more minorities.


    PubPeer comments draw lawsuit

    A scientist who claims that anonymous comments on the peer-review website PubPeer cost him a lucrative job has filed a libel lawsuit against the anonymous posters. Fazlul Sarkar, a cancer researcher at Wayne State University in Detroit, Michigan, on 9 October also subpoenaed PubPeer in a bid to force it to disclose the posters' identities. The lawsuit, filed in a local court in Wayne County, Michigan, claims that the negative comments about Sarkar's work prompted the University of Mississippi Cancer Institute to withdraw its offer of a tenured position paying $350,000 annually. PubPeer moderators have said that they will oppose efforts to reveal the identities of their users, and the American Civil Liberties Union has offered to defend the website.

  2. The Aging Brain

    Starting young

    1. Emily Underwood

    Decades-old IQ test records from Scottish children have opened a unique window on how the brain ages.

    In April, psychologist Ian Deary (front, center) gathered surviving members of the Lothian Birth Cohorts to hear about how their brains are aging.


    On 4 June in 1947, just before being released for the summer holiday, 11-year-old Sheila McGowan sat down at her desk with a pencil and paper to take an intelligence test at a state-run school in Glasgow, Scotland. It began easily enough, with simple analogies: “A man is to skin as what—coat, animal, bird, skin or cloth—is to fur?” for example. The test quickly progressed to more difficult challenges: spatial puzzles, arithmetic, and decoding cyphers. There were 71 questions in all, with only 45 minutes to finish.

    More than 70,000 other 11-year-olds took the same test that day, as part of one of the first efforts to measure the intelligence of an age cohort across an entire nation. Called the Scottish Mental Surveys, the tests, including an earlier survey administered in 1932, were originally aimed at determining how many children were too “mentally defective” to benefit from schooling and to address fears that the average Scottish intelligence was dropping as professional families had fewer children.

    McGowan remembers the test “very well,” because her mother was gravely ill when she took it. After her mom died the following April, McGowan lived alone with her father, who had to work night shifts at the local shipyard to make ends meet. She had scored high on the intelligence test, but like many poor Scottish teenagers at the time, McGowan did not finish school. She went to work at 16, as a Glasgow shop assistant.

    Decades passed. McGowan married and had two daughters, became a teacher at a college for the deaf, and earned a bachelor's degree in psychology. Then, in 2003, a leaflet arrived at her door asking if she had taken the 1947 mental test. The note was from cognitive psychologist Ian Deary and his colleagues at the University of Edinburgh in the United Kingdom. He wanted McGowan to take the same test again and participate in an ongoing study to determine whether she had maintained her girlhood sharpness or was showing signs of cognitive decline.

    “It took a little bit of courage” to agree to join the study, she says. Indeed, several of McGowan's old schoolmates opted not to participate because they didn't relish the idea of scientists tracking their mental downswing, she says.

    But Deary and colleagues did persuade more than 1000 of McGowan's contemporaries in the Lothian region of the Scottish lowlands, as well as more than 500 participants from the 1932 survey, to sign up for what is now a decade's worth of followup studies. This April, Deary gathered as many surviving participants as possible—the 1947 and 1932 cohorts are now 78 and 93 years old, respectively—to the Church of Scotland's Assembly Hall on the Mound in Edinburgh for a sneak preview of the most recent results. Roughly 400 elderly Scots attended, with one well-behaved golden retriever in tow.

    Standing before a podium under the Assembly Hall's vaulted ceilings, Deary regaled the group with the fruits of their participation in the study, a unique look at how childhood cognitive abilities fare across a lifetime. It has yielded more than 250 scientific publications, based on more than 20,000 hours of cognitive tests and brain scans, done at roughly 2- to 3-year intervals. And, most significantly, it offers the beginnings of an answer to a long-debated question: Why do some healthy people maintain their cognitive sharpness as they age, whereas others lose their edge?

    After studying the Lothian cohorts' test scores on dozens of cognitive tests, sampling their genes, scanning their brains, and documenting their lifestyles and health in painstaking detail for more than 10 years, Deary has found one factor that appears to predict late-life cognitive ability better than any other single measure. It's not exercise, education, or any other virtuous activity, but rather simply an individual's level of intelligence at age 11. As Deary likes to say about old age, quoting Fred Astaire, “To make a success of it, you've got to start young.”

    LIKE MANY COUNTRIES, Scotland's population is aging rapidly, with the number of people over the age of 65 projected to increase by roughly 60% over the next 20 years. Dozens of aging studies worldwide are tracking seniors such as McGowan, looking for clues about how to stave off cognitive decline and dementia. But the Lothian Birth Cohort studies remain unique thanks to an unexpected windfall.


    In 1997, Deary and colleagues discovered records from the Scottish Mental Surveys stashed away in a University of Edinburgh basement. Boxes upon boxes of documents—containing information painstakingly analyzed in the predigital age by tabulation machines that relied on punch cards and needles—had piled up in government and university archives, collecting dust. As they sifted through the data, Deary and psychiatrist Lawrence Whalley realized they'd stumbled on a gold mine. “This will change our lives,” Whalley recalls Deary telling him at the time.

    Although a handful of longitudinal aging studies can look back to IQ tests or other records of cognitive ability from age 19 to 22 or so, “it's very rare that we have any information about the cognitive abilities of these people at earlier ages,” notes Timothy Salthouse, a psychologist at the University of Virginia in Charlottesville.

    Deary was well-suited to exploiting the trove of documents, having spent decades studying the cognitive and biological basis of differences in people's intelligence. He also had a personal connection to the study: His uncle, Richard Deary, had participated in the 1932 Scottish Mental Survey, but died in World War II at age 21, when his submarine struck a mine in the Mediterranean Sea.

    After unearthing the national survey data, Deary and colleagues received U.K. charitable and government funding to launch their new study of the participants. Financial realities made it impractical to recontact all the survivors of the roughly 160,000 children who took the Scottish Mental Surveys in 1932 and 1947, but for one representative sample they reached out to the nearly 5000 who lived in the Lothian region near Edinburgh.

    As participants rolled in, ultimately numbering 1641 from the two cohorts, they were retested on the original assessment used in the Scottish Mental Survey—a measure of IQ that has proven to be reliable and well-validated in both childhood and older age, Deary says. The researchers also administered a range of other cognitive and physical tests and took DNA samples, hoping to detect genetic variations that would help explain differences in how the participants' mental abilities were changing with time. In 2005, Deary's research group “upped their game,” he says, after persuading the Age UK research charity to fund “The Disconnected Mind,” an effort to perform MRI scanning studies on 1000 of the Lothian participants. (Unlike many aging studies, the Edinburgh team has no guaranteed funding, so Deary says he must find new sources for each wave of data collection. “It takes up a lot of my time.”)

    So far, the work has supported a clear conclusion: A large part of participants' differences in cognitive ability during these senior years, as measured in relation to their peers, depends on where they stood at 11. The participants' scores at age 11 can predict about 50% of the variance in their IQs at age 77, Deary and his colleagues estimate.

    A few studies elsewhere have also demonstrated the importance of early cognitive ability to maintaining one's faculties with age, says Paul Thompson, a neuroscientist at the University of Southern California (USC) in Los Angeles. In the Nun Study of Aging and Alzheimer's Disease at the University of Minnesota, for example, researchers examined autobiographical essays written by Roman Catholic sisters at the time of induction at about age 22. They found that the linguistic complexity in the writing was a strong indicator of how the nuns would fare in later life. Compared with nuns whose essays “looked like Cicero, with wonderful, florid prose,” nuns whose writing was laconic and brief were substantially more likely to have poor cognitive function and Alzheimer's disease 58 years later, Thompson says.


    By having an intelligence measure from even earlier in life, the Lothian studies are helping distinguish glitter from gold in the vast literature on factors correlated with cognition. A good recent example is Deary's analysis of the potential benefits of drinking, Thompson says. A smattering of correlational studies suggest that drinking small amounts of wine has positive effects on cognition late in life—indeed, Deary initially found a similar result when he first looked for a relationship between alcohol consumption and cognitive performance in the Lothian cohort. When he accounted for the participants' IQ scores on the Scottish Mental Survey, however, the perceived benefit dissolved. Rather than gaining cognitive benefit from drinking wine when they were older, “people who drank more were already likely to be smart,” Deary says.

    After taking an intelligence exam at age 11, Margaret Lawson is now among the 1641 Scots being studied for clues to how the brain ages.


    The Lothian cohort has similarly challenged other reported influences on cognition, such as diet, body mass index, and caffeine consumption. None of those factors seems to have any effect on cognitive skills in the Lothian cohort when childhood intelligence is accounted for, Deary says. Even the effects of social and intellectual activity disappeared when he took into account how bright children were at age 11, possibly because those children are more likely to end up being socially and intellectually engaged.

    Deary's work is “very elegant,” says Pamela Greenwood, a psychologist at George Mason University in Fairfax, Virginia, but she cautions that it does not mean mental function in old age is foreordained, with no hope for interventions that can help boost or preserve one's brainpower. Although it's still early days for research into cognitive training, Greenwood says there is growing evidence that activities that improve specific abilities, such as the ability to control attention, may have practical benefits for reasoning and problem-solving.

    Indeed, although childhood IQ may be the largest factor in late-life intelligence, the Lothian study suggests it accounts for just half the variation—which means that other factors must account for the remaining 50%. Regardless of how smart they were as children, people in the Lothian cohort who did not smoke, were physically fit, bilingual, or had more education enjoyed slightly better cognitive test scores in old age than their early life scores would have predicted, he says. And Deary is convinced that other factors, both genetic and environmental, must also play a role in explaining how some people whose intelligence ranked quite low as children made impressive strides as adults, while others “start out at a high level, and end up quite low.”

    THIS SPRING, WHEN DEARY SPOKE to the study participants under the Assembly Hall's vaulted ceilings, he began by teasing them. Gazing out over a sea of white hair and wool cardigans, he said, “I've decided to mix it up today, and pit the 1921s against the 1936s.” The two groups looked around the auditorium, trying to determine who belonged in which group. Few of the 550 people who took the Scottish Mental Survey in 1932 and enrolled in the study more than 60 years later were present—they are now 93 years old—but the attendance of 78-year-olds from the larger 1947 test cohort was more robust. “You can't tell who is who, can you?” Deary joked.

    He shared the good news first. Over the past decade of testing, both cohorts had held up well on tests of memory and knowledge, such as remembering a paragraph-long story and pronouncing words, though likely in part because of their growing familiarity with the test, Deary says. In skills that required abstract problem-solving, fast thinking, and speedy reaction times, however, all groups showed some decline with age from results in previous years. A task that required a person to quickly discriminate between two lines of different lengths as they flashed on a screen was particularly challenging for all the participants and elicited a widespread groan when Deary named it. That all participants seem to be struggling at this task suggests age may take a particular toll on the ability to quickly and efficiently sample sensory information, says Stuart Ritchie, one of Deary's postdoctoral students. “A fascinating thing is that decline in this simple sensory speed measure tracks decline in complex thinking skills as the cohort ages,” Deary says.

    Maps of white matter connections in the Lothian cohorts, such as this one, of a 73-year-old participant, suggest that better neuronal wiring in old age is linked to higher cognitive function.


    Although the averages of the Lothian cohort reveal intriguing trends, Deary's true passion is for the study of individual divergence. He flipped to a slide that displayed how the cohorts' test scores on all the tests had fanned out as people hit their 70s and 80s, with many straying far from the average. “What I'm trying to do with my colleagues is study why the mean does not tell you the full story,” he told the elderly Scots.

    Brain scans of these volunteers show that aging takes a vastly different toll on each person, notes Joanna Wardlaw, a neuroradiologist at the University of Edinburgh who collaborates with Deary. Using a technique called diffusion tensor imaging, which tracks how water molecules move throughout the brain's white matter tracts, Wardlaw and colleagues have found that roughly 10% of the differences in general cognitive function in the Lothian participants depends on the integrity of neuronal connections.

    Certain blotchy patches of abnormal white matter, called hyperintensities, are known to signal damage to blood vessels, surrounding cells, and the connections between neurons. The hyperintensities generally increase with age but can vary drastically from person to person, Wardlaw says. An important goal of the Lothian study is to determine how hyperintensities interfere with cognition, and why some people seem to be more susceptible to them, whereas for others they seem to represent harmless “wrinkles” on the brain—the neural equivalent of crow's-feet or frown lines, she adds. What causes hyperintensities is still poorly understood, but research from the Lothian cohort and other groups suggests that they are “pretty tightly linked” to high levels of cortisol, a hormone released in response to stress, USC's Thompson notes.

    Why the cerebral cortex tends to shrink in normal aging is another mystery that scans from the Lothian cohort may help unravel, says Sherif Karama, a psychiatrist at McGill University in Montreal, Canada. People with Alzheimer's disease have an undeniable thinning of the cerebral cortex, but this region, which is involved in nearly all the brain's higher level cognitive processes, such as thinking and decision-making, shrinks in “normal” aging, too, he explains. Historically, scientists have assumed that when healthy older people complained of cognitive deficits, their cortex shrinkage was to blame, or at least reflected a neurodegenerative process, he says. But earlier this year, Karama used Lothian data to show that the people with a relatively thinner cortex in old age also had a lower IQ both as adults and in childhood. That suggests, he says, that those who appear to be losing brain mass along with their cognitive abilities may simply have started out with less gray matter. Deary and colleagues have recently begun collecting the donated brains of Lothian participants after they die, in order to further explore structural and anatomical differences that might explain why some people age better than others.

    Toward the end of his presentation this April, Deary unfolded a slip of paper and read a question about aging submitted by the audience: “So, are you just lucky or unlucky with the brain you've got?” it asked.

    At least for now, “the short answer to that question appears to be ‘yes,’” says Nicholas Martin, a geneticist at QIMR Berghofer Medical Research Institute in Herston, Australia, who is not involved in the studies. He says the growing body of data from the Lothian Birth Cohort studies and other aging research supports a theory that some describe irreverently, and a little brutally, as the “water tank hypothesis”: The better put-together your brain is early on, thanks to good genes and, to some extent, a favorable early life environment, the more cognitive reserves you have to lose to neurodegeneration. In other words, Martin says, “the more you start out with in the tank, the longer it takes to draw down.”

    Pinpointing the genes that determine how full the tank is and how fast it empties will take studies much larger than Deary's. Neuroscientists and geneticists “have learned the hard way,” Thompson notes, that hundreds of thousands of DNA samples are required to make even minor inroads toward identifying how different genetic variants affect the brain and how such variants interact with environment to affect behavior.

    In 2009, the Lothian studies took a first step, joining a consortium of 70 separate institutions in 33 countries called ENIGMA, which seeks to ramp up the search for genetic variants that affect the brain. With access to brain scans and DNA from hundreds of thousands of people from regions as diverse as Brazil, the United States, Cambodia, and Siberia, researchers are already finding clusters of genes that play a role in notoriously complex disorders such as schizophrenia, says Thompson, who this month received an $11 million grant from the U.S. National Institutes of Health to establish a Center of Excellence for the ENIGMA project at USC. The same approach may also help untangle the genetic factors that affect cognitive aging—for example, why some people's brains age faster than others when exposed to high levels of stress hormones, he adds.

    Sheila McGowan, for her part, has taken Deary's study as a call to action to make the best of her aging brain. A longtime art lover, she has taken up painting again and hopes to exhibit her work one day. At age 75, after joining the Lothian cohort, she began taking online university courses focused on philosophy and art history. Confronting the reality of her own aging brain “catapulted me out of my lethargy—I'm trying to buck the change,” she says.