News this Week

Science  13 Mar 2009:
Vol. 323, Issue 5920, pp. 1412

    A First Step in Relaxing Restrictions on Stem Cell Research

    1. Constance Holden
    Crowd pleaser.

    Obama won a standing ovation for relaxing stem cell restrictions and promising to restore the integrity of science in decision-making.


    Scientists are breathing a huge sigh of relief now that President Barack Obama has put his signature on an executive order lifting the restrictions on stem cell research laid down by President George W. Bush on 9 August 2001.

    In the same chandeliered White House room where 2 years ago Bush announced his veto of a bill passed by Congress to override his policy, Obama announced “we will lift the ban on federal funding for promising embryonic stem cell research [and] will vigorously support scientists who pursue this research.”

    The signing ended with a standing ovation from the crowd, which included politicians, lobbyists, ethicists, stem cell researchers, and a goodly contingent of Nobelists—including Harold Varmus, head of the President's Council of Advisors on Science and Technology. Even Kyoto University researcher Shinya Yamanaka—famous for developing induced pluripotent stem (iPS) cells—flew in for the event. Also in evidence was Robert Klein, maestro of the California Institute for Regenerative Medicine.

    Obama also used the occasion to tackle the alleged politicization of the science behind subjects ranging from stem cells to global warming during Bush's reign. The president announced that he has sent a memorandum to the Office of Science and Technology Policy directing it to develop “a strategy for restoring scientific integrity to government decision-making” within the next 120 days. Kurt Gottfried, chair of the Union of Concerned Scientists, says the memo signals “a sea change from the last Administration.”

    The signing of the new executive order on human embryonic stem (hES) cells is “a great outcome,” says University of Pennsylvania stem cell researcher John Gearhart. It “lifts a cloud in many areas. … It will allow more people to get involved [in hES cell research]—and it also sends a message internationally that [National Institutes of Health (NIH)-funded researchers] can collaborate with people.” The development of iPS cells, ES-like pluripotent cells that can be grown without the destruction of embryos, has removed some of the intense pressure for scientists to have access to ES cells, and Gearhart himself is using iPS cells. Nonetheless, he says he still needs ES cells to study the mechanisms of embryogenesis and as a standard against which to compare iPS cells.

    The executive order sweeps away a cumbersome level of bureaucracy that required researchers who receive both federal and private funds to keep separate accounting systems and use separate equipment depending on which cells they are working with. Harvard University researcher Kevin Eggan said earlier this year that the expected policy change “will have a huge immediate impact on my daily life.” Roughly half his graduate students have NIH training grants, which has meant they could not participate in any non-NIH-approved work. The change, he said, “will mean I don't have to spend 7 or 8 hours a week dealing with Harvard administration making sure that the costing allocations for my lab are appropriately followed.”

    Some other scientists see the executive order's value as primarily symbolic. Because so many scientists are focusing on iPS cells, “I think it is going to have minimal effect in the short term” on research, says Martin Grumet, director of the Rutgers Stem Cell Research Center in Piscataway, New Jersey.

    NIH has 120 days to finalize guidelines on research with the hundreds of hES cell lines that will soon be available to researchers. Such guidelines will likely draw heavily on existing ones on informed consent and other procedures that have been put out by the National Academy of Sciences and the International Society for Stem Cell Research. It's a “very exciting time at NIH right now,” says Story Landis, head of the NIH Stem Cell Task Force.

    NIH Acting Principal Deputy Director Lawrence Tabak said at a press conference that “our expectation is [that] some stimulus money will be available for use with the new guidelines.” That's good news for researchers hoping for a piece of the package, which, among other things at NIH, allocates $200 million to a 2-year grants program that covers stem cells, regenerative medicine, and a dozen other fields.

    Congress is already poised to pass legislation to codify the new order. The White House has been working with members of both the House and the Senate to ensure swift passage of the measure that was twice vetoed by Bush. Those bills (S. 487 and H.R. 873) specify that federally funded researchers can work only with ES cell lines derived from embryos created for fertility treatment that would otherwise be discarded.

    Political battles are not yet over, however. Representative Chris Smith (R-NJ), co-chair of the House Pro-Life Caucus, held a press conference accusing the Administration of “incentivizing the creation and destruction of human embryos.” Francis Collins, former head of NIH's National Human Genome Research Institute, told Science he is trying to help members of the religious community come to terms with the policy.

    Federally funded scientists will still not be allowed to derive new lines of hES cells because of the 13-year-old Dickey-Wicker Amendment. Added annually to the health appropriations bill, the amendment prohibits federally funded researchers from harming human embryos. DeGette has indicated, however, that the time may be ripe to start asking legislators to reconsider their support for the measure.


    England Spreads Its Funds Widely, Sparking Debate

    1. Daniel Clery

    Competitions always produce winners and losers, along with appeals to the referee about perceived unfairness, but last week's announcement of how £1.57 billion in annual research funding will be distributed to English universities drew complaints from some unlikely sources: top research institutions. Science heavyweights, including Imperial College London, Cambridge University, and Southampton University, received cuts or below-inflation increases in their annual funding, and they're not happy about it, warning of possible layoffs. Nottingham University; Queen Mary, University of London; and others that won substantial increases are not complaining. Indeed, a large number of England's “new universities,” those created in the 1990s, are ecstatic, having earned money from this research pot for the first time. “This is a great encouragement,” says Les Ebdon, vice-chancellor of the University of Bedfordshire.

    Almost all of the U.K.'s universities are primarily state-funded, and to those conducting international-level research, the government allocates annual block grants to cover departmental overhead costs. But not every institution gets an equal share: The higher the quality of a university's research, the more it gets. Who gets what is decided by a competition held at irregular intervals called the Research Assessment Exercise (RAE), a huge peer-review process involving more than 1000 researchers serving on 15 subject panels and 67 subpanels. Two-thirds of the annual money is allocated according to the RAE results.

    English officials used the last RAE, carried out in 2001, to concentrate funding in departments doing top-level international research, leaving many with no funding despite doing good work. (Scottish, Welsh, and Northern Irish education officials also use RAE data but independently devise their allocation strategies.) The RAE in 2008 saw a slight change of methodology: Instead of giving a single quality score to whole departments, it noted what percentage of each department was doing work in each of four grades, from nationally recognized to world leading (Science, 2 January, p. 24).

    Shifting fortunes.

    New research quality data has led to shifts in research funding among English universities, with some earning such money for the first time.


    When the results were announced last December, the RAE revealed many “pockets of excellence” in institutions not normally lauded for their research, particularly those created after 1992 when a new law gave polytechnic colleges the right to become universities. In the allocations for England, revealed on 5 March, 25 institutions that previously received no RAE-allocated funding are now expecting a check in the mail, and others saw huge increases. The University of Lincoln, for example, sees its share of the research funding jump from £266,000 to £1.9 million. The new funding “will make it easier to hold on to world-class research teams,” says Ebdon.

    Changes in student demographics could have made the allocations even worse for England's traditional research universities. The number of university students in the United Kingdom has ballooned over the past decade, but most flocked to humanities and social science courses while the number enrolling in science has remained fairly steady. As a result, universities hired many more humanities and social science academics, who then won good ratings in the RAE. To prevent these newcomers draining money from science departments, the government decided that traditional science subjects would receive the same proportion of the whole pot as in previous years. “The ringfencing was absolutely crucial,” says Hilary Leevers of the Campaign for Science and Engineering.

    Nonetheless, there were some dramatic changes of fortune. Nottingham University will receive an extra £9.7 million in 2009–10, a 23.6% increase, while Imperial will get £5 million less, a 5.1% cut. The 16 English members of the Russell Group, which represents the United Kingdom's top research-intensive universities, together receive 3.3% more than last year. The whole fund being distributed was increased by 7.7%, however, indicating that England is spreading its research wealth more widely. That's a mistake, the traditional powerhouses warn. “Britain can't sustain 101 internationally competitive universities. If it's going to compete it's got to concentrate its resources,” says epidemiologist Roy Anderson, rector of Imperial College.


    Humane Society Launches Offensive to Ban Invasive Chimp Research

    1. Jon Cohen

    The Humane Society of the United States (HSUS) last week stepped up its long-running campaign to end biomedical research with chimpanzees, orchestrating the broadcast of an “exclusive” story by ABC News that sharply criticized a leading primate research center, filing a 108-page complaint against the facility, accusing the U.S. National Institutes of Health (NIH) of violating its own moratorium on breeding chimpanzees, and working with Congress to draft and introduce new legislation that would ban “invasive” research on great apes. HSUS's efforts quickly grabbed the attention of researchers as well as government agencies that support or oversee the well-being of the 1200 “research” chimpanzees in the country.

    On 4 March, HSUS revealed that for much of 2008, an unidentified employee at the New Iberia Research Center, part of the University of Louisiana (UL), Lafayette, worked undercover for HSUS and secretly videotaped the care and handling of some of the facility's 6000 monkeys and 350 chimpanzees. The video shows a chimpanzee falling from a perch and smacking the floor after being darted by a tranquilizer gun, an anesthetized monkey rolling off a table, a baby monkey writhing while receiving a feeding tube, and other strong images of caged primates. “A major issue for us is the psychological deprivation and torment that these animals are enduring,” said HSUS President Wayne Pacelle at a press conference.

    Alleged violations.

    A Humane Society complaint to USDA charges that New Iberia does not handle its chimpanzees and monkeys with proper care.


    HSUS filed a complaint that day with the U.S. Department of Agriculture (USDA), alleging 338 “possible violations” of the Animal Welfare Act at New Iberia, one of four facilities in the United States that conducts biomedical research with chimpanzees. USDA Secretary Tom Vilsack immediately ordered a “thorough investigation.” In January, USDA had received a separate complaint of alleged violations at New Iberia from Stop Animal Exploitation NOW. A USDA investigation revealed no violations, nor did a routine USDA inspection in September 2008.

    HSUS also charged that the National Institute of Allergy and Infectious Diseases (NIAID) has violated NIH's own moratorium against breeding chimpanzees. The agency has had a $6.2 million contract with New Iberia to supply four to 12 infant chimpanzees each year between September 2002 and 2009 for research on several viral diseases.

    On 5 March, Representative Edolphus Towns (D-NY) introduced the Great Ape Protection Act of 2009, which prohibits “invasive” research on great apes in the United States and federal support for such research anywhere. “Invasive” is defined as “any research that may cause death, bodily injury, pain, distress, fear, injury, or trauma.” This would include drug testing, as well as anesthetizing or tranquilizing animals. The bill has 22 cosponsors and has been referred to the House Committee on Energy and Commerce. Kathleen Conlee, director of program management at HSUS, said her group helped draft the bill and that Towns “agreed to introduce the legislation on a timetable we asked for.” Towns introduced a similar bill last year; it was referred to three committees but never moved forward.

    UL Lafayette, NIH, and researchers who conduct invasive research on chimpanzees have challenged the HSUS media, regulatory, and legislative blitz on many fronts. UL Lafayette President E. Joseph Savoie said the images on the video “are based on interpretation and impression.” At a press conference, employees at New Iberia contended that many of the most disturbing images were distorted. For example, they alleged that the person who shot the undercover video of the monkey falling from a table was a technician responsible for the care of anesthetized animals who possibly could have tended to the monkey. Savoie contended that HSUS's allegations against New Iberia were calculated to help pass the new legislation.

    NIH told Science that its contract with New Iberia for the supply of infant chimpanzees does not violate its moratorium on breeding the animals. That moratorium applies only to chimpanzees owned or supported by NIH's National Center for Research Resources, which in 1995 abandoned a large chimpanzee-breeding program for fiscal reasons (Science, 26 January 2007, p. 450). “The moratorium was not intended for privately owned chimpanzees, or to apply to the other NIH Institutes,” an NIH statement read, further noting that NIAID does not directly pay for breeding of the infant chimps, which are returned to New Iberia for long-term care. John McGowan, deputy director of science management at NIAID, says researchers there who study respiratory syncytial virus and different hepatitis viruses need infant chimps because they have not been exposed to these pathogens. He adds that they have also been used in biodefense studies related to smallpox and anthrax but that the animals were not harmed.

    Several researchers who conduct studies on chimpanzees say the legislation is shortsighted. Geneticist John VandeBerg, the chief scientific officer at the Southwest Foundation for Biomedical Research in San Antonio, Texas, says researchers there use chimpanzees primarily for testing drugs and vaccines against hepatitis B and C, diseases that he notes affect nearly 500 million humans.

    Neuroscientist Todd Preuss of the Yerkes National Primate Research Center in Atlanta complains that the bill defines “invasive” too broadly. It would prohibit his and other groups from sedating chimpanzees to perform brain scans or drawing blood for behavioral experiments and endocrinology studies. He calls these interventions “minimally invasive.”

    Several scientific societies also oppose the ban. Alice Ra'anan, director of science policy at the American Physiological Society in Bethesda, Maryland, notes that existing government rules strictly regulate experiments on chimpanzees. “We can't afford to support an across-the-board ban,” says Ra'anan. “There are diseases that can only be studied in chimpanzees.”

    Ajit Varki, a glycobiologist at the University of California, San Diego, who studies chimpanzees and disease but does not do invasive research himself, has long tried to find a middle ground between opponents and proponents of this controversial animal model. He says no research should be done on chimps that we would not do on humans. “On the other hand, I would no more think of banning all research on chimpanzees than of banning all research on humans,” says Varki. “That would be a bad idea for the future of either species.”


    Report Puts NASA's Solar Program Under a Cloud

    1. Andrew Lawler

    A panel of space scientists has given NASA low grades on an ambitious 10-year plan to study the sun and its impact on Earth. A new report* from the National Academies' National Research Council (NRC) says rising project costs and an inadequate budget pose “a serious impediment” to the field of heliophysics. NASA officials disagree with the academies' harsh assessment of the agency's progress on a 10-year research plan adopted 5 years ago, citing a pipeline filled with major missions and increased funding for studying the results.

    The analysis, by a committee of the academies' Space Studies Board, is a midterm report card on the 2003 decadal plan that was requested by Congress. Although NASA receives average to good marks for its progress on many parts of the plan, it earns a D for making little progress on the Geospace Network, a program that would study the impact of solar variability on sensitive electronics used in satellites and ground-based power grids, and an F for failing to integrate its efforts with those of other disciplines and agencies, notably the Department of Energy.

    Richard Fisher, who directs the heliophysics program at agency headquarters, acknowledges that the report may be “useful” but says he's disappointed that it included neither a clear scientific assessment of the current situation nor recommendations on what NASA might do better. “We would have welcomed such an assessment so as to better address … the allocation of scarce resources and flight-project decisions,” he says.

    The saga of two major projects featured in the 2003 decadal study—the Solar Dynamic Observatory (SDO) and the Magnetospheric Multiscale (MMS) mission—illustrates the problems facing NASA and the field. Both launches have been delayed by some 4 years, and the cost of each has more than doubled. SDO, which will study the effect of the solar atmosphere on near-Earth space, is now set to be launched later this year at a cost of nearly $900 million—far more than the original $400 million. The cost of MMS, which will use four separate spacecraft to study the structure of Earth's magnetosphere, has grown from $350 million to $1 billion with a scheduled launch in 2014. “The resources … have not been used effectively,” says Daniel Baker, a space physicist at the University of Colorado, Boulder, who was on the original 2003 panel. “The result is disastrous.”

    Obscured view.

    NASA's Solar Dynamic Observatory has been plagued by rising costs and launch delays.


    The cost increases have deferred deployment of the Geospace Network until later in the next decade. And NASA's decision in 2004 and 2005 to trim smaller programs such as Explorer because of shrinking budgets has deprived solar scientists of expected opportunities to put their instruments into orbit. “There's just not enough money,” says Roderick Heelis, a University of Texas, Dallas, physicist who co-chaired the 11-member NRC panel with Stephen Fuselier of Lockheed Martin's Advanced Technology Center in Palo Alto, California.

    Fisher says the heliophysics program is in better shape than the academies' report suggests. SDO will be followed in 2012 by a series of spacecraft called the Radiation Belt Storm Probes, he notes, and NASA plans to launch the first satellite to orbit near the sun a year or so after MMS goes up. NASA's balloon and small-rocket programs are growing, he adds.

    Heelis says the panel wants future missions designed to make the best use of the money that's available. But the forecast for funding solar exploration is cloudy at best. Under a long-term projection from the Bush Administration, the budget for heliophysics would drop from $831 million in 2007 to $598 million in 2010, before climbing to $747 million in 2013. The Obama Administration has not signaled what it plans to do.

    Colorado's Baker is encouraged by the strong interest in climate change research shown by White House officials and their awareness of the importance of space weather, two areas that could bolster NASA's program. But solar scientists worry that their field may be eclipsed by overruns in the Mars Science Laboratory, the push to build a massive new spacecraft to orbit Jupiter, and the need to restart the flagging Earth science program. “There's a zero-sum game mentality, which makes it hard for our aspirations to be met,” says Baker.


    Paul Keim on His Life With the FBI During the Anthrax Investigation

    1. Yudhijit Bhattacharjee

    Hours after the first wave of the 2001 anthrax letters sickened a man in Florida—the first of five people to die in the attacks—geneticist Paul Keim got a phone call from the Federal Bureau of Investigation (FBI). Keim, of Northern Arizona University in Flagstaff, had 3 years earlier developed a genetic fingerprinting technique to distinguish between different types of Bacillus anthracis. The FBI wanted him to identify the anthrax strain the attacker had used. Keim and his lab continued to play a role in the investigation until last summer, when the FBI implicated U.S. Army researcher Bruce Ivins as the perpetrator of the attacks. In a recent interview with Science, Keim recounted some of the key moments from his involvement in the 7-year-long investigation. Some of his responses were edited for clarity.

    Q: When were you consulted about the case?

    P.K.: Shortly after Florida's Robert Stevens [the first victim] became ill, officials contacted me to find out whether he could have contracted anthrax from drinking water out of a stream in North Carolina, where he had been traveling the week before, or from exotic Oriental food stores. But in the wake of September 11, we were all suspicious that it was a follow-on terrorist attack.

    Q: How did your lab get involved?

    P.K.: The FBI wanted us to analyze the anthrax in Stevens's cerebrospinal fluid. An FBI scientist, Doug Beecher, called me on the afternoon of October 4 to let me know that the culture was already in the air: It was arriving in a business jet from Atlanta, where an identical sample was being analyzed at the Centers for Disease Control.

    My body went cold because I realized we only had a few hours to prepare for it. It was sundown when I drove to the commercial airport at Flagstaff. The airport officials let me drive out on the tarmac. I watched the jet land; an attractive blonde woman got off with a box containing the culture. It was a surreal experience. I felt like Humphrey Bogart in a scene from Casablanca. I put the box in my car and drove right back to the lab. The next morning, we called Atlanta to say that we'd determined the sample to be the Ames strain.

    Q: What came next?

    P.K.: Over the next few months, we analyzed anthrax from all the letters. After the FBI decided to pursue full genome sequencing of anthrax samples from labs all around the country and overseas to trace the source of the spores used in the attacks, we were contracted to prepare DNA from the samples. The DNA was shipped to The Institute for Genomic Research in Rockville, Maryland, for sequencing. Our lab also served as the repository for all of the samples.

    Q: What was it like to be collaborating with an investigation?

    P.K.: The conversion of an academic lab to a forensic lab was painful: We had to follow strict rules concerning the handling of evidence. Every little step in the process needed to have witnesses. But we took it very seriously; we were motivated by the fear of having a Johnnie Cochran cross-examining us in court.

    We were sent some 1500 samples. … The work pressure was very high. … At one point, people in the lab were starting to revolt. After we developed some real-time PCR assays to identify the samples, we were able to cruise through.

    Q: When did you learn that investigators were focusing on Bruce Ivins?

    P.K.: On 14 May 2008, when FBI agents and Justice Department officials revealed his name in the course of questioning me about the timeline for how the technology for fingerprinting anthrax had improved since the mid-'90s.

    Q: How did the interview unfold?

    P.K.: It was in a room at the Courtyard Marriott near the Washington Dulles airport, where I was attending a meeting of the FBI's Scientific Working Group on Microbial [Genetics and] Forensics. There were five FBI agents and officials from the U.S. Attorney's office. … I remember making a nervous joke. I said I've greased up my wrists just so I can slip out of my handcuffs when you throw me in the back of the van. …

    What they were trying to establish was how much Ivins would have known about the developments in fingerprinting [to distinguish between different strains]. They pulled out e-mails that Ivins and I had exchanged in 2001–2002 as part of ongoing discussions amongst anthrax researchers about the attacks. They wanted to know if I could tell, from those e-mails, if Ivins might have been attempting to cover his tracks.

    Q: What did you conclude?

    P.K.: I didn't see any smoking gun. I went back and looked at some other e-mails from him, and in one that he sent on 7 February 2002 to the group, he said, “The only place I know of that makes anthrax powder is the Dugway Proving Ground.”

    Q: Do you think Ivins was guilty?

    P.K.: I don't know.

    Q: Are all the samples still at your lab?

    P.K.: No, the FBI took them in June 2008. The agency flew a propeller plane from the East Coast to Flagstaff to transport them to another destination.

    Q: How can researchers learn more about the scientific work done on the case?

    P.K.: Scientists are committed to publishing all of the research. The goal is to package all of the papers into one journal so that the community can evaluate the quality of the science all in one place.

  6. CHINA

    Biologists Muscle Up With Major New Protein Facilities

    1. Richard Stone*
    1. With reporting by Hao Xin in Shanghai

    BEIJING—When molecular biologist Xu Rui-Ming gave up a plum professorship at New York University to return to China last autumn, he knew the move would entail sacrifices: a pay cut, for starters, and the need to acclimate to Beijing, one of the fastest growing cities in the world. But the opportunity to help build a research empire was impossible to resist.

    This month, Xu and colleagues here at the Institute of Biophysics (IBP) of the Chinese Academy of Sciences (CAS) will begin recruiting researchers for a National Laboratory of Protein Science (NLPS). In the spirit of Janelia Farm, the biomedical research haven in Virginia run by the Howard Hughes Medical Institute, the national lab will give a few dozen top-notch biologists generous contracts, access to top equipment, and protection from the vitality-sapping chase for research funding.

    Last year, Premier Wen Jiabao called for the creation of several dozen national labs; NLPS is among the first in the biological sciences. “We are the guinea pigs,” says Xu. They will get a leg up from an allied effort to lift biology boats countrywide. Science managers here and in Shanghai are divvying up $160 million for a National Core Facility for Protein Sciences that will be open to all Chinese researchers, including those at NLPS, and eventually to foreigners as well. “It's the first time in history that the government has funded a national facility in life sciences,” says cell biologist He Fuchu, director of the Beijing Proteome Research Center.

    Quickest off the blocks should be NLPS. It's the brainchild of former IBP Director Rao Zhihe, now rector of Nankai University in Tianjin, who floated the concept in 2003. CAS has spent $45 million over 5 years instrumenting IBP. Next, “we need high-caliber researchers,” says Xu, a specialist on epigenetics who intends to recruit a significant number of the lab's initial 60 principal investigators from overseas.

    To sweeten the appeal, the national lab will give researchers a salary and 5-year grants and let them loose in a setting akin to Cold Spring Harbor Laboratory, where Xu worked for 13 years. “This is an absolutely new concept in China,” he says. Two-thirds of the initial recruits will be stationed at IBP and the rest spread around the country, like Howard Hughes investigators at their home institutions; the plan is to ramp up to 100 principal investigators, with a rising percentage located outside IBP. NLPS's budget is roughly $60 million a year.

    An even grander plan is the National Core Facility for Protein Sciences. After months of discussions, the research center is taking shape. Four organizations are orchestrating the Beijing part of the venture: the Academy of Military Medical Sciences, Tsinghua University, Peking University, and CAS. Last month, the partners agreed on the broad outlines of how they will spend $80 million from the agency that bankrolls major infrastructure projects—the National Development and Reform Commission (NDRC)—and a nearly equivalent sum on top of that from the Beijing municipal government, the Ministry of Education, and the Department of Logistics of the People's Liberation Army.

    Beijing bio-glitterati.

    Xu Rui-Ming (top) will lead the National Laboratory of Protein Science, while He Fuchu is the top manager of PHOENIX, a planned proteomics facility.


    The Beijing facility, dubbed PHOENIX, will focus on proteomics, including high-throughput pipelines for protein expression profiling and protein-protein interactions; structure determination; proteome-wide functional analysis; large-scale protein and antibody production; and bioinformatics. User committees have begun drawing up wish lists of equipment. PHOENIX, says He, who will head it, will host more than 50 principal investigators. Outside researchers would be able to come for short periods to conduct experiments or tap into the complex remotely via high-speed data links. The core facility, backers say, will provide a platform for ‘small science' projects conceived by individual investigators. Groundbreaking is expected to take place around the end of the year.

    CAS's Shanghai Institutes of Biological Sciences (SIBS) will receive the other half of NDRC's largess to build a National Facility for Protein Science in Shanghai. Pending NDRC approval of an itemized budget, the facility will be located at a science and technology park that CAS and the Shanghai municipal government plan to build in Pudong district, just down the road from a third-generation synchrotron source set to come online this spring. The synchrotron will be a major resource of the Shanghai protein facility, which plans to construct five beamlines for solving protein structures, studying protein dynamics, and imaging molecules.

    As with PHOENIX, the intent is not “big science,” says SIBS vice president Wu Jiarui, chief scientist of the Shanghai component of the national core facility. “It may be considered a protein hospital,” says Wu, in that biologists, like doctors examining patients, will use a variety of approaches—including electron microscopy and mass spectrometry—to probe protein structure, function, and interactions.

    In China, biology has long been a poor cousin of the physical sciences, which until recently lured many of the country's finest minds and produced strategic advances such as a nuclear arsenal and human space flight. Now it is biology's turn to shine, says He. “We hope to make a great leap in understanding the functions of protein systems,” he says.


    In Dune Map, Titan's Winds Seem to Blow Backward

    1. Chelsea Wald*
    1. Chelsea Wald is a writer in New York City.

    Christopher Columbus rode the trade winds from Europe to America. But on Saturn's moon Titan, a wind-driven westward voyage might not be possible. A new map of Titan's dunes reveals that near-surface winds at the equator blow the wrong way: from west to east. That's the opposite of the predictions made by models of Titan's atmosphere. Researchers say there's no clear path to reconciling those differences.

    For the map, published in Geophysical Research Letters last month, planetary scientists Ralph Lorenz of Johns Hopkins University in Laurel, Maryland, and Jani Radebaugh of Brigham Young University in Provo, Utah, used images from the Synthetic Aperture Radar instrument on the Cassini spacecraft to map 16,000 individual dune segments, equivalent to 8% of Titan's surface. The dunes, all within 30° latitude of the equator, roughly align along an east-west axis. From the way the dunes divert around obstacles, the scientists inferred that the winds that formed them must have come from the west, Radebaugh says.

    Titan global circulation modelers have found this conclusion difficult to swallow. They expected Titan's atmosphere to follow a pattern like that on other terrestrial bodies that rotate in the same direction, including Earth. On these bodies, the surface transfers its angular momentum to the near-surface atmosphere, pulling the atmosphere around after it. The equator is the part of the surface with the most angular momentum, because it's the farthest away from the rotation axis. Therefore, the atmosphere near the equator's surface should rotate no faster—and probably slower—than the surface itself, producing gentle westward winds such as Earth's trade winds. The near-surface winds would move eastward only if they could draw extra angular momentum from somewhere else—an unlikely scenario, says Jonathan Mitchell, a planetary scientist at the Institute for Advanced Study in Princeton, New Jersey.

    Although unlikely, that's exactly what seems to be happening on Titan. “It almost seems to violate basic physical principles like conservation of angular momentum,” Mitchell says.

    Initially, the modelers questioned the results, says Claire Newman, a planetary scientist at the California Institute of Technology in Pasadena. “It's almost like you go, well, did they flip the images around the right way when they got them down from Cassini?” But now they concede that something must be missing from their models, which are adaptations of various general-circulation models of Earth or Mars. So far, the researchers have speculations but no answers.

    Dune world.

    Scientists used Cassini radar images (bottom) to map the direction of Titan's equatorial dunes (top). From that, they inferred the direction of the winds that formed the dunes. The result flew in the face of modeling predictions.


    For example, how the dunes form may be important, Newman says. The researchers have assumed that the dune formation requires average eastward winds. But average winds could be westward, as the models predict, if the rare eastward winds are somehow especially good at making the dunes—because they come at special times of the day or year, or at special speeds, such as storm speeds. Unfortunately, Newman says, nothing in her model stands out as special so far.

    Alternatively, the dunes may not be what they seem, says Scot Rafkin, a planetary meteorologist at the Southwest Research Institute in Boulder, Colorado. The researchers used similar dunes on Earth to draw conclusions about how Titan's dunes formed. But there could be “some process operating on Titan that doesn't operate on Earth to produce dunes that look similar,” he says. And although the dunes seem to be young, he says, they may be cemented in place, remnants of an atmospheric past.

    The heights of Titan's topographic features are still not well mapped, so unknown hills and valleys could still be the culprits, says Sébastien Lebonnois, a planetary climatologist at the Laboratoire de Météorologie Dynamique (CNRS, University Paris 6) in Paris. Tetsuya Tokano, a planetary meteorologist at the University of Cologne in Germany, has added speculative topography to his model, to no avail. He now guesses that the dunes themselves could be diverting the winds, but the models don't have a fine enough resolution to study the dunes.

    Mitchell plans to study the possible connection to Titan's atmospheric superrotation—fast-moving eastward winds high in the atmosphere over the equator. But how would that angular momentum move from high up to near the surface? “That's the missing piece,” Mitchell says. “I'm trying to come up with ways” to make that happen in the model.

    Lorenz, who is not an atmospheric modeler, draws a connection to another line of research, which indicates that the crust of Titan is separated from the core by a vast underground ocean. If that's the case, Titan's surface could spin at a variable rate over the course of Titan's year. Could that change the wind direction at the equator or create the dunes? “At this point, everything's on the table,” Mitchell says.

    The mystery may say less about Titan than about the state of Titan research: There's “too little data on one side and too primitive a state of modeling on the other side,” says Anthony Del Genio, a planetary meteorologist at the NASA Goddard Institute for Space Studies in New York City and a member of the Cassini imaging team. With Jupiter's moon Europa recently chosen for NASA's next flagship mission to the outer solar system, it's not clear when a new mission will go to Titan. So progress depends on Titan modelers getting access to “sufficient computational resources,” he says. “Then we'll have a better idea of whether this thing is really a mystery or whether it's something that is easily explainable.”


    Ice Age No Barrier to 'Peking Man'

    1. Ann Gibbons

    Ever since a Canadian anatomist discovered the skullcap of “Peking Man” by candlelight in China's Zhoukoudian Cave in 1929, the cave has been known as the richest Homo erectus site in the world. Just 50 kilometers southwest of Beijing, the cave was the resting place for more than 40 individuals dragged there by predators. But with no volcanic sediments for traditional radiometric dating, researchers have not known precisely when early humans lived near the cave. Now, Chinese and American researchers have redated bones and tools from the site with a new radiometric method.


    Homo erectus may have been the first human species to survive a near-glacial winter.


    They date the oldest human fossils to about 770,000 years ago, at least 200,000 years older than previously thought, in work published this week in Nature. The dates are not the oldest that have been claimed for Chinese H. erectus, but they suggest this species survived during a mild glacial period at Zhoukoudian. They provide the earliest evidence that H. erectus lived this far north (39.93° North latitude) during near-glacial conditions. “We always thought things were happening later in China,” says Harvard University paleoanthropologist G. Philip Rightmire. “These findings bring H. erectus in China more closely in sync with Africa.”

    Ancestral tomb.

    New dates put H. erectus in this ancient cave earlier than expected.


    To get the new dates, geochronologist Darryl Granger of Purdue University in West Lafayette, Indiana, applied a method called cosmogenic burial dating, which is used to date the retreat of glaciers and has also been used in caves (Science, 25 April 2003, p. 562). When cosmic rays bombard rocks on Earth's surface, they produce unstable isotopes of beryllium and aluminum. The older the rock, the more isotopes it accumulates. When the rock is buried, the bombardment stops and the isotopes decay at a known rate; the ratio of the two isotopes thus works as a clock to reveal when the rock was buried (Science, 11 January 2002, p. 256).

    Granger dated quartz from stone tools and the sedimentary layers where the oldest fossils were found and got a mean date of 770,000 years, with a fairly wide margin of error of 80,000 years. Some previous dates ranged from 200,000 to 500,000 years ago, although the new ages match less reliable uranium series and paleomagnetic dates, says lead author Guanjun Shen, a geochronologist at Nanjing Normal University in China.

    If the new dates are right, then H. erectus was at Zhoukoudian during a relatively mild glacial period about 750,000 years ago that brought icy winds and snow to the region—similar, perhaps, to the cold, dry climate today in southern Siberia. Studies of animal fossils, isotopes, and dust in the cave also suggest that H. erectus was there when the climate fluctuated between cold, dry glacial periods and wet, warm interglacials for 400,000 years. Surviving the cold implies they used fire, which is not surprising for these big-brained hominids but has been very difficult to prove.

    The new climate data and differences in anatomy suggest to some that H. erectus in the north was separate from H. erectus found on Java in Indonesia. As the species left its original home in Africa, there may have been “two migrations, one earlier to the south and one later to the north,” says paleoanthropologist Russell Ciochon of the University of Iowa in Iowa City.

    In any case, the new dates allow researchers to tie Zhoukoudian to other H. erectus sites in China, Africa, and Europe, providing “a more coherent picture,” says paleoanthropologist Susan Anton of New York University in New York City. “This site was always an outlier [in time].”


    From the Science Policy Blog

    The powerful chair of the House Judiciary Committee, John Conyers (D-MI), has explained why he believes the National Institutes of Health (NIH) should not require scientists it funds to make their research papers publicly available via the Internet. In an essay released last week, Conyers defended a bill he introduced that would reverse that open-access policy, which he sees as a threat to copyright law and, ultimately, peer review. Stanford University law professor Lawrence Lessig and open-access guru Michael Eisen of the University of California, Berkeley, had attacked Conyers as “shilling for special interests” in an essay.

    Nobel Prize-winning physicist and Energy Secretary Steven Chu stayed in the headlines this week over ways to make coal power more climate friendly. He announced a new “true engineering collaboration” with science ministries from China, the United Kingdom, and other European allies to demonstrate carbon capture and storage technologies.

    Patent reform came up in the form of a bipartisan bill introduced in both the House and the Senate. But judging from its reception, it's going to be another long fight to pass this always-contentious legislation. The initial response from the biomedical community was negative; a coalition of companies that favor strong patents blasted the bill as favoring “infringers over inventors.” Others, who feel that the current system gives patent holders unfair rights, were more favorable toward it.

    Harold Varmus, co-chair of the President's Council of Advisors on Science and Technology, pitched his new memoir on The Daily Show with host and comedian Jon Stewart. Although the former NIH director played it mostly straight, he did drop a nugget of news here and there, mentioning, for example, that he “was not” advising the president on his recent announcement to boost cancer research at NIH.

    For the full postings and more, go to ScienceInsider.


    How Much Coal Remains?

    1. Richard A. Kerr

    The planet's vast store of coal could fuel the world economy for centuries--and fiercely stoke global warming--but a few analysts are raising the prospect of an imminent shortfall.

    The planet's vast store of coal could fuel the world economy for centuries—and fiercely stoke global warming—but a few analysts are raising the prospect of an imminent shortfall


    To a geologist, gauging how much coal the world has left to burn is a fairly straightforward, if daunting, business. Millions upon millions of drill holes have revealed where the coal is. So geologists can just evaluate each seam's quality and the cost of extraction. Add up all the coal worth mining and you've got lots and lots—within the United States, a century or even two of U.S. consumption; globally, 150 years' worth for the world.

    But there's another, emerging approach to assessing coal resources that yields more sobering results. Rather than go into the field, these analysts go to the record books to see how fast miners have been producing coal of late. By fitting curves to that production history, they come up with a number for the total amount of coal that will ever be mined and a date for the greatest production, the time of “peak coal,” after which production inevitably declines.

    Early results from this curve-fitting analysis of production history show much less coal being mined than geologists ever expected and a peak in coal production looming as early as a decade from now. Curve fitting “is a worthy competitor to a geological estimate” of remaining coal, says David Rutledge of the California Institute of Technology in Pasadena, a nongeologist who has produced such an estimate himself. Geologists beg to differ. “The whole notion of applying statistics to time series [of coal production] is fraught with danger,” says energy resource geologist Peter McCabe of the Commonwealth Scientific and Industrial Research Organisation (CSIRO) in North Ryde, Australia. “I think what you see in Rutledge's presentation is a fundamental misunderstanding.”

    At stake are two central questions: How bad is greenhouse warming likely to get? And when must alternatives to coal come online? By Rutledge's calculations, burning all the coal, oil, and gas humans can get their hands on won't pump enough carbon dioxide into the atmosphere to drive global warming past 3°C, a far less intimidating ultimate warming than the 8°C or 10°C of some scenarios. But engineers developing energy alternatives would have only a few decades to get them in place.

    Ringing the bell

    The growing popularity of curve fitting to fossil fuel production got its start with geophysicist M. King Hubbert (1903–1989) and his bell-shaped curves. Hubbert—a prominent researcher at Shell Oil and then the U.S. Geological Survey—concluded that the rate at which oil was produced would follow the same bell-shaped curve over time as production from a single field does (see top figure, below).

    Up, down.

    Both U.S. oil production and U.K. coal production rose rapidly, peaked, and then declined.


    Production in Hubbert's scheme gradually accelerates from the bell's left side as drillers tap the most accessible, most easily extracted oil pools at an ever-faster rate. But eventually, no matter how much they drill, it gets harder and harder to suck out the oil because the remaining pools are fewer, smaller, deeper, and more difficult to drain. The earlier acceleration now slows across the crown of the bell until production stops growing, peaking at the bell's top. It then declines until, at the right side, all the oil that will ever be produced has been produced.

    In 1956, Hubbert published his curve-fitting prediction that U.S. oil production would peak in the early 1970s. He used the U.S. production history up to that point to set the shape of the curve and an estimate of the amount of oil that would ultimately be produced to set the area encompassed by a complete curve.

    U.S. production in fact peaked in 1970, inspiring a generation of oil “peakists” to apply Hubbert's approach to world oil (Science, 21 August 1998, p. 1128). As of 1998, they broadly agreed that world oil would reach its peak by now or at least by the middle of the next decade. In fact, production outside oil-rich OPEC (Organization of the Petroleum Exporting Countries) has not risen since 2004, despite years of encouragement by high oil prices. And total world production has gone nowhere since 2005.

    On to coal

    In the last couple of years, forecasting coal production by Hubbert's approach has come into vogue, partly because geologists seemed to be having trouble assessing how much minable coal was left. For example, “40% of the world's coal disappeared in 3 years,” recalls retired U.S. Geological Survey coal expert Harold Gluskoter. For the World Energy Council's triennial survey of coal resources in 1990, China cut its recoverable coal reserves—the amount of remaining coal geologists believe can be extracted with today's technology at today's prices—to one-sixth of what it had reported in 1987. The coal was mostly still there; the Chinese just decided they could extract only a smaller proportion of it.

    Less dramatically, in 2007 a committee of the U.S. National Research Council that Gluskoter served on could not support the long-standing estimate of about 267 billion short tons of recoverable reserves in the United States. Divided by current U.S. production, the old estimate gave the oft-quoted figure of a 250-year supply for the United States. “We probably have 100 years. We don't know how much after that,” says Gluskoter.

    Reserve estimates around the world were also coming down, and dividing estimates of minable coal by current production says nothing about when coal might peak. To get at both forecasts, analysts tried a variation on curve fitting à la Hubbert. No results have been published in the peer-reviewed literature, but Rutledge's effort has probably gotten the widest exposure (

    Rutledge, an electrical engineer, adapted a technique used by geologist Kenneth Deffeyes, professor emeritus at Princeton University, to predict a world oil peak in 2005. (Deffeyes sees his prediction holding up nicely.) In the technique, an as-yet-to-peak production history (see bottom graph, above) is replotted in terms of cumulative production and annual production in such a way as to produce a straight line, at least if production approximates an ideal bell-shaped curve. The straight line intercepts the x-axis at the ultimate production—all the coal that will ever be produced—and the year in which half of the ultimate production will be achieved is the year of peak production.

    Rutledge tested the method on regions long past their coal peaks. The coal production of the United Kingdom—once the world's premier energy supplier—peaked in 1913 and is now at about 6% of its peak. A straight line makes “a beautiful fit” to the replotted history, Rutledge told audience members at last December's meeting of the American Geophysical Union. The projected ultimate production in the United Kingdom is about 28 billion metric tons of coal, Rutledge said, which should be reached in 8 years. Geologic estimates made in the 19th century had reserves near 200 billion tons. In fact, the World Energy Council's geologically based reserve estimates—provided by the United Kingdom—stayed at 19th century levels until the 1970s, when they collapsed toward Rutledge's number.

    Applied to 14 major coal-producing regions, Rutledge's method gives a world ultimate production of 660 billion metric tons. That's only one-quarter of geologic estimates of ultimate production, he says. And when combined with similar estimates of ultimate production of oil and gas, the total emissions of carbon as the greenhouse gas carbon dioxide till 2100 are smaller than any of the 40 emissions scenarios that climate scientists have been working with for the past 10 years.

    As to when coal will peak, Rutledge declines to say, citing the way peak timing varied widely among regions already well past their peak. He will say, however, that in his projection the world will have produced a whopping 90% of its coal by 2069. Physicist Mikael Höök of Uppsala University in Sweden and his colleagues are willing to point to a peak. They have taken a similar approach to Rutledge's but with some reliance on estimated reserves. Still, they see world coal production topping out by 2020, entering a 30-year-long plateau, and then declining.

    Outside influences

    Geologists and resource economists aren't ready to give up on coal so soon. “The bell-shaped curve is nice to look at after the fact,” says Gluskoter, but “I'm not sure how predictive it is.” “You can put me in the skeptical camp” as well, says physicist and energy analyst Klaus Lackner of Columbia University. Rutledge “puts too much weight on a simple model,” he says. “The world is not a two-parameter curve.”

    U.K. coal “did not decline because it reached some magic percentage of coal depletion,” argues CSIRO's McCabe. “Gradually, [U.K.] demand disappeared. Coal was no longer used for power, steamships, railroads, domestic heating, or iron and steel production. When I was a kid in England many years ago, gas for the home was produced from coal. There are much cheaper alternatives in the U.K. for energy.” Natural gas from the North Sea replaced coal gas in the home, he notes. Relatively inexpensive—and cleaner-burning—oil became available.

    Does it matter why coal production behaves as it does? responds Rutledge. Whether it's a physical lack of resources or a demand shift to cheaper, more attractive sources, a limit is a limit. Gluskoter and many colleagues are betting on a demand shift. “I believe in technology,” he says. But first, technology could actually stretch coal resources. “The resource is there,” he notes, “it's perfectly minable. It's just not economical right now.” Improved technology will let miners get at thinner, less accessible seams, he says; uneconomic coal might even be turned into gas right in the ground.

    Despite any coal added by technology, “we'll stop using coal before we run out of it,” says Gluskoter. Just as cleaner, cheaper fuels displaced British coal, less-polluting, less-expensive energy sources will replace coal worldwide, he argues. His bet is on solar energy. Plenty of usable coal will be left in the ground, he adds, but “we're not going to be burning coal in a couple hundred years.”


    A Memorable Device

    1. Lucas Laursen*
    1. Lucas Laursen is a freelance writer based in Cambridge, U.K.

    Wearable cameras offer help to people with memory problems and provide a tool for studying how the brain creates and retrieves personal histories.

    Wearable cameras offer help to people with memory problems and provide a tool for studying how the brain creates and retrieves personal histories

    It was over drinks at a local pub in the spring of 2006 that cognitive psychologist Martin Conway of the University of Leeds in the United Kingdom first told his colleague Chris Moulin about using a wearable camera for memory research. But it took more than a few pints of beer to convince Moulin that SenseCam, a camera that periodically takes still photos while worn on the user's chest, might be a game-changer in the study of what psychologists call autobiographical memory. Although skeptical of the small device's usefulness, Moulin did finally agree to take one for a test drive.

    Or rather, he took it on a test walk. Moulin regularly wore a SenseCam on a series of walks. When he reviewed the images 6 months later, to see how well his memories matched the camera's visual record, Moulin says he experienced an unexpected feeling of “mental time travel.” One of the images triggered the memory of the song—Thom Yorke's “Black Swan”—that was playing on his iPod when the image was taken.

    Conway says that many SenseCam users likewise report a sudden flood of memories of thoughts and sensations, what he calls “Proustian moments,” when they review images taken by the device. SenseCam's images “correspond to the nature of human memory—they're fragmentary, they're formed outside your conscious control, they're visual in nature, they're from your perspective. All these features are very like what we call episodic memory,” says Conway.

    That's why he, Moulin, and dozens of other researchers have begun to test whether the images can help resolve how the brain handles personal memories. Cognitive experiments, however, represent just one line of inquiry supported by Microsoft Research, the scientific arm of the software giant and the inventor of SenseCam. Medical researchers are also evaluating whether the device can help people with memory problems due to illness or injuries.

    In 2004, Narinder Kapur and Emma Berry, neuropsychologists at Addenbrooke's Hospital in Cambridge, U.K., were the first to use a SenseCam for memory rehabilitation work. They found that the device significantly helped Mrs. B, an elderly woman with memory problems due to brain damage from an infection. Mrs. B normally forgot events after 3 to 5 days, and even keeping a diary that she periodically reviewed helped her remember events for only about 2 weeks. But when she regularly reviewed SenseCam images of events, she could recall more details—and her memories persisted for months after she ceased reviewing the past images. Encouraged by that data, Kapur says he and Berry grew hopeful that “periodic, regular review of visual images of personal events … really does help long-term [memory] consolidation.”

    Shooting in the rain.

    The SenseCam (left) snaps dozens of wide-angle, low-resolution images from chest level on even a short walk.


    They and others are getting a chance to test that hypothesis. After the pair reported the results from Mrs. B, Microsoft Research decided to provide more than $550,000 in funding to seven research groups, most of them focusing on people with memory problems, and to loan hundreds of cameras to other scientists. SenseCam has “very obvious applications in a whole range of clinical disorders,” says one of the grant recipients, psychologist Philip Barnard of the University of Cambridge.

    Personal black boxes

    SenseCam grew out of a Microsoft Research project that aimed to create a “black box for the human body” which would record data that doctors might find useful if a person were in an accident, says Ken Wood of Microsoft Research Cambridge. In 1999, computer scientist Lyndsay Williams, then also at the same lab, suggested adding a camera to the device so it could double as a memory aid for mundane tasks such as finding lost keys.

    In 2002, Kapur heard then-Microsoft CEO Bill Gates mention the project in a talk. Because his hospital is just a few miles from Microsoft Research Cambridge, it was easy enough for him and Berry to suggest using SenseCam prototypes for patients with memory problems due to Alzheimer's or brain injuries.

    Clinicians who work with such people have typically focused on helping them with their prospective memory, i.e., remembering tasks to be completed in the future, such as keeping appointments. For this, the best aids are still simple tools such as checklists and alarm clocks. But for patients with difficulty recalling past events, clinicians have had little to offer beyond diary-keeping, a task many people, such as Mrs. B and her husband, complain is onerous.

    In contrast, SenseCam records images passively, permitting a person to go about their day without interruption. The latest version is about the size and weight of a clunky mobile phone and appears to observe the world through two unmatched eyeballs. One is a passive infrared sensor, tuned to trigger the camera whenever another person passes by. The other is a wide-angle camera lens, set to capture most of the user's field of view. The device is also equipped with an ambient light sensor that triggers the camera when its user moves from one room to another, or goes in or out of doors. The camera can also be set to snap an image if the sensors haven't triggered a photo after an arbitrary number of seconds. A typical wearer might come home with 2000 to 3000 fragmentary, artless images at the end of a day.

    It may be just those characteristics of the SenseCam images that make them so useful for memory rehabilitation and research, Kapur says. Like Conway, he suspects that the reason the images stimulate memory retrieval and possibly consolidation is because they mimic “some of the representations that we have” of past events in our brains.

    To move beyond the initial case study of Mrs. B, the Addenbrooke's team, under the direction of neuropsychologist Georgina Brown, has followed five additional people with memory problems over a nearly 3-year period, exploring the difference between the memory boost provided by visual and written diary-keeping. Establishing a baseline of how fast these people lose their memories, the team asked each about an event every other day for 2 weeks after the event, and then again after 1 month and after 3 months. Then they asked the patients to keep a diary of a separate event and review it every other day during an initial 2-week assessment, but not during subsequent months. Finally, patients reviewed their SenseCam's images for 2 weeks following a third event.

    The preliminary results suggest that SenseCam use strengthened these patients' memories more than diary-keeping did. A full analysis of the data is in preparation, says Brown, whose team plans to submit it to the journal Memory for a special issue devoted to SenseCam research.

    In a recent, separate study, Mrs. B has repeated a version of her trial, this time incorporating a brain scanner. Researchers compared the activity in her brain as she tried to remember events she had either reviewed in her written diary or with personal images from her SenseCam. Mrs. B recognized about 50% of images taken at an event she had studied using a diary, but 90% if she had studied images instead. And brain regions associated with autobiographical memory were more active when she recalled events she had studied using SenseCam images than when she recalled the diary-studied event, Berry and colleagues report online on 13 March in the Journal of Neurology, Neurosurgery and Psychiatry.

    The Addenbrooke's work represents just a few patients with varying causes of memory loss, but Berry notes that worldwide there are about 30 ongoing SenseCam studies of memory patients. Adam Zeman of the University of Exeter in the United Kingdom leads one. “I think the main interest [in SenseCam] is that it gives you an opportunity to look at memory in what you might call a more ecological fashion than laboratory stimuli generally do,” he says, and “it gives an opportunity to support and rehabilitate memory.”

    Memory walks

    Normally, basic research precedes clinical studies, but the history of SenseCam has been the reverse. “The initial studies had a strong pragmatic aim,” says Kapur, “but certainly once we started to collect data, [psychologists] began to look at these things from a theoretical slant.” The question for cognitive scientists is whether SenseCam, or any similar wearable, point-of-view photographic device, can illuminate how healthy autobiographical memory works. Moulin, for example, has engaged volunteers to undertake memory walks in which they read a list of words while wearing the SenseCam. His student Katalin Pauly-Takacs has tested the participants' recall of the words on the day of their walks and then again 3 months later, with and without the help of SenseCam images. Their preliminary results suggest that volunteers remember more of the words from walks that they reviewed using SenseCam images.

    Moulin's experiment is a nod to decades of autobiographical memory research, in which volunteers were tested on their ability to recall standard images or word lists they had previously seen. Some researchers suggest that the more personal nature of SenseCam images will be key to better studying autobiographical memory storage and retrieval. “Using SenseCam we can first, have more interesting stimuli and second, test [memory] processes that can generalize more easily to real life,” explains Roberto Cabeza, a neuroscientist at Duke University in Durham, North Carolina, who is also working with the device.

    Despite SenseCam's more personal touch, there are no guarantees it will break new ground in memory research. “Whether or not it will tell us different principles or something novel is unclear,” says Larry Squire, a psychologist at the University of California, San Diego, who hasn't yet worked with the device.

    William Brewer of the University of Illinois, Urbana-Champaign, notes that nobody really knows how best to evaluate SenseCam as a memory-consolidation aid or a retrieval cue. He and his graduate student Jason Finley have tested different aspects of memory using SenseCam images as cues, asking individuals how certain they are that they've seen an image before, or inquiring what they did after a certain image was taken. Such baseline studies, says Brewer, should help identify the most appropriate memory tests.

    In addition to the seven Microsoft Research grants handed out in 2007, dozens of groups in cognitive psychology, clinical neuropsychology, education, and computer science are conducting research with borrowed SenseCams and independent funding. But there are no current plans to commercialize the hardware or the software from the SenseCam project—a fact that puzzles some fans of the device. In fact, to keep up with the growing demand for the devices, Microsoft would like to find another manufacturer willing to mass-produce the cameras, says Wood. Microsoft currently provides the cameras to only a limited number of patients under clinical supervision.

    Even though he lobbies colleagues such as Moulin to try the device, Conway remains cautious about overselling SenseCam. There is still at least a decade's work ahead before “we can maximize its use for research and its use as an intervention scheme in helping failing memories,” says the 56-year-old investigator. “By that time, I'll need to wear one permanently, myself.”


    Fathoming Matter's Heart Unbound

    1. Adrian Cho

    Going to extremes, physicists hunt for "unbound" nuclei that don't stick together at all.

    Going to extremes, physicists hunt for “unbound” nuclei that don't stick together at all

    New source.

    Japan's Radioactive Isotope Beam Factory should crank out more unbound nuclei.


    Ordinarily, the protons and neutrons in an atomic nucleus bind to one another with ferocious strength. The might of that binding explains why alchemists never found a way to change lead into gold: That would require prying apart the nucleus of one element to change it into another. Now, however, some physicists are eagerly creating odd nuclei that are so loosely built they are hardly nuclei at all. These rare beasts may provide a better understanding of the heart of matter.

    In recent decades, experimenters have used particle accelerators to produce ever-more-unstable and fleeting radioactive nuclei. The new work pushes this exploration to its logical extreme with the creation of “unbound nuclei”—puffs of protons and neutrons so loosely jumbled together that there is literally nothing to keep them intact, not even momentarily. Some unbound nuclei could yield insights into stellar explosions that forge many of the heavy elements we see on Earth today. Others may stretch current theories of nuclear structure until they snap and thus yield new insights.

    “In some sense, it's the most extreme test of our theories of nuclear structure,” says Nigel Orr of the Laboratory for Corpuscular Physics (LPC) in Caen, France, one of three dozen physicists who gathered recently for a workshop on the subject.* Sydney Gales, director of the National Heavy Ion Accelerator (GANIL) in Caen, says: “We are discovering that there is a whole new kind of loosely formed matter. The physics is completely new.”

    So far, experimenters have snared a handful of the oddities. But interest in them is growing, as new accelerators now powering up or in planning should cough out many more.

    Crossing the line

    The study of unbound nuclei crosses a conceptual frontier. Researchers map nuclei on a chart resembling a crossword puzzle, with the number of protons increasing from bottom to top and the number of neutrons increasing from left to right (see diagram). The 255 stable nuclei form a diagonal “valley of stability,” with their unstable radioactive brethren, less and more neutron-rich, to the left and right. In pursuing unbound nuclei, physicists strive to make nuclei ever richer in neutrons and, ultimately, to push across the “neutron drip line,” beyond which binding is impossible. On the near side of this line, each nucleus can minimize its energy, at least temporarily, by forming a tight clump. On the far side, a nucleus can always reduce its energy by falling apart, so there is no energy barrier to hold the thing together.

    Interest in unbound nuclei builds on the study of other strange-but-bound nuclei, says Angela Bonaccorso, a theorist with the Italian National Institute of Nuclear Physics in Pisa. In the 1980s, scientists discovered that the bound nucleus lithium-11, which has three protons and eight neutrons, possesses an unusual structure in which two of its neutrons form a diffuse “halo” roughly 10 times the radius of the nucleus's core. Oddly, one halo neutron can't stick without the other: Remove one to form lithium-10 (three protons and seven neutrons) and the other flies out, too. Lithium-10 is unbound.

    This means that the drip line zigzags as unbound lithium-10 lies between bound lithium-9 and lithium-11. Similarly, unbound beryllium-13 lies between bound beryllium-12 and beryllium-14, and unbound helium-7 and helium-9 interleave with bound helium-6 and helium-8. These interlopers are barely unbound; if their lowest energy state were just slightly lower, they'd stick together.

    Such unbound nuclei challenge established theories of nuclear structure. Protons and neutrons cling to one another through the strong force, and most theories assume that each particle whizzes about in a static force field determined by the average distribution of all the others. Such “mean field” models predict the existence of energy “shells”—like those for the electrons in an atom—into which the protons and neutrons stack.

    In these barely unbound nuclei, the mean-field approach comes up short. That's because the precise energy of the entire system depends on the details of the continual jumbling of the protons and neutrons. In that case, the notion of a shell—which assumes that the energy can be calculated from an unchanging average distribution of the particles—is no longer strictly valid, says Horst Lenske, a theorist at the Justus Liebig University Giessen in Germany. In fact, Lenske says, whether a nucleus is bound may depend on the precise and exceedingly complicated dynamics of all the interacting protons and neutrons.

    If theorists can account for these dynamics, then they might better understand all nuclei, Bonaccorso says. The basic shell model has been embellished in various ways to help account for dynamical effects and deal with specific nuclei. Insights from unbound nuclei might tie these ad hoc fixes together more coherently. “We are constructing theories that are far more general,” Bonaccorso says.

    Depends on how you look at it

    Studying unbound nuclei is not easy, however. The experiments require intense beams of radioactive nuclei to make these rare beasts and sophisticated detection schemes to snare the fragments released as they fly apart in less than a trillionth of a nanosecond. An unbound nucleus also presents a challenge because its properties depend on how it is produced.

    For example, physicists expect that lithium-10 consists of a lithium-9 core with a halo neutron whizzing around it, and they want to know the exact “state” of that far-flung neutron. To determine that, LPC's Orr and his team fired beryllium-11 nuclei (with four protons and seven neutrons) through a carbon target in experiments at GANIL. A few of the collisions plucked one proton out of the beryllium nucleus to make a lithium-10 nucleus. That would instantly break into a lithium-9 nucleus and a neutron, and the experimenters would look for those pieces.

    By measuring the energy with which the neutron and lithium-9 sped apart, researchers could probe their interactions; any pushing or shoving between them should create a peak in the energy spectrum. That spectrum would thus reveal the original state of the lithium-10 nucleus. Looking at the energy spectrum, Orr and his team found a broad peak that suggested the ejected neutron began in a state in which it had no angular momentum—a so-called s-state.

    However, Haik Simon of the Helmholtz Center for Heavy Ion Research (GSI) in Darmstadt, Germany, and colleagues took a different approach to make lithium-10. They fired rare lithium-11 nuclei at a carbon target to try to knock one neutron out of the lithium nucleus. They then examined the resulting lithium-10 much as Orr did and observed an energy spectrum comprising three overlapping peaks. That suggests that the lithium-10 sometimes emerged in an s-state, sometimes in a p-state with one unit of angular momentum, and sometimes in a d-state with two units.

    In spite of the incongruous results, there is an underlying consensus, Simon says. Both experiments show that, in spite of its infinitesimally brief existence, lithium-10 has a structure with well-defined energy states. They both also show that the lowest energy ground state is the s-state, as theory predicted. “Here we're getting quite clear,” Simon says. “It has now been resolved.”

    But not all unbound nuclei are so straightforward. Takashi Nakamura of the Tokyo Institute of Technology and colleagues see signs of a more complicated situation in unbound beryllium- 13 (four protons and nine neutrons). In experiments at the Institute of Physical and Chemical Research's (RIKEN's) Nishina Center for Accelerator-Based Science in Wako, Japan, Nakamura and colleagues produced beryllium-13 by shooting beryllium-14 nuclei through a liquid hydrogen target to chip one neutron out of the incoming nucleus.

    When the beryllium-13 fell apart, the researchers measured the energy spectrum of the rebounding pieces and observed a peak. But that peak seems to have the wrong energy and shape to be the expected s-state and may be a p-state, Nakamura says. That suggests that in beryllium-13, the quantum state of the beryllium-12 core is somehow altered, or “collapsed,” by the mere presence of the extra neutron, he says. Deciphering how the core is deformed is the sort of challenge theorists hope will lead to new insights.

    Prime quarry.

    Physicists are stalking barely unbound nuclei such as lithium-10, beryllium-13, and helium-7 that on the chart of nuclei lie between bound neighbors.


    Stellar explosions run backward

    A nucleus can also become unbound if it absorbs too much energy, and such overamped nuclei may play starring roles in stellar explosions. In a blast called a nova or in a more-powerful one called an x-ray burst, heavier nuclei form when lighter ones rapidly absorb protons, in the so-called “rp-process.” For example, a magnesium-22 nucleus can absorb an energetic proton to make an aluminum- 23 nucleus, which quickly spits out a photon to shed its excess energy.

    Reproducing that interaction is difficult. However, RIKEN's Tohru Motobayashi and colleagues have found an easier way to run it backward. They fire aluminum-23 nuclei through a lead target. As an aluminum nucleus passes through a lead nucleus's electric field, it absorbs a “virtual” photon from the field. That unbinds the aluminum-23 and it splits into magnesium-22 and a proton.

    The researchers measured the energy with which the magnesium-22 and proton flew away from each other. They found three overlapping peaks in the energy spectrum, the lowest at about 500 kiloelectron volts. Such details suggest that the forward process requires relatively high temperatures and densities and may play a bigger part in more-energetic explosions. “Our result implies that this process does not contribute so much to the nova but that it is important for the x-ray burst,” Motobayashi says.

    The study of unbound nuclei is likely to grow, given the new facilities coming online or in planning, researchers say. Two years ago, the Nishina Center revved up the massive superconducting cyclotron that powers the lab's Radioactive Isotope Beam Factory (Science, 15 December 2006, p. 1678). When fully functional, it will provide beams more than 1000 times as intense as those at older facilities. GSI is building a synchrotron-based facility that will power up in the middle of the next decade (Science, 2 November 2007, p. 738), and in December 2008, the U.S. Department of Energy chose Michigan State University in East Lansing to host its proposed linear-accelerator-based facility (Science, 19 December 2008, p. 1777).

    With much more intense beams, researchers should be able to climb drip lines to make unbound elements up to aluminum or silicon, says Tokyo Tech's Nakamura. Motobayashi says it should also be possible to study overenergized nuclei involved in an astrophysical progression called the r-process, in which lighter nuclei gobble up neutrons and which is thought to forge half the nuclei heavier than iron.

    Exactly what unbound nuclei will reveal remains to be seen. Of course, the allure of the unknown is also leading nuclear physicists to test the bounds of their field.

    • * Unbound Nuclei Workshop, University of Pisa, Italy, 3–5 November 2008.

Log in to view full text

Log in through your institution

Log in through your institution