News this Week

Science  22 Jul 2005:
Vol. 309, Issue 5734, pp. 540

    NASA May Cut Shuttle Flights and Reduce Science on Station

    1. Andrew Lawler

    All eyes were on the Florida coast this week as NASA struggled to end a two-and-a-half year hiatus in human space flight by launching the space shuttle Discovery. But behind the scenes, NASA's space transportation system is facing an even bigger challenge. On the table is a plan that could mean as few as a dozen more shuttle flights, even less science on the international space station, and a reengineered shuttle system to carry humans and cargo to the moon by the end of the next decade.

    NASA chief Michael Griffin is betting that the plan, which has yet to be approved by the White House and made public, will square with the exploration goals set by U.S. President George W. Bush in 2004 without busting the agency's budget or raiding unrelated science programs. He's also hoping for support from politicians fiercely protective of shuttle-related jobs in their states. But he's constrained by the still-rising costs of returning the shuttle to orbit. And he knows that NASA's European and Japanese partners will almost certainly balk at any attempt to reduce the station's capabilities yet again. “There's going to be a lot of kicking and screaming” over the station's future, predicts one official involved in the discussions.

    The transportation report, due out later this month, is one of two internal studies that Griffin requested shortly after taking office in March (Science, 18 March, p. 1709). The other, due out late next month, will examine how to assemble the space station using as few shuttle flights as possible.

    At the heart of the transportation report, according to officials familiar with it, is a redesigned solid rocket booster that carries the orbiter into space. By adding an upper stage and a capsule, NASA could turn the booster into three distinct vehicles: one to carry a crew of three or so, another to orbit equipment requiring a pressurized cabin, and a third to carry cargo that could withstand the vacuum of space. This “single-stick” option could be ready in 2011, providing crew and cargo services to the space station, according to sources familiar with the study.

    The retirement of the shuttle no later than 2010 would shift attention to a heavy-lift vehicle capable of launching a whopping 100 tons—an order of magnitude more than the single stick. That design also would draw on the shuttle system, essentially replacing the orbiter with a cargo carrier. The unpiloted vehicle would be used later in the decade to launch the pieces of a lunar outpost.

    Infrequent flyers?

    NASA is weighing a plan that could mean as few as a dozen more shuttle trips to the space station.


    A shuttle-derived vehicle, rather than one based on an existing expendable launcher, has political as well as engineering advantages. Lawmakers in Texas, California, Alabama, and Florida—the site of thousands of shuttle-related jobs—have been reluctant to pull the plug on the shuttle. For them, the single-stick and heavy-lift options promise to keep assembly lines humming after the orbiters are retired. And although Pentagon officials prefer a new launch system based on the department's Atlas or Delta launchers, Griffin won them over by assuring that plenty of science missions would be launched on Delta rockets.

    The estimated cost of these new vehicles is from $10 billion to $15 billion through 2015. Operating costs for the single-stick series would run about $3 billion a year—approximately $1 billion less than the shuttle cost before Columbia's failure. NASA hopes to pay the tab from its scheduled modest budget increases and savings from falling shuttle return-to-flight costs. But one official says that those return-to-flight costs will climb as high as $7 billion over 5 years—$2 billion more than previously estimated. That figure would leave little room for new ventures, the cost of which have traditionally been underestimated.

    That gloomy budget picture is forcing NASA to consider even more radical cuts to the number of flights needed to finish the space station. NASA had planned 28 more shuttle flights, but the team reexamining the station is officially working to find a way to finish up after 18 to 24. Sources close to the second study say that Griffin and the White House are pressing for as few as a dozen more flights. Last month, Griffin warned his European and Japanese counterparts that the agency may propose other ways to put their laboratory modules into space, such as using expendable launchers, on an extended schedule. “He is softening the beachhead by warning that there may be some deferral,” says one source. Japan and Europe have resisted any alternative plan to launch the labs, their primary contribution to the station, because that would force expensive modifications and delays. “The reaction was quite adamant,” the official adds.

    To honor pledges from the White House to meet its obligations to the station partners, the redesign team is looking at alternatives to reducing shuttle flights. One strong possibility is to minimize the science aboard the U.S. laboratory module. Griffin has already issued such a warning (Science, 29 April, p. 610), but fewer shuttle flights could lead to even more dramatic reductions in science equipment and racks. “There isn't a lot of science that could be done on the space station that can't be done later” or on the moon, explains another official familiar with the study.

    Not true, says Ian Pryke, a senior fellow at George Mason University in Fairfax, Virginia, and former head of the Washington, D.C., office of the European Space Agency. A centrifuge, he notes, could provide important data on the long-term effect of lunar—or Mars-style—gravity on mammals. Japan is building the centrifuge for NASA, but Griffin already has stated that it likely must be abandoned given space and budgetary constraints.

    The station itself seems safe for now. But Griffin's job over the next several months will be to satisfy a White House eager to move beyond the station, placate foreign partners frustrated by delays, and convince lawmakers that he isn't ignoring station science. “With a radically reduced [shuttle] flight rate, the change is going to be traumatic,” warns one official. “We're in a mess.” That mess may well prove more daunting than a successful return to flight aboard Discovery.


    New Virtual Center Aims to Speed AIDS Vaccine Progress

    1. Jon Cohen

    A star-studded team of AIDS researchers from four universities, led by Barton Haynes of Duke, has won a huge award to explore some of the deepest immunologic mysteries confronting the field—part of a bold new effort to speed the search for an HIV vaccine. Haynes will direct the so-called Center for HIV/AIDS Vaccine Immunology (CHAVI), which could receive more than $300 million over the next 7 years from the U.S. National Institute of Allergy and Infectious Diseases (NIAID). “It's big science in the way that the Human Genome Project was,” says Peggy Johnston, the top AIDS vaccine official at NIAID, which announced the award last week.

    The CHAVI award marks the start of the Global HIV/AIDS Vaccine Enterprise, an ambitious public-private effort spearheaded by the Bill and Melinda Gates Foundation that aims to remove roadblocks hindering the field. “We've all been frustrated by the slow tempo of progress and how difficult a bug this virus is,” says Haynes, an immunologist and former chair of Duke's medical school in Durham, North Carolina. “This means a change in the way we do business.” Although Haynes and his collaborators—who include Harvard University's Norman Letvin and Joseph Sodroski, Oxford University's Andrew McMichael, and George Shaw of the University of Alabama, Birmingham—beat out three other high-profile teams, at least one competitor doesn't expect the award to divide the field. Harvard immunologist Bruce Walker predicts that the process will have “a lot of collateral positivity.”

    Dream team.

    Duke's Barton Haynes formed a winning AIDS vaccine consortium, part of the ambitious new Global HIV/AIDS Vaccine Enterprise.


    The enterprise envisions different funders—including the Gates Foundation and other wealthy countries—sponsoring several CHAVI-like consortia. The push for these consortia grows out of the deep frustration about the limits of investigator-initiated research. The enterprise attempts to address those by hewing to a strategic plan to guide the field, standardizing assays so labs can easily compare results, and avoiding unnecessary duplication. CHAVI itself will intensively examine immune responses and the genetic factors that give some people an upper hand against the AIDS virus. In particular, CHAVI investigators will study people who are repeatedly exposed to HIV but remain uninfected, and they will try to unravel why newly infected people vary in their ability to keep the virus in check. Haynes and his collaborators will also explore why some HIV isolates transmit more readily, the structure of anti-HIV antibodies that work best, and why some vaccines work in monkey experiments.

    The intensely competitive CHAVI application process has been the talk of the field for months. “Everybody who's very active was on one of the applications,” says Walker. His group may attempt to fund the projects they proposed through other sources, and Walker's already planning to meet with other also-rans. “My sense is a lot of these groups will continue to pursue the goals that they outlined,” he says. Haynes stresses that as CHAVI expands, it might invite researchers from the other teams to join the virtual center. “Our group is just one group,” says Haynes. “We don't have all the ideas.”

    Because money for CHAVI comes solely from NIAID's budget, some basic researchers worry that the institute may cut back on investigator-initiated grants. Anthony Fauci, NIAID's director, says he “can't predict funding from one year to another” but notes that current CHAVI funding taps new money and that NIAID makes it “the highest priority” to protect investigator-initiated research funds. “The field was screaming for some bold new approach,” Fauci says.

    The CHAVI grant will pay the full amount allocated ($49 million per year) only if the researchers meet specific milestones and move their ideas from the lab to clinical trials. “One of the challenges is going to be how to keep everybody pulling in the same direction,” says NIAID's Johnston, who sees CHAVI itself as a grand experiment. “It will either succeed big or fail big, but at least we have tried.”


    Parliamentary Gadfly Loses His Post

    1. Eliot Marshall

    CAMBRIDGE, U.K.—One of the few scientists in Parliament and a blunt critic of the management of U.K. science has been bumped from a committee leadership post by his own Labour Party. Ian Gibson, former dean of biological science at the University of East Anglia, says the party last week gave up control of the Select Committee on Science and Technology, knocking him from the chairmanship, a post he has held for 4 years.

    After losing seats in the national election in May, Labour was required by parliamentary rules to give up control of at least one committee; it chose the science panel. Gibson says the committee will be headed by an adversary of Labour, Phil Willis, a Liberal Democrat from a constituency 320 kilometers north of London. A former schoolteacher, Willis has handled education issues for his party but is not known to have spoken in Parliament about science, observers say.


    Ian Gibson gives up the science committee chair.


    Gibson says he is “very disappointed,” particularly because he thinks the select committee has sharpened policy by advancing “open access” publishing schemes and probing government research funding. Making last week's decision worse, he says, was a “ridiculous” move by Labour tacticians to stifle dissent: He and other Labour committee members “were more or less blackmailed to accept the choice” of Willis as chair or else lose their own seats on the committee. For this reason, he says, he will not oppose the change, and he expects to remain on the committee. The episode, Gibson adds, “smacks of getting your own back.” Party leaders, he believes, punished him for failing to toe the line, for example when he led a revolt against increases in university tuition fees. (The campaign failed.)

    The science committee under Gibson did a “valuable” job, according to Peter Cotgreave, director of a London-based lobby group called the Campaign for Science and Engineering. “The committee did some excellent reports,” says Cotgreave, including a probe of the Medical Research Council that highlighted management controversies that had been accumulating over many years. He would like to see the committee continue such investigations, perhaps taking on topics such as how science funders should pay for the “full cost” of university research and how to improve links between university and industrial researchers. The Gibson panel “kept the government on its toes,” Cotgreave says, adding, “that's the whole point” of Parliament.


    Carlo Rubbia Dismissed From Energy Agency

    1. Susan Biggin*
    1. Susan Biggin is a writer in Trieste, Italy.

    ROME—Carlo Rubbia, winner of the 1984 Nobel Prize for work in particle physics, has lost his position as president of Italy's nuclear and alternative energy agency (ENEA) in a battle over leadership. The government dismissed Rubbia last week hours after he published an open letter in La Repubblica criticizing the scientific competence of the agency's board.

    The government has already named a special commissioner to take over: Luigi Paganetto, economics faculty head at Rome University “Tor Vergata.” He will be flanked by two deputies, both former board members: Claudio Regis, a hydrogen engine engineer, and Corrado Clini, director general of the environment ministry and one of Rubbia's fiercest opponents.

    ENEA's life has brimmed with controversy since it was set up in 1982 to oversee the nuclear power program. Despite grand ambitions, its agenda has been stalled by boardroom clashes and frequent changes of management. Observers say the agency never recovered from a national referendum in 1987 that pulled the plug on nuclear power. And its niche under the ministry for industry, in collaboration with the environment and research ministries, is top-heavy with bureaucracy.

    Slated for dissolution in the 1995 budget, ENEA managed to survive when supporters proposed an overhaul. They called for shifting to new areas such as nuclear fusion and high-performance computing. Rubbia took over as president in 1999, but after clashing with the board, resigned in 2001. He subsequently became ENEA's special commissioner with a mandate to prepare a law governing the agency's future. Under this legislation, Rubbia again became ENEA's president in early 2004.

    Publish and perish.

    Hours after his letter appeared in print, Rubbia was out.


    But Rubbia didn't get far. Board members overruled him on his choice of director general and frequently on scientific matters as well. They requested Rubbia's removal and proffered their own resignations. Early this year, Rubbia took the board to court over its “irregular” procedures—and won the removal of the board's director in mid-June. However, the court suggested placing ENEA under a commissioner. In June, ENEA's 3000 researchers publicly called for an end to the fighting, saying their work was being paralyzed.

    Last week, Rubbia complained in his letter that although the law stipulates that ENEA be led by scientists of international repute and high merit, its seven board members were political choices of ENEA's three umbrella ministries and exhibited a “lack of scientific knowledge.”

    Clini denies that politics lies at the heart of these clashes. He says that, contrary to the board's wishes, Rubbia favored a nuclear waste disposal project that would have spent a large part of ENEA's $440 million budget on French and German researchers.

    As to the future, Paganetto wants ENEA to launch new collaborations with Italy's other research institutions and “move quickly to take advantage of European projects.” Clini adds that the agency should become the hub for energy and environment research projects related to controlling greenhouse gases under the Kyoto Protocol.


    Flawed Statistics in Murder Trial May Cost Expert His Medical License

    1. Eliot Marshall

    Mangling statistics is a common offense, but in the case of Roy Meadow—a renowned expert on child abuse and co-founder of London's Royal College of Paediatrics and Child Health—it has had uncommon repercussions. As an expert for the prosecution in a 1999 criminal trial, Meadow overstated the low odds of two infants in the same family dying suddenly for unexplained reasons, helping convict a mother of murdering her two sons. On 15 July, a professional panel ruled that Meadow, 72 and retired, should be “erased” from the register of physicians in Britain for his statistical blunder—a decision that some think will deter scientists from testifying as expert witnesses.

    The mother, attorney Sally Clark, spent 3 years in prison before her husband (also an attorney) and others organized an appeal that quashed the verdict in 2003. An appeals court found that medical details had been withheld and that the jury may have been swayed by Meadow's testimony. The reversal also prompted an investigation of Meadow by the physicians' governing body, the General Medical Council (GMC).

    A “fitness to practice” panel, headed by GMC lay member Mary Clark-Glass, a former law lecturer, read Meadow its conclusions last week: “You abused your position as a doctor by giving evidence that was misleading, albeit unintentionally, and … you were working outside the limits of your professional competence by straying into the area of statistics. …” It found Meadow “guilty of serious medical misconduct” and meted out the severest penalty. The reason, Clark-Glass explained, was that Meadow's “eminence and authority … carried such great weight,” and his errors were “compounded by repetition” in court testimony. Meadow is not commenting to the press at this time.


    Roy Meadow overstated the odds against two SIDS deaths in one family.


    Meadow's most egregious mistake, according to the inquiry, was to testify that the risk of two infants in the same family dying of unexplained natural causes—sudden infant death syndrome (SIDS)—was one in 73 million. He acknowledged that he got the high number by taking a figure from a draft report of the risk of a single SIDS death in a nonsmoking family like the Clarks (1 in 8543) and squaring it. But according to the GMC, the unpublished report he used was not about the recurrence but the occurrence of SIDS. And the panel found that he was wrong to compute the odds as independent risks. Indeed, the GMC said that information in the draft report showed that the odds went the other way: “There is an elevated risk of a second SIDS death in one family after there has been one such death.” The panel faulted Meadow for getting the numbers wrong and for using a bold metaphor (for which he later apologized). He suggested that the likelihood of two children in a family dying this way would be like picking the winning horse in the Grand National 4 years in a row.

    Some think it was a mistake for the GMC to focus its ire on Meadow. Richard Horton, editor of The Lancet, argued in a 2 July editorial that Meadow had become a “scapegoat” for the failings of the legal system in a series of controversial child abuse cases in which one family convictions have recently been overturned. “Society needs to guard against any crude and oversimplistic settling of scores,” he wrote. He proposed a commission to examine how experts should be used in court.

    Meadow's friend and colleague Alan Craft, president of the College of Paediatrics, protests that Meadow “did not mean to mislead the jury” but acknowledges that he was “wrong in one small bit of evidence and in the way he presented the statistics.” For the GMC to jump from that to serious misconduct was “quite astonishing,” Craft says, and will make it “extraordinarily difficult to get experts involved in child protection cases.” He thinks the implications for all of medicine “are enormous.” Philip Newman, deputy chair of the 1500-strong Academy of Experts in London, says the decision is not necessarily bad: It's a strong reminder that experts must adhere to three I's—“independence, impartiality, and integrity.”

    Meadow has 28 days to appeal.


    U.S. Rules Could Muffle Scientific Voices

    1. Yudhijit Bhattacharjee

    The U.S. government has issued new rules on interactions between U.S. citizens and the United Nations Educational, Scientific, and Cultural Organization (UNESCO) that some scientific organizations fear could limit access to the international scientific and cultural body by U.S. experts. But U.S. officials say the changes are intended simply to keep the government in the loop.

    “It certainly has the power of acting as a filtering process,” says Christopher Keane. He represents the American Geological Institute on the U.S. National Commission to UNESCO, a 100-member body appointed by the U.S. government to coordinate communications between its citizens and UNESCO that was briefed on the directive last month at its first meeting. “But it's a little hard to hang them on it until there's evidence” that the U.S. government is preventing UNESCO from accessing the experts it needs, says Keane.

    The 5 May directive, from U.S. Ambassador Louise Oliver to UNESCO Director General Koichiro Matsuura, requires UNESCO to consult U.S. officials before partnering with organizations or citizens in the United States. It also asks UNESCO to check with the U.S. permanent delegation and the commission before planning any U.S. events. U.S. individuals and institutions, it adds, must channel all communications through the commission and avoid direct contact with the UNESCO secretariat in Paris.

    Veto power.

    New directive could restrict U.S. access to meetings such as this one at the International Centre for Theoretical Physics in Trieste, Italy.


    U.S. officials say the directive is meant to keep the U.S. government informed about UNESCO's dealings with nongovernmental organizations and is consistent with UNESCO's own regulations. The memo “absolutely does not impose a vetting mechanism,” says Andrew Koss, a State Department official who serves as the deputy chief of the U.S. mission. “Advance consultation simply means that if UNESCO comes to us with a list of potential partners, we might offer additional names to help them broaden their horizons.” The United States rejoined UNESCO 2 years ago after dropping out in 1984.

    But others say the directive goes far beyond the practices of most member states, which only expect UNESCO to inform their national commissions about a given activity after the details have been worked out. “The memo implies that UNESCO's decisions to engage U.S. scientists and engineers—even when they are being selected for their expertise and not as official U.S. representatives—need to be vetted by the U.S. government,” says Irving Lerch, chair-elect of the American Physical Society's Forum on International Physics. Lerch, who's also a trustee for Friends of UNESCO, says the procedure would allow the U.S. government to control the flow of scientific opinion from the research community to UNESCO.

    The memo has also sparked concern among some managers of UNESCO's scientific programs. “If we were to follow this literally, organizing routine scientific meetings could get very difficult for us,” says K. R. Sreenivasan, director of the Inter-national Centre for Theoretical Physics in Trieste, Italy, which is a part of UNESCO. “We'd like to invite U.S. scientists who are appropriate for us, not those who have been approved by the U.S. government.”


    Defense Rules Would Pinch Foreign-Born Scientists

    1. Yudhijit Bhattacharjee

    The U.S. Department of Defense (DOD) has proposed a rule that would make it harder for universities to involve foreign nationals in unclassified research projects funded by the agency. The additional security arrangements required by the rule are at odds with traditional practices, say university administrators. The result, they warn, will be fewer opportunities for many researchers born abroad.

    The rule, published in the 12 July Federal Register (, is intended to beef up DOD's compliance with export-control regulations aimed at restricting the transfer of certain technologies to countries viewed as threats to national security. The Commerce Department earlier this year proposed modifying those regulations so that universities must obtain a license before engaging nationals from a list of countries that includes China, India, and Russia (Science, 13 May, p. 938). Universities have traditionally considered themselves exempt from this requirement under what is known as the fundamental research exemption.

    By not mentioning the fundamental research exemption, the DOD rule would apply to all DOD-sponsored research. To comply, universities and companies working on defense projects would not only need licenses to enable foreign nationals to participate in the research but would also need to protect export-controlled information through an “access control plan” that includes “unique badging requirements for foreign nationals” and “segregated work areas.” The requirements are in line with recommendations last year from DOD's Inspector General, who concluded that the agency did not have “adequate processes to identify unclassified export-controlled technology and to prevent unauthorized disclosure to foreign nationals” (Science, 23 April 2004, p. 500).

    University officials foresee “draconian clauses” in research contracts that would make it more difficult for them to involve foreign nationals in projects, says Toby Smith, senior federal relations officer for the Association of American Universities in Washington, D.C. Many universities would have to turn down such contracts either because of the cost of additional security or to avoid violating their own nondiscrimination policies, Smith says. “Walling off labs, making foreign graduate students wear badges—it's just not what we do at a university,” says Paul Powell, assistant director of the Office of Sponsored Programs at the Massachusetts Institute of Technology in Cambridge.

    The comment period closes 12 September.


    Bill Could Restructure Agency and Strengthen Director's Hand

    1. Jocelyn Kaiser

    An influential legislator wants to boost the budget authority of the director of the National Institutes of Health (NIH)—and impose a ceiling on the agency's overall growth.

    Those controversial suggestions are expected to be part of a bill to streamline management of the biomedical research behemoth. A draft has triggered a mixed reaction from research community leaders, who fear that giving the NIH director too much power could lead to unwise decisions about how to divide resources among NIH's 27 institutes and centers. Meanwhile, a Senate spending panel last week approved a $1.05 billion boost for NIH in 2006, to $29.4 billion. But that level is unlikely to be sustained in the final spending bill because it relies on accounting tricks that are unpopular in the House.

    The NIH bill, known as a reauthorization, will soon be introduced by Representative Joe Barton (R-TX), chair of the House Energy and Commerce Committee. NIH's programs were last reauthorized in 1993. A “discussion draft” of the new legislation, which was aired at a committee hearing this week, reflects advice from a 2003 Institute of Medicine (IOM) report on how to address concerns that NIH's sprawling structure makes it less agile and leads to duplicative research across the agency (Science, 1 August 2003, p. 574).

    Several provisions in the upcoming bill reflect IOM recommendations, such as boosting the 1% of NIH's overall budget that the director can now move from one institute to another or pool for common projects. The bill would also create a new division in the director's office to analyze NIH's overall portfolio and disburse grants for trans-NIH initiatives, and it would require NIH to give Congress detailed spending reports every 2 years. But another key provision came out of left field, observers say: lumping together the annual budgets for NIH's institutes into just two piles—one for 15 “mission-specific” institutes such as cancer and diabetes and a second for nine “science-enabling” institutes such as general medical science and genomics. Smaller piles would go to the director's office and its planning division.

    Capitol idea.

    Representative Joe Barton (R-TX) is proposing changes in how NIH manages its money.


    Although giving the NIH director more “flexibility” to move funds is a good idea, says David Moore, head of governmental relations for the Association of American Medical Colleges, Congress should be wary of sanctioning “huge reallocations” because that would override the careful planning that now goes into each institute's appropriation. Research lobbyists worry, too, that the science-enabling institutes could lose out because they don't have patient advocacy groups backing them. Patient groups, for their part, are concerned that eliminating institutes' individual appropriations will make it harder to advocate for funding for particular diseases.

    Research leaders are also unhappy that the bill would specify the maximum budget increase NIH could receive from 2007-09, the period of the reauthorization. Legislators typically talk about approving “such sums as necessary” in reauthorization bills to give appropriators full discretion each year.

    Barton is believed to be concerned that the doubling of NIH's budget between 1999 and 2003 was not particularly well managed and wants to foreclose such rapid growth. But “the research community is very concerned about what the overall authorization levels will be. We're watching it very closely,” says Patrick White, a lobbyist for the 62-member Association of American Universities.

    Barton hopes that the House will pass his measure before the end of the year. There is as yet no equivalent bill in the Senate. And any bill would be vulnerable to members of Congress attaching amendments on controversial topics such as support for human embryonic stem cell research. In the meantime, biomedical research advocates are watching closely what happens in the House.


    Global Analyses Reveal Mammals Facing Risk of Extinction

    1. Erik Stokstad

    Two new studies are helping conservation biologists think big—in the case of one of the studies, as big as one-tenth of the continents.

    Conservationists typically set goals and priorities for relatively small regions. Although some have come up with priorities for the planet, these have often been wish lists rather than objectives drawn from rigorous analyses. Now a team of researchers, led by mammalogist Gerardo Ceballos of the National Autonomous University of Mexico, has conducted the first global analysis of the conservation status of all known land mammals. On page 603, they report that 25% of known mammal species are at risk of extinction. In order to decrease the risk to mammals worldwide, about 11% of Earth's land should be managed for conservation, the analysis finds.

    This is the first time such a global conservation estimate has been calculated for mammals, and although experts are not surprised by these results, they praise the study for its comprehensiveness and detail. “This sets a new standard for global priority-setting analyses,” says Peter Kareiva, lead scientist for The Nature Conservancy.

    A second conservation study, reported online by Science this week (, finds that large mammals may be more threatened than their smaller relatives. A team led by Georgina Mace of the Zoological Society of London and Andy Purvis of Imperial College London reports that adult mammals that weigh more than 3 kilograms tend to have biological traits that hike their risk of extinction. “Both of these papers provide us with finer and more detailed insights into threat patterns and processes,” says Thomas Brooks of Conservation Inter-national in Washington, D.C.

    Big risk.

    Large size significantly ups the odds of extinction for mammals such as elephants and pandas.


    The two new analyses rely on massive data sets. Ceballos and his colleagues combed the literature and compiled geographic ranges for all 4795 known species of land mammals. After dividing the world's land into many thousands of cells, each 10,000 square kilometers, they plugged their range data into a conservation planning model, called MARXAN, that identified the least amount of area—all told, 17,020,000 km2, or 1702 cells—that would conserve at least 10% of the range of each species. Various population models used by conservation biologists typically specify that threshold as the minimum amount of range needed to sustain a healthy population of a species.

    This particular analysis won't be used in specific conservation efforts because the scale is much too coarse, but experts say it reveals important points. For example, the analysis shows that the collection of 1702 cells—11% of the total—would provide a resilient and flexible strategy, because almost any cell can be replaced by another cell without an overall loss of species diversity. But about 80% of these high-priority cells have already been affected by agriculture, which destroys natural habitat. “We simply are not going to be able to do conservation without making it compatible with some measure of agriculture,” notes Kareiva.

    The results from Ceballos's team are only for mammals, whose ranges may not overlap with those of other taxa. Adding birds, amphibians, and reptiles would increase the amount of land needed to be conserved. “We need to do much more,” says study author Paul Ehrlich, a population biologist at Stanford University. “If you want to add in most biodiversity, you're talking about [conservation of] 30% to 40% of Earth's surface,” he speculates. Ehrlich adds that the population size of a species that can survive by preserving 10% of its former range won't be as effective at providing ecosystem goods and services, such as pollination or bush meat.

    Similar results about mammal ranges and conservation, not yet published, will come from John Gittleman, an evolutionary biologist at the University of Virginia, Charlottesville. His group spent 4 years collecting range maps and biological data for all known land mammals. “There's a nice convergence,” he says. “It's very reassuring.”

    The report by Mace, Purvis, and their colleagues relies on information from Gittleman's group as well as other data sets such as the so-called World Conservation Union's Red List, which ranks mammals according to the extinction threats they face. Drawing on such information for 4000 mammal species, the authors determined what factors, such as small geographic ranges or large body size, put particular species at higher risks of extinction.

    The analysis found that for mammals smaller than 3 kilograms, the main risk factors were environmental, such as proximity to agriculture or human populations. Identifying and conserving habitat is likely to be enough to keep these species going, the scientists conclude. But larger animals, such as elephants and pandas, face threats magnified by intrinsic biological constraints, such as small litters and long gestation times. Conservation biologists had suspected that larger mammals face greater extinction risks, but the size of this data set puts the premise on a much stronger footing, Gittleman says.

    Mace, Purvis, and their colleagues conclude that the survival of large mammals will likely require a concerted effort tailored to the biology of each species.


    Forty-Four Researchers Broke NIH Consulting Rules

    1. Jocelyn Kaiser

    An internal review of 81 National Institutes of Health researchers who consulted for industry since 1999 has found that 44 did not follow NIH ethics rules for such activities. Nine cases are serious enough to be investigated for possible criminal misdeeds, according to the review.

    These results, released last week by the House Energy and Commerce Committee, are part of an examination of NIH ethics rules begun in late 2003 following media reports of large payments by drug and biotech companies to some NIH employees. The furor led NIH earlier this year to temporarily ban all consulting (Science, 11 February, p. 824).

    The violations show that “the ethical problems are more systemic and severe than previously known,” declared Representative Joe Barton (R-TX), chair of the panel that has been investigating NIH. Spokesperson John Burklow says NIH “has been aware of the issues and problems for some time” and is addressing them. Some NIH staffers and observers suggest that the report actually demonstrates how few of the agency's thousands of researchers committed serious violations. Still, “nine is too many,” says Howard Garrison, public affairs director of the Federation of American Societies for Experimental Biology in Bethesda, Maryland.

    The 81 names appeared on lists that 20 drug companies gave to the committee but not on NIH's own tally of staff consulting activities. Although 37 people were cleared, the rest didn't request approval for their consulting, did the work on company time, and/or did not report the income, according to an 8 July letter from NIH Director Elias Zerhouni to the committee. Eight have since left NIH. Officials have concluded that the consulting in some instances conflicted with the employee's official duties and in other cases traded on “the name of NIH as an affiliation.”

    Nine cases have been referred to the Department of Health and Human Services' inspector general (IG), the letter says. A few of those names have been reported in the press previously—such as Alzheimer's disease researcher Trey Sunderland, who is still at NIH, and cancer researcher Lance Liotta, who left this spring for George Mason University in Manassas, Virginia. A spokesperson in the IG's office said that former government employees may still be prosecuted.

    NIH is still reviewing the cases of 22 staffers. These scientists either admitted not reporting an activity or were named in stories by the Los Angeles Times that sparked the ethics overhaul.


    France Hatches 67 California Wannabes

    1. Martin Enserink

    PARIS—France may soon have its own Silicon Valley—or, more likely, 67 miniversions of that icon of American innovation. Last week, Prime Minister Dominique de Villepin announced a list of 67 regional partnerships across the country that his government hopes to nurture into cutting-edge science and technology engines designed to create new jobs and kick-start the economy. But the plan has already run into criticism: Some researchers say industrial strategy shouldn't drive research policy, while others argue that the funds available—some €1.5 billion for the next 3 years—are spread so thin they can't possibly have much impact.

    The decision to create “Competitiveness Clusters,” as the new regional hubs are called, was taken last year by the previous government, led by Jean-Pierre Raffarin. But it has been embraced by Villepin, who made fighting France's double-digit unemployment his number one priority when he took over last month. Flanked by four cabinet ministers and citing Silicon Valley as a “historic example,” Villepin called the plan a “choice for ambition” when he presented it last week.

    Spreading the wealth.

    Almost every region in France will be home to several of 67 new Competitiveness Clusters. (The number on this map is greater than 67 because interregional clusters are shown more than once.)


    The clusters—selected from 105 candidates by an interdepartmental panel—consist of a regional collaboration among research institutes, schools, universities, and businesses. Their focus ranges from nanotechnology and secure communications to sports equipment and—in “Cosmetic Valley,” a plan backed by companies such as Dior and Yves Saint Laurent—“the science of beauty and well-being.” The centers will benefit from tax breaks as well as specific support from funding agencies, including the new National Research Agency. They will also enjoy priority status when the government allocates the 3000 new research jobs it has promised for next year (Science, 27 May, p. 1243).

    But some fear that Villepin's version of Silicon Valley may be unattainable. The failure thus far to translate French research into new, profitable technologies stems from a variety of factors, says Alain Trautmann, the spokesperson of Sauvons la Recherche (Let's Save Research), a protest movement—including a less entrepreneurial spirit, timid venture capitalists, and discouraging bankruptcy laws. He doesn't think they can be fixed by scattering extra funds here and there. What's more, Trautmann says, U.S. high-tech hubs arise in areas with excellent basic research, which doesn't “take its orders from industry.”

    Others have criticized the large number of centers, suggesting that the plan is inspired more by behind-the-scenes lobbying and U.S.-style pork-barrel politics than by a desire to promote excellence. The resulting budget per center (some €7.5 million per year, often shared by dozens of partners) is bound to be ineffective, the opposition Socialist Party said in a statement last week.

    But Bruno Goud, a group leader at the Curie Institute—a partner in a health cluster in the Paris region that's on the list—says something is better than nothing. Although it may be “typically French” for the government, rather than market forces, to designate the hot spots of the future, he adds, that doesn't mean it won't work.


    Is It Time to Shoot for the Sun?

    1. Robert F. Service

    Officials at the U.S. Department of Energy are working to kindle support for a crash program to transform solar energy from a bit player into the world's leading power source

    Ask most Americans about their energy concerns, and you're likely to get an earful about gasoline prices. Ask Nate Lewis, and you'll hear about terawatts. Lewis, a chemist at the California Institute of Technology in Pasadena, is on a mission to get policymakers to face the need for sources of clean energy. He points out that humans today collectively consume the equivalent of a steady 13 terawatts (TW)—that's 13 trillion watts—of power. Eighty-five percent of that comes from fossil fuels that belch carbon dioxide, the primary greenhouse gas, into the atmosphere. Now, with CO2 levels at their highest point in 125,000 years, our planet is in the middle of a global experiment.

    To slow the buildup of those gases, people will have to replace most, if not all, of those 13 TW with carbon-free energy sources. And that's the easy part. Thanks to global population growth and economic development, most energy experts predict we will need somewhere around an additional 30 TW by 2050. Coming up with that power in a way that doesn't trigger catastrophic changes in Earth's climate, Lewis says, “is unarguably the greatest technological challenge this country will face in the next 50 years.”

    Clearly, there are no easy answers. But one question Lewis and plenty of other high-profile scientists are asking is whether it's time to launch a major research initiative on solar energy. In April, Lewis and physicist George Crabtree of Argonne National Laboratory in Illinois co-chaired a U.S. Department of Energy (DOE) workshop designed to explore the emerging potential for basic research in solar energy, from novel photovoltaics to systems for using sunlight to generate chemical fuels. Last week, the pair released their report on the Web (, and the hard copy is due out soon.

    The report outlines research priorities for improving solar power. It doesn't say how much money is needed to reach those goals, but DOE officials have floated funding numbers of about $50 million a year. That's up from the $10 million to $13 million a year now being spent on basic solar energy research. But given the scale of the challenge in transforming the energy landscape, other researchers and politicians are calling for far more.

    It is too early to say whether the money or the political support will fall in line. But it is clear that support for a renewed push for solar energy research is building among scientists. Last month, Lewis previewed his upcoming report for members of DOE's Basic Energy Sciences Advisory Committee (BESAC), which regularly must weigh its support for facilities that include x-ray synchrotrons, neutron sources, nanoscience centers, and core research budgets. Despite a painfully lean budget outlook at DOE, support for a solar research program “is nearly unanimous,” says Samuel Stupp, a BESAC member and chemist at Northwestern University in Evanston, Illinois.

    Why? Terawatts. Even if a cheap, abundant, carbon-free energy source were to appear overnight, Lewis and others point out, it would still be a Herculean task to install the new systems fast enough just to keep up with rising energy demand—let alone to replace oil, natural gas, and coal. Generating 10 TW of energy—about 1/3 of the projected new demand by 2050—would require 10,000 nuclear power plants, each capable of churning out a gigawatt of power, enough to light a small city. “That means opening one nuclear reactor every other day for the next 50 years,” Lewis says. Mind you, there hasn't been a new nuclear plant built in the United States since 1973, and concerns about high up-front capital costs, waste disposal, corporate liability, nuclear proliferation, and terrorism make it unlikely that will change in any meaningful way soon.

    Fields of gold.

    Solar power is the most promising renewable energy source.


    Other energy alternatives have their drawbacks as well. Fusion reactors have the theoretical potential to provide massive amounts of cheap power—but not soon. Last month, Japan, Europe, China, Russia, South Korea, and the United States agreed to build a new experimental fusion reactor in France at a projected cost of $5 billion (Science, 1 July, p. 28). But even if the facility meets proponents' grandest expectations, it will still provide a sustained fusion reaction for at most 500 seconds, a far cry from the continuous operation needed to yield large amounts of power. “Will it work? We don't know. But we think it's worth the investment,” says Ray Orbach, who directs DOE's Office of Science.

    There is, of course, a grab bag of renewable energy options as well. Chief among them is wind energy. The technology already produces electricity for $0.05 a kilowatt-hour, making it cheaper than all but natural gas and coal plants. Still, scale is a problem. If wind turbines were installed wherever wind is plentiful and the costs reasonable, they still would generate only 2 to 6 TW of power, according to recent estimates from the Intergovernmental Panel on Climate Change and the European Wind Energy Association. (A new estimate from researchers at Stanford University ups the figure to 72 TW, a much higher number based on wind potential at 80 meters off the ground—the height of modern wind turbine hubs—where wind speeds are typically stronger. But that estimate extrapolates global wind potential from point measurements, Lewis says.) In any case, it's clear that wind energy is a critical renewable resource that will be pursued. But if the earlier predictions of wind energy potential are correct, it's no panacea.

    Biomass, geothermal, and energy from ocean waves also have potential. But biomass's potential is limited by the need to use arable land to grow food; geothermal energy's potential is limited by high drilling costs; and ocean power has been stalled in part by high construction costs. Shunting CO2 from power plants underground before it can escape into the atmosphere holds vast promise (Science, 13 August 2004, p. 962). But large-scale demonstrations have only recently begun and haven't confirmed that CO2 will remain underground for hundreds to thousands of years without leaking out. “We absolutely need to be doing this. But it may not technically work,” Lewis says. Finally, conservation programs have the potential to squeeze a lot more mileage out of existing energy sources. But by themselves they don't solve the CO2 problem.

    So what is the world to do? Right now the solution is clear: The United States is currently opening natural gas plants at the rate of about one every 3.5 days. A stroll through Beijing makes it clear that China is pursuing coal just as fast. Fossil fuel use shows no signs of slowing (see figure, below).


    Handwringing geologists have been warning for years that worldwide oil production is likely to peak sometime between now and 2040, driving oil prices through the roof. The critical issue for climate, however, is not when production of a fossil fuel peaks, but its global capacity. At the 1998 level of energy use, there is still at least an estimated half a century worth of oil available, 2 centuries of natural gas, and a whopping 2 millennia worth of coal. The upshot is that we will run into serious climate problems long before we run out of fossil fuels.

    What's left? Solar. Photovoltaic panels currently turn sunlight into 3 gigawatts of electricity. The business is growing at 40% a year and is already a $7.5 billion industry. But impressive as it is, that's still a drop in the bucket of humanity's total energy use. “You have to use a logarithmic scale to see it” graphed next to fossil fuels, Lewis says.

    What solar does have going for it is, well, the sun. Our star puts out 3.8 × 1023 kilowatt-hours of energy every hour. Of that, 170,000 TW strike Earth every moment, nearly one-third of which are reflected back into space. The bottom line is that every hour, Earth's surface receives more energy from the sun than humans use in a year.

    Collecting even a tiny fraction of that energy won't be easy. To harvest 20 TW with solar panels that are 10% efficient at turning sunlight to electricity—a number well within the range of current technology—would require covering about 0.16% of Earth's land surface with solar panels. Covering all 70 million detached homes in the United States with solar panels would produce only 0.25 TW of electricity, just 1/10 of the electric power consumed in the country in the year 2000. That means land will need to be dedicated for solar farms, setting up land use battles that will likely raise environmental concerns, such as destroying habitat for species where the farms are sited.

    Solar energy advocates acknowledge that a global solar energy grid would face plenty of other challenges as well. Chief among them: transporting and storing the energy. If massive solar farms are plunked down in the middle of deserts and other sparsely populated areas, governments will have to build an electrical infrastructure to transport the power to urban centers. That is certainly doable, but expensive.

    A tougher knot is storing energy from the sun. Because electricity cannot be stored directly, it must be converted to some other form of potential energy for storage, such as the electrochemical energy of a battery or the kinetic energy of a flywheel. The massive scale of global electric use makes both of those forms of energy storage unlikely. Another possibility is using the electricity to pump water uphill to reservoirs, where it can later be released to regenerate electricity. Electricity can also be used to generate hydrogen gas or other chemical fuels, which can then be delivered via pipelines to where they are needed or used directly as transportation fuels. But that too requires building a new expensive infrastructure that isn't incorporated in solar energy's already high cost.

    The issue of cost may be solar energy's biggest hurdle. Even without the extra infrastructure, harvesting power from the sun remains one of the most expensive renewable technologies on the market and far more expensive than the competition. In his BESAC presentation last month, Lewis noted that electricity derived from photovoltaics typically costs $0.25 to $0.50 per kilowatt-hour. By contrast, wind power costs $0.05 to $0.07, natural gas costs $0.025 to $0.05, and coal $0.01 to $0.04. What is more, electricity makes up only about 10% of the world's energy use. Globally, most energy goes toward heating homes, something that can usually be done more cheaply than with electricity generated from fossil fuels. As a result, says Lewis, “solar energy needs to be 50-fold lower in cost than fossil fuel electricity to make electric heat cheap enough to compete.”

    If all this has a familiar ring to it, that's because many of the same arguments and alternatives have been discussed before. In the wake of the oil shocks of the 1970s, the Carter Administration directed billions of dollars to alternative energy research. The big differences now are the threat of climate change and the current huge budget deficits in the United States. Some of the cost numbers have changed, but the gap between solar energy's potential and what is needed for it to be practical on a massive scale remains wide. The April DOE meeting explored many ideas to bridge that gap, including creating plastic solar cells and making use of advances in nanotechnology (see sidebar, p. 549).

    That wealth of potentially new technologies makes this “an excellent time to put a lot of emphasis on solar energy research,” says Walter Kohn, a BESAC member and chemist at the University of California, Santa Barbara. Some of these ideas do currently receive modest funding, enough to support a handful of individual investigator-driven labs. But Richard Smalley, a chemist at Rice University in Houston, Texas, who advocates renewed support for alternative-energy research, notes that unless research progresses far more rapidly to solve the current energy conundrum by 2020, there is essentially no way to have large amounts of clean-energy technology in place by 2050. “That means the basic enabling breakthroughs have to be made now,” Smalley says.

    Global need.

    This map shows the amount of land needed to generate 20 TW with 10% efficient solar cells.


    Of course a major sticking point is money. At the April meeting, DOE officials started talking about funding a new solar energy research initiative at about $50 million a year, according to Mary Gress, who manages DOE's photochemistry and radiation research. Lewis is reluctant to say how much money is needed but asks rhetorically whether $50 million a year is enough to transform the biggest industry in the world. Clearly, others don't think so. “I don't see any answer that will change it short of an Apollo-level program,” Smalley says.

    For the past few years, Smalley has been advocating a $0.05-a-gallon gasoline tax to fund $10 billion a year in alternative energy research, which encompasses more than just solar research. A few members of Congress have recently pushed for that level of funding for alternative energy R&D. But so far such measures have failed to win broad support. Even coming up with $50 million a year in new money will be difficult, given growing pressure to reduce the current $333-billion-a-year deficit. “With the budget outlook the way it is, it'll be pretty hard,” says Patricia Dehmer, associate director of science in DOE's Office of Basic Energy Sciences. Asked whether a solar energy research initiative has a shot at receiving backing by the Administration, Joel Parriott, who helps the White House Office of Management and Budget oversee the budget for DOE's Office of Science, says that “it's too early to tell.” He adds that the Administration has already set its energy policy priorities as increasing oil drilling in Alaska's Arctic National Wildlife Refuge, clean coal, and hydrogen. However, he says, “that doesn't mean there isn't room for new things.”

    With Congress close to passing an energy bill that focuses on tax breaks for oil exploration and hybrid cars, it doesn't look as if a big push on solar energy will be one of those “new things” anytime soon. But Dehmer notes that progress on energy issues happens slowly. “I'm trying to lay the groundwork for a commitment on the scale of a major scientific user facility,” she says.

    At least compared with DOE's earlier push for progress in hydrogen technology, many researchers expect that a push on solar energy research will be a far easier sell. “With hydrogen it was a lot more controversial,” Stupp says. “There are scientific issues that are really serious [in getting hydrogen technology to work]. With solar, it's an idea that makes sense in a practical way and is a great source of discovery.” If that research and discovery doesn't happen, Lewis says he's worried about what the alternative will bring: “Is this something at which we can afford to fail?”


    Solar Report Sets the Agenda

    1. Robert F. Service

    If they are ever to supply a major part of the world's energy needs, solar cells must become both much cheaper and more efficient at converting sunlight to electricity. Meeting those somewhat contradictory goals will not be easy. But recent trends in the industry offer hope.

    In fact, the efficiency of solar cells has risen steadily over the past 4 decades. And as manufacturing levels have risen, the price of installed solar panels has dropped dramatically—particularly in Japan, where increasing sales slashed solar power prices an average of 7% a year between 1992 and 2003, according to the International Energy Agency. Still, prices must drop another 10- to 100-fold to make solar not just competitive with other electric sources but cheap enough to be used to generate transportation fuel and home heating. In hopes of bringing about those and related changes, the new Department of Energy report identifies 13 priorities for solar energy research. Among them:

    Revolutionary photovoltaic designs Standard solar panels can turn at most one-third of the energy in the photons that strike them into electric current. Some of those photons have too little energy to excite electrons in the solar cells, and others have extra energy that just generates heat. Recent lab studies indicate that it may be possible to capture some of the high-energy strays using nano-sized lead-based particles that generate more than one electron from an incoming photon. But the technique has yet to be demonstrated in a working solar cell.

    “Plastic” cells Solar cells made from organic materials, including cheap high-volume polymers, have the potential to drastically reduce the cost of solar electricity. But current versions suffer from low efficiency, as most convert less than 2% of solar energy into electricity. New materials and device designs could change that equation.

    Flex time.

    Reel-to-reel manufacturing could slash the cost of plastic cells.


    Nanotechnology Although crystalline solar cells can reach efficiencies of about 30%, producing the crystalline silicon in the first place is energy intensive and expensive. Solar cell makers have begun using cheap chemical manufacturing techniques to create nano-sized semiconductor crystals and incorporating these into solar cells. These cells are typically far cheaper to make, but for now the efficiency is stuck at about 10% or less. Researchers might be able to boost that efficiency if they can find ways to organize those nanoparticles to ferry excited electrons out of the cells.

    From air and water to fuel Sunlight can be used to split water molecules into oxygen and hydrogen gas, which can be stored, transported through pipelines, and used either to fuel vehicles or to generate electricity. But here too efficiency is a problem. The catalysts used to split water absorb only a couple of percent of the energy in sunlight that hits them, and in many cases they are unstable in practical settings. That could change if researchers could find new high- efficiency, stable catalysts to do the job. Equally promising is to find high-efficiency catalysts capable of using solar energy to convert carbon dioxide from the air into energy-rich hydrocarbon fuels.

    Solar concentrators Large banks of reflectors that concentrate large amounts of sunlight on a single photovoltaic already produce the lowest-cost solar electricity. Researchers are also looking at related designs to split water to create hydrogen gas, or to strip hydrogen gas from fossil fuels, while sequestering the carbon. To be most efficient, such reactors must concentrate enough sunlight to reach 2000 kelvin. But such high temperatures cause heat shocks that break down the ceramic materials in the chemical reactors. New heat-resistant ceramics could help lower the cost of sunlight-derived fuels.


    A Powerful First KiSS-1

    1. Gretchen Vogel

    Puberty researchers are finding that the protein kisspeptin and its receptor are central to this sexual maturation

    Both anticipated and dreaded, puberty is rarely fun. From swelling breasts and sprouting hair to cracking voices and unexpected urges, this transition is almost always awkward, especially if puberty comes earlier or later than normal. It is a rare teenager who has not wondered, “Why is this happening to me?”

    The body's awakening into sexual maturity is no less puzzling for developmental biologists and endocrinologists. And they have an equally straightforward question: How does the body know when, exactly, to unleash the cascade of hormones that change face, voice, height, bone structure, and sexual organs into those of a fertile adult? The emerging answer, it seems, could have come from a teenage romance novel: Puberty starts with a kind of kiss.

    Recent studies have shown that a protein called kisspeptin is a key trigger of the complex chain of physiological reactions that readies the body for sexual maturity. Without this signal, people, as well as mice and other mammals, stay in a preteen limbo and never fully grow up. Discovering the involvement of kisspeptin and its receptor, a protein called GPR54, in puberty “is a major breakthrough in reproductive physiology,” says Manuel Tena-Sempere of the University of Cordoba in Spain. Indeed, the duo was one of the most-discussed topics at a recent meeting on the control and onset of puberty.*

    Scientists hope the two proteins might help them solve long-standing puzzles about the start of puberty, such as how the body revives the hormone production that is prevalent in fetal and newborn development but then mysteriously disappears during childhood, and how puberty might be influenced by nutrition and other metabolic factors. Preliminary evidence suggests, moreover, that the protein pair may even play a lifelong role in regulating sex hormones and reproduction.

    The topic is more than academic. For some children, puberty doesn't happen at the right time: Girls who start to develop breasts and pubic hair as young as 6 years old, and boys at 17 who still sing soprano often end up at the pediatrician's office looking for answers. Although the physical consequences of being an early or late bloomer remain unclear, the social consequences can be significant. Boys who develop late may face brutal taunting because of their small stature and underdeveloped muscles. And early-developing girls “have higher rates of depression, substance abuse, and teenage pregnancies,” Pierre-André Michaud, a specialist in adolescent medicine at the University of Lausanne in Switzerland, said at the meeting. Consequently, physicians are eager to understand how puberty is controlled and whether they can, or should, safely delay or accelerate it in certain cases.

    Are you ready?

    A protein called kisspeptin helps trigger the flood of hormones that marks puberty.


    KiSS-1-ng partner It was GPR54, not kisspeptin, that appeared first as a player in puberty. The initial clue was a 20-year-old man in Paris who had undeveloped testes, sparse pubic hair, and the bone maturity of a 15-year-old; such lack of sexual development is called idiopathic hypogonadotropic hypogonadism (IHH). Doctors soon discovered that the man was not the only one in his family to fail to complete puberty: Three of his four brothers were similarly affected, and one of his two sisters had experienced only a single menstrual period in her life—at age 16. All had abnormally low levels of sex hormones.

    It turned out that the parents of this family were first cousins and, as a team led by Nicolas de Roux of INSERM in Paris reported in 2003, both mother and father carried a mutation in one copy of their GPR54 gene. The affected children had all inherited two mutated copies of the gene. Other researchers had shown that GPR54 acts as a receptor for kisspeptin, so de Roux and his colleagues suggested that the molecular embrace between the two proteins might be a player in the first steps of puberty.

    A month after de Roux's paper was published, that suggestion got a major boost. Stephanie Seminara, Yousef Bo-Abbas, and William Crowley of Harvard Medical School in Boston and their colleagues reported that six members of a large Saudi Arabian family, all diagnosed with IHH, also had mutations in their GPR54 genes. They also found that an unrelated patient with IHH carried mutations in both his copies of the gene. In the same paper, researchers from Paradigm Therapeutics in Cambridge, U.K., reported that mice lacking the GPR54 gene also failed to go through the rodent version of puberty.

    Scientists at the time knew very little about GPR54. They knew its gene was expressed in the brain and the placenta, and they knew the protein was a receptor for kisspeptin, which is encoded by a gene called KiSS-1. KiSS-1, on the other hand, was fairly well known, but not among endocrinologists. The gene was discovered by cancer researchers at Pennsylvania State College of Medicine in Hershey, Pennsylvania, who noticed that it played a role in the ability of tumor cells to move and metastasize. (The romantic connection to puberty is accidental: The researchers named the gene for the famous Hershey chocolate drops.)

    Because of KiSS-1's known role in cell motility, scientists initially thought that the kisspeptin-GPR54 pairing might influence puberty by directing so-called GnRH neurons to the correct part of the brain. GnRH neurons were identified more than 3 decades ago as the source of gonadotropin-releasing hormone (GnRH), a brain chemical that prompts the pituitary gland to produce follicle stimulating hormone and luteinizing hormone (LH). Those signals in turn stimulate production of sex hormones such as estrogen and testosterone in the ovaries and testes.

    Kallmann syndrome, another condition in which patients fail to go through puberty, is caused by the improper migration of GnRH neurons during fetal development, so researchers wondered whether a similar problem affected IHH patients with GPR54 mutations. But subsequent studies have since shown that GnRH neurons are present in the correct place and quantity in the GPR54-knockout mice.

    Instead, the mutations may prevent the release of GnRH; GnRH neurons express GPR54 receptors, and their activation by kisspeptin prompts the cells to release their hormonal signal. In cell-based assays, kisspeptin “is one of the most powerful activators of GnRH neurons ever seen,” says Robert Steiner of the University of Washington, Seattle. And in February, endocrinologist Tony Plant of the University of Pittsburgh in Pennsylvania reported in the Proceedings of the National Academy of Sciences that within 30 minutes of injecting juvenile male rhesus monkeys with kisspeptin, the animals' levels of LH increased 25-fold.

    Leading lights.

    The neurons that express the KiSS-1 gene (white dots) cluster in a region of the hypothalamus known to respond to sex hormones.


    Puberty's puzzles

    Those results solidify the fundamental role of kisspeptin and GPR54 in puberty's onset, but it is not the whole story. “I'm not sure this is the discovery of the Holy Grail for puberty,” Steiner says. “You need to have this circuit operating for sure, but the conclusion that this is the ultimate switch for puberty is probably premature.”

    A missing link, for example, is what turns the circuit on. Steiner and neuroendocrinologist Allan Herbison of the University of Otago in Dunedin, New Zealand, are studying the neurons that produce the protein to find out what signals influence them. One of the most intriguing ideas is that kisspeptin might be connected to the hormone leptin: Steiner said at the meeting that he has preliminary evidence that at least half of the neurons that express KiSS-1 also carry receptors for leptin.

    A few years ago, many scientists thought that leptin, which is produced by fat cells, was the key puberty trigger, providing a way for the body to delay sexual maturation until it has enough stored energy to support reproduction. Women who become too thin, for example, become infertile and stop having periods. And people and mice with mutations in the genes coding for leptin or its receptor are infertile, apparently because of a failure to go through puberty. But further research failed to turn up direct connections between leptin and GnRH neurons.

    There's early evidence that kisspeptin may help mediate such a connection. In the June issue of Endocrinology, Tena-Sempere reports that rats kept on a restrictive diet produce less messenger RNA (mRNA) from KiSS-1, consistent with the idea that the gene responds to leptin and other hormones that signal the body's nutritional status. They also found that administering kisspeptin to underfed juvenile rats could jump-start their delayed puberty, perhaps bypassing the need for leptin to reach some puberty threshold.

    The KiSS-1 neurons, Steiner says, may integrate signals from a wide variety of body systems, such as how much food is available and even circadian clues such as time of day and season of year. The connection may sound surprising, but researchers have long known that GnRH and other sex hormones follow a daily rhythm and that the first hormone surges of puberty tend to occur at night. Steiner says he and his colleagues are looking for connections between KiSS-1 neurons and the brain's circadian clock to see if they might link the circadian and reproductive systems. But such work is still speculative. “The KiSS-1 neuron is far from characterized,” de Roux cautions.

    There is also evidence that the kisspeptin-GPR54 signal helps regulate reproduction long past the first stirrings of puberty. Steiner and his colleagues reported online in the 26 May issue of Endocrinology that KiSS-1 neurons in the mouse brain carry estrogen receptors and that levels of KiSS-1 mRNA in the brains of adult mice are modulated by injections of the hormone. And a group led by Keiichiro Maeda of Nagoya University in Japan reported online 23 June in Endocrinology that when they used antibodies to block the kisspeptin-GPR54 signal in adult female rats, the LH surge that triggers ovulation didn't occur. “It is not just a switch that is activated once,” Tena-Sempere says. It seems that, like the best kisses, KiSS-1 has long-lasting consequences.

    • *6th Puberty Conference, Evian, France, 26-28 May.


    Europe Joins Forces in Push for Monster Scope Project

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Amersfoort, the Netherlands.

    European astronomers want to leapfrog current technology to make a telescope 10 times as wide as today's largest. But do they have the know-how or the unity?

    DWINGELOO, THE NETHERLANDS—The European Union wants its scientific enterprise to be second to none. At a meeting* here earlier this month, E.U. officials joined a chorus of researchers who want Europe's disparate national astronomy communities to work together in continent-wide organizations. As an example of what could be gained, researchers reported on their grand vision: a gargantuan telescope sporting a mirror 50 to 100 meters across that could be gazing skyward in 10 years. At the meeting, researchers presented a study that lays out the scientific case for the European Extremely Large Telescope (ELT), an effort supported in part by the E.U., and over the next 3 years they will carry out a detailed design study, again with E.U. help.

    “Europe is already a world leader” in ground-based optical and infrared astronomy, says Gerry Gilmore of Cambridge University in the U.K. “We aim to stay there.” The continent owes its status partly to one pan-European success story: the European Southern Observatory (ESO), which operates several top-rank scopes in Chile. But some astronomers think the Europeans may be overreaching themselves with the ELT; U.S. astronomers will first move from today's 10-meter scopes to something around 20 or 30 meters. “Understandably, Europe doesn't want to be left behind,” says Richard Ellis, director of the Caltech Optical Observatories in Pasadena, “but they could avoid that by building a second 30-meter telescope instead.”

    Ever since Galileo observed the heavens through his home-built spyglass some 4 centuries ago, telescope sizes have doubled every 50 years or so. The current record holders are the two 10-meter Keck telescopes at Mauna Kea, Hawaii. But the ELT's Science Working Group, chaired by Isobel Hook of the University of Oxford, U.K., described plans to break the trend by making the most dramatic leap in the history of telescopic astronomy. If built, the ELT's segmented mirror will be larger than all previous professional telescope mirrors combined.

    Roberto Gilmozzi of ESO in Garching, Germany, who will coordinate the ELT design study, says detector technology has improved so rapidly over the past decade that “we now need bigger telescopes” to take full advantage of it. A 100-meter “Overwhelmingly Large Telescope,” with adaptive optics to compensate for atmospheric turbulence, could detect stars 5 trillion times fainter than the naked eye can see. It would also have a resolving power of a milli-arc second—enough to discern a dime 3500 kilometers away.

    Giant leap.

    The proposed Overwhelmingly Large Telescope would boast a 100-meter mirror.


    According to Hook's 140-page report laying out the science behind the project, the ELT's main targets will be exoplanets, galactic evolution, and cosmology. The telescope could detect Earth-like planets circling other stars out to a distance of 75 light-years, says Gilmozzi. Spectroscopic studies of such planets might find possible signs of life such as atmospheric oxygen. Current telescopes can't distinguish individual stars in distant galaxies. But the ELT will be able to, and logging millions of them from many galaxies would provide information on the origin and evolution of these vast stellar assemblies. The monster telescope should also be able to look far enough into space, and hence back in time, to learn more about the universe's first light and the mysterious dark energy that is accelerating cosmic expansion.

    U.S. astronomers and telescope builders are keeping a close eye on European plans. They themselves are designing and building several telescopes up to 30 meters wide. “Fifty to 100 meters is pretty gutsy and could lead to unfortunate technical choices because of lack of experience at intermediate size,” says Roger Angel of the University of Arizona's Mirror Laboratory in Tucson, where the first mirror is currently being cast for the 21-meter Giant Magellan Telescope. Ellis agrees. “It's a big leap,” he says.

    To orchestrate the effort, in 2004 the E.U. funded the creation of OPTICON (Optical Infrared Coordination Network for Astronomy). The network now consists of 47 groups in 19 countries. Representatives of several similar nascent pan-European collaborations also attended the Dwingeloo meeting. E.U. Research Commissioner Janez Potocnik told them that the new research infrastructures are key to Europe's research future. “They will bring us even closer to answering some of the most fundamental questions that mankind has ever asked,” he says. Potocnik acknowledges that the currently stalled negotiations over the E.U. budget mean that “Europe is in crisis” (Science, 10 June, p. 1530), but he stresses that building a knowledge society is essential for the future of the continent. “We need these decisions now,” he says. Even if a diminished budget scuppers the E.U.'s grand plans, astronomers seem confident that the ELT will be built. “ESO alone would be able to finance a 40- to 60-meter telescope,” says Gilmozzi.

    In the end, budget considerations may force astronomers on both sides of the Atlantic to work together. According to Ellis, the United States would be very interested in becoming a partner in a global effort to construct a 50- to 100-meter telescope, provided 30-meter instruments are built first. But right now, European bravado makes this scenario seem unlikely. “We'll certainly not go back to 30 meters,” says Gilmozzi.

    • * Astronomy Looks Into the Future—The Role of European Infrastructures, Dwingeloo, the Netherlands, 7 July 2005.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution

Navigate This Article