News this Week

Science  11 Feb 2005:
Vol. 307, Issue 5711, pp. 203

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    NIH Chief Clamps Down on Consulting and Stock Ownership

    1. Jocelyn Kaiser

    Under intense pressure from outside, National Institutes of Health (NIH) Director Elias Zerhouni last week issued unexpectedly strict ethics rules intended to “[preserve] the public's trust” in the agency. Congressional critics, who have been troubled by revelations about apparent conflicts of interest among some senior NIH scientists, praised the new rules, but many NIH staff members are outraged, calling them punitive and draconian. Under the rules, which took effect on 3 February, all NIH staff are barred from outside paid or unpaid consulting for drug and medical companies and even nonprofit organizations. All 17,500 employees will also have to sell or limit their stock in biotech and drug companies.

    As recently as December, Zerhouni said some industry consulting should be allowed. But pressures from Congress and the difficulty of forging workable rules led him to decide on a “clean break.” As he told employees on 2 February, “this issue was standing between the prestigious history of NIH and its future.” Leaders in the biomedical research community say the harsh steps were unavoidable as questions continued to arise about a handful of intramural researchers who apparently violated existing ethics rules. “The ground was cut out from under Zerhouni. His hand was forced because of the behaviors of a few,” says David Korn, a former medical school dean at Stanford University who is now a senior vice president at the Association of American Medical Colleges (AAMC).

    Clean slate.

    Zerhouni decided only a ban on consulting could resolve past problems.


    Zerhouni concedes that he had no choice. Ethics concerns first surfaced in 2003 when Congress inquired about cash awards received by then-National Cancer Institute (NCI) director Richard Klausner. NCI ethics officials had approved the awards, but a House subcommittee suggested that gifts from grantee institutions posed a conflict of interest. Then came a December 2003 article in the Los Angeles Times reporting that several top scientists had received hundreds of thousands of dollars in payments from industry and raising questions about conflicts of interest. (Former director Harold Varmus had loosened the rules on consulting in 1995 to make them more consistent with those of academia and help recruit talent to the intramural program.)

    As Congress began investigating, Zerhouni conferred with an outside panel and proposed new limits. But last summer more problems arose: According to data from drug companies, several dozen employees hadn't told NIH about their consulting activities. “It was like getting shot in the back by your own troops,” says Zerhouni.

    He then proposed a 1-year moratorium on consulting, but again, more concerns emerged in the press: Some researchers were apparently paid to endorse particular drugs. The final straw came when the Senate appropriations committee suggested that failure to take strong measures could become “a basis for a cut” in NIH's budget, Zerhouni says.

    The new rules, a 96-page Department of Health and Human Services (HHS) interim regulation, ban all compensated and uncompensated consulting or speaking for drug, biotech, and medical-device companies, health care providers, institutions with NIH grants, and even professional societies (see table). NIH scientists can still receive payments for teaching courses, editing and writing for peer-reviewed publications, and practicing medicine. NIH researchers can also continue some activities, such as serving on a society's board, if their supervisor approves it as official duty.


    The rules also prohibit senior staff members who file public or confidential financial disclosure reports—about 6000, including many researchers—and their spouses and minor children from owning biotech or drug company stock, a rule followed only by regulatory agencies such as the Food and Drug Administration and the Securities and Exchange Commission. Other NIH employees, such as secretaries and technicians, can keep no more than $15,000 in related stock. The rules restrict cash scientific awards to $200, except for employees below senior level with no business with the donor. (There is an exception for a few awards such as the Nobel Prize.) So far only about 100 awards have been deemed “bona fide.”

    Although HHS will collect comments on the “interim” regulation for 60 days and evaluate it after a year, the rules will stand unless they are clearly harming recruitment and retention, Zerhouni said. NIH employees will have just 30 to 90 days to end their outside activities and up to 150 days to divest stock.

    At an employee meeting last week, staff members reacted angrily to the rules, which one researcher described as “throwing the baby out with the bathwater.” The edict to sell off stock, in particular, has hit a nerve: With stock prices low, it could cause “potentially irreparable financial harm,” warned one lab chief who, like others, asked a reporter not to use his name. Others questioned the rationale. NIH Deputy Director Raynard Kington responded that although NIH, unlike FDA, does not regulate companies, its “influence [has become] substantial,” citing a drop in the market in December after two large NIH trials using COX-2 inhibitor painkillers were halted for safety reasons.

    Scientific groups outside NIH, such as AAMC, generally support the new rules—with caveats. “The nuances and consequences must be watched very, very carefully,” says Korn. The Federation of American Societies for Experimental Biology (FASEB) expressed concerns about recruiting as well as possible limits on participating in scientific societies. “It would be a serious loss if those activities were completely curtailed,” said FASEB president Paul Kincade.

    Thomas Cech, president of the Howard Hughes Medical Institute in Chevy Chase, Maryland, worries that the rules could undermine Zerhouni's goal of translating research into cures. “Medical uses require commercialization. It's not something to be ashamed about. The key thing is to manage to avoid conflict of interest,” Cech says.

    The new rules seem “like a heavy-handed solution,” says Varmus, now president of Memorial Sloan-Kettering Cancer Center in New York City. But thanks to other reforms in the 1990s, “the intramural program is strong, and it can survive,” he says. Top scientists will still be attracted to NIH, where they are protected from the vagaries of winning grants in a tight budget climate, he says. “The people who just want to do science will still come here,” agrees Robert Nussbaum, a branch chief at the genome institute. But exactly what NIH will look like under some of the most stringent ethics rules in the federal government may not become apparent for several years.


    NIH Wants Public Access to Papers 'As Soon As Possible'

    1. Jocelyn Kaiser

    Ending months of uncertainty, National Institutes of Health (NIH) Director Elias Zerhouni last week unveiled a policy aimed at making the results of research it funds more freely available. But the announcement has injected a new element of controversy into an already bitter debate. Zerhouni is asking NIH-funded researchers to send copies of manuscripts that have been accepted for publication to a free NIH archive. Researchers will specify when the archive can make them publicly available, but NIH wants that to be “as soon as possible (and within 12 months of the publisher's official date of final publication).” That language has stirred worries that NIH is putting authors on the spot by asking them to challenge publishers' own release dates.

    The “public access” policy emerges from a major battle last year. At the request of Congress, NIH in September asked for comment on a proposal to urge its grantees to submit copies of their research manuscripts for posting on NIH's PubMed Central archive 6 months after publication. NIH argued that this would increase public access to research and help it manage research programs. Supporting this plan were librarians, patient advocates, and some scientists who feel that journal prices are too high and that access to research articles should be free. In the other corner, publishers said that free access so soon after publication could bankrupt them and inflict damage on scientific societies dependent on journal income.

    After collecting more than 6000 comments from both sides, Zerhouni on 3 February issued a final policy* that states NIH will wait up to 1 year to post the papers, although it “strongly encourages” posting “as soon as possible.” This “flexibility” will help protect publishers who believe earlier posting will harm revenues, he says. Norka Ruiz Bravo, NIH deputy director for extramural research, expects that authors “will negotiate” the timing with the publisher rather than relying on the publisher's policy for when articles can be posted. NIH will not track compliance or make public access a condition of accepting an NIH grant, she says: “We have no plans to punish anybody who doesn't follow the policy.”

    The policy applies only to original research manuscripts, and authors will send in the final peer-reviewed version accepted for publication. If the author wishes, PubMed Central will incorporate subsequent copy-editing changes to avoid having two slightly different versions of the paper. Alternatively, publishers can have NIH replace the manuscript in PubMed Central with the final published paper.

    Authors vs. publishers?

    NIH's Ruiz Bravo urges authors to ask publishers to allow speedy free access to articles.


    NIH didn't attempt an economic analysis of the impact on journals, Ruiz Bravo says, because that “would be a major thing.” However, the agency argues that because NIH-funded papers make up only 10% of the biomedical research literature, the policy won't put journals out of business; NIH promises to track the impact of the policy through a new advisory group.

    Neither side seems satisfied. A group of nonprofit publishers called the D.C. Principles Coalition argues that the $2 million to $4 million per year that NIH estimates it will cost to post 60,000 papers is an unnecessary expense because most nonprofit journals already make papers publicly available in their own searchable archives after a year. “We're concerned about the waste of research dollars,” says Martin Frank, executive director of the American Physiological Society in Bethesda, Maryland. Frank also argues that the plan would infringe journals' copyright, and it might not stand up to a legal challenge.

    For their part, open-access advocates aren't happy about the “voluntary” aspect or the 12-month timeframe. Whether articles will become available any sooner than they are now “is a big ‘if,’” says Sharon Terry, president of the Genetic Alliance and an organizer of the Alliance for Taxpayer Access in Washington, D.C. The request that authors try to have their papers posted as soon as possible puts them “in the untenable position” of trying to please both NIH and their publishers, says the Alliance for Taxpayer Access.

    The only group that seems pleased with the wording is the Public Library of Science (PLoS) in San Francisco, California, which charges authors publication costs and then posts papers immediately upon publication. “We have influence here,” says PLoS co-founder Harold Varmus, president of Memorial Sloan-Kettering Cancer Center in New York City. “The journal may say 12 months, but the journal also wants [the] paper. Researchers are going to be voting with their feet.”

    But that assertion assumes researchers will feel strongly enough to raise the issue with publishers. Virologist Craig Cameron of Pennsylvania State University, University Park, says he will likely rely on the publisher's existing policy even if it's 12 months. “With everything I have to think about on a daily basis, it's not something I would spend a lot of time on,” he says. Authors will be asked to send their manuscripts to NIH starting 2 May.


    Ginseng Threatened by Bambi's Appetite

    1. Erik Stokstad

    With few natural predators left, deer are running rampant across much of eastern North America and Europe. In addition to damaging crops, raising the risk of Lyme disease, and smashing into cars, white-tailed deer are eating their way through forests. “This is a widespread conservation problem,” says Lee Frelich of the University of Minnesota, Twin Cities. Indeed, on page 920, a detailed, 5-year forest survey of ginseng reveals that deer, if not checked, will almost certainly drive the economically valuable medicinal plant to extinction in the wild.

    The survey was conducted by James McGraw, a plant ecologist at West Virginia University in Morgantown, and his graduate student Mary Ann Furedi. Ginseng is one of the most widely harvested medicinal plants in the United States; in 2003, 34,084 kilograms were exported, mainly to Asia, where wild ginseng root fetches a premium. Although the plant (Panax quinquefolius) ranges from Georgia to Quebec, it is slow-growing and scarce everywhere.

    To determine the population trends of ginseng, McGraw and Furedi began a census in West Virginia forests. For 5 years, they checked seven populations of wild ginseng every 3 weeks during the spring and summer. They quickly noticed that plants were disappearing. In some places, all of the largest, most fertile plants were gone by mid-August. At first they suspected ginseng harvesters, but the valuable roots were left. Cameras confirmed that deer were at work. The nibbled plants are less likely to reproduce, and after repeated grazing, they die. Indeed, during the study, populations declined by 2.7% per year on average.

    Oh deer.

    Deer are eating their way through too much ginseng (right).


    McGraw and Furedi then ran a ginseng population viability analysis. By plugging in the sizes of plants in various populations, mortality rates, and other factors, they learned that current ginseng populations must contain at least 800 plants in order to have a 95% chance of surviving for 100 years.

    That's bad news. A broader survey they conducted of 36 ginseng populations across eight states revealed that the median size was just 93 plants and the largest was only 406 plants. At the current rate of grazing, all of these populations “are fluctuating toward extinction,” McGraw concludes. Even the biggest population has only a 57% chance of surviving this century.

    “This paper has high significance because it's one of the first demonstrations of the direct impact of deer browsing on understory plants,” says Daniel Gagnon of the University of Quebec, Montreal. And deer eat more than ginseng. “We could lose a lot of understory species in the next century if these browsing rates continue,” McGraw says. That in turn could affect birds, small mammals, and other wildlife that rely on these plants.

    McGraw and Furedi calculate that browsing rates must be cut in half to guarantee a 95% chance of survival for any of the 36 ginseng populations they surveyed. That has direct management implications, says Donald Waller of the University of Wisconsin, Madison. “We should be encouraging the recovery of large predators like wolves. It also suggests we should be increasing the effectiveness of human hunting” by emphasizing the killing of does rather than bucks, he adds. Such deer-control measures are controversial: Reintroduction of predators like wolves faces logistical as well as political hurdles, for example. Meanwhile, the deer keep munching.


    Millennium's Hottest Decade Retains Its Title, for Now

    1. Richard A. Kerr

    The scientific consensus that humans are warming the world stands on three legs, one of which has been getting a workover lately. For a decade, paleoclimatologists have combed through temperature records locked in everything from ancient tree rings to ice cores, yet they've failed to find a natural warming in the past 1000 years as big as that of the past century. That implied that humans and their greenhouse gases were behind the recent warming, as did computer studies of warming patterns and the trend of 20th century warming. But in a soon-to-be-published Geophysical Research Letters paper, two researchers attack the recent warming reflected in an iconic paleoclimate record as an artifact of a programming error.

    Even as greenhouse skeptics revel in what they presume is the downfall of one of global warming's most prominent supports, paleoclimatologists have come up with yet another analysis. In a paper published this week in Nature, Swedish and Russian researchers present their first entry in the millennial climate sweepstakes. They consider new sorts of measurements and apply a different analytical technique to the data. Their conclusion: Even the surprisingly dynamic climate system doesn't seem to have produced a natural warming as large as that of the past century. “The past couple of decades are still the warmest of the past 1000 years,” says climate researcher Philip Jones of the University of East Anglia in Norwich, U.K.

    The millennial climate debate has revolved around the “hockey stick” record published in Nature by statistical climatologist Michael Mann of the University of Virginia, Charlottesville, and his colleagues in 1998 and revised and extended in 1999. He and his colleagues started with 12 temperature records extracted from, among other things, the width of tree rings, the isotopic composition of ice cores, and the chemical composition of corals—so-called proxies standing in for actual measurements of temperature. They compiled the proxy records and calibrated them against temperatures measured by thermometers in the 20th century. The result was the “hockey stick” curve of Northern Hemisphere temperature over the past millennium. Temperature declined slowly during most of the millennium, creating the long, straight handle of the stick, before rising sharply beginning in the mid-19th century toward the heights of the 1990s, forming the tip of the upturned blade of the stick. Those temperatures handily exceed any temperature of the past millennium.

    Still no equal.

    Temperature records recovered from tree rings and other proxies broadly agree that no time in the past millennium has been as warm as recent decades (black).


    Two researchers are now saying that the millennial curve doesn't resemble a hockey stick at all. In their latest paper, Stephen McIntyre of Toronto, Canada, a mineral-exploration consultant, and economist Ross McKitrick of the University of Guelph, Canada, make two charges. They claim that “what is almost certainly a computer programming error” in the statistical technique used by Mann and colleagues causes a single record—from ancient bristlecone pine trees of the western United States—to dominate all other records. And the bristlecone pines had a late growth spurt apparently unrelated to rising temperatures, they say. They also charge that Mann's techniques create the appearance of statistical significance in the first half of the millennium where none exists. When McIntyre and McKitrick kicked off a publicity campaign late last month, greenhouse contrarians were gleeful.

    Mann calls the McIntyre and McKitrick charges “false and specious.” He has been parrying their claims since they responded to his 1998 paper with what he says was an analysis of an inadvertently corrupted data set. The bottom line from the latest go-round, Mann says, is that the same hockey stick appears whether he uses his original technique, variations on it, or a completely different methodology. Observers have been slow to wade into such turbid statistical waters, citing instead the other half-dozen paleoclimate studies employing a variety of data analyzed using two different types of methodologies. McIntyre, however, sees far too much overlap among analysts and data sets and perceives far too many problems in analyses to be impressed.

    Now comes a joint Swedish-Russian effort that clearly breaks away from the pack. Climate researcher Anders Moberg of the University of Stockholm, Sweden, and his colleagues have not participated in previous millennia analyses. Tree rings don't preserve century-scale temperature variations very well, so they added 11 proxy records ranging from cave stalagmites in China to an ice core in northern Canada. They also used a wavelet transform technique for processing the data, a new approach in millennial studies.

    Moberg and his colleagues found that temperatures around the hemisphere fell farther during the Little Ice Age of the 17th century than in Mann's reconstruction and rose higher in medieval times. The medieval warmth equaled that of most of the 20th century, but it still did not equal the warmth of 1990 and later.

    Moberg's result is only the latest to suggest that the handle of “the hockey stick is not flat,” says paleoclimatologist Thomas Crowley of Duke University in Durham, North Carolina. “It's more like a boomerang,” he notes. The near end still sticks up—albeit less dramatically—above all else of the past 1000 years.


    With a Stumble, Microsoft Launches European Research Project

    1. Gretchen Vogel

    The Microsoft Corp. is about to increase its research presence in Europe. On 2 February, company Chair Bill Gates told a meeting of government leaders in Prague that Microsoft plans to fund several research centers, graduate scholarships, and scientific meetings across Europe, focusing on the interface between computer science and biology, agriculture, and engineering. The venture has been widely welcomed, except for one problem: Its name, the EuroScience Initiative, is already taken.

    The initiative's first site will be the Center for Computational and Systems Biology in Trento, Italy. The center will receive up to €15 million over the next 5 years, 60% from national and local governments and 40% from Microsoft. Corrado Priami, a bioinformatics professor at the University of Trento who will head the center, says up to 30 researchers will focus on understanding complex systems such as the chemical communication within a cell and developing tools for biologists and computer designers. Priami says all research results will be made public, and intellectual property will remain with the university, although Microsoft will have an option to exclusively license products that result from the funded research.

    Microsoft is reportedly in discussions with universities in Germany, France, and the U.K. and plans to announce several more centers later this year.

    As for the name, the EuroScience Association, a group of more than 2000 European scientists founded in 1997, cried foul. The organization, which last year held a European-wide meeting called the EuroScience Open Forum (Science, 3 September 2004, p. 1387), also advises the European Union on policy issues, says spokesperson Jens Degett. “If suddenly there is no difference between EuroScience and Microsoft, it will be very damaging” to the group's credibility as an independent organization. In response, Microsoft said it would work with the group to eliminate any misunderstanding and is planning to rename the program.


    Inspector General Blasts EPA Mercury Analysis

    1. Erik Stokstad

    Power plants buying and selling the right to spew toxic mercury from their smokestacks—the mere prospect raises the hackles of environmentalists. But when the U.S. Environmental Protection Agency (EPA) proposed such a cap-and-trade system last year, it argued that it was the most effective way to cut back the 48 tons of mercury, a known neurotoxin, emitted nationwide each year. Last week, the agency came under fire anew—this time from its own Inspector General (IG), who accused EPA officials of deliberately skewing their analyses to burnish the cap-and-trade approach. EPA denies the charges, but environmentalists say the report* will give them a leg up in court if they sue over the final rule.

    Coal-fired power plants are responsible for about 40% of all mercury emissions in the United States, making them the largest single source. Perhaps as much as half spreads considerable distances, while the rest is deposited locally, creating so-called hot spots. The primary route of human exposure is fish consumption, because mercury bioaccumulates in water. Nearly every state has fish consumption advisories, especially for pregnant women, as fetuses are considered most vulnerable.

    No federal rules on mercury from power plants are in place yet, although EPA determined in 2000 that regulation was “appropriate and necessary.” Under existing law, there is only one way to regulate a hazardous air pollutant like mercury (as opposed to less dangerous pollutants). This so-called MACT (maximum achievable control technology) approach requires all polluters to meet an air standard based on the average emissions of the cleanest 12% of power plants.

    Up in smoke.

    Coal-fired power plants account for most mercury emissions in the United States.


    While calculating the MACT, EPA became enamored of pollution- trading approaches, allowed by law for so-called criteria or conventional air pollutants. For instance, the “Clear Skies” legislation, introduced in Congress in June 2002, included a pollution-trading scheme to reduce emissions of sulfur dioxide (SO2) and nitrogen oxides (NOx). That's relevant to the mercury debate because the same scrubber technology that can clean up these pollutants can also reduce mercury in some situations, yielding what's called a “cobenefit.”

    After that bill stalled, EPA proposed a rule in January 2004 that would regulate mercury under a similar cap-and-trade system. The agency claimed that this trading approach would cut emissions by 70% to 15 tons by 2018—apparently a much better bottom line than the MACT approach, which EPA said would lower annual emissions to only 34 tons by 2008. Industry likes this approach, because it gives power plants more flexibility in the technology they can employ and provides time to cope by slowly tightening the regulations.

    Environmentalists and state regulatory agencies were highly critical, charging that the trading system would allow the dirtiest power plants to buy the rights to continue polluting, and mercury would continue to accumulate in toxic hot spots. In April of last year, seven senators asked the IG to investigate.

    Now the IG has weighed in, charging in a 3 February report that EPA analyses were intentionally “biased” to make the MACT standard look less effective. Citing internal e-mails, the IG maintains that high-ranking officials had their fingers on the scale during this process: “EPA staff were instructed to develop a MACT standard that would result in national emissions of 34 tons per year” by 2008, the report found. Agency documents show that EPA took several stabs at running the model that produces the MACT standards, first yielding 29 tons, then 27, then finally 31. EPA then adjusted the results of the final run to hit the target.

    Why 34 tons? The IG notes that's the same reduction that would be achieved as a cobenefit by simply reducing SO2 and NOx under the cap-and-trade rule proposed earlier. Martha Keating of the Clean Air Task Force in Boston, Massachusetts, sees it as an attempt to save industry from any extra costs. She says, and state regulators agree, that power plants could achieve greater reductions under MACT if they were required to install new technology, called activated carbon injection. EPA says it didn't generally consider the effects of this technology for its MACT standard, arguing that it won't be commercially ready by 2008.

    The IG recommends that EPA rerun its analyses of the MACT standard and tighten its cap-and-trade proposal, but it can't force the agency to do so. EPA says that its final rule, expected 15 March, will include further details, analyses, and cost-benefit information. Spokesperson Cynthia Bergman maintains that the agency properly created the MACT standard and that the cap-and-trade rule is the better way to go. Meanwhile, Senator Jim Jeffords (D-VT), one of those who signed the request to the IG, called for “extensive oversight hearings into this important health issue and into the process by which this rule was crafted.”


    Hearing Highlights Dispute Over Hubble's Future

    1. Andrew Lawler*
    1. With reporting by Robert Irion.

    Scientists, engineers, and politicians are increasingly at odds over what to do with the Hubble Space Telescope. That much was clear at a contentious hearing last week before the House Science Committee, where participants disagreed over whether and how to service the aging spacecraft, what each option would cost, and how to pay for it.

    Sean O'Keefe, set to give up his job as NASA administrator, caused a stir last year when he canceled a mission to have astronauts upgrade Hubble's instruments and keep it running until the end of the decade, when the James Webb Space Telescope is slated for launch. After pressure from lawmakers, he suggested that a robotic mission would be a safer bet than sending humans. That proposal, however, was shot down in December by a panel of the National Academy of Sciences, which called the robotic option too complex and costly and urged O'Keefe to reconsider sending astronauts to do the job. The panel also noted that the telescope could fail by 2007, before the robot likely would be ready. This week President George W. Bush requested no funding for a servicing mission in NASA's 2006 budget, a step that seems certain to keep the debate raging.

    Representative Sherwood Boehlert (R-NY), who chairs the science committee, called himself an “agnostic” and pleaded with witnesses to “clarify what's at stake.” What emerged were the deep divisions among scientists—including those at the same institution.

    Follow on.

    Instead of fixing Hubble, some astronomers are advocating a new telescope, the Hubble Origins Probe.

    Astronomer Colin Norman of the Space Telescope Science Institute in Baltimore, Maryland, said the best option is to forgo fixing Hubble in favor of a $1 billion telescope, dubbed Hubble Origins Probe (HOP), that could examine dark energy, dark matter, and planets around other stars in addition to extending Hubble's mission. He noted that Japan has offered to help pay for HOP, which would be launched in 2010. “We must continue with the Hubble adventure,” Norman added. The institute's director, Steven Beckwith, also favors completing Hubble's mission. But he wants to do it “as soon as possible,” having it fixed by experienced astronauts aboard the shuttle rather than building and launching a new telescope. Other researchers expressed fear that any fix would come at the expense of other science projects.

    Joseph Taylor, a Princeton University astronomer who co-chaired the academy's 2000 astronomy study that set long-range priorities, says he opposes any servicing “if it requires major delays or reordering” of future missions. Neither a new telescope nor a servicing mission “should be a higher priority” than the Webb and Constellation-X, another planned NASA telescope, he stated.

    Although astronomers are loath to lose Hubble, they also want to protect projects in the decadal study. “We have been playing fast and loose with the process by ignoring our prioritizations,” says Alan Dressler of the Carnegie Institution of Washington in Pasadena, California, who did not testify at the hearing. Dressler wants the academy to find out which missions astronomers would be willing to sacrifice to save Hubble.

    Louis Lanzerotti, who led the academy's Hubble study, agrees that if science must pay the servicing tab, priorities must be assessed. “If $1 billion is going to come out of some other aspect of NASA's science program, then I would have serious questions” about another Hubble mission—be it a new telescope, a shuttle service, or a robotic effort. But both he and Taylor would back a servicing mission if the money came from elsewhere. Lanzerotti added that NASA's $1 billion estimate doesn't square with the $300 million to $400 million price tag of past shuttle missions: “There is some accounting here which doesn't compute.”

    For many scientists, NASA's robotic mission is the least attractive option. Lanzerotti, for one, said it would be using an important scientific instrument as “target practice” for new technologies. But Representative Dana Rohrabacher (R-CA) argued that NASA should “push the envelope” by taking the opportunity to develop technologies that could benefit from Bush's plans for space exploration.


    Caught in the Squeeze

    1. Jeffrey Mervis*
    1. With reporting by Amitabh Avasthi, Yudhijit Bhattacharjee, Jennifer Couzin, Marie Granmar, Jocelyn Kaiser, Eli Kintisch, Andrew Lawler, and Charles Seife.

    Many U.S. science agencies would have to make do with less under the president's 2006 budget request, which aims to cut the deficit, boost military and antiterrorism spending, and make tax cuts permanent

    President George W. Bush has proposed a flat budget for U.S. science next year. And the spinning has begun in earnest.

    John Marburger, director of the White House Office of Science and Technology Policy, calls it “a pretty good year” for research, given the Administration's priorities of fighting terrorism, defending the homeland, and reducing the federal deficit. He says that the proposed 1% decline in the $61 billion federal science and technology budget for 2006—which excludes the Pentagon's even larger weapons development budget—would have been much worse but for the fact that “the president really cares about science.”

    However, most science policy analysts are wringing their hands over the tiny increase sought for the National Institutes of Health (NIH), a small rebound for the National Science Foundation (NSF) after a cut in 2005, and reductions in the science budgets at NASA, the National Oceanic and Atmospheric Administration, and the departments of energy and defense. At a time when other countries are ramping up their scientific efforts, they say, the United States shouldn't be resting on its laurels.

    “The inadequate investments in research proposed by the Administration would erode the research and innovative capacity of our nation,” says Nils Hasselmo, president of the 62-member Association of American Universities. AAU and the Federation of American Societies for Experimental Biology both call the president's 2006 request “disappointing,” with FASEB adding that the proposed funding levels could “discourage our most talented young people from pursuing careers in biomedical research.”

    The 0.7% increase for the $28 billion NIH, coming 2 years after a succession of double-digit boosts that resulted in a 5-year budget doubling, prompted agricultural imagery from newly installed Department of Health and Human Services (HHS) Secretary Michael Leavitt: “We have planted. It's now time for us to harvest the fruit.” Even science-savvy legislators from the president's own party struggled to find a bright side. Representative Sherwood Boehlert (R-NY), chair of the House Science Committee, seized on an 8% boost to the $450 million intramural research budget at the National Institute of Standards and Technology (NIST) even as the president proposed eliminating a $137 million precompetitive technology research program the institute oversees. “Given an overall cut to nondefense domestic discretionary spending, science programs fared relatively well,” Boehlert noted. “I was especially pleased to see the significant increases for the NIST labs.”

    Taking the long view.

    Presidential science adviser John Marburger notes a sharp rise in science funding in Bush's first term.


    The 2006 budget request, following tradition, unfolded in a series of briefings by agency heads. Here are some highlights, brought to you by Science reporters who were there.

    NIH: The president's budget includes a 42% boost, to $333 million, for a set of cross-NIH initiatives to support translational research, known collectively as the Roadmap. Biodefense efforts would receive a 3.2% hike, to $18 billion, and another $26 million would be allocated for the Neuroscience Blueprint involving 15 institutes. NIH is also getting $97 million more to develop countermeasures for a radiological or chemical attack.

    Still, the overall news is grim, as most institutes would get increases averaging about 0.5%. NIH Director Elias Zerhouni says he hopes to protect the number of funded investigators by shifting money from some clinical and center grants that expire in 2006 into new and competing grants, which will rise for the first time since 2003. But the average grant size of $347,000 would remain flat, and the proportion of applications funded would continue to plummet, to a projected 21%. NIH is boosting postdoc stipends by 4% and increasing health benefits. But the result is a 2% drop in the number that would be supported. “We think it's the right choice,” Zerhouni says.

    NSF: A $113 million increase proposed for the agency's $4.2 billion research budget hides a $48 million transfer from the U.S. Coast Guard to take on the annual cost of breaking ice to keep the shipping lanes open in the Antarctic. The 2006 request includes funding for all five of the agency's major new facilities under construction, but it lacks two expected new starts in 2006: a network of ocean observatories and an Alaskan regional research vessel. NSF Director Arden Bement says he hopes to request money for them in 2007 if the budget climate warms up. The biggest hit comes in the agency's education programs (see sidebar).

    NASA: The news was good for missions that would support the president's vision for eventual lunar and martian exploration by humans. The lunar program, which would be focused on technology more than science, would nearly triple, from $52 million to $135 million, and Mars projects would jump from $681 million to $723 million. The largest increase would go to developing a rocket capable of taking humans beyond Earth orbit; the Constellation project would more than double, to $1.12 billion in 2006. The one exception to that rule is human research: Funding for studying the effects of space on astronauts would plummet from $1 billion to $807 million for 2006.

    The biggest losers would be missions to the outer planets and Earth-observing activities (see sidebar on p. 834).


    Energy: As part of the 4% cut for the Office of Science, Department of Energy officials want to pull the plug on a $140 million experiment at Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, to study the physics of particles that contain the bottom quark. Science chief Ray Orbach says the Large Hadron Collider being completed at CERN, the European particle physics lab near Geneva, Switzerland, would cover the same territory as BTeV, which was set to begin construction this year, and that the savings will go toward developing a future neutrino detector at Fermilab. “Maybe it's not that they're trying to drive science from the United States, but boy, they're sure making it look like they are,” says Sheldon Stone, a physicist at Syracuse University and BTeV co-spokesperson. Operations at the Relativistic Heavy Ion Collider, the primary accelerator at Brookhaven National Laboratory in Upton, New York, will be curtailed, with funding for only 1400 hours of experiments compared with a scheduled 3600 hours this year.

    FDA: The Food and Drug Administration wants $30 million more to expand a network of state labs that can handle threats to food safety, an area former HHS secretary Tommy Thompson says is vulnerable to terrorism. It also hopes to hire 25 more people to clear up a backlog of reports submitted on potential safety problems with drugs that are already on the market. “It's a step in the right direction,” says Jerry Avorn, a pharmacoepidemiologist at Harvard Medical School in Boston and the author of Powerful Medicines: The Benefits, Risks, and Costs of Prescription Drugs. But any changes, he says, also require a new “culture of openness.”

    Homeland Security: The department wants $227 million for a new Domestic Nuclear Detection Office to sniff out attempts to bring bombs into the country. Several federal agencies will contribute staffers to the new office, which President Bush mentioned in last month's State of the Union address.

    Defense: Although the Pentagon's basic research account would slump by 13%, officials hope to scale up a pilot scholarship program to attract more U.S. citizens into government defense jobs. The first 25 awards in the Science, Mathematics, and Research for Transformation program are due to be announced this spring, and the 2006 request would allow for up to 100 2-year undergraduate and graduate scholarships in 15 disciplines.

    Graduates must return the favor by working for the department. But Michael Corradini, a mechanical engineer at the University of Wisconsin, Madison, worries that the requirement could scare off potential applicants. He suggests instead that graduates should be required to do a summer internship in the department. “If students have a meaningful experience during the internships,” he says, “they might be inclined to pursue a DOD career.”

    The $2.5 trillion proposed budget now goes to Congress, which will tinker with the president's priorities and add in its own. That means the fate of these and other research programs, although traditionally nonpartisan, will be shaped by larger forces—from Social Security to tax policy—stirring the political waters.


    Science Education Takes a Hit at NSF

    1. Jeffrey Mervis

    The National Science Foundation's (NSF's) role in improving science and math education in the United States would shrink significantly under the president's 2006 budget request. Particularly hard hit are programs to improve the skills of elementary and secondary school science and mathematics teachers, develop new teaching materials, and evaluate whether those activities are working.

    “This is outrageous,” says Gerry Wheeler, executive director of the National Science Teachers Association. “Despite all the concern about how U.S. students perform on international math and science tests, the Administration has made it clear that K-12 science education is not a priority.”

    The request would trim the budget for NSF's Education and Human Resources (EHR) directorate by $104 million, to $737 million, a 12.4% drop that follows a similar reduction this year. By NSF's own estimate, its programs would reach 64,000 elementary and secondary school students and teachers in 2006, compared with 100,000 in 2004.

    Left behind.

    NSF's programs will reach 36,000 fewer students in 2006 than in 2004.


    The biggest blow would fall on the directorate's division of elementary, secondary, and informal education. A $60 million program begun last year to help teachers, from training them to providing professional development, would be slashed by nearly half, to $33 million. A $28 million program to develop new classroom materials and focus on an increasingly diverse student population would be pared by one-third, and a university-based network of Centers for Learning and Teaching, with 16 sites, would make no new awards in 2006. In addition, the math and science partnerships program, begun in 2002 as a $200-million-a-year effort to link university science and engineering departments with their local school districts, would continue to wind down, with only enough money to fulfill existing commitments.

    The biggest percentage loser in the 2006 budget is the directorate's $59 million division of research evaluation, targeted for a 43% drop. NSF officials project that the president's request will mean no new awards next year for programs aimed at developing new ways to monitor the performance of students and teachers as well as evaluating the effectiveness of new methods and materials.

    NSF Director Arden Bement says that the EHR reductions give NSF the chance “to sharpen our focus on programs with a proven track record. … We have a lot of knowledge of what needs to be done. Now we have to do it.”


    Jupiter Is a Blue State, Mars Is Red

    1. Andrew Lawler

    The timing could not be more ironic. Just as a joint U.S.-European spacecraft is making exciting and front-page discoveries from distant Saturn, the White House proposes a budget that could scrub the agency's only major mission planned for the outer solar system. Another victim is an earth science flight to study aerosols, and several other longer-term projects, from planet finders to dark-energy seekers, would be put on hold.

    The strains placed on NASA by the Columbia failure and U.S. President George W. Bush's new exploration vision are evident in the request, which includes only half the increase the White House had promised last year for 2006. Outgoing NASA Administrator Sean O'Keefe says the request would have been far worse without the exploration plan Bush laid out last January: “It's rather remarkable, given the circumstances.”


    The Jupiter Icy Moons Orbiter would be delayed at least 6 years.


    The request would not cut any “ongoing” science programs, says science chief Al Diaz, whose budget would stay relatively flat. But a host of missions still in the early stages of planning would be delayed, some indefinitely. The most dramatic impact would be on the Jupiter Icy Moons Orbiter (JIMO), an elaborate and expensive mission that would harness nuclear electric technology to provide unprecedented access to Europa and the giant planet's other moons. The technology made JIMO “a high-risk venture,” says Craig Steidle, NASA exploration chief. Technology funding for the mission would be slashed from $432 million to $320 million, and JIMO would be delayed at least until 2018—6 years later than NASA officials had projected just a year ago.

    Instead, Diaz said NASA would reconsider a simpler mission to Europa that was canceled in 2002. Diaz says it may be included in a revamped science strategy this summer.

    The request contains bad news for scientists working in other fields. The launch date for the Kepler mission, designed to search for extrasolar planets, has slipped from 2007 to “to be determined,” according to NASA documents. The Dawn project, which would visit the asteroid belt, would be downsized and delayed. And the 2007 flight of the Glory satellite, which would measure atmospheric aerosols, would be abandoned, although some of its instruments might be used on other spacecraft. Technical challenges will delay the Space Interferometry Mission, planned for launch in the next decade to search for Earth-sized planets. And the Beyond Einstein program, which would launch a series of spacecraft to test Einstein's theories (p. 869), remains a dream.


    Ocean Research Budget Ebbs

    1. Elizabeth Pennisi

    Ocean policy is hot, but advocates say that President George W. Bush's proposed budget is tepid when it comes to addressing the needs of the nation's troubled waters. A 10% cut in the $580 million research budget for the National Oceanic and Atmospheric Administration (NOAA), the government's key ocean research and protection agency, “provides a rather distressing signal about the level of commitment [to the oceans],” says Ted Morton, federal policy director for a Washington, D.C.-based advocacy group called Oceana.

    Not so, says NOAA Deputy Administrator James Mahoney. The 2005 figure was inflated by legislative earmarks, he says. A more accurate measure of the Administration's commitment, he argues, is that the president requested 7% more for NOAA than he asked for last year.

    Full steam ahead.

    A new NOAA fisheries survey ship is in the works.


    Last fall a presidential commission urged the White House to devote more attention to the Great Lakes and coastal and marine resources and said $1.5 billion was needed to jump-start a successful national ocean program. Three months later, the president's U.S. Action Ocean Plan established a Cabinet-level, interagency task force on oceans (Science, 24 December 2004, p. 2171). The 2006 request is the next step, says Mahoney.

    While some NOAA programs are being squeezed, a few efforts tied to marine research are getting boosts. The agency has requested $38.5 million for a new fisheries survey vessel, $1.5 million more for the $25 million coral reef program, and $10 million for an expanded tsunami warning system (Science, 21 January, p. 331). In a surprise move, the White House submitted a level budget for Sea Grant, which supports marine and Great Lakes research and education in coastal states. The program historically has relied on Congress to keep it healthy.


    Lupus Drug Company Asks FDA for Second Chance

    1. Jennifer Couzin

    Biotech firm pleads for drug's approval so it can afford to prove that it works

    Doctors who treat the autoimmune disease lupus are on edge as the U.S. Food and Drug Administration (FDA) considers a rare plea: Approve a lupus drug that the agency has already rejected and that—even its maker acknowledges—has uncertain efficacy. The small California company that's developing the drug, La Jolla Pharmaceutical Company (LJPC), says it can't afford to complete the new clinical trial FDA has requested—the company's 14th, which it has already begun at a cost of $2.5 million a month. After meetings of lupus specialists, company executives, and FDA officials last fall, the agency is considering whether to approve the drug on the condition that LJPC conduct a postmarketing study to determine whether it works. It may rule this month.

    The lupus community is split over whether the drug, LJP 394, should become the first therapy approved for the condition in over 30 years. The drug has an outstanding safety record, and some of the more than 500 lupus patients who've tried the therapy suffered fewer kidney flares, a hallmark of serious disease. Still, the trials sponsored by LJPC so far have failed to show definitively that it works.

    “What you're left with is this terrible dilemma,” says David Wofsy, a lupus specialist at the University of California, San Francisco, who was not involved in developing LJP 394. “There's an important unanswered question here, and it should be answered. But it's different than saying the drug should be approved.”

    LJPC has already spent close to $300 million—90% to 95% of its expenditures—on LJP 394, according to the company's chair and CEO, Steven Engle. Last week it raised $16 million, enough to see it through this year—although not enough to complete additional testing. Approval, Engle hopes, could bring not only revenue but also a corporate partner.

    Tackling a butterfly.

    It's uncertain whether a new drug can erase the symptoms of lupus, often characterized by a “butterfly” rash.


    Doctors and patients are desperate for any new lupus drug because current therapy is so inadequate. Just three drugs—the steroid prednisone, the chemotherapy drug cyclophosphamide, and aspirin—are approved for the disease, which can attack nearly any organ.

    Fifteen years ago, LJPC set out to change that. The company had patented a technology that disables a narrow swath of immune cells: B cells sporting anti-DNA antibodies. Such antibodies are common, although not universal, in the blood of lupus patients. They also appear in the kidneys of those with lupus-induced kidney disease, which strikes about a fifth of sufferers. Furthermore, several studies showed that a rise in anti-DNA antibody levels presages a kidney flare.

    In a phase II/III trial of LJP 394 in the late 1990s with 200 volunteers, LJPC teamed up with the pharmaceutical giant Abbott Laboratories, based in Abbott Park, Illinois. But in 1999, before the trial ended, Abbott pulled out. Abbott was concerned that the drug was ineffective, according to Engle, but an Abbott spokesperson says the company simply decided not to pursue treatments for lupus nephritis.

    The drug failed when all the trial's subjects were considered. But when LJPC took a closer look at the data, it found that roughly 90% of patients in the trial had “high affinity” antibodies, to which the drug was likelier to bind, and those patients seemed to fare better than the rest. LJPC then forged ahead on its own with a phase III study that focused primarily on how those with high-affinity antibodies responded to the drug.

    View this table:

    The results, announced in early 2003, were not what the drug's enthusiasts had hoped for. Twelve percent of kidney flares occurred in the treatment group, compared with 16% on placebo. In the earlier study with Abbott, 21% of patients with high- affinity antibodies on placebo experienced flares, compared with 8% on the drug. David Wallace, a rheumatologist at the University of California, Los Angeles, who participated in the trial, attributes the placebo difference to the immunosuppressant mycophenolate, which came on the market between the trials. Approved for patients with organ transplants, doctors quickly began experimenting with it in lupus. Eleven percent of those in the phase III trial were on the drug, he says.

    Given the mixed results, LJPC agreed in August with FDA on the design of a larger phase IV postmarketing study, which would include about 600 people and seek to confirm the clinical benefit of the drug. Under a regulation designed to encourage development of drugs for life-threatening illnesses with few remedies, FDA could have approved LJP 394 if the company agreed to conduct the follow-up study. But on 14 October, the agency rejected the company's new drug application, saying another trial was needed.

    Since then, LJPC and lupus specialists have met with FDA a half-dozen times, lobbying the agency to reconsider. “What's the downside” of approving LJP 394?, asks Jill Buyon, a lupus specialist at New York University Medical Center who until last fall was a paid consultant for LJPC. She and Engle say the company would pull the drug off the market if it failed in the postmarketing trial.

    But Wofsy, who says he doesn't necessarily oppose approval, worries that putting LJP 394 on the market could complicate testing of other therapies for lupus-induced kidney disease. “Would we deny the drug to anyone who wanted it?” he asks. Doctors would be hard pressed to do so, given that no one knows which lupus patients stand to benefit from it.

    After a decades-long drought, there are now roughly 10 drugs for lupus in early clinical trials, says Joan Merrill of the Oklahoma Medical Research Foundation in Oklahoma City, who has consulted for LJPC and is the medical director of the Lupus Foundation of America in Washington, D.C. So why such anxiety about abandoning LJP 394? The other drugs might falter, says Merrill, and LJP 394 is possibly “the safest one of all. … We want these things studied,” she adds, “until we know they don't work.”


    SUMO Wrestles Its Way to Prominence in the Cell

    1. Jean Marx

    The small protein SUMO is turning out to have as many roles in the cell as its better-known cousin, ubiquitin

    The protein known as SUMO comes from an illustrious family. Its cousin ubiquitin has long been a star in cell biology: Researchers have shown that it is a key regulator of numerous cellular activities, controlling everything from protein degradation and gene expression to the cell division cycle. Ubiquitin is so renowned, in fact, that its discoverers were awarded last year's Nobel Prize in chemistry (Science, 15 October 2004, p. 400). During ubiquitin's ascent, SUMO remained in the shadows. Recently, however, SUMO has begun making a name of its own.

    Over the past few years, researchers have implicated it in a range of activities rivaling those of ubiquitin itself. Although SUMO can operate throughout the cell, its actions seem to be concentrated in the nucleus. The molecule has left its fingerprints on many nuclear functions, including gene transcription, DNA repair, the transport of proteins and RNAs into and out of the nucleus, and the building of the mitotic spindle that draws sets of chromosomes to the opposite ends of a dividing cell.

    Physicians may one day be as intrigued with SUMO as cell biologists are now. The protein seems to help some viruses infect cells, making it a possible target for antiviral therapies. It may also be involved in neurodegenerative diseases such as Huntington's and Alzheimer's. “There is exciting biology coming out of SUMO research,” says Van Wilson of the Texas A&M University System Health Science Center in College Station, who leads one of the groups that have linked SUMO to viral infectivity.

    A late start

    Although ubiquitin was discovered more than 25 years ago, SUMO eluded detection until 1997, when two groups stumbled on it more or less simultaneously. Both teams, one including Frauke Melchior and Larry Gerace of the Scripps Research Institute in La Jolla, California, and the other including Michael Matunis, who was then working in Gunter Blobel's lab at Rockefeller University in New York City, were studying a protein called RanGAP1 that had been implicated in both nuclear transport and the control of mitosis.

    The researchers found that cells contain two forms of RanGAP1, one of which weighs some 20 kilodaltons more than the other. Further analysis showed that the larger form carries an attachment—a 97-amino-acid protein that turned out to resemble ubiquitin in its shape and in the way it connects to RanGAP1. Both the newfound protein and ubiquitin attach through their carboxyl ends to the second amino group on the amino acid lysine in their target proteins. These similarities prompted Melchior and her colleagues to dub the new protein SUMO, which stands for small ubiquitin-like modifier.

    In the spotlight.

    SUMO's crystal structure was determined recently.


    Serendipity played a big role in SUMO's discovery. “We got really lucky,” says Melchior, who is currently moving her lab from the Max Planck Institute for Biochemistry in Martinsried, Germany, to the University of Göttingen. She explains that RanGAP1 is the only protein in which SUMO stays put when cells are broken apart for analysis. All others rapidly lose their SUMO tags. “I think that may be why [SUMO] was overlooked for so long,” she says.

    Once SUMO was identified, however, it opened the floodgates. The protein plus two nearly identical relatives found later—dubbed SUMO2 and −3—have since turned up on numerous additional proteins, most of which are located in or around the nucleus. More often than not, researchers encountered a SUMO accidentally, while studying the regulation of some fundamental cell process, such as gene transcription or cell division. “SUMO is popping up in every place you look,” says J. Lawrence Marsh of the University of California (UC), Irvine, who is investigating a possible role for the protein in neurodegeneration.

    Proteomics studies performed in the last several months have expanded the roster of sumoylated proteins even further. Mark Hochstrasser, whose team at Yale University is one of several performing such analyses, says that the total in yeast, the preferred organism for the work so far, is now up to 150. “I'm sure the number is much higher in mammalian cells,” he predicts.

    What's it doing?

    Although identifying SUMO-adorned proteins is now easy, figuring out exactly what the modification does has proved to be more of a challenge. One thing for sure is that the SUMO tag does not do what ubiquitin addition to proteins often does: mark them for destruction by a cell structure called the proteasome. In fact, there are a few situations in which SUMO modification protects proteins from degradation by blocking addition of a ubiquitin tag.

    Researchers have been building a circumstantial case that SUMO is involved in directing protein movements in the cell, particularly the transport of proteins through the pores of the nuclear envelope. That idea emerged early with the discovery of SUMO1 on RanGAP1. Both the Matunis team, which is now at Johns Hopkins University in Baltimore, Maryland, and that of Melchior showed that unmodified RanGAP1, which is located primarily in the cell cytoplasm, moves when sumoylated to the tiny fibrils that project from the outer side of nuclear pores.

    Since then, components of the machinery that sumoylates proteins have also turned up at the pore. For example, Matunis and his colleagues have located a protein called Ubc9 at the pore. In the first step of the sumoylation reaction, which is similar to how ubiquitin is added to proteins, SUMO forms a high-energy bond to the so-called E1 activating protein. Then SUMO is transferred to an E2 conjugating protein—Ubc9—and from there it's joined to its ultimate protein target with the aid of an E3 ligase, which is needed for target recognition.

    Researchers were somewhat surprised to learn that the machinery includes an E3 ligase because in test tube experiments E1 and E2 seemed sufficient to pin SUMO on proteins. But in the past 4 years, researchers in several labs have identified a half-dozen or so E3s that do this job. As shown by Melchior's team, working with Anne Dejean and her colleagues at the Pasteur Institute in Paris, these include a protein called RanBP2/Nup358, which is located at the nuclear pore and is known to bind RanGAP1. The supposition is that RanBP2 is involved in sumoylating RanGAP1 and other proteins at the pore, although that has not been proven.

    In addition, Hochstrasser and others have identified protease enzymes that can remove SUMO from proteins. “These are reversible modifications,” Hochstrasser says. The situation, which parallels that for addition of other protein-regulating modifiers such as ubiquitin and phosphate, provides for dynamic control by the cell of the modified proteins. At least one of these SUMO-stripping proteases has also been located at nuclear pores by the Matunis team and by Mary Dasso and her colleagues at the National Institute of Child Health and Human Development (NICHD) in Bethesda, Maryland.

    The presence at such pores of the various enzymes involved in SUMO addition and removal raises the possibility that sumoylation serves as a kind of gatekeeper, regulating traffic into and out of the nucleus. This may be the case for a nuclear enzyme called histone deacetylase 4 (HDAC4), which removes acetyl groups from histone proteins in the chromatin. This action allows the DNA to condense and thus has the effect of repressing gene transcription.

    As the cell divides.

    Before mitosis begins (left panel), the sumoylated protein RanGAP1 (red) is located mostly at the nuclear pores. But as mitosis gets under way, it moves to the kinetochores and mitotic spindle (middle panels) and then redistributes to the pores of the new nuclei after the daughter cells separate.


    Three years ago, Dejean, Melchior, and their colleagues filled in some details suggesting how HDAC4 might work. They showed that it must be sumoylated to produce its full gene-suppressing activity and that the nuclear pore protein, RanBP2, promotes HDAC4 sumoylation. That suggests that HDAC4 picks up its SUMO tag as it moves into the nucleus. There's still room for uncertainty, however. Sumoylating enzymes are present both inside the nucleus and in cytoplasm, leaving open the possibility that HDAC4 picks up its SUMO tag elsewhere.

    Wherever sumoylation takes place, it can have important functional consequences, particularly in regulating gene activity. Perhaps 50% of the proteins altered by the tag are transcription factors that are involved in turning genes on or off.

    In most cases, adding SUMO results in lowered activity of the target genes. Researchers have shown this by, for example, mutating the sumoylation site on the transcription factors, thus preventing SUMO attachment. The result: increased gene expression. Again, though, things get somewhat murky when it comes to the mechanism by which this inhibition happens. “The problem is that SUMO can regulate so many functions of proteins,” says Kevin Sarge of the University of Kentucky in Lexington.

    Indeed, the protein's role in transcription is complex. SUMO-driven inhibition of gene expression may occur in several different ways—more than one of which may be operating at a time. Most transcription factors work with several protein partners, and sumoylation may interfere with their interactions. Or, as has been shown for several nuclear proteins including some transcription factors, sumoylation can direct proteins to so-called PML nuclear bodies, small particles located in the nucleus. This may take them out of action, perhaps simply by sequestering them away from the DNA.

    Although sumoylation of transcription factors usually results in decreased gene expression, occasionally the opposite occurs. Sarge and his colleagues provide some intriguing examples. They have been studying the activities of heat shock factors (HSFs), proteins that protect the cell against heat and other stresses by turning up the activity of a variety of protective genes.

    In work done a few years ago with Matunis and his colleagues, the Sarge team showed that HSF1 is sumoylated in response to stress and that this leads to activation of HSF1's target genes. In this case, adding SUMO may increase HSF1's binding to DNA.

    Metaphase close-up.

    As shown by the red staining, sumoylated RanGAP1 localizes to the kinetochores, which attach the chromosomes to the mitotic spindle.


    Something similar happens with HSF2, although here the sumoylation trigger is not stress but the cell cycle transition from the second growth phase to actual cell division. When cells are preparing to divide, they compact most of the DNA of their chromosomes with the aid of an enzyme called condensin. In order for cells to function, however, essential genes have to be kept open, and the new work indicates that SUMO plays a role in this “bookmarking” of critical points in DNA. The Kentucky team reported in the 21 January issue of Science (p. 421) that sumoylated HSF2 binds both to a target gene and to CAPG, a subunit of condensin, and then draws in an enzyme that inactivates the condensing enzyme. As a result, the DNA stays open in the gene's vicinity.

    Further experiments showed the importance of HSF2 sumoylation for cell survival. When Sarge and his colleagues blocked the synthesis of HSF2 with an inhibitory RNA, they found that control cells could withstand an elevated temperature of 43°C, but that many of the cells carrying the inhibitory RNA died at that temperature.

    Protecting the genome

    SUMO's roles in the nucleus go far beyond regulating protein transport and gene transcription. The protein is also needed for normal separation of the chromosomes during mitosis and is involved in repairing damaged DNA. Researchers have found that mutations in SUMO genes themselves or in genes for enzymes involved in adding or removing the protein from its targets lead to abnormal cell division and increased susceptibility to DNA-damaging agents.

    During cell division, the duplicated daughter chromosomes are joined together at their central regions, the centromeres, before they ultimately separate. The evidence so far indicates that sumoylation may help signal the separation. Working with yeast, Stephen Elledge of Baylor College of Medicine in Houston, Texas, Nancy Kleckner of Harvard University in Cambridge, Massachusetts, and their colleagues discovered that mutating the gene for one of the SUMO-removing proteases results in premature chromosome separation. The researchers have evidence that this effect involves topoisomerase II (Top2), an enzyme known to regulate chromosome structure during mitosis.

    According to the model they proposed, SUMO is constantly being added to and removed from Top2 by the appropriate enzymes. The unsumoylated version is the one that helps maintain chromosome cohesion, perhaps through its effects on chromosome structure. But when tagged with SUMO, Top2 can no longer sustain the cohesion, allowing chromosome separation. Thus, when a mutation inactivates the protease that should remove SUMO from Top2, the sumoylated version will accumulate at the expense of the unsumoylated form. The result: Chromosomes separate prematurely.

    Consistent with this model, Dasso and her NICHD colleagues have recently found that Top2 is heavily sumoylated during mitosis in frog eggs. And as expected, preventing that sumoylation blocked chromosome separation.

    Sumoylation might also be involved in another critical event involving the centromere: formation of the kinetochore that attaches the chromosomes to the microtubule fibers of the mitotic spindle that draw the separating chromosomes to the opposite ends of the cell. Researchers have found that SUMO modifies several kinetochore and centromere proteins. And Dasso's team has found that in cultured human cells, addition of SUMO to RanGAP1 is what targets the protein to the kinetochore and mitotic spindle. The presence of RanGAP1, which activates one of the enzymes involved in spindle assembly, “may be needed for kinetochore integrity and microtubule attachment,” Dasso suggests.

    Further evidence for that idea comes from Brian Burke's team at the University of Florida, Gainesville. These researchers found that depletion of RanBP2, the SUMO E3 ligase, results in abnormalities in kinetochore structure and thus in mitosis. This might be because RanBP2 binds RanGAP1 at the kinetochore just as it does at the nuclear pore, or because it is needed for sumoylation of RanGAP1, or both.

    Getting it on.

    SUMO addition occurs in three steps. In the first, the protein is attached to the E1 activating enzyme with the aid of energy released from ATP hydrolysis. SUMO is then transferred to the E2 conjugating protein (Ubc9), and from there an E3 ligase directs it to its target protein. SUMO may also pitch in to help regulate DNA repair. Because DNA is constantly subject to damage, either through errors in replication or by exposure to chemicals or radiation, a cell needs to maintain an effective repair machinery. Researchers have found that sumoylation regulates the activities of several proteins involved in DNA repair. These include p53, sometimes called the “guardian of the genome” because of the key role it plays in DNA repair, and a protein called PCNA (proliferating cell nuclear antigen).

    Stefan Jentsch and his colleagues at the Max Planck Institute for Biochemistry showed that ubiquitin addition to PCNA promotes its DNA-repairing activity. In contrast, sumoylation inhibits that activity, apparently because it goes on at the same site, thereby precluding ubiquitin addition. The Martinsreid workers speculate that SUMO may direct PCNA to another function, perhaps in DNA replication.

    Disease links

    With sumoylation apparently involved in so many cellular activities, it's not surprising that it may be relevant to diseases as well. There are hints that it is involved in the pathology of neurodegenerative diseases, including Huntington's, which is caused by mutations in the Huntingtin protein (Htt). Last spring, UC Irvine's Marsh and his colleagues reported that sumoylation of Htt increases its neurotoxicity (Science, 2 April 2004, p. 100) both in cultured human neurons and in a fruit fly model of Huntington's disease.

    Even certain viruses, such as human papillomavirus and the herpesviruses, may utilize a cell's SUMO for their own nefarious purposes. In some cases, sumoylation of viral proteins targets them to the nucleus so that they can take over the cell's replication machinery, thus allowing viral reproduction. For example, Wilson and his colleagues found that blocking sumoylation of a human papillomavirus protein causes it to lose the ability to activate viral replication.

    Viruses may also aid their cause by interfering with the cell's sumoylation machinery. A team including Julio Draetta and Susanna Chiocca of the European Institute of Oncology in Milan and Ronald Hay of the University of St. Andrews in the United Kingdom reported in the November issue of Molecular Cell that Gam1, a protein from an avian adenovirus, inactivates the SUMO E1 protein in cultured human cells, thus totally blocking sumoylation.

    Because SUMO addition to transcription factors tends to inhibit transcription, the result is an overall increase in gene expression, presumably including those of the virus. “Viruses are going to affect host sumoylation with the goal of making an environment in the cell that is favorable for viral replication,” Wilson says.

    Although much remains to be learned about SUMO and its actions, it's already clear that its discovery has opened numerous lines of investigation. Researchers are learning that even a small protein is able to throw its weight around in the cell.


    Calorie Count Reveals Neandertals Out-Ate Hardiest Modern Hunters

    1. Elizabeth Culotta

    NEW YORK CITY—Top Neandertal experts gathered from 27 to 29 January for an invitation-only conference sponsored by New York University and the Max Planck Institute for Evolutionary Anthropology in Leipzig.

    Cartoons, B-movies, and anthropologists agree that Neandertals were a husky tribe. But how much fuel was needed to power those stocky, powerful frames? Scientists have speculated that supporting such massive bodies in the chill of glacial Europe required a hefty dose of calories and perhaps oxygen to burn them; the need for increased oxygen, in turn, might have spurred the evolution of Neandertals' large chests, which presumably enclosed capacious lungs. At the meeting, paleoanthropologist Steve Churchill of Duke University in Durham, North Carolina, unveiled numbers to test those ideas.

    For living humans, physiologists have developed equations that relate parameters such as size, skin surface area, and basal metabolic rate (BMR, or the number of calories burned to maintain body temperature at rest). To tailor the equations to short-limbed, big-muscled Neandertals, Churchill created a half-size Neandertal model, proportioned after a famous skeleton from La Ferrassie in France. He plastered the model's surface with a silicone rubber peel, digitized the peel, scaled up his results, and concluded that an 84-kilogram, 171-cm-tall male Neandertal was wrapped in 2.1 square meters of skin.

    Neandertal data in hand, Churchill used equations from modern human physiology to estimate a male Neandertal BMR of about 2000 calories per day, about 25% more than the average for a modern American male. Then he assumed that Neandertals were about as active as modern people who hunt game near an ice front, namely living Inuit hunters, whose activity consumes about 2 to 2.5 times the calories spent in basal metabolism. Churchill concluded that a male Neandertal needed 4500 to 5040 kilocalories per day to survive; for comparison, Inuit people consume about 3000 to 4000.

    Action man.

    Half-size model helped compute Neandertals' turbocharged metabolism.


    Chemical isotopes in their bones indicate that Neandertals were the original, extreme Atkins dieters, dining almost exclusively on meat. That means that a male Neandertal would have needed to nosh one healthy caribou per month. “That's two kilos of caribou a day,” says Churchill. “That's a lot of meat.” A mixed band of 10 Neandertals might have needed two caribou a week, he said.

    Supersized servings might also have led to beefy oxygen requirements and could help explain Neandertals' large chests, Churchill says. Using equations relating energy expenditure to oxygen intake, he concludes that Neandertals at rest respired at an average rate of 19 liters of air per minute—two or three times as much as modern people breathe at rest. So part of Neandertals' generous lung capacity may have gone simply to power resting respiration, says Churchill. Bursts of activity such as hunting might have required even more breathing power, he says.

    Researchers of diverse backgrounds praised Churchill's work. “That pop-up Neandertal is very direct and absolutely the right way to do it,” says paleoanthropologist Milford Wolpoff of the University of Michigan, Ann Arbor. “He's not saying Neandertals are Eskimos, but his estimates are compatible with real data from real people. To me that's exciting.”

    Churchill adds that his calorie calculations show that at times Neandertals may have come perilously close to the edge of survival, especially at the end of winter when food was scarce. “Their caloric budgets must have been tight,” he says. He notes that many Inuit undergo yearly fasts and that Neandertal teeth are “full of defects that indicate periodic starvation.” Periods of winter stress fit with other data on the Neandertals as well as studies of modern hunter-gatherers, agrees archaeologist Alison Brooks of George Washington University in Washington, D.C. “The hunter-gatherer life in the past was very precarious. Even modern hunter-gatherers often lose 10% of their weight in the bad season, and that can stop ovulation and reproduction.” The hunting-dependent Neandertals would have “had the fewest calories available at the coldest time of year; it must have been very stressful,” she says.

    All the same, Neandertals apparently thrived for 600,000 years in Europe's harsh glacial climes, Wolpoff points out: “You can't have a population close to the edge for that long.”

    “Their overall adaptive strategy was successful,” Churchill agrees. “But it was an energetically expensive adaptation.”


    Faces May Lie When Skulls Tell Tales

    1. Elizabeth Culotta

    NEW YORK CITY—Top Neandertal experts gathered from 27 to 29 January for an invitation-only conference sponsored by New York University and the Max Planck Institute for Evolutionary Anthropology in Leipzig.

    For decades anthropologists comparing fossils have argued bitterly about whether similarities are due to family resemblance or to convergent evolution. For example, both living Europeans and Neandertals have high-bridged, projecting noses, and a few researchers have cited this as evidence of Neandertal ancestry. But others say big schnozzes may merely reflect independent adaptations to Europe's chilly weather.

    At the meeting, Katerina Harvati and Tim Weaver of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, presented a new way to sort out how genetics and environment affect three parts of the human skull: the face, the braincase or vault, and the temporal bone, which comprises the temple, the ear, and the upper jaw joint. “People have said, ‘This or that feature is best to track population history,’ but it's never really been tested,” said Harvati. With samples from individuals in 10 populations throughout the world, Harvati and Weaver compared three kinds of data: differences in skull morphology, or shape; genetic differences taken from Stanford University geneticist Luigi Cavalli-Sforza's published global database; and climatic differences, as represented by latitude and mean temperature.

    They found that morphological differences did indeed correlate with genetic ones in each part of the skull. But the shape of the face was also associated with climate. For example, Greenlanders and northern Europeans, although relatively distant genetically, both tend to have flat faces.

    Power points.

    Some parts of the skull give more evolutionary information than others.


    In contrast, the vault did not reflect climate but tracked genes closely. For example, Syrians, Italians, and Greeks “clustered together beautifully,” both genetically and in vault shape, revealing recent population history, Harvati says. The temporal bone tracked more ancient population history, she says. Only in this part of the skull were Africans distinct from all other populations, mapping the most ancient split seen in the genetic data. “So if you're looking deep into time, you probably want to use the temporal bone and avoid the face, because the face reflects a complex mix of genes and climate,” Harvati says. Their analysis of temporal bone shape shows that living and Upper Paleolithic modern humans cluster together but that Neandertals are quite distinct from both, suggesting that they are indeed different species.

    Many at the meeting praised what paleoanthropologist Steve Churchill of Duke University in Durham, North Carolina, called Harvati and Weaver's “right-headed approach.” “I'm full of admiration for [Harvati's] work,” said paleoanthropologist Chris Stringer of the Natural History Museum in London. Several researchers pointed out ways to improve the analysis, however, suggesting everything from better genetic data sets to more precise climatic data. And they noted that many anthropologists already rely on the temporal bone—and steer clear of the face—when sorting out evolutionary relationships. All the same, says paleoanthropologist Ian Tattersall of the American Museum of Natural History in New York City, “this is a very imaginative approach, and it's a harbinger for future advances.”


    The Question of Sex

    1. Elizabeth Culotta

    NEW YORK CITY—Top Neandertal experts gathered from 27 to 29 January for an invitation-only conference sponsored by New York University and the Max Planck Institute for Evolutionary Anthropology in Leipzig.

    For 150 years, members of Homo sapiens have gazed on the bones of Neandertals and wondered, “Was this one of us?” At a recent meeting in New York, many paleoanthropologists—although not all—answered “No.” Yet even partisans committed to the notion that Neandertals and moderns were separate species agreed that when the groups met, at least a bit of what Ian Tattersall of the American Museum of Natural History in New York City calls “Pleistocene hanky-panky” probably took place.

    Carbon-14 dating of fossils and artifacts suggests that Neandertals and modern humans coexisted for several thousand years in Western Europe, after moderns swept in from Africa and before Neandertals vanished about 28,000 years ago. The minority of researchers who think Neandertals and moderns belonged to a single species have no doubt about what happened next: “What do we do when we encounter anyone? We trade mates and culture,” says Milford Wolpoff of the University of Michigan, Ann Arbor, who has long argued for a single interbreeding human population. “The archaeological record is clearly showing us that these groups are trading ideas, which almost certainly means they're trading mates.”

    Gene swappers?

    Most researchers say modern humans (right) and Neandertals got together—but not often.


    Indeed, the idea of thousands of years of chaste coexistence is too much of a stretch even for many experts who believe Neandertals were a separate species. “If you're counting on humans not to mate, you'll be very disappointed,” warned paleoanthropologist Trent Holliday of Tulane University in New Orleans, Louisiana. In his presentation, Holliday argued that any attempted gene-swapping could well have succeeded. By his count, about 1/3 of known mammalian hybrids are fertile. They include crosses between mule deer and white-tailed deer, lynx and bobcat, and many others. Primatologist Cliff Jolly of New York University, speaking from the audience, added a crucial primate example: olive and hamadryas baboons of Ethiopia, visibly distinct forms with different social structures. According to mitochondrial DNA (mtDNA), their ancestors diverged about 300,000 to 500,000 years ago, roughly the same time modern humans and Neandertals evolved in separate lineages. In the wild, the baboons freely interbreed within a narrow hybrid zone. “With them as a primate parallel, you'd expect that Neandertals and moderns would have been reproductively compatible,” says Jolly.

    Yet the ancient mtDNA so far gathered from a handful of Neandertals is distinct from that of both early and living modern humans. Jolly and others suggest that behavioral or cultural differences might have kept the gene pools of modern humans and Neandertals mostly distinct even in the face of some mating. Even so, they say, that doesn't mean abstinence worked perfectly. “Neandertals and moderns can be regarded as distinct species, but that does not mean that they were completely reproductively isolated,” says Chris Stringer of the Natural History Museum in London, a longtime advocate of the notion that modern humans replaced the Neandertal species. “The point that came out [at the meeting] is that you can have both: distinct species, and some reproduction.”

    The real question, said paleoanthropologist Jean-Jacques Hublin of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, is whether that reproduction affected later populations of Homo sapiens. “As for sex in the past: They [Neandertals and moderns] did it. I believe that. But does this have any biological relevance? No.” Hublin, Stringer, and others at the conference see no evidence, from fossils or ancient DNA, that Neandertals are part of modern humans' ancestry. Thus they argue that hybridization must have been quite limited.

    Jolly notes that a few genes that were highly adaptive for the local environment might have found their way from Neandertals to modern humans. Genes for pale skin color, for example, are advantageous in the sun-starved north. Wolpoff and a few others go further. They emphasize that even the geneticists admit that current mtDNA data cannot completely rule out a Neandertal contribution, and they cited a few Upper Paleolithic fossils that may show signs of Neandertal traits. Paleolithic hybrids may have bridged two species, but the question of their impact remains as divisive as ever.

  17. Special Relativity Reconsidered

    1. Adrian Cho

    Einstein's special theory of relativity reaches into every corner of modern physics. So why are so many trying so hard to prove it wrong?

    At an age when most boys would rather chase girls, Albert Einstein daydreamed of chasing light. When he was about 16 years old, Einstein later recalled, he realized that if he ran fast enough to catch up to it, light should appear to him as a wavy pattern of electric and magnetic fields frozen in time. “However,” Einstein observed, “something like that does not seem to exist!” Ten years later, that insight blossomed into the special theory of relativity, which forbade catching light, overturned ancient conceptions of time and space, and laid the cornerstone for modern physics. Now, however, some physicists wonder whether special relativity might be subtly—and perhaps beautifully—wrong.

    In 1905, physicists believed space was a grand stage on which the drama of the universe unfolded and time ticked away at the same rate for all actors. Special relativity denied all that. It replaced space and time as distinct entities with a single “spacetime” that, in mind-bending ways, looks different to observers moving relative to each other. But the theory's implications reach far beyond questions of when and where. Combined with quantum mechanics, it helps explain the stability of matter and even requires the existence of antimatter, says Steven Weinberg, a theoretical physicist at the University of Texas, Austin. “That's the only way nature can be if you're going to satisfy the requirements of both relativity and quantum mechanics,” Weinberg says.

    Yet a growing number of physicists are entertaining the possibility that special relativity is not quite correct. That may sound perverse, but researchers have good reason to hope Einstein's theory isn't the final word: Any deviation from special relativity could point physicists toward an elusive goal, a quantum theory of gravity. Candidate theories can be tested directly only with particle collisions a million, billion times more energetic than any produced with a particle accelerator. On the other hand, testing special relativity provides a far more practical, albeit indirect, way of probing quantum gravity, says V. Alan Kostelecky, a theorist at Indian University, Bloomington.

    Only a decade ago, questioning special relativity would have struck many as heretical, says Robert Bluhm, a theoretical physicist at Colby College in Waterville, Maine. “When I started working on it, I was kind of sheepish about it because I didn't want to be perceived as a crackpot,” Bluhm says. “It seems to really have gone mainstream in the past few years.”

    Physicists are now testing special relativity with everything from enormous particle accelerators, to tiny electromagnetic traps that can hold a single electron for months, to bobs of metal twisting on the ends of long fibers. They are even repeating the famed experiment by Albert Michelson and Edward Morley that in 1887 found no evidence for the “ether” that light was supposed to ripple through just as sound ripples through air. In spite of these efforts, special relativity remains inviolate—so far.

    Unbearable coincidences

    According to legend, Einstein invented special relativity to explain the Michelson-Morley experiment. In truth, he worried more about conceptual puzzles in the theory of electricity and magnetism, which had been hammered out in the 1860s by the Scottish physicist James Clerk Maxwell, says Michel Janssen, a historian of science at the University of Minnesota, Twin Cities.

    Consider the simplest electrical generator—a loop of wire and a magnet moving toward each other at constant speed. Current will flow through the wire. According to Maxwell's theory, different mechanisms drive the current depending on whether the wire is stationary in the presupposed ether and the magnet is moving or the other way around. Yet the current is the same in either case. That supposed coincidence was too fantastic for Einstein. “The idea that we would be dealing here with two fundamentally different situations was unbearable to me,” he later wrote. But Einstein found he could show that the two cases were different ways of looking at the same thing—if he abandoned the ether and familiar notions of space and time.

    A simple analogy captures the essence of Einstein's insight. Imagine two explorers, Alice and Bob, lost in a vast desert. From the top of a dune they spot an oasis. Alice pulls out her compass and range finder and determines that the oasis is 5 kilometers due north. Bob takes his own measurements and finds that it's 4 km north and 3 km east. What's gone wrong? The answer is simple: Alice and Bob disagree because their compasses don't line up. Each has a different notion of north, so what Alice takes to be a purely north-south distance, Bob takes to be a combination of north- south and east-west distances, and vice versa (see figure).


    In special relativity, traveling at a constant speed relative to another observer mixes time and space in much the same way. For example, imagine that instead of explorers, Alice and Bob are astronauts in deep space. Suppose, in a fit of foolishness, Bob holds up a firecracker in each of his outstretched hands. He sets the explosives off as Alice zooms past at half the speed of light. If Bob sees both firecrackers flash at the same time, Alice will see them flash at different times. So what Bob perceives as a purely spatial distance, Alice perceives as a spatial distance and a time interval. In essence, there is only one spacetime, which they perceive as different combinations of space and time.

    In precisely the same way, in Einstein's analysis there is only one underlying “electromagnetic” field that requires no ether, and different observers slice it into different combinations of electric and magnetic fields. The underlying unity explains why the current in the simple generator is the same regardless of whether the magnet or the wire is moving. In fact, according to special relativity it's meaningless to say which is “really” moving.

    Outgrowing Einstein

    Einstein doggedly followed his theory to bizarre but unavoidable conclusions. A clock whizzing by at near light speed ticks slower than the watch on your wrist. A meter stick flying past looks shorter than one in your hands. Light travels at the same speed for all observers—so it cannot be caught.

    But special relativity packs even more punch when combined with quantum mechanics to form “relativistic quantum field theory.” That amalgam predicts the existence of antimatter and demands a kind of mirrorlike correspondence between matter and antimatter, which is known as CPT symmetry. It also forges a connection between how much particles spin like little tops and whether two or more of them can occupy the exact same quantum state. That “spin-statistics connection” explains why atoms and nuclei do not implode.

    Antimatter must exist because quantum mechanics blurs notions of before and after, at least for particles, says the University of Texas's Weinberg. Suppose Alice throws an electron and Bob catches it. Observers moving at different speeds will disagree on how long it takes the electron to make the trip, but, sans quantum mechanics, all will agree that Alice throws it before Bob catches it. Mix in the uncertainty principle, however, and some observers will see Bob receive the electron before Alice tosses it, Weinberg says. “And the way that relativistic field theory gets around that difficulty is by reinterpreting it as Bob emits an antielectron that Alice receives,” he says. However, the conceptual fix-up works only if the electron and antielectron have exactly the same mass and other properties—collectively, CPT symmetry.

    The spin-statistics connection is less intuitive. All particles behave like little tops and can have only certain amounts of spin. For example, the photons that make up light have exactly one quantum of spin, whereas the electrons, protons, and neutrons that make up atoms have half a quantum. The spin-statistics connection says that no two identical particles can occupy the same quantum state if they have spin 1/2 (or 3/2, 5/2, etc.) That means the electrons in an atom cannot collapse into a tiny knot. Instead they stack into shell-like orbitals, and this arrangement keeps the atom stable. And it's a consequence of special relativity, says O. W. Greenberg, a theorist at the University of Maryland, College Park. “Violating the spin-statistics connection in a relativistic theory is, so far as we know, just impossible,” Greenberg says.

    Ironically, Einstein disdained the marriage of special relativity and quantum mechanics. “I know from experience how difficult it was to discuss quantum field theory with him,” wrote his scientific biographer, Abraham Pais, who died in 2000. “Relativistic quantum field theory was repugnant to him.” But Greenberg says we should not disparage Einstein because he didn't fully appreciate the implications of his own discovery: “It sort of outgrew Einstein.”

    As the world turns

    Now, however, some physicists are hoping to reach beyond special relativity. Researchers generally agree that the ultimate theory of gravity cannot be a quantum field theory. Such theories assume that particles are infinitesimal points and spacetime is smooth. But according to the uncertainty principle, at minuscule scales spacetime ought to erupt into a chaotic froth that overwhelms any theory of point particles. On the other hand, calculations suggest that alternative theories—such as string theory, which assumes that every particle is really a tiny vibrating string—might not completely jibe with special relativity.

    Unfortunately, sketchy quantum gravity theories cannot tell experimenters precisely what signs to look for, says Indiana University's Kostelecky. So over the last 15 years, he and his colleagues have taken another tack. They start with the relativistic quantum field theory that explains all the particles seen so far, the so-called Standard Model. They add to it myriad “background fields” that lace empty space. These resemble an electromagnetic field in that each points in some direction. But whereas electromagnetic fields arise from charges and currents, the background fields are woven into the vacuum. Known as the Standard Model Extension (SME), this catch-all theory clashes with special relativity because each background field provides a universal benchmark with which to determine whether an object is moving or not, or at least which direction it's going.

    Experimenters are striving to glimpse the background fields, mainly by trying to detect Earth's motion through them. Because Earth spins, a lab will zoom through a background field at different angles at different times of day. So if an experiment bolted to the floor can feel the field, its output should oscillate in sync with Earth's rotation. Researchers expect the effects to be tiny. Spotting them will require, for example, measuring the frequencies of microwaves to one part in 1000 billion or better. But “even though these effects are very small,” Kostelecky says, “the current experimental capabilities are in the range that you would expect to see something” if the fields are there.

    SME has limitations. It doesn't explain how the background fields arise. And each type of particle may interact with a different set of fields, leaving experimenters with dozens of measurements to make. Nevertheless, SME tells researchers which experiments should be most sensitive and enables them to compare seemingly disparate efforts, which is why it has sparked much of the interest in testing special relativity, says Blayne Heckel, an experimental physicist at the University of Washington, Seattle. “Once you have this community out there that appreciates what you're doing,” Heckel says, “you don't feel so bad measuring zero.”

    Feel the breeze.

    Experimenters hope to detect oscillating effects as Earth whizzes and spins through putative background fields.


    K0 mesons and clocks in space

    Heckel is hardly the only one to come up short in trying to prove Einstein wrong: Experimenters have found no evidence that special relativity isn't bang on the money. For decades particle physicists have tested CPT symmetry, which now can be analyzed in the context of SME, by comparing particles and antiparticles. Researchers working with the KTeV experiment at Fermi National Accelerator Laboratory in Batavia, Illinois, have shown that fleeting particles called K0 (pronounced kay-zero) mesons have the same mass as their antiparticles to one part in a billion billion. By “weighing” individual particles in devices called Penning traps, researchers at the European particle physics laboratory, CERN, near Geneva, Switzerland, have shown that protons and antiprotons have the same mass to a part in 10 billion.

    Others are probing for background fields by comparing extremely precise clocks. A background field may affect one clock differently from the other, in which case one will speed up and slow down relative to the other over the course of the day. In fact, the “clocks” can be two different frequencies of radiation emitted by the same atom. Using a device called a maser, Ronald Walsworth of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts, and colleagues compared two frequencies emitted by hydrogen atoms, which allowed them to probe for fields that might affect the lone proton at the center of the hydrogen atom. No sign of the background fields emerged.

    Some have resorted to a centuries-old technique. In 1798 English physicist Henry Cavendish measured the strength of gravity by dangling a barbell-shaped bob on the end of a fiber and watching it twist as one end came close to a heavy object. Now, Heckel and colleagues at the University of Washington have employed a souped-up version of Cavendish's “torsion balance” to search for background fields that interact with an electron's spin. The bob is a symmetrical assemblage of pieces of magnets arranged so that a majority of electrons spin in one direction. Crucially, the bob has no net magnetism, so it won't interact with the inevitable stray magnetic fields. So far Heckel and colleagues have seen no unusual twists of their apparatus.

    Physicists have also repeated the Michelson-Morley experiment. Michelson and Morley reasoned that because Earth spins in the light-carrying ether, to an earthbound observer light should travel at different speeds depending on whether it is zipping north-south or east-west. SME's background fields could produce similar effects. Michelson and Morley used light beams and mirrors; today researchers employ “resonators” that ring with microwaves much as bells ring with sound. Two identical resonators are arranged perpendicularly and researchers compare their frequencies, achieving 10 million times better sensitivity than the original experiment, says John Lipa of Stanford University in California. Just as Michelson and Morley caught no whiff of the ether, modern experimenters have found no trace of the SME background fields.

    Researchers had planned to fly atomic clocks and resonators on the international space station, where they would have been far more sensitive than earthbound experiments. But NASA scuttled those plans last year when President George W. Bush set his sights on sending humans to Mars (Science, 30 January 2004, p. 615). “Basically, NASA shut down all the activities in physics on the space station,” Lipa says. “That's politics.”

    A subtler beauty

    Why are some physicists so keen to take on Einstein? Answers vary widely. Experimenters should test basic theoretical assumptions as rigorously as possible as a matter of principle, says Gerald Gabrielse of Harvard University, who led the efforts to compare protons and antiprotons. “If we were to find a violation,” Gabrielse says, “the consequences of that would ricochet through physics, affecting our understanding of the structure of the universe in every way.”

    On the other hand, some particle theorists may be drawn to the matter because “there's not that much else to do,” quips Roman Jackiw, a theoretical physicist at the Massachusetts Institute of Technology in Cambridge. Particle theorists have little fresh and challenging data to gnaw on, Jackiw says, although that should change when an accelerator known as the Large Hadron Collider powers up at CERN in 2007.

    To Kostelecky, the architect of SME, the allure is aesthetic. Special relativity states that spacetime possesses a kind of perfect symmetry, like an infinite plane so featureless that it's impossible to tell where you are and which direction you're facing. In special relativity, the symmetry extends to time, too, so that space and time mix together into a single seamless whole. That “Lorentz symmetry” is so elegant most physicists assume it's true. But “nature's beauty is more subtle than that perfect symmetry,” Kostelecky says. “For me it may make nature more beautiful if it is almost Lorentz symmetric.”

    That sentiment might have intrigued Einstein, who often drew inspiration from his own sense of the beauty of nature and of physical theories. Perhaps he would have followed the thought to deep new insights, just as he surfed an imaginary light wave to one of the most profound ideas ever conceived.

  18. Doubly Special, Twice as Controversial

    1. Adrian Cho

    Quantum gravity may bend, not break, special relativity, some theorists say. Special relativity says that nothing can travel faster than light. Quantum gravity effects might also limit an individual particle's energy, says Giovanni Amelino-Camelia of the University of Rome “La Sapienza.” That could lead to what he and others call “doubly special relativity.” The embryonic theory has no background fields, and just as in ordinary special relativity, it's impossible to tell whether an object is moving relative to the vacuum. But the rules for adding up momentum and energy change, leading to potentially observable astronomical effects.

    Photon photo finish.

    Gamma rays from humongous stellar explosions may reveal hypothesized variations in the speed of light.


    For example, doubly special relativity predicts that the speed of light could depend on its color and energy. Such an effect might be spotted by observing gargantuan stellar explosions known as gamma ray bursts, says Lee Smolin of the Perimeter Institute for Theoretical Physics in Waterloo, Canada. The gamma rays take billions of years to reach Earth, Smolin says, giving the faster ones time to pull measurably ahead of the slower ones.

    However, some theorists doubt that doubly special relativity can be made into a coherent theory. Amelino-Camelia says he sees no obvious reason why it can't. Still, he adds, “there are plenty of consistency checks to be made, and I offer no guarantees until they're done.”

  19. We're So Sorry, Uncle Albert

    1. Charles Seife,
    2. Andrew Lawler

    NASA's new focus on exploration closer to home may derail missions aimed at torture-testing Einstein's relativistic ideas

    Einstein is in trouble. A century after his “miraculous year,” astronomers and physicists across the globe have plotted an ambitious, multibillion-dollar challenge to Einstein's theory of relativity. Armadas of spacecraft launched over the next 2 decades will directly test some of the most dramatic assertions of relativity theory: that the entire fabric of space and time ripples with distortions, that there are regions in space where gravity is so strong that light cannot escape, and that the big bang and newly discovered “dark energy” leave a characteristic imprint upon the very distant and very ancient universe. Two great observatories, three smaller probes, and a pair of “vision missions”—which make up NASA's “Beyond Einstein” project—are the culmination of years of planning by astrophysicists.

    The problem is not the tests. Most physicists believe that Einstein's theories will pass them handily and emerge strengthened by the new data. But when and whether the flotilla will be launched is now in question. When President George W. Bush announced last January that NASA would focus on lunar and Mars exploration by robots and humans, Beyond Einstein, which doesn't fit into that vision, faltered. The Administration cut budgets for parts of the effort and put others on the back burner.

    “I sincerely hope [Beyond Einstein] will survive,” says Michael Garcia, an astrophysicist at the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts. “I think it's taken a hit already.” Unfortunately, Einstein's trouble may be that he's no longer an important part of NASA's universe.

    Wrinkles and holes in time

    NASA's Beyond Einstein effort tied together two existing projects called the Laser Interferometer Space Antenna (LISA) and Constellation-X, envisioned a new series of probes designed to answer fundamental questions raised by Einstein's work, and proposed innovative future missions to study black holes and peer back to the big bang. Launched with fanfare in February 2003, the program won congressional backing and the full $59 million NASA requested for 2004. It was an auspicious start for an effort estimated to cost $765 million during the first 5 years.

    Beyond Einstein is aimed at addressing questions about Einstein's theory of relativity and about the nature of black holes and galaxies. LISA and Constellation-X, the two main observatories of the Beyond Einstein project, are both expensive flotillas of spacecraft, yet they attack those questions in very different ways.

    Triple threat.

    LISA, a trio of laser-ranging satellites designed to detect gravity waves, faces delays due to budget cuts.


    LISA comprises three spacecraft that will surf the swells of spacetime, flying in formation—a 5-million-kilometer-wide triangle, linked by laser beams. As gravitational waves rattle by, they will stretch and squish spacetime enough to change the relative positions of the three satellites by a fraction of a millimeter (Science, 16 August 2002, p. 1113). LISA will be able to see gravitational waves that earthbound observatories can't, in part because the vast distance between the satellites will make it much more sensitive.

    “The odds are in favor to see a substantial signal,” says Peter Bender, a gravitational physicist at the Joint Institute for Laboratory Astrophysics in Boulder, Colorado. LISA should be able to pick up gravitational waves from various energetic events: the coalescence of massive black holes at the centers of galaxies, the inspirals of black hole binaries in the last moments before they collide, and perhaps even the ripples in space caused by a supernova explosion.

    Detecting waves from these events will not only provide a ringing confirmation that Einstein's gravitational waves are real but will also tell astrophysicists about the nature of black holes and galaxy formation. “During the initial formation of galaxies, we think that they form hierarchically,” says Bender—that small galaxies merge to form larger ones. “Whenever galaxies coalesce, you're likely to have their black holes coalescing also.” So by listening to the ripples caused by crashing black holes, scientists would get a direct view of galaxy birth.

    Constellation-X will also look at black holes, by sensing high-energy light. Matter close to a black hole is extraordinarily hot and emits x-rays; as a hunk of matter falls in, the immense gravitational field of the black hole stretches those x-rays and makes them redder and redder. “You can watch [a black hole's x-ray emissions] change through time,” says Garcia. “You can watch it move from the blue to the red end of the spectrum.” To spot those changes, scientists plan to yoke together four x-ray telescopes—their size limited by the rockets that will launch them into orbit—to form a larger instrument powerful enough to help physicists map spacetime right near the edge of a black hole.

    Plans for Beyond Einstein also include three probes to complement the two large observatories. The Inflation Probe will detect gravitational waves unleashed in the moments right after the big bang, by measuring how they affected the microwave radiation that suffuses the universe. The Black Hole Finder Probe will search a large portion of the sky for hints of black holes, which, among other things, will help scientists pick targets for Constellation-X. The Dark Energy Probe—conceived of as a joint NASA/Department of Energy (DOE) mission—will survey the skies for supernovae. A large census of supernovae, which serve as cosmic yardsticks, will enable astrophysicists to home in on the properties of dark energy, the mysterious antigravity force that is causing the fabric of spacetime to expand faster and faster. “If you talk to anyone at any level at NASA or DOE, they still seem very excited about it,” says Saul Perlmutter, a supernova expert at Lawrence Berkeley National Laboratory in California who is working on one of the proposed designs for the dark energy mission. “I'm hoping you'll see all the important missions get a chance.”


    Constellation-X's fleet of planned black hole-detecting x-ray satellites will remain an artist's conception until 2016 or beyond.


    The dice game

    When the Beyond Einstein project was launched in 2003, the auspices were good. Prioritization studies by physicists and astrophysicists gave LISA, Constellation-X, and some of the probes a very high rating. As a result, Congress backed the project and gave NASA the money it asked for to start Beyond Einstein on its way.

    Then, a year ago, NASA slammed on the brakes. After Bush called for humans to return to the moon and eventually travel to Mars, White House and NASA managers diverted money from efforts like Beyond Einstein to get that program under way. NASA asked for only $40 million for the entire Beyond Einstein effort in 2005, delaying LISA and Constellation-X each by several years and indefinitely postponing the other missions (Science, 6 February 2004, p. 749). After receiving $25 million for LISA in 2004, NASA asked for only $19 million in 2005. Similarly, the proposed budget for Constellation-X dropped by nearly half, from $23.4 million to $12 million. And the agency slashed $1 million from the $10.5 million set aside in 2004 to start work on the other missions.

    The decision to retrench stunned scientists. “I really hope the situation is going to change and NASA will take another look at their overall priorities,” says Bender. “There's been a lot of work by the astrophysical community to determine their decadal prioritization, and the dropping of a substantial piece of that looks like a mistake.”

    Congress reluctantly complied with NASA's less enthusiastic plan in December, approving the Administration's request. A congressional aide says that many lawmakers were unhappy with NASA's decision to pull back but that the Beyond Einstein projects can't match the political clout of more mature projects. “They are not so entrenched yet, so that makes them vulnerable,” the aide says. He predicts that without strong congressional pressure, Beyond Einstein funding will continue to be squeezed by NASA managers.

    Worse may be yet to come. The agency still must allocate hundreds of millions of dollars in congressional earmarks, as well as space shuttle and Hubble Space Telescope costs, within its 2005 budget. And both NASA officials and outside scientists fear that younger efforts like Beyond Einstein will bear the brunt of those cuts, which likely will not be announced until late February, weeks after the 2006 White House budget request goes to Congress.

    One way around the financial squeeze might be to find allies with expertise and money. NASA and the European Space Agency (ESA) agreed last August to work together on the two separate missions that make up LISA, to the tune of $1 billion per agency. The first flight is a 2008 dress rehearsal known as LISA Pathfinder to test the advanced technologies to be used on the later LISA mission.

    But already the tremendous complexity of the technologies has led to cost overruns and schedule delays. Two of the most difficult engineering challenges are to keep the innards of the LISA satellites on the correct path to within a nanometer or so and to ensure that disturbances such as solar photons and the spacecraft's own electrical systems don't affect the measurements. The key is a gravitational reference system, which will be NASA's main contribution to LISA Pathfinder. But cost increases on that system triggered a cancellation review last fall—which it survived—and another will take place in March. Bryant Cramer, Beyond Einstein program manager at Goddard Space Flight Center in Greenbelt, Maryland, says he is confident the system will survive the next scrutiny as well. In the meantime, technical challenges have postponed the launch of LISA Pathfinder from 2007 until the summer of 2008.

    Once in space, LISA Pathfinder's results will immediately be fed into the design of LISA itself, in preparation for a 2013 launch of the three-satellite system. As a result, any delay to the first mission will ripple back, affecting not only LISA but possibly the other Beyond Einstein missions waiting in the queue.

    After LISA comes Constellation-X. But plans for a 2013 or 2014 launch have already been abandoned in the wake of the cuts. Paul Geithner, Beyond Einstein program manager at NASA headquarters, says that the mission won't get off the ground before 2016, and other agency managers predict it will be several years later than that. As with LISA, funding trouble makes a partnership with ESA more attractive to NASA. The two agencies now are in negotiations to combine their efforts.

    The three Einstein probes are next in line, although their fate is uncertain. “They haven't been dropped, merely put on hold until we understand the budget situation,” says Cramer. Coordination between NASA and DOE on the Joint Dark Energy Mission, for example, continues despite the space agency's decision not to fund the effort. “They've got the money, and we don't,” Cramer adds. Although legislative language in DOE's funding bill suggests DOE could take over the mission, “we still hope to partner with NASA,” says Robin Staffin, director of DOE's high-energy physics program. A few advanced concept studies are under way, to the tune of $100,000 each, but “the real money doesn't kick in for some time,” says Geithner.

    Geithner and Cramer are nevertheless confident that the scientific promise of Beyond Einstein will ultimately carry the day. “Sensible people will see that these projects offer a whole new window into the universe,” he says. “And Congress always resists” NASA's efforts to rob science to pay for space-flight missions, he notes. Cramer believes that LISA in particular has enough momentum to stay on track, because design work began in October with the approval of NASA science chief Al Diaz. “There is enough scientific cachet to do this mission, though it may be slower” than anticipated, he adds.

    And Geithner contends that despite the budget competition, his program is here to stay. “It's going to happen,” he insists. “Beyond Einstein is not going away.”