News this Week

Science  03 Oct 2003:
Vol. 302, Issue 5642, pp. 28

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Speeding Up Delivery: NIH Aims to Push for Clinical Results

    1. Jocelyn Kaiser*
    1. With reporting by Jennifer Couzin.

    Genomic research is pouring out of government-funded laboratories at a torrential rate. Yet medicine has lagged behind in converting basic discoveries into benefits for patients. That troubles Elias Zerhouni, director of the National Institutes of Health (NIH). He has spent the year since he took charge of NIH looking for ways to make the process more efficient. This week he unveiled a “roadmap” that he says can be used to transform the way the biomedical giant in Bethesda, Maryland, does business.

    The plan aims to give scientists more tools, encourage cross-disciplinary teams, and overhaul the infrastructure for clinical trials. “Think of it as synergizing areas that no institute either has the mission or resources to invest in,” Zerhouni says. Funding is relatively modest: $2.1 billion over 6 years, which, averaged per year, is just over 1% of NIH's $27.3 billion 2003 budget. But “it's a revolutionary process for NIH,” says Zerhouni, because all 27 institutes and centers agreed to chip in. “That has never been done before.”

    Some scientists who took part in the roadmap meetings, such as Stanford cardiovascular researcher Judy Swain, are delighted. “This is going to make the whole NIH stronger,” she says, adding that it will also respond to a public hungry for results. Zerhouni has “got to show a return on the investment” of NIH's 5-year budget doubling, says Swain. Former NIH Director Harold Varmus, now president of Memorial Sloan-Kettering Cancer Center in New York City, agrees that Zerhouni “has done a terrific job.” But, he says, “the question is whether consensus will persist if budgets tighten.”


    Elias Zerhouni consulted with more than 300 experts to come up with a “roadmap” to accelerate biomedical research.


    Zerhouni guarded the details of the final plan (summarized on p. 63) so closely that few outside NIH had seen them before last Tuesday. NIH officials acknowledge anxiety about how the plan will be received. One reason: Many people remember the massive strategic plan crafted 10 years ago by outgoing NIH Director Bernadine Healy, which went nowhere. Zerhouni insists that his roadmap is not a strategic plan but is “much more focused and narrowly directed.” Still, perhaps mindful of the Healy history, NIH released no glossy document; instead, it put the details on a Web site (

    To reach a consensus, Zerhouni met with more than 300 biomedical experts from outside and within NIH over the course of 14 months, brainstorming about roadblocks to research that no single institute could overcome. “I didn't want to impose any vision,” Zerhouni says.

    The result is a grab bag of initiatives (see box), from a new award that would give top researchers $500,000 a year for 5 years with no strings attached (Science, 15 August, p. 902) to “nanomedicine” centers to a national grid for biological computing. The plan also attempts to boost collaborations among different disciplines by funding centers and meetings.

    View this table:

    Most sweeping is the roadmap's scheme for “reengineering” and speeding up clinical research. The 10-year plan would attempt to address a major inefficiency: Each time a trial begins, researchers recruit patients and build a data-collection system from scratch. To save resources and allow data to be compared across trials, NIH wants to develop a national network for clinical trials that uses standard data protocols. Another piece of the plan would recruit family physicians to join in clinical research. The plan is “very ambitious,” says clinical researcher Steven Cummings of the University of California (UC), San Francisco, who helped craft it.

    Although the basic research initiatives hit themes already supported by NIH, Zerhouni says the details are new. Some would fill gaps, for example, by generating large quantities of membrane proteins, which are difficult to crystallize. “Nobody has had a combined effort around that issue,” Zerhouni says.

    Another new departure would involve all of NIH in drug discovery. The agency intends to set up a “molecular library” that includes at least 500,000 synthetic small molecules. Scientists could then screen potential drug targets against these compounds. Some might be candidates for further development—at which point private industry could pick them up. NIH wants to make its library freely available, but it is still sorting out how to do that and offer drug companies exclusive licensing rights, says Francis Collins, director of the National Human Genome Research Institute.

    The roadmap calls for many, many new centers. But NIH's investigator-initiated R01 grants won't be sacrificed to pay for them, Zerhouni argues, because the new plan takes “a very small percentage of the [overall] NIH budget.” NIH has allotted $128.3 million in 2004: $35 million in special funds from Congress, and most of the rest from institutes under existing transfer authority. The annual budget will roughly double in 2005, says Zerhouni.

    Computational scientist Larry Smarr of UC San Diego notes that the plan is “very much in line” with a recent report from the National Academies that calls for putting at least 5% of NIH's budget into transinstitute initiatives (Science, 1 August, p. 574). “I think there will be strong support from the field.”

    Others are more cautious. Steven Teitelbaum of Washington University in St. Louis, Missouri, past president of the Federation of American Societies for Experimental Biology, says he's “very positive” about the roadmap's potential to give researchers access to “big science” technologies. But Teitelbaum doesn't buy the argument that R01 funding won't be affected. “It's a zero-sum game,” he says. Some NIH officials, such as Collins, acknowledge that R01 success rates could decline if roadmap initiatives take off. “But … if we don't do this, we're going to slow down the translational process,” Collins says.

    Teitelbaum also has concerns about funding scientists using a model similar to that used by the Howard Hughes Medical Institute, which he thinks could end up giving the awards to “old boys” instead of fresh talent. “It has to be done carefully,” Teitelbaum says. “Careful” seems to be NIH's watchword. Each initiative will be monitored according to year-by-year goalposts and, if it falls short, will be shut down. “You don't want to unbalance the portfolio,” Zerhouni says.

    The corporate lingo, say NIH insiders, is part of Zerhouni's approach to promoting NIH to the public and to Congress. The idea, they say, is to make the plan so appealing to lawmakers that they add more funding.

    But the roadmap could meet resistance from patient advocacy groups because it doesn't mention any diseases. That's deliberate, say NIH officials, to stick to broad themes. The first public test of this approach was expected this week before a joint hearing of House and Senate oversight committees to review the National Academies report on NIH's structure.


    Experts Put Some Flesh on a Bare-Bones Proposal

    1. Gretchen Vogel

    BERLIN—European scientists have long dreamed of having their own National Institutes of Health or National Science Foundation. That vision took a step toward reality last week, when a panel set up to advise European Union (E.U.) research ministers laid out a detailed proposal for a $2.3-billion-a-year fund to support basic research across the continent.* The plan is not as radical as some earlier proposals, at best amounting to a miniature version of the U.S. grantmaking powerhouses. But it “is going in the right direction,” says Frank Gannon, executive director of the European Molecular Biology Organization in Heidelberg, Germany.

    Europeans already have their own national research agencies, as well as the E.U.'s Framework 6 program, which is slated to spend nearly $20 billion on research over the next 5 years. But Framework 6 has long been dogged by complaints that it is overly bureaucratic and skewed too heavily toward applied projects (Science, 8 November 2002, p. 1163). To compete with the United States and Japan, many science leaders argue, Europe needs a fund to support investigator-driven basic research across all disciplines; the notion of a European Research Council (ERC) began to take shape early last year.


    Federico Mayor's panel has recommended creating a European Research Council within the Framework program.


    The ERC Expert Group's interim report, issued on 25 September, calls on the E.U. to allot at least E2 billion per year to a new European Fund for Basic Research. The fund should be a line item in Framework 7, starting in 2007, but administered by an independent ERC led by “eminent researchers,” the report recommends. Framework's current basic research elements, such as its networks-of-excellence grants, could be transferred to the new fund.

    Early discussions had aimed to establish the ERC outside Framework—perhaps even outside the E.U. bureaucracy altogether—but Gannon says the proposal to work within the system is wise. “It makes political sense that the E.U. would be the owners of this,” he says, particularly because new money for an ERC is far more likely to come from the E.U. budget than from national science agencies. A 1% shift of funds from E.U. agricultural subsidies into research would mean an extra $1 billion per year, Gannon notes. Expert group chair Federico Mayor, a biochemist and former head of UNESCO, agrees. “We want the money to be en plus, not money from other science programs,” he says.

    The interim report is open for comment through the end of next month, with a final version to go to E.U. research ministers in December. They are expected to embrace the proposal, says Enric Banda, secretary-general of the European Science Foundation in Strasbourg, France. The more daunting challenge, he says, will be selling the plan to the finance ministers who hold the E.U.'s purse strings.


    SARS Experts Want Labs to Improve Safety Practices

    1. Dennis Normile*
    1. With reporting by Martin Enserink, Ding Yimin, and Bai Xu. Ding Yimin and Bai Xu are reporters with China Features in Beijing.

    Last week's confirmation that a young Singaporean was infected with severe acute respiratory syndrome (SARS) through a lab accident was both a relief—because SARS does not appear to be spreading—and an alarm bell. With untold numbers of SARS samples scattered across Asia, health officials are worried that sloppy handling, not a presumed animal reservoir or asymptomatic carriers, might pose the greatest risk for the reemergence of the deadly disease. “Checking and certifying [biosafety] level 3 and 4 facilities is an international problem,” says Anthony Della-Porta, an Australian biosafety consultant who led an investigative panel set up by the Singaporean government. “This is a beneficial wake-up call.”

    The idea that a lab accident could spark a new SARS crisis has long worried health officials. The issue will be at the top of the agenda at a meeting of SARS lab representatives in Geneva later this month, when the World Health Organization (WHO) will encourage countries to regulate their infectious diseases research and set up training and accreditation procedures. But in the end, each country will have to get its own house in order. “We can make recommendations, but we're not a policing organization,” says Klaus Stöhr, a WHO virologist who played a key role in coordinating the organization's response to the SARS crisis.


    A researcher works at Singapore's Environmental Health Institute, where another scientist was infected with SARS in a lab accident.


    The Singapore investigative panel concluded that the patient, now fully recovered, contracted SARS while working with a West Nile virus sample that had been contaminated with the SARS coronavirus in a biosafety level 3 (BSL-3) lab at the Environmental Health Institute (Science, 26 September, p. 1824). The EHI lab was one of several that Singapore pressed into service at the height of the SARS crisis. “They didn't have the background to understand how to handle dangerous agents,” Della-Porta says about the 1-year-old lab, which primarily studies mosquito-borne diseases. Not only were there problems with the lab itself, including improper air circulation and poorly located autoclaves and freezers, but operating procedures, including training and record-keeping, were also found wanting.

    In addition to EHI, the panel investigated three other Singapore labs that handle the SARS virus. Although none had lapses as serious as those at EHI, each fell short of meeting WHO BSL-3 guidelines. The report recommended national legislation to mandate detailed safety codes covering the periodic certification of “both structure integrity and operating procedures” at biosafety laboratories and the introduction of a system to track the importation, exportation, and in-country transfer of infectious agents. Khaw Boon Wan, acting minister for health, said last week that the government would address the issue in the next few months.

    Only a handful of countries have adopted nationally enforced biosafety guidelines. Chen Xuli, an official at China's Ministry of Health, says that the mainland already follows a three-step certification process covering design, construction, and commissioning for biosafety labs. The ministry also issued guidelines last summer for the management of SARS tissue samples and viral strains. But WHO recommended that the ministry go a step further by keeping track of where SARS samples are stored and used. Tracking how each institution in China is handling SARS samples is a huge task, notes Alan Schnur, WHO's team leader for communicable disease control in Beijing. He believes that the ministry is following up on the recommendations. But, “we don't expect to have results by tomorrow morning.”


    Smashup Sends Alien Stars Streaming Near the Sun

    1. Robert Irion

    Stars from a foreign galaxy are invading our neighborhood, according to a new analysis of the fate of a small galaxy being shredded by the Milky Way. The discovery promises to give astronomers a rare close-up view of stars born in different physical conditions.

    Astronomers spotted the distorted galaxy, called the Sagittarius dwarf, in 1994 on the far side of the Milky Way spiral from our sun. Later observations showed that its stars escape into extended streams from the outskirts of the dwarf as it orbits within the Milky Way's intense gravitational field. A recently finished atlas called the Two Micron All Sky Survey (2MASS) allowed astronomers to trace the full extent of these “tidal tails” for the first time.

    Sorting through a half-billion objects in the 2MASS catalog, a team led by astronomer Steven Majewski of the University of Virginia in Charlottesville found several thousand M giants, a distinctive class of red giant stars common in the Sagittarius dwarf but rarely seen above or below the plane of our galaxy. The dwarf's tidal tails popped out dramatically, swooping in arcs more than 100,000 light-years from the Milky Way's center. According to the team's models, so many stars have been ripped from the Sagittarius dwarf in the last 2 billion years that the little galaxy—just one-10,000th as massive as the Milky Way—is on its last legs. “It's dissolving right before our eyes,” says Majewski. The team's report will appear in the 20 December issue of the Astrophysical Journal.


    Stars yanked from a dwarf galaxy (red) loop around the Milky Way and dive near our sun (yellow dot).


    By a quirk of timing, stars in one of the dwarf's tidal tails are raining down upon the sun's current position. “That's very important and very surprising,” says astronomer Heidi Newberg of the Rensselaer Polytechnic Institute in Troy, New York, because astronomers now can try to identify specific alien stars by scrutinizing their motions and compositions. Indeed, Majewski calculates that there should be at least one such star within 100 light-years of the sun. The proximity of the dwarf's remains also ought to be good news for physicists hunting for so-called dark matter, because many astronomers suspect that dwarf galaxies are especially rich in dark matter. But the Sagittarius dwarf may be an exception, Majewski notes: Its disintegration suggests that it contains precious little dark matter to hold it together.


    'Tragedy of the Commons' Author Dies

    1. Constance Holden

    Ecologist Garrett Hardin never minced words in presenting his unvarnished view of humanity's impact on the planet. And he was no less direct in planning his death. On 14 September he and his wife committed suicide at their home in Santa Barbara, California. Hardin was 88, and his wife Jane was 81. Both were in very poor health.

    Hardin is best known for his 1968 article in Science, “The Tragedy of the Commons” (13 December 1968, p. 1243). In it he argued that if everyone had free access to common property, the resource would be lost to all. But Hardin was immensely influential in a host of related causes, including environmentalism, population control, abortion rights, and restrictions on immigration. His hard-headed approach to the competition for resources won him notoriety as well as fame—as when he suggested that if rich people let poor people into their “lifeboat,” all will sink. “The human species viewed as a whole has been a disaster for the Earth,” he said in a 1996 interview.

    Double suicide.

    Ecologist Garrett Hardin and his wife, Jane, took their own lives last month.


    He “pushed very hard, was an innovative thinker, and is certainly somebody we're going to miss,” says Stanford University biologist Paul Ehrlich, whose 1968 book, The Population Bomb, also stoked the debate over population and the environment. Herman Daly, an economist at the University of Maryland, College Park, says Hardin showed a new breed of “ecological economists” the importance of “giving the welfare of future generations a weight in moral decisions.”

    Hardin received a Ph.D. in microbiology from Stanford University in 1941 after studying zoology at the University of Chicago. He taught at the University of California, Santa Barbara, until his retirement in 1978. He remained active, however, and in 1986 he and his wife helped found Californians for Population Stabilization. His output totaled 27 books and 350 articles.

    Friends say the Hardins practiced what they preached by collecting rainwater to drink, recycling, composting, and eschewing newspapers because they squander newsprint. They were members of the Hemlock Society, and their deaths occurred a week after their 62nd wedding anniversary.


    Princeton Study Strikes Sad But Familiar Chord

    1. Andrew Lawler

    Women scientists at Princeton University are far more dissatisfied with their jobs than men are, and nearly a quarter complain about inappropriate behavior by colleagues. Those findings are part of a new report by an 11-member faculty panel that parallels a groundbreaking 1999 study at the Massachusetts Institute of Technology (MIT) (Science, 12 November 1999, p. 1272) and provides more evidence of the academic barriers facing women scientists and engineers.

    Released this week, the Princeton study was commissioned 2 years ago by molecular biologist Shirley Tilghman shortly after she became president. Although Tilghman agreed with the report's suggestion to name a special assistant for gender-equity issues, she is balking at a proposal for $10 million to promote hiring and retention of women scientists and engineers because of budget constraints and concerns about its legality, according to university officials.

    The panel, led by molecular biologist Virginia Zakian, found both good and bad news in its examination of 14 departments of natural sciences and engineering. On the one hand, the percentage of tenured women has more than doubled in the past decade to 13%, and it tops 20% in two departments—ecology and evolutionary biology, and psychology. But progress has been wildly uneven, the report states, and “the overall percentages of women continue to be quite low.” How low is demonstrated by the fact that Tilghman's rise to the presidency—which removed her from the faculty rolls—was a noticeable factor in reducing the percentage of faculty women in molecular biology from 30% to 19%.


    The panel didn't find significant gender differences on important yardsticks such as tenure rates and salaries, and workloads and university-level assignments seem to be gender-neutral. Yet only 39% of women said they were very satisfied with their jobs, compared with 63% of the men. And whereas no men reported being very dissatisfied with their work, 7% of women said they are.

    Nearly a quarter of women said their colleagues engaged occasionally or frequently in “unprofessional” behavior and excluded women from professional activities. Zakian predicts that dissatisfaction and such behavior likely will decrease as more women are hired. “Many of these climate issues will change with more women faculty,” she says.

    However, the department is the place where change must come, says the panel, which included two men. There's a long way to go: Only two of 14 departments have had women chairs, and women are less involved in department work than their male colleagues. Even so, the panel stopped short of recommending specific targets, Zakian says, because “each one is a different story—there is no one-size-fits-all goal.”

    Tilghman praised the panel's work as “one of the most thorough analyses to date” on the topic. She added that she “intends to make resources available to meet the recommendations of the task force,” adding that no final decision has been made on the $10 million fund. But last week she privately told panel members that legal and budgetary issues make the fund problematic. “I'm disappointed,” says one. But she and other women faculty members said they are confident that the administration remains committed to hiring and retaining women.

    The president did name psychologist Joan Girgus as special assistant to the dean of faculty on gender issues. “I'm optimistic, but it's a very difficult situation,” says Girgus, who says she hopes to speak with every department chair. Nancy Hopkins, the MIT biologist who headed that university's study on women, called the Princeton report superb. “I suspect that most universities willing to be as honest as Princeton would make very similar findings,” she added. The presidents of nine major U.S. research universities hope to meet next spring at MIT to discuss the matter.


    Europe Embarks on Leisurely Lunar Odyssey

    1. Daniel Clery

    CAMBRIDGE, U.K.—The European Space Agency (ESA) on 27 September dispatched a spacecraft on a mission that will try to solve a 4-billion-year-old riddle—how the moon was formed—and look to the future by testing a next-generation type of propulsion.

    ESA has grand ambitions to explore the solar system, from Mercury to the asteroid belt. To prepare for such long-haul missions, agency managers want to get the technology right. That's what the $125 million SMART-1 mission, launched on an Ariane 5 rocket from French Guiana, is all about. At 367 kilograms, the bantamweight spacecraft fits into a cube 1 meter across. Its 14-meter solar panels provide power to ionize wisps of its 82-kilogram supply of xenon gas and shoot it out the back of the craft. This generates a thrust of 70 millinewtons, or about the weight of a postcard against the hand; because that force can be applied continually as long as the sun falls on the arrays, the craft's speed will build slowly.

    Ion drives have been used to keep orbiting satellites in position; NASA's Deep Space 1, launched in 1998, was the only spacecraft to date to use one as its main source of propulsion. ESA was keen to get in on the act because ion drives provide 10 times as much impulse per kilogram of propellant as chemical thrusters. It plans to employ an ion drive in its 2011 or 2012 mission to Mercury.

    Easy rider.

    Europe's SMART-1 will put ion-drive propulsion through its paces before hunting for lunar ice and probing the moon's origins.


    SMART-1, now in Earth orbit, will ever so gradually spiral out to an altitude of 200,000 kilometers before feeling a tug from the moon; it should be captured in March 2005. After completing its 18-month voyage (Apollo 11, for comparison, took 3 days), it will transform into a science mission. Its instruments include a miniature charge-coupled device camera, an infrared spectrometer, and an imaging x-ray spectrometer. Infrared maps of the moon have been compiled before, most recently by NASA's Clementine. But SMART-1's infrared eye will make more-detailed surveys of intriguing areas using 250 wavelength channels—a big jump from Clementine's five. “It's an enormous improvement in spectral resolution,” says co-investigator Sarah Dunkin of the U.K.'s Rutherford Appleton Laboratory in Chilton.

    The spectrometer will also hunt for water in the shadowy craters of the south pole. As no direct sunlight falls there, the detector must rely on light reflected off crater slopes then bouncing off the ice up to the spacecraft. “It sounds like a long shot,” admits Dunkin, but researchers are hoping that with regular overflights over 6 months they may gather enough light to make the first direct observation of lunar water.

    The big science question that SMART-1 hopes to shed light on is how the moon was born. The prevailing theory is that it coalesced from the debris of a titanic collision between Earth and a large body some 4.5 billion years ago. Rocks hauled back by Apollo missions suggest that the moon has constituents similar to those of Earth's mantle. But that was a limited sample, and “there is a desperate need for a global inventory of what the moon is made of,” says principal investigator Manuel Grande of Rutherford. X-rays in sunlight impinging on the moon cause surface atoms to fluoresce; these photons, with wavelengths that are characteristic for each element, can be picked up by SMART-1's x-ray spectrometer.

    SMART-1 project manager Giuseppe Racca cautions that because many of the craft's systems and instruments are experimental, they cannot be guaranteed to work as predicted. But if SMART-1 delivers, he says, “it will open a new era of lunar science.”


    Report Says Listings Draw on Sound Science

    1. Robert F. Service

    The science behind listing endangered species in the United States is solid, says the government's watchdog agency. But the process of preserving their habitat is a quagmire.

    Both supporters and opponents of the Endangered Species Act (ESA) have found something to like in a new report by the General Accounting Office (GAO) ( Environmentalists were heartened by GAO's conclusion that decisions governing which species should be protected and how much habitat should be set aside for their recovery “are generally based on the best available science.” But Republicans in Congress say that it backs their contention that legal snafus are undermining recovery efforts.

    “We have an affirmation that the species on the ESA listing are there for legitimate reasons. That is huge,” says John Kostyack, senior counsel for the National Wildlife Federation in Washington, D.C. GAO noted that interviews of peer reviewers of listings from 1999 through 2002 revealed “overwhelming” support for the agency's decisions.

    Still, the report wasn't all good news for the U.S. Fish and Wildlife Service (FWS), which oversees endangered species. GAO concluded that the process of designating the critical habitat that endangered populations need to recover has become dominated by litigation, sapping much of the agency's ESA funding. The legal gridlock has led to “a serious crisis,” it adds, including a backlog of more than 200 species waiting for ESA listing.

    “It confirms what we've been saying all along, that the ESA is broken,” says Nicole Andrews, press secretary for the House Committee on Resources, which requested the study. Court orders mean that FWS biologists sometimes must designate critical habitat even if they conclude it would provide little benefit, she notes. A sound-science approach should allow FWS to recommend no habitat designation, she says.

    One way to ease the gridlock, says GAO, is for FWS to issue clearer guidelines on when it is appropriate to designate critical habitat. But agency officials say that step could put them in hot water with the courts by diverting resources from other court-mandated activities.


    Physics Tries to Leave the Tunnel

    1. Charles Seife

    A decade after the demise of the Superconducting Super Collider, high-energy physicists are planning new experiments to recoup the insights it was supposed to yield. But many say their field will never be the same

    BATAVIA, ILLINOIS—A hair-raising electronic screech fills the control room. The technicians look up at the loudspeakers in disbelief and groan. As the operators frantically flick switches underneath banks of red alarm lights, a calm female voice intones, “The Tevatron is off.”

    Scientists, engineers, and technicians here at the Fermi National Accelerator Laboratory (Fermilab) are struggling to make their finicky machine behave. It's no easy task. The Tevatron is the most powerful particle accelerator in the world, yet nagging technical problems have cast a shadow on its potential. As technicians pore over schematics of refrigeration units, trying to pinpoint where the failure occurred, accelerator operator John Sutherland leans into the huddle. “It's a fairly stressful job,” he says. “We've had a string of bad luck lately.”

    A lot of hopes are resting on the Tevatron's luck changing. The machine was upgraded 2 years ago in an effort to fill a small part of the void left when the U.S. Congress unceremoniously killed the Superconducting Super Collider (SSC) 10 years ago this month. Had the SSC been built, the most powerful machine in the world would now be a 20-trillion electron volt (TeV) collider in Texas, instead of a balky 1-TeV accelerator in Illinois. The monstrous accelerator would likely have turned on in the late 1990s, and by the turn of the century, the first results would have leaked out. Right now, we would likely be reading the first ironclad scientific papers about the discovery of new particles, such as the Higgs boson.

    “The state of the world would be very different,” says Stan Wojcicki, a physicist at the Stanford Linear Accelerator Center. “We would have known much more. We would have known if supersymmetry exists. We would have known if the Higgs exists.” Instead, 10 years after the cancellation of the SSC, physicists are still hoping to wring some interesting physics out of the Tevatron while they await the debut in 4 years time of Europe's Large Hadron Collider (LHC), a machine that might answer some of the questions the SSC was to have tackled—if all goes according to plan. But the LHC might be the last of its kind. Physicists are being forced to contemplate the possibility that the high-energy frontier will close after the LHC finishes its run—in part, because the “big science” approach to particle physics suffered such a blow from the SSC's demise. “The SSC has cast a very long shadow” over high-energy physics and big science in general, says Fermilab physicist William John Womersley. “We're still dealing with the legacy.”

    The vision of the past

    The SSC was conceived in 1982, high in the Rocky Mountains. At a summer conference in Snowmass, Colorado, the U.S. high-energy physics community cobbled together a vision of the next big project: an enormous collider that would shoot beams of protons, each with 20 TeV of energy. Protons smashed together with such enormous energy would create particles out of the vacuum. The tremendous shower of mass and energy would likely reveal exotica that particle physicists had been searching for for decades.

    Buried hopes.

    The underground accelerator lab south of Dallas was supposed to unlock the secrets of mass and discover, or banish, the doppelgänger particles of supersymmetry.


    Interest in the project grew, and by the end of 1983 the Department of Energy (DOE) began to fund an initial design study. The design was bold—and expensive. The $3 billion machine (in fiscal year 1984 dollars) would require a circular tunnel at least 90 kilometers around, studded with superconducting magnets. These magnets would accelerate and guide two beams of protons until they reached more than 99% of the speed of light and then smash them together inside detectors. The result would be a treasure trove for high-energy physicists.

    The SSC was designed to discover two key particles. The first is the Higgs boson, which is postulated to imbue particles with mass; the second, known as the lightest supersymmetric particle (LSP), would reveal a shadowy mirror universe that only subtly influences our own. Many scientists are convinced that these particles exist and that they are just out of reach. Together, their presence would fill two big holes in the Standard Model of particle physics: the problems of mass and unification.

    Taken literally, the Standard Model says that particles shouldn't have any mass at all. Until the 1960s, whenever scientists tried to insert mass terms into its equations, the model blew up and ceased making mathematical sense. Physicists solved the problem by positing the existence of one or more new force-carrying particles, now known as Higgs bosons, that could give the particles mass without straining the equations of the Standard Model beyond their breaking point.

    In a massless universe, it takes no effort to shove a particle and get it to move. Indeed, you can define an object's mass as its resistance to acceleration, so something with no mass offers no resistance. Higgs bosons are sticky particles that exist everywhere in the universe, grabbing onto passing objects. This ubiquitous stickiness makes an object resist acceleration. It imbues the object with mass. The problem is that nobody has been able to see a Higgs boson—yet.

    What the Higgs does for mass, the LSP would do for a theory known as supersymmetry. Supersymmetry addresses the second main problem of the Standard Model: unifying forces that cause particles to interact.

    As scientists probe forces on a smaller and smaller scale and at higher and higher energies, the electromagnetic force and weak force merge into a single “electroweak” force. However, the “strong” force, which cements the nucleus together, doesn't join smoothly with the other two forces. That incompatibility prevents scientists from understanding the physics that governs very tiny scales and high energies. Supersymmetry offers a solution that is both slight and radical. Extending the basic framework of the Standard Model, it gives each known particle a doppelgänger that hasn't yet been discovered. Each quark has a squark (supersymmetric quark) counterpart, the electron has a selectron, the photon has a photino, and so on. The mathematical simplicity of the model, coupled with the idea that the LSP is a beautiful candidate for the “exotic cold dark matter” that cosmologists believe makes up five-sixths of the mass in the universe, makes a compelling case for supersymmetry.

    Theory says that both the Higgs boson and the LSP must be relatively massive particles. The more massive the particle, the shorter its lifetime, the tinier the scale it lives on, and the harder it is to discover. Physicists need large, powerful atom smashers to find these massive and evanescent particles.

    If supersymmetry or Higgs theory (or both) is correct, the SSC should have been powerful enough to produce massive numbers of Higgs and LSPs, enough that scientists could discover and weigh these particles—or discredit the theories behind them if the particles weren't found. “We certainly would know if there's a Higgs or several Higgs by now, and supersymmetry—l ots would be known,” says Thomas Kirk, an associate director at Brookhaven National Laboratory in Upton, New York.

    The reality of the present

    By the mid-1980s, physicists finalized the design for the accelerator and, with the help of sympathetic officials within DOE, got the green light from then-President Ronald Reagan. But rifts within the community sapped momentum from the project, and the cost soared. Design changes and a slower construction schedule—9 years instead of the original six—more than doubled the initial $4 billion budget. In October 1993, Congress killed the project, leaving a village of abandoned buildings and a catacomb of half-dug tunnels in its wake. As a result, Womersley says, “the key scientific questions for the next decade are the same questions I heard [SSC scientists] say were the key questions for the 1990s.”

    Heir apparent.

    The Large Hadron Collider should bag the Higgs boson (simulated event).


    In the mid-1990s, physicists hoped to keep the energy frontier open—and, with luck, discover evidence for supersymmetry and perhaps the Higgs—by giving Fermilab's Tevatron a $260 million makeover. Unfortunately, the Tevatron has been underperforming in the 2 years since its renovation (Science, 8 February 2002, p. 942). Although the machine will provide valuable insights into the top quark (which it discovered in 1995), the bottom quark, and massive particles such as the W and Z, physicists say the Tevatron has probably missed its chance to make a dramatic discovery.

    Most high-energy physicists have shifted their hopes to Europe's LHC, a 7-TeV accelerator scheduled to start operations in 2007. According to Gordon Kane, a physicist at the University of Michigan, Ann Arbor, experiments at CERN near Geneva and at other labs imply that even a 10- or 15-TeV machine—far less powerful than the planned SSC—would be able to bag the Higgs and the LSP. The LHC ought to spot the particles if their masses fall within the most likely range, although it's not quite powerful enough to make an ironclad promise. “But that no longer seems so important,” Kirk says; physicists are just thankful that the LHC is coming online before the end of the decade.

    The uncertainty of the future

    Meanwhile, high-energy physicists say, the cancellation of the SSC continues to cast a pall over their field. The troubles at the Tevatron are making matters worse, especially in the United States. Graduate students are “staying away in droves,” Kirk says, and many established researchers are migrating to other disciplines. Some, such as Wojcicki, have turned their attention to neutrinos: ubiquitous, nearly massless particles. Others are looking outward for inspiration. “Over the last decade or so, there has been an increased migration of capable talent into astrophysics,” says Lawrence Jones, a physicist at the University of Michigan, Ann Arbor. “I think the fact that [Nobel laureates] Sam Ting and Jim Cronin are heading up astrophysics projects is significant.”

    Some natural high-energy physics experiments—cosmic ray observatories—study particle smashups with hundreds of exa electron volts of energy, more than a million times as much wallop as the SSC could have put into a proton. But scientists can't control the collisions or detect the progenitor cosmic rays themselves; instead, they see enormous showers as each generation of particles spawns hundreds of offspring. Although this is useful for astrophysicists, “no amount of cosmic ray data, no amount of those measurements can answer the basic questions for us: the Higgs boson, the origin of matter asymmetry, the nature of cold dark matter,” says Kane. “Accelerator experiments are crucial to take us to beyond where we are.”

    Those may not be in the offing. Although physicists are beginning to design the new International Linear Collider, the chances of success are considerably smaller than they were for the SSC (see next story). “The whole ethic of the field has to change,” says Kirk. “It has to downsize considerably,” relying on smaller experiments to unravel the fundamental laws of the universe. But others say that “small is beautiful” is not an option. Without a linear collider or another sort of accelerator on the energy frontier, the high-energy physics community will “largely self-destruct,” says Kane. “In the U.S., we'll range from doing very little to doing second-rate stuff. In the world, we're likely to have the LHC and an upgrade, but it will take incredible luck to go beyond that.” And luck, as the scientists at the Tevatron have learned, can be the most capricious particle of all.


    Lots of Reasons, But Few Lessons

    1. Jeffrey Mervis,
    2. Charles Seife

    Physicists and policymakers haven't forgotten about the death of the world's biggest and costliest science project. But what have they learned?

    The newest project on the frontier of particle physics was conceived 3 years ago atop the Rocky Mountains, at a summer conference in Snowmass, Colorado. The brainchild of the high-energy physics community, the International Linear Collider (ILC) hopes to be the next big project in particle physics. It will smash electrons and positrons into each other with 500 billion electron volts of energy, yielding precise measurements of the Higgs boson and supersymmetric particles. Design work is already under way.

    To make the ILC a reality, however, scientists must avoid the fate of an even more powerful accelerator. Conceived 20 years ago at the same Colorado location, the Superconducting Super Collider (SSC) would have fed off collisions of two proton beams, each with 20 trillion electron volts of energy. Once up and running, the SSC would have dwarfed the Large Hadron Collider (LHC) now under construction at CERN, the European particle physics lab near Geneva, and rendered obsolete the Tevatron collider at Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois.

    But the SSC was never built. Ten years ago this month, Congress stunned the high-energy physics community by writing off $2 billion and killing the partially completed project. Instead of the scientific, economic, and cultural boom that would come with the world's most powerful accelerator, Texas officials ended up with 30 kilometers of tunnels and a cluster of unwanted buildings (see p. 40). And instead of having the inside track on discovering two elusive particles (see p. 36), physicists were left with bitter memories and sharp recriminations.

    Despite the passage of a decade, U.S. scientists and government officials still disagree about the lessons to be learned from what was arguably the world's largest and most costly scientific fiasco. Indeed, most of the major players in the SSC debate seem averse to even looking back, much less to making sure that the next multibillion-dollar machine avoids repeating the SSC's mistakes. “There's not much point dwelling in the past,” says physicist Roy Schwitters, who left Harvard University to become the first and only director of the SSC lab and who's now at the University of Texas, Austin. “We didn't do everything right, or we'd still be there, doing physics.”

    Admittedly, the lessons are not obvious. Most of the protagonists agree that half a dozen or so major factors greased the fall of the SSC, but they disagree on their relative importance. High on some lists is the inability of scientists to explain why the public should pay for a machine that smashes together tiny particles at unprecedented energies. But so are repeated budget increases; a dearth of international partners; a different management structure at the Department of Energy (DOE), which was overseeing the project, as well as a jarring change in the management of the accelerator; tensions within the physics community; congressional concern over a growing federal budget deficit; and the decision to build the SSC at a “greenfield” site outside Dallas, Texas.

    One man, many magnets.

    Superconducting magnets drove the SSC—and drove Director Roy Schwitters crazy.


    But although the SSC remains a touchstone for physicists—“It's impossible to have a discussion of the linear collider in the United States without, in 2 minutes, the SSC coming up,” says Fermilab physicist William John Womersley—its legacy is not clear. “Every morning I woke up asking myself what I could learn from this experience,” recalls presidential science adviser John Marburger, who chaired the nonprofit university consortium that managed the SSC lab for DOE from its inception to its final days. “On most days, the answer was nothing.”

    The blame game

    John Gibbons, President Bill Clinton's first science adviser, thinks that the SSC's fate was sealed soon after his boss took office and that neither the White House nor the project's advocates could have saved it. “I briefed Clinton the same day on both the SSC and the international space station,” recalls Gibbons, a physicist who served in the White House from 1993 to 1998. “He agreed to back the station, and we agreed that the SSC was a goner. The momentum was all going in the wrong direction.”

    Marburger, who performs the same duties today for President George W. Bush, also lets the SSC's scientific management off the hook. But he's not as kind to its political overseers. “DOE created a culture that made it difficult to develop the SSC in the same way it had used to build other accelerator labs,” he says. “It displaced the strong in-house capabilities with a new team, from a military background, that felt there was nothing unique about building a large collider. But that's not the way it works.”

    Schwitters points the finger directly at Congress. “The SSC became a symbol of congressional constraint,” he says. “They needed to show that they could hold down spending, and we were the biggest trophy.” At the same time, he admits to handing legislators a potent weapon: a decision to increase the aperture of the magnets that steer the proton beams around the 90-kilometer racetrack. That change added a couple of billion dollars to the SSC's bottom line, and many saw the increase as the final straw for the project. “I'd like to have that [budget] jump back,” he says. “But I don't know how I could have avoided it. It was the right thing to do, scientifically. It was the only way to ensure that we could reach the necessary energy levels [to find the Higgs boson].”

    An outside panel chaired by Stanford physicist Sidney Drell endorsed that decision at the time. But some prominent scientists, with the benefit of hindsight, now disagree. John Peoples, the former director of Fermilab, thinks that reducing the energy and capability of the collider might have kept the project alive with little or no impact on its experimental results. “If they had made a serious compromise on the energy level,” he says, “they might have kept the price closer to what they had previously announced.” Gordon Kane, a physicist at the University of Michigan, Ann Arbor, says that “insisting on maintaining the full energy and luminosity … was their big mistake.”

    One of the project's most visible opponents, Representative Sherwood Boehlert (R-NY), still bristles at the way the SSC's supporters made their case. “There was a lot of arrogance in how this project was presented to Congress,” says Boehlert, now chair of the House Science Committee. “They assumed that it couldn't be shut down.” The high-energy physics community also assumed erroneously that, according to Peoples, “promising to find the Higgs boson would be enough to get people to shower money on them.”

    Rising costs were definitely a sticking point for legislators, says Boehlert, as was the lack of significant contributions from other countries. “It was sold to us as an international project that would cost $4 billion,” he says. “But the last time I looked, we didn't have any significant partners and it was going to cost $9 billion or more. It was good science but not priority science. Scientists were also telling me that we shouldn't be putting all our resources in one basket.”

    One man, few votes.

    The Clinton Administration didn't push hard for the SSC, admits John Gibbons (far left).


    A longtime aide to one of the project's biggest backers, Senator J. Bennett Johnston (D-LA), dismisses part of Boehlert's analysis. “Only a handful of members had any notion of what the SSC was, so it was easy to demagogue the issue,” says Proctor Jones, who worked for Johnston until his 1997 Senate retirement and is now at the Washington, D.C., lobbying firm of Johnston and Associates. But he agrees that the internal divisions undermined the project's support on Capitol Hill. “It didn't help that parts of the community spoke out against the project because they thought they would get some of the money if Congress killed the SSC,” he says. “Of course, they were wrong; the money didn't go to science.”

    It also didn't help, say supporters, that the Clinton Administration was never sympathetic to the idea of a big physics project. “I was always worried about how many of these big commitments we could afford,” says Gibbons. “I recommended to the Senate that we build it, but I only testified once. And I didn't lie down in front of the train [of growing opposition].”

    Marburger acknowledges that Clinton's 1992 victory over Bush—a Texan who served as Reagan's vice president—was a big blow. “No subsequent administration is ever going to be as supportive to a project as the one that proposed it,” he says. “The SSC had only one parent, and he was a Republican.”

    Peoples, who managed the decommissioning of the lab after the final congressional vote in October 1993, says that there's plenty of blame to go around. The SSC's supporters “made a terrible mistake in making an enemy of Boehlert.” (Schwitters says that his efforts to meet personally with Boehlert to discuss the project were rebuffed.) DOE “failed to appreciate the magnitude of the challenge,” Peoples says, and fostered an unnecessary rivalry with Fermilab that undermined the project. Peoples and other scientists also believe that things would have gone more smoothly if Illinois had won the SSC site competition, which at one point included more than half the states, and built the new lab near Fermilab.

    The man whom many SSC supporters loved to hate—Joseph Cipriano, the veteran Navy official whom DOE recruited to manage the project—doesn't disagree with some of the Monday-morning quarterbacks. Putting the SSC at Fermilab, he says, “would have solved some of the [management] problems” that arose later. He also admits that DOE didn't do a very good job of knitting together the divergent academic and industrial cultures at the lab. “If I had it to do over, I'd try to increase the involvement of the scientific community,” says Cipriano, now a vice president at Lockheed Martin Information Technology in Bethesda, Maryland.

    Even so, Cipriano defends DOE's decision to impose “a manufacturing environment” on the lab construction because of the efficiencies needed to build 17,000 magnets. He insists that the project was “on time and on budget” until the very end. And he says that Congress's decision was a crushing blow. “I was depressed for 2 years,” he confesses.

    Foreign intrigue

    Perhaps the murkiest area for those following in the footsteps of the SSC is its impact on future international collaborations. On the surface, the lesson is clear: Line up enough backers so that the host country doesn't have to carry the entire financial load. “We didn't realize that, once you reach a certain level, the way you handle international partnerships has to change,” says Marburger. Cipriano puts it more directly: “It was the wrong order. We should have gotten the foreign commitments before we started the projects.”

    The multinational support promised for the proposed International Thermonuclear Experimental Reactor and in hand for the LHC, says Marburger, shows that it's possible to spread the burden among many countries. Backers of the International Linear Collider hope he's right. Project organizers say that Japan, Europe, and the United States must be full partners regardless of where the machine is built.

    DOE officials came to that view belatedly, after Congress imposed a limit on the U.S. contribution to the project. But their effort to get $1 billion from Japan was too little, too late. Both supporters and opponents of the SSC agree that the project was a child of the Cold War and that its technological spinoffs were expected to strengthen the country's political and economic hegemony. “The SSC was supposed to show the world that the U.S. was number 1,” says Gibbons. “Unfortunately, that Cold War mentality was no longer relevant in 1993.”

    Despite the budgetary advantages, stitching together international collaborations on multibillion-dollar projects will always be a hard sell politically, warns former Senate aide Jones. “The country that hosts the next big machine will be the one that puts up the bulk of the money,” he predicts. “It only makes sense. We don't want our young people to have to go to CERN to get their training [in accelerator physics]. And the same thing applies to every other country.”


    Scientists Are Long Gone, But Bitter Memories Remain

    1. Jeffrey Mervis

    The Superconducting Super Collider was supposed to put Waxahachie on the map. Instead, it left a hole in the heart of Texas

    New hope for cancer patients. An economic boom. An influx of well-educated professionals. A better science and math curriculum in the local schools. And most of all, an international reputation for a down-home Texas community.

    The Superconducting Super Collider (SSC) was going to do all that, and more, for the people of Waxahachie, Texas, surrounding Ellis County, and the greater Dallas area. Instead, the project provided the setting for a made-for-TV movie and launched the political career of a former lab official. A decade after Congress killed the project, those who helped win an international competition for the world's largest and most costly scientific facility are still bitter about seeing their dreams turn to dust so quickly.

    “I attended the first meeting for the SSC in Ellis County, back in 1987, and there was a lot of hype once they picked us as the site [in November 1988],” recalls Robert Sokoll, Waxahachie city manager. “But the project's been an albatross around our necks ever since [the lab was shut down].”

    Ten years after the lab was terminated, the detritus of the government's $2 billion investment is still visible on a vast stretch of scrubland south of Dallas. Now empty, the cluster of specially designed structures on the SSC's west campus once hosted more than 1000 people. (Lab employees also worked at another site that has been converted into a refrigeration manufacturing plant.) As deep as 80 meters underground, water is thought to be filling some 30 kilometers of tunnels that were chewed through the Texas chalk, marl, and shale.

    If buildings could talk, the SSC's massive Magnet Development Laboratory might relate a decade's worth of woe. A deal with a major pharmaceutical company to use it as a distribution center fell apart at the last minute, and plans to convert it into an antiterrorist training facility haven't gelled. Its closest brush with science since the SSC closed was its appearance in Universal Soldier II, a 1999 movie about a rogue band of robotic warriors commanded by a supercomputer that occupied the magnet building. That's where small-business owner Darren Downing of Ardmore, Oklahoma, found his 15 minutes of fame. “I was just a peon, one of the extras,” says Downing, who was trying to generate some free publicity for his fitness store, Muscles in Motion.

    Not a magnet.

    Waxahachie Chamber of Commerce executives say the SSC's demise slowed economic development.


    That use was a far cry from the building's intended purpose, to develop the thousands of superconducting magnets that would steer twin beams of protons around a 90-kilometer circular racetrack before they crashed into one another. But it's a step up from its current role as a warehouse for the Ellis County government. “It broke my heart to see it being used to store Styrofoam cups,” says physicist Roy Schwitters of the University of Texas, Austin, the first and only director of the SSC laboratory, recalling his last visit to the lab 3 years ago.

    County officials have tried to sell the seven properties. But so far there have been no takers for the specialized structures, which include a 180-meter-long, 8.5-meter-wide magnet testing building that mimics the curve of the tunnel beneath it. “It's not near a major highway or a railroad, so it's not a very convenient location,” admits Joe Grubbs, county and district attorney. “DOE [the Department of Energy] was going to improve the roads, if the money hadn't run out. But we keep hoping that somebody will find the right use for it.”

    Soon after Congress pulled the plug on the SSC lab, Texas tried to salvage something from the debacle by asking for suggestions on what to do with the wreckage. The carrot was a $68 million pot of money from the federal government, given in partial compensation for the state's $1 billion investment. One proposal was to tap into the SSC's linear accelerator to treat cancer patients. The idea of using the linac, the first stop for protons on their multistage journey to the highest energy levels, was compelling enough to lure radiation oncologist Eli Glatstein to the University of Texas Southwestern Medical Center in Dallas from the National Cancer Institute in Bethesda, Maryland. A proton therapy center, says Glatstein, would have allowed well-off cancer patients to be treated in Dallas rather than traveling to Houston.

    But the competition fizzled after then-Governor George W. Bush and the legislature spent the money to retire state bonds that had been issued to finance the lab's construction. “I had recruited faculty, and I was supposed to get new facilities, but the money was never approved,” Glatstein remembers. Disappointed, he pulled up stakes after only 4 years at Southwestern and took a position at the Hospital of the University of Pennsylvania in Philadelphia. But he hasn't abandoned the idea of proton therapy. “We hope to get a machine here,” he says.

    Local residents still talk about advanced cancer therapies as the big one that got away when asked about the SSC's demise. “People were going to be flying in from all over the place,” says Debra Wakeland, president of the Waxahachie Chamber of Commerce, recalling another proposal to use the SSC to produce short-lived isotopes for medical diagnoses using positron emission tomography scanners. “For anyone who has lost relatives to cancer, that was a big deal.”

    Al Cornelius isn't a cancer specialist or a high-energy physicist. But after spending 25 years in contracts and procurements at NASA's Johnson Space Center in Houston, he decided that the SSC posed a more interesting challenge. “I was just about to accept a position in California when I learned about an opportunity at the SSC lab,” he recalls. “We always knew that there was a chance the project would bite the dust, but it seemed worth the risk.”

    When the curtain came down on the SSC, Cornelius transformed himself into a politician. “As a federal employee, I never had the chance to run for political office,” he says. He promptly ran for the post of the county's chief executive officer—and won. Soon he was negotiating with the state for the rights to his former workplace. “A lot of property had been taken off the tax rolls and given to the lab, so we filed a $150,000 claim for lost revenue,” he says. “The state didn't want to pay out any cash, but they gave us 150 acres [61 hectares] of land and the [west campus] buildings.”

    Within a year Cornelius had sold the utility rights to the property for more than $1 million. But he wasn't as fortunate with the buildings. “We had one offer in the 5- to 8-million-dollar range. But it fell through,” he recalls. Cornelius left office in January, after serving two terms, and retired to New Mexico. But it still bothers him that the buildings are vacant. “I wish we could have gotten something back for the taxpayers,” he says.

    In the meantime, the loss of the SSC lab has left its scar on the body politic. “They took all this money and flushed it down the toilet,” says Sokoll. “It can't hardly help but leave a bad taste in the mouths of people here. There were scientists working here from all over the world, and it was sad how it affected their lives.”


    The Power of Words: The SSC in Literature

    1. Jeffrey Mervis

    It may be 10 years late, but the Superconducting Super Collider (SSC) has finally found a literary champion. It's Herman Wouk, the Pulitzer Prize-winning author of The Caine Mutiny and a master of historical fiction. Still productive at 88, Wouk has made the SSC the focus of his latest novel, A Hole in Texas, due out next spring.

    “My longtime friend, Glenn Seaborg, used to ask me when I was going to write a novel about a scientist,” recalls Wouk. “Well, A Hole in Texas is my shot at the target.”

    To prepare to write the book, his 12th novel, Wouk toured the SSC carcass in Waxahachie and met with scientists at Illinois's Fermilab, home of the world's most powerful accelerator. “For years I've thought that there is a great story in the interface of science and politics,” he says. “The SSC seemed like a good way to tell it.”


    The main character in A Hole—think Clint Eastwood for the movie version—is Guy Carpenter, an unsung but brilliant physicist who spent 5 years at the doomed SSC. He's called to Washington, D.C., by movie star-turned-congresswoman Myra Kadane (think Stockard Channing) to explain a bombshell announcement by Chinese scientists that they have discovered the Higgs boson, the elusive particle that was to have been the SSC's quarry. Congress and the media go on a witch-hunt to assign blame, which in turn shines a spotlight on the noble, selfless SSC scientists and their pursuit of truth.

    There's romance galore: Carpenter and the lead Chinese scientist on the Higgs paper, Wen Mei Li, were a hot item back in their student days at Cornell (think Lucy Liu for the flashbacks) and have continued to correspond, and sparks fly between Carpenter and Kadane. There's no sex, but Wouk takes plenty of potshots at the American political process.

    A Hole is actually the second novel in which the SSC plays a starring role. A 1997 “hard” science-fiction novel by physicist John G. Cramer of the University of Washington, Seattle, Einstein's Bridge, assumes that the SSC was built and that energy from its powerful collisions is detected by two other civilizations—one of which, benign, warns that the other is bent on cosmic domination. To escape destruction, humanity must alter history to prevent the SSC from being built. Two physicists take on the challenge and in the course of their time travel expose the seamy conduct of the politicians who cancelled the project.

    Cramer, 68, readily admits that the book gave him a chance to vent his scientific spleen about how U.S. politicians never appreciated the value of the SSC and how government bureaucrats fatally mismanaged the project. But those feelings haven't soured him on science; indeed, Cramer says that plans for a sequel to Einstein's Bridge have been waylaid by his own research at the Relativistic Heavy Ion Collider at Brookhaven National Laboratory in Upton, New York. “Instead of working on a novel, I'm writing up a paper this fall for PRL [Physics Review Letters].”


    An End to Business as Usual?

    1. Yimin Ding,
    2. Xiong Lei*
    1. Ding Yimin and Xiong Lei write for China Features in Beijing.

    Beijing University plans to implement dramatic changes aimed at breaking the insular nature of faculty hiring and promotions

    BEIJING—Five years into its second century, Beijing University sits at the top of the country's academic food chain. But the way it hires and promotes faculty members may be more suitable to a small noodle shop than a great university.

    This month the university, nicknamed Beida, plans to change that, with a package of reforms intended to curb academic nepotism and to bring personnel practices in line with those at top institutions around the world. Its chief features include banning the hiring of Beida graduates immediately out of school, filling more slots with outside talent, and making tenure and promotion more dependent on research productivity. School officials hope the new policies will pave the way for similar reforms at other domestic institutions. But so far it's been a rocky road: Unusually strong criticism has already forced major revisions in the plan, and critics still charge that the changes may favor certain groups, damage morale, and leave untouched more fundamental problems affecting the quality of teaching and research at the university.

    “Many universities are run like a family business,” says the man behind the reforms, economist Zhang Weiying. Teachers frequently hire their former students, for example. “In most cases these students are obedient and easy to command,” says Zhang, assistant to Beida's president. “But they are not necessarily creative enough to do research on their own and are hardly competitive internationally.” To back up their case, Beida authorities have calculated that 20% of the university's 2000 faculty members are the source for 80% of the school's academic achievements, including publications, awards, and outside funding.

    The solution, to Zhang and others, is more outside competition and less academic “inbreeding.” A first draft of the reform plan, released in May, would have set screening gates at key points along an academic's career path. It decreed that one-third of lecturers, one-fourth of associate professors, and half of the professors hired should be from outside Beida. (There are about twice as many professors and associate professors as lecturers.) It said that at least one-third of the members of academic appraisal panels should be prominent scholars from overseas universities. It set a deadline—6 years for lecturers and 9 to 12 years for associate professors—to be promoted to the next level or face dismissal. And it declared that any faculty member, regardless of seniority, could be cut loose if judged to be lagging far behind his or her peers.

    Outside the family.

    Beijing University's Zhang Weiying wants to hire more faculty trained elsewhere.


    Those changes triggered an outcry among Beida faculty members who didn't see any reason to change the status quo. Their reaction prompted the university to come out, barely 1 month later, with a second, less radical, draft proposal. Under the new plan, senior faculty would be exempted from the outside hiring policy, and the quota system was scrapped. The prohibition on hiring freshly minted Beida graduates remained, however, as did the time limits for promotion.

    The revised version still disturbs some younger faculty members. They think that it's not fair to leave out the school's 800 professors, much less some 4000 administrative employees, a group that includes not only Communist Party officials but also business and housing managers and all manner of logistics and support staff. “The up-and-out reforms should not target only teachers,” says Jiang Feifei, an associate professor of history.

    Zhang says that the university addressed the teaching and research faculty first because their impact on students is greater. But the best way to appraise the performance of faculty members is also a hot topic. Jiang and her colleagues worry that people who devote themselves to research but who do not spend time cultivating their superiors, what one calls “apple polishing,” will be at a disadvantage before tenure committees. “In many cases, the candidate's academic performance only counts for 10%, with the rest being seniority and one's personal relationship with officials and appraisal committee members,” says one associate professor. Zhang agrees that the current system can be manipulated but believes that adding outsiders to the committees will curb such abuses.

    Professors in the natural sciences say that the reforms ignore how hard it can be to lure Chinese scientists now working abroad. The big disparity between salaries in China and those in the West means that “Beida might not be able to attract talent from overseas,” says Ouyang Qi, a lab director at the university's Institute of Physics. He is also worried about losing good young talent because of pressure to hire outsiders, noting that it takes him “2 to 3 years” to train a new lab member to take on research and teaching duties.

    Although Ouyang says, “I support the reforms,” he believes that the university must be committed for the long haul. “The gap between the old system and the new system is so big, the reform must carry on for a very long time,” he says. “And supplementary measures are needed to get rid of the obstacles.”

    Zhang, who says the opposition to the initial draft “was not surprising” given its radical nature, agrees that any changes will take time. But he says that Beida, and China, have no alternative if they want to remain competitive globally. “It is essential that we employ the best possible faculty members, who must be creative and innovative,” he says. “We can wait no longer.”


    Studying the Well-Trained Mind

    1. Marcia Barinaga

    Buddhist monks and Western scientists are comparing notes on how the mind works and collaborating to test insights gleaned from meditation

    CAMBRIDGE, MASSACHUSETTS—Matthieu Ricard is no ordinary Buddhist monk. He earned his Ph.D. in molecular biology at the Pasteur Institute in Paris before deciding 30 years ago to devote his life to the practice of Tibetan Buddhism. Now Ricard, a member of the Shechen Monastery in Nepal, is involved in science again, as both a subject and a collaborator in a neuroscience project at the University of Wisconsin, Madison. There he and neuroscientist Richard Davidson hope to learn whether the study of trained meditators can provide insights into the mechanisms of brain function or new therapeutic approaches for psychology.

    This unusual collaboration and others like it were catalyzed by the Mind and Life Institute, created in the 1980s by businessman Adam Engle and the late neuroscientist Francisco Varela to foster a dialogue between Buddhist scholars and Western scientists. Initially the institute sponsored small, private meetings held at the Dalai Lama's headquarters in Dharamsala, India. But last month the meetings went public for the first time, with a conference called Investigating the Mind, held here at the Massachusetts Institute of Technology and co-sponsored by MIT's McGovern Institute for Brain Research.

    For 2 days, panels of neuroscientists and Buddhist scholars took the stage with the Dalai Lama before an audience of 1100 to discuss attention, mental imagery, and emotion—topics of interest to Buddhists and scientists. The atmosphere was casual; the Tibetan leader huddled with speakers over a laptop to follow their presentations and frequently interrupted with questions or comments.

    Mind and Life co-founder Varela, who was director of research at CNRS's Cognitive Neurosciences and Brain Imaging Laboratory in Paris, held a deep conviction that Buddhists, with their 2500-year history of introspective inquiry into the nature of the mind, had much to offer to neuroscientists. A handful of neuroscientists such as Davidson who were familiar with Buddhism agreed. Others have come to the meetings out of curiosity but with less certainty of what the Buddhists could contribute. “I have to confess that some of the scientists came to the table looking at the Buddhists almost as specimens,” says cognitive neuroscientist Jonathan Cohen of Princeton University, who participated in the MIT meeting. “It was like, ‘Here are these people who claim to be able to do unusual things. Let us get our electrodes on them.’ … It took a round [of discussion] for the scientists to come to respect that the Buddhists had some very interesting things to say.”

    Interdisciplinary research.

    Scientists (left) shared the stage at MIT with Buddhist scholars (right) and the Dalai Lama (fourth from right). They discussed attention, mental imagery, and emotion.


    A Buddhist science of the mind

    Some scientists made that transition through learning more about meditation. The practice is often viewed by Westerners as merely a form of relaxation whose benefits are limited to stress relief or lowered blood pressure. It is actually a rigorous system of mind training and observation of mental processes, what Buddhists consider to be their own “science of the mind.” “From its outset, [Buddhism] has had a very strong emphasis on refining the attention, enhancing attention skills, and developing very sophisticated means for investigating the nature of the mind from a first-person perspective,” says Buddhist scholar and former monk B. Alan Wallace, president of the Santa Barbara Institute for the Interdisciplinary Study of Consciousness.

    What's more, Wallace adds, the Buddha himself told his followers not to take his teachings on faith but to test them for themselves. That spirit of inquiry makes some Buddhist practitioners eager to participate in neuroscience studies.

    The time is ripe for Buddhists' input, says Clifford Saron, a researcher at the Center for Mind and Brain at the University of California (UC), Davis. The tools with which cognitive neuroscientists measure brain activity have grown so sensitive, Saron says, that scientists can observe differences in brain activity between individuals doing the same task or even between different trials with the same individual. There is information in that variation, but it requires the input of the subject to decipher it. “Most people have very little training to report how they did a task,” Saron says, but meditators who are trained to observe their own minds should be able to describe in detail whether their attention was more stable in one trial versus another, whether they prepared themselves in a slightly different way, or even what kinds of fleeting emotions or images might have passed through their mind.

    Buddhists say they hope the interaction will lead to several things—first of all, “a healthier world,” according to Buddhist monk and meeting participant Ajahn Amaro of the Abhayagiri Monastery in Redwood Valley, California. Beyond that, they want the opportunity to test their first-person insights with Western research techniques and understand better the mental states they achieve through meditation. There has been a fair amount of “shlock science” done toward this end, Davidson says, but the Mind and Life Institute has approached the issue “in a very different way, involving the very best people in their respective areas,” he adds.

    One hot topic at the MIT meeting was the role of introspection, or reporting personal mental experience, in science. Although introspection has formed the basis of the Buddhist investigation of the mind, Harvard University psychologist Daniel Gilbert notes that “a lot of scientists have a hard time getting their heads around the idea that introspection can be a form of data.” Indeed, Harvard psychologist Stephen Kosslyn spent part of his presentation illustrating ways in which subjects' reports on the mental strategy they used to solve a problem could be misleading.

    Despite such caveats, neuroscience has already begun using first-person insights to help frame questions, says neuroscientist G. Ron Mangun, director of UC Davis's Center for Mind and Brain. “When you are talking about something like human cognition, if you don't use introspection to guide you, it is a bit difficult to get anywhere,” Mangun says. “We use introspection all the time in our research. We are just trained to be very careful with how we use it.”


    Stephen Kosslyn explains a psychological test to the Dalai Lama.


    Some Buddhists' introspections directly challenge views held by neuroscientists. For instance, trained meditators claim to be able to hold their attention on a single object for hours, or to shift attention rapidly as many as 17 times in the span of a finger-snap. These claims contradict Western reports that attention cannot be held that long or switched that fast. Whether such claims prove to be precisely true or not, MIT neuroscientist Nancy Kanwisher is eager to see whether monks who have spent years training their attention are better on standard attention tests than the average person. “Training the attention has barely been touched by cognitive neuroscience,” says Kanwisher.

    Soon that may change. UC Davis's Mangun and Saron are planning a collaboration with the Santa Barbara Institute's Wallace to test the attention skills of trained meditators. They plan to enroll two dozen volunteers in a 3-month, intensive, full-time program in shamatha, a form of Buddhist contemplative training that is aimed at enhancing attention skills. In exchange for the training, the meditators will agree to be subjects for psychological and brain-imaging studies of attention.

    Because subjects will be tested before, during, and after their training, the study avoids certain pitfalls of working with established meditators, such as the possibility that any observed differences might reflect not training but the fact that subjects drawn to meditation may have had unusual brains to begin with.

    Hold that image

    Whereas some Buddhist practitioners specialize in attention, others devote themselves to the demanding practice of visual imagery, meditating on an image held in the mind as a means to purge the mind of value judgments. It may take decades for a monk to develop prowess in imagery. Some virtuosos claim to be able to hold in their minds a detailed image such as a complex mandala, a symbolic depiction of the universe, for many minutes or even hours.

    These claims are also contradicted by Western neuroscience. “Based on my understanding of how the brain works, that should not be possible,” says Harvard's Kosslyn, who studies mental imagery. Kosslyn has found that mental images are fleeting—necessarily so, he reasons, because mental imagery uses the same brain areas that serve vision, and visual images fade quickly from the brain to prevent the appearance of smearing as our eyes move.

    “Do you have plans to conduct experiments on monks?” the Dalai Lama asked Kosslyn. “If they are different, how will it change your theory?” Kosslyn replied that he is eager to test his theory with trained meditators, and he has already set up experiments to test the holding of an image over time, and the vividness of images. If he were to find differences with trained meditators, he said, he would scan their brains for unusual activity that might explain the disparity, using functional magnetic resonance imaging.

    There is an impediment to doing such experiments: Very few monks are truly accomplished in visual imagery, and, says Ricard, “they are contemplative hermits. None of them is ready to come to the lab.” Ricard says he hopes to find some moderately accomplished monks who are willing to travel.


    Richard Davidson (left) and Matthieu Ricard after a brain-imaging experiment.


    Accentuate the positive

    The collaboration between Buddhists and neuroscientists has borne the most fruit to date in the study of emotions. Buddhist meditation fosters “virtuous” mental states that are said to promote well-being, such as compassion, joy, and “loving-kindness.” UC Berkeley psychologist Dacher Keltner says this is a radically different approach from Western psychology, which focuses mainly on negative mental states such as anger, fear, or depression.

    A growing number of Western psychologists are investigating the potential of Buddhist meditation training to shift the brain into positive emotional states. Wisconsin's Davidson is collaborating with Ricard to study the brain activity associated with positive emotions in Buddhist monks. Davidson and his colleagues have demonstrated repeatedly that activity in the frontal region of the brain reflects a person's emotional state. A high ratio of activity in the left versus the right frontal areas marks either a fleeting positive mood or what Davidson calls a positive “affective style,” which is the quality of mood that persists over time. Subjects gripped in a negative mood, or with generally negative affective styles, rank lower on the left-to-right ratio. And when researchers trigger a negative emotion, for example by showing subjects a disturbing news photo, those negative emotions fade more quickly in people with more left-frontal brain activity.

    Using such techniques, Davidson and postdoc Antoine Lutz are studying Ricard and other monks with many years of meditation experience. Their first subject, while not meditating, showed a left-right brain activity ratio higher than that of any of the 150 non-Buddhist subjects the team had previously tested. The team has tested six monks so far. The data are still being analyzed, but Davidson reported at the MIT meeting that when the monks were instructed to meditate on compassion, they showed a greater shift toward left-frontal activation than control subjects who were not trained meditators but who were given instructions to meditate on compassion.

    Such a study can't rule out the possibility that the monks' brains were unusual even before they began their training. So Davidson's team took another approach. It recruited employees of Promega, a Madison-based biotech company, to go through 8 weeks of basic meditation training. Volunteers were randomly assigned to receive the training or not. The team recently reported in Psychosomatic Medicine that, compared with controls, those trained to meditate showed an increase in left-prefrontal activation both at rest and in response to an emotional challenge.

    “This was not in any shape or form a definitive study,” says Davidson. But it is not the only pilot project to have produced tantalizing preliminary results. A study called the Cultivating Emotional Balance project has also suggested that meditation training can promote emotional health in Westerners. UC San Francisco psychologist Paul Ekman conceived of the project after participating in one of the Mind and Life meetings in Dharamsala and developed it with Wallace and UCSF health psychologist Margaret Kemeny. For the pilot study, 15 school teachers underwent a 5-week intensive course in meditation that includes meditation on compassion and loving-kindness, integrated with strategies and techniques selected from modern Western emotion research.

    The teachers performed a battery of psychological tests before and after the training. They were wired for physiological measures such as heart rate and blood pressure, and they were videotaped so that psychologists could monitor them for nonverbal reactions that show feelings such as contempt or acceptance. The subjects showed more positive emotional responses after training than before. Based on that result, the researchers are planning a larger study—this time with a control group.

    Kemeny notes that the Cultivating Emotional Balance project differs from other studies in that the focus of its training is on emotions such as compassion and empathy that generate a positive feeling toward others, and in its measurement of changes in the subjects' reactions to other people. “We want to understand the psychological effects” of such training, she says.

    That is a question for which the Dalai Lama has a ready answer. In his closing remarks, he repeated his faith in the power of science and encouraged collaboration between Buddhists and scientists, ending with a confident exhortation to his audience to “encourage positive emotions, discourage negative. Then you will be more happy.” If the Buddhists and neuroscientists can put their heads together and figure out how we can all do that, maybe Ajahn Amaro will get his wish for a healthier world.


    Molecular Scaffolding Helps Raise a Crop of Neurons

    1. Robert F. Service

    From 7 to 11 September, 14,000 chemists, physicists, and engineers gathered in New York City for ACS's 226th National Meeting. Among the highlights: nerve-healing nanofibers and carbohydrate microarrays.

    For decades biomedical researchers have dreamed of regrowing damaged nerve cells. Now chemists may be getting a handle on the trick by extending a technique first used to promote bone growth.

    At the ACS meeting, Samuel Stupp, a chemist at Northwestern University in Evanston, Illinois, reported that his group had designed molecules that assemble themselves into tiny rods that spur the growth of neural tissue in rats. If the strategy works equally well in humans, self-assembling molecules could offer new hope to victims of spinal cord injury and other types of nerve damage. Says Robert Grubbs, a chemist at the California Institute of Technology in Pasadena: “It's a very promising approach.”

    It's one that Stupp and colleagues have been honing for several years. Recently, the team designed two-part molecules called peptide-amphiphiles (PAs) that assemble themselves into rigid fibers. The peptides, or short protein fragments, that decorated the outside of the nanofibers contained amino acids that encouraged the growth of hydroxyapatite crystals, a basic constituent in bone (Science, 23 November 2001, p. 1635).

    For their new work, Stupp, postdoc Gabriel Silva, and graduate student Krista Niece changed the outermost peptide groups on the PAs in hopes of promoting the growth of neurons. Other researchers had shown that proteins called laminins bind to neurons and encourage the growth of neurites, arms that extend out from the central cell body. Laminins contain a five-amino acid sequence known as IKVAV (for the sequence isoleucine, lysine, valine, alanine, and valine). So the Northwestern researchers designed their PA molecules to end with the IKVAV sequence.

    Right track.

    Nanofibers (not shown) encourage neural progenitor cells to become neurons (green) instead of astrocytes.


    The two-part PA molecules contain oily hydrocarbon chains connected to the peptides. When placed in a watery solution, the hydrocarbon chains seek to crowd together to avoid the energetically costly association with water. The negatively charged peptides at the other end normally repel one another, keeping the molecules apart. But the researchers overcame that repulsion by adding the PAs to a cell culture medium that contained positive ions. The ions surrounded the negative charges and allowed the oily hydrocarbon tails to pack together into nanofibers, with the peptide groups facing outward.

    In addition to positive ions, the cell cultures also contained mouse neuronal progenitor cells. Harvested from mouse embryos, these cells can develop into any of several types of nerve cells, including neurons and astrocytes, cells involved in forming scar tissue at the site of a spinal cord injury. The Northwestern researchers found that, in the presence of IKVAV-topped nanofibers, 30% of the neural progenitor cells differentiated into neurons after just 1 day and 50% after 7 days. That's five times as many as in a laminin-coated cell culture without the nanofibers. The nanofibers also seemed to prevent progenitor cells from becoming scar tissue-forming astrocytes.

    Early experiments with animals support the approach. Stupp reported that he had teamed up with Northwestern University neurologist John Kessler and his graduate student Catherine Czeisler to study the effect of the nanofibers on rats with spinal cord damage. The researchers found that rats with spinal cord damage showed better movement than control animals 40 days after being given an injection of PA molecules.

    Stupp cautions that more work needs to be done with living animals to quantify the effect. Meanwhile, the researchers are working to improve the technique by beefing up their IKVAV-bearing nanofibers with additional sequences known to promote neuronal binding.


    Arraymaker Speeds Analyses by Months

    1. Robert F. Service

    From 7 to 11 September, 14,000 chemists, physicists, and engineers gathered in New York City for ACS's 226th National Meeting. Among the highlights: nerve-healing nanofibers and carbohydrate microarrays.

    Molecular biologists owe a lot to their toolmakers. Gizmos such as automated DNA synthesizers and gene chips have made it possible to study genes by the thousands.

    Life is harder for researchers studying the biological role of sugars. Sugar molecules that decorate proteins and cells are chemically complex, made with numerous types of chemical bonds and branching structures. But things are looking up. Two years ago, for example, carbohydrate chemist Peter Seeberger of the Massachusetts Institute of Technology in Cambridge and colleagues developed an automated synthesizer to stitch together a wide variety of sugar chains, known as oligosaccharides (Science, 2 February 2001, p. 805).

    Now Seeberger's team has gone one step further. At the ACS meeting, Seeberger—now at the Swiss Federal Institute of Technology (ETH) in Zürich—reported how his team's automated oligosaccharide synthesizer could turn out carbohydrate microarrays, glass slides dotted with hundreds of spots containing different sugar chains. The new arrays now make it possible to study systematically how sugars bind to their targets. The knowledge will allow researchers to unravel the complex role of sugar chains in governing everything from protein folding to cell communication, and it could also speed the development of new disease-fighting drugs. “It's a powerful approach,” says carbohydrate chemist Laura Kiessling of the University of Wisconsin, Madison.

    Four teams reported making carbohydrate arrays last year, Kiessling notes. But each approach had important drawbacks. Most, for example, required researchers to harvest oligosaccharides from natural sources— tedious work that can result in a mixture of different oligosaccharides on individual spots on the array, making it difficult to determine which sugar chain is doing the binding. Working with automatically synthesized oligosaccharides “has the nice advantage that you make something that is homogeneous,” Kiessling says.

    Sweet spots.

    Oligosaccharide microarrays could speed the development of new sugar-targeting drugs and vaccines.


    To make their arrays, Seeberger's team fired up the group's oligosaccharide synthesizer to stitch together sugar chains with from one to nine separate sugar groups. A conventional DNA-array robot attached the sugars to a glass slide. In addition to spotting down the sugars, Seeberger's team also dotted its slides with multiple copies of standard glycoproteins—proteins with their usual sugar chains attached—as well as copies of the same proteins with the sugars stripped off.

    In one representative study, the group dotted slides with fully glycosylated and stripped-down versions of GP120, a protein that coats the outer surface of the human immunodeficiency virus (HIV). The researchers then used their GP120 chips to determine whether the protein, the sugars, or both held the key to binding various targets. They selected four separate proteins that were known to bind to GP120 and attached fluorescent tags to each. They spritzed separate chips with solutions containing the different proteins, waited for the proteins to bind to their targets, and washed away any unbound proteins. They then used a standard fluorescence detector to track where the target proteins bound.

    One of the targets the researchers checked was a bacterial HIV-blocking protein called cyanovirin-N. Painstaking work by several labs had shown that the protein binds to glycoproteins such as GP120 that contain large amounts of the sugar mannose. The arrays confirmed that result, showing that the protein bound to three-, six-, and nine-member sugar chains containing mannose but not to individual mannose groups. Those earlier studies “took 6 months to do by hand,” Seeberger says. “With the chips we could do the experiment in 1 day.”

    The group then tested the binding of a human HIV-blocking antibody known as 2G12, about which less was known. Previous work had shown that 2G12 also binds to mannose-rich oligosaccharides and had hinted that a particular chemical bond between sugar groups, called a mannose α-(1→2) mannose linkage, could be involved. The array results bore out the theory. Seeberger's team found that 2G12 binds to three-, four-, six-, and nine-member sugar chains containing the bond but not to other chains without the bond. The protein also bound to the whole GP120 glycoprotein itself as expected. “While the linkage had been suggested previously, we were able to show this was the case in a single slide,” Seeberger says.

    Armed with those results, Seeberger's team is now tracking how bacterial enzymes alter various oligosaccharides on their surfaces to become resistant to antibiotics. The answer could lead to the development of new antibacterial medications. Related research could also reveal novel sites for drugs to target pathogens, such as HIV and the bacterium that causes tuberculosis.