News this Week

Science  14 May 2004:
Vol. 304, Issue 5673, pp. 936
  1. NATIONAL INSTITUTES OF HEALTH

    Paid Consulting: Good for the Staff, Not for the Chiefs

    1. Jocelyn Kaiser

    For the past 5 months, critics in Congress and elsewhere have battered the National Institutes of Health (NIH) for allowing its scientists to accept hefty consulting payments from companies. Last week, a blue-ribbon panel appointed by NIH Director Elias Zerhouni offered a plan to restore public confidence: It recommends that top NIH officials and grant decision- makers be barred from industry consulting. But the panel “walked a fine line,” said co-chair Norman Augustine, by saying that in-house NIH scientists may continue to interact with industry, although within new limits.

    The response from the research community was positive: “This will increase transparency without doing harm to the enterprise,” said biologist David Burgess of Boston College at a meeting last week of the NIH director's advisory committee, on which he serves. Association of American Medical Colleges president Jordan Cohen stated that the report “will help sustain and strengthen the public's essential trust” in NIH. Although Zerhouni is still weighing the recommendations, he seemed to like the advice and asked the blue-ribbon panel to come back once more to “fine-tune” it. But it's not clear whether the plan will satisfy NIH's critics.

    Concerns about payments to NIH staff members erupted after the Los Angeles Times reported last December that some NIH scientists have received up to $300,000 or more over the past decade from drug and biotech companies and suggested that these deals posed conflicts of interest (Science, 19 December 2003, p. 2046). The rules on outside money had been loosened in 1995 by then-Director Harold Varmus, who was hoping to recruit more private-sector scientists and clinicians. Although Zerhouni has found no evidence that patients were harmed or decisions influenced by these deals, he asked for a review by a 10-person blue-ribbon panel co-chaired by Augustine, chair of the executive committee of Lockheed Martin Corp., and Bruce Alberts, president of the National Academy of Sciences.

    The panel's 109-page report* notes that few NIH employees do consulting work—just 118 out of 17,500 as of last month, down from 228 in January, Zerhouni said. The “simplest” solution might have been to “just ban” these deals, but “that would be a grave error,” Augustine said. Instead, the panel decided that industry consulting is beneficial because it enables NIH to share technologies and compete for talent.

    Course correction.

    A panel led by Bruce Alberts and Norman Augustine urges Elias Zerhouni (center) to revamp rules on outside income.

    CREDIT: MARTY KATZ

    Because they have broad responsibilities and can influence funding decisions, according to the panel, senior managers and grant officers should not consult for industry or academia. (Two NIH institute directors with outside deals named by the Los Angeles Times have ended them.) The panel also endorsed current law barring clinicians from accepting payments from a company with a financial interest in their research. But that rule should not be applied to most intramural scientists, the report says: Consulting that does not pose a conflict of interest “should be allowed.” The panel recommends limits, however: Employees should not spend more than 400 hours a year consulting or earn fees of more than 50% of their salary (for clinicians, 100%). And they should not be paid with stock.

    At the same time, the panel strongly supports nonindustry outside activities such as receiving awards that may include cash, writing textbooks, speaking, and editing. Those activities are “part of the tradition of science,” the report says. The panel also advised NIH to find ways to require that more senior officials file publicly available financial disclosure reports. And it said NIH should request authority to raise a current $200,000 cap on salaries (see sidebar), partly to offset the ban on consulting by senior people.

    The panel urged speedy implementation of its advice, on which action would be required by both the Administration and Congress. NIH staff members now suffer “confusion” about policies, and there is “a growing morale problem” at the agency, the report says.

    Researchers have expressed a few concerns about the report. Varmus, now president of Memorial Sloan-Kettering Cancer Center in New York City, opposes a blanket ban on consulting by clinical and scientific directors, because some do not oversee grant decisions. “It should be by function,” says Varmus—a view shared by some of Zerhouni's advisory committee members. Varmus and others also questioned the prohibition on stock options, noting that they are often the only payment a start-up company can offer.

    Other observers thought the panel was too gentle, including some lawmakers, who would like to see wider public disclosure of employee financial reports. Representative James Greenwood (R-PA), chair of the House Subcommittee on Oversight and Investigations, planned to probe further at a hearing this week.

  2. NATIONAL INSTITUTES OF HEALTH

    Are Scientists Worth More Than Senators?

    1. Jocelyn Kaiser

    The controversy over researchers' income centers not only on who pays, but how much. A member of Congress has questioned whether the National Institutes of Health (NIH) had acted illegally in hiring top-level officials as consultants, paying them more than they could earn as civil servants—or even as senators. But high pay is not a problem, according to a blue-ribbon panel that reported to NIH last week (see main text); on the contrary, it says some NIH salaries should be even higher.

    The issue involves a legal mechanism, part of Title 42 of the U.S. Code, that former NIH Director Harold Varmus began using in 1998 to raise institute directors' salaries and recruit intramural scientist- physicians above the top federal pay scale of $150,000 or $160,000. Under Title 42, NIH can pay as much as $200,000 (or more, with bonuses) to “special consultants” without approval from the Department of Health and Human Services (HHS). “It has made an enormous difference in our ability to recruit and retain an outstanding senior scientific staff,” says NIH intramural director Michael Gottesman.

    Recent NIH hires at the $200,000 salary level include, among others, five new institute directors; Dennis Charney, a leading neurobiologist who came to NIH's intramural program from Yale; and Anna Barker, an immunologist who headed a biotech firm and is now a deputy director of the cancer institute.

    A panel examining NIH's consulting policies concluded last week that although NIH's midrange salaries compare well with those in academia, $200,000 salaries for top leaders and clinicians are “far from competitive.” NIH needs to retain Title 42 or a similar authority and should ask HHS to raise the cap, the panel said.

    But Representative James Greenwood (R-PA), chair of the House Subcommittee on Oversight and Investigations, suggested in a 4 February letter to HHS that hiring senior officials as consultants could mean that they “lack the legal authority” to carry out their job duties. And using it to pay salaries higher than the vice president's is a major policy decision that should not be carried out without involving Congress, says a staffer.

  3. OCCUPATIONAL HEALTH

    Beset by Lawsuits, IBM Blocks a Study That Used Its Data

    1. Dan Ferber

    Workers who make the chips that form the brains and memories of computers and electronic devices use an array of nasty chemicals, some of which are known to cause cancer or are suspected of doing so. Whether on-the-job exposure has actually caused disease among chip workers is, however, a hotly contested issue. Good data are scarce, but some occupational health experts believe valuable clues may lie in a trove of records maintained by the computer giant IBM, which is now at the center of a legal dispute.

    Two researchers who gained access to IBM's records have produced an epidemiological study, paid for by attorneys suing IBM, concluding that former workers at IBM's computer-chip factories faced increased risks of dying from brain, kidney, blood, and skin cancers. But IBM recently moved to block publication of these findings, arguing that it had turned over the data only for use in a lawsuit and that a judge had ruled that the analysis could not be introduced as evidence in a trial because it was irrelevant and could be confusing to jurors. The company itself has since hired its own expert to conduct a new analysis of the data.

    The contested study—by epidemiologists Richard Clapp of Boston University and Rebecca Johnson of Epicenter in Circle Pines, Minnesota—analyzes a large set of mortality records on people who worked for IBM over a period of more than 3 decades. It was scheduled to appear in an upcoming special issue of Medical Clinics of North America, says guest editor Joseph LaDou, director of the University of California, San Francisco's International Center for Occupational Medicine. Four peer reviewers had read and approved the study, Ladou says. Then, on 30 March, Clapp withdrew it. IBM attorney Robert Weber confirms that an IBM attorney warned Clapp he shouldn't publish. LaDou says, “I wanted this in the journal because it's the most definitive cancer study” to date on this industry. IBM's action was a “serious disappointment to our scientific and academic freedom.”

    But Weber argues that academic freedom is not at issue. Clapp has no legal right to publish the mortality analysis because he is bound by the court's protective order that mandates he keep the data confidential, Weber says.

    A judge originally ordered IBM to open its files to Amanda Hawes and Richard Alexander of Alexander, Hawes and Audet in San Jose, California. They are attorneys for families suing the company over cancers in California and New York. The plaintiffs' attorneys then paid Clapp and Johnson to analyze the data. IBM attorneys say the court ruled that the data were available for use only in litigation and that Clapp had signed on to the protective order, which legally binds him to maintain confidentiality.

    Clean room?

    Workers employed in computer manufacturing in the 1970s and 1980s had an elevated risk of dying from specific cancers, according to a hotly contested study.

    CREDIT: ED KASHI/CORBIS

    Clapp declined to comment on the legal dispute, as did Hawes. Alexander and Johnson did not respond to telephone messages.

    The IBM lawsuits grew out of long-standing allegations that workers in semiconductor manufacturing were poorly protected from hazardous solvents and other chemicals that cause cancers and birth defects. IBM and other companies have denied this. But the health risks are worth investigating, says cancer epidemiologist John Bailar, a professor emeritus at the University of Chicago and current scholar-in-residence at the National Academy of Sciences in Washington, D.C. “Semiconductor workers in the past and at present suffer exposure to potent organic chemicals, including some known carcinogens,” says Bailar, who has no connection with the litigation. A handful of earlier studies had provided hints of elevated cancer risks among semiconductor workers, based on limited data.

    In 1997, former workers and their survivors began suing IBM. In discovery proceedings for one of the first of the approximately 200 ongoing lawsuits, a New York judge ordered IBM to turn over a database called the corporate mortality file, which tabulates the cause of death of more than 33,000 former IBM workers, as well as a database of work history on more than 18,000 deceased former workers. As an expert for the plaintiffs, Clapp had access to these files.

    In their study, which Science has obtained, Clapp and Johnson first conducted a preliminary analysis called a proportional mortality ratio. It spotlighted eight types of cancer in men and five in women that seemed to kill former IBM workers more frequently than would be expected in a group of average Americans. Then, they compared the percentage of cancer deaths from each type of cancer with the corresponding percentages in the general population—a measure called proportional cancer mortality ratio (PCMR). The PCMR analysis provided several statistically significant results: The 7697 IBM men who died of cancer were between 23% and 62% more likely to have died from cancers of the kidney, brain, blood, and skin. And the 1667 IBM women analyzed were 20% more likely to have died from kidney cancer. When the researchers focused on a subgroup more likely to have been exposed to chemicals—employees who worked at least a month at one of IBM's chip-manufacturing plants—PCMR results indicated that male workers were 62% to 79% more likely to have died from kidney, skin, or brain cancer, and female workers were 112% more likely to have died from kidney cancer.

    Retreat.

    Under legal pressure, Clapp withdrew an epidemiological study from publication.

    CREDIT: ANDREW BRILLIANT/http://www.brilliantpictures.com/

    Bailar calls some of the numbers “elevated enough to make me worry.” The Clapp and Johnson analysis—the biggest cancer study in the electronics industry so far—is consistent with preliminary evidence from a small 2001 British-government study on a Scottish semiconductor factory that showed elevated cancer risk, he says. But the new results fall “short of proof.” Occupational epidemiologist Harris Pastides of the University of South Carolina, Columbia, concurs. Although the authors “reported very objectively,” he says, the study amounts to “an early part of the detective work to try to go down that road to causality.”

    IBM's lawyers reject the findings: “This is one of the clearest examples of what has been characterized as junk science,” Weber maintains. “It's a litigation-produced study in which lawyers supplied key data and gave direction on how the study was to be done.” Clapp says that he “totally disagrees” with that depiction. The corporate mortality file itself came from the plaintiffs' lawyers, he says, but he and Johnson did “everything after that,” including designing the study, choosing the statistical software, and analyzing the data.

    IBM's attorneys fought successfully to keep the analysis out of the first cancer lawsuit to come to trial, a closely watched case brought by two former IBM workers in San Jose, California. On 9 October, just before the trial started, Judge Robert Baines of California Superior Court barred jurors from seeing the Clapp and Johnson study because he said it did not document a link between the mortality data and workplace exposure to chemicals, making it “simply irrelevant and … highly prejudicial.” IBM won that case on 27 February.

    Bailar and other epidemiologists suggest that IBM could clear up lingering questions about whether semiconductor manufacturing work raises cancer risks by giving outside scientists access to records for the entire workforce rather than just for those who have died, which would allow scientists to look at overall rates of death. Researchers also would like to see IBM's data on workers' assignments and chemical exposures.

    In March, the Semiconductor Industry Association, a San Jose, California, trade group representing 95 of the largest semiconductor manufacturers, commissioned a retrospective study to see whether chip- manufacturing workers faced higher cancer risks. IBM's Weber says that the company has commissioned a separate study on possible workplace health hazards at IBM, led by Elizabeth Delzell of the University of Alabama, Birmingham. Delzell declined to discuss the details of the study but says she hopes to complete it this year, and she plans to publish it in a peer-reviewed journal.

  4. SPACE SCIENCE

    Hubble Alternative to the Rescue?

    1. Andrew Lawler

    With a robotic mission to extend the life of the Hubble Space Telescope a good bet for late 2007 or 2008, NASA is also quietly considering launching one of the replacement instruments on a free-flying telescope that will incorporate advanced optics and spy-satellite technologies.

    The robotic rescue likely could replace Hubble's dying batteries and gyroscope and perhaps install a new wide-field camera. Replacing the Cosmic Origins Spectrograph (COS), however, may be beyond the reach of robotic systems. The $65 million instrument, which would gather data on early galactic evolution, requires complicated wiring best connected by human hands. So when NASA announced in January that it would not send humans to service Hubble again, the future of COS looked grim.

    But NASA chief scientist John Grunsfeld says one alternative is to launch COS aboard a spacecraft that makes use of the latest advances in mirror technology. Adaptive optics, now used on ground-based mirrors to counter thermal, gravitational, and atmospheric variations, could be used to lower both the cost and weight of a new space-based mirror and forge the way for future missions. “The angle is that we would partner with people who want to build lightweight mirrors,” says Grunsfeld, a former astronaut who flew on Hubble servicing flights. He adds that he has talked with Defense Department officials, as well as researchers at the University of Arizona in Tucson and industry heavyweights such as Spectrum Astro in Gilbert, Arizona. The target date is 2009.

    Ride wanted.

    The Cosmic Origins Spectrograph may be too hard for a robot to install on Hubble.

    CREDIT: BALL AEROSPACE & TECHNOLOGIES CORP.

    One official involved in the discussion envisions a 2-meter mirror nearly as capable as Hubble's but weighing only 200 kilograms, a fraction of the weight of Hubble's mirror. The spacecraft would weigh 2.5 tons, or one-fifth the weight of Hubble. The cost would be $200 million to $300 million, plus up to $100 million to launch the telescope into a high orbit aboard a Delta 2 expendable rocket. “It's Hubble Lite,” he said, adding that defense officials are eager to test advanced optics for use in their spy satellites.

    “It's potentially doable, and it would be nice to take steps in that direction,” says J. Roger P. Angel, a University of Arizona astronomer who has worked for decades on adaptive optics. But he warns that new technology usually takes more time and money than expected and that the nearly 1-ton size of COS poses a challenge.

    For now, however, Grunsfeld says NASA's focus must be on the robotic mission to Hubble. Before it even hears from a National Academy of Sciences panel examining Hubble's future, NASA plans to ask industry next month to propose ways to conduct the effort and award a contract by September. “It is clear to us that if we don't get moving, it's not going to happen,” says Grunsfeld.

  5. PLANETARY SCIENCE

    Endurance Has Its Rewards on Mars

    1. Richard A. Kerr

    It was a sight to warm the cockles of any geologist's heart: exposed rock, lots of it. Once the Mars rover Opportunity rolled up to the rim of the 130-meter-wide Endurance crater (below), the rover team saw “enormous outcrops of layered rock,” says science team leader Steven Squyres of Cornell University in Ithaca, New York.

    Opportunity had already spent weeks analyzing an outcrop at the 20-meter-wide Eagle crater, its landing site. The curb-high chunk of finely layered rock told of an acid, salty, shallow sea on early Mars. But with just one sliver of martian history to read, rover team scientists had no idea how long that water had stuck around on the surface of ancient Mars or what came before it.

    CREDIT: NASA/JPL/CORNELL

    Endurance, however, is “a spectacular impact crater that has exposed many meters of the history of Mars before the water revealed at Eagle crater,” says Squyres. If Opportunity's first distant look at the bare Endurance cliffs is any indication, Mars had a varied history before the time of the Eagle crater sea. Team member James Rice of Arizona State University in Tempe sees up to five rock layers lying below the light-toned layer also seen at Eagle crater. So far, remote spectral analysis of the older layers shows no trace of the salts detected in the Eagle outcrop, just the basaltic rock typical of Mars. Hints of cross-cutting layers suggest to Rice that some of the older beds may be sandstone laid down by ancient winds. “It may be a dune environment,” says Squyres, “it may be a beach. It isn't what we saw at Eagle crater.”

    Opportunity will be sizing up Endurance from its rim in coming weeks, but to get a reliable reading of martian history, Opportunity will have to edge down to rock exposures on the crater's steep inner walls. “If we go in,” says Squyres, “it will have enormous scientific potential, but there will be risk as well.” Before taking such risks, Opportunity will take care of unfinished business on the less exciting but safer plains.

  6. PALEONTOLOGY

    Evidence of Huge, Deadly Impact Found Off Australian Coast?

    1. Richard A. Kerr

    Seven geoscientists report online this week in Science (www.sciencemag.org/cgi/content/abstract/1093925) that they have found the scar of a large asteroid or comet impact just off the northwest coast of Australia that could have trigged the largest mass extinction ever, 250 million years ago. The proposed Bedout (pronounced “Bedoo”) impact could have triggered the Permian-Triassic (P-T) extinction, they say, the way the Chicxulub impact on the Yucatán Peninsula caused the death of the dinosaurs.

    Not so fast, say some researchers who specialize in deciphering signs of impact lingering in rock. “There's no convincing evidence for an impact origin” in the studied rocks, says impact petrographer Bevan French of the National Museum of Natural History in Washington, D.C. “Everything they're arguing was shocked [by impact] can have nonshock origins,” such as volcanic activity, he argues. Despite the variety of evidence presented in this and two earlier Science papers by the same principal authors (Science, 21 November 2003, pp. 1388 and 1392), impact-triggered extinction at the P-T has yet to meet broad acceptance.

    This search for a P-T impact crater started with oil exploration. On the basis of oil companies' seismic probing beneath the sea floor, oil explorationist John Gorter, now at ENI Australia Ltd. in West Perth, proposed in 1996 that the submerged Bedout High is the central peak of a large impact crater formed at the end of the Permian. Since that time, geochemist Luann Becker of the University of California, Santa Barbara, and colleagues have been trying to explain the apparent impact debris they were finding in Antarctica. They took a closer look at the oil exploration data and samples. A map of subtle gravity variations across the region reveals a ring reminiscent of Chicxulub's, they say. And radiometric dating of a Bedout High mineral grain recovered from the bottom of an oil exploration well puts its formation in the neighborhood of the 251-million-year age of the P-T.

    Impact debris?

    Microscopic details (left) from a sea-floor rock core (right) suggest the shock of a large impact.

    CREDIT: L. BECKER ET AL.

    But central to their argument are rocks from Bedout High. The shock waves racing away from a large impact are powerful enough to alter mineral crystals profoundly. Shock can rearrange the crystal structure of minerals into distinctive patterns, obliterate crystallinity entirely to produce the glassy product maskelynite, and even melt the mineral. Becker and her colleagues point to examples of maskelynite in their samples. They also cite many examples of melted crystals. In one case, a crystal encloses a melted core of the same chemical composition as the crystal. “It's nigh unto impossible to get that in a volcanic process,” says geochemist Robert Poreda of the University of Rochester in New York, who did much of the mineral analyses. “The only way you can do that is to shock melt it.”

    Some experts on shock effects on minerals are not persuaded. “I see nothing that would convince me there was an impact,” says Christian Köberl of the University of Vienna, Austria. At Chicxulub, the buried layer of jumbled and melted rock fragments always contains microscopic bits of minerals such as quartz riddled with distinctive banding due to shock, he says. “Where are all the shocked minerals?” he asks.

    Impact geologist Richard Grieve of the Canadian Geological Survey in Ottawa would also expect to see signs of flow frozen into the once-molten material. An impact's debris is deposited so violently that the molten rock should have been pulled like taffy as it suddenly cooled, he says. And he questions the paper's identification of shock-formed maskelynite. “I've never seen maskelynite look like this,” says Grieve. “I could be wrong, but I wouldn't add [Bedout] to the list” of proven impact craters he helps maintain. To make the list, say Grieve and the others, Becker and her colleagues need to apply some more powerful analytical tools, such as micro-Raman spectroscopy, to targets such as the putative maskelynite. Then perhaps Bedout can join the club.

  7. MOLECULAR BIOLOGY

    Consortium Tackles Mouse Regulome

    1. Wayne Kondro*
    1. Wayne Kondro writes from Ottawa.

    OTTAWA, CANADA—Dozens of the world's leading molecular biologists have banded together to map out the biochemical instructions that allow organisms to make all the types of cells they need. The Canada-based effort, called the International Regulome Consortium, hopes to raise $100 million in public funding for what organizers are calling “the third generation of genomics.”

    The new consortium proposes to characterize and tag the 1486 known transcription factors in the mouse genome, as well as an estimated 600 coregulators that work together to control cellular and biological functions through networks called regulons. The entire suite of actors is known as the regulome. Understanding the processes by which a set of genes is regulated during development of an organism, or during its disease states, “will revolutionize our understanding of how cells function,” says Michael Rudnicki, senior scientist at the Ottawa Health Research Institute, who is chairing a steering committee that will manage the effort.

    Roughly 75 researchers from six nations (Canada, the United States, the United Kingdom, France, Italy, and Singapore) began sketching the parameters of their plan during a founding workshop held here 3 to 5 May. They focused on the technologies needed to purify the complexes, identify and tag the transcription factors, and eventually construct databases to store results in standardized formats. “The idea is to apply some very state-of-the-art genomic and proteomic technology to the biology of stem cells,” says Kevin Struhl, a professor of biological chemistry and molecular pharmacology at Harvard Medical School in Boston. They also hope to identify the complete set of DNA binding sites and corresponding target genes for the regulons in embryonic stem cells and a subset of the cells they differentiate into.

    Model for regulation.

    An ambitious plan to map the genome's regulatory elements will focus first on mouse embryonic stem cells.

    CREDIT: MICHAEL RUDNICKI/OTTAWA HEALTH RESEARCH INSTITUTE

    The mouse is the obvious choice to focus on, says William Skarnes, senior scientist at the Wellcome Trust Sanger Institute in Cambridge, U.K., given the vast amount of data already available on the animal and its similarity to humans. The group's initial work with mouse stem cells “is not going to tell us everything about all aspects of mammalian biology,” acknowledges University of Toronto professor Jack Greenblatt, but he and others hope it will yield important principles that would apply to other types of cells and processes.

    Still, the genetic homogeneity of largely inbred mouse strains may be misleading when it comes to understanding human stem cells, cautions Peter Andrews, professor of biomedical science and co-director of the Centre for Stem Cell Biology at the University of Sheffield, U.K. “In the human, every embryonic stem cell that we're working with that comes from a different person is genetically different,” says Andrews. “We don't know, at the moment, what significance that genetic heterogeneity will have. It may very well be that the behavior of different embryonic stem cells varies because of their different genotype.”

    Consortium members hope that governments will provide support to help the group get organized as well as for ongoing operations and research. The Canadian participants are looking to a combination of federal and provincial funding agencies, including Genome Canada, the Canadian Institutes of Health Research, and the Ontario Research & Development Challenge Fund, to contribute about half the total needed.

  8. BIOMEDICAL RESEARCH

    New NIH Training Grants Open to Foreign Students

    1. Jeffrey Mervis

    For the past 30 years, the National Institutes of Health (NIH) has awarded institutional training grants and fellowships that come with a major proviso: for U.S. citizens and permanent residents only. But that is about to change. NIH is quietly launching a training program for the 2004–05 academic year that will be open to all, regardless of citizenship. The agency is now reviewing the first set of proposals for the $6 million initiative and expects to select about a dozen institutions for 5-year awards of up to $600,000 a year.

    The impetus for the new program, called “Training for a New Interdisciplinary Research Workforce,” came from a strategic plan (the Roadmap) drawn up by NIH Director Elias Zerhouni after his arrival in Bethesda 2 years ago. “It grew out of the concepts in the Roadmap, which was to rethink everything NIH is doing and be as inclusive as possible,” says Wendy Liffers of the policy shop at the National Institute of Dental and Craniofacial Research. As its name implies, the program is aimed at increasing the number of scientists trained in interdisciplinary research. “If somebody came to the United States already steeped in this approach, then we want to give them a way to continue it here,” says Terry Bishop of the National Institute of Diabetes and Digestive and Kidney Diseases, which will run the competition.

    NIH grantees have always been able to support foreign students and postdocs through their research grants; indeed, foreign-born students are now a majority in some fields. But the agency's primary means of training undergraduates, graduate students, and postdocs—called the National Research Service Award program—is restricted to domestic students under a 1974 law. “Legislators generally feel that training dollars should stay at home,” explains one congressional aide familiar with the various NIH funding mechanisms.

    To avoid running afoul of that law, NIH will combine research and training in the new program (grants.nih.gov/grants/guide/rfa-files/RFA-RM-04-015.html). Foreign students will be supported by research funds and domestic students by training money, although both will receive the same kind of training. “Scientifically, it doesn't matter, of course,” says Bishop. NIH will, however, create a hybrid accounting system to track how many of each are being served. “Congress likes to know how many dollars we spend on training, and we thought that they might ask,” Bishop adds.

    NIH officials say they will be monitoring the new program carefully. “We are doing it as a pilot, and I don't know how long it will last,” says Walter Schaffer, head of NIH extramural training programs. “But everybody seems to think that it's an idea worth trying.”

  9. U.S. IMMIGRATION

    Groups Urge Easing of Restrictions on Visa Policies Affecting Scientists

    1. Yudhijit Bhattacharjee

    Sixteen academic and professional organizations this week asked the Bush Administration to take steps to ease the entry of foreign scientists and students into the United States without undermining national security. “We are confident that it is possible to have a visa system that provides for thorough reviews of applicants and still welcomes the brightest minds in the world,” say the signers, which include the National Academy of Sciences, the Association of American Universities, and AAAS (which publishes Science).

    The organizations cite six problems in the current visa system and offer ways to reduce their impact on scientific exchanges and the global flow of information (www.aaas.org/news/releases/2004/0512visa.shtml). The changes would include fast-tracking visa applications that have been pending for more than 30 days, extending the validity of security clearances from 1 year to the individual's duration of study or academic appointment, and improving the ability of consular officers to recognize when a more detailed review is required and when it is not needed. The signers also propose extending the duration of visas for international students and scientists by revising visa reciprocity agreements between the United States and countries that send large numbers of scientists such as China and Russia. And they suggest establishing a mechanism for allowing visitors on F and J visas—for professional or personal business travel—to initiate the visa renewal process before leaving the United States. “We now have a consensus that the country must facilitate access to legitimate visitors without compromising security,” says Victor Johnson, associate director of NAFSA: Association of International Educators, a co-signer of the statement. “But it is never easy to translate a policy consensus into desired bureaucratic behavior.”

    State Department officials say they are already moving in that direction. The 1-year rule, implemented last summer, replaced a process that subjected visa holders to a security review each time they sought to travel abroad. There's “a good chance” that the validity of clearances will be extended further, says a former senior department official. The department is also trying to reduce the number of cases reviewed by an interagency panel by training consular staff to better identify fields of study regarded as sensitive.

    The coalition argues that improving visa processing efficiency will benefit both national security and U.S. higher education and science. Implementing these measures, the signers say, will correct “the misperception that the United States does not welcome international students, scholars and scientists.”

  10. INTERNATIONAL COOPERATION

    Priorities for Rebuilding Civilian Iraqi Science

    1. Richard Stone

    CAMBRIDGE, U.K.—On a visit to Iraq's science ministry one morning last March, Abdalla Alnajjar, president of the Arab Science and Technology Foundation (ASTF), recalls how impressed he was at first with its apparent vitality. Throngs of Iraqi scientists filled the halls and courtyards of the ministry complex. Alnajjar soon realized, however, that the vast majority were biding their time, with nothing to do, until clocking out around noon. “I was shocked,” says Alnajjar, a physicist. “Most of the scientists are reputable and deserve to be working, not standing around in the sun.”

    Alnajjar is in a privileged position to guide many out of their torpor. ASTF, based in the United Arab Emirates, and Sandia National Laboratories in Albuquerque, New Mexico, are moving forward with what could become the biggest program by far for rebuilding civilian Iraqi science. In a draft report, the initiative's organizers spell out a number of priority areas—from better water to biotech (see table)—in which they hope to engage Iraqi scientists. The program aims to raise $50 million for its first 12 to 18 months of operations, Alnajjar says.

    View this table:

    The initiative comes at a critical moment for Iraq's scientific community. The U.S. State Department has crafted ambitious plans for redirecting Iraqi scientists once involved in weapons of mass destruction (WMD) programs (Science, 12 March, p. 1594). But the prison abuse scandal has complicated this effort. “It may be touch and go for a while,” admits a State Department official. An official at the Coalition Provisional Authority confirms that things are “pretty tense” on the ground. “Some of our [Iraqi] scientists feel threatened for cooperating,” he says. “But we are moving forward.”

    Because the ASTF-Sandia initiative involves Arab scientists, it “could arouse fewer suspicions among Iraqi scientists and encourage scientists with WMD knowledge to come forward,” says Michael Roston, an analyst tracking the Iraq initiatives for the Russian American Nuclear Security Advisory Council. And its “big tent” approach to all scientists, says the State Department official, “makes this a useful and potentially important complement to our WMD-focused effort.”

    Under the ASTF-Sandia banner, teams of Arab and Iraqi experts visited Iraq between January and March. They met with nearly 200 scientists and gathered some 450 proposals, which they later winnowed down to 170 for serious consideration.

    The experts chronicled the deterioration of the Iraqi scientific community since the first Gulf War in 1991. Most young scientists in Iraq have never spent time in Western labs—and it shows. “The result is a significant loss of expertise, including a marked scarcity of mathematicians,” the report notes.

    That jibes with impressions of U.S. officials. At a recent meeting in Baghdad to discuss the plight of Iraq's universities, one dean after another bragged about the many science Ph.D.s their faculties had produced over the past decade. At last, the sole woman among the Iraqis, Siham Afif Kandela of Al Nahrain University, had had enough. “She stood up and declared that the universities, even her own, shouldn't even be granting degrees,” recalls the State Department official. Kandela, an expert on atomic laser spectroscopy, explained that conditions had eroded so badly since 1991 that many Ph.D.s had graduated without ever having done a lab experiment.

    Devastated land.

    The ASTF-Sandia initiative gave high priority to restoration of drained marshes such as this one in Querna, southern Iraq.

    CREDIT: ELIZABETH DALZIEL/AP PHOTO

    The U.S. government is still mulling how much it will contribute to the ASTF-Sandia initiative; the lion's share of funding is expected to come from other governments and nonprofits, says Alnajjar, who is organizing a workshop to match funders and scientists in September.

    In the interim, the draft report advocates providing temporary safe havens in Arab and other institutions for select Iraqi scientists. “This is one way to safeguard highly valuable scientists from losing their expertise and living standards,” the report states. It's unclear, though, how many scientists might be offered such positions and how they would be chosen. For Iraqi scientists now spinning their wheels, time is of the essence. “We cannot allow them to fade away,” Alnajjar says.

  11. RESEARCH ETHICS

    South Korean Cloning Team Denies Improprieties

    1. Dennis Normile

    A team of South Korean researchers has found that being on the cutting edge of controversial research brings both plaudits and scrutiny. In March they garnered headlines for the first successful production of a human embryonic stem cell line from cloned human cells (Science, 12 March, p. 1669; published online 12 February). But in a news story in the 6 May issue of Nature, the group was blasted for possible ethics violations. Specifically, Nature reported that the group, led by veterinary cloning expert Woo Suk Hwang and gynecologist Shin Yong Moon of Seoul National University, may have improperly used one of the 15 co- authors as an egg donor. Hwang emphatically denies that charge. But since then, other questions have arisen, suggesting that scientists involved in such sensitive research are going to have to carefully toe the line to avoid gray areas of ethical impropriety.

    According to the paper and accompanying online material, 16 volunteers underwent hormone treatment to stimulate overproduction of maturing eggs, creating a total of 242 eggs. The authors say that the women donated specifically for this experiment, were not compensated, and were informed that they would not personally benefit from the research.

    Last week Nature reported that in an interview a member of the research team admitted being one of the egg donors, raising questions about whether she profited professionally by being a co-author. Nature quoted bioethicists as saying that, to avoid any hint of coercion, there should be an arms-length relationship between the research group and the donors.

    Hwang blames the language barrier for “a miscommunication.” He says the woman had tried to explain that, in the future, she would be willing to donate eggs for such research by other groups. Moon-il Park, a professor of obstetrics and gynecology at Hanyang University in Seoul and chair of the Institutional Review Board (IRB) at the university hospital that approved the research plan—the eggs were harvested at the hospital—wrote in an e-mail that no one from Hwang's team was among the 16 volunteers. “I confirmed this after being contacted by Professor Hwang” regarding the allegations, he wrote.

    Emphatically not.

    Woo Suk Hwang, shown here returning from a trip to the United States, says allegations are baseless.

    CREDIT: YUN SUK-BONG/EPA/AP PHOTO

    But other questions are being raised. In their paper, Hwang and Moon note that the study was partly supported by a grant from the Stem Cell Research Center, which Moon heads. The center's own IRB prohibits therapeutic cloning until national guidelines for such research are developed. Hwang says that despite the acknowledgment in the paper, the team did not rely on the Stem Cell Research Center funding to support the research but only used “technical assistance” from the center to culture the cloned embryos and stem cells after they were established.

    One member of the center's IRB, Young-Mo Koo, a professor of medical ethics at the University of Ulsan College of Medicine in Seoul and secretary of the Korean Bioethics Association, says he feels “deeply betrayed by our colleagues' research.” But not all the IRB members share those feelings. Un-Jong Pak, a lawyer on the faculty of Seoul National University, says that from the information she has seen so far, “they have not abused [established] procedures” in conducting their research.

    Koo says the bioethics association worries that these researchers have been skirting the edges of propriety when “such leading scholars should be exemplary [in their] bioethics.” Last December, the South Korean legislature approved a Bioethics and Biosafety Act that will allow strictly controlled therapeutic cloning beginning in January 2005, once a new National Bioethics Committee is in place. Koo says Hwang and Moon “failed to take into consideration the social consensus” that such research needs national oversight.

    Hwang says they have done nothing wrong. Still, he says they are suspending their therapeutic cloning research until the new law takes effect: “This research is not illegal, but I want to build a consensus about our research among nongovernmental and religious groups.” His recent experience, though, makes him pessimistic about achieving that consensus.

  12. METEOROLOGY

    Storm-in-a-Box Forecasting

    1. Richard A. Kerr

    Ever-cheaper computing is making the prediction of the most destructive weather a local affair

    The two tornadoes bearing down on Fort Worth, Texas, weren't a complete surprise: The National Weather Service (NWS) had alerted the public a few hours earlier that storms likely to spawn twisters might be on the way. But the tornadoes, rain, and baseball-size hail of 28 March 2000 still went on to kill five people. The alert helped, but more details and greater confidence could have helped more.

    Part of the problem was that a half-day earlier, NWS's weather forecasting models being run back East—the basis for NWS as well as commercial forecasts across the United States—didn't give a clue that North Texas was in for any untoward weather. Globe-spanning computer models run on central supercomputers are increasingly adept at predicting the broad weather picture (see sidebar). But they often miss violent, small-scale weather such as tornadic storms developing over Texas, thunderstorms appearing on the coast of Florida, or the details of a turbulent North Pacific storm slamming into the rugged mountains of Washington state.

    The answer to violent little surprises such as the storm that hit Fort Worth, an increasing number of meteorologists say, is weather forecasting models that focus on regional weather in unprecedented detail and are fine-tuned to local conditions. “Weather is local, with a lot of local influences,” says meteorologist Kelvin Droegemeier of the University of Oklahoma (OU), Norman. “Many of us feel the future of weather forecasting is regional weather models run locally.” That future could soon be here.

    Take the Advanced Regional Prediction System (ARPS) developed by Droegemeier and his OU colleagues. It can take in the broad picture of the weather around North Texas in the hours before the Fort Worth tornadoes, add in local details such as Doppler radar data, and then predict with striking verisimilitude the development of the intense storms that crossed Fort Worth in the following hours (see figure). And, fueled by faster, cheaper computing power, better predictions should be on the way. A new regional model—the Weather Research and Forecasting (WRF, pronounced “warf”) model—that incorporates and improves on the best features of current models will be supplanting the leading regional models later this year.

    Bingo.

    When forecasters included local weather observations in a high-resolution model that had been optimized to local conditions, they were able to forecast tornadic storms 2 hours before they hit Fort Worth, Texas (fuchsia star).

    CREDIT: CENTER FOR ANALYSIS AND PREDICTION OF STORMS, UNIVERSITY OF OKLAHOMA (XUE ET AL., 2003)

    How soon local modeling will become widespread, however, remains to be seen. A few scattered local centers are already making routine storm-scale—“mesoscale”—forecasts, and NWS's National Centers for Environmental Prediction (NCEP) in Camp Springs, Maryland, will be running the WRF over various U.S. regions. But the WRF has no immediate prospects of being tuned to local conditions and run routinely at regional NWS forecast offices.

    Getting up close

    Mesoscale forecasting is global modeling writ small, with an added dollop of local flavor. Because global models have only finite computing power, they must use a broad-brush approach. First they paint a picture of the current weather using observations from around the world. This snapshot is a fuzzy one, like a coarsely executed pointillist painting, because the model records the state of the weather only at widely separated points in the atmosphere. The points come at the intersections of the lines of checkerboard grids stretched across the surface and stacked up through the atmosphere. These grid points are separated by 40, 60, or more kilometers horizontally.

    A global model knows the weather only at widely spaced grid points because it couldn't handle any more detail at the next step in the forecasting process. After forming its initial picture of the weather, it must calculate how the atmosphere would evolve under the laws of physics at each of thousands of grid points, minute by minute into the future, for hours and days on end. Such global forecasting at even 40-kilometer resolution takes the biggest supercomputer that a rich nation's weather service can afford. Just doubling the resolution globally to 20 kilometers would require eight times more computer power. Such global model forecasts—in the United States, NCEP runs them daily—form the starting point for forecasts issued by NWS forecast offices as well as for the forecast maps put out by private forecasters, from Accuweather and The Weather Channel to the local TV meteorologist.

    Mesoscale modelers get beyond the fuzzy global forecast by starting with the portion of the global model's initial weather picture that covers their region, increasing model resolution there, and plugging in local weather observations. Mesoscale Model 5 (MM5), the most widely used mesoscale model among researchers, now in its fifth generation, was developed at Pennsylvania State University, University Park, and the National Center for Atmospheric Research (NCAR) in Boulder, Colorado.

    Thirteen years after its debut, MM5's most ambitious operation is at the University of Washington (UW), Seattle, under the supervision of meteorologist Clifford Mass. Run within the 13-member Northwest Modeling Consortium, which includes UW and NWS, MM5 tackles forecasting over a checkerboard grid box with just 4-kilometer spacing that covers only the U.S. Pacific Northwest. MM5 forecasts are distributed to consortium members, who use them in turn to forecast regional environmental conditions such as air quality and stream flow. The MM5 forecasts are also posted on the Web.

    The mesoscale edge

    Concentrating the forecasting effort in one small region has several advantages. One is that the forecaster can incorporate weather observations that never make it into a global model. In the case of the Pacific Northwest, additional observations that improve the first 6 hours of a forecast can come from the NorthwestNet—a compilation of more than two dozen regional networks of surface observations sent from ships, ferries, schools, buoys, and ground stations monitoring everything from farms to mountain avalanche areas. Regional observations can also come from commercial aircraft, NWS Doppler radar, and an upward-looking radar that profiles temperatures and winds with altitude. In the case of the Fort Worth tornado, the inclusion of Doppler radar data lets ARPS accurately forecast the tornadic storms that developed just to the north of Fort Worth; without Doppler, the model would never see them coming.

    Another mesoscale-model advantage is its high resolution. MM5's 4-kilometer resolution, notes meteorologist-in-charge Christopher Hill of the NWS forecast office in Seattle, allows a more realistic simulation of storms and their winds interacting with the mountainous terrain of the region. The more detailed the mountains in the model, for example, the more likely moisture-laden model winds will drop the right amount of rain or snow as they rise over the model mountains.

    A third edge for mesoscale modeling—one not achieved in every forecasting operation—is the adaptation of the model to local conditions. “One-size-fits-all numerical weather prediction is not necessarily the best approach,” Mass has written. “What's appropriate in one place,” he says, “may not be appropriate in another.” NWS's mesoscale model Eta, for example, runs a large grid with 12-kilometer spacing that can be placed over the eastern, central, or western lower 48 states to sharpen forecasters' views of approaching storms. Nothing about the model is changed between one region and the next, however, leading to degraded performance in the rugged, high-standing West, Mass notes. Eta's grid in effect runs into mountains there rather than following the topography. That creates computational problems that cause unnatural features, such as too much blocked air flow.

    Out of a fix.

    A forecast model tailored to the Antarctic helped rescuers extract personnel from the supply ship Magdalena Oldendorff (left).

    CREDIT: COURTESY OF ARGENTINE NAVY, AVAILABLE VIA SANTIAGO L. AVERSA/FUERZASNAVALES.COM

    A good example of the need for local adaptation came when researchers at NCAR and Ohio State University in Columbus wanted to run MM5 over high southern latitudes to assist air and sea operations supplying the U.S. stations in Antarctica. Because the Antarctic environment is so different from that of the central United States, where MM5 was developed, they had to adjust a half-dozen different atmospheric processes that move model heat and radiation among ice, snow, clouds, and air. Indeed, they modified it so heavily that they gave it a new name: the Polar MM5.

    The heavy tuning of Polar MM5 seems to have paid off. In April 2001—early winter in Antarctica—Polar MM5 predicted a break in the blowing snow at the U.S. South Pole station, allowing an evacuation plane to land in darkness on an unlighted runway to evacuate an ailing station staff member. And in June 2002, Polar MM5 helped guide a rescue ship around a threatening storm to reach the German supply ship Magdalena Oldendorff trapped in Antarctic sea ice with 28 crewmembers and 79 Russian scientists. All were safely removed, the last of them during an accurately predicted window of favorable weather on 1 July.

    Detailed but cheap

    The advent of mesoscale-model forecasting—from the Pacific Northwest to Antarctica—has been a grassroots movement fueled by the plummeting cost of high-speed computing. “The world has changed,” says Mass. “Now you can have tremendous computing power locally.” The workhorse in the Northwest is a cluster of 40 Linux 2.8-gigahertz processors with a total of 6 terabytes of disk storage, all of which cost a total of $80,000. “I have as much computer power as NCEP had a few years ago,” he says, power that cost NWS millions per year.

    Patrick Welsh, science officer at the NWS forecast office in Jacksonville, Florida, agrees that “the revolution in computing has been phenomenal.” With a $25,000 Coastal Storms Initiative grant from the National Oceanic and Atmospheric Administration, parent agency of NWS, “we can build a cluster with the throughput of a supercomputer of the mid-'90s,” says Welsh. The resulting WRF forecasts for the greater North Florida area captured the local sea-breeze winds better than ever before, says Welsh. That matters in North Florida because it's the sea breeze pushing inland that often sets off the region's abundant thunderstorms. With locally run mesoscale modeling, Jacksonville forecasters now often forecast afternoon thunderstorms to within a few minutes of their occurrence. Similarly cost-effective operations using MM5 have been applied to forecasting how cold it will get on a U.S. Army test range in Alaska; how much snow would fall at the 2002 Winter Olympic Games in Salt Lake City, Utah; and what combat conditions might have been like in the fall of 2001 in Afghanistan.

    Although ever-cheaper computing was helping spread the local operation of mesoscale models (mainly MM5), the mesoscale community still had a problem. Researchers (mainly in the universities) and operational forecasting modelers (mainly at NCEP and in the military) “weren't using the same models,” says Mass, “so the research developments weren't going into the operational models. It was not healthy.”

    The solution to the community split is going to be the WRF model now in the last stages of development and testing. The product of a community collaboration involving NCAR and six government agencies including NWS, WRF builds on MM5 and Eta with new and improved versions of software, numerical calculation techniques, approximations of physical processes, and construction of an initial weather picture.

    Perhaps most promising is the ability of WRF users to plug in different model components as they are developed. “That has the benefit of greatly facilitating moving advances made in the research community into the operational configuration,” says Joseph Klemp of NCAR, who has coordinated the WRF project. This fall, WRF will replace the regional version of Eta, says Geoffrey DiMego, mesoscale branch chief at NCEP. Next fall, it will replace the Eta version whose grid encompasses all of North America and in 2006 the version used to forecast hurricanes. Most MM5 users outside NWS will be transitioning to WRF as further development of MM5 ends and NCAR training support for MM5 users is eliminated.

    Local is good

    Broad use of WRF will accelerate advances in mesoscale modeling, all agree, but many modelers would like to see more regional modeling being done locally. “NCEP is always going to be the center for modeling,” says NWS's Welsh. “But I believe there's a place in weather forecast offices for a localized, customized model for a particular part of the country. I don't know that every part of the country needs one; I'm convinced Florida does.”

    Mass agrees on the need for regionally based forecasting. “Each part of the country has different needs; the way you do the modeling is different,” he says. A local mesoscale-model forecaster is also more likely to find enough local observations to feed the model, he says. And a local forecaster will be close to those who use the forecasts: a state water agency predicting river flows, a state environmental agency predicting air quality, or a U.S. Forest Service office planning controlled burns. Those connections would also make it easier to raise the $200,000 to $300,000 needed from diverse sources each year for operational support of a forecasting system such as the Northwest Modeling Consortium.

    Such local funding will likely be vital to the continued expansion of mesoscale forecasting, at least for a few years. According to Nelson Seaman of NWS in Silver Spring, Maryland, NWS support for WRF at the regional level will consist of sequential trial runs at a half-dozen sites around the country during the next few years. So far, nothing is being promised beyond that to accelerate the devolution of forecasting power into a truly local affair.

  13. METEOROLOGY

    No End Yet to Forecast Advances

    1. Richard A. Kerr

    While weather forecasters have been sharpening their views of tomorrow's weather in their own backyards (see main text), other researchers have been keeping up their seemingly inexorable improvement in forecasting next week's big picture of the weather. “The harder you work on each aspect of the forecast system, the better the forecasts become,” says forecast model developer Anthony Hollingsworth of the European Centre for Medium-Range Weather Forecasts in Reading, England.

    In its nearly 25-year history, work on medium-range forecasting by computer models has extended the length of high-quality forecasts from about 2 days to about 4 days, says Hollingsworth. That improvement required that the biggest weather forecasting models be run on the most powerful supercomputers governments could afford. The forecasts are graded on how well they predict only the general weather patterns around the world: the position and intensity of fair-weather high-pressure systems and stormy lows. Lower quality but still useful forecasts have been extended from 5.5 days to almost 8 days. “I hope we'll see useful 10-day forecasts by the end of this decade, in the winter at least,” says Hollingsworth.

    Up, up, up.

    Forecast accuracy out to 7 days ahead is still rising. The gap between the north and south (color) has closed while overall accuracy has increased.

    CREDIT: ADRIAN SIMMONS/EUROPEAN CENTRE FOR MEDIUM-RANGE WEATHER FORECASTS

    The most dramatic improvement of the past decade came in the Southern Hemisphere. Any computer forecast must begin with a picture of the current weather; the more accurate the initial picture, the more accurate the forecast. But the predominance of ocean over land in the Southern Hemisphere has always meant a dearth of places from which to make weather observations. In the 1990s, the advent of sophisticated weather satellites and of new ways of assimilating their observations into forecasts accelerated improvements in the south, says Hollingsworth. In the past 3 or 4 years, the gap between forecast skill in the north and south has closed. Over both hemispheres, forecasting into next week also benefited from more detailed model simulations and more realistic representations of a model's physical processes, such as cloud formation, says Hollingsworth.

    All these improvements required ever-increasing computer power and new and more efficient ways to do the required numerical computations. But human forecasters are still staying ahead of their machines, says James Hoke of the National Weather Service's National Centers for Environmental Prediction in Camp Springs, Maryland. Knowing the shortcomings of the models, human forecasters are adding 10% to 15% to the skill of forecasts over that of the models alone, he says. But there's a theoretical limit to prediction—whether machine or human—somewhere around 14 days, when atmospheric chaos prevails. And as models continue to improve, Hoke says, the amount of room available for forecast improvement by humans will eventually shrink. Someday, the machines could take over.

  14. PLANETARY SCIENCE

    Skywatchers Await the Fleeting Shadow of Venus

    1. Robert Irion

    On 8 June, Venus will cross directly in front of the sun for the first time in 122 years

    Venus usually shines like a brilliant beacon in the morning or evening sky. But on 8 June, our sister planet will assume a darker guise: a circular blot, slowly crossing the sun's face in a dramatic 6-hour “transit.”

    No one alive has seen this mini-eclipse, which last occurred in 1882. Astronomers of that era launched lavish excursions to capture the event with newly invented cameras. This time, some researchers will use the transit as a dress rehearsal for studying extrasolar planets; others will probe the causes of an odd optical distortion. Space agencies also plan observing campaigns to educate the public about the workings of our clocklike solar system.

    Part of that clock is the sporadic timing of Venus transits. The planet rarely crosses a direct line between the sun and Earth, because its orbit tilts 3.4 degrees relative to the plane of Earth's path around the sun. When a transit does occur, a second one usually happens 8 years later. Those who miss the show in June will have another chance in 2012—the last alignment for 105 years.

    After the first sighting of a transit in 1639, each one grew in cultural impact. The 1874 and 1882 events were such phenomena that composer John Philip Sousa wrote a march called “Transit of Venus,” and a Harper's Magazine cover depicted Appalachian children watching the sun through a smoked pane of glass.

    Astronomers were captivated as well. “It was like a space race in the 19th century to make accurate measurements of the transits,” says NASA chief historian Steven Dick, formerly of the U.S. Naval Observatory in Washington, D.C. Indeed, the U.S. Congress funded eight expeditions in 1874 for a princely $177,000, and Russia fielded a whopping 26 teams. Their goal was the same: to measure the exact moments when the full circle of Venus entered and exited the sun's disk. Once they gauged those times at many places on Earth, astronomers could use surveying methods to calculate the Earth-Venus distance. Then, Johannes Kepler's orbital laws would yield the long-sought “astronomical unit” (AU)—the distance between Earth and the sun.

    The answers were close to what is now known to be the true value of about 150 million kilometers, but scientists were skeptical. The problem was the “black-drop effect”: a distortion that stretches the silhouette of Venus into the shape of a water drop when the transit begins and ends. “The black-drop effect makes it extraordinarily difficult to determine when the planet's edge actually touches the inner edge of the sun,” says astronomer Edward DeLuca of the Harvard-Smithsonian Center for Astrophysics (CfA) in Cambridge, Massachusetts. By the 1890s, other methods for measuring the AU were deemed far more accurate.

    Sic transit Venus.

    Skywatchers across much of the globe will see Venus drift across the sun's face on 8 June. These images, from Lick Observatory in California, recorded the last transit in 1882.

    CREDIT: UNIVERSITY OF CALIFORNIA OBSERVATORIES/LICK OBSERVATORY

    Many popular accounts blame the optical tricks on Venus's thick clouds, but astronomers agree that the planet's atmosphere can't bend light severely enough. Sharp observers in the 18th century first suggested another source: Earth's own blanket of air, a deduction confirmed in 2001 by astronomer Bradley Schaefer of Louisiana State University in Baton Rouge. Using computer models, Schaefer showed that smearing within Earth's atmosphere—which also makes stars twinkle—blurs Venus's disk in the telltale way during transits. Diffraction of light within a telescope adds more warping, he noted.

    But that's not the full story. When a satellite far above the atmosphere watched a transit by the planet Mercury in 1999, it also spotted a black-drop effect, according to a recent study in Icarus. Astronomers Glenn Schneider of the University of Arizona in Tucson; Jay Pasachoff of Williams College in Williamstown, Massachusetts; and Leon Golub of CfA concluded that the effect came from the spread of light within the satellite's camera and the dimmer appearance of the sun's edge, an effect called “limb darkening.” The same satellite—NASA's Transition Region and Coronal Explorer (TRACE)—will observe the Venus transit in June to reveal the relative impacts of each distortion once and for all. “It will solve the black-drop mystery totally,” says DeLuca, a TRACE scientist.

    Others on the ground also plan to watch. Astronomers Wolfgang Schmidt of the Kiepenheuer Institute for Solar Physics in Freiburg, Germany, and Timothy Brown of the National Center for Atmospheric Research in Boulder, Colorado, will use a 0.7-meter solar telescope in the Canary Islands to take detailed spectrographic images. They will try to measure wind speeds in the upper atmosphere of Venus by detecting Doppler shifts in the spectral lines of carbon dioxide gas, illuminated by the bright sun behind.

    “This is an unprecedented experiment,” Brown says. “No one knows how it will work.” Ultimately, astronomers might adopt a similar approach to study the atmospheres of transiting planets outside the solar system, he notes. Any such effort would have to be exquisitely sensitive to faint changes in the pattern of a distant star's light.

    Beyond the potential research, scientists expect a surge of public interest as the transit nears. Both NASA and the European Southern Observatory are sponsoring public- viewing campaigns* and live Webcasts. Participating students will record transit times and learn how to calculate an AU. Viewers in most of Europe, Africa, and Asia will get to watch the transit from start to finish, although those in the eastern half of the United States must settle for a shorter taste at sunrise. Other Americans will miss out—but a sunset view of the next transit awaits in 2012.

    Even grizzled scientists are eager for 8 June to arrive. “The romance and history of Venus transits are wonderful,” says Brown. “If nothing else, this will be a great time.”

  15. AMERICAN PHYSICAL SOCIETY MEETING

    Once Again, Dark Matter Eludes a Supersensitive Trap

    1. Charles Seife

    DENVER, COLORADO—APS's April meeting (held here 30 April to 4 May) brought together 1000 researchers in nuclear and high-energy physics, astrophysics, and related fields.

    Dark matter has just become a shade darker. At the APS meeting, physicists from the Minnesota-based Cryogenic Dark Matter Search (CDMS) reported that the first results from the most sensitive dark-matter detector ever built had failed to reveal the invisible particles that theorists believe make up most of the mass in the universe. The finding nails shut the coffin on a controversial claim to have spotted dark matter, but if the particles continue to be no-shows, that would spell trouble for scientists' understanding of our universe.

    Almost all astrophysicists are certain that dark matter exists. Several lines of evidence suggest that about 85% of the universe's mass is invisible. Stranger still, the observations imply that this mass is not the ordinary matter that makes up stars and planets and people. It must be made of an entirely different type of particle. The leading candidate by far is known as a weakly interacting massive particle (WIMP).

    Despite years of trying, scientists have yet to catch WIMPs. Since 1998, researchers in the Italian Dark Matter (DAMA) experiment have claimed to have seen their faint signature, but nobody else has confirmed DAMA's results—and other experiments seemed to belie them (Science, 7 June 2002, p. 1782).

    Now you see it.

    A dark-matter shroud surrounds a galaxy in this computer-generated image.

    CREDIT: UNIVERSITY OF WASHINGTON N-BODY SHOP AND THE ARCTIC REGION SUPERCOMPUTING CENTER

    CDMS also started hunting WIMPs in 1998, using silicon and germanium detectors to look for dark-matter particles traversing a tunnel at Stanford University in California. If a dark-matter particle bumps into an atom in the detector, it leaves behind some energy, which shows up as a signal. But cosmic rays and stray nuclear particles can give false readings and limit the detectors' sensitivity. So, in 2003, physicists running the second phase of the experiment, CDMS II, buried improved detectors deep in an iron mine in Soudan, Minnesota, where overlying rock and soil screen out most of the stray particles.

    CDMS II is four times more sensitive than any other experiment is, says team member Bernard Sadoulet, a physicist at the University of California, Berkeley. Nevertheless, CDMS II has not spotted a WIMP in 53 days of running. Says Stanford physicist and CDMS II team member Blas Cabrera: “If there had been WIMP events in the data set, we're quite convinced we would have seen them.”

    Although the results are disheartening so far, they at least refute the controversial DAMA claim. If the DAMA result were a genuine observation, says Sadoulet, “we would have observed something like 150 events.” At another talk at the meeting, physicist Lawrence Krauss of Case Western Reserve University in Cleveland, Ohio, declared that “DAMA is dead, as far as I can see.” But Rita Bernabei, a physicist with the DAMA collaboration, says that the CDMS results are “model dependent” and do not invalidate DAMA's direct measurements of dark matter.

    The CDMS II team plans to increase the instrument's sensitivity in the coming months by adding more detectors. Then, the experiment will run for several more years. If CDMS II hasn't shined a spotlight on a dark-matter particle by then, cosmologists will be in a dark mood indeed.

  16. AMERICAN PHYSICAL SOCIETY MEETING

    Solar Flares Reveal Surprising Recipe

    1. Charles Seife

    DENVER, COLORADO—APS's April meeting (held here 30 April to 4 May) brought together 1000 researchers in nuclear and high-energy physics, astrophysics, and related fields.

    When the sun belches, scientists listen—and observe. At the meeting, a satellite with gamma ray eyes that watch solar flares erupt in extraordinary detail presented a new picture of flares that is leaving solar physicists sunstruck.

    Most scientists believe that solar flares, huge explosions on the sun, occur when the sun's magnetic field lines snap and then reconnect. The magnetic fields accelerate charged particles—electrons and ions—in the solar atmosphere and slam them back into the surface of the sun. The process sends torrents of gamma rays and x-rays shooting out into space. The 2-year-old Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) satellite is designed to spot those gamma and x-rays, which reveal how charged particles in a flare behave.

    Glaring omission.

    Solar flares (bright spots on image) are more complex than thought.

    CREDIT: SOHO (ESA AND NASA)

    On 23 July 2002, RHESSI glimpsed a solar flare on the east part of the sun—the first gamma ray picture of a solar flare in action. What the satellite saw, though, was unexpected, says Robert Lin, principal investigator of the RHESSI mission. “It was thought that both ions and electrons were accelerated at the same time and should be in the same place,” he says, but the electrons' gamma rays and the ions' gamma rays came from spots on the sun thousands of kilometers apart. “That was a big surprise to us.” Enormous solar flares that erupted in October and November 2003 seem to show the same trend. “There's a suggestion that [ions and electrons] are not coincident,” Lin announced at the meeting. “The jury's really out; we don't know why they end up in different places.”

    Even though the reasons for the separation are still obscure, the observations are likely to help physicists unravel what's going on in a solar flare. “That's interesting,” says Gerald Share, a physicist at the Naval Research Laboratory in Washington, D.C. “Maybe the ions are traveling down a different magnetic loop than the electrons.” Share joined the RHESSI team 7 months ago to work on slightly different aspects of the gamma ray data—and he's extremely excited about what the satellite's observations will reveal about how solar flares work. Lin shares Share's enthusiasm. “We'd like to understand how the sun releases its energy,” he says. “We're just getting gamma ray measurements that bear on this.”

  17. AMERICAN PHYSICAL SOCIETY MEETING

    Gravity Withstands Close-Up Scrutiny

    1. Charles Seife

    DENVER, COLORADO—APS's April meeting (held here 30 April to 4 May) brought together 1000 researchers in nuclear and high-energy physics, astrophysics, and related fields.

    Legend says Galileo studied gravity by dropping balls off the Leaning Tower of Pisa. Nowadays, physicists drop neutrons. At the meeting, German researchers showed how these tiny particles are revealing the strength of gravity on the very tiniest scales. The neutron work is a “very interesting experiment,” says physicist Eric Adelberger of the University of Washington, Seattle. “Questions about gravity are at the heart of physics.”

    Since Newton's time, scientists have known that the force of gravity between two bodies falls off as the square of the distance between them increases. By observing planets moving around the sun, satellites orbiting Earth, and heavy masses attracting other nearby masses, physicists have confirmed that this so-called r-squared law holds on astronomical-length scales down to a few fractions of a meter. But recently, theorists have suggested that gravity might subtly deviate from the law on tiny scales. An extra dimension, for example, might mess up the gravitational force for lengths smaller than the diameter of a human hair.

    Gravity is extremely hard to measure on those scales, because stronger forces, such as electrostatic repulsion, overwhelm its effects. At the University of Washington, physicists have used a very fine pendulum to show that the r-squared law holds down to scales of a tenth of a millimeter. At the meeting, physicist Stefan Baessler of the University of Mainz, Germany, described an experiment that tested the law on scales up to 100,000 times smaller still.

    Baessler and colleagues dropped very slow, very cold neutrons onto a surface. When a neutron hits, says Baessler, it bounces like a tennis ball. Because a neutron is a quantum object, however, it can rebound only in fixed steps. Just as there is a lowest energy for an electron bound by electric forces near a hydrogen nucleus, there is a minimum bounce height for a neutron bound by gravitational forces near a surface. Find out that minimum bounce height, and you find out, with great precision, the force of gravity.

    Even a tiny deviation from the r-squared law should make the minimum bounce height different from the height expected. When the physicists measured the minimum bounce height of the neutrons by lowering a neutron-absorbing ceiling toward the surface, they found that it was right where the r-squared law implied it should be. Baessler says they would have spotted significant deviations from that law even on the scale of a nanometer—and further refinements should make the technique even more sensitive.

  18. HIGHER EDUCATION

    Reinventing Europe's Universities

    1. Martin Enserink

    European researchers have begun to wonder why their universities don't have the same research star status as America's Ivy League. Getting there will require serious reform

    LIÈGE, B< font size=-1>ELGIUM—Does Holland have a Harvard? How does the Sorbonne measure up against Stanford? And why is there no Euro-Ivy League? European researchers and policy makers are increasingly asking such questions, and they don't find the answers reassuring. Across the continent, there are fears that Europe's universities, once bastions of leading science, no longer rate as global players—a slump that could harm Europe's economic prosperity.

    At a recent meeting* here, scientists, administrators, and politicians grappled with a variety of ways to help universities punch their weight, from reforming education and improving graduate training to breaking down national barriers and what former UNESCO chief Federico Mayor called an “onslaught on the bureaucratic labyrinth” of European R&D funding. The goal: to create American-style research universities—and along with them, the vibrant technology sector that seems to flourish around them.

    However, creating an American-style competitive market will not be easy. It goes against the grain of European egalitarianism, which strives to provide a solid education to as many students as possible while refraining from rewarding exceptional talent. Besides, some are uneasy about the idea of refashioning universities as technological powerhouses, worrying that it will hurt the traditional role of campuses as seats of learning for an intellectual avant-garde. “Surely, a university is more than training people how to set up companies,” said Jean-Patrick Connerade, the president of Euroscience, a pressure group representing research scientists.

    The scale of the problem is vigorously debated. When the Institute of Higher Education at Jiao Tong University in Shanghai recently posted on its Web site a ranking of university research prowess—based on such measures as the number of papers in Science and Nature, Nobel prizes, and citations—only 10 European universities cracked the top 50, compared with 35 from the U.S. (see table, p. 953).

    View this table:

    The rankings, bandied frequently at the Liège meeting, have touched a nerve. Skeptics of the value of such comparisons note that although the European research landscape has fewer peaks, it has fewer valleys as well, so such a ranking says little about overall research quality. But other indicators tell a similar story, says Roger Bouillon, vice president for research at the Katholieke Universiteit in Leuven, Belgium. European researchers may produce comparable numbers of papers to their U.S. counterparts, but the impact is less. European science also leads to far fewer patents, and there's a net brain drain to the United States. Numbers aside, “you just feel it,” says Bouillon. “Whenever you're at an important meeting, Americans dominate.”

    Europe's premium on equality is not the sole reason for the widening transatlantic gap. In the United States, federal agencies such as the National Institutes of Health and the National Science Foundation dole out huge sums of money in open competitions based on research quality. Hence a university that attracts top-tier scientists will win supersized slices of that pie, with the top 20 U.S. universities together raking in about a third of federal research dollars. Although systems vary across Europe, universities in many nations are awarded funding according to enrollment figures—and students often pick a university based more on location than on its reputation.

    A number of national policies hinder competition among universities. Some countries do not allow institutions to select the best students, some have fixed national tuition fees and others none at all, and academic salaries usually are governed by a national formula, constraining the ability of universities to hire top talent. Moreover, “management is often not superprofessional,” says Jeroen Huisman of the Centre for Higher Education Policy Studies in Enschede, the Netherlands. Often, the leadership of departments and even universities is not chosen on merit or even by vote, but rotates among faculty members, many of whom strive primarily to keep the peace.

    The European Commission wants to see something done about this policy patchwork. Four years ago it announced its intention to make Europe the “most competitive and dynamic knowledge-based economy in the world” by 2010, with universities leading the charge. Although the commission has no power over university policy—that lies with national governments—it does have one important tool at its disposal: money. The commission is about to throw its weight behind the creation of the European Research Council (ERC), an agency envisioned to spend billions of euros annually. The E.U.'s existing research program, Framework, focuses largely on applied R&D and emphasizes large international collaborations. In contrast, ERC will focus on basic science, fund individual teams, and have research quality as its sole criterion (Science, 2 January, p. 23).

    Dreaming spires.

    Should ancient universities such as Salamanca University emulate the Ivy League?

    CREDIT: UNIVERSITY OF SALAMANCA

    Like the commission, some countries are beginning to see competition as a way to strengthen their universities. The German government, for instance, has proposed a plan to select a handful of “elite universities” and channel extra money their way. A few states have launched their own reforms; Baden-Württemberg, for example, is introducing more professional management, autonomy, performance-based funding, and tuition fees—all anathema not so long ago. The Dutch government, too, is forging ahead with an experiment in letting universities set tuition fees and select students.

    But resistance to such changes is often fierce. In the U.K., for example, legislation allowing variable tuition fees nearly cost Prime Minister Tony Blair his political hide. Education ministries are loath to relinquish power, and university researchers fear losing out in a Darwinian struggle for survival. Introducing a merit-based funding system in Europe is “going to take quite a bit of fighting,” Bouillon predicts. “We have accepted that the best win in soccer, art, music, and business,” he says, “but not in basic research. That makes no sense.” Indeed, some Liège delegates decried the slow pace of change. “We're an airplane going around in circles on the runway but never getting off the ground,” complained Antoni Kuklinksi, director of the Institute of Socio-Economic and Regional Geography at the University of Warsaw, at the meeting.

    Taking off is growing more urgent every day, says Geoffrey Boulton, a vice principal at the University of Edinburgh, because a whole new generation of Asian scientists is joining the global competition. “When I look at cozy Europe,” he said at the meeting, “I'm really quite terrified sometimes.”

    • *The Europe of Knowledge 2020. A vision for university-based research and innovation. Liège, 25–28 April.

  19. HIGHER EDUCATION

    Russian Universities Want Their Share of the Research Pie

    1. Andrey Allakhverdov,
    2. Vladimir Pokrovsky*
    1. Allakhverdov and Pokrovsky are writers in Moscow.

    MOSCOW—In the old days in Russia, everything was kept in its place: Institutes of the Russian Academy of Sciences (RAS) and government ministries conducted the vast majority of basic research, while universities were almost exclusively teaching institutions. University academics are even barred by law from applying for research grants. But pressure is growing to break down the distinction between research institute and university. At a meeting in Moscow last month, all the talk centered around one thing: American-style research universities. “This is a time when resources are not so plentiful and the old system which worked very well in the old days probably doesn't work so well now,” says Gerson Sher, president of the U.S.-backed Civilian Research and Development Foundation (CRDF), which sponsored the meeting.

    To be eligible to receive a grant, university scientists have to take convoluted steps to evade the ban on research grants going to universities; a common one is for the university to set up a semi-independent body known as a “research council.” Some universities fund their own research, but only on a small scale. “In our university only a paltry part if any of the money which students pay for their education goes for research,” says Vladimir Minkin, director of the Physical and Organic Chemistry Research Institute at Rostov University.

    Many at the meeting supported the idea that exposing students to research would not only improve their education but would likely benefit the research as well. Government officials are now working on amendments to current laws on science and higher education to allow universities to undertake research. But to boost university research without starving existing research centers will require new sources of funding. “To make a university really big and significant, strong financial support is needed,” says Minkin.

    To set the ball rolling, CRDF and Russia's V. Potanin Charity Fund launched a program last month to train Russian university administrators in research management. And two U.S. foundations, the John D. and Catherine T. MacArthur Foundation and the Carnegie Corporation of New York, have pledged to help universities move into research. Delegates to the Moscow meeting will work on a Russian model of a research university to put to the MacArthur Foundation in September. Sources in Moscow believe that the foundation may invest as much as $15 million in bringing that model to reality. “I think they will start with one pilot university, will work there for a year or two, see how it is perceived, where the funding goes, what are fundraising possibilities,” Minkin says.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution