# News this Week

Science  14 May 2004:
Vol. 304, Issue 5673, pp. 936
1. NATIONAL INSTITUTES OF HEALTH

# Paid Consulting: Good for the Staff, Not for the Chiefs

1. Jocelyn Kaiser

For the past 5 months, critics in Congress and elsewhere have battered the National Institutes of Health (NIH) for allowing its scientists to accept hefty consulting payments from companies. Last week, a blue-ribbon panel appointed by NIH Director Elias Zerhouni offered a plan to restore public confidence: It recommends that top NIH officials and grant decision- makers be barred from industry consulting. But the panel “walked a fine line,” said co-chair Norman Augustine, by saying that in-house NIH scientists may continue to interact with industry, although within new limits.

The response from the research community was positive: “This will increase transparency without doing harm to the enterprise,” said biologist David Burgess of Boston College at a meeting last week of the NIH director's advisory committee, on which he serves. Association of American Medical Colleges president Jordan Cohen stated that the report “will help sustain and strengthen the public's essential trust” in NIH. Although Zerhouni is still weighing the recommendations, he seemed to like the advice and asked the blue-ribbon panel to come back once more to “fine-tune” it. But it's not clear whether the plan will satisfy NIH's critics.

Concerns about payments to NIH staff members erupted after the Los Angeles Times reported last December that some NIH scientists have received up to $300,000 or more over the past decade from drug and biotech companies and suggested that these deals posed conflicts of interest (Science, 19 December 2003, p. 2046). The rules on outside money had been loosened in 1995 by then-Director Harold Varmus, who was hoping to recruit more private-sector scientists and clinicians. Although Zerhouni has found no evidence that patients were harmed or decisions influenced by these deals, he asked for a review by a 10-person blue-ribbon panel co-chaired by Augustine, chair of the executive committee of Lockheed Martin Corp., and Bruce Alberts, president of the National Academy of Sciences. The panel's 109-page report* notes that few NIH employees do consulting work—just 118 out of 17,500 as of last month, down from 228 in January, Zerhouni said. The “simplest” solution might have been to “just ban” these deals, but “that would be a grave error,” Augustine said. Instead, the panel decided that industry consulting is beneficial because it enables NIH to share technologies and compete for talent. Because they have broad responsibilities and can influence funding decisions, according to the panel, senior managers and grant officers should not consult for industry or academia. (Two NIH institute directors with outside deals named by the Los Angeles Times have ended them.) The panel also endorsed current law barring clinicians from accepting payments from a company with a financial interest in their research. But that rule should not be applied to most intramural scientists, the report says: Consulting that does not pose a conflict of interest “should be allowed.” The panel recommends limits, however: Employees should not spend more than 400 hours a year consulting or earn fees of more than 50% of their salary (for clinicians, 100%). And they should not be paid with stock. At the same time, the panel strongly supports nonindustry outside activities such as receiving awards that may include cash, writing textbooks, speaking, and editing. Those activities are “part of the tradition of science,” the report says. The panel also advised NIH to find ways to require that more senior officials file publicly available financial disclosure reports. And it said NIH should request authority to raise a current$200,000 cap on salaries (see sidebar), partly to offset the ban on consulting by senior people.

The panel urged speedy implementation of its advice, on which action would be required by both the Administration and Congress. NIH staff members now suffer “confusion” about policies, and there is “a growing morale problem” at the agency, the report says.

Researchers have expressed a few concerns about the report. Varmus, now president of Memorial Sloan-Kettering Cancer Center in New York City, opposes a blanket ban on consulting by clinical and scientific directors, because some do not oversee grant decisions. “It should be by function,” says Varmus—a view shared by some of Zerhouni's advisory committee members. Varmus and others also questioned the prohibition on stock options, noting that they are often the only payment a start-up company can offer.

Other observers thought the panel was too gentle, including some lawmakers, who would like to see wider public disclosure of employee financial reports. Representative James Greenwood (R-PA), chair of the House Subcommittee on Oversight and Investigations, planned to probe further at a hearing this week.

2. NATIONAL INSTITUTES OF HEALTH

# Are Scientists Worth More Than Senators?

1. Jocelyn Kaiser

The controversy over researchers' income centers not only on who pays, but how much. A member of Congress has questioned whether the National Institutes of Health (NIH) had acted illegally in hiring top-level officials as consultants, paying them more than they could earn as civil servants—or even as senators. But high pay is not a problem, according to a blue-ribbon panel that reported to NIH last week (see main text); on the contrary, it says some NIH salaries should be even higher.

The issue involves a legal mechanism, part of Title 42 of the U.S. Code, that former NIH Director Harold Varmus began using in 1998 to raise institute directors' salaries and recruit intramural scientist- physicians above the top federal pay scale of $150,000 or$160,000. Under Title 42, NIH can pay as much as $200,000 (or more, with bonuses) to “special consultants” without approval from the Department of Health and Human Services (HHS). “It has made an enormous difference in our ability to recruit and retain an outstanding senior scientific staff,” says NIH intramural director Michael Gottesman. Recent NIH hires at the$200,000 salary level include, among others, five new institute directors; Dennis Charney, a leading neurobiologist who came to NIH's intramural program from Yale; and Anna Barker, an immunologist who headed a biotech firm and is now a deputy director of the cancer institute.

A panel examining NIH's consulting policies concluded last week that although NIH's midrange salaries compare well with those in academia, $200,000 salaries for top leaders and clinicians are “far from competitive.” NIH needs to retain Title 42 or a similar authority and should ask HHS to raise the cap, the panel said. But Representative James Greenwood (R-PA), chair of the House Subcommittee on Oversight and Investigations, suggested in a 4 February letter to HHS that hiring senior officials as consultants could mean that they “lack the legal authority” to carry out their job duties. And using it to pay salaries higher than the vice president's is a major policy decision that should not be carried out without involving Congress, says a staffer. 3. OCCUPATIONAL HEALTH # Beset by Lawsuits, IBM Blocks a Study That Used Its Data 1. Dan Ferber Workers who make the chips that form the brains and memories of computers and electronic devices use an array of nasty chemicals, some of which are known to cause cancer or are suspected of doing so. Whether on-the-job exposure has actually caused disease among chip workers is, however, a hotly contested issue. Good data are scarce, but some occupational health experts believe valuable clues may lie in a trove of records maintained by the computer giant IBM, which is now at the center of a legal dispute. Two researchers who gained access to IBM's records have produced an epidemiological study, paid for by attorneys suing IBM, concluding that former workers at IBM's computer-chip factories faced increased risks of dying from brain, kidney, blood, and skin cancers. But IBM recently moved to block publication of these findings, arguing that it had turned over the data only for use in a lawsuit and that a judge had ruled that the analysis could not be introduced as evidence in a trial because it was irrelevant and could be confusing to jurors. The company itself has since hired its own expert to conduct a new analysis of the data. The contested study—by epidemiologists Richard Clapp of Boston University and Rebecca Johnson of Epicenter in Circle Pines, Minnesota—analyzes a large set of mortality records on people who worked for IBM over a period of more than 3 decades. It was scheduled to appear in an upcoming special issue of Medical Clinics of North America, says guest editor Joseph LaDou, director of the University of California, San Francisco's International Center for Occupational Medicine. Four peer reviewers had read and approved the study, Ladou says. Then, on 30 March, Clapp withdrew it. IBM attorney Robert Weber confirms that an IBM attorney warned Clapp he shouldn't publish. LaDou says, “I wanted this in the journal because it's the most definitive cancer study” to date on this industry. IBM's action was a “serious disappointment to our scientific and academic freedom.” But Weber argues that academic freedom is not at issue. Clapp has no legal right to publish the mortality analysis because he is bound by the court's protective order that mandates he keep the data confidential, Weber says. A judge originally ordered IBM to open its files to Amanda Hawes and Richard Alexander of Alexander, Hawes and Audet in San Jose, California. They are attorneys for families suing the company over cancers in California and New York. The plaintiffs' attorneys then paid Clapp and Johnson to analyze the data. IBM attorneys say the court ruled that the data were available for use only in litigation and that Clapp had signed on to the protective order, which legally binds him to maintain confidentiality. Clapp declined to comment on the legal dispute, as did Hawes. Alexander and Johnson did not respond to telephone messages. The IBM lawsuits grew out of long-standing allegations that workers in semiconductor manufacturing were poorly protected from hazardous solvents and other chemicals that cause cancers and birth defects. IBM and other companies have denied this. But the health risks are worth investigating, says cancer epidemiologist John Bailar, a professor emeritus at the University of Chicago and current scholar-in-residence at the National Academy of Sciences in Washington, D.C. “Semiconductor workers in the past and at present suffer exposure to potent organic chemicals, including some known carcinogens,” says Bailar, who has no connection with the litigation. A handful of earlier studies had provided hints of elevated cancer risks among semiconductor workers, based on limited data. In 1997, former workers and their survivors began suing IBM. In discovery proceedings for one of the first of the approximately 200 ongoing lawsuits, a New York judge ordered IBM to turn over a database called the corporate mortality file, which tabulates the cause of death of more than 33,000 former IBM workers, as well as a database of work history on more than 18,000 deceased former workers. As an expert for the plaintiffs, Clapp had access to these files. In their study, which Science has obtained, Clapp and Johnson first conducted a preliminary analysis called a proportional mortality ratio. It spotlighted eight types of cancer in men and five in women that seemed to kill former IBM workers more frequently than would be expected in a group of average Americans. Then, they compared the percentage of cancer deaths from each type of cancer with the corresponding percentages in the general population—a measure called proportional cancer mortality ratio (PCMR). The PCMR analysis provided several statistically significant results: The 7697 IBM men who died of cancer were between 23% and 62% more likely to have died from cancers of the kidney, brain, blood, and skin. And the 1667 IBM women analyzed were 20% more likely to have died from kidney cancer. When the researchers focused on a subgroup more likely to have been exposed to chemicals—employees who worked at least a month at one of IBM's chip-manufacturing plants—PCMR results indicated that male workers were 62% to 79% more likely to have died from kidney, skin, or brain cancer, and female workers were 112% more likely to have died from kidney cancer. Bailar calls some of the numbers “elevated enough to make me worry.” The Clapp and Johnson analysis—the biggest cancer study in the electronics industry so far—is consistent with preliminary evidence from a small 2001 British-government study on a Scottish semiconductor factory that showed elevated cancer risk, he says. But the new results fall “short of proof.” Occupational epidemiologist Harris Pastides of the University of South Carolina, Columbia, concurs. Although the authors “reported very objectively,” he says, the study amounts to “an early part of the detective work to try to go down that road to causality.” IBM's lawyers reject the findings: “This is one of the clearest examples of what has been characterized as junk science,” Weber maintains. “It's a litigation-produced study in which lawyers supplied key data and gave direction on how the study was to be done.” Clapp says that he “totally disagrees” with that depiction. The corporate mortality file itself came from the plaintiffs' lawyers, he says, but he and Johnson did “everything after that,” including designing the study, choosing the statistical software, and analyzing the data. IBM's attorneys fought successfully to keep the analysis out of the first cancer lawsuit to come to trial, a closely watched case brought by two former IBM workers in San Jose, California. On 9 October, just before the trial started, Judge Robert Baines of California Superior Court barred jurors from seeing the Clapp and Johnson study because he said it did not document a link between the mortality data and workplace exposure to chemicals, making it “simply irrelevant and … highly prejudicial.” IBM won that case on 27 February. Bailar and other epidemiologists suggest that IBM could clear up lingering questions about whether semiconductor manufacturing work raises cancer risks by giving outside scientists access to records for the entire workforce rather than just for those who have died, which would allow scientists to look at overall rates of death. Researchers also would like to see IBM's data on workers' assignments and chemical exposures. In March, the Semiconductor Industry Association, a San Jose, California, trade group representing 95 of the largest semiconductor manufacturers, commissioned a retrospective study to see whether chip- manufacturing workers faced higher cancer risks. IBM's Weber says that the company has commissioned a separate study on possible workplace health hazards at IBM, led by Elizabeth Delzell of the University of Alabama, Birmingham. Delzell declined to discuss the details of the study but says she hopes to complete it this year, and she plans to publish it in a peer-reviewed journal. 4. SPACE SCIENCE # Hubble Alternative to the Rescue? 1. Andrew Lawler With a robotic mission to extend the life of the Hubble Space Telescope a good bet for late 2007 or 2008, NASA is also quietly considering launching one of the replacement instruments on a free-flying telescope that will incorporate advanced optics and spy-satellite technologies. The robotic rescue likely could replace Hubble's dying batteries and gyroscope and perhaps install a new wide-field camera. Replacing the Cosmic Origins Spectrograph (COS), however, may be beyond the reach of robotic systems. The$65 million instrument, which would gather data on early galactic evolution, requires complicated wiring best connected by human hands. So when NASA announced in January that it would not send humans to service Hubble again, the future of COS looked grim.

But NASA chief scientist John Grunsfeld says one alternative is to launch COS aboard a spacecraft that makes use of the latest advances in mirror technology. Adaptive optics, now used on ground-based mirrors to counter thermal, gravitational, and atmospheric variations, could be used to lower both the cost and weight of a new space-based mirror and forge the way for future missions. “The angle is that we would partner with people who want to build lightweight mirrors,” says Grunsfeld, a former astronaut who flew on Hubble servicing flights. He adds that he has talked with Defense Department officials, as well as researchers at the University of Arizona in Tucson and industry heavyweights such as Spectrum Astro in Gilbert, Arizona. The target date is 2009.

One official involved in the discussion envisions a 2-meter mirror nearly as capable as Hubble's but weighing only 200 kilograms, a fraction of the weight of Hubble's mirror. The spacecraft would weigh 2.5 tons, or one-fifth the weight of Hubble. The cost would be $200 million to$300 million, plus up to $100 million to launch the telescope into a high orbit aboard a Delta 2 expendable rocket. “It's Hubble Lite,” he said, adding that defense officials are eager to test advanced optics for use in their spy satellites. “It's potentially doable, and it would be nice to take steps in that direction,” says J. Roger P. Angel, a University of Arizona astronomer who has worked for decades on adaptive optics. But he warns that new technology usually takes more time and money than expected and that the nearly 1-ton size of COS poses a challenge. For now, however, Grunsfeld says NASA's focus must be on the robotic mission to Hubble. Before it even hears from a National Academy of Sciences panel examining Hubble's future, NASA plans to ask industry next month to propose ways to conduct the effort and award a contract by September. “It is clear to us that if we don't get moving, it's not going to happen,” says Grunsfeld. 5. PLANETARY SCIENCE # Endurance Has Its Rewards on Mars 1. Richard A. Kerr It was a sight to warm the cockles of any geologist's heart: exposed rock, lots of it. Once the Mars rover Opportunity rolled up to the rim of the 130-meter-wide Endurance crater (below), the rover team saw “enormous outcrops of layered rock,” says science team leader Steven Squyres of Cornell University in Ithaca, New York. Opportunity had already spent weeks analyzing an outcrop at the 20-meter-wide Eagle crater, its landing site. The curb-high chunk of finely layered rock told of an acid, salty, shallow sea on early Mars. But with just one sliver of martian history to read, rover team scientists had no idea how long that water had stuck around on the surface of ancient Mars or what came before it. Endurance, however, is “a spectacular impact crater that has exposed many meters of the history of Mars before the water revealed at Eagle crater,” says Squyres. If Opportunity's first distant look at the bare Endurance cliffs is any indication, Mars had a varied history before the time of the Eagle crater sea. Team member James Rice of Arizona State University in Tempe sees up to five rock layers lying below the light-toned layer also seen at Eagle crater. So far, remote spectral analysis of the older layers shows no trace of the salts detected in the Eagle outcrop, just the basaltic rock typical of Mars. Hints of cross-cutting layers suggest to Rice that some of the older beds may be sandstone laid down by ancient winds. “It may be a dune environment,” says Squyres, “it may be a beach. It isn't what we saw at Eagle crater.” Opportunity will be sizing up Endurance from its rim in coming weeks, but to get a reliable reading of martian history, Opportunity will have to edge down to rock exposures on the crater's steep inner walls. “If we go in,” says Squyres, “it will have enormous scientific potential, but there will be risk as well.” Before taking such risks, Opportunity will take care of unfinished business on the less exciting but safer plains. 6. PALEONTOLOGY # Evidence of Huge, Deadly Impact Found Off Australian Coast? 1. Richard A. Kerr Seven geoscientists report online this week in Science (www.sciencemag.org/cgi/content/abstract/1093925) that they have found the scar of a large asteroid or comet impact just off the northwest coast of Australia that could have trigged the largest mass extinction ever, 250 million years ago. The proposed Bedout (pronounced “Bedoo”) impact could have triggered the Permian-Triassic (P-T) extinction, they say, the way the Chicxulub impact on the Yucatán Peninsula caused the death of the dinosaurs. Not so fast, say some researchers who specialize in deciphering signs of impact lingering in rock. “There's no convincing evidence for an impact origin” in the studied rocks, says impact petrographer Bevan French of the National Museum of Natural History in Washington, D.C. “Everything they're arguing was shocked [by impact] can have nonshock origins,” such as volcanic activity, he argues. Despite the variety of evidence presented in this and two earlier Science papers by the same principal authors (Science, 21 November 2003, pp. 1388 and 1392), impact-triggered extinction at the P-T has yet to meet broad acceptance. This search for a P-T impact crater started with oil exploration. On the basis of oil companies' seismic probing beneath the sea floor, oil explorationist John Gorter, now at ENI Australia Ltd. in West Perth, proposed in 1996 that the submerged Bedout High is the central peak of a large impact crater formed at the end of the Permian. Since that time, geochemist Luann Becker of the University of California, Santa Barbara, and colleagues have been trying to explain the apparent impact debris they were finding in Antarctica. They took a closer look at the oil exploration data and samples. A map of subtle gravity variations across the region reveals a ring reminiscent of Chicxulub's, they say. And radiometric dating of a Bedout High mineral grain recovered from the bottom of an oil exploration well puts its formation in the neighborhood of the 251-million-year age of the P-T. But central to their argument are rocks from Bedout High. The shock waves racing away from a large impact are powerful enough to alter mineral crystals profoundly. Shock can rearrange the crystal structure of minerals into distinctive patterns, obliterate crystallinity entirely to produce the glassy product maskelynite, and even melt the mineral. Becker and her colleagues point to examples of maskelynite in their samples. They also cite many examples of melted crystals. In one case, a crystal encloses a melted core of the same chemical composition as the crystal. “It's nigh unto impossible to get that in a volcanic process,” says geochemist Robert Poreda of the University of Rochester in New York, who did much of the mineral analyses. “The only way you can do that is to shock melt it.” Some experts on shock effects on minerals are not persuaded. “I see nothing that would convince me there was an impact,” says Christian Köberl of the University of Vienna, Austria. At Chicxulub, the buried layer of jumbled and melted rock fragments always contains microscopic bits of minerals such as quartz riddled with distinctive banding due to shock, he says. “Where are all the shocked minerals?” he asks. Impact geologist Richard Grieve of the Canadian Geological Survey in Ottawa would also expect to see signs of flow frozen into the once-molten material. An impact's debris is deposited so violently that the molten rock should have been pulled like taffy as it suddenly cooled, he says. And he questions the paper's identification of shock-formed maskelynite. “I've never seen maskelynite look like this,” says Grieve. “I could be wrong, but I wouldn't add [Bedout] to the list” of proven impact craters he helps maintain. To make the list, say Grieve and the others, Becker and her colleagues need to apply some more powerful analytical tools, such as micro-Raman spectroscopy, to targets such as the putative maskelynite. Then perhaps Bedout can join the club. 7. MOLECULAR BIOLOGY # Consortium Tackles Mouse Regulome 1. Wayne Kondro* 1. Wayne Kondro writes from Ottawa. OTTAWA, CANADA—Dozens of the world's leading molecular biologists have banded together to map out the biochemical instructions that allow organisms to make all the types of cells they need. The Canada-based effort, called the International Regulome Consortium, hopes to raise$100 million in public funding for what organizers are calling “the third generation of genomics.”

The new consortium proposes to characterize and tag the 1486 known transcription factors in the mouse genome, as well as an estimated 600 coregulators that work together to control cellular and biological functions through networks called regulons. The entire suite of actors is known as the regulome. Understanding the processes by which a set of genes is regulated during development of an organism, or during its disease states, “will revolutionize our understanding of how cells function,” says Michael Rudnicki, senior scientist at the Ottawa Health Research Institute, who is chairing a steering committee that will manage the effort.

Roughly 75 researchers from six nations (Canada, the United States, the United Kingdom, France, Italy, and Singapore) began sketching the parameters of their plan during a founding workshop held here 3 to 5 May. They focused on the technologies needed to purify the complexes, identify and tag the transcription factors, and eventually construct databases to store results in standardized formats. “The idea is to apply some very state-of-the-art genomic and proteomic technology to the biology of stem cells,” says Kevin Struhl, a professor of biological chemistry and molecular pharmacology at Harvard Medical School in Boston. They also hope to identify the complete set of DNA binding sites and corresponding target genes for the regulons in embryonic stem cells and a subset of the cells they differentiate into.

The mouse is the obvious choice to focus on, says William Skarnes, senior scientist at the Wellcome Trust Sanger Institute in Cambridge, U.K., given the vast amount of data already available on the animal and its similarity to humans. The group's initial work with mouse stem cells “is not going to tell us everything about all aspects of mammalian biology,” acknowledges University of Toronto professor Jack Greenblatt, but he and others hope it will yield important principles that would apply to other types of cells and processes.

Still, the genetic homogeneity of largely inbred mouse strains may be misleading when it comes to understanding human stem cells, cautions Peter Andrews, professor of biomedical science and co-director of the Centre for Stem Cell Biology at the University of Sheffield, U.K. “In the human, every embryonic stem cell that we're working with that comes from a different person is genetically different,” says Andrews. “We don't know, at the moment, what significance that genetic heterogeneity will have. It may very well be that the behavior of different embryonic stem cells varies because of their different genotype.”

Consortium members hope that governments will provide support to help the group get organized as well as for ongoing operations and research. The Canadian participants are looking to a combination of federal and provincial funding agencies, including Genome Canada, the Canadian Institutes of Health Research, and the Ontario Research & Development Challenge Fund, to contribute about half the total needed.

8. BIOMEDICAL RESEARCH

# New NIH Training Grants Open to Foreign Students

1. Jeffrey Mervis

For the past 30 years, the National Institutes of Health (NIH) has awarded institutional training grants and fellowships that come with a major proviso: for U.S. citizens and permanent residents only. But that is about to change. NIH is quietly launching a training program for the 2004–05 academic year that will be open to all, regardless of citizenship. The agency is now reviewing the first set of proposals for the $6 million initiative and expects to select about a dozen institutions for 5-year awards of up to$600,000 a year.

The impetus for the new program, called “Training for a New Interdisciplinary Research Workforce,” came from a strategic plan (the Roadmap) drawn up by NIH Director Elias Zerhouni after his arrival in Bethesda 2 years ago. “It grew out of the concepts in the Roadmap, which was to rethink everything NIH is doing and be as inclusive as possible,” says Wendy Liffers of the policy shop at the National Institute of Dental and Craniofacial Research. As its name implies, the program is aimed at increasing the number of scientists trained in interdisciplinary research. “If somebody came to the United States already steeped in this approach, then we want to give them a way to continue it here,” says Terry Bishop of the National Institute of Diabetes and Digestive and Kidney Diseases, which will run the competition.

NIH grantees have always been able to support foreign students and postdocs through their research grants; indeed, foreign-born students are now a majority in some fields. But the agency's primary means of training undergraduates, graduate students, and postdocs—called the National Research Service Award program—is restricted to domestic students under a 1974 law. “Legislators generally feel that training dollars should stay at home,” explains one congressional aide familiar with the various NIH funding mechanisms.

To avoid running afoul of that law, NIH will combine research and training in the new program (grants.nih.gov/grants/guide/rfa-files/RFA-RM-04-015.html). Foreign students will be supported by research funds and domestic students by training money, although both will receive the same kind of training. “Scientifically, it doesn't matter, of course,” says Bishop. NIH will, however, create a hybrid accounting system to track how many of each are being served. “Congress likes to know how many dollars we spend on training, and we thought that they might ask,” Bishop adds.

NIH officials say they will be monitoring the new program carefully. “We are doing it as a pilot, and I don't know how long it will last,” says Walter Schaffer, head of NIH extramural training programs. “But everybody seems to think that it's an idea worth trying.”

9. U.S. IMMIGRATION

# Groups Urge Easing of Restrictions on Visa Policies Affecting Scientists

1. Yudhijit Bhattacharjee

Sixteen academic and professional organizations this week asked the Bush Administration to take steps to ease the entry of foreign scientists and students into the United States without undermining national security. “We are confident that it is possible to have a visa system that provides for thorough reviews of applicants and still welcomes the brightest minds in the world,” say the signers, which include the National Academy of Sciences, the Association of American Universities, and AAAS (which publishes Science).

The organizations cite six problems in the current visa system and offer ways to reduce their impact on scientific exchanges and the global flow of information (www.aaas.org/news/releases/2004/0512visa.shtml). The changes would include fast-tracking visa applications that have been pending for more than 30 days, extending the validity of security clearances from 1 year to the individual's duration of study or academic appointment, and improving the ability of consular officers to recognize when a more detailed review is required and when it is not needed. The signers also propose extending the duration of visas for international students and scientists by revising visa reciprocity agreements between the United States and countries that send large numbers of scientists such as China and Russia. And they suggest establishing a mechanism for allowing visitors on F and J visas—for professional or personal business travel—to initiate the visa renewal process before leaving the United States. “We now have a consensus that the country must facilitate access to legitimate visitors without compromising security,” says Victor Johnson, associate director of NAFSA: Association of International Educators, a co-signer of the statement. “But it is never easy to translate a policy consensus into desired bureaucratic behavior.”

State Department officials say they are already moving in that direction. The 1-year rule, implemented last summer, replaced a process that subjected visa holders to a security review each time they sought to travel abroad. There's “a good chance” that the validity of clearances will be extended further, says a former senior department official. The department is also trying to reduce the number of cases reviewed by an interagency panel by training consular staff to better identify fields of study regarded as sensitive.

The coalition argues that improving visa processing efficiency will benefit both national security and U.S. higher education and science. Implementing these measures, the signers say, will correct “the misperception that the United States does not welcome international students, scholars and scientists.”

10. INTERNATIONAL COOPERATION

# Priorities for Rebuilding Civilian Iraqi Science

1. Richard Stone

CAMBRIDGE, U.K.—On a visit to Iraq's science ministry one morning last March, Abdalla Alnajjar, president of the Arab Science and Technology Foundation (ASTF), recalls how impressed he was at first with its apparent vitality. Throngs of Iraqi scientists filled the halls and courtyards of the ministry complex. Alnajjar soon realized, however, that the vast majority were biding their time, with nothing to do, until clocking out around noon. “I was shocked,” says Alnajjar, a physicist. “Most of the scientists are reputable and deserve to be working, not standing around in the sun.”

Patrick Welsh, science officer at the NWS forecast office in Jacksonville, Florida, agrees that “the revolution in computing has been phenomenal.” With a $25,000 Coastal Storms Initiative grant from the National Oceanic and Atmospheric Administration, parent agency of NWS, “we can build a cluster with the throughput of a supercomputer of the mid-'90s,” says Welsh. The resulting WRF forecasts for the greater North Florida area captured the local sea-breeze winds better than ever before, says Welsh. That matters in North Florida because it's the sea breeze pushing inland that often sets off the region's abundant thunderstorms. With locally run mesoscale modeling, Jacksonville forecasters now often forecast afternoon thunderstorms to within a few minutes of their occurrence. Similarly cost-effective operations using MM5 have been applied to forecasting how cold it will get on a U.S. Army test range in Alaska; how much snow would fall at the 2002 Winter Olympic Games in Salt Lake City, Utah; and what combat conditions might have been like in the fall of 2001 in Afghanistan. Although ever-cheaper computing was helping spread the local operation of mesoscale models (mainly MM5), the mesoscale community still had a problem. Researchers (mainly in the universities) and operational forecasting modelers (mainly at NCEP and in the military) “weren't using the same models,” says Mass, “so the research developments weren't going into the operational models. It was not healthy.” The solution to the community split is going to be the WRF model now in the last stages of development and testing. The product of a community collaboration involving NCAR and six government agencies including NWS, WRF builds on MM5 and Eta with new and improved versions of software, numerical calculation techniques, approximations of physical processes, and construction of an initial weather picture. Perhaps most promising is the ability of WRF users to plug in different model components as they are developed. “That has the benefit of greatly facilitating moving advances made in the research community into the operational configuration,” says Joseph Klemp of NCAR, who has coordinated the WRF project. This fall, WRF will replace the regional version of Eta, says Geoffrey DiMego, mesoscale branch chief at NCEP. Next fall, it will replace the Eta version whose grid encompasses all of North America and in 2006 the version used to forecast hurricanes. Most MM5 users outside NWS will be transitioning to WRF as further development of MM5 ends and NCAR training support for MM5 users is eliminated. ## Local is good Broad use of WRF will accelerate advances in mesoscale modeling, all agree, but many modelers would like to see more regional modeling being done locally. “NCEP is always going to be the center for modeling,” says NWS's Welsh. “But I believe there's a place in weather forecast offices for a localized, customized model for a particular part of the country. I don't know that every part of the country needs one; I'm convinced Florida does.” Mass agrees on the need for regionally based forecasting. “Each part of the country has different needs; the way you do the modeling is different,” he says. A local mesoscale-model forecaster is also more likely to find enough local observations to feed the model, he says. And a local forecaster will be close to those who use the forecasts: a state water agency predicting river flows, a state environmental agency predicting air quality, or a U.S. Forest Service office planning controlled burns. Those connections would also make it easier to raise the$200,000 to 300,000 needed from diverse sources each year for operational support of a forecasting system such as the Northwest Modeling Consortium. Such local funding will likely be vital to the continued expansion of mesoscale forecasting, at least for a few years. According to Nelson Seaman of NWS in Silver Spring, Maryland, NWS support for WRF at the regional level will consist of sequential trial runs at a half-dozen sites around the country during the next few years. So far, nothing is being promised beyond that to accelerate the devolution of forecasting power into a truly local affair. 13. METEOROLOGY # No End Yet to Forecast Advances 1. Richard A. Kerr While weather forecasters have been sharpening their views of tomorrow's weather in their own backyards (see main text), other researchers have been keeping up their seemingly inexorable improvement in forecasting next week's big picture of the weather. “The harder you work on each aspect of the forecast system, the better the forecasts become,” says forecast model developer Anthony Hollingsworth of the European Centre for Medium-Range Weather Forecasts in Reading, England. In its nearly 25-year history, work on medium-range forecasting by computer models has extended the length of high-quality forecasts from about 2 days to about 4 days, says Hollingsworth. That improvement required that the biggest weather forecasting models be run on the most powerful supercomputers governments could afford. The forecasts are graded on how well they predict only the general weather patterns around the world: the position and intensity of fair-weather high-pressure systems and stormy lows. Lower quality but still useful forecasts have been extended from 5.5 days to almost 8 days. “I hope we'll see useful 10-day forecasts by the end of this decade, in the winter at least,” says Hollingsworth. The most dramatic improvement of the past decade came in the Southern Hemisphere. Any computer forecast must begin with a picture of the current weather; the more accurate the initial picture, the more accurate the forecast. But the predominance of ocean over land in the Southern Hemisphere has always meant a dearth of places from which to make weather observations. In the 1990s, the advent of sophisticated weather satellites and of new ways of assimilating their observations into forecasts accelerated improvements in the south, says Hollingsworth. In the past 3 or 4 years, the gap between forecast skill in the north and south has closed. Over both hemispheres, forecasting into next week also benefited from more detailed model simulations and more realistic representations of a model's physical processes, such as cloud formation, says Hollingsworth. All these improvements required ever-increasing computer power and new and more efficient ways to do the required numerical computations. But human forecasters are still staying ahead of their machines, says James Hoke of the National Weather Service's National Centers for Environmental Prediction in Camp Springs, Maryland. Knowing the shortcomings of the models, human forecasters are adding 10% to 15% to the skill of forecasts over that of the models alone, he says. But there's a theoretical limit to prediction—whether machine or human—somewhere around 14 days, when atmospheric chaos prevails. And as models continue to improve, Hoke says, the amount of room available for forecast improvement by humans will eventually shrink. Someday, the machines could take over. 14. PLANETARY SCIENCE # Skywatchers Await the Fleeting Shadow of Venus 1. Robert Irion On 8 June, Venus will cross directly in front of the sun for the first time in 122 years Venus usually shines like a brilliant beacon in the morning or evening sky. But on 8 June, our sister planet will assume a darker guise: a circular blot, slowly crossing the sun's face in a dramatic 6-hour “transit.” No one alive has seen this mini-eclipse, which last occurred in 1882. Astronomers of that era launched lavish excursions to capture the event with newly invented cameras. This time, some researchers will use the transit as a dress rehearsal for studying extrasolar planets; others will probe the causes of an odd optical distortion. Space agencies also plan observing campaigns to educate the public about the workings of our clocklike solar system. Part of that clock is the sporadic timing of Venus transits. The planet rarely crosses a direct line between the sun and Earth, because its orbit tilts 3.4 degrees relative to the plane of Earth's path around the sun. When a transit does occur, a second one usually happens 8 years later. Those who miss the show in June will have another chance in 2012—the last alignment for 105 years. After the first sighting of a transit in 1639, each one grew in cultural impact. The 1874 and 1882 events were such phenomena that composer John Philip Sousa wrote a march called “Transit of Venus,” and a Harper's Magazine cover depicted Appalachian children watching the sun through a smoked pane of glass. Astronomers were captivated as well. “It was like a space race in the 19th century to make accurate measurements of the transits,” says NASA chief historian Steven Dick, formerly of the U.S. Naval Observatory in Washington, D.C. Indeed, the U.S. Congress funded eight expeditions in 1874 for a princely177,000, and Russia fielded a whopping 26 teams. Their goal was the same: to measure the exact moments when the full circle of Venus entered and exited the sun's disk. Once they gauged those times at many places on Earth, astronomers could use surveying methods to calculate the Earth-Venus distance. Then, Johannes Kepler's orbital laws would yield the long-sought “astronomical unit” (AU)—the distance between Earth and the sun.

The answers were close to what is now known to be the true value of about 150 million kilometers, but scientists were skeptical. The problem was the “black-drop effect”: a distortion that stretches the silhouette of Venus into the shape of a water drop when the transit begins and ends. “The black-drop effect makes it extraordinarily difficult to determine when the planet's edge actually touches the inner edge of the sun,” says astronomer Edward DeLuca of the Harvard-Smithsonian Center for Astrophysics (CfA) in Cambridge, Massachusetts. By the 1890s, other methods for measuring the AU were deemed far more accurate.

Many popular accounts blame the optical tricks on Venus's thick clouds, but astronomers agree that the planet's atmosphere can't bend light severely enough. Sharp observers in the 18th century first suggested another source: Earth's own blanket of air, a deduction confirmed in 2001 by astronomer Bradley Schaefer of Louisiana State University in Baton Rouge. Using computer models, Schaefer showed that smearing within Earth's atmosphere—which also makes stars twinkle—blurs Venus's disk in the telltale way during transits. Diffraction of light within a telescope adds more warping, he noted.

But that's not the full story. When a satellite far above the atmosphere watched a transit by the planet Mercury in 1999, it also spotted a black-drop effect, according to a recent study in Icarus. Astronomers Glenn Schneider of the University of Arizona in Tucson; Jay Pasachoff of Williams College in Williamstown, Massachusetts; and Leon Golub of CfA concluded that the effect came from the spread of light within the satellite's camera and the dimmer appearance of the sun's edge, an effect called “limb darkening.” The same satellite—NASA's Transition Region and Coronal Explorer (TRACE)—will observe the Venus transit in June to reveal the relative impacts of each distortion once and for all. “It will solve the black-drop mystery totally,” says DeLuca, a TRACE scientist.

Others on the ground also plan to watch. Astronomers Wolfgang Schmidt of the Kiepenheuer Institute for Solar Physics in Freiburg, Germany, and Timothy Brown of the National Center for Atmospheric Research in Boulder, Colorado, will use a 0.7-meter solar telescope in the Canary Islands to take detailed spectrographic images. They will try to measure wind speeds in the upper atmosphere of Venus by detecting Doppler shifts in the spectral lines of carbon dioxide gas, illuminated by the bright sun behind.

“This is an unprecedented experiment,” Brown says. “No one knows how it will work.” Ultimately, astronomers might adopt a similar approach to study the atmospheres of transiting planets outside the solar system, he notes. Any such effort would have to be exquisitely sensitive to faint changes in the pattern of a distant star's light.

Beyond the potential research, scientists expect a surge of public interest as the transit nears. Both NASA and the European Southern Observatory are sponsoring public- viewing campaigns* and live Webcasts. Participating students will record transit times and learn how to calculate an AU. Viewers in most of Europe, Africa, and Asia will get to watch the transit from start to finish, although those in the eastern half of the United States must settle for a shorter taste at sunrise. Other Americans will miss out—but a sunset view of the next transit awaits in 2012.

Even grizzled scientists are eager for 8 June to arrive. “The romance and history of Venus transits are wonderful,” says Brown. “If nothing else, this will be a great time.”

15. AMERICAN PHYSICAL SOCIETY MEETING

# Once Again, Dark Matter Eludes a Supersensitive Trap

1. Charles Seife

DENVER, COLORADO—APS's April meeting (held here 30 April to 4 May) brought together 1000 researchers in nuclear and high-energy physics, astrophysics, and related fields.

Dark matter has just become a shade darker. At the APS meeting, physicists from the Minnesota-based Cryogenic Dark Matter Search (CDMS) reported that the first results from the most sensitive dark-matter detector ever built had failed to reveal the invisible particles that theorists believe make up most of the mass in the universe. The finding nails shut the coffin on a controversial claim to have spotted dark matter, but if the particles continue to be no-shows, that would spell trouble for scientists' understanding of our universe.

Almost all astrophysicists are certain that dark matter exists. Several lines of evidence suggest that about 85% of the universe's mass is invisible. Stranger still, the observations imply that this mass is not the ordinary matter that makes up stars and planets and people. It must be made of an entirely different type of particle. The leading candidate by far is known as a weakly interacting massive particle (WIMP).

Despite years of trying, scientists have yet to catch WIMPs. Since 1998, researchers in the Italian Dark Matter (DAMA) experiment have claimed to have seen their faint signature, but nobody else has confirmed DAMA's results—and other experiments seemed to belie them (Science, 7 June 2002, p. 1782).

CDMS also started hunting WIMPs in 1998, using silicon and germanium detectors to look for dark-matter particles traversing a tunnel at Stanford University in California. If a dark-matter particle bumps into an atom in the detector, it leaves behind some energy, which shows up as a signal. But cosmic rays and stray nuclear particles can give false readings and limit the detectors' sensitivity. So, in 2003, physicists running the second phase of the experiment, CDMS II, buried improved detectors deep in an iron mine in Soudan, Minnesota, where overlying rock and soil screen out most of the stray particles.

CDMS II is four times more sensitive than any other experiment is, says team member Bernard Sadoulet, a physicist at the University of California, Berkeley. Nevertheless, CDMS II has not spotted a WIMP in 53 days of running. Says Stanford physicist and CDMS II team member Blas Cabrera: “If there had been WIMP events in the data set, we're quite convinced we would have seen them.”

Although the results are disheartening so far, they at least refute the controversial DAMA claim. If the DAMA result were a genuine observation, says Sadoulet, “we would have observed something like 150 events.” At another talk at the meeting, physicist Lawrence Krauss of Case Western Reserve University in Cleveland, Ohio, declared that “DAMA is dead, as far as I can see.” But Rita Bernabei, a physicist with the DAMA collaboration, says that the CDMS results are “model dependent” and do not invalidate DAMA's direct measurements of dark matter.

The CDMS II team plans to increase the instrument's sensitivity in the coming months by adding more detectors. Then, the experiment will run for several more years. If CDMS II hasn't shined a spotlight on a dark-matter particle by then, cosmologists will be in a dark mood indeed.

16. AMERICAN PHYSICAL SOCIETY MEETING

# Solar Flares Reveal Surprising Recipe

1. Charles Seife

DENVER, COLORADO—APS's April meeting (held here 30 April to 4 May) brought together 1000 researchers in nuclear and high-energy physics, astrophysics, and related fields.

When the sun belches, scientists listen—and observe. At the meeting, a satellite with gamma ray eyes that watch solar flares erupt in extraordinary detail presented a new picture of flares that is leaving solar physicists sunstruck.

Most scientists believe that solar flares, huge explosions on the sun, occur when the sun's magnetic field lines snap and then reconnect. The magnetic fields accelerate charged particles—electrons and ions—in the solar atmosphere and slam them back into the surface of the sun. The process sends torrents of gamma rays and x-rays shooting out into space. The 2-year-old Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) satellite is designed to spot those gamma and x-rays, which reveal how charged particles in a flare behave.

On 23 July 2002, RHESSI glimpsed a solar flare on the east part of the sun—the first gamma ray picture of a solar flare in action. What the satellite saw, though, was unexpected, says Robert Lin, principal investigator of the RHESSI mission. “It was thought that both ions and electrons were accelerated at the same time and should be in the same place,” he says, but the electrons' gamma rays and the ions' gamma rays came from spots on the sun thousands of kilometers apart. “That was a big surprise to us.” Enormous solar flares that erupted in October and November 2003 seem to show the same trend. “There's a suggestion that [ions and electrons] are not coincident,” Lin announced at the meeting. “The jury's really out; we don't know why they end up in different places.”

Even though the reasons for the separation are still obscure, the observations are likely to help physicists unravel what's going on in a solar flare. “That's interesting,” says Gerald Share, a physicist at the Naval Research Laboratory in Washington, D.C. “Maybe the ions are traveling down a different magnetic loop than the electrons.” Share joined the RHESSI team 7 months ago to work on slightly different aspects of the gamma ray data—and he's extremely excited about what the satellite's observations will reveal about how solar flares work. Lin shares Share's enthusiasm. “We'd like to understand how the sun releases its energy,” he says. “We're just getting gamma ray measurements that bear on this.”

17. AMERICAN PHYSICAL SOCIETY MEETING

# Gravity Withstands Close-Up Scrutiny

1. Charles Seife

DENVER, COLORADO—APS's April meeting (held here 30 April to 4 May) brought together 1000 researchers in nuclear and high-energy physics, astrophysics, and related fields.

Legend says Galileo studied gravity by dropping balls off the Leaning Tower of Pisa. Nowadays, physicists drop neutrons. At the meeting, German researchers showed how these tiny particles are revealing the strength of gravity on the very tiniest scales. The neutron work is a “very interesting experiment,” says physicist Eric Adelberger of the University of Washington, Seattle. “Questions about gravity are at the heart of physics.”

Since Newton's time, scientists have known that the force of gravity between two bodies falls off as the square of the distance between them increases. By observing planets moving around the sun, satellites orbiting Earth, and heavy masses attracting other nearby masses, physicists have confirmed that this so-called r-squared law holds on astronomical-length scales down to a few fractions of a meter. But recently, theorists have suggested that gravity might subtly deviate from the law on tiny scales. An extra dimension, for example, might mess up the gravitational force for lengths smaller than the diameter of a human hair.

Gravity is extremely hard to measure on those scales, because stronger forces, such as electrostatic repulsion, overwhelm its effects. At the University of Washington, physicists have used a very fine pendulum to show that the r-squared law holds down to scales of a tenth of a millimeter. At the meeting, physicist Stefan Baessler of the University of Mainz, Germany, described an experiment that tested the law on scales up to 100,000 times smaller still.

Baessler and colleagues dropped very slow, very cold neutrons onto a surface. When a neutron hits, says Baessler, it bounces like a tennis ball. Because a neutron is a quantum object, however, it can rebound only in fixed steps. Just as there is a lowest energy for an electron bound by electric forces near a hydrogen nucleus, there is a minimum bounce height for a neutron bound by gravitational forces near a surface. Find out that minimum bounce height, and you find out, with great precision, the force of gravity.

Even a tiny deviation from the r-squared law should make the minimum bounce height different from the height expected. When the physicists measured the minimum bounce height of the neutrons by lowering a neutron-absorbing ceiling toward the surface, they found that it was right where the r-squared law implied it should be. Baessler says they would have spotted significant deviations from that law even on the scale of a nanometer—and further refinements should make the technique even more sensitive.

18. HIGHER EDUCATION

# Reinventing Europe's Universities

1. Martin Enserink

European researchers have begun to wonder why their universities don't have the same research star status as America's Ivy League. Getting there will require serious reform

LIÈGE, B< font size=-1>ELGIUM—Does Holland have a Harvard? How does the Sorbonne measure up against Stanford? And why is there no Euro-Ivy League? European researchers and policy makers are increasingly asking such questions, and they don't find the answers reassuring. Across the continent, there are fears that Europe's universities, once bastions of leading science, no longer rate as global players—a slump that could harm Europe's economic prosperity.

At a recent meeting* here, scientists, administrators, and politicians grappled with a variety of ways to help universities punch their weight, from reforming education and improving graduate training to breaking down national barriers and what former UNESCO chief Federico Mayor called an “onslaught on the bureaucratic labyrinth” of European R&D funding. The goal: to create American-style research universities—and along with them, the vibrant technology sector that seems to flourish around them.

However, creating an American-style competitive market will not be easy. It goes against the grain of European egalitarianism, which strives to provide a solid education to as many students as possible while refraining from rewarding exceptional talent. Besides, some are uneasy about the idea of refashioning universities as technological powerhouses, worrying that it will hurt the traditional role of campuses as seats of learning for an intellectual avant-garde. “Surely, a university is more than training people how to set up companies,” said Jean-Patrick Connerade, the president of Euroscience, a pressure group representing research scientists.

The scale of the problem is vigorously debated. When the Institute of Higher Education at Jiao Tong University in Shanghai recently posted on its Web site a ranking of university research prowess—based on such measures as the number of papers in Science and Nature, Nobel prizes, and citations—only 10 European universities cracked the top 50, compared with 35 from the U.S. (see table, p. 953).

View this table:

The rankings, bandied frequently at the Liège meeting, have touched a nerve. Skeptics of the value of such comparisons note that although the European research landscape has fewer peaks, it has fewer valleys as well, so such a ranking says little about overall research quality. But other indicators tell a similar story, says Roger Bouillon, vice president for research at the Katholieke Universiteit in Leuven, Belgium. European researchers may produce comparable numbers of papers to their U.S. counterparts, but the impact is less. European science also leads to far fewer patents, and there's a net brain drain to the United States. Numbers aside, “you just feel it,” says Bouillon. “Whenever you're at an important meeting, Americans dominate.”

Europe's premium on equality is not the sole reason for the widening transatlantic gap. In the United States, federal agencies such as the National Institutes of Health and the National Science Foundation dole out huge sums of money in open competitions based on research quality. Hence a university that attracts top-tier scientists will win supersized slices of that pie, with the top 20 U.S. universities together raking in about a third of federal research dollars. Although systems vary across Europe, universities in many nations are awarded funding according to enrollment figures—and students often pick a university based more on location than on its reputation.

A number of national policies hinder competition among universities. Some countries do not allow institutions to select the best students, some have fixed national tuition fees and others none at all, and academic salaries usually are governed by a national formula, constraining the ability of universities to hire top talent. Moreover, “management is often not superprofessional,” says Jeroen Huisman of the Centre for Higher Education Policy Studies in Enschede, the Netherlands. Often, the leadership of departments and even universities is not chosen on merit or even by vote, but rotates among faculty members, many of whom strive primarily to keep the peace.

The European Commission wants to see something done about this policy patchwork. Four years ago it announced its intention to make Europe the “most competitive and dynamic knowledge-based economy in the world” by 2010, with universities leading the charge. Although the commission has no power over university policy—that lies with national governments—it does have one important tool at its disposal: money. The commission is about to throw its weight behind the creation of the European Research Council (ERC), an agency envisioned to spend billions of euros annually. The E.U.'s existing research program, Framework, focuses largely on applied R&D and emphasizes large international collaborations. In contrast, ERC will focus on basic science, fund individual teams, and have research quality as its sole criterion (Science, 2 January, p. 23).

Like the commission, some countries are beginning to see competition as a way to strengthen their universities. The German government, for instance, has proposed a plan to select a handful of “elite universities” and channel extra money their way. A few states have launched their own reforms; Baden-Württemberg, for example, is introducing more professional management, autonomy, performance-based funding, and tuition fees—all anathema not so long ago. The Dutch government, too, is forging ahead with an experiment in letting universities set tuition fees and select students.

But resistance to such changes is often fierce. In the U.K., for example, legislation allowing variable tuition fees nearly cost Prime Minister Tony Blair his political hide. Education ministries are loath to relinquish power, and university researchers fear losing out in a Darwinian struggle for survival. Introducing a merit-based funding system in Europe is “going to take quite a bit of fighting,” Bouillon predicts. “We have accepted that the best win in soccer, art, music, and business,” he says, “but not in basic research. That makes no sense.” Indeed, some Liège delegates decried the slow pace of change. “We're an airplane going around in circles on the runway but never getting off the ground,” complained Antoni Kuklinksi, director of the Institute of Socio-Economic and Regional Geography at the University of Warsaw, at the meeting.

Taking off is growing more urgent every day, says Geoffrey Boulton, a vice principal at the University of Edinburgh, because a whole new generation of Asian scientists is joining the global competition. “When I look at cozy Europe,” he said at the meeting, “I'm really quite terrified sometimes.”

• *The Europe of Knowledge 2020. A vision for university-based research and innovation. Liège, 25–28 April.

19. HIGHER EDUCATION

# Russian Universities Want Their Share of the Research Pie

1. Andrey Allakhverdov,