# News this Week

Science  16 May 2003:
Vol. 300, Issue 5622, pp. 1062
1. SARS OUTBREAK

# Flood of Sequence Data Yields Clues But Few Answers

1. Gretchen Vogel

The unprecedented computer and brain power focused on the severe acute respiratory syndrome (SARS) outbreak is beginning to yield hints about how to fight the disease. Within 3 weeks of fingering the culprit, scientists in several labs had sequenced the novel 30,000-base coronavirus. Last week, researchers from Singapore described their comparison of sequences drawn from 14 patients around the world, providing insights on the mutation rate of the virus and tools to track its spread. And on 13 May, researchers from Germany used the sequence to work out a probable structure for one of the key proteins involved in the virus's replication, a development that could provide a possible drug target.

The progress has been impressive. In the last month, labs have produced “more coronavirus sequences than I've seen in my 18-year career,” says coronavirus expert Mark Denison of Vanderbilt University in Nashville, Tennessee. But the most effective tool against the disease is still the same one used to fight 19th century cholera outbreaks: quickly tracing and isolating the contacts of stricken patients before they spread the virus further.

Hoping to learn how fast the virus is changing, Edison Liu and his colleagues at the Genome Institute of Singapore compared isolates of SARS coronavirus from five patients in Singapore. They then compared these isolates to those sequenced by teams in Canada, the United States, and China. In a paper published online on 9 May by The Lancet, the team reported that overall, the virus is relatively stable compared to many other RNA viruses. Some scientists had hoped for signs that the virus was becoming less virulent, but Denison says such a systematic change could easily take years or even decades instead of weeks.

Of the 127 changes, the team found 16 examples that were present in more than one isolate—a sign that the mutation was present in the original virus cultured from the patient and was not a lab artifact. The distinctive variations may offer a tool with which to trace the spread of the virus. Sequences isolated from patients whose infections can be traced to a sick doctor at the Metropole Hotel in Hong Kong have a pattern distinct from those in Beijing and Guangdong.

In animal coronaviruses, scientists have documented several examples in which single nucleotide changes can dramatically alter the virus's behavior. But until the different symptoms and severity of SARS are better understood, it will be difficult to tell what—if anything—the genetic variations mean for the behavior of this virus, says Earl Brown of the University of Ottawa.

Meanwhile, different teams are using the new sequence data to search for the virus's potential weak spots. On 13 May, Science published a paper online (www.sciencemag.org/cgi/content/abstract/1085658) describing the crystal structures of a key protein from two coronaviruses: one that causes mild cold symptoms in humans and one that infects pigs. Rolf Hilgenfeld of the University of Lübeck, Germany, and colleagues then used the known structures and the sequence of the protein from the SARS virus to build a probable model of the new virus's protein. The protein is a crucial piece of the virus's replication cycle, notes Denison, and compounds that block it could make an effective treatment.

So far, old-fashioned tracking and isolation of patient contacts does seem to be paying off. On 28 April, Vietnam became the first country to be declared free of SARS. The outbreaks in Toronto and Singapore are also tapering off. In Hong Kong, progress is slower. On 12 May, officials reported just five new cases in the city, according to the World Health Organization (WHO), down from more than 100 a day at the epidemic's peak. But the city is not out of the woods: WHO officials said last week they could not identify the source of infection for nearly 9% of probable cases. They also agreed that the virus is more deadly than it originally appeared. When they factored in patients who are hospitalized for weeks before succumbing to SARS, the death rate climbed to nearly 15%. For patients over 65, the death rate is 50%.

A sensitive diagnostic test to quickly identify people infected with the SARS coronavirus is at the top of many public health officials' wish lists. Christian Drosten of the Bernhard Nocht Institute for Tropical Medicine in Hamburg, Germany, says the relative stability of the virus should help scientists develop just such a test. If all goes well, he says, it could be ready within a few weeks.

2. SARS OUTBREAK

# Hong Kong to Beef Up Monitoring

1. Dennis Normile

TOKYO—Until this year, the Hong Kong government didn't see an urgent need to replace its patchwork system for combating flu outbreaks with a coordinated effort to spot emerging diseases. Now, thanks to a $64 million gift from the Hong Kong Jockey Club, the government is setting up an institution with the necessary resources to monitor and respond aggressively to the severe acute respiratory syndrome (SARS) virus and other infectious agents. “There was no great need of such an organization before,” says Lap-Chee Tsui, vice chancellor of the University of Hong Kong and a noted geneticist. “The [1997] bird flu outbreak was stopped effectively. But the SARS outbreak just identified the need.” On 5 May, Chief Executive Tung Chee Hwa announced that Hong Kong plans to create a center modeled on the U.S. Centers for Disease Control and Prevention in Atlanta. “Since the SARS virus came to Hong Kong, we have seen that there is a possibility that there might be other infectious diseases in the future,” says Tung. Kennedy Shortridge, a virologist and professor emeritus at the University of Hong Kong who has long warned that south China could be the crucible of disease pandemics, hails the new center as “an excellent idea that stands to benefit the global community.” The jockey club steers proceeds from Hong Kong's lucrative horseracing operations into various philanthropic efforts, including a recent campaign to combat SARS by disinfecting schools. Lo Su Vui, head of the research office of the Health, Welfare, and Food Bureau, says the new center is expected to build on and centralize existing governmental disease efforts. “At the moment, [disease surveillance and control] functions are undertaken by various departments,” Lo says. “What we need to do is to pool these resources.” One of the first steps, he says, will be to train additional personnel to watch for emerging diseases and respond quickly when they appear. A key point in making early surveillance work, Lo adds, will be “not working in isolation but linking up with similar structures in the region.” He notes that officials from his department and their counterparts from neighboring Guangdong Province, for example, recently agreed to strengthen mutual reporting mechanisms. The center is also expected to carry out public education and support research. But Lo says the mechanism for interacting with existing university research programs is still being worked out. 3. PHYSICS # Is Berkeley Past Its Prime? 1. Robert F. Service U.S. News and World Report ranks its Ph.D. program third in the country. So why have so many rising young stars left the physics department at the University of California, Berkeley, for greener academic pastures? The answer, according to a new report by a blue-ribbon panel, is simple: resources. Not so long ago, physics was at the center of Berkeley's universe, and Berkeley was at the center of the physics world. In 1930, a 29-year-old whiz kid named Ernest Lawrence invented the atom-smashing cyclotron there. In the 1940s, faculty member Robert Oppenheimer led the Manhattan Project, which produced the world's first nuclear weapons. But the department has suffered a “significant decline” since those glory days, says a blunt report by outside reviewers. And what's needed—tens of millions of dollars to renovate dilapidated offices and antiquated laboratory space—won't be easy to find given California's current budget crunch and the slumping national economy. The lack of adequate lab space “was one of the most important reasons I left,” says Séamus Davis, a nanotechnology expert who decamped for Cornell University in Ithaca, New York, 13 months ago. “I think it's a superb institution. I loved being there,” he says. “But the amount of support physics was getting [from the administration] was too low.” Lars Bildsten, a theoretical physicist who was wooed away 4 years ago by the University of California, Santa Barbara, adds that faculty members rarely want to leave Berkeley but that poor facilities make them vulnerable to headhunters. “It's the reason why, when the phone rings [with an offer from another institution], you don't say ‘No, thank you, I'm not interested, good-bye,’” Bildsten says. Over the past 17 years, the department has lost nine recently tenured professors to other universities; at least seven have left in just the past 4 years. The chemistry department, by contrast, has lost only two junior professors in 10 years. “Unless the current situation is changed, further plucking of the best young faculty by stronger institutions is inevitable,” the report warns. Details of the report were first made public by the San Francisco Chronicle. The review, a periodic departmental assessment commissioned by the university, was carried out by a six-person panel that included Nobel Prize winners Horst Stormer of Columbia University in New York City and Steven Chu of Stanford University. According to the team's report, the department's troubles stem from its outdated infrastructure. Berkeley physicists occupy three buildings on campus and share facilities at the neighboring Lawrence Berkeley National Laboratory. But the on-campus buildings, the newest of which was built in the 1960s, are “in urgent need of overhaul” and lack state-of-the-art equipment needed to shield experiments from vibrations and other disturbances. To make matters worse, this summer one of the physics buildings, LeConte Hall, will close for 18 months for earthquake-proofing, forcing many research groups into makeshift lab space. Many departmental researchers say the report's conclusions came as little surprise, although its blunt language raised some eyebrows. “In some cases I feel the report is overly pessimistic,” says Berkeley's physics department chair, Christopher McKee. “By its nature, it focused on things that need fixing, not on what works well.” Clearly, McKee and others say, Berkeley's physics department still has plenty going for it: Besides its lofty magazine ranking, which placed it behind only Harvard and Princeton, the department features 12 members of the U.S. National Academy of Sciences. One of them, theoretical physicist Marvin Cohen, says he wouldn't hesitate for a moment to recommend the department to a close relative. “I'm one of those people who is still very high on the department,” he says. To stop the defections, the report urges department and administration officials to raise funds for new lab space, renovate LeConte Hall while the building is closed for seismic repairs, and recruit an “exceptionally strong” chairperson to create a long-term strategic plan. Department and administration officials say they are already addressing the problems. Berkeley's executive vice chancellor, Paul Gray, says the school has nearly locked up$65 million in state funds to help pay for a $108 million integrated physical sciences building. Construction is set to begin in 2006, with completion 3 years later. Physicists will also be given lab space in a multidisciplinary building, Stanley Hall, that's due for groundbreaking later this month and set to open in 2006. Finally, Gray says, the physics department is getting$10 million to $12 million in discretionary funds that could be used to help renovate LeConte Hall during its seismic retrofit. Over time these upgrades “are going to make a huge difference,” Gray says. Berkeley's faculty has heard promises before. “There have been talks for the last 15 to 20 years for getting new buildings for the physics department. But we could never get past this point to make it the highest priority within the university,” says theoretical physicist Steven Louie. Still, Louie and others say they are cautiously optimistic. “I think we are on the right path,” says experimental physicist Alex Zettl. “These are good solutions to the immediate problems. But we need to look many steps ahead. In a decade, we don't want to be in the same situation again.” 4. ANTHRAX VACCINE # NIAID's$233 Million Problem Put on Hold

1. Jon Cohen

Two influential U.S. senators have ridden to the rescue of the National Institute of Allergy and Infectious Diseases (NIAID), which is faced with a $233 million budget problem. NIAID officials have been scrambling to comply with a surprise directive from the White House Office of Management and Budget (OMB) to commit$233 million of its fiscal year 2003 budget to “procure” millions of doses of a new-generation anthrax vaccine. Because the funds had not been included in NIAID's budget, officials were reluctantly preparing to rob other programs to pay for the vaccines. On 9 May, following a story in that day's issue of Science (p. 877), Senators Arlen Specter (R-PA) and Tom Harkin (D-IA)—chair and ranking member, respectively, of the Senate's subcommittee on labor, health and human services, and education—wrote the director of OMB a stern letter blocking the expenditure, at least temporarily.

The senators stressed that their subcommittee, which writes a report detailing how NIAID and other branches of the National Institutes of Health should spend their money, “did not intend that NIH research dollars be used for this purpose.” They suggested that instead of “diverting funds appropriated for SARS, AIDS, and other infectious disease research,” a new bill in Congress for the so-called Project BioShield—a collection of programs to protect against bioterrorism—could include the anthrax money. They also questioned whether a new-generation anthrax vaccine will actually be ready for purchase by the end of the fiscal year. NIAID has funded two biotechnology companies to develop the vaccine, but they expect to complete only preliminary animal and human tests in that time frame.

Specter and Harkin asked OMB to answer their questions by 16 May and said that NIAID should not spend any of its money to buy anthrax vaccines until then.

5. TOXICOLOGY

# Arsenic Victims to Take British Science Body to Court

1. Daniel Bachtold

CAMBRIDGE, U.K.—The British Geological Survey (BGS) is girding for a court battle over claims that it could have averted a wave of arsenic poisoning in Bangladesh over the past decade. A High Court judge in London last week gave the go-ahead for a trial pitting two Bangladeshi residents against BGS's parent body, the Natural Environment Research Council (NERC). The plaintiffs, both of whom are said to suffer from arsenic poisoning, are claiming unspecified damages.

The arsenic poisonings are a tragic and unforeseen consequence of good intentions. In the late 1970s, UNICEF and other relief organizations began drilling drinking water wells to bypass sewage-tainted surface waters blamed for deadly outbreaks of cholera and other bacterial diseases. They were not aware that the groundwater in some parts of Bangladesh contains high levels of arsenic. Over the past few years, researchers have linked escalating rates of particular cancers—including bladder and lung—and skin lesions in Bangladesh to high levels of arsenic in drinking water, prompting the government in 1998 to alert communities to the hazard and to take emergency steps to provide alternate water supplies to the worst-hit regions.

The court case hinges on whether BGS should have tested for arsenic when it was commissioned by the British Overseas Development Administration in 1991 to help improve wells used mainly for irrigation and fish farming in Bangladesh's floodplains. As part of that study, BGS sampled for trace elements in 150 wells to find out “how the groundwater was flowing, particularly around the boreholes,” says hydrogeologist David Holmes, director of environment and hazards at BGS. “It wasn't aimed at testing the [water] supply for drinking quality.” In its 1992 report, BGS listed levels of 36 elements and compounds, including aluminum, iron, manganese, phosphorus, and silica. Arsenic was not tested, Holmes says, because at the time it was not known to be abundant in floodplains.

The lawsuit alleges that BGS was negligent not to probe for arsenic and that it should have expected that its report would be seen as a pronouncement on the safety of drinking water. “People were relying on the report … [as] a test of water quality,” asserts Bozena Michalowska, a lawyer for the plaintiffs. She contends that “they should have tested for [arsenic],” because BGS found high levels of iron in compounds relatively poor in oxygen, which could be explained by arsenic leaching into the water. Three years before the work in Bangladesh, BGS “had identified this to be an issue in British groundwater,” says Michalowska, and had subsequently surveyed for arsenic there.

Holmes acknowledges that BGS found iron in reducing conditions in Bangladesh in 1992 but says it is unfair to contrast those findings with the earlier U.K. study. “We didn't test for [arsenic] because we didn't expect it, whereas in the U.K. we would have expected it to be there,” he says. He adds that BGS scientists first heard about arsenic in groundwater in Bangladesh at a conference in 1995.

NERC says it will appeal the judge's decision to hear the complex case. “We are confident that there is no case to answer,” says NERC spokesperson Marion O'Sullivan. But if the claims do go to trial, Michalowska says that a successful outcome will pave the way for a class-action suit that could include hundreds of victims of arsenic poisoning.

6. NUMBER THEORY

# Prime-Number Proof's Leap Falls Short

1. Dana Mackenzie*
1. Dana Mackenzie is a writer in Santa Cruz, California.

All math students know the feeling of dismay when the teacher finds a mistake in their “right” answer. Now it's happened in one of the most exciting recent discoveries in number theory.

In March, Dan Goldston of San Jose State University in California announced that he and Cem Yildirim of Bogasiçi University in Istanbul, Turkey, had proved that prime numbers—integers that can be evenly divided only by themselves and 1—cluster on the number line in tighter clumps than was previously known (Science, 4 April, p. 32). Their estimate beat all previous results on “small gaps” by a stunning margin. It also marked the biggest step in decades toward proving one of the oldest and most famous hypotheses in number theory: the Twin Prime Conjecture, which posits that there are infinitely many pairs of prime numbers, such as 3 and 5 or 1,000,000,007 and 1,000,000,009, that are just two numbers apart. Mathematicians agreed, however, that a proof of the conjecture was still a long way off.

In mid-April, number theorists Kannan Soundararajan of the University of Michigan, Ann Arbor, and Andrew Granville of the University of Montreal in Canada tweaked Goldston and Yildirim's technique to show that there are infinitely many pairs of primes at most 12 numbers apart. The result was heartbreakingly close to the Twin Prime Conjecture. Suspecting that it was too good to be true, they decided to double-check Goldston and Yildirim's work.

After two restless nights, Soundararajan and Granville found an error in a routine computation. “It was a little bit surprising, because arguments of this kind are frequently used,” says Soundararajan. “You get comfortable with it, and you don't realize that this case is different.” Goldston spent days trying to patch the hole in the proof, in vain. “It occurs in just about any approximation of this type, and therefore there is no quick fix,” he says.

Goldston believes he can still top previous estimates on prime clumping, but by nowhere near as dramatic a margin as before. “[The original claim] was like Bob Beamon's long jump in the 1968 Olympics, when he beat the world record by 2 feet,” says Brian Conrey, a number theorist at the American Institute of Mathematics in Palo Alto, California. “This would be like finding out he scratched on that jump and beat the record by only an inch.”

7. PALEONTOLOGY

# Life's Diversity May Truly Have Leaped Since the Dinosaurs

1. Richard A. Kerr

To judge by the haul of fossil clams, snails, and sea urchins that paleontologists have retrieved from ancient sea-floor sediments around the world, life has only gotten more and more varied during the past quarter-billion years. But those same paleontologists have become increasingly concerned that their less-than-systematic search for trophy fossils may have misled them about life's inevitably rising diversity (Science, 25 May 2001, p. 1481).

On page 1133 of this issue of Science, a group of paleontologists largely lays those worries to rest. Simply cleaning up the existing fossil record, they report, eliminates their reservations about the reality of rising diversity, at least in the case of the doubling of diversity since the death of the dinosaurs. “I'm convinced this demonstrates there has been a major diversity increase in the” past 65 million years, says paleontologist Richard Bambach of Harvard University, who is not a member of the group. “It did happen.” Now paleontologists can focus on what might have increased diversity lately and on fixing the more numerous problems of the older fossil record.

The potential problem with the younger fossil record has been the so-called Pull of the Recent. The thorough knowledge of animals in the Recent epoch of the last 10,000 years—essentially, living animals—can inflate the trend in diversity by improving the completeness of the younger fossil record far more than that of the older record. The vagaries of fossilization mean that there will always be a time difference between the last known fossil of a species and its actual demise. There can be a missing section in a species' fossil record even if the species is still living. In the case of the coelacanth—the “living fossil” discovered swimming deep off South Africa in 1938—the gap is 65 million years.

Of course, knowing that the coelacanth lives, paleontologists fill in that 65-million-year gap since its last fossil. Thus, the living animal can “pull” its kind forward in the history of life, which increases diversity in the gap between its last fossil and the Recent. But the ancient coelacanth notwithstanding, the Recent's reach back is limited; it is most likely to pull species forward over the younger part of the record, increasing diversity more there than earlier and thus producing an inflated rise in diversity.

The obvious test for the Pull of the Recent is to drop the Recent and its living animals from an analysis of diversity trends. But paleontologist David Jablonski of the University of Chicago and his colleagues decided that before they did such an analysis, they needed to correct some mismatches between the way paleontologists have been classifying fossils over the decades and the way biologists classify living animals today.

Taking the class Bivalvia with its oysters, clams, and mussels, Jablonski and colleagues updated the records of 958 genera of living marine bivalves. All the selected living genera had fossils in at least one 5-million-year interval of the record, either the latest 5-million-year interval or earlier ones. They found that many genera present today that had seemed to be missing from the fossil record of the past 5 million years and had been pulled forward by the Recent were actually there all along. Paleontologists and biologists had simply put many species in two different genera. Other species had been lumped into broadly defined genera by paleontologists working over many decades; Jablonski and colleagues moved them into more modern, narrowly defined genera.

Rationalizing fossil classification had dramatic results. When they calculated the diversity trend from the updated record without the Recent, Jablonski and his colleagues found that the Pull of the Recent had been affecting just 5% of taxa. Because the Pull of the Recent operates on so few taxa, it can't be biasing diversity much at all. When the group applied the same approach to the 5-million-year interval just before the end of the dinosaurs, the “Pull of the Danian Stage” (the interval just after) affected more taxa, as might be expected, but still only 13%. “I think we've really cracked the problem,” says Jablonski. “All we really did was vet the data. It's very reassuring.” Given that the bivalves are representative of most of the marine fossil record in their ease of preservation, Jablonski believes that the steep rise in diversity the past 65 million years is real.

Paleontologist Michael Foote, a colleague of Jablonski's at Chicago, isn't so sure. “They found that the Pull of the Recent for bivalves is substantially less than had been thought,” he says, but “the jury's still out on whether that says the record is OK” at other times and for other animals. Bambach, however, sides with Jablonski, as does paleontologist Douglas Erwin of the Smithsonian Institution's National Museum of Natural History in Washington, D.C. “This puts the Pull of the Recent to bed,” he says. And other problems that plague the older record don't seem to apply here. “You have to believe the [past 65 million years of] diversity increase is probably real,” he concludes.

What could have been driving such a jump in diversity remains unclear. Speculations are numerous. Nutrients washed off rising mountain ranges may have fueled an increase in biological productivity that drove diversification. Or the subdivision of the world as continents split, opening new oceans dammed off from the others, could have created new places for different fauna to appear. The growing chill of the past 100 million years would also have increased the number of climate niches between the poles and the tropics for different animals to live in. Whatever the driver or drivers were, most paleontologists now have more confidence that searching for them is worthwhile.

8. TOXICOLOGY

# E.U. Shifts Endocrine Disrupter Research Into Overdrive

1. Sonja Lorenz*
1. Sonja Lorenz has just completed an internship in Science's Cambridge, U.K., office.

CAMBRIDGE, U.K.—The European Union is embarking on a massive new effort to pinpoint the harmful effects of hormone-mimicking chemicals. Last month, the European Commission launched a collaboration involving 60 labs across the continent to investigate the threat that these substances, primarily pollutants, pose to humans and wildlife. The intent is both to give the E.U. the information it needs to ensure that chemicals are tested adequately for endocrine effects before reaching the market and to flag effects in compounds already out there.

Concern over so-called endocrine disrupters arose in the early 1990s, when studies tentatively linked rising levels of pollutants to declining sperm counts and cancer of the testicles, prostate, and breast in people and to genital malformations in wildlife. However, many of the studies have been controversial. Establishing a cause-and-effect relationship has been a “hot potato, politically and scientifically,” says toxicologist Andreas Kortenkamp of the University of London School of Pharmacy, who coordinates the E.U.'s new Cluster of Research on Endocrine Disruption in Europe (CREDO).

The 4-year-long, $23 million program is meant to complement a substantial amount of research already under way around the world. For instance, many labs are probing the effects of chemicals that mimic or block estrogens, female sex hormones. One thrust of CREDO will be to look hard at compounds that block or behave like androgens such as testosterone, the main male sex hormone. Thus CREDO will “act as a counterbalance” to the stack of findings on estrogen disrupters, says Ulrike Schulte-Oehlmann, an ecotoxicologist at the University of Frankfurt, Germany. Her 13-lab consortium hopes to zero in on invertebrates, perhaps sea urchins or snails, that might serve as “sentries” in polluted environments and as standard test systems for detecting potential effects in higher species. Another gap in understanding that CREDO will try to fill is the risk posed by bromine-containing flame retardants, used widely in polymers and textiles. These high-production chemicals, some of which bear striking toxicological similarities to known endocrine disrupters such as polychlorinated biphenyls, have been accumulating in aquatic food chains for decades. “This is a warning: We should be concerned about them,” says toxicologist Joseph Vos of the National Institute of Public Health and the Environment in Bilthoven, the Netherlands, who is coordinating this part of CREDO. The E.U. initiative will enter one controversial area: how hormone-mimicking chemicals interact with each other. Gauging the risks of individual chemicals in the milieu encountered in nature is “a nightmare scenario” for risk assessors, says Kortenkamp. Much work needs to be done to develop proper test methods and assessment strategies that can untangle these risks, he says, particularly at low doses. Once an endocrine disrupter enters the body, in principle it can target any organ having hormone receptors with which it can interact. “We have to check from top to toe,” says Wolfgang Wuttke, a biomedical researcher at the University of Göttingen, Germany. His lab consortium will focus on known estrogenlike rogues, including pesticides, ultraviolet absorbers in sunscreens, and phytoestrogens used in hormone replacement therapy. The researchers' goal is to reveal to what extent these compounds influence gene expression in nonreproductive organs. Observers predict that the initiative's megacollaboration credo will bear fruit. “They have the critical mass to advance the field and see what is really important,” says Tuomo Karjalainen, a scientific officer at the European Commission in Brussels. 9. SCIENTIFIC WORKFORCE # Down for the Count? 1. Jeffrey Mervis U.S. students are avoiding science degrees, industry is worried about filling high-tech jobs, and graduate programs are overflowing with foreigners. That's the accepted wisdom. But how true is it? When 250 scientists descended on Washington, D.C., last month to urge federal legislators to boost funding for the physical sciences and engineering, they also made a pitch for training more scientists. They described “a diminishing number of Americans” entering the profession and an inadequate supply to satisfy demand for their services, an “unhealthy trend” according to the coalition of scientific organizations that sponsored their visits. “If there's one killer stat, it's that we've produced a declining number of [bachelor's degree] engineers for the last 10 years,” explains David Peyton, vice president for technology at the National Association of Manufacturers and vice chair of the Alliance for Science and Technology Research in America, which cosponsored the Hill visits. It wasn't the first time such arguments had been made. The idea that U.S. students are avoiding science has been repeated so often in recent years, by so many scientific organizations, that it has become accepted wisdom. Most groups also lament the low participation rates of women and minorities in an increasingly diverse society and the country's overreliance on foreign talent. And they want the government to do something about it—and fast. “We face a looming crisis based on the country's inability to train enough U.S.-born talent,” asserts George Langford, a biologist at Dartmouth College in Hanover, New Hampshire, and head of the education committee of the National Science Board, whose task force on the scientific workforce is finishing up a report that is expected to call for a major boost in government training funds (see sidebar). Shirley Ann Jackson, president of Rensselaer Polytechnic Institute in Troy, New York, and the author of two widely cited reports warning of looming holes in the scientific workforce, sees a “gap [that] represents a shortfall in our national scientific and technical capabilities.” The message is compelling. But how true is it? Statistics collected by the National Science Foundation (NSF) and other authoritative bodies paint a much more nuanced picture of the emerging U.S. scientific workforce. Some of the numbers and trends about enrollments and degrees are at odds with the conventional wisdom, whereas others show a cyclical pattern with both slumps and spurts. The decline in engineering degrees that Peyton cites, for example, isn't visible in the statistics collected by the Engineering Workforce Commission of the American Association of Engineering Societies. A report issued in March shows that the number of bachelor's degrees in engineering stood at 65,001 in 1993 and 68,648 in 2002—and that the number has changed very little over the decade. Moreover, first-year enrollments jumped by 14% between 1999 and 2001, suggesting that the number of degrees is likely to rise in the next few years. Langford's “looming crisis” and Jackson's “gap” are more difficult to pin down. Neither person says when it will hit nor spells out what would be “enough” degrees. The trends upon which their forecasts depend are also far from clear-cut. To pick one broad marker, the number of bachelor's degrees awarded in science and engineering (S&E) rose by 31% between 1980 and 2000, according to federal data compiled by the Commission on Professionals in Science and Technology (CPST). For the natural sciences (i.e., excluding the social sciences and psychology) and mathematics, it's up 15%. Even in computer science and engineering the number of degrees awarded is up 38% since 1980, although it's been on a roller coaster ride during those 20 years (see sidebar, p. 1071). And those students are overwhelmingly U.S. citizens. Moving up the academic ladder, NSF figures analyzed by the RAND Corp. show that the number of doctoral degrees in science, engineering, and health awarded to U.S. citizens has risen by 22% over the same 2 decades, including an 11% rise over the past 10 years. To be sure, there is consensus that precollege training for prospective scientists and engineers needs to be improved and that African Americans, Hispanics, and Native Americans are seriously underrepresented among those pursuing S&E degrees. And some of the enrollment numbers suggest worrisome shifts in the composition of the scientific workforce being trained by U.S. universities. To cite an extreme example, a new NSF report on graduate enrollments shows that the total pool of U.S. citizens and permanent residents pursuing S&E degrees has dropped by 10% since peaking in 1993, and the number of white U.S. graduate students in S&E has dropped by 20%. The share of graduate students in computer science and engineering with temporary visas has risen by 66% in the past half-dozen years, notes the same report, with foreign-born students now making up almost half the graduate population in those fields. But are these enduring trends? Moving the baseline back by only 3 years wipes out the decline among all U.S. citizens pursuing graduate degrees and cuts in half the size of the shrinkage among males. And the attractiveness of U.S. higher education to foreign-born graduate students is notoriously volatile, subject to external events—such as Tiananmen Square or 11 September—that have nothing to do with the scientific workforce. In addition, trends vary by discipline—patterns for the biological sciences differ markedly from those for the physical sciences, mathematics, and engineering—and are easily influenced by business cycles. It is noteworthy that the much-cited data on student enrollment and degrees, the supply side of the equation, say nothing about the demand for scientific talent. Nor do they constitute evidence of a gap between supply and demand, notes John Marburger, the president's science adviser. “I've been hearing a lot about labor shortages since I came to Washington [in the fall of 2000],” says Marburger. “But I don't think that there's a good model that can tell us what's going on in the workforce and what the government can do to help.” Most of the current debate, he adds, “is based not on data but on anecdotes, or in response to pleading by individual sectors.” ## By the numbers What do we know for certain about the U.S.-born-and-bred workforce? For starters, NSF figures show that the total number of S&E Ph.D.s awarded to U.S. citizens held fairly steady throughout the 1980s and rose gradually through the 1990s before leveling off at the end of the decade. The 2000 figure stood at 16,390, according to an analysis by RAND, compared with 13,438 in 1980. The data are also encouraging for those concerned about participation rates by underrepresented groups. The number of white women (all U.S. citizens) earning doctoral degrees has almost doubled over that span, and ethnic and racial minorities, a group that consists of African Americans, Hispanics, and Native Americans, have done even better, nearly tripling their presence. Admittedly, the numbers for minorities are still small, only 1540 in the year 2000. But their share of the pot has risen from one in 25 to one in 10. (Including Asian Americans boosts that to one in six.) The overall story doesn't change as one moves down the academic food chain. Graduate enrollments in S&E for U.S. citizens and permanent residents have risen by 15% since 1980, according to the latest figures through 2001, just released by NSF. Graduate enrollment in computer sciences, a field of special concern to many industrial leaders, rose for the fifth straight year, standing almost 20% higher than in 1990 and some 130% greater than in 1980. Likewise, the number of bachelor's degrees in mathematics and computer science has more than doubled since 1980, although the figure has cycled up, down, and back up. And undergraduates aren't forsaking science for other fields: The overall proportion earning bachelor's degrees in all S&E fields has held steady, at about 32%, for the past 35 years. The gender gap is also narrowing. In 2000, women for the first time earned more bachelor's degrees than men in all fields of S&E. That's almost true within the natural sciences and mathematics as well, where the female-male ratio is 47:53. However, when engineering is added, the ratio drops to 39:61. The same pattern can be seen at the graduate level: In 2001, women occupied 48% of all graduate slots in the sciences (natural, behavioral, and social sciences), up from a 34% share 2 decades ago. Even within engineering, their presence has grown from less than 9% in 1980 to 20% in 2001. And the absolute number of women engineering grad students is up 50% since 1990 and has nearly quadrupled since 1980. The data do seem to reinforce the standard wisdom on some points. The increased presence of foreigners in U.S. graduate departments, for example, is real. NSF figures show that noncitizens have increased their share of science, engineering, and health Ph.D.s from 22% to 37% since 1980. And the latest figures on graduate enrollments show a rise of 10% a year for the past 3 years. But that sudden rise comes after a decline for much of the 1990s, leaving the number of Ph.D. degrees awarded noncitizens in 2000 roughly the same as it was in 1991. It's also not clear how long these trends might continue. Many observers speculate that the tightened visa procedures in place since September 2001, combined with the recent war in Iraq, could turn off many prospective graduate students from abroad. An abrupt falloff could cause strains within some university departments. On the other hand, some observers believe that such a decline might create opportunities for U.S.-born students. It is true that the number of Americans pursuing advanced degrees is dropping in some fields. The number of Ph.D.s in the physical sciences—astronomy, chemistry, and physics—has dropped by 14% since 1993. Then again, that's also true for noncitizens, who experienced a 10% drop over the same period. Some of the statistical gains by women are due to a shrinking male population: The number of white men who earned science, engineering, and health Ph.D.s in 2000 was 15% lower than 20 years ago. The rise in the number of engineering students also masks a slight drop in the proportion of women (but not the absolute numbers) in the classroom, as computer engineering—the fastest growing specialty—attracts a heavily male enrollment. So far, however, there is no evidence that these trends have created a shortage of scientific workers, says economist William Butz, the former head of social and behavioral science research at NSF, now at RAND's Science and Technology Policy Institute. In a talk last fall entitled “Is There a Shortage of Scientists and Engineers? How Would We Know?” Butz applied five definitions of “low” production of a commodity and found that only one—when production doesn't satisfy market demand—has a self-correcting mechanism. The mechanism, according to traditional economic theory, is rising wages and declining unemployment rates. After looking at the available data, Butz concluded that neither phenomenon is occurring in the S&E workforce. “The data at hand give no indication of the kind of earnings premiums for scientists and engineers that would signal the existence of a shortage,” he said about a comparison of lifetime earnings for S&E Ph.D.s and those with professional degrees. And the fact that unemployment rates rose for S&E graduates during a period when the overall U.S. employment rate was falling, he pointed out, “is a strong indicator of developing surpluses of workers, not shortages.” ## Shortage redux Cries of a scientific shortage are not new. Sputnik forced U.S. leaders to ask, perhaps for the first time, whether the country had a sufficient supply of technically adept workers. The response, the National Defense Education Act of 1958, launched a generation of scientists who found employment in the baby-boomer-propelled expansion of academic and industrial research of the 1960s and early 1970s. Then came a decade of economic malaise and a tough labor market, which elbowed workforce issues to the sidelines. Former NSF Director Erich Bloch revived the idea of a looming shortage after taking office in 1984. The number of bachelor's degrees awarded in the physical and biological sciences, math, and engineering was near a peak, driven largely by demographics. Bloch and other influential voices in the community projected a drop in degree production against rising demand and came up with a massive predicted shortfall over the next decade and beyond. They believed that the federal government needed to address it. The conclusion dovetailed with Bloch's campaign to boost the agency's budget, then around$1.5 billion.

Although NSF's budget rose in the late 1980s and early 1990s, the idea that the nation was facing a shortage of scientific talent collapsed after a 1992 congressional inquiry pointed out its weak underpinnings. “The NSF models did not anticipate a deep economic recession [or] the end of the Cold War,” notes a National Academy of Sciences report based on a 1998 workshop on scientific labor forecasting. “They also failed to account for the market mechanisms that operate to bring supply and demand in balance.”

The latest drive to boost the workforce springs from two sources. One is a 1996 report by the Department of Commerce that projected the nation would need 2 million more information technology workers over the next decade than the number of IT graduates coming out of U.S. universities. This report, in turn, reflects concerns raised by the Information Technology Association of America (ITAA) about a pending labor shortage.

But the Commerce Department results have since been largely discredited by workforce analysts. “We got hammered by the General Accounting Office, and rightly so,” says a department official who helped with the report. “Our approach was way too simple.” Demographer Michael Teitelbaum, who oversees several workforce initiatives for the Alfred P. Sloan Foundation, not only criticizes the survey's methodology but also scolds lobbyists from research universities for climbing aboard the ITAA bandwagon in hopes of winning more federal spending in IT-related fields.

The second impetus is a very real, and widening, disparity between federal funding for the biological and the physical sciences. This year's $27 billion budget for the National Institutes of Health (NIH), already the biggest kid on the academic-research funding block, signaled the end of a 5-year doubling. Although science groups were glad to see Congress doling out more research dollars to NIH, they argued that because all fields are interdependent, such a growing imbalance is bad for all of science. To remedy the situation, advocates settled on a common vehicle—a proposal to double NSF's budget. As pro-science groups joined forces with leading high-tech industrial firms, they broadened their message to include workforce issues as well as funding. “Companies care about federal support for research,” says Peyton, “but from a business standpoint the personnel issue is much more pressing. After all, these are the people you'll be hiring in the next 1 to 3 years.” The campaign has been successful—to a point. Late last year, President George W. Bush signed into law a bill that spells out a 5-year doubling path for NSF, but it contains only guidance, not money. And the president's current budget request for NSF calls for only a 3.5% increase in 2004, a far cry from the 15% annual rises that would be needed to double NSF's budget by 2008. To keep the issue alive, vigilant lobbyists are peppering meetings, congressional hearings, and other forums with anguished cries about the state of the scientific workforce. Indeed, workforce initiatives are popping up like spring flowers in Washington science policy circles. A federally funded coalition of companies, government officials, and professional societies called Building Engineering and Science Talent (BEST) last fall warned of a “quiet crisis” based on the fact that “American colleges and universities are not graduating enough scientific and technical talent to step into research labs, software centers, refineries, defense installations, science policy offices, manufacturing shop floors, and high-tech start-ups.” Jackson, the author of the BEST report, issued a similar warning in March under the auspices of the Government-University-Industry Research Roundtable, an unusual hybrid body staffed by the U.S. National Academies. The report, Envisioning a 21st Century Science and Engineering Workforce for the United States, describes “a shrinking workforce [and] an unprecedented labor shortage.” Over the next several months, top government advisory groups will offer their analyses. A task force of the National Science Board will shortly deliver a report asserting that the U.S. government has the responsibility to ensure an adequate supply of technically trained workers and suggesting several ways to meet that need (see sidebar, p. 1073). And the President's Council of Advisors on Science and Technology (PCAST) has just established a task force, headed by former Microsoft COO Robert Herbold, to see if sufficient numbers of U.S. students are being trained for scientific careers and if universities have the resources to meet the challenge. The White House Office of Science and Technology Policy has created an interagency group to gather up the numbers. In the meantime, Teitelbaum doesn't expect the community to stop beating the drums for increased federal support to reverse the alleged dearth of scientists and engineers. “It's a hardy Washington perennial,” he says, “even if there's no credible, quantitative evidence to back it up.” In the end, its staying power may derive from the fact that the debate rests not so much upon numbers as upon an abiding faith in the value of a robust scientific enterprise. And that faith translates into the need for scientists to be fruitful and multiply regardless of current market conditions. “Doesn't everybody agree that everybody should be trained as an engineer?” quipped PCAST co-chair Floyd Kvamme at a recent meeting. Jackson summarized that imperative last month at a 2-day science and technology colloquium sponsored by the American Association for the Advancement of Science (which publishes Science). “This country needs more scientists in the pipeline,” she said. “And we also need to create the right climate to generate that [demand for] scientists.” 10. SCIENTIFIC WORKFORCE # Viewed From 1986, It's Mostly Downhill 1. Jeffrey Mervis For most of the world, 1986 evokes a year of science-related tragedies. In January the space shuttle Challenger disintegrated off the Florida coast, and in April the Chornobyl nuclear power plant exploded. But 1986 was also a peak year for the production of bachelor's degrees in many fields within the natural sciences and engineering in the United States. It was the high point of a spike that interrupts what otherwise has been a steady progression toward a doubling of science and engineering degrees over the past 35 years (see graphs below). The peak is a mystery to demographers, who note that it occurred in the midst of a 15-year slide in the number of college-age students that followed the end of the postwar baby boom. Some point to the arrival of the personal computer, which triggered interest in computer science. Whatever the reason, advocacy groups claim—correctly—that graduation rates for some scientific disciplines have declined since 1986. In making its case for more research funding, for example, the Alliance for Science and Technology Research in America cites a 29% drop from 1986 to 2000 in the number of bachelor's degrees in mathematics, a 22% drop in engineering, and a 15% decline in the physical sciences. For computer science, the drop is 12%. But those numbers represent a distorted snapshot of the scientific pipeline. For the broader category of the natural sciences—which includes biology as well as the physical sciences—and mathematics, bachelor's degree production in 2000 is up 28% over the 1986 figure. The rate of decline or increase depends on where you peg its starting point. Simply shifting the baseline back 5 years for the same set of degrees produces a dramatically different picture, thanks in part to a 180% increase in computer science degrees between 1981 and 1986. In computer science and engineering, for example, the 20-year trend shows a 23% increase. For mathematics, starting 5 years earlier would show similar levels in 1981 and 2000. Only in the physical sciences, which awards one-fifth the number of bachelor's degrees as in computer science and engineering, has the 2000 enrollment stayed below 1981 levels. 11. SCIENTIFIC WORKFORCE # Report to Argue for More Training Funds 1. Jeffrey Mervis Although many groups have tried to make the case that the supply of scientists is not keeping up with demand (see main text), the members of the National Science Board's Task Force on National Workforce Policies have taken a different approach to the issue. Indeed, they canceled a study to examine industrial demand for scientists in the course of their 2½-year study. Instead, they have been propelled by a conviction that the country needs more U.S. citizens going into science and engineering because those graduates will benefit the nation in countless ways. “There aren't any new data presented, but we have a wealth of anecdotal evidence” pointing to a shortage of domestic scientific talent, says Joseph Miller, Corning's chief technology officer and chair of the task force, which has held a series of open meetings to discuss a long-awaited report that it plans to present to the board next week. “A healthy scientific enterprise leads to a strong economy,” adds Miller, a polymer engineer who spent 35 years at DuPont before joining Corning 2 years ago. “And a strong science and technology workforce is essential to that enterprise.” The report is expected to recommend that the federal government offer “substantial new support” to make science and engineering degrees “more attractive to students.” Its suggestions will include more scholarships and higher paying stipends for students and more institutional funding for a range of programs aimed at attracting and retaining U.S.-born-and-bred students. The goal is to produce “more individuals with competencies that industry needs,” says George Langford, a biologist at Dartmouth College in Hanover, New Hampshire, who heads the board's education committee. The task force originally planned to commission a study of industrial demand for scientists and engineers. But Miller says the idea was abandoned because “assessing the demand side is always very difficult … it can change dramatically in a relatively short time.” That left the task force, formed in the fall of 2000, free to focus on what the government should do to lure more U.S. citizens into the field. 12. TAIWAN-CHINA COLLABORATION # A Bridge Over Troubled Waters 1. Dennis Normile Researchers from Taiwan and the mainland have hit scientific pay dirt with the first—and so far the only—collaboration between two institutions across the Taiwan Strait TOKYO—A hot campaign issue in Taiwan's presidential election in March 1996 was whether the island should drop its long-held objective of reuniting with the mainland and formally declare its independence. As a warning to what it regards as a renegade province, China staged military exercises along the Taiwan Strait and fired test missiles into nearby waters. It was hardly fortuitous timing for physicists planning the first-ever institutional-level scientific collaboration across the strait. But it didn't deter Chang Chung-Yun, a physicist at the University of Maryland, College Park, then on sabbatical at the Institute of Physics of Taiwan's Academia Sinica. “I was scared because the missiles were being launched when I flew from Taipei,” admits Chang, who was born on the mainland but is now a U.S. citizen. It was his idea to get Taiwanese scientists together with researchers at the Chinese Academy of Sciences' Institute of High Energy Physics (IHEP). That month, the two institutions signed a memorandum of understanding to work together on the Taiwan Experiment on Neutrinos (TEXONO). Seven years later, political relations between Taiwan and the mainland still oscillate between tepid and boiling over. But scientific ties are warming nicely, thanks partly to TEXONO, which just published its first result, on the neutrino magnetic moment, in Physical Review Letters. “The science is modest,” admits Henry Tsz-king Wong, who heads the collaboration for Taiwan. “But we've built bridges for future collaborations.” Indeed, other groups are preparing to cross those bridges. And researchers hope good science will promote friendlier politics. ## Mutual benefits, barriers Wong says geographic proximity, a common language and culture, and practical reasons mean “it really makes sense for us to work together.” He says Taiwan's smallish scientific community needs the mainland's numerous researchers, whereas Taiwan has a cadre of experienced, mid-career scientists missing in China because of the Cultural Revolution. Maryland's Chang adds that pairing Taiwan's world-leading electronics know-how with the mainland's rapidly advancing industrial capabilities should be “mutually beneficial” in developing experimental devices and instrumentation. The Chinese Academy of Sciences and Academia Sinica even trace their roots to the same institution formed in 1928 on the mainland. But political barriers have hindered scientific ties. Li Jin, an IHEP physicist who heads the mainland side of the TEXONO collaboration, recalls that during a 1980 stint at Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, researchers from Taiwan and the mainland “did not even want their pictures taken together.” But by the late 1980s, unofficial scientist-to-scientist collaborations were producing joint publications—and discovering pitfalls. A 1998 Science paper on sponge fossils co-authored by Li Chia-Wei, a biologist at Taiwan's National Tsing Hua University in Hsinchu, and Chen Jun-Yuan, a paleontologist at the mainland's Nanjing Institute of Geology and Paleontology, triggered denunciations on the floor of the Taiwan legislature and calls for Li's funding to be terminated. His crime? His institutional affiliation included the address “Taiwan, China,” instead of the official “Republic of China.” “I was put in a very difficult situation,” says Li, now director of the National Museum of Natural Science in Taichung, Taiwan. Lee Shih-Chang, head of the high-energy physics group at Academia Sinica's Institute of Physics, started nurturing ties to IHEP in the early 1990s to strengthen his institute's experimental program. With funding from Taiwan's National Science Council, Lee brought senior mainland experimentalists to Taiwan to help develop experiments for accelerators at Fermilab and at CERN in Europe. But it was Maryland's Chang who saw the opportunity to take the visiting-scientist cooperation to the next level: formal ties between institutions. He proposed that IHEP and the Institute of Physics form a joint team to develop a detector that would be set near one of Taiwan's nuclear power plants for studying neutrinos. The investment would be modest but large enough to require institutional backing on both sides. A carefully designed experiment could make an important contribution in one of the hottest areas of particle physics. “The project offered both scientific and sentimental benefits,” Chang says. Taiwan agreed to fund the project in 1996. But the mainland's National Natural Science Foundation balked over how the institutes would be identified. A compromiseidentifying Academia Sinica as being in Taiwan, instead of the “Republic of China,” and IHEP as being in China, instead of the “People's Republic of China” led to mainland financial support. Later, several other institutions on both sides of the strait joined the project. But there were more hurdles. Mainland scientists visiting Taiwan need to get an exit permit from the mainland as well as a visa from Taiwan, a process that takes at least 4 months and is subject to unexpected delays. The Taiwan side provided 90% of the$700,000 to build the detector, yet many of the critical components were produced on the mainland. This meant more approvals to move both funds and high-tech materials across the strait. In several cases, Wong says, the only way to convince “paper handlers” was to visit their offices and demand to be shown the legal basis for denial of a permit.

But perseverance paid off. In June 2001, the team finally started taking data at a detector set up 28 meters from the core of a reactor at the Kuosheng Nuclear Power Station 30 kilometers north of Taipei. Occasionally, an ephemeral neutrino produced by the reactor hits an electron in a 1-kilogram germanium crystal at the heart of the detector and produces an electronic signal. By analyzing those signals, the team produced the best upper limit yet of the neutrino magnetic moment, an indicator of the particle's inherent magnetism.

Petr Vogel, a neutrino physicist at the California Institute of Technology in Pasadena, says the result “is not earth-shattering, but it [is] the best limit on an important quantity.” Given the budgetary constraints and the fact that this is the first experimental physics experiment in Taiwan, the TEXONO collaboration “performed admirably,” he says.

Although no other collaboration matches the magnitude of TEXONO yet, other joint efforts are on the way. More and more Taiwan institutions are hosting visitors from the mainland, notes Lee, who predicts that more collaborative research will follow. A physics group at Taiwan's National Central University in Jung-Li is building a detector that will be used in experiments at IHEP's electron positron collider in Beijing. And James Shen, director of Academia Sinica's Institute of Molecular Biology, hints at a couple of TEXONO-style collaborations under discussion in the life sciences, although he doesn't want to disclose details prematurely.

For those who aspire to follow in TEXONO's footsteps, Wong advises, “Don't get frustrated, and keep a sense of humor.” And although IHEP's Li doesn't see the political tensions between Taiwan and the mainland dissolving anytime soon, he thinks that the bureaucratic obstacles “are getting smaller and smaller.” For both men, a chance to do good science is reason enough to reach across the strait.

13. PARTICLE PHYSICS

# Calculating the Incalculable

1. Adrian Cho is a freelance writer in Grosse Pointe Park, Michigan.

After decades of struggle, physicists may soon be able to predict the properties of matter made of quarks. The advance could change the face of particle physics

Particle physicists know plenty about frustration. For 30 years they have known the exact form and origin of the force that binds fundamental particles called quarks into protons, neutrons, and myriad other particles. Yet the theory of that “strong force” is so complicated that researchers have struggled to perform many basic calculations, such as precisely accounting for the mass of the proton. For theoretical physicists, it's as if they have a theory that proves 5 plus 5 equals an even number, but no way of calculating whether that number is 10 or 12.

But that may be changing. Thanks to several key insights and the help of powerful computers, some theoretical physicists say they are on the verge cracking the theory of the strong force, which is known as quantum chromodynamics (QCD). They have already performed a first set of high-precision calculations, and within a few years they hope to make a slew of detailed predictions that will open new avenues of research, says Peter Lepage of Cornell University in Ithaca, New York. “Anyone who can take the theory errors, which are typically 20%, and shrink them to 2% is going to have a huge impact,” Lepage says.

Such calculations should help particle physicists battle an even greater source of frustration: the entire theory of fundamental particles. The so-called Standard Model combines the theory of the quark-binding strong force with theories of the electromagnetic force, which ties electrons to nuclei in atoms, and the weak force, which causes a type of radioactive decay. The model neatly accounts for every particle interaction studied so far with huge particle-smashing accelerators. Yet it also leaves basic questions unanswered, such as why the fundamental particles have the masses that they do.

The ability to make high-precision strong-force calculations could help experimenters find what lies beyond the Standard Model, says Persis Drell, an experimentalist at the Stanford Linear Accelerator Center (SLAC) in Menlo Park, California. “It would change how we target our measurements,” Drell says. “It would be phenomenal.”

## Forces strong and weak

According to the Standard Model, ordinary matter consists of electrons and two types of quarks, the up quark and the down quark, which combine to form the protons and neutrons in the nuclei of atoms. A wispy particle called the neutrino emerges from the nucleus during a particular type of radioactive decay. Nature copies this collection of four particles—the up and down quarks, the electron, and the neutrino—twice over in heavier particles that generally flit into existence only in high-energy collisions at accelerator laboratories. The heavier quarks are known as strange and charm, top and bottom.

These particles interact by swapping yet other particles that convey the various forces. For instance, two up quarks and a down quark embrace one another to form a proton by exchanging particles called gluons. Massive particles called W and Z bosons convey the weak force, which lies at the heart of the Standard Model, as it governs how a heavier quark decays into a lighter one. For example, a bottom quark transmogrifies into a charm quark by emitting a W boson, which may then turn into an electron and a neutrino.

To probe the limits of the Standard Model, physicists tabulate the rates at which heavier quarks decay into lighter ones in an array of numbers known as the CKM matrix. If the Standard Model describes all the particles there are, certain combinations of these rates must add up to 100%. So researchers are striving to carefully measure these numbers in experiments at SLAC; Cornell University; the High Energy Accelerator Research Organization (KEK) in Tsukuba, Japan; and the Fermi National Accelerator Laboratory in Batavia, Illinois. If the numbers don't add up as predicted, then new particles may loom just over the high-energy horizon.

In the best of all universes, physicists could measure the CKM numbers by watching individual quarks decay into lighter ones. Unfortunately, the strong force is so strong that it's impossible to isolate a single quark. Every quark must be bound by a tangle of gluons to other quarks or to an antimatter antiquark. Experimenters can only measure the decays of the resulting composite particles and then try to extract the CKM numbers by using theoretical calculations to filter out the effects of the extra quarks and gluons. And by its very nature, the strong force makes such calculations exceedingly difficult.

## Brown muck

Consider a particle called the B meson, which consists of a heavy bottom quark and a light up or down antiquark. Crudely speaking, the quark and antiquark are bound by gluons like two bricks held together with a little mortar. In reality, however, the meson is far more complex. The gluons themselves exchange gluons to form a roiling tangle. And thanks to the Heisenberg Uncertainty Principle, quark-antiquark pairs constantly pop into and out of existence, adding to the complexity of the mess surrounding the bottom quark, which goes by the technical term “brown muck.”

To measure the rate at which bottom quarks decay into charm quarks, physicists need to understand the brown muck. Ordinarily, they would do that by calculating the properties of more and more complicated arrangements of quarks and gluons, using a process known as perturbation theory. But that approach won't work for QCD, says Mike Creutz of Brookhaven National Laboratory in Upton, New York. “Every time you add a term, it gets bigger and bigger,” Creutz says, so the approximations never home in on a solution.

To get around this problem, theorists use powerful computers to simulate the most probable arrangements of gluons and quarks inside the particle and then use these configurations to determine the particle's properties. But even the most powerful computer bogs down if it tries to keep track of the quarks and gluons everywhere within the particle. So theorists simplify things by chopping up the continuous space and time in which the particles exist into a four-dimensional grid of discrete points called a lattice. Just as chess pieces must stand on definite squares, not just anywhere on a chessboard, the quarks and gluons in the computer model may reside only on the points of the lattice. That restriction reduces an intractable mess of an infinite number of variables to an extremely hard problem containing only tens of millions of variables. The approach is known as lattice QCD, and it is almost as old as QCD itself.

Lattice QCD has yielded many insights into the nature of the strong force (see sidebar). Yet even the lattice computations have remained so difficult that for decades theorists have struggled to perform realistic calculations of particle properties. Now, however, they have achieved that elusive goal—or so say some physicists.

## Cleverer quarks

Thanks to conceptual advances made in the 1990s, it's now possible to perform lattice QCD calculations that are accurate and precise to within a few percent, more than two dozen theorists from four different collaborations argue. To prove it, they teamed to calculate nine well-known particle properties and have shown that their results agree with measured values far better than previous calculations did. The researchers posted their results on the Web last month (www.arxiv.org/abs/hep-lat/0304004) in a paper they have submitted to Physical Review Letters.

To achieve such precision, researchers had to overcome several sticky problems. Most important, the researchers included all the quark-antiquark pairs popping into and out of existence in the brown muck, something that for decades theorists had simply left out of the calculations, says Doug Toussaint of the University of Arizona in Tucson. Including such quarks is tough because, for mathematical reasons, moving quarks from real space into chopped-up lattice space makes the number of quarks double for each dimension of the lattice. That means the simplest lattice QCD theory contains 16 times as many quarks as researchers would like. Theorists have concocted several methods for weeding out redundant quarks, but they generally require setting the lightest quark masses many times too high, or they gobble up too much computing power to be practical.

To avoid those problems, the researchers in the collaboration souped up an older method of thinning the throngs of quarks. First, they spread each quark over four neighboring lattice sites—an old trick that reduces the number of quarks by a factor of 4. They compensated for the remaining redundancy by prying open their calculations and replacing a few key mathematical quantities with their fourth roots, says Christine Davies of the University of Glasgow, U.K. “People who have committed a lot of effort to these other formalisms are kind of horrified when they find out that it could be this easy,” Davies says.

## But is it right?

Some theorists argue, however, that the various fixes are too good to be true. “On the fundamental level, there is some doubt whether this formulation is valid at all,” says Martin Lüscher of CERN, the European particle physics laboratory near Geneva. In particular, injecting the fourth roots plays havoc with the quarks, Lüscher says, because it lets widely separated quarks instantaneously affect each other. That would violate the principle that no physical influence can travel faster than light, he says. Brookhaven's Creutz notes that to make their simulations work, the researchers still have to assume that the up and down quarks are a few times more massive than they really are. They then extrapolate to the right masses after the computer has done its part, and that extrapolation can be very tricky, Creutz says.

Others have adopted a wait-and-see attitude. Despite the new method's unresolved problems and the uncertainties they cause, says Steve Sharpe of the University of Washington, Seattle, it's still worth pressing ahead with the calculations. “You make progress by proceeding even though you don't have everything completely under control,” Sharpe says.

For their part, the researchers who made the calculations see their work as a first step. Next, they plan to calculate the properties of D mesons, which contain a heavy charm quark and which experimenters will study in detail at Cornell in the next few years. They then hope to tackle B mesons, which are being studied at SLAC and KEK and may offer the best chance to spot a hole in the Standard Model. If all goes well, their calculations could point the way to new particles and a more complete theory of matter. For theorists, the satisfaction would be incalculable.

14. PARTICLE PHYSICS

# Monster Machines

1. Adrian Cho is a freelance writer in Grosse Pointe Park, Michigan.

Lattice quantum chromodynamics (QCD) has explained why it's impossible to isolate a quark, revealed the deep symmetries of the strong force, and predicted the temperature at which protons and neutrons melt. Such advances required both keen physical insight and ever better computers, says Norman Christ of Columbia University in New York City.

Researchers from Columbia University; Brookhaven National Laboratory in Upton, New York; the Japanese Institute of Physical and Chemical Research (RIKEN); and the United Kingdom are developing computer chips tailored for lattice QCD. Within a year, they plan to build a machine at Columbia University that can perform 1.5 trillion calculations per second, more than 1000 times faster than a high-end PC. Still faster machines will follow at Brookhaven and the University of Edinburgh, U.K. Huge clusters of smaller computers may provide nearly as much power at Fermi National Accelerator Laboratory in Batavia, Illinois, and Thomas Jefferson National Accelerator Facility in Newport News, Virginia. In the meantime, researchers at the University of Rome and the German Electron Synchrotron (DESY) in Zeuthen are developing competing machines. And Japanese researchers at the University of Tsukuba and the nearby High Energy Accelerator Research Organization (KEK) have supercomputers dedicated to lattice QCD.

15. COGNITION

# How the Mind Reads Other Minds

1. Carl Zimmer*
1. Carl Zimmer is the author of Evolution: The Triumph of an Idea.

Understanding what others are thinking is a human exclusive. Now researchers are tracking how the brain performs this feat and speculating about how it evolved

Imagine a boy sitting on a couch about to unwrap a chocolate bar. His mother announces that she's taking him to soccer practice. He tucks the chocolate under the couch for safekeeping and leaves. A few minutes later, his sister comes into the room in search of her teddy bear. When she looks under the couch she is surprised to find an unopened chocolate bar, which she then hides behind a bookshelf. When her brother comes home, drooling for chocolate, where will he look?

This may not seem like a difficult question: It's glaringly obvious that the boy will look under the couch. But to get the right answer, you have to perform an extraordinary mental feat: understand the boy's intentions and beliefs—regardless of their accuracy—and use that information to predict his action. And the skill doesn't come easily: Until the age of 5 or so, children answer that the boy will look behind the bookshelf, where they know the chocolate bar to be.

Of all the species on Earth, only humans possess what researchers call a “theory of mind”—the ability to infer what others are thinking. “It is extremely critical, because it underlies teaching, deception, propaganda, all these sorts of things,” says Christopher Frith of University College London. “To get an idea from one brain into another, that's a deeply mysterious thing we do.”

It has been 25 years since scientists first began to seriously investigate this ability to read minds. Today psychologists can chart its development, and they link it to children's acquisition of language and appropriate social behavior. Neuroscientists have tried for a decade to pinpoint the regions of the brain required for a theory of mind, and their results are now converging on a distinct network.

This work may ultimately reveal how the brain's social wiring evolved. On a more practical level, it is also allowing researchers to pinpoint some of the neurological signs of autism, in which a breakdown in theory of mind makes it difficult to understand other people's emotions and motives. Researchers hope that they will be able to translate these discoveries into more accurate diagnoses of autism and point to more effective treatments.

The psychologists David Premack and Guy Woodruff, who first coined the term “theory of mind,” believed that chimpanzees and perhaps other primates could read intentions. Subsequent research has shown that primates are remarkably sophisticated in their relationships: They can deceive, form alliances, and bear grudges for days. Chimpanzees can even tell what another chimpanzee can and cannot see. But after decades of studies, no one has found indisputable signs that chimps or other nonhuman primates have a theory of mind.

Humans aren't born with a full-blown theory of mind. Children from cultures around the world acquire the skill at roughly the same age, and after reaching the same developmental milestones, which has led many psychologists to conclude that the theory of mind is a specific adaptation separate from general-purpose intelligence. “There might be a dedicated brain system,” says Frith. Researchers are still working out how theory of mind develops in the brain. “The exciting thing that's going to happen in the next few years is scanning children doing [theory-of-mind] tasks at different stages,” says Frith.

Autism provides more evidence that theory of mind is somewhat independent of other skills. Autistic people do poorly on problems like the chocolate bar test. One powerful demonstration uses animations of geometrical shapes in motion. Nonautistic people easily recognize when these shapes are acting like people—when two triangles coax a circle out of a square, for example. But even highly intelligent autistic people can't distinguish intentional movements from random collisions. Autism, some researchers theorize, is a selective deficit of theory of mind.

Specialized circuitry for reading people's minds hints that this ability played an important role in human evolution. According to Paul Bloom, a psychologist at Yale University in New Haven, Connecticut, the evolution of language would have been impossible without it. “Our species has come to possess this powerful theory of mind, and once you possess that, you are then capable of understanding people referring to things,” says Bloom. His research suggests that theory of mind makes learning words a fast, efficient process because children can quickly recognize what their parents are trying to teach them.

## Minds in the brain

In the mid-1990s, researchers using brain-imaging technology began searching for the biological basis of theory of mind. The premise was simple: Scan people while they perform mental tasks differing only in that one demands the use of theory of mind. In an early study, Frith and his colleagues had subjects read stories. The subjects then answered questions about the stories while in a brain scanner. Only some of the stories demanded that the subjects think about the beliefs and intentions of the characters.

Many other teams conducted similar experiments, and each revealed a constellation of active brain regions. Unfortunately, the constellations were not identical from one experiment to the next. Part of the trouble was that theory of mind is hard to isolate from other mental tasks. Another problem was that the experiments didn't demand mind readings on real minds—only on pictures or stories.

More recent experiments have tried to overcome these limits. Working with Frith and other researchers at University College London, Helen Gallagher put together an experiment based on the game of “rock, scissors, paper.” In each round, two players simultaneously choose one object. Rock beats scissors, scissors beat paper, and paper beats rock. Gallagher's subjects lay in a brain scanner and played the game on a computer screen. In some cases, they were told they were playing against a computer; in other cases, they thought their opponent was a person. In fact, the researchers generated a random sequence of choices. The only difference lay in the attitude of the subjects: As the researchers confirmed in interviews after the study, when subjects thought they were playing against a person, they tried to figure out their opponent's strategy.

In the brain of the participant, the chief difference between playing against a computer or a supposed human lies in one small region, a patch of neurons above the eyes known as the anterior paracingulate cortex, the researchers reported last year in Neuroimage. The paracingulate turned up in older studies of theory of mind as well. It may be responsible for the central task: separating your own mind from someone else's. “It's being able to recognize that someone has a different perspective than you do,” says Gallagher, who is now at Glasgow Caledonian University. She notes that the adjacent region of the brain acts as a conflict monitor, sensing when the brain's predictions about how the world works don't match up with reality. The paracingulate may make a similar distinction between our own beliefs and intentions and those of other people.

As important as the paracingulate may be, however, it can't create a theory of mind by itself. “You have to draw on other abilities and skills in other parts of your brain,” says Gallagher. Two such regions tend to turn up in theory-of-mind experiments. One is the temporal pole, located near the ear. It's crucial for recalling memories, suggesting that subjects need to dredge up past experiences to help figure out what other people must be thinking.

The other region, known as the superior temporal sulcus, lies high up on each side of the brain. Like the temporal pole, neuroscientists have found it at work in many experiments not focused on theory of mind. It is, essentially, a sensor of biological motion. The sight of a moving car won't activate its neurons, but a moving hand will. It is particularly sensitive to the movement of eyes and lips. It's possible that the paracingulate gyrus depends on interpretations about body language made by the superior temporal sulcus in order to work out what's going on inside another person's head.

Other parts of the brain may eventually be initiated into this inner circle of the theory of mind, but not without debate. Simon Baron-Cohen of the University of Cambridge, for example, believes that an almond-shaped structure known as the amygdala should be recognized as part of the network. The amygdala has long been recognized as a key player in the emotional life of the brain. Baron-Cohen has found that the amygdala becomes active when subjects look at pictures of eyes to answer questions about the depicted person's feelings and intentions. Theory of mind comes into play, Baron-Cohen says, when “you're trying to find out if [a stranger is] friendly or aggressive.”

A few other studies have revealed the amygdala at work as well. Robert Schultz of Yale University and his colleagues showed people animations of “intentional” geometrical objects, as they reported in the 28 February issue of Philosophical Transactions of the Royal Society of London. They found that only the shapes that were moving as though they were pursuing goals triggered a theory-of-mind network—one that included the amygdala. In the brains of autistic people, none of the films activated this network.

Gallagher plays down the role of the amygdala, however, pointing out that it fails to become active in most tests of theory of mind. She suspects it plays an indirect role, primarily during development. As children mature, she suggests, the amygdala guides their theory of mind to situations that are socially relevant.

However, earlier this year in Neuropsychologia, Baron-Cohen and his colleagues reported that people who sustain damage to the amygdala in middle age—long after their theory-of-mind network has been wired together—do worse on theory-of-mind tests than do people with undamaged brains. In other words, it seems adults still rely on the amygdala to read minds.

## An evolving theory

Determining the neural building blocks of the theory of mind may help researchers figure out how it evolved in our ancestors. The brain regions identified so far all have counterparts in the brains of other primates. It seems that the pieces of the mind-reading network were probably all in place in the brains of early hominids. Perhaps all that was required to create a theory of mind was to network them in the right way. “My personal view is that it all comes from trying to predict what some other animal is going to do next,” says Frith. “It just happens that the best way to predict what people are going to do next is understanding mental states.”

A theory of mind may even have evolved before people could understand their own minds. That at least is the opinion of Francesca Happé, who works with Frith at University College London. She points out that the brain networks underlying self-awareness and an awareness of others are intermingled. “People have struggled to find an evolutionary explanation for self-awareness, whereas the evolutionary value of theory of mind is clear,” says Happé. Perhaps a human ancestor “could track the intentions of others but wasn't at all self-reflective.” However it evolved, theories about theory of mind will likely keep researchers' minds occupied for years.

16. WEATHER FORECASTING

# Huge Pacific Waves Trigger Wild Weather Half a World Away

1. Richard A. Kerr

A week or two's warning of extreme weather may be possible by forecasting atmospheric waves snaking eastward to North America and Europe

From deadly tornadoes in the east-central United States, to the storm that sank an oil tanker off Spain, to epic flooding around the Alps, November 2002 was disastrous. Just a string of bad luck? No, say meteorologists. These and other extreme weather events all sprang from huge atmospheric undulations barreling eastward from halfway around the world. Because the disturbances, known as Rossby waves, take over a week to reach North America and Europe from their birthplace in the waters near Indonesia, meteorologists now see promise of forecasting extreme weather—at least in general outline—many days or even weeks in advance.

Currently, state-of-the-art forecasts provide only temperature and precipitation averaged over the period of 8 to 14 days ahead. If researchers can get a better understanding of wide-ranging Rossby waves, “we have some chance of extending [more detailed] forecast skill into the 1- to 2-week period,” says meteorologist Mitchell Moncrieff of the National Center for Atmospheric Research in Boulder, Colorado. “There are a lot of people excited about it.” Long-range tropical effects transmitted to mid-latitudes by Rossby waves could give researchers some handle on an otherwise chaotic atmosphere. “I think there's something there,” says Klaus Weickmann of the National Oceanic and Atmospheric Administration's (NOAA's) Climate Diagnostics Center (CDC) in Boulder. “We just have to quantify it.”

Last month meteorologists described recent cases in which far-traveling Rossby waves had triggered extreme weather around the globe.* Although not statistically significant, the evidence was striking. The researchers traced last November's disasters back westward from mid-latitude North America and Europe, along a great arc across the Pacific Ocean, over a mighty storm off Japan, to the pool of exceptionally warm waters in the western tropical Pacific always centered on Indonesia. It was over the western warm pool that a surge of atmospheric convection—the heat-driven churning embodied in towering thunderstorms—started it all off, says Moncrieff.

Such surges in convection tend to recur every 30 to 60 days as part of the Madden-Julian Oscillation, or MJO (Science, 7 September 1984, p. 1010). Researchers can follow the MJO's development as a broad, thick patch of clouds marching eastward out of the Indian Ocean. As the MJO passes over the warmer water and mountains of the Indonesian archipelago, the convection and its heat-releasing rains intensify before subsiding and passing on to the east.

The heat released by that week or two of extra-heavy rains can send an atmospheric signal on to North America and beyond, says meteorologist Prashant Sardeshmukh of NOAA's CDC. Last November, the added energy temporarily altered atmospheric flow as far north as offshore Japan, says Moncrieff, where it stoked the production of Rossby waves from a major storm. The waves show up on weather maps as wiggles of the snaking jet streams. The wiggles carry energy eastward while nurturing storms nestled in their smaller loops and steering the storms north or south across the latitudes.

Last November, the storm off Japan sent four “packets” of Rossby waves eastward, triggering strings of extreme weather almost 20,000 kilometers downstream, according to Melvyn Shapiro of NOAA's Office of Weather and Air Quality in Boulder. In one series, Rossby waves dispersing from the Pacific kicked off a tornado outbreak across the central United States on 11 November that cost 36 lives, a storm off Spain that sank the tanker Prestige and blew its oil onshore, Alpine floods on 16 and 17 November, and a devastating windstorm in Austria and Germany on the 17th. “The extreme weather seems to ride on the leading edge of Rossby wave packets,” says Shapiro. Another November packet set off a U.S.-Canadian snow and ice storm, delayed the launch of the space shuttle Endeavor, and flooded Morocco. And in the first 2 weeks of last August, Shapiro says, a packet of Rossby waves circled all the way around the hemisphere, setting off flooding centered on Germany and the Czech Republic that caused \$28 billion in damage (see p. 1099).

Clearly, Rossby waves can pack a formidable wallop. But are they common enough and predictable enough to help forecast disasters to come? “A lot of these storm systems—not all of them, but a substantial number—are related to Rossby wave dispersion,” says Rossby wave theorist Edmund Chang of the State University of New York, Stony Brook. But some experts fear that the waves may be too erratic to make good forecasting tools. “Rossby wave propagation is reasonably well understood,” says meteorologist Michael Blackburn of the University of Reading, U.K., but “it's not clear it's going to help.” For western Pacific Rossby waves to make it to North America, much less Europe, he notes, broad atmospheric conditions must be just right for keeping the waves together and focused. And the end result can depend sensitively on a wave's interaction with the atmospheric conditions at its target. “You're asking quite a lot” of any forecasting system, says Blackburn.

Current forecasting systems haven't proved themselves up to that task yet, but there are encouraging signs. “We noticed something severe was going to happen over Europe” last August, says meteorologist Federico Grazzini of the European Centre for Medium-Range Weather Forecasts in Reading, “but the location remained very uncertain until 1 to 2 days before.” A European Centre experimental program employing computer model forecasts out to a month did correctly predict the generation of Rossby waves in the Pacific and their propagation eastward, says Grazzini, “but the time of arrival over Europe was wrong.” An international effort dubbed THORPEX now in the works will search for better ways to observe the parts of the atmosphere crucial to forecasting extreme weather more than a week ahead. But it's too soon to predict success.

• *Joint Assembly of the European Geophysical Society, American Geophysical Union, and European Union of Geosciences, 6 to 11 April, Nice, France.