# News this Week

Science  27 Aug 2004:
Vol. 305, Issue 5688, pp. 1222
1. PHARMACOGENOMICS

# Cancer Sharpshooters Rely on DNA Tests for a Better Aim

1. Jennifer Couzin

Without fanfare, two diagnostic labs have launched a genetic test to guide doctors treating a common and deadly form of lung cancer. Despite lingering questions about whether the test is comprehensive, physicians think this approach could herald a new generation of gene-based methods of tailoring cancer treatment.

Designed to pinpoint patients who might be helped by the drug Iressa, the new test hunts for mutations in a gene called epidermal growth factor receptor (EGFR), whose protein Iressa targets. People who test positive may be more likely to benefit from this therapy, which has an impressive record in treating non-small cell lung cancer—but only in a small fraction of cases. If screening takes off, it could significantly affect the roughly 140,000 U.S. patients diagnosed each year with this type of cancer.

This month, a Harvard- affiliated diagnostics lab rolled out its version of the Iressa test, following a similar decision in July by the City of Hope hospital in Duarte, California. Both offer similar tests to lung cancer patients (at a cost of $500 to$2000), screening for mutations in DNA isolated from tumors.

Approved by the U.S. Food and Drug Administration in May 2003, Iressa initially baffled doctors with variable results: Tumors shrank in only about 10% of patients, but in that group the response was dramatic. Researchers concluded that the drug worked best in those with EGFR-dependent tumors, but there was no way to identify such patients. That became possible last spring, when two independent teams of scientists at Massachusetts General Hospital (MGH) and the Dana-Farber Cancer Institute, both in Boston, reported that Iressa responders have mutations in a specific stretch of the EGFR gene (Science, 30 April, p. 658).

“Hundreds of patients have contacted us” to learn their EGFR status, says Thomas Lynch, who directs the center for thoracic oncology at the MGH cancer center and was a lead author on one of the spring papers. Adds Matthew Meyerson, a pathologist at Dana- Farber and an author of the second paper: “Our goal, basically, is to get the test into the widest and fastest possible use.”

But the details must be ironed out. For one, the research groups are not equipped to handle the hundreds of thousands of samples that could flood in. (So far, each has tested fewer than 20.) “We're hoping there will be a commercial test,” says Lynch, adding that MGH and Dana-Farber have applied for patents and are discussing this with “more than one company.” The current goal, says Daniel Haber, head of the cancer center at MGH, is to sign on a company willing to distribute the genetic test to hospitals that want to screen their own patients. “We are not looking at the model Myriad has,” he says, referring to Myriad Genetics, the Salt Lake City, Utah, company whose monopoly over two breast cancer gene tests has spurred controversy.

In addition, new biological complexities are appearing: Preliminary studies have identified patients who respond to Iressa but who don't have EGFR mutations in the DNA swath tested. Vincent Miller, a thoracic oncologist at Memorial Sloan-Kettering Cancer Center in New York City, is concerned that some patients who could benefit from Iressa might not receive it after testing negative.

One possibility is that relevant mutations may be hiding elsewhere in the EGFR gene. Based on that hypothesis, says the chief of the clinical molecular diagnostic lab, Steve Sommer, the City of Hope has just launched a second EGFR test that screens the entire EGFR gene. That's four times as much DNA as the Boston test and the original City of Hope test cover.

Meanwhile, several hospitals, led by MGH, are planning a clinical trial for October to better correlate mutations with drug responses. The trial will enroll 30 newly diagnosed lung cancer patients with EGFR mutations and offer them Iressa up front.

Physicians are already beginning to extend findings from Iressa studies to a related drug, Tarceva, which also targets EGFR. Early studies show that the same mutations may help determine the success of Tarceva therapy.

2. U.S. VISA POLICY

# Foreign Scholars to Get Longer Clearance

1. Yudhijit Bhattacharjee

The United States plans to extend the validity of security clearances for foreign students and scientists beyond the current 1-year duration. The new policy, which government officials say could be implemented as early as this fall, will reduce delays for U.S.-based international scholars seeking to reenter the country.

“We've heard loud and clear from the university and scientific communities that the image of this country as a venue for research and scholarship has been suffering,” says C. Stewart Verdery Jr., assistant secretary for border and transportation security policy at the Department of Homeland Security (DHS). “And we want to change that.”

Foreign students and researchers who work in sensitive fields of science and technology currently must undergo a security review to obtain a reentry visa if their last clearance was granted more than 12 months ago. Under the new policy, which has yet to be finalized, the clearance could be valid for as long as the duration of their study or academic appointment. DHS officials say the extension is a result of improved measures to monitor individuals entering and leaving the country. Through the Student and Exchange Visitor Information System, for instance, “we can know when an international student majoring in English has switched to nuclear engineering,” says Verdery. “And if the system shows that a scholar is returning for the same activity that he or she was pursuing prior to leaving the U.S., it makes sense not repeat a security check.”

The administration is also planning to revise the list of sensitive technologies used to determine whether a visa applicant needs to undergo an elaborate interagency review. DHS officials say that the department will consult with scientists to review the list, which they acknowledge is “too broad.”

The scientific community sees the proposed changes as the latest in a series of positive steps. “They've already made some serious efforts to minimize visa delays,” says Mark Frankel of AAAS, publisher of Science, which this spring helped draft a set of visa policy recommendations (Science, 14 May, p. 943).

3. NEXT LINEAR COLLIDER

# Physicists Pick a Cold Road for Accelerator Project

1. Charles Seife

Particle physicists are hot to trot with a cold linear collider. Although money and politics may prevent it from ever being built, the next big machine to explore the fundamental forces and particles in the universe should use “cold” superconducting technology rather than “warm” traditional conductors, scientists decided last week. “It's a very important point,” says Jonathan Dorfan, director of the Stanford Linear Accelerator Center (SLAC). “We will all come together now, enthusiastically, to come to a design.”

A more powerful linear collider is the next logical step in a 75-year sequence of building particle accelerators. In 2007 or 2008, the Large Hadron Collider (LHC) at Europe's CERN lab near Geneva, Switzerland, will begin to search for new particles. Most particle physicists have high hopes that the LHC will discover important exotica such as the Higgs boson and “supersymmetric partners” of known particles. But the LHC, which smashes complicated protons together, won't have the finesse to analyze those discoveries in detail. A linear collider, which smashes simple electrons and antielectrons together, can be used to figure out the properties of the new particles with greater precision.

At a Colorado summit in 2001, particle physicists across the United States agreed to pursue a next- generation linear collider (Science, 27 July 2001, p. 582), but they split over how to accelerate the electrons and antielectrons to smashing speed. Scientists at Japan's KEK laboratory in Tsukuba and at SLAC favored using copper cavities to pump an extremely large amount of energy into the accelerating particles in a relatively small space. The European DESY lab in Hamburg, Germany, meanwhile, championed a plan to use superconducting niobium cavities to accelerate the electrons and antielectrons in a more leisurely—but more efficient—manner. “Warm technology supports a higher gradient, so you can get a physically smaller, shorter machine,” says Stephen Holmes, associate director for accelerators at the Fermi National Accelerator Laboratory in Batavia, Illinois. “Cold technology uses less power, so it's cheaper to operate.”

Most scientists agreed that either technology would have done the job well at about the same cost. Paul Grannis, a particle physicist at the State University of New York, Stony Brook, and a member of the panel that made the choice, says that several factors played crucial parts in the decision. For example, the lower- frequency operation of the cold technology makes it somewhat less sensitive to problems such as ground motion, Grannis says. The technology is also similar to that which DESY's Tera-electron-volt Energy Superconducting Linear Accelerator (TESLA) collaboration developed for the lab's planned X-FEL free-electron laser project, which will help pave the way for the superconducting collider (Science, 10 May 2002, p. 1008).

Physicists' consensus boosts the accelerators' prospects, says DESY's project leader for linear collider research, Rolf-Dieter Heuer: “This is what politicians want—a clear view of how to proceed. It brings us to a very strong position.”

The next step is to come up with a conceptual design for the machine, a task that should take 2 years or so. “I don't have a good answer” for costs, says Dorfan. “But it will be many billions of dollars.”

4. CHEMISTRY

# Fuel Cell Draws Power From Poison

1. Robert F. Service

Scientists working on automotive fuel cells have come up with a way to turn a molecular adversary into a friend.

Low-temperature fuel cells use platinum catalysts to extract electricity from hydrogen gas. But when that gas is produced from fossil fuels—the most common source—it's invariably contaminated with carbon monoxide (CO), which poisons the catalysts. On page 1280, however, researchers led by chemical engineer James Dumesic of the University of Wisconsin, Madison, report that they've solved the problem and, for good measure, created another source of fuel.

“It's pretty novel and interesting,” says Matthew Neurock, a catalysis expert at the University of Virginia, Charlottesville. Neurock and others say the work might help make fuel cells cheaper by scrapping part of the high-temperature apparatus currently required to eliminate CO from hydrogen fuel. It could also be welcome news for those who advocate generating hydrogen fuel from renewable fuels such as agricultural waste, because producing hydrogen from “biomass” also produces large amounts of CO. “This opens the door to using renewable energy resources,” Neurock says.

Makers of low-temperature fuel cells—called polymer electrolyte membrane (PEM) fuel cells—currently fight their molecular enemy by sending their fuel through an initial chamber where CO reacts with vaporized water at temperatures of 500°C or more. At this high temperature, CO molecules grab oxygen atoms from water molecules to make carbon dioxide (CO2), an inert gas that's vented to the air. The leftover hydrogen joins the rest of the hydrogen gas that's fed to the fuel cell.

The process cleans up fuel effectively. But it's costly and inefficient to create the high temperatures needed for the reaction and then cool the exhaust gases below 100°C, as most low-temperature fuel cells require, says Robert Hockaday, founder of Energy Related Devices, a fuel cell maker in Los Alamos, New Mexico.

Earlier this year, Dumesic's team found a cool alternative: a membrane coated with gold nanotubes and nanoparticles. On the nanoscale, Dumesic explains, normally unreactive gold becomes so active that it catalyzes reactions swiftly even at low temperatures. For their current study, Dumesic, postdoctoral assistant Won Bae Kim, and students Tobias Voitl and Gabrielle Rodríguez-Rivera used their nanogold catalyst to react CO and liquid water to create CO2, hydrogen ions (H+), and electrons (e). Instead of letting the energy in the electrons fizzle away, they captured it with electron-ferrying “redox” compounds known as polyoxometalates (POMs) dissolved in the water surrounding the membrane. POMs carry a strong positive charge that makes them hungry for electrons. When those electrons bind to the POMs, they turn the solution from a bright yellow to a vivid deep blue.

To recover the energy from the electron-toting POMs, Dumesic's team piped the solution, mingled with hydrogen ions from the CO reaction, to the front end of a PEM fuel cell. There, a positively charged electrode—the anode—stripped off the electrons and turned them into usable current. The oxidized POMs were then recycled to the gold-nanotube reactor to convert more CO. The rest of the process—combining the hydrogen ions, electrons, and oxygen at the cathode to create water—was standard fuel-cell chemistry.

The novel scheme for befriending CO is getting mixed reviews. Shimshon Gottesfeld, chief technology officer at MTI Micro Fuel Cells in Albany, New York, notes that fuel cells that ferry electrons by means of chemicals such as POMs are typically less efficient than devices that move electrons using electrodes. But Hockaday likes the way the system cleans up hydrogen fuel, and he and others predict that industry will be interested. “I would use it,” he says.

5. GENETICS

# Patient Advocate Named Co-Inventor On Patent for the PXE Disease Gene

1. Eliot Marshall

In an apparent first, the lay leader of an advocacy group has been recognized as a co-inventor with four scientists on a gene patent. This is evidence, says Francis Collins, director of the National Human Genome Research Institute, of the increasing role patient groups are playing in research.

The work deals with a transporter gene, known as either MRP6 or ABCC6, that causes a rare connective tissue disease called PXE (pseudoxanthoma elasticum). Sharon Terry, mother of two children with PXE and executive director of the group PXE International in Washington, D.C., is one of five inventors on a patent issued on 24 August by the U.S. Patent and Trademark Office. A diagnostic test should be available “by the end of the year,” says Terry, and PXE International expects to offer it to its members by then.

PXE is not usually lethal, but the calcium buildup it causes in certain cells can have devastating effects, such as vision loss, gastrointestinal bleeding, and heart disease. Harmful PXE gene mutations are thought to occur in 1 of about 50,000 people in the United States; there is no proven treatment. The four academic scientists listed as inventors of the PXE patent are members of a research collaboration led by Charles Boyd of the University of Hawaii, Honolulu. Although others had laid the groundwork for their studies, Boyd's group was first in a four-way race to publish 4 years ago (Science, 2 June 2000, p. 1565).

Sharon and her husband Patrick Terry founded PXE International 8 years ago and quickly helped mobilize support for scientific studies on an international scale. Genetic researchers often get help from families affected by rare diseases, for instance, in obtaining tissue samples and collecting family data. But Sharon Terry says she did much more: “I extracted DNA, ran gels, read the gels,” and helped write the paper announcing the gene's discovery.

Collins says that Terry's direct contribution to the scientific work earned her a place on the inventor's list—a point subjected to “careful evaluation” by patent examiners. He views PXE International's involvement as a positive “example of how parents and lay organizations can play a catalytic role in research on rare diseases.” Like many, he contrasts the PXE gene patent with the patent for the gene causing Canavan's disease; in that case, patient advocates sued a university and its scientist to regain control of a gene patent because they wanted to control testing costs and availability (Science, 10 November 2000, p. 1062). The lawsuit failed.

The success of PXE International, says Collins, has encouraged others. Leslie Gordon, mother of a child with a rapid-aging syndrome called progeria, worked with his lab in identifying a causative gene. A Ph.D.-pediatrician, Gordon is a co-author of the 2003 paper describing the progeria gene and one of the inventors listed on the patent application. She is also a co-founder and medical director of the Progeria Research Foundation in Peabody, Massachusetts.

Terry thinks PXE International may have to subsidize the price of gene testing, a typical use of which might be as a replacement for a painful skin biopsy to learn whether a younger sibling of a PXE patient is also at risk. It's a complex gene, difficult to analyze, she notes. After looking at DNA from 260 people, her group has found that six mutations account for 45% of the affected individuals. Capturing more mutations, and thus more of the affected population, may require additional DNA sequencing, which would be expensive.

“We're not sure how much it will cost” to test an individual, Terry says, but in some cases it could run to $3000 if it becomes necessary to sequence the entire gene. But whatever happens, Terry says, it's comforting to know that PXE International is now “driving the boat.” 6. NUCLEAR WEAPONS POLICY # Showdown Expected in Congress 1. David Malakoff Take shelter, a political nuclear war is about to resume. An increasingly fiery debate over proposed new funding for U.S. nuclear weapons research and testing is expected to heat up again next month when members of Congress return from their summer recess. At issue are three relatively small proposals that the Bush Administration included in its spending plan for the 2005 fiscal year, which begins on 1 October. One asks Congress to provide$27.6 million to develop an earth-penetrating nuclear weapon capable of destroying buried bunkers. The White House also wants $9 million to study “advanced concepts” for low-yield nuclear weapons and$30 million to shorten the time needed to prepare a site in Nevada if the United States were to resume underground testing of a nuclear weapon.

The White House argues that the moves are needed to enhance the country's ability to deter potential enemies and keep weapons scientists sharp (Science, 4 July 2003, p. 32). Administration officials insist that they have no plans to actually build or test new weapons, adding that Congress would have to approve those steps. But a bipartisan group of critics is skeptical, saying the proposals threaten to spark an expensive new arms race—even as the United States is seeking to prevent Iran, North Korea, and other nations from developing atomic weapons. “Each side gets to stress themes that resonate with its [political] base,” says Jonathan Medalia, a national defense specialist at the Congressional Research Service in Washington, D.C.

Last year, the two sides fought to a draw, with Congress giving the White House approval to move ahead with the research but approving only about half of the requested funds. This year, the critics have already won the first round. In June, the full House of Representatives approved a Department of Energy (DOE) spending bill that eliminates funding for the three programs, even though the programs were authorized in separate measures approved by each house this summer.

In a harshly worded report that accompanied the spending bill, the House appropriations panel that oversees DOE said it was “unconvinced by [DOE's] superficial assurances” that the earth penetrator—which the White House says would cost nearly $500 million to develop—“is only a study and that advanced concepts is only a skills exercise for weapons designers.” A leaked DOE memo, the panel charged, “left little doubt that the objective of the program was to advance the most extreme new nuclear weapon goals.” Representative David Hobson (R-OH), who heads the House spending panel, boasted in a recent public forum at the National Academy of Sciences on nuclear nonproliferation that he would “beat 'em again” if the White House tried to force another House vote on the issue. “I think they can count [votes],” he said. The action now turns to the Senate, which has traditionally been friendlier to the three programs and has already defeated several efforts to eliminate them. Its version of the DOE spending bill could come to a preliminary vote as early as next month. If it approves funds for the programs, a House-Senate conference committee would have to settle the issue. Campaign politics could delay any final decision until after the November elections. 7. PRIMATE STUDIES # Politics Derail European Chimp Home 1. Martin Enserink A battle over where to build a permanent retirement home for Europe's last remaining research chimpanzee colony is intensifying. Plans for a facility in Spain were derailed this spring, when the mayor of the tiny Spanish mountain town slated to host it declared his opposition to the project. Now, the Dutch charity that plans to build it has launched an international campaign to salvage the project. The search for a final home began in 2002, when the Netherlands banned the use of chimps in research. Under pressure from animal-rights groups, the Dutch government agreed to take 63 remaining chimps away from the Biomedical Primate Research Center in Rijswijk by mid-2005 and hand them over to Foundation AAP, which runs a private primate shelter. At its Almere headquarters, AAP plans to build a permanent home for 30 chimps infected with hepatitis C and the simian cousin of HIV. For 33 uninfected chimps, however, it bought a 45-hectare estate in Relleu, near Alicante on Spain's eastern coast, where they can live more comfortably and cheaply than in the Netherlands. The facility would also house other abandoned and confiscated primates from all over Europe. In 2002, Relleu's elected mayor, Santiago Cantó, signed a letter supporting the facility, saying it would benefit the local economy with minimal environmental impact. But last March, Cantó reversed himself and asked the regional government not to issue a “declaration of public interest,” a key bureaucratic hurdle. Any economic benefits were irrelevant, Cantó wrote, given the “social unrest” that the plans had caused. He cited the risk of noise, odors, and zoonotic diseases and said the facility would hurt tourist development at nearby properties. Although it has supported the plans, the regional government is unlikely to overrule the mayor's opinion, says zoologist Vicente Urios of the University of Alicante, who has followed the affair closely. Jack Drenthe, AAP's representative in Spain, suggests that Cantó's change of heart is primarily inspired by a complaint filed by an Alicante businessman and developer who owns property adjacent to the site. But so far, the facility's backers have been unable to change Cantó's mind. Last month, famed U.K. primatologist Jane Goodall visited Relleu to show her support, but if anything, the visit “may have hardened the opposition,” she says. A tumultuous town meeting on 30 July was dominated by the mayor and other opponents of the plan, says Urios, who chaired the event: “It's no longer a rational discussion.” Now, AAP is urging supporters to e-mail Cantó to show their support; it is also about to send letters signed by Goodall to members of the European Parliament and Spanish ambassadors across Europe. The Dutch government hasn't decided what to do if AAP misses its 2005 deadline, a spokesperson for the science and education ministry says. Cantó could not be reached for comment, but Urios says he's unlikely to change his mind again. Luckily, he adds, other towns in the region are interested in providing the Dutch chimps with a tranquil, sunny old age. 8. BIOSECURITY # Up in the Air 1. Kathryn Brown The government is pouring money into sensors to detect bioweapons, but skeptics question whether they can really protect the public from the array of potential threats Pentagon employees couldn't see the gas seeping into their building. They couldn't taste or smell it. But strategically placed sensors immediately picked up the problem, precisely tracking the wafting gas. Everyone was safe. This was not reality. This was Pentagon Shield, a Department of Defense exercise last spring that simulated a biological or chemical attack. Research teams released sulfur hexafluoride—a harmless gas used in airflow testing—outside the Pentagon intermittently over several days. Standard gas analyzers traced its movement around and into the building, while other sensors recorded weather conditions. With those data, scientists are refining a computer model of aerosolized weapon movement. In a real attack, however, unlike a neatly defined exercise, it's unclear how well actual sensors would perform. The Department of Homeland Security (DHS) spends more than$60 million annually on environmental detectors that monitor outdoor air for bioweapons, but many scientists argue that those detectors are ineffective. Now, DHS plans to spend at least $32 million more, over the next 18 months, to develop next-generation sensor technology. “This research has tremendous promise,” says Penrose Albright, assistant secretary for science and technology at DHS. But scientists remain skeptical that government contractors really can design sensors that quickly, cheaply, and accurately detect one of the dozens of bacteria, viruses, or toxins that could become aerosolized bioweapons (see table). ## Hazardous history Bioagents instill fear because just a little can pack a big punch. “Infectious biological agents are on the order of 1000 to 1 million times more hazardous than chemical [agents],” says Edward Stuebing, head of aerosol sciences at the U.S. Army Edgewood Chemical Biological Center in Edgewood, Maryland. For decades, these worries were the quiet domain of U.S. military and national weapons labs, funded by the Department of Energy or the Defense Advanced Research Projects Agency. Researchers at Los Alamos National Laboratory (LANL) in New Mexico and Lawrence Livermore National Laboratory (LLNL) in California collaborated on an early biodetection network, dubbed BASIS. That eventually led to the sole environmental bioweapon sensor deployed nationwide today: BioWatch, an aerosol system that works like a vacuum cleaner, sucking air over filter paper that traps aerosol particles. Although earlier BASIS sensors were designed only to detect bioweapons during specific events, such as the Olympics, DHS has deployed BioWatch sensors to continually monitor air in more than 30 major cities. Despite DHS claims of a perfect record, scientists privy to classified assays suggest that the sensors may experience false positives—mistaking normal environmental toxins for bioweapons. Others complain that because the assay results are classified, they have not been evaluated by outside scientists. DHS's Albright characterizes BioWatch as a starting point, a relatively cheap system that can be upgraded with new technology. Much of the cost of BioWatch—roughly$60 million annually, or $2 million per city—is labor, he says: “Today, we collect the BioWatch filter, take it to the lab, treat the sample, do an initial screen, and then, if we get a hit, take it through an extensive battery of tests.” DHS wants a faster, sleeker system—one that continuously sniffs for bioweapons and can be sampled frequently with little maintenance, Albright says: “We want high sensitivity, minimal false alarms, and low cost, so we could deploy it nationally in large quantities and expect it to be maintained by, say, volunteer firefighters.” That's a big jump from today's BioWatch. But DHS's external funding arm, the Homeland Security Advanced Research Projects Agency (HSARPA), thinks it can make the leap. The agency recently launched its first research push, allocating more than$32 million to 14 outside teams.*

DHS is funding six teams to develop high-priority “detect-to-treat” systems. These would be deployed outdoors like BioWatch but would identify a bioweapon within just 3 hours, enabling doctors to treat exposed civilians. The remaining eight teams are doing feasibility studies for “detect-to-protect” systems, for use inside critical buildings and in specific outdoor spots, to detect a bioweapon within 2 minutes, in time to warn civilians and trigger responses in, say, ventilation systems.

“We are asking everybody to work as fast as they can,” says Jane Alexander, deputy director of HSARPA. “In some cases, we have told bidders, ‘We know we're asking for the sun, the moon, and four planets. If you can only give us two planets, go ahead.” With DHS investment, several sensor prototypes probably could be deployed within months, says J. Patrick Fitch, head of chemical and biological national security at LLNL.

## Fine-tuning

To build next-generation sensors, DHS hopes to tweak existing prototypes with the latest technology. Some sensors will run simultaneous assays on microchips, for instance, or tap new genomic markers for more definitive pathogen signatures.

All biosensors share two basic tasks: to sample air particles and to identify any pathogens. For sampling air particles 1 to 10 micrometers in size, a sensor includes one (or more) of several technologies. A vacuum, for instance, sucks air over filter paper to trap particles, as in the BioWatch sensor. Alternatively, a wetted cyclone draws air down a tube injected with water, which moves with centrifugal force to capture particles. A third variety, called a virtual impactor, uses tiny jets to push air particles down a tube at high speed, concentrating them while diverting excess air. Each differs in cost, sensitivity, speed, and complexity.

For the second task—isolating and identifying bacterial, viral, or toxic particles trapped in the sample—sensor systems typically run immunoassays, polymerase chain reactions (PCR), or mass spectrometry screens. Again, there are tradeoffs. Detect-to-protect technologies are relatively fast and cheap but often carry higher rates of false positives. “If I go from wanting an answer in an hour to wanting one in 2 minutes, I have eliminated all kinds of technologies, like PCR,” says Fitch.

Although slower, the detect-to-treat sensors often use PCR to glean greater detail about a pathogen's identity, activity, and susceptibility to various treatment options. Among the DHS-funded teams, at least two detect-to-treat prototypes are already being field-tested. One is TIGER—for Triangulation Identification for Genetic Evaluation of Risk—developed by Science Applications International Corp. in San Diego, California, and Ibis, a division of Isis Pharmaceuticals in Carlsbad, California. TIGER works by sampling the air, extracting nucleic acids, and amplifying those acids with broad-based PCR primers that capture all biological agents in the sample. TIGER electrosprays the PCR products into a mass spectrometer that produces each agent's mass and DNA base composition. Scientists compare an organism's DNA signature with those in a broad database, confirming its identity—or, in the case of an unknown organism, using phylogenetics to characterize it. This process takes up to a day.

A similar sensor, the Autonomous Pathogen Detection System (APDS), has already been field-tested in the Washington, D.C., Metro transit system and at the San Francisco and Albuquerque airports. LLNL developed this sensor and licensed the technology to MicroFluidic Systems, which leads one of the DHS-funded research teams.

This sensor works by screening air particles with immunoassays or PCR analysis. By multiplexing —or running multiple tests simultaneously—an APDS unit can screen for more than 100 different bacteria or viruses in about an hour. Networked sensors communicate data to a remote console, often via wireless connection, so scientists get monitoring updates from afar. APDS can identify a known bioweapon in 30 minutes to 1.5 hours, Fitch says.'

Faster detect-to-protect sensor prototypes are also emerging. One DHS-funded team leader, Johns Hopkins University's Applied Physics Laboratory (APL) in Laurel, Maryland, is developing a time-of-flight mass spectrometer that can, within minutes, identify a biological agent based on its proteins or peptides. APL's sensor automatically sucks in aerosol samples, mixes them with an ultraviolet light-absorbing chemical, and pulses the samples with UV light in a mass spectrometer. Based on light scattering and molecular weight, the system identifies key proteins, say, found in biotoxins. Such a system could instantly warn that bioagents may be present—and possibly trigger changes in ventilation systems or sound alarms. But the system offers less detail on pathogens than slower varieties do.

## Wrong track

Still, skeptics question whether DHS's push for environmental detection is misguided. Microbiologist Paul Jackson of LANL argues that biosensor research is a costly diversion that will provide, at best, a false sense of security. “Everybody has aerosols on the brain,” he says. “Frankly, I don't know that environmental monitoring of aerosols at random—or even in important places—is necessarily the best approach.”

Jackson and others argue that more biodefense funds and government guidance should go to hospitals nationwide for “syndromic surveillance” or for the use of simple, reliable blood tests and other diagnostics to detect bioweapons. “The best sentries we have are patients who come into [emergency rooms] with suspicious symptoms,” Jackson says. If an initial wave of bioterror victims was diagnosed quickly, he adds, many might be saved—and a nationwide alert could immediately be launched.

The federal government has already promised more than $2 billion in biodefense funds to local public health leaders, and the Centers for Disease Control and Prevention has urged those leaders to invest in syndromic surveillance. But local efforts are patchy—and, many say, poorly coordinated. DHS also encourages syndromic surveillance. But its detection efforts begin in the environment, where questions first emerge. Did an attack actually happen? Can it be stopped? How can patients be treated? Can buildings be decontaminated? Tradeoffs are likely to continue. Future bioterror weapons, scientists say, could include genetically engineered pathogens, prions, and bioregulators. All demand new sensors—and questions. 9. SOCIETY FOR CONSERVATION BIOLOGY MEETING # Loss of Dung Beetles Puts Ecosystems in Deep Doo-Doo 1. Erik Stokstad NEW YORK CITY—Some 1500 conservation biologists gathered at Columbia University from 30 July to 2 August to discuss humanity's growing impact on the natural world. Among the findings were new twists on how fragmenting forests can hurt dung beetles, monkeys, and other creatures. Like an overengineered airplane, ecosystems are thought to have redundant functions that should prevent a single extinction from triggering more serious consequences. Many animal species disperse seeds, for example. So when one such species disappears, others face less competition and ought to become more abundant, taking up any slack. New research suggests that may not always be true. The study examined the fate of dung beetles, which collect dung, bury it, snack on it, and lay their eggs in it. Burying the seed-laden dung also enriches the soil and helps plants regenerate. Trond Larsen, a graduate student at Princeton University, found that the beetle species best at burying dung were the first to disappear from forest fragments. Alarmingly, related species did not become more abundant. Much dung then went unburied. “It tells us that the level of resilience in ecosystems to damage or biodiversity loss could be much less than we thought,” says Richard Ostfeld of the Institute of Ecosystem Studies in Millbrook, New York. Larsen studied 42 species of dung beetles in eastern Venezuela, where a hydroelectric dam completed in 1986 flooded 4300 square kilometers of tropical forest and created more than 100 forest islands. He found that smaller islands had fewer species of beetles and that the larger beetles were most frequently missing. The main cause of the beetle's decline was a bad sense of direction. Most dung beetles are used to flying in contiguous forest, where they don't need to be expert navigators. By marking some 15,000 beetles and recapturing as many as possible, Larsen showed that beetles couldn't find their way back if they flew off the island. “Once they hit open water, they're done for,” he says. Big beetles fly faster and farther than small beetles, he discovered, and are more likely to go AWOL. The problem is worse on smaller islands, where there is a larger perimeter relative to the area. To retain a viable population, three of the largest dung beetle species needed at least 85 hectares—a surprisingly large amount of habitat for an insect, Larsen says. When beetle diversity declined, much less dung was buried. The remaining species of dung beetle on the smaller islands didn't become more abundant and dig into the surplus dung, Larsen found. The reason, he suspects, is that they too are accidentally leaving the islands, although at a lower rate. With fewer seeds being buried, forest diversity ultimately will decline. The worrisome conclusion is that species diversity is less of a safeguard against ecosystem collapse than had been assumed, Larsen says: “Even the loss of just one or two species may have a much greater impact than we previously thought.” Like top carnivores, the large dung beetles appear to be the most sensitive to extinction and extremely important for ecosystem integrity, he adds. Moreover, it's surprisingly hard for others to fill their shoes, Ostfeld says: “I wouldn't have expected to see this effect with a dung beetle.” Larsen's discovery that the beetle's larger body size and flying behavior make it more vulnerable to decline is an important contribution, says Ostfeld. “Finding a clear mechanism makes it more likely that ecologists can predict the systems that should behave similarly,” Ostfeld says. “That's a big deal for environmental managers and policy specialists.” 10. SOCIETY FOR CONSERVATION BIOLOGY MEETING # Forest Loss Makes Monkeys Sick 1. Erik Stokstad NEW YORK CITY—Some 1500 conservation biologists gathered at Columbia University from 30 July to 2 August to discuss humanity's growing impact on the natural world. Among the findings were new twists on how fragmenting forests can hurt dung beetles, monkeys, and other creatures. It's bad news for endangered animals when their habitats are fragmented. Populations become isolated, food supplies diminish, and hunters become more of a threat. Now add to that list a higher risk of illness. Although it's known that disturbed habitat can help transmit diseases between wildlife and humans, a new study shows for the first time that fragmentation of forests by humans can hasten the decline of a primate population by making common parasites more abundant and introducing new ones. “It's a potentially devastating effect,” says Peter Daszak, director of the Consortium for Conservation Medicine in Palisades, New York. Deforestation threatens many populations of forest-dwelling primates in Africa. Thomas Gillespie, now a postdoc at the University of Illinois, Urbana-Champaign, and his Ph.D. adviser, Colin Chapman of the University of Florida, Gainesville, studied two species of leaf-eating monkeys to understand how habitat change might affect their health. They compared groups living in undisturbed forest within Kibale National Park in western Uganda with those living in surrounding forest fragments. In the park, overall populations of both the Red Colobus monkey (Piliocolobus tephrosceles) and the Black-and-White Colobus (Colobus guereza) have remained stable. But in 22 nearby patches of forest, the scientists found that the total Red Colobus population fell by 20% between 1999 and 2003. In contrast, the number of Black-and-White Colobus in the same fragments rose by 4%. Suspecting that parasites might be to blame for the decline in the Red Colobus, Gillespie and his team first looked for evidence of them in both fragmented and intact forest. Densities of primate parasites were higher in forest fragments, they found. For example, the larvae of the nodule worm Oesophagostomum, which causes the most debilitating symptoms of all the pathogens, were more than five times more abundant in the fragments. “It's very clear that there was a higher risk of infection in disturbed forest,” says Gillespie. He suspects that people and livestock are introducing pathogens; indeed, four of the five parasites found only in the fragments also infect humans and livestock. To measure the levels of infection, Gillespie examined 1151 monkey feces samples for parasites. Ten parasite species were present in the Red Colobus samples, and feces from fragmented habitat had significantly higher levels of most parasites than feces from the virgin forest. By contrast, the Black-and-White Colobus samples contained just seven parasites. For five of those parasites, there was essentially no difference in their prevalence between dung samples from fragmented and intact forest dwellers. That could help explain why the Black-and-White Colobus are doing better, although it's not clear why they would carry fewer parasites than do the Red Colobus. “This work suggests a really strong role for disease” in the decline of the Red Colobus, says Nick Isaac, an evolutionary biologist at the Zoological Society of London. Although probably not fatal, parasites can affect a population indirectly, Isaac explains, by making monkeys less able to feed or conceive. And stress makes the animals more vulnerable to infection by parasites, which makes a grim situation even grimmer. 11. JOHN SCHAEFER PROFILE # Shooting for the Stars 1. Jeffrey Mervis John Schaefer has driven Research Corporation to new heights in astronomy. But critics wonder if he'll ever relinquish the helm and whether something's been lost along the way This fall, on a mountaintop in southeastern Arizona, astronomers from around the globe will celebrate first light at the world's most powerful optical telescope. They will also toast John Schaefer, the longtime head of Research Corporation (RC), the oldest scientific foundation in the United States. It was Schaefer who, in 1992, applied RC's weight—and eventually$12 million of its money—to pull what became the Large Binocular Telescope (LBT) from a mire of problems that was threatening to engulf it (Science, 22 June 1990, p. 1479).

Schaefer's rescue of the LBT was one of a series of bold moves that have changed the $150 million foundation since his arrival in 1982—not all of them successful, according to his critics. The LBT ceremony will also mark a rite of passage for Schaefer. The former organic chemist turns 70 next month and will step down at the end of the year as president and CEO of the atypical Tucson, Arizona-based charity. But he's not really leaving. Instead, he'll embark on what may be his most ambitious challenge yet: raising$200 million from both the public and private sectors to build yet another world-class observatory, the Large Synoptic Survey Telescope (LSST) (see sidebar). RC has committed $10 million to the venture as one of four founding partners and is providing office space for Schaefer and the LSST staff. “He's very much a visionary, and he's been on the mark most of the time,” says G. King Walters, a professor emeritus of physics at Rice University in Houston, Texas, and an RC board member since 1977. Patrick Osmer, a new board member and chair of the astronomy department at Ohio State University, which rejoined LBT in 1996, calls Schaefer “one of the smartest people I know and a remarkable leader.” But Schaefer's assertive leadership has also created a backlash. Two years ago, Schaefer suppressed what he viewed as a near- revolt within the organization by firing his designated successor, chemist Michael Doyle, and replacing four of the nine members of the foundation's board of directors. That insurrection was triggered by mounting concern over what Laurel Wilkening, a former chancellor of the University of California (UC), Irvine, and former board member, calls his “highhanded and autocratic” style of leadership. It's a style that has produced results, but his opponents say it is ill suited to a post-Enron era of greater corporate responsibility. “John's the kind of guy that you'd call in a crisis because he doesn't worry about consultation,” says Wilkening. “He would take charge, and then tell us afterward what he had done. But times have changed, and he hasn't changed with the times.” ## A visionary in a hurry Time is a relative term for Schaefer. “Right now I feel like I'm 25,” he explained during an interview last month, surprised that anyone would question his decision to tackle a long-term project like the LSST when his peers are spending their days at the bridge table or on the golf course. Not that Schaefer has ever spent much time relaxing. His résumé includes becoming president of the University of Arizona in Tucson at the tender age of 36 and founding the university's acclaimed Center for Creative Photography, becoming sufficiently accomplished in the medium to co-author a popular textbook with Ansel Adams. In 2002, when RC marked its 90th anniversary, Schaefer says he “took a couple of weeks” to pen a history of the foundation (http://www.rescorp.org/). By all accounts, Schaefer expects from others the same crisp efficiency that he demands from himself. “As department chair, I never met with John for more than 10 minutes,” says University of Arizona astronomer Peter Strittmatter about his former boss—and current colleague on LBT's corporate board, which Schaefer chairs. “But I always left with an answer, even if it wasn't what I wanted to hear.” Schaefer's career is a classic immigrant's success story. The son of poor German parents who arrived just before the Great Depression lacking a formal education, he excelled in New York City public schools and worked as a carpenter to help pay his way through Brooklyn's Polytechnic Institute. In 1958, he received a Ph.D. in chemistry from the University of Illinois, Urbana-Champaign. Two years later Carl Marvel, a former president of the American Chemical Society and professor emeritus at Illinois, was headhunted by Arizona's Richard Harvill, who was trying to expand the school's regional reputation by recruiting world-class researchers. Marvel suggested that Schaefer join him in Tucson. “I took to the city immediately,” Schaefer recalls. And the university reciprocated, rocketing him through its ranks. In 1968, he was named department chair and in 1970 dean of the college of arts and sciences. Eighteen months later, he succeeded Harvill as president. Schaefer calls his promotions “a series of fortunate accidents.” But he admits to a bit of ambition, too. “When Harvill announced his retirement, I thought, ‘I'm liking this [dean's] job and enjoying having a greater degree of control over things. Why not go for the top job?’” Schaefer took immediate advantage of his new authority to raise the university's research profile. He claimed control of all vacant faculty positions—typically the prerogative of individual departments—and held a campuswide competition to fill them. “It was a hunting license for department chairs to go after the best talent,” he recalls. He also pooled overhead payments from federally sponsored research to create a discretionary fund that financed start-up packages and innovative research proposals. “We didn't use a committee because that isn't always the best way to make decisions,” recalls Richard Kassander, an atmospheric scientist who was part of Schaefer's small, inner circle of senior administrators. Old-timers say his most important step was getting Arizona (and its archrival, Arizona State) into an athletic conference that included prestigious schools such as Stanford and UC Berkeley. “Those relationships go beyond what happens on the athletic field,” Schaefer says. “I wanted us to be associated in the public mind with top-notch schools.” By all accounts, those efforts enabled his successor, Henry Koeffler, to make a successful bid to join the elite Association of American Universities, signaling the school's arrival as a major research institution. ## Remaking RC After 11 years at Arizona, Schaefer says he “had done what I wanted to do” and was ready for a change. It didn't help that the school's football team was facing a 2-year probation by the National Collegiate Athletic Association stemming from financial improprieties by its coach. “At my first press conference,” says Koeffler, who took over in 1982, “every question was about athletics, not academics.” Meanwhile, RC was entering its eighth decade as an unsung, blue-chip charity and was looking for a new president with some pizzazz. Schaefer, who had joined the RC board in 1974, says that running a foundation “appealed to me.” It also represented a golden opportunity to apply his management philosophy of “trying to make a difference” with whatever resources were available. Research Corporation is not your typical foundation. It was founded in 1912 by Caltech chemist Frederick G. Cottrell as a way for the public to benefit from his invention of the electrostatic precipitator and other patents donated by university scientists. And it worked hard for its endowment: In its early years, RC operated as a business, making and selling precipitators; for decades it also helped commercialize university inventions, taking a share of the royalties. RC is best known for making small, early-career awards to promising physical scientists, including 30 Nobelists, using a process that Brian Andreen, a chemist who joined RC in 1964 and serves as its unofficial archivist, says was later copied by the National Science Foundation (NSF). Schaefer's most dramatic move at RC was to create a separate nonprofit but tax-paying organization, Research Corporation Technologies (RCT), to handle RC's technology transfer activities. Schaefer says that the Internal Revenue Service had been warning RC for years that its dual functions of generating income from patent royalties and handing out grants could jeopardize its tax-exempt status. So Schaefer and a former Arizona colleague now at RC, Gary Munsinger, successfully lobbied Congress for a 1986 law that allowed RC to separate the two functions. Over the years, RCT has parlayed its initial$35 million “loan”—half of RC's endowment at the time—into some $300 million in working capital, with a current focus on seeding promising high-tech start-ups. In 1999, RCT even created its own scientific charity, the Frederick G. Cottrell Foundation. View this table: Although Schaefer says RC had no choice but to divest its tech-transfer business, others aren't sure. Joan Valentine, a professor of chemistry at UC Los Angeles who this spring was ousted from the RC board, calls it “the biggest thing he did wrong. We were told there was no alternative, but I wonder.” Walters, who also served on RCT's board, says he was “very disappointed that it came to pass.” But he believes the RC board exercised “due diligence” in exploring its options. Although RC and RCT are legally separate, Schaefer ran both organizations and chaired both corporate boards for several years. For good measure, he also headed the RC board's executive and nominating committees, giving him control over his corporate overseers as well as over the decision-making process. That's far too many hats for one person to wear, says Valentine. A self-proclaimed political “naif,” Valentine was part of a minority bloc on the board that had grown increasingly concerned about what it saw as Schaefer's “dictatorial” authority and the board's willingness to bend to his will. “The foundation, under its current president, is not interested in debate, dissent, or intellectual inquiry from those with different views,” she wrote in an April 2003 memo to the board. “Given the demands and control of the president, this board has emasculated itself to the point of being irrelevant.” In particular, Valentine, Wilkening, and Robert Gavin Jr., a physical chemist and former president of Macalester College in St. Paul, Minnesota, who also resigned from the board in 2003, feared that RC was abandoning its historical mission. Instead of advancing the natural sciences through early-career awards and small grants that foster innovative research and teaching, RC was beginning to adopt what Wilkening called a “project-of-the-month” philosophy that relied upon Schaefer's particular preferences rather than rigorous scientific review. “I kept saying that we needed a long-range plan,” says Gavin. “But all we got from John were ideas, one after another.” ## What went wrong Three recent actions by Schaefer fueled those fears. The first was his bid in 2001 to run the U.S. government's preeminent network of ground-based optical telescopes, the National Optical Astronomy Observatories (NOAO). After turning around LBT, Schaefer thought the foundation was ready to become a national powerhouse in astronomy. But winning the contract would have imposed enormous strains on an operation tailored to making small academic grants, says Wilkening. “The proposal was put together in a hurry,” says Wilkening, “and it always seemed like a stretch.” In May 2002, NSF rejected RC's proposal and retained NOAO's longtime manager, Associated Universities for Research in Astronomy. The second move was the abrupt dismissal of Doyle, Schaefer's handpicked successor. Doyle, who joined RC in 1997, had been named president in January 2002, although he continued to report to Schaefer rather than to the board. Six months later, however, Schaefer removed Doyle and reclaimed his old post. “It was a big blow and a shock to all of us,” says Walters. “I have the greatest respect for Mike, and I did anticipate that Mike would succeed John.” Doyle accepted a financial settlement from the board and is now chair of the chemistry department at the University of Maryland. Neither he nor Schaefer will discuss the events that precipitated the separation, but some board members say the timing—following the loss of the astronomy proposal—is key. “I don't think that John ever cared very much about the grants programs; his passion was astronomy,” says Valentine. “So the idea was that Mike would run the shop and John would take over all astronomy activities. Then NSF rejected our bid to run NOAO.” With LBT going smoothly and LSST still only an idea, she speculates, “perhaps John felt RC was all he had.” The third controversy involves an environmentally friendly project to grow food along the Red Sea, to which RC made$4.6 million in loans in 2000. The Eritrea project, known variably as Seawater Farms and Seaphire, was the brainchild of Carl Hodges, then head of the University of Arizona's Environmental Research Lab. “The idea was to grow shrimp and use the effluent to grow halophytes [salt-loving plants],” Schaefer says. “And by not dumping the spent water back into the Red Sea, you could make it a closed system that would preserve the environment. It was a concept that the world needed,” he says, and very much in line with having RC make a difference.

That's not how it looked to Valentine, who as head of the board's scientific advisory committee would normally have reviewed any project of this scale. “Carl's presentation was an embarrassment,” she says. “It never actually came to a vote, however, because John decided to call it a special investment rather than a grant. That meant it could be handled by the finance committee,” of which she was not a member.

Unfortunately, the investment was a bust. Current board members tend to blame political instability in the region, including the arrest of the government minister who supported the effort and a MiG attack by rebels on one of project's power plants. “Nobody on the board was a plant biologist or environmental scientist, but the pieces had been shown to work,” says Stuart Crampton, a physicist at Williams College in Amherst, Massachusetts, and immediate past chair of RC's board of directors. Schaefer says that the project has been “mothballed” and that Hodges's recent request for support for a similar project in Mexico is a matter for his successor.

That would be biologist James Gentile, dean of the Natural Science Division of Hope College, an undergraduate institution in Holland, Michigan. Gentile, who says he expects to inherit Schaefer's titles as president and CEO when he takes the reins, acknowledges that Schaefer's tenure at RC will be a hard act to follow. Schaefer, he says, “casts a long shadow” on the foundation.

Valentine worries that it could turn into a dark cloud. “John is an irresistible force,” she says. “So what will happen, for example, if the LSST comes up short? Will the board be pressured into giving more [than its $10 million pledge]?” More important, she says, “Will John let the new guy run the show?” 12. JOHN SCHAEFER PROFILE # The Desire to Go Faint, Fast 1. Jeffrey Mervis Going “wide, fast, and deep” is the best way to explore the universe, according to astrophysicist J. Anthony (Tony) Tyson. For John Schaefer, it's another way for Research Corporation to make a difference. The object of both men's desires is the Large Synoptic Survey Telescope (LSST), a$200 million instrument that would search for everything from the mysterious dark energy at the edges of the universe to asteroids that could threaten life on Earth. The three-mirrored optical telescope, sometimes called the Dark Matter Telescope, would peer much more rapidly and deeply into a wider swath of the heavens than any existing instrument. It would also deliver vast amounts of data to a global community of users.

Schaefer, the outgoing president and CEO of Research Corporation (see main text), was seduced by the telescope's goal of addressing “one of the most fundamental questions of our time—will the universe collapse or fly apart?” It's the kind of bold venture that Schaefer says RC must pursue to remain relevant as a science charity. RC has already pledged $10 million, part of a$70 million pot that Schaefer has promised to raise from the private sector (lsst.org). In turn, Schaefer's vast network of contacts impressed Tyson, who had collected endorsements for a weak gravitational lensing telescope from three separate panels of the National Academies but nary a dime to design and build it.

The result is a partnership involving RC, the universities of Arizona and Washington, and the National Optical Astronomy Observatory. The LSST Corporation, which Schaefer chairs, is also seeking significant support from the Department of Energy and the National Science Foundation. “Everybody has their assignments,” says Tyson, director of the project, who recently left Bell Labs for the University of California, Davis. “John's role is to know enough about the project to explain it [to] a lay audience and to lead the fundraising effort.” Next month, the University of Washington, one of the partners, will host a scientific workshop on the project.

LSST's three mirrors—an 8.4-meter primary mirror, a 3.4-m secondary mirror, and a 5.2-m. tertiary mirror—will funnel light onto a 3.2-gigapixel camera in a design that creates a 10-square-degree field of view. That combination results in an optical throughput (the product of the telescope's light-collecting area and sky coverage, called etendue) of 300, some 60 times greater than those of existing telescopes such as the Sloan Digital Sky Survey. “The LSST will be the world's largest imager,” Tyson says about the 1000-kg camera, capturing objects as faint as the 24th magnitude in 10 seconds and surveying the entire visible sky three times a month. “Wide, fast, and deep are not words that usually go together in astronomy.”

Tyson says LSST will have an unprecedented ability to detect change, from swiftly moving near-Earth asteroids to dark matter as old as half the age of the universe. Schaefer says RC's initial contribution will allow the LSST Corporation to let a contract to design and construct the mirrors, showing other potential donors and scientists that the project is real. “If you build a mirror and a telescope, they will come,” he quips. If all goes according to plan, the telescope would see first light in 2012.

On a typical night, the LSST will collect 30 terabytes of data on faint, rare, or transient objects. Over a decade, that's 30 petabytes of digital information—a prodigious output that will turn the instrument into a user-friendly facility. “The traditional model in astronomy relies on a committee that reviews requests for viewing time and controls access to the telescope, which can only do one project at a time,” says Harvard astrophysicist and LSST system scientist Christopher Stubbs. “With the LSST, we can do multiple projects at the same time, and the data will be freely available, without any proprietary delays, to anyone who wants it—from the best scientists to a high school student doing a science project.”

That approach will require huge advances in software and distribution technologies, however. And that's not the only obstacle facing Schaefer and Tyson. The LSST must |also prove its mettle against another project, already under way, that is designed to tackle many of the same scientific challenges despite having an etendue one-fifth the magnitude of the LSST's. The $60 million project will use a handful of smaller, cheaper telescopes to be built over the next 5 years. “The upside is that there are existing vendors who can build these smaller [1.4-m] telescopes quickly,” says University of Hawaii, Honolulu, astrophysicist Nick Kaiser, PI for the Pan-STARRS project (pan-starrs.ifa.hawaii.edu). “The downside is that you need multiple detectors [cameras] and the software to connect them. But we think we can hold down the detector costs by applying our experience with similar detectors.” The U.S. Air Force, which cares not at all about dark matter but a great deal about the technology for detecting threats to the planet, has already committed$20 million and is expected to foot the entire bill for construction.

Kaiser and his team hope to begin operating a prototype, atop Hawaii's Mount Haleakala, by early 2006. That gives Pan-STARRS quite a jump on the LSST, whose designs are still on the computer. “We think we've got a better approach, and Tony thinks he has a better approach,” says Kaiser. “The community will decide whose approach—one massive instrument or many smaller ones—works better.”

13. ECOLOGY

# Sportfishers on the Hook for Dwindling U.S. Fish Stocks

1. David Grimm

New findings are likely to fuel debate over proposals to bar recreational anglers from some coastal waters

Call it the mystery of the disappearing fish.

Despite decades of tighter restrictions on commercial fishing, the populations of many U.S. fish stocks have continued to decline. The puzzle intrigued marine ecologist Felicia Coleman of Florida State University in Tallahassee nearly a decade ago, when she served on a government panel that helps set regional catch limits. Coleman noticed that recreational fishers were hunting many of the at-risk species the council was trying to protect. While commercial fishers were on the regulatory hook, were sport anglers the ones that got away?

The notion that hobby anglers pose a major threat to marine fish is controversial. Many U.S. sportfishing groups, for instance, have opposed restrictions on their pastime by claiming just 2% of the overall fish landings—despite estimates that 50 million Americans participate in the sport. These low-catch claims have been politically persuasive, says Andrew Rosenberg, a marine biologist at the University of New Hampshire in Durham and a former deputy director of the National Marine Fisheries Service. “It's hard to convince people that one guy on a boat could be causing a problem,” he says.

That may be about to change, however, thanks to Coleman. In an extensive analysis of fisheries data published online this week by Science (www.sciencemag.org/cgi/content/abstract/1100397), her research team concludes that sportfishers are having a much bigger impact on marine populations than had been thought—and that they represent the major human threat for some species. Sportfishers are responsible for the vast majority of the landings of some at-risk species, according to the study, and have landed about 5% of the average annual catch over the last 2 decades.

Such numbers highlight the need for new restrictions on sportfishing, say marine conservationists, including barring anglers from new “no-take” reserves in coastal waters. Sportfishing groups, however, say the statistics don't necessarily support that solution. “You don't need to stop people from enjoying the outdoors” to protect fish, says Michael Nussman, president of the American Sportfishing Association (ASA) in Alexandria, Virginia.

To obtain the new numbers, Coleman's group cast a wide net, collecting 22 years' worth of landings data from state and federal agencies. Overall, they found that recreational landings accounted for 4% of the 4 million metric tons of marine finfish brought back from U.S. waters in 2002 (the most recent year for which statistics are available). But sport anglers had a much bigger impact on some species and in some regions. When the researchers focused on several dozen overfished species such as red snapper and red drum, they found that one-quarter were being landed by recreational fishers. Sport anglers take one-third of the catch of at-risk species in the South Atlantic and two-thirds of those in the Gulf of Mexico.

The study also questions another bit of conventional wisdom—that sport fishers do less harm to marine ecosystems than commercial fleets. Not so, report the researchers, because they often hunt top predators, causing ripple effects throughout the ecosystem. “It doesn't matter whose hook is in the water,” Coleman says.

“This is by far the best assembly of landings data” to date, says Ray Hilborn, a fisheries scientist at the University of Washington, Seattle. He says it shows that “the recreational fishing industry is a much bigger problem than it would like to think it is.” Rosenberg predicts that the findings will have political ramifications by bolstering opposition to “freedom to fish” bills that have been introduced in Congress (S. 2244 and H. 2890) and in a dozen coastal states. The bills seek to counter growing efforts to establish no-fishing zones by forcing government officials to show that alternative approaches won't help threatened species.

Recreational fishers, meanwhile, note that the landings data underpinning the study can be notoriously unreliable. And even if the numbers are accurate, they argue that no-take zones should be a last resort. “We have a good track record of conservation,” says ASA's Nussman, noting that traditional restrictions—such as catch limits and seasonal closures—have helped restore some threatened populations, such as striped bass along the Atlantic coast. “We'll do what we need to do to fix the problem.”

Marine researchers, however, aren't convinced that traditional approaches will be enough to protect dwindling stocks. Even bag limits, Coleman notes, only restrict the number of fish that can be caught by an individual fisher, not the total number caught by all sport anglers. “Right now, it's open access for recreational fishers,” she says. “We need to fix that.”

Commercial fishers, meanwhile, are happy to be out of the spotlight. Studies like Coleman's support what commercial captains have been saying for years, says Robert Jones, executive director of the Southeastern Fisheries Association in Tallahassee, Florida: “We're not the only ones causing the problem.” Still, Jones is skeptical that the new data will produce policy change. “The recreational fishing industry has very strong political connections,” he says.

The strength of those connections will be tested early next year. That's when several state legislatures are expected to consider freedom-to-fish proposals. The next Congress also plans to resume work on a major overhaul of federal fisheries regulation.