# News this Week

Science  26 Feb 2010:
Vol. 327, Issue 5969, pp. 1066
1. Drug Safety

# New Network to Track Drugs and Vaccines in Pregnancy

1. Jennifer Couzin-Frankel

Pregnant women worry about—and avoid—exposure to virtually everything that might be risky, from tap water to soft cheeses. Many also jettison drugs they need, for fear of harming their baby.

Unfortunately, existing data are fuzzy about the dangers of using—or going without—key medications. “We can tell you what happens in a rat or a rabbit,” says Christina Chambers, an epidemiologist at the University of California, San Diego. But a pregnant woman? For most medications, “we are flying by the seat of our pants,” says Chambers, a situation she calls “appalling and frustrating.” As a result, both doctors and patients are jittery about whether to continue or drop potentially risky treatments during pregnancy.

A new effort to bring risks into focus is being launched this week with $12.5 million from two U.S. agencies. It will start by examining asthma medications called short-acting beta agonists, as well as flu vaccines and antivirals for influenza. Called VAMPSS (the Vaccines and Medications in Pregnancy Surveillance System), the program will be funded for 5 years by the Agency for Healthcare Research and Quality and for 2 years by the Biomedical Advanced Research and Development Authority and coordinated by the American Academy of Allergy, Asthma, and Immunology. An advisory committee that includes members from pediatric and obstetric groups and the Centers for Disease Control and Prevention will guide VAMPSS's research. This push for data began 8 years ago. Chambers and two of her colleagues—asthma specialist Michael Schatz of the Kaiser Permanente Medical Center in San Diego, California, and Allen Mitchell, who directs the Slone Epidemiology Center at Boston University—had spent years researching the issue. But their studies were hampered by too few volunteers and potentially imprecise data from mothers asked to remember every pill they'd taken. The new program aims to get more robust results by bringing together two long-standing efforts. The first, led by Mitchell, has collected information over the years on 37,000 babies, most of them with congenital malformations, and their mothers. Mitchell plans to recruit at least 2000 more babies in each of the next 2 years for VAMPSS. Chambers, meanwhile, is one of the leaders of the Organization of Teratology Information Specialists (OTIS). It counsels between 70,000 and 100,000 pregnant women and health-care providers each year in the United States and Canada about drug and other exposures in pregnancy and lactation. It also invites some callers to enroll in research studies in which they and their babies are followed over time. OTIS will recruit thousands of these women for the VAMPSS studies on asthma and flu treatments and flu vaccines. OTIS takes an approach that improves the quality of the data: It works with women before their babies are born. But its cohorts are often too small to link a specific medication with a specific birth defect. On the flip side, the project headed by Mitchell has the statistical power to focus on one birth defect at a time, but it relies on mothers to recall exposures during pregnancy. By conducting studies in sync on the same treatment or vaccine, there's “no question” that VAMPSS will be superior to existing efforts and far more systematic, says Gideon Koren, who directs the Motherisk Program at the Hospital for Sick Children in Toronto, Canada, which is part of OTIS's North American network. The government support helps fill a serious gap. “It's not a secret that most drug companies … don't want anything to do with pregnancy,” says Koren. Drug companies so far have declined to help fund VAMPSS. To survive long-term and branch out to other drugs and vaccines, as its leaders hope it will, it needs industry money. VAMPSS is coming together now partly because of the H1N1 flu. H1N1 was “a situation that seemed to be uniquely affecting pregnant women” who were at high risk for complications if they contracted it, says Schatz, a past president of the allergy academy. Meanwhile, the U.S. Food and Drug Administration (FDA) is asking companies to focus more on drug safety in pregnancy after a drug is approved. In December, FDA announced it was setting up the Medication Exposure in Pregnancy Risk Evaluation Program, which relies on insurance company databases to look for signals. VAMPSS is focused as much on demonstrating safety as on finding hazards. “In some ways there's more benefit” to showing safety than risk, says Chambers, because women and their babies can be harmed by a poorly controlled disease. Studies of pregnant women with asthma have found that those who have asthma attacks are more likely to give birth to babies with low birth weight and, in one study, with birth defects. But just how reassuring can any study be? “It's been really difficult” to prove that drugs or vaccines are safe in pregnancy, says Allison McGeer, an infectious disease specialist at Mount Sinai Hospital in Toronto, who is studying flu vaccines in pregnancy. Although McGeer believes flu vaccines are safe, she hesitates to prescribe antiviral drugs to pregnant women who are mildly ill or as a preventive treatment. “Those of us who don't deal routinely with pregnant women are very afraid to do anything,” she says. One area not addressed by VAMPSS and most other studies is whether medications taken during pregnancy can cause effects in children years later, such as learning difficulties in school. “We need to focus more on the long-term effects,” says Lars Pedersen, an epidemiologist and obstetrician at Aarhus University in Denmark, who has studied antidepressants and other drugs in pregnancy. But that is not easy to do. It's not so much that “drugs are out there causing problems,” says Schatz, although some probably are. The bigger challenge, he believes, is the uncertainty: Which drugs are dangerous to a fetus, and which are not? 2. Physics # Century-Long Debate Over Momentum of Light Resolved? 1. Adrian Cho What is the formula for the momentum of light zipping through a transparent material? That may sound like a question on a high-school physics quiz, but physicists have been debating the matter ever since two different formulas were proposed more than 100 years ago. Now Stephen Barnett, a theorist at the University of Strathclyde in Glasgow, U.K., says he has resolved the famed “Abraham-Minkowski dilemma.” Both formulas are correct, he says, but they denote different things and apply in different contexts. Others had suggested that each formula might be correct in its own way, but Barnett spells out precisely when each is relevant, says Robert Boyd of the University of Rochester in New York state. “Steve tells you how to apply them correctly,” Boyd says. “I think [the work] has a good chance of being definitive.” Everyone agrees that the momentum of a photon zinging through empty space is given by a fundamental constant divided by the light's wavelength. When the light enters a medium such as glass or a gas, however, it slows down, which is why a lens bends light. What then happens to the light's momentum? Key to this question is the material's “index of refraction,” the ratio of light's speed in a vacuum to its speed in the material, a number typically larger than one. In 1908, German mathematician Hermann Minkowski argued that the momentum of light in a material equals its momentum in the vacuum multiplied by the index of refraction, making it greater than the vacuum momentum. A year later, his compatriot, physicist Max Abraham, argued that the momentum of light in a material equals the vacuum momentum divided by the index, making it smaller than the vacuum momentum. Thought experiments and real-world data can be found to support each formula. For example, imagine a photon speeding toward a block of glass (see diagram). Together, the glass and the photon possess a total mass and energy that flows in the same direction as the photon. According to Newton's laws of motion, that flow should continue unabated as the photon passes through the glass. But within the glass, the photon slows down. So to maintain constant energy flow, the glass has to recoil in the same direction. From this premise, a little algebra leads to Abraham's formula for the photon's momentum in the glass. On the other hand, imagine firing a photon at an atom in a gas. Suppose the atom can absorb light of a wavelength slightly longer than that of the approaching photon. Then to soak up the photon, the atom must speed away from the light source so that from its perspective the light wavelength stretches—just as a siren's pitch dips if you're in a car rushing away from the siren. The size of that “Doppler shift” is proportional to the gas's index of refraction. Starting from that premise, a little math yields Minkowski's formula. Actually, Barnett argues in the 19 February issue of Physical Review Letters, the two cases describe different kinds of momentum. Abraham's formula gives the “kinetic momentum”—essentially the mechanical punch the photon packs as it hits the glass. Any experiment to measure such a punch will agree with Abraham's formula. Minkowski's formula gives the subtler “canonical momentum”—which, loosely speaking, is tied to the wave nature of light and is higher in a material than in vacuum because the light's wavelength is shorter in the material. Any experiment to probe wave effects will jibe with Minkowski's formula. More technically, the canonical momentum is a mathematical quantity connected to movements in space. A theorist can write down a quantum “wave function” describing an atom sitting in an electromagnetic field. To move the atom to another spot, the theorist must change the wave function by performing a specific mathematical operation that involves the canonical momentum. That's why in the thought experiment with the moving atom, it's the canonical momentum that counts. Given the debate's long history, few expect the work to win immediate acceptance. “Various people have taken rather strong views, you might say verging on religious beliefs,” says Paul Lett, a physicist at the U.S. National Institute of Standards and Technology in Gaithersburg, Maryland. Barnett says he's game to take on the naysayers, however: “If somebody exposes some flaw, then I suppose I shall have to—Oh, they won't!” 3. Psychiatry # Experts Map the Terrain of Mood Disorders 1. Constance Holden There's been a lot of debate over efforts to revise psychiatry's Diagnostic and Statistical Manual of Mental Disorders (DSM), and one of the issues generating extensive discussion is the connection between depression and anxiety. Anxiety isn't on the list of symptoms for major depression, but “most cases of depression are anxious depression,” notes David Goldberg of London's Institute of Psychiatry. This is part of a broader conversation about how the American Psychiatric Association's (APA's) teams assembling a new DSM-V edition can “deconstruct” psychiatric illnesses, recognizing that few exist in their pure form; rather, comorbidity and cross-cutting features are the norm (Science, 12 February, p. 770). Depression is a prime example. It can coexist with practically any other psychiatric condition. And when it's the primary complaint, many other factors can shape its course. “There are a lot of [comorbid] symptoms that categorical diagnoses don't reflect … that really affect outcome,” says psychiatrist Jan Fawcett of the University of New Mexico School of Medicine in Albuquerque, chair of the DSM-V work group on mood disorders. Substance abuse and anxiety are two of the most important. Indeed, DSM-V authors are debating whether the relationship between anxiety and depression is so close that they should be subsumed into a supercategory of human hopelessness, fear, and existential angst. DSM-IV, currently in use, gives a menu of nine symptoms for “major depression,” a diagnosis that afflicts about 17% of the population at some point in life, according to the U.S. National Comorbidity Survey. (Bipolar illness—depression alternating with mania—affects another 1%.) Missing from the list is anxiety; yet, says Fawcett, anxious depressives are at greater risk for suicide, and there's a “staggering” fivefold difference in response to antidepressants, with nonanxious depressives doing much better. Fawcett's group is therefore recommending that “mixed anxiety depression,” a condition that has been residing in the Appendix of DSM-IV, be promoted to a freestanding diagnosis. But giving anxiety a higher status within depression raises other categorical questions. The “anxiety disorders” are currently a separate category that includes generalized anxiety disorder, phobias, and panic, as well as obsessive-compulsive disorders and post-traumatic stress disorder. The symptoms defining the latter two are far more varied than those associated with depression and anxiety. The mood work group spent a lot of time agonizing over the relationship between anxiety and depression at a conference on the subject at the Institute of Psychiatry in 2007. Much new data have revealed a close relationship between the two, but some of it is conflicting. Both family studies and whole-genome surveys show that the two disorders share some of the same genes. On the other hand, imaging and neurochemistry data—including drug responses—suggest important differences. The anxious brain “doesn't look the same” as the depressed one, says APA President Alan Schatzberg, a psychiatrist at Stanford University in Palo Alto, California. Twin studies have shown “a common, underlying genetic vulnerability,” he says, but environmental factors seem to determine whether a twin becomes anxious or depressed. At another border area of depression's bleak realm, major depression is sometimes hard to distinguish from bipolar illness, in which depression alternates with mania—characterized by grandiosity, hyperactivity, racing thoughts, and wild schemes. “There's a raft of reports of major depressives with [only] one or two manic symptoms,” says Fawcett. Because such individuals are at risk of full-blown bipolar illness—which is often not diagnosed for years—and also because some antidepressants can trigger a manic episode in the vulnerable, his group decided to slightly lower the bar for a bipolar diagnosis. Depression also has a murky border with psychosis, which involves delusions (often paranoid) and hallucinations. Bipolar illness and schizophrenia have traditionally been thought of as separate disorders. But genetic and brain-imaging studies over the past decade have undermined this separation, says APA psychiatrist Darrell Regier, co-chair of the DSM-V effort. Both bipolar and schizophrenic patients exhibit common biological abnormalities, including anomalous brain waves and eye movements, notes Goldberg. And in bipolar I, the more severe form, psychosis is not unusual. Genetic studies have shown that vulnerability to psychosis appears as a “common denominator” in some families with both schizophrenia and bipolar illness, Schatzberg says. Some psychiatrists see the two as ends of a continuum, with “schizoaffective disorder”—a utilitarian diagnosis for people showing symptoms of depression and schizophrenia but not qualifying for either diagnosis—somewhere in the middle. All involved with the DSM-V process emphasize that no one yet knows what the final product will look like. On the Web site where proposed revisions are posted (www.dsm5.org), diagnoses are currently grouped for convenience under DSM-IV categories, Regier says. But, he adds, “we are very seriously considering a reorganization” in which anxiety and depression are combined under a supercategory of “internalizing disorders.” Still, says Regier, it's all very much a work in progress. The final organization will not be determined until changes have been tested in field trials. 4. Psychiatry # Suicide Scale 1. Constance Holden Because severity of depression is not a good indicator of whether a patient is suicidal, the mood group is proposing two “suicide assessment scales” for youths and adults, based on a review of the literature on completed suicides, to help in the process of diagnosing any mental disorders. Proposed risk factors include history of suicide attempt, living alone, “angry impulsivity,” drug or alcohol abuse, chronic pain, and a suicide plan. 5. Stem Education # DOE Reworks Student Initiative to Prepare Energy Researchers 1. Jeffrey Mervis Having learned that bigger isn't always better, the U.S. Department of Energy (DOE) has downsized a proposal to train more scientists and engineers to work in a low-carbon economy. Last year, Energy Secretary Steven Chu won President Barack Obama's backing for a$115 million education initiative dubbed RE-ENERGYSE (REgaining our ENERGY Science and Engineering Edge). It was designed to support activities along the entire training spectrum, from postdoctoral fellowships to hands-on learning for elementary students, with a special focus on preparing technicians for jobs in the burgeoning clean-energy sector.

Federal lawmakers weren't very impressed (Science, 10 July 2009, p. 130). They worried that RE-ENERGYSE overlapped with other federal activities to improve science, technology, engineering, and mathematics (STEM) education, notably those at the National Science Foundation (NSF) and the Department of Education, as well as at the Workforce Development for Teachers and Scientists program within DOE's Office of Science. In the end, Congress did not give DOE any money for RE-ENERGYSE in its 2010 budget.

But this month, the president's 2011 budget request contains a slimmed-down RE-ENERGYSE, with a price tag of $50 million—$35 million for higher education and $15 million for technical training and precollege outreach. The need for more skilled workers hasn't gone away, says Kristina Johnson, under secretary for energy, who helped to craft the program and would oversee it. But DOE has learned its lesson, she says: “We've scaled back. We're going to start small, and we'll try to build it up once we see what's working.” For example, Johnson says that DOE's workforce development program, overseen by Science Under Secretary Steven Koonin, is “taking the lead” on graduate research fellowships, with a request for 170 new fellows in 2011 to join a class of 160 funded this year. In comparison, the new RE-ENERGYSE would support only 60 fellows in 2010, instead of the 150 to 200 requested last year. However, RE-ENERGYSE's fellows would be encouraged to work on clean-energy topics, while the Office of Science fellows would be spread across most basic-science disciplines. Johnson has also trimmed plans for another key element in the initiative: the creation of master's degree programs in interdisciplinary energy studies. DOE is requesting money for two rather than four such collaborations across departmental lines. The idea is modeled on an engineering management program that Johnson expanded while dean at Duke University in Durham, North Carolina. RE-ENERGYSE's activities at the precollege level “haven't been defined yet,” Johnson admits. But there, too, the goal is to produce more clean-energy scientists and engineers. “This year, about 6% of entering college students say they are interested in engineering, and about 4.4% of them graduate with an engineering degree,” she explains. “It's pretty hard to compete globally when you're relying on such a small segment of the population to provide the innovations we need. We know from studies NSF has done that we lose kids in grades three to five. And if they don't maintain an interest in science and math, they won't take enough courses to be able to become an engineer.” A congressional aide on the spending panel that controls DOE's budget says that members were taken aback last year by the size of the initiative as well as its scope. “We're still concerned about whether RE-ENERGYSE complements or overlaps with existing STEM education programs,” says the aide. But “it sounds like they heard us.” 6. ScienceInsider # From the Science Policy Blog The U.S. National Institutes of Health (NIH) has widened its definition of what constitutes a human embryonic stem cell to keep up with what's happening in the lab. The new definition would include lines derived from embryos before the blastocyst stage. The U.S. government has formally closed its investigation of Bruce Ivins. A 92-page report summarizes why it believes the Army researcher who killed himself in July 2008 was the 2001 anthrax mail bomber. A veteran science policymaker is the new head of the European Research Council's scientific council. Helga Nowotny is replacing founding president Fotis Kafatos. This week's launch of an instrument to measure the extent and thickness of polar ice has been delayed after the European Space Agency became concerned about the Russian rocket that will place the CryoSat-2 spacecraft into its highly inclined orbit. New York state officials have accused a former researcher from the University at Buffalo of hiring three actors to give false testimony that helped clear him of research misconduct charges in a 2004 case. William Fals-Stewart an expert on addiction, was arrested and charged with attempted grand larceny, perjury, identity theft, and other crimes. The man in charge of the recent Copenhagen climate conference, Yvo de Boer, has resigned his United Nations post and will join the consulting giant KPMG to work on climate and sustainability issues. Raynard Kington who ran NIH during the transition from Elias Zerhouni to Francis Collins, is leaving as deputy NIH director to become president of Grinnell College in Iowa. For the full postings and more, go to blogs.sciencemag.org/scienceinsider. 7. Meeting Briefs # Scientists Grapple With 'Completely Out of Hand' Attacks on Climate Science 1. Eli Kintisch A symposium organized at the last minute at the annual meeting of the American Association for the Advancement of Science (the publisher of Science) by two of the world's most prominent scientific organizations addressed recent attacks on an increasingly beleaguered climate science community. The panel met in the uncertain aftermath of the stolen e-mails affair and critiques of the Intergovernmental Panel on Climate Change (IPCC) (Science, 12 February, p. 768). The symposium was convened by U.S. National Academy of Sciences President Ralph Cicerone, in conjunction with AAAS, at a time when flaws in the latest IPCC report, and even the legitimacy of climate science, have made headlines. E-mails uncovered late last year revealed instances of scientists on the panel discussing withholding data and documents from those with opposing views, conspiring to keep contradictory papers out of influential reports, and encouraging colleagues to delete e-mails. Despite a drumbeat of studies that corroborate the conclusion that the planet is warming and human activities are largely responsible, these recent skirmishes “have really shaken the confidence of the public in the conduct of science [overall],” said Cicerone, citing a number of recent polls on the public perception of science. “The situation is completely out of hand,” said climate scientist Gerald North of Texas A&M University in College Station, who has served as an IPCC reviewer. “One guy e-mailed me to say I'm a ‘whore for the global warming crowd.’” His PowerPoint presentation at the meeting included a slide quoting conservative talk show host Glenn Beck, who suggested that scientists commit “hara-kari” to atone. “Scientists cannot use the same tone and rhetorical style as commentators and bloggers,” North said. Although much of the session at the meeting, titled “Ensuring the Transparency and Integrity of Scientific Research,” focused on what Harvard University oceanographer and former AAAS head James McCarthy called the “abominable” press coverage, scientists owned up to their share of the blame. Small errors in the 2007 report were “careless,” said McCarthy, but IPCC should have done a full and public examination to describe how they had come about: “The names of the authors, who was on the review, what happened—it all should have been up there, and it wasn't done. And I think that the institution was hurt as a result,” he said. The community allowed “the situation to get out of control,” said Sheila Jasanoff of Harvard University. She said in general scientists had to connect better to the public. “There is a kind of arrogance—we are scientists and we know best,” Jasanoff said. “That needs to change.” 8. Meeting Briefs # The Latest on Geoengineering 1. Eli Kintisch Preliminary findings presented here suggest that some proposed techniques to cool the planet manually may have fewer barriers than previously thought. But many technical and societal barriers remain. Even before they got to the sessions, the scientists had to contend with a smattering of activists with drums, cameras, and a megaphone alleging that the government is already performing geoengineering through the spraying of particles, in so-called chemtrails. Physicist David Keith of the University of Calgary in Canada addressed the concept of spreading aerosol droplets in the stratosphere, where they could block a small fraction of the sun's rays. A paper published last year in Environmental Research Letters suggested that the leading proposal, spraying sulfur dioxide gas, wouldn't work. Sulfur dioxide is converted in the atmosphere into droplets of sulfuric acid, which would clump and fall out of the sky before they could have much cooling effect. To get around this problem, Keith and colleagues have proposed using airplanes to spray droplets of the acid itself, rather than sulfur dioxide. In unpublished data, the team found that injecting only “a few megatons per year” of sulfuric acid could be more than twice as effective at blocking radiation as starting with sulfur dioxide. While scientists are finding ways to overcome the engineering challenges, the environmental effects of planet-hacking techniques remain uncertain. One challenge in geoengineering a warmed planet is simultaneously restoring temperatures while minimizing disruption of rain and precipitation. (Stratospheric particles lower the total amount of energy striking Earth, the driver of precipitation.) In previous modeling efforts, adding sun-blocking particles uniformly across the globe has tended to undercool the poles while overcooling the equator. So Kenneth Caldeira, a climate scientist at the Carnegie Institution for Science in Stanford, California, modeled various approaches to try to counteract a severe warming—the result of a doubling of preindustrial CO2 concentration. In work yet to be published, he distributed the particles unevenly to try to minimize those effects; for example, by putting more at the poles versus the equator. (Global warming is greatest in the Arctic.) In models, that strategy helped fix the undercooling/overcooling problem, but it worsened the effects on precipitation. “There's a complex problem of how do you balance the damage that you do against the benefit,” said Caldeira. That said, simulating either geoengineering approach to counteract global warming—distributing particles globally or focusing on the poles—suggests a cooler world with less disruption of rain patterns than one in which warming continues unabated. “In a high-global-warming world, more people would be better off with geoengineering, but some people would be worse off,” he said. 9. Meeting Briefs # Is a Dolphin a Person? 1. David Grimm* Are dolphins as smart as people? And if so, shouldn't we be treating them a bit better? Those were the questions scientists and philosophers debated at a session here on Sunday. Dolphins, it turns out, are pretty darn smart. Panelist Lori Marino, an expert on cetacean neuroanatomy at Emory University in Atlanta, said they may be Earth's second smartest creature, after humans, of course. Bottlenose dolphins have bigger brains than humans (1600 grams versus 1300 grams), and they have a brain-to-body-weight ratio greater than that of great apes (but smaller than that of humans), said Marino. “They are the second most encephalized beings on the planet.” But it's not just size that matters. Dolphins also have a very complex neocortex, the part of the brain responsible for problem solving, self-awareness, and various other traits we associate with human intelligence. And researchers have found spindle neurons in dolphin brains called von Economo neurons that in humans and apes have been linked to emotions, social cognition, and even theory of mind: the ability to sense what others are thinking. Overall, said Marino, “dolphin brains stack up quite well to human brains.” What dolphins do with their brains is also impressive. Cognitive psychologist Diana Reiss of Hunter College of the City University of New York has been working with dolphins in aquariums for most of her career, and she said their social intelligence rivals that of the great apes. Dolphins can recognize themselves in a mirror, a sign of self-awareness. They can understand complex gesture “sentences” from humans. And they can learn to poke an underwater keyboard to request toys. “Much of their learning is similar to what we see with young children,” said Reiss. So if dolphins are so similar to people, shouldn't we be treating them more like people? “The very traits that make dolphins interesting to study,” said Marino, “make confining them in captivity unethical.” She noted, for example, that, in the wild, dolphins have a home range of about 100 square kilometers. In captivity, they roam one 10-thousandth of one percent of this area. Far worse, Reiss said, is the massive dolphin culling ongoing in some parts of the world, which she documented with a graphic video of dolphins being drowned and stabbed in places like the Japanese town of Taiji. Thomas White, a philosopher at Loyola Marymount University in Redondo Beach, California, suggested that dolphins aren't merely like people—they may actually be people, or at least, “nonhuman persons.” Defining exactly what it means to be a person is difficult, White said, but dolphins seem to fit the checklist many philosophers agree on. There are the obvious ones: They're alive, aware of their environment, and have emotions; but they also seem to have personalities, exhibit self-control, and treat others appropriately, even ethically. When it comes to what defines a person, said White, “dolphins fit the bill.” Still, experts caution that the scientific case for dolphin intelligence is based on relatively little data. “It's a pretty story, but it's very speculative,” says Jacopo Annese, a neuroanatomist at the University of California, San Diego. Despite a long history of research, scientists still don't agree on the roots of intelligence in the human brain, he says. “We don't know, even in humans, the relationship between brain structure and function, let alone intelligence.” And, Annese says, far less is known about dolphins. • * With reporting by Greg Miller. 10. Meeting Briefs # More Highlights From AAAS 2010 Science reporters posted more than two dozen blog entries and podcasts from the meeting. Here is a sample. For full coverage, see www.sciencenow.org. And to see what our guest bloggers had to say, see news.sciencemag.org/sciencebloggers. A Sexy Treatment for Traumatic Brain Injury The hormone progesterone is best known for its work in the female reproductive system, where it plays various roles in supporting pregnancy. But starting next month, it will be the focus of a phase III clinical trial for traumatic brain injury. Researchers hope an infusion of progesterone given within a few hours of a car accident or other trauma will help prevent brain damage. The Mathematics of Clumpy Crime Even in a sprawling city like Los Angeles, California, crimes still clump together. Mathematical models presented at the meeting show that such crime hot spots form when previous crimes attract more criminals to a neighborhood. By understanding how these blobs form, researchers hope to help police break them up. Are 'Test Tube Babies' Healthy? When Louise Brown was born on 25 July 1978, she kicked off an era. The first “test tube baby” is a mother herself now, and she's been joined by millions of others born with the help of in vitro fertilization (IVF). But are babies born via IVF the same as those born naturally? Researchers have discovered some subtle genetic differences. Drive Green, Make Money Widespread adoption of plug-in electric vehicles could dramatically cut greenhouse gas pollution and reduce U.S. dependence on foreign oil. And results of a new electric-car pilot project provide added incentive to go electric: Car owners could return unused electricity to the grid and make real money doing so. Science Is Kryptonite for Superheroes Hollywood has a message for scientists: If you want something that's 100% accurate, go watch a documentary. A panel of screenwriters for superhero-driven movies and TV shows like Watchmen and Heroes said that their job is to get the characters right, not the science. 11. Medicine # Cancer's Circulation Problem 1. Jocelyn Kaiser Researchers are counting and examining the rare cells shed by a primary tumor that circulate in the blood, but will these studies help patients? It gives surgeon Stefanie Jeffrey great satisfaction that most of the women whose breast tumors she removes will go on to lead healthy lives, their cancer gone for good. But in about 25% of her patients, the cancer will reappear, having spread to their bones or other organs. Yet physicians have no easy way to know whether a woman's breast tumor has metastasized or whether treatments are keeping a cancer in check. Often they and their patient find out only when a new tumor is large enough to cause symptoms or show up with imaging technologies. By then, “it's often too late” for a cure, says Jeffrey. “It's very frustrating not knowing which patients will develop metastases or how best to treat those who do.” A solution to Jeffrey's frustration may come at her lab bench at Stanford University in Palo Alto, California, rather than in her operating room. She and other researchers are exploring a new window for tracking the spread of cancer: the vanishingly few tumor cells that circulate in a patient's blood. It's been recognized for decades that cancers metastasize because primary tumors shed cells into the blood, which carries them to other organs where they seed new tumors. It's these metastases, not the primary tumors, that cause 90% of cancer deaths. Only in the past 10 years, however, have researchers figured out how to efficiently capture circulating tumor cells (CTCs) from a blood sample. Clinical researchers are now counting CTCs every few weeks in patients with several types of metastatic cancer, a crude but potentially useful measure for gauging whether a treatment is working. Researchers have also begun to analyze CTCs for certain gene variants or proteins that indicate a patient's tumor is susceptible to a particular drug. This kind of “liquid biopsy,” which may allow physicians to follow cancer changes over time and tailor treatment, has spurred at least two dozen academic groups and companies to come up with new CTC detection devices. The growing ability to detect and analyze CTCs could also galvanize the development of drugs designed to block metastasis, says Joan Massagué, a metastasis researcher at Memorial Sloan-Kettering Cancer Center in New York City. Compared to waiting for secondary tumors to appear, monitoring CTC counts may give companies a shortcut for measuring whether an antimetastasis drug works. At the same time, CTC research faces hurdles. Any new CTC detection technology is considered a disease monitoring device by U.S. regulators and must be validated in clinical trials—a slow, costly process. And because CTCs are so rare and hard to capture—there may be as few as one cancer cell in a billion blood cells—most separation strategies are thought to miss some of the cancer cells. Moreover, researchers don't yet have a good handle on whether the cells they're collecting from people's blood are the ones that can seed new tumors. Further analysis of CTCs could answer that question and confirm that the picture of metastasis developed over the past decade in animal studies is the same in people. “In some way, the big missing piece has been access to these cells,” says Daniel Haber, director of the Massachusetts General Hospital (MGH) Cancer Center in Boston. ## Rare, but important The first report of blood-borne cells shed by a solid tumor came in 1869 from an Australian physician named Thomas Ashworth. Using a microscope, he examined blood from a patient who died of metastatic cancer, spotting cells that looked identical to the cells in the patient's tumors. Such cells “may tend to throw some light upon the mode of origin of multiple tumours existing in the same person,” Ashworth presciently wrote. But only in the 1990s did clinicians fully realize the potential value of CTCs. The inspiration came in part from work on rare primary tumor cells found lodged in a cancer patient's bone marrow long before metastasis was evident. Studies in Europe suggested that patients with these so-called disseminated tumor cells in their bone marrow had a poorer prognosis. Frequent bone marrow biopsies aren't practical, however, so researchers began to look to CTCs in blood samples, says Klaus Pantel, a medical oncologist at the University Medical Center Hamburg-Eppendorf in Germany. Also motivating the interest in CTCs has been the recent development of molecularly targeted cancer therapies that work best on patients whose tumors have a particular mutation. This spurred a push to develop devices to efficiently capture and analyze CTCs, which could potentially serve as a surrogate for the tumor itself, Pantel says. One of the first widespread CTC detectors, called CellSearch, works for cancers that arise in the epithelial tissue lining organs such as the breast and colon. The device traps CTCs using magnetic beads coated with an antibody that sticks to a protein called epithelial cell adhesion molecule (EpCAM) that's found on the tumor cells but not on blood cells. But before technicians can use a microscope to count the CTCs, they must stain the trapped cells with other antibodies to distinguish the tumor-derived ones from white blood cells that linger as contamination. CellSearch demonstrated its potential in 2004, when a study of 177 breast cancer patients showed that women with at least five CTCs per 7.5 milliliters of blood had a poorer prognosis than those with fewer or no CTCs. Based on these results, that year the U.S. Food and Drug Administration approved CellSearch as a device for managing the progression of metastatic breast cancer. Manufacturer Veridex later won approval for using CellSearch to monitor metastatic colon and prostate cancers. One CellSearch test on a blood sample costs about$600, far less than $1500 or more for a PET or CT scan that may only spot significantly sized tumors, notes Massimo Cristofanilli of the Fox Chase Cancer Center in Philadelphia, Pennsylvania, who led the breast cancer trial. CellSearch “will be extremely useful for oncologists,” he says. Just how useful remains uncertain. When the American Society of Clinical Oncology issued its most recent guidelines for treating breast cancer in 2007, it cautioned against making treatment decisions based on CTCs just yet. To confirm the utility of CTC monitoring, experts want a prospective study to show that switching treatments when a patient's CTC levels rise, rather than waiting until an imaging scan shows progression, extends lives. A randomized breast cancer trial now enrolling 500 women in the United States with metastatic breast cancer aims to provide this evidence within the next few years. ## More than a number Counting cells barely scratches the surface of how oncologists want to use CTCs. “My problem with counting is, say the cell count is up and the drug isn't working. What drug do you switch to?” says Jeffrey. She and others want to analyze CTCs for molecular changes in a person's cancer that would point toward particular treatments. For example, only women with breast cancer whose tumor cells express the receptor HER2 respond to the drug Herceptin. Researchers reported 6 years ago that some women whose primary tumors were HER2-negative later had CTCs that were positive for HER2, suggesting that their cancer had mutated. Monitoring CTCs might therefore identify women who were initially ineligible for Herceptin but who would later qualify for the drug. To detect such molecular changes routinely, researchers say they need to improve on CellSearch, which often finds only a few, if any, CTCs in a cancer patient and yields impure cells that can't be analyzed in much depth. One newer device is a silicon chip developed by Haber and MGH biomedical engineer Mehmet Toner. Called the CTC Chip, it has 78,000 microscopic posts coated with EpCAM antibodies that let blood cells pass by but trap live tumor cells; as with CellSearch, these cells are then dyed with markers and detected with a microscope. In a 2007 Nature paper on the CTC Chip, the MGH team reported that 67 of 68 mostly metastatic cancer patients had CTCs, while controls had none. The MGH group later used the chip to capture CTCs from lung cancer patients for genetic analysis. In a 2008 New England Journal of Medicine article, the MGH group reported extracting DNA from these CTCs and detecting key mutations in the gene for a cell surface protein called EGFR—including a genetic change indicating that some patients' cancer had become resistant to the potent drug they were receiving. Haber's team has a$15 million, 3-year grant from Stand Up To Cancer, a U.S. telethon to raise money for cancer research, to further improve the test—it's slow, requiring about 6 hours per sample—and incorporate it into clinical trials at several other cancer centers.

Devices like the CTC chip and CellSearch that use EpCAM to fish out cancer cells in blood have a potentially significant drawback: They miss CTCs that lack EpCAM, possibly because the cells have gone through the so-called epithelial-mesenchymal transition (EMT), a process in which some tumor cells become less sticky as they're breaking free and entering the blood. Nor can EpCAM methods detect non-epithelial cancers, such as sarcomas.

To combat these problems, some labs are using cocktails of antibodies to try to pick up more CTCs. Others are experimenting with so-called negative filtration, which uses antibodies to remove blood cells from a sample and leave behind tumor cells. Still others are working on CTC trapping methods that don't rely on antibodies. A size-based membrane filter—CTCs tend to be larger and less dense than blood cells—developed by a team led by pathologist Richard Cote of the University of Miami in Florida wins the record for speed: It's essentially a big syringe, and in just 90 seconds it squeezes a blood sample through a filter, leaving behind tumor cells. Yet another approach is to smear whole blood on a slide, stain the cells with various fluorescent markers, then use a laser to rapidly count the cells.

A particular device “may be better depending on what you're trying to do. It's up to you to prove it,” says oncologist Howard Scher of Sloan-Kettering. CTC detectors may also soon face a challenge from growing efforts to analyze blood samples for free-floating DNA shed by tumor cells (see sidebar, p. 1074).

One reason researchers are hotly pursuing better CTC detectors is the prospect of a test that is sensitive enough to detect a person's initial tumor early in its development. “We're not ready for early detection, but the principle is there,” says Haber. Bioengineers are also working on ways to physically filter CTCs out of the blood to prevent metastasis. Some oncologists seem skeptical that you could get rid of every last cancer cell without causing harmful side effects, but the National Cancer Institute is funding research on the topic.

## Dangerous seeds?

Whatever the technique, CTC researchers are still struggling with the question of whether the cells they're capturing are the seeds for new tumors. Teams are chipping away at the problem by probing CTCs for the expression of genes thought to be involved in the EMT process or for stem cell markers. The latter is a more recent development as more cancer researchers have begun to believe that the cells initiating tumors have stem cell properties (Science, 24 August 2007, p. 1029). Several teams, including Cristofanilli's and Jeffrey's, have been finding CTCs that express EMT or stem cell genes.

A few labs are going a step further, trying to directly show that human CTCs can cause new tumors. “If anybody can prove that the few CTCs in human samples grow in mice, it would be almost a proof of their stem cell nature,” says Pantel.

This, however, requires isolating sufficient numbers of living CTCs, which only a few labs have managed to do. Jeffrey's group has implanted cells cultured from a human cancer cell line in mouse mammary tissue, waited for tumors to grow, then isolated live CTCs from these mice using a new device that her team invented. When these CTCs were cultured and implanted in another set of mice, the resulting tumors grew larger and spread faster than tumors from the original cell line, her team reported online in January in the British Journal of Cancer. “These cells definitely were involved in metastasis,” Jeffrey says. She eventually wants to culture CTCs from patients and implant them in mice to see if tumors form.

Cancer biologists are paying more and more attention to such experiments. “I'm thinking of CTCs all the time,” says Massagué. If the details of these cells can reveal how human cancers spread, they may offer new ways of stopping it in its tracks.

12. Medicine

# Keeping Tabs on Tumor DNA

1. Jocelyn Kaiser

A group at Johns Hopkins University has been monitoring free-floating tumor DNA in blood to track cancer. A new study being published in Science Translational Medicine this week suggests that advances in DNA sequencing may make this approach more viable.

While many researchers are working on using tumor cells in blood to track cancer (see main text), Bert Vogelstein's group at Johns Hopkins University has been doggedly pursuing a related idea: monitoring free-floating tumor DNA in blood. A new study suggests that advances in DNA sequencing may make this approach a strong alternative.

Detecting naked DNA shed by a tumor isn't quite as daunting as capturing circulating cancer cells, which requires physically separating multiple kinds of cells. But it isn't easy, either. For several years, Vogelstein's team has tried to detect tumor DNA in a cancer patient's blood by searching for DNA with subtle mutations in known cancer genes. But not all patients' tumors carry such mutations. And the polymerase chain reaction (PCR) technique used to amplify and detect these single-DNA-base changes can generate false positives.

While cataloging mutations in cancers, the Johns Hopkins group hit upon a potentially better way to fish out tumor DNA. They noticed that all solid tumors had large chromosomal rearrangements that were unique to that patient's tumor. Rapid advances in DNA-sequencing technology have now made it practical to systematically look for such large-scale changes. In Science Translational Medicine this week, the Johns Hopkins team reports taking biopsies of breast or colon tumors from six people and sequencing the entire genome in each tumor's cells. In each case, the researchers identified rejiggered chromosomes specific to the tumor but not seen in a person's normal DNA.

The researchers then showed they could use PCR to pick out scant amounts of this distinctive tumor DNA from normal DNA, even if it constituted as little as 0.001% of the overall DNA sample. They next analyzed the blood of two people who had colon tumors that had been biopsied and genome-sequenced but not yet been removed. The blood of both tested positive for their specific cancer biomarker, whereas blood from healthy people tested negative. Finally, the researchers used the genetic fingerprint of one colon tumor to track that patient's response to various treatments (see graph). The amount of tumor DNA in the person's plasma declined in the hours after surgery, rose during the next few weeks, and then dropped again after chemotherapy and surgery for a secondary tumor in the liver.

Compared with single-base mutations, the chromosomal biomarkers are “extraordinarily specific. The chance of getting a false positive is essentially zero,” says Johns Hopkins's Victor Velculescu, who led the pilot study. They could also be used to detect tumor DNA in other fluids and tissues.

Klaus Pantel of the University Medical Center Hamburg-Eppendorf in Germany, who studies both circulating whole tumor cells and free tumor DNA, enviously admits that his group has thought about the same genome-sequencing strategy. “It is a great approach,” he says. But as Pantel and the study authors themselves note, the costs—\$5000 per patient just to find the unique chromosome changes—must still come down significantly before the strategy could be available for routine use. And Howard Scher of the Memorial Sloan-Kettering Cancer Center in New York City cautions that tests for whole tumor cells and free-floating tumor DNA needed to be compared head to head in clinical trials to know which is more useful for a specific purpose.

13. Planetary Science

# Iceball Mars Proving a Tough Place to Find Liquid Water

1. Richard A. Kerr

Although Mars was once awash in flowing water, all the water found by orbiting radars has long been tied up in recurring ice ages.

“The running joke is that somebody's about to announce the discovery of water on Mars, again,” says planetary scientist Robert Grimm of the Southwest Research Institute (SwRI) in Boulder, Colorado. Actually, it was more than 30 years ago that researchers first discovered water on Mars, more than a million cubic kilometers of it frozen in the north polar ice cap.

More recently, water has been turning up far and wide on Mars, but—like the two polar caps—all of it is quite solidly frozen. “On a global scale, Mars has been cold, frozen, and dominated by ice for a long, long time,” says planetary scientist Michael Mellon of the University of Colorado, Boulder. This frozen (or sometimes vaporous) water is looking increasingly dynamic, dancing across the planet between recurring ice ages. But radar's deep probing is pushing the search for liquid water—and possible oases of life—deeper and deeper into the planet.

Planetary scientists reported the latest water-ice findings at December's American Geophysical Union (AGU) meeting in San Francisco, California. Jeffrey Plaut of the Jet Propulsion Laboratory (JPL) in Pasadena, California, and Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS) team members reported detecting ice-rich deposits as deep as 1 kilometer beneath the 3-million-square-kilometer Dorsa Argentea Formation near the south polar cap. From its perch on the European Space Agency's Mars Express orbiter, MARSIS beams microwave pulses at the surface. A part of each pulse may reflect back from buried geologic strata. Compared with most rock, ice is relatively transparent to microwaves, so when microwaves easily penetrate deep into the subsurface, that implies the area is rich in ice.

Planetary geologists such as James Head of Brown University aren't surprised that roughly equal amounts of ice and rock underlie Dorsa Argentea. They had already recognized the sinuous ridges and channels of the region as lingering markers of a now-vanished ice cap. When its base was melting a couple of billion years ago, that cap was far larger than today's.

Where that ice cap's water went remains a mystery. So does the current whereabouts of the far more copious water that soaked Mars during its “warm and wet” era more than 4 billion years ago. But radar is on water's trail. At next week's Lunar and Planetary Science Conference (LPSC) in Houston, Texas, Plaut and team members on SHARAD (SHAllow RADar)—the radar flying on NASA's Mars Reconnaissance Orbiter—will report that ice is indeed abundant across the 1000-kilometer span of the Deuteronilus Mensae area on the edge of the great northern lowlands.

Geologists had inferred the presence of buried ice, but with repeated SHARAD passes, it can now be seen that there are hundreds of meters of nearly pure ice—covered by a few meters of protective rocky debris—in almost every valley and piled against almost every mesa across nearly all of Deuteronilus Mensae. Head and colleagues have shown from geological studies that this ice is just the residue of glaciers that hundreds of millions of years ago flowed off high plateaus from ice sheets up to 800 meters thick. Climate modeling has shown that when Mars nods far over on its axis—as it does on time scales of tens of thousands to millions of years—the solar energy added to the polar regions can drive water from the ice caps to snow out in mid-latitudes and form ice sheets (Science, 11 April 2003, p. 234).

Another residue from cyclic martian ice ages may still pervade high latitudes just centimeters beneath the surface. Analysis of cosmic-ray–induced gamma-ray and neutron emissions had pointed to water ice in the top meter of martian soil down to a latitude of 40°; the Phoenix lander touched some of that ice. But at the AGU meeting, both the SHARAD and the MARSIS teams reported signs of abundant high-latitude ice down as deep as a few tens of meters in both hemispheres. That's “exactly what we'd predict,” says Head; Phoenix ice is a residue of past snow-fed glaciation. On the other hand, the Phoenix team and especially Mellon have argued that Phoenix ice is all just atmospheric water vapor that formed a dense frost between soil grains. That debate continues, although in an abstract for LPSC, Phoenix team member Michael Hecht of JPL jumps ship and argues for a glacial origin.

Ironically enough, the Mars radars were designed to hunt not for ice but for the missing liquid water from the planet's early days. Planetary scientists thought they would find liquid water in aquifers as radar probed as much as 5 kilometers beneath the surface. “We had expectations of a shallow aquifer, but we don't have any detections,” Plaut said at the AGU meeting. “There are few if any large, shallow aquifers on Mars,” he now says. Aquifers “are probably there,” he adds, “just too deep to see.”

Unfortunately, the radars are not probing as deeply as even the most pessimistic predictions, says planetary geophysicist Roger Phillips of SwRI, Boulder, a member of both radar teams. The problem may be radar-blocking clays. And radar has detected a rocky crust unbowed by the weight of the overlying north polar cap. That means less crust-weakening—and ice-melting—heat is flowing up from the interior than geophysicists had assumed, so any subterranean water would be frozen well beyond the reach of radar. So while geologists continue sorting out the climate history of Mars as frozen into the planet's ice, geophysicists will likely have to come up with another, more penetrating probe of the Red Planet.

14. Animal Research

# Dog Dealers' Days May Be Numbered

1. David Grimm

Legislators want to shut down the pipeline of “random source” dogs and cats to laboratories, but some researchers worry about the impact on science.

In the summer of 2005, a 1-year-old Labrador mix with brindle markings arrived on a truck at the University of Minnesota, Minneapolis. The dog, one of a handful of ostensibly unwanted canines rounded up by an animal dealer from local pounds, was to be implanted with an experimental heart device and eventually euthanized. But this dog was hardly unwanted. When research technicians passed a handheld scanner over his shoulder blades, they detected a microchip that they traced back to a man, three states away, desperately searching for his pet, Echo.

Cases like Echo's demonstrate what can happen when the so-called Class B dealer system breaks down. For more than 4 decades, individuals licensed by the U.S. Department of Agriculture (USDA) have collected dogs and cats from shelters, breeders, and other sources and sold them to research facilities. Proponents say these dealers provide genetically diverse breeds of various sizes and ages that can't be obtained from traditional laboratory animal suppliers and that are essential in some types of research. But detractors point to a history of misconduct, from stolen pets to animal cruelty, and have been trying for years to shut down the system. “By using these animals, we risk losing our credibility with the public,” says Robert Whitney, who oversaw animal resources programs at the National Institutes of Health (NIH) for more than 20 years. “It's an Achilles' heel for research.”

Last year, the National Academy of Sciences (NAS) released a report that backed up what Whitney and fellow thinkers have been saying. “Class B dealers are not necessary for supplying dogs and cats for NIH-funded research,” it said, and recommended ways to phase out the system. The report is also giving fuel to a congressional bill that would ban these dealers outright.

But many in the research community are fighting back, even those who don't use Class B dealers. “These actions are premature,” says Alice Ra'anan, director of Government Relations and Science Policy at the American Physiological Society (APS), which represents more than 10,000 scientists, doctors, and veterinarians. Any such ban, she says, would delay important research projects and could shut down others entirely. “It would be enormously disruptive.”

## The rise of animal welfare

Ironically, it was a case much like Echo's that helped create the Class B dealer system. In 1965, a Dalmatian named Pepper was stolen from a farm in Pennsylvania and sold to a research hospital in New York, where she died in a cardiac pacemaker experiment before her owners could locate her. The following year, Life magazine published “Concentration Camps for Dogs,” a photo exposé of emaciated dogs, cats crammed into chicken crates, and other abuses at the property of a Maryland dealer who sold animals to research facilities.

The stories galvanized the public, and in 1966 President Lyndon Johnson signed the Laboratory Animal Welfare Act into law. The legislation mandated the humane treatment of dogs, cats, and other laboratory animals. It also created two types of licenses—Class A and Class B—for selling animals to research. Class A dealers could sell only animals that they had raised themselves, while Class B dealers could sell animals they had acquired from “random sources,” such as pounds, breeders, and even other dealers. Class A facilities tended to be large corporate entities that bred animals on site, while Class B dealers often ran smaller, “mom-and-pop” operations.

By the 1970s, the Class B dealer system was thriving. NAS estimates that there were about 200 dealers supplying thousands of dogs and cats to U.S. laboratories.

These animals proved critical to advances in science and medicine. Large-chested Dalmatians like Pepper helped doctors develop some of the first artificial-heart devices and lung-transplant procedures. And cats and dogs gathered from the general population harbored a variety of genetic diseases and infections that led to insights into everything from sleep apnea to AIDS.

Yet, despite USDA regulation, stolen and abused animals continued to show up at research institutions. So, in 1990, Congress toughened the Animal Welfare Act. Shelters now had to hold animals for 5 days before selling them to Class B dealers, and—as part of a new USDA “traceback” program—the dealers had to provide extensive documentation about where they got their animals, often detailing multiple sources over several states. Some shelters began refusing to sell cats and dogs to Class B dealers entirely.

The biggest blow to the Class B system came in 2003, when a member of a humane organization—Last Chance for Animals—infiltrated the Arkansas facility of a Class B dealer named C. C. Baird and went public with accounts of sick, abused, and dying animals, many of which appeared to be former pets. The case became fodder for an HBO documentary and resulted in the largest investigation of animal abuse in U.S. history. USDA, blamed for not properly enforcing the Animal Welfare Act, intensified its traceback program and began unannounced quarterly inspections of Class B facilities.

The intense regulation took its toll. Today, only 11 Class B dealers sell dogs and cats to research facilities (hundreds of others sell nonhuman primates, pigs, and other animals), and more than half of these are under intense USDA scrutiny. Together, they supply about 3000 dogs and cats—about 3% of the 90,000 or so used in U.S. research.

Yet critics have been unable to shut down the system entirely. In 1996, federal legislators first introduced the Pet Safety and Protection Act, which would have outlawed the sale of cats and dogs to researchers from Class B dealers. But APS and other research groups opposed it, and it has failed to pass every year it has been proposed. That may change with the release of last year's NAS report.

## The leash tightens

The report seems damning in its conclusions. Commissioned by Congress in response to Echo, C. C. Baird, and other incidents, it found that although Class B dog and cat dealers had provided a vital service to biomedicine, the system was now obsolete and even potentially damaging to research. “There is a minority of dealers that are totally legitimate and doing the job well,” says Stephen Barthold, the chair of the report committee and director of the Center for Comparative Medicine at the University of California, Davis. “But others have sullied the reputation and are taking down the whole thing.”

Class B dealers, the report found, were no longer providing the valued diversity they had in the past. Shut out of shelters and forced to rely on breeders and private owners, the dealers were selling researchers primarily young hounds and beagles—essentially the same type of dogs Class A dealers were providing. “We could not find any compelling evidence that these animals were unique,” says Barthold.

The committee also concluded that, because of limited resources, USDA could not properly regulate the Class B system. “USDA is supposed to ensure compliance,” says Barthold, “but they've done a bad job.” And that meant stolen and abused animals could still end up in U.S. research labs. “It's a very negative public stigma that, personally, I don't think NIH needs,” Barthold says. USDA refutes those claims: “The record over the years shows that we've enforced the system very well,” counters Robert Gibbens, who oversees USDA regulation of Class A and B dealers in the western United States.

The NAS committee recommended several ways to phase out the Class B dog and cat system. It suggested that researchers get their animals directly from pounds and shelters. It advised paying Class A dealers to provide older and more genetically diverse animals. And it proposed that universities or NIH set up consortia to share dogs and cats, as has been done for primates and rodents. “There are so many possible sources for these animals,” says Cathy Liss, president of the Washington, D.C.–based Animal Welfare Institute (AWI), which has tried to find a middle ground between groups like APS and those who want to eliminate cat and dog research entirely. “It's about trying to ensure integrity in the supply.”

But these ideas have not sat well with scientists who still rely on the Class B system. “All of the possibilities … wouldn't work as far as I'm concerned,” says a cardiovascular researcher who asked to remain anonymous so as not to draw attention to his university. For more than 30 years, he has used large and old random-source dogs from Class B dealers to study cardiovascular diseases and develop medical devices. Class A dealers don't stock these dogs, he says, because it's more economical for them to sell puppies. Nor can he get them from shelters, because most no longer sell to researchers. And he says he doesn't understand why NIH or Class A dealers should breed extra dogs and cats for terminal research, when millions of shelter animals are euthanized every year.

“There may not be a lot of groups in America still using Class B dogs for research,” he says, “but the numbers do not reflect the importance of the research being done.”

## End of the pipeline?

Still, the end seems near for Class B dog and cat dealers. Last fall, Representative Mike Doyle (D–PA) and Senator Daniel Akaka (D–HI) reintroduced identical versions of the Pet Safety and Protection Act (H.R. 3907 and S. 1834, respectively). With the National Academies' report, “we're in a better position to pass this bill than we've ever been,” says Doyle. NIH's response to the report, which is expected this spring, could include halting future funding for research that uses Class B dogs and cats.

Even APS seems to acknowledge that the system is on its way out. The society has endorsed the NAS report, and Ra'anan says it wants to work with NIH to develop viable alternatives. She's arguing for a 5-year transition period, especially for labs that have ongoing projects with random-source animals. “This is not something that can be done overnight,” she says, “but we need to get the ball rolling.”

Some universities have already started. Duke, Yale, and MIT, for example, discourage their researchers from obtaining cats and dogs from Class B dealers. Says AWI's Liss: “Institutions need to step up to the plate.”

At least one dealer says it is planning on shutting down on its own. “I don't see how the system can continue to survive like this,” says Janice Hodgins, who has run a Class B facility in Howell, Michigan, with her husband since 1960. At one time, the operation housed more than 300 dogs and cats, used in everything from hip-replacement studies to mental health research. Today, they have just nine. “There's been a lot of things learned through random-source animals,” she says, “but I feel like we're on the losing end of this now.”