News this Week

Science  25 Feb 2011:
Vol. 331, Issue 6020, pp. 992

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution

  1. Around the World

    1 - Urbana-Champaign, Illinois
    First Cowpox Case in the U.S.
    2 - New Delhi, India
    India May Join U.S. MoonRise Mission
    3 - Tokyo, Japan
    Whaling Season Cut Short
    4 - Armonk, New York
    Dr. Watson, We Presume
    5 - Washington, D.C.
    Train Wreck, Anyone?

    Urbana-Champaign, Illinois

    First Cowpox Case in the U.S.

    A student lab worker at the University of Illinois is the first person in the United States to catch cowpox, a less dangerous relative of smallpox. Researchers from the U.S. Centers for Disease Control and Prevention (CDC) traced the infection to a genetically modified cowpox virus strain stored in the lab's freezer. The student had never worked with the virus, which the lab had not studied for 5 years. After she came down with the disease in July 2010, however, tests found DNA from the strain (although no live virus) in various parts of the lab.

    Cowpox is endemic in Europe and Asia, where veterinarians and zoo workers often catch it from animals. In the United States, however, it exists only in laboratories. CDC recommends vaccinations for anyone working with cowpox or similar viruses, but the student declined because she did not expect to handle it.

    New Delhi, India

    India May Join U.S. MoonRise Mission


    India's Space Commission has given the go-ahead for work to begin on a possible contribution to a U.S. sample return mission to the moon. The mission, called Moon-Rise, would land a probe in the South Pole–Aiken Basin (pictured) on the far side of the moon, scoop up 1 kilogram of material, and return it to Earth. Launch is currently planned for 2016. The Indian Space Research Organisation (ISRO) hopes to provide an orbiter that would circle the moon for a few years and aid in communication and imaging.

    If India does join Moon-Rise, it would underline a change in Indo-U.S. security relations. Until recently, U.S. labs and companies were prohibited from exchanging technologies with ISRO in an attempt to limit their use for military purposes. But Indian Prime Minister Manmohan Singh and U.S. President Barack Obama met in New Delhi in November and agreed to become strategic partners. The countries may be ready to join hands on a major space mission.

    Tokyo, Japan

    Whaling Season Cut Short

    Citing harassment by the activist group Sea Shepherd Conservation Society, Japan last week called an early halt to this year's research whaling expedition to Antarctic waters, having captured and killed just 172 of a planned 900 whales. Sea Shepherd declared it “Victory in the Southern Ocean Day” for whales.

    An international moratorium on commercial whaling took effect in 1985, but Japan relies on a provision allowing research whaling to catch hundreds of minke and smaller numbers of other species each year. Taken whales are examined to determine the age, stomach contents, amount of heavy metals accumulated in tissue, and other data; then the whale meat is sold, with proceeds subsidizing the research whaling expeditions.

    Critics contend that the data could be collected through nonlethal means and that Japan's cetacean research is thinly veiled commercial whaling.

    Armonk, New York

    Dr. Watson, We Presume

    Fresh from mopping the floor with two human opponents in Jeopardy!, IBM's factoid-spewing supercomputer Watson is turning its talents to medicine. On 17 February, IBM announced that it was teaming up with software company Nuance Communications Inc. and two universities to produce a computerized “physician's assistant” designed to fetch up-to-date medical information on command.

    IBM's Watson supercomputer challenges two former Jeopardy! champions.


    Nuance, which PC users know as the maker of Dragon speech recognition software, already has a foothold in health care technology. Researchers and clinicians at Columbia University Medical Center and the University of Maryland School of Medicine in Baltimore will contribute medical expertise to the project. And IBM, of course, is supplying Watson: a suite of algorithms working in concert to parse natural language, process information, and retrieve data.

    IBM says computer P.A.s will help physicians and nurses make faster diagnoses and prescribe up-to-date treatments. The company expects the first products to come on the market within 18 to 24 months.

    Washington, D.C.

    Train Wreck, Anyone?

    The worst fears of U.S. scientists were realized last week when the House of Representatives approved a budget that would trim roughly $5 billion from current federal spending on research. The so-called continuing resolution for the last 7 months of the 2011 fiscal year would lower overall discretionary spending by $61 billion, taking billion-dollar bites out of the National Institutes of Health, National Science Foundation, and Department of Energy's Office of Science (see p. 997). House Republicans were especially keen to derail presidential initiatives in education, energy, and climate research. But the 67 successful amendments—out of a pile of nearly 600 that were drafted by members of both parties—took aim at the entire federal budget, with the exception of mandatory programs like Medicare and Medicaid.

    Next week, the spending measure will be taken up by the Senate, which has promised to make its own reductions to President Barack Obama's request for 2011 that the previous Congress failed to act upon. Its Democratic leaders say they won't be using the House bill as a template, however, and Obama has already threatened a veto. The current agreement to extend 2010 spending levels into 2011 expires on 4 March. If the White House and legislators don't resolve the deadlock by then, the next step could be a government shutdown.

  2. Random Sample


    >A company in Oxfordshire, U.K., hopes to cash in on fusion with a design for a supercompact fusion reactor, or tokamak, that it plans to sell as a neutron generator for industry or research. Tokamak Solutions says it could build the most basic version of the machine—producing just a hot plasma for research purposes—in a year at a cost of around $1 million.

    Science Snaps Win Prizes


    From his home in Stafford, U.K., retired schoolteacher Spike Walker snapped four of the 20 images that won the 2010 Wellcome Image Awards, announced this week. The pictures, created by stacking as many as 44 different frames captured through a microscope, reveal surprising details about four different insects in vivid color. These include the hooks on a caterpillar's belly, a mosquito's feathery antennae, the hairs on a coiled ruby-tailed wasp, and the suckers on the foreleg of a great diving beetle, with which it grasps females during mating (pictured). “The beetle was one I found in a large collection of Victorian slides a friend was wanting to get rid of,” Walker says. “The wasp just flew into the kitchen one day.” His four photomicrographs, along with the 16 other winners, which included a scan of a patient's aneurysm and an image of a 3-day-old mouse blastocyst undergoing its first cell division, are on display at the Wellcome Collection in London.

    By the Numbers

    28% — Percentage of U.S. adults in 2008 who had enough scientific knowledge to read the Tuesday science section in The New York Times, according to a survey by the University of Michigan's Jon Miller. That's up from 10% in 1988.

    $3.2 billion — Amount of global funding in 2009 for research into neglected diseases, such as tuberculosis and dengue fever, according to a new report. That's up 8% from 2008.

    4 — Number of species found by a 5-month mission that searched 21 countries for living members of 100 amphibian species thought to be extinct. Among the still-missing: the golden toad of Costa Rica, last spotted in 1989.

    Elementary Mathematics


    An international group of mathematicians hopes to do for math what Dmitri Mendeleev's periodic table did for chemistry by identifying the shapes in three, four, and five dimensions that cannot be divided into other shapes—the elemental “atoms” of geometry. Borrowing techniques from theoretical physics, they plan to sift through some 470 million 4D shapes in search of a few thousand fundamental building blocks. “We're using physics as a lens to view the mathematics,” says team leader Alessio Corti of Imperial College London (ICL).

    Mathematicians have already identified all the 2D and 3D basic shapes—called Fano varieties—but need new methods for higher dimensions. So the researchers turned to string theory, a branch of physics which posits that, in addition to the familiar dimensions, the universe contains other dimensions curled up so small that their effects are hard to detect. Tools developed by string theorists to study such curled-up dimensions can tell the team whether higher-dimensional shapes, slices of which are shown here, are Fano varieties.

    The researchers—who are from the United Kingdom, Russia, Japan, and Australia—communicate via a blog ( and Twitter, so anyone can see how they're getting on. Knowing the basic building blocks of geometry, they hope, will be useful for mathematicians, string theorists, and engineers. Team member Tom Coates of ICL says it should take roughly 3 years to work through the 4D shapes. And the 5D ones? “We simply don't know.”

  3. Meeting Brief

    AAAS Meeting

    More than 8000 people attended the AAAS annual meeting in Washington, D.C., from 18 to 21 February. Here are some snapshots from the meeting. Go to for extensive coverage, including stories, podcasts, and live chats.

    Infants Watch What You Say

    Your baby's language skills may surprise you. Before they even crawl, infants can distinguish between two languages they've never heard before just by looking at the face of a speaker. And this ability is enhanced if they're raised in a bilingual household.


    Developmental psychologist Janet Werker of the University of British Columbia in Canada described at the AAAS meeting tests she and Núria Sebastián of Pompeu Fabra University in Barcelona conducted on 8-month-old Spanish babies. Some were raised in homes in which only Spanish is spoken, some in homes whose residents spoke only Catalan, and some in bilingual homes. Werker and her colleague showed the babies a soundless video of three women who were bilingual speakers of French and English—languages the babies didn't know. Each was shown in turn speaking sentences in one of the languages. Eventually the babies got used to this and stopped watching. Then the language changed. Babies raised in bilingual homes looked at the video again. The monolingual babies showed little reaction; but other studies have shown that they can make the distinction until they are 6 months old.

    Werker speculates that the babies may be focusing on differences in lip shapes as the languages are spoken or on “the whole ensemble of muscle movements in the face.”

    Methane From Oil Spill Migrating Undigested?

    The blowout of BP's Macondo well didn't just spew some 5 million barrels of oil into the Gulf of Mexico last year. Lots of methane also whooshed out, but what happened to it is a matter of debate. Some researchers have concluded that almost all the methane was eaten by an enormous bloom of bacteria. But microbial geochemist Samantha Joye of the University of Georgia, Athens, reported at the AAAS meeting that the picture is far more complicated.

    Joye and her colleagues have estimated that 500,000 tons of methane and other gases escaped from the busted well. She reported at the meeting that the breakdown of the dissolved gas dropped sharply 6 weeks after the blowout began, even though there was still plenty of methane in the water. Methane oxidation by bacteria, which had been 60,000 times higher than normal to the southwest of the well, fell to 300 times the background rate, according to her unpublished data.

    Joye speculated that the microbes ran out of another nutrient, which would have prevented them from metabolizing more methane. She also reported that her team detected far more methane than expected to the northeast of the well in late summer, after it had been capped. “It looks like there's a significant amount of gas in the ecosystem,” and it's spread across a larger area, she said.

    Seaweed: Malaria's Nemesis?

    Julia Kubanek with Callophycus serratus.


    In the war against malaria, researchers may have recruited an unlikely ally: a seaweed found in Fiji. In 2005, Julia Kubanek, a chemical ecologist at the Georgia Institute of Technology in Atlanta, and her colleagues discovered that the seaweed, a red alga called Callophycus serratus, contains unusual ring-shaped compounds called bromophycolides that are particularly effective at killing certain fungi. In 2009, they found one that also kills the malarial parasite in red blood cells.

    Now, Kubanek reported at the AAAS meeting, her group has discovered the mechanism. Malarial parasites infect red blood cells, where they thrive on hemoglobin, the body's oxygen-carrying molecules. As the parasites break hemoglobin down, they release heme, a pigment that is toxic to them. To protect themselves, the parasites crystallize the heme and store it in a separate chamber. Kubanek reported that the bromophycolide prevents this crystallization, causing heme to accumulate and poison the parasite. Next, she and her colleagues will test the compound in mice infected with the parasite.

    Trading Tuna for Sardines

    Overfishing has not just decimated populations of tasty fish such as tuna and cod; it's also drastically altered the balance of biomass in the world's oceans, according to a new study reported at the AAAS meeting.

    A team led by Villy Christensen of the University of British Columbia in Canada analyzed models depicting more than 200 marine food webs around the world at various time periods from 1880 to 2007. Christensen's team then estimated the distribution of biomass in these ecosystems and extrapolated the results to cover all of the oceans. The result: The total biomass of predatory fish plummeted by about two-thirds over the past 100 years—54% in the past 40 years alone—while the biomass of the fish they prey on, such as sardines and anchovies, rose by 130%.

    “It's a very different ocean,” Christensen says, adding that the shift in the balance of the food web isn't healthy or sustainable.

    They Said It

    Climate change denialism by evangelicals is a “heresy committed against all of creation, nothing less than a monstrous wrong.”

    —Richard Cizik, a former evangelical preacher and founder of The New Evangelical Partnership for the Common Good, speaking at a AAAS session titled Evangelicals, Science, and Policy: Toward a Constructive Engagement.

  4. Radiology

    Second Thoughts About CT Imaging

    1. Lauren Schenkman

    Concern that CT scan radiation is causing cancer has focused public scrutiny on radiologists and medical physicists—and riled up controversy among them. Can they find a solution?


    Biophysicist David Brenner was in his 40s when an older colleague in New York City, a prominent pediatric radiologist, mentioned in conversation something Brenner couldn't shake: Far too many children, the pediatrician felt, were getting computed tomography (CT) scans for ailments, such as suspected appendicitis, that used to be diagnosed easily by ultrasound or even observation. Cells in children, already more vulnerable because they divide faster than those of adults, have more time to turn cancerous after the initial damage from radiation. And a single CT scan delivers a lot of it, the equivalent of dozens to a few hundred chest x-rays. A concern about the CT boom, planted 11 years ago, began to grow.

    Brenner now directs Columbia University's Center for Radiological Research, where he focuses on exactly how radiation damage leads to cancer. He seems an unlikely candidate for a troublemaker, a passionate Beatles fan who speaks so quietly, he's sometimes hard to hear. Yet he's become one of the most insistent voices in an imbroglio that is roiling radiologists, medical physicists, and the general public over the rising and largely unregulated use of CT scans, and whether the technology can, in same cases, cause more harm than good.

    Brenner hails from Liverpool, U.K., and his Upper West Side apartment shows it: A large black-and-white poster of a young and pensive George Harrison hangs above the sofa, and a plastic John Lennon figurine from the movie Yellow Submarine crouches next to the stereo. Settled on a couch in his living room, he explains how he moved to the center of the CT storm.

    “What I thought I could contribute to this discussion was to provide some quantitative estimates of what the risks [of CT scans] actually were,” he says. These risks are surprisingly unclear, given how old and how commonly used the technology is. Brenner found that each CT scan gives a patient a very small chance of developing cancer. With hundreds of thousands of children getting CT scans every year, that small individual risk balloons into a pressing public health concern, Brenner concluded.


    Many radiologists and medical physicists strongly disagree. For a single CT scan, they say, there's no hard evidence of any raised cancer risk. Nor do they think that thousands of small, potential risks add up to a large public health problem. Furthermore, they say, the dearth of evidence makes it difficult to assess how low is low enough. But even the skeptics favor managing potential CT risks, if for no other reason than to reassure patients. That's one of the aims behind a summit co-sponsored by the U.S. National Institute of Biomedical Imaging and Bioengineering in Bethesda, Maryland, that ends this week. Researchers there discussed standardizing protocols, estimating doses, and dose-saving technologies.

    As the debate rages, the number of CT scans administered continues to soar. In 1980, 3 million scans were given in the United States. In 2006, the number was about 67 million. It shows no sign of slowing down.

    How small a risk?

    CT scans deliver the same type of radiation as an x-ray machine, but much more of it. As the patient moves through the scanner, x-ray beams and detectors revolve around the bed. The body's tissues absorb radiation to varying degrees, and what gets through creates slice-by-slice, incredibly detailed images of the body part being scanned. Crafting this picture comes with potential risks. The chromosomes of a healthy cell are tangled in a “spaghetti-like formation,” Brenner explains. If radiation breaks up the chromosomes, the strands can usually repair themselves. Sometimes, though, the wrong ends meet up, scrambling the genetic information and leaving behind a premalignant cell that can bloom into cancer.

    Eleven years ago, when Brenner first began considering children's CT risks, he needed two pieces of information: the radiation delivered by a single CT scan, and the probability that a dose to a given organ would produce a fatal cancer there. For the first, he used a 1989 British survey of CT use in adults to estimate the dose children experience. For the second, Brenner turned to “the only quantitative tool we had and still have,” he says: risk calculations of radiation-induced cancer in survivors of the atomic bombings of Hiroshima and Nagasaki. The most recent were published in a report by the National Research Council in 1990 and updated in 2006.

    Because data on atomic bomb survivors who had died of cancer were more robust, Brenner focused on fatal cancers. The chance of a child someday dying of cancer from one CT scan was small—on the order of one in 1000, Brenner found. But given that about 600,000 abdominal and head CT scans were performed yearly on children at the time that Brenner did his research, he and his colleagues estimated that 500 of those children might end up dying later on in life of cancer caused by the scan. Meanwhile, the total number of children who would get cancer from CT scans would be about twice that, he estimated. Brenner published his results in early 2001 in the American Journal of Roentgenology.

    The paper was a sensation. “CT scans in children linked to cancer,” blared a headline in USA Today. Brenner's use of the atomic bomb data raised “the ire of lots of people” in the medical physics and radiology communities, he says. They criticized him for extrapolating from the risks associated with extraordinarily high doses from an atomic bomb.

    But Brenner says he didn't need to extrapolate. In Hiroshima, “as you go further and further away [from the epicenter], doses get less and less, so eventually you get to a region where the doses are actually comparable to a CT scan,” he says. Brenner based his risk estimates on a cadre of about 30,000 survivors whose doses were in the range of 5 to 100 millisieverts, equivalent to one or two CT scans, he says.

    Keith Strauss, a medical physicist at Children's Hospital Boston, clearly remembers the day in 2001 when Brenner's research was described in USA Today. The article “ruined my life for 2 weeks,” Strauss said in an interview in Philadelphia last July, at the annual meeting of the American Association of Physicists in Medicine (AAPM). His patients' parents were suddenly full of worries and questions about their children's scans.


    Biophysicist David Brenner, shown with a mannequin that estimates CT doses, worries about cancer risks from scans.


    By 2010, concerns about CT scans and cancer were getting plenty of attention; AAPM sessions on dose reduction were packed. Brenner had first sounded the alarm, but now others were raising concerns, too. In March 2009, the National Council on Radiation Protection and Measurements (NCRP) reported that, in 2006, CT radiation alone contributed 24% of the U.S. population's radiation dose. In 1980, that number was 0.4%.

    “The NCRP report was really the crucial one [that showed] the dramatic increase in the level of exposure,” says Amy Berrington de González, a radiation epidemiologist at the National Cancer Institute in Bethesda, who was not part of the NCRP panel. In late 2009, Berrington de González and her colleagues published a paper in Archives of Internal Medicine estimating that the approximately 70 million scans performed in the United States in 2007 would lead to about 29,000 new cancers.

    There's still a great deal of controversy, however, about how dangerous CT scans really are. Cynthia McCollough, a medical physicist at the Mayo Clinic in Rochester, Minnesota, is skeptical that there's any evidence for the risks Brenner and Berrington de González have reported. To her, the research on doses in the CT range do not show any meaningful biological effect. Some studies, she says, suggest that low doses can even be protective against cancer, similar to how the weakened virus in a flu vaccine helps the body fight off that year's flu. As for whether patients getting multiple scans should worry more, cells can repair themselves between scans, she says, so the damage shouldn't be cumulative. And like others, she's leery of looking at survivors of an atomic bomb blast.

    At last November's meeting of the Radiological Society of North America (RSNA) in Chicago, Illinois, McCollough and Brenner engaged in a public debate over whether cancer risks should be taken into account when ordering CT scans. At the end, the audience was polled and was evenly split. Brenner has his estimates, but “no one,” says James Brink, the chair of diagnostic radiology at Yale University School of Medicine, “has conclusively shown that medical radiation has caused cancer.”


    Dialing down the dose

    Although professionals clash over how hazardous CT scans are, they have converged to find ways to minimize a scan's radiation dose while preserving its accuracy. Despite disagreeing with Brenner, McCollough calls herself a “dose cop.” Before buying a new machine, she doesn't always rely on the manufacturer's specs. Instead she has hauled her case of plastic torsos to a hospital that had the same model in order to test the doses herself.

    She is also exploring how to reduce the dose from a CT scan. Accomplishing this is easier said than done, for both technological and practical reasons. Many radiologists and medical physicists expect that as CT technology improves, doses will drop. A CT angiogram, for example, was a high-dose procedure when it first came on the scene in the 1980s; the scanner would fire x-rays during the heart's entire cycle. Then, in 2006, researchers developed a new technique that used a predictive algorithm to synch up the scan to the heartbeat. X-rays only fire during the relevant phase of the heartbeat, reducing the dose by 83%, according to one paper in Radiology. At the July AAPM meeting in Philadelphia, other dose-saving techniques were highlighted, including algorithms that can scrape the same amount of information out of lower-dose scans.

    These high-tech methods work only if they're used clinically, and CT scanners are loosely regulated. The U.S. Food and Drug Administration (FDA) is limited to overseeing the machines themselves, not how they're applied or the training of the people who administer the scans.

    Given this, most efforts at dose-reduction have been voluntary. Strauss helps direct a pediatric initiative by four radiological societies called Image Gently to remind radiologists to “child-size” the dose they give children. In November, RSNA announced the launch of a parallel campaign for adults called Image Wisely. Strauss notes that higher doses give crisper images, “so there's no automatic incentive to cause [a radiologist] to reduce the technique to an appropriate level. It has got to be a conscious thing.”

    It's tough to tell whether the campaigns are having much effect, says Brenner. Berrington de González agrees: “I would really like to see some data to show the evidence of the effects of these campaigns.” Image Gently's Web site allows radiologists to fill out an online form pledging to review the initiative's guidelines and adjust their practice. The site has racked up 6506 pledges.

    What those pledges translate to, practically speaking, isn't clear. In 2008, Rebecca Smith-Bindman, a cancer epidemiologist at the University of California, San Francisco, who is also a clinical radiologist, visited four Bay Area hospitals and collected the dose that had been given to 1119 adult patients earlier that year during various types of CT exams. The median dose for abdominal and pelvic scans was 66% higher than the number usually quoted at the time. The dose from a “multiphase” version, which goes over the same area multiple times, was 400% higher. She also found that two patients could get wildly different doses, even with a scan on the same part of the body with the same machine at the same institution. Instead of following well-defined guidelines, “every doctor reinvents the wheel” in order to get a readable scan, says Smith-Bindman.


    There's growing pressure to better regulate CT machines, something FDA is considering. Sean Boyd, who heads the agency's diagnostic devices branch, says that FDA might soon require new machines to have safety features, such as software that warns operators or even locks down the machine when the dose they're requesting is too high. But Boyd says the agency doesn't yet have a timeline for introducing those standards. Beginning in summer of 2012, California will require facilities to record the radiation dose of every CT study, legislation introduced after Cedars-Sinai Medical Center in Los Angeles admitted to overdosing 269 patients.

    The lackluster response has Smith-Bindman and others frustrated. She wants FDA to take a more active role and cites the 1992 Mammography Quality Standards Act, which allows FDA to tightly regulate mammogram equipment, technicians, and procedures, as an example of what's possible. “Someone has to stand up and take responsibility for this,” she says.

    Then there's the problem of excessive CT scanning, which many say is only getting worse. For someone whose symptoms necessitate a CT scan, the immediate benefits outweigh the small individual risk, says Brenner. That's not true for those who don't need the scan in the first place. Hedvig Hricak, the chair of radiology at Memorial Sloan-Kettering Cancer Center in New York City, sees a pattern of use in emergency departments “when the CT would be done before you even examine the patient.” There is also, Smith-Bindman says, “a financial cost motivation. It's extremely profitable to do these tests.” Brenner adds that doctors may do the scans to help insulate themselves from malpractice suits. “There's no doubt that people are doing CTs for defensive medicine,” he says. A 2010 study in the Journal of the American College of Radiology of 284 outpatient CT scans found that 27% “were not considered appropriate.”

    Guidelines do exist: The American College of Radiology has issued what it calls “appropriateness criteria” to help doctors decide when a CT is the best course of action, but several studies have suggested that they're not widely used. Where they are, the number of CT scans being ordered has decreased dramatically (see sidebar, p. 1003.)

    Meanwhile, Brenner continues to build his case. In an October paper in the Journal of the National Cancer Institute, he suggested that cancer risks for adults, while lower than those for children, don't decrease dramatically with age, as thought. That's because radiation is not only an inducer, turning healthy cells into premalignant ones, but also a promoter, pushing those cells toward a cancerous state, he argues. While older bodies have cells that divide more slowly and have less time to develop cancer, they also have more premalignant cells, meaning “risks stay pretty constant through … the 30s, 40s, and 50s,” he says—right when most people begin to have health problems and go in for a CT scan.

  5. Radiology


    1. Lauren Schenkman

    In 2005, Massachusetts General Hospital (MGH) created and implemented a program that scores the appropriateness of a CT scan every time a doctor orders one and compares its worth to that of other imaging techniques given the patient's symptoms. As a result, the quarterly growth rate in MGH's use of CT scans has dropped from 3% to 0.25%.

    James Thrall, chief radiologist at Massachusetts General Hospital (MGH) in Boston, has discovered in the past 6 years just how much doctors rely on computed tomography (CT) scans. In 2005, MGH created and implemented a program that scores the appropriateness of a CT every time a doctor orders one and compares its worth to that of other imaging techniques given the patient's symptoms. The software shares the score with doctors and offers them a chance to change their mind. “Physicians really like that feature,” says Thrall, who spearheaded the effort, which cost between $600,000 and $700,000 to develop. The software also helps to shield doctors from malpractice suits.


    It took a lot of work; the American College of Radiology has appropriateness criteria for hundreds of scenarios. But to make the criteria work for MGH, Thrall had to recruit doctors from every specialty to come up with about 15,000 potential links between imaging studies, such as a CT of the head, and clinical scenarios, such as a patient complaining of a headache. In its presoftware days, MGH's CT use was growing by about 3% each quarter. Afterward, growth dropped to 0.25% percent, Thrall and colleagues reported in a 2009 paper in Radiology.

    MGH was driven not simply by a desire to reduce unnecessary procedures. Local insurance companies, concerned about paying for unneeded scans, were demanding that doctors phone up so-called radiology benefit management companies—Thrall calls the strategy “1-800-may-I-do-a-scan”—to get approval for every imaging study. The hospital negotiated with the insurance companies to use decision support software instead. The software is now licensed to the Burlington, Massachusetts–based software corporation Nuance; MGH will earn royalties each time others use it. Insurance companies in Minnesota are paying for the software in place of a third party for authorization.

    David Brenner, a radiation biophysicist and director of Columbia University's Center for Radiological Research, applauds the approach. It “jar[s] physicians' memories each time,” he says, and makes them reconsider whether their plan matches what they've been trained to do. “It did quite markedly change CT prescribing patterns”—at least, at MGH.

  6. Human Genome 10th Anniversary

    Beyond Human: New Faces, Fields Exploit Genomics

    1. Elizabeth Pennisi

    Fast new genomics technology is not just for human geneticists and biomedical researchers anymore.

    Twenty years ago, the proposal to sequence the entire human genome was met with skepticism, even derision. Ten years later, the completion of the first human genome sequence was a source of awe worthy of presidential recognition. Today, it's a paragraph in a 3-year grant proposal.

    The Human Genome Project drove a technological revolution in DNA sequencing that has continued since the full draft sequences were published in Science and Nature in 2001. And as sequencing DNA has become faster and cheaper, opportunities have grown for fields outside human biology.

    Analyzing the genome of an organism once required extensive mapping of its chromosomes and the development of genetic tools specific for that species—a time-consuming and, usually, too expensive process. And decoding just one genome per species was the norm. Now researchers can skip those steps and even think about generating genomes or partial genomes for many members of a species. The new technology also lets researchers track gene activity on an unprecedented scale. “There's been a quantum change in what can be done and the number of organisms that can be studied,” says evolutionary biologist William Cresko of the University of Oregon, Eugene.

    That's why, in the last of our news features commemorating the 10th anniversary of the human genome, we look beyond human biology (although we peer inside the human gut in one case) to profile five research teams that have embraced genomic-scale science to tackle questions they could not have easily addressed before.

  7. Human Genome 10th Anniversary

    Tracing the Tree of Life

    1. Elizabeth Pennisi

    With the help of next-generation sequencing, a team of evolutionary biologists is shining a scientific spotlight on little-studied organisms in order to refine the much-debated animal tree of life.

    On a quest.

    Casey Dunn went to Greenland in search of rare animals for his phylogenetic studies.


    In 1994, researchers came across a completely new animal in a spring on a large island off the west coast of Greenland. Looking a bit like a worm, this microscopic creature had complex jaws and was so unusual that the biologists could not assign it to a known phylum. Instead, they put it in a group of its own, called Micrognathozoa. The oddball, officially named Limnognathia maerski, has been little studied since. Indeed, researchers in Greenland have documented only a few hundred specimens.

    Yet with the help of next-generation sequencing, evolutionary biologist Casey Dunn and his colleagues plan to shine a scientific spotlight on micrognathozoans. During a recent field trip to Greenland, they collected specimens of the tiny invertebrate and have stored them in a freezer in Dunn's lab at Brown University. In a project that would have been unthinkably expensive for a single lab just a few years ago, they will now decipher much of the creature's genome and identify its genes. And that's just the beginning. Dunn and his colleague Gonzalo Giribet, a Harvard University invertebrate biologist, have freezers full of other unusual organisms whose genomes they plan to sequence so that they can refine the much-debated animal tree of life.

    Molecular systematists have long used individual genes to assess relationships among organisms, identifying differences within equivalent pieces of DNA in various species. Controversy often ensued when those early results didn't agree with more traditional classification schemes, such as those based on fossils or morphology. To add clarity, researchers have, over time, increased the amount of genetic material they compare, creating a field called phylogenomics in which many hundreds of genes are evaluated in each analysis (Science, 27 June 2008, p. 1716).

    A few years ago, it cost about $12,000 per animal to sequence 1000 or so genes, says Dunn. Now, a few thousand dollars delivers many more genes. “We can do now what we couldn't do before,” Dunn says. That includes sequencing little-studied organisms, such as micrognathozoans, so accurately that scientists may resolve the relationships of invertebrates whose lineages split off from a common ancestor 500 million years ago.

    Odd creatures.

    Siphonophores (top) and micrognathozoans may clarify animal evolution.


    There are often challenges to sequencing unusual organisms. Sometimes researchers don't have enough DNA to work with; other times the organism has odd ratios of DNA's four bases that make decoding samples difficult. But Giribet has already sequenced and analyzed 20 of these animals, including a whip scorpion, a ribbon worm, and several mollusks. Dunn is excited about the prospect of resolving the animal tree as never before: “It's clear we are going to be able to base our tree on lots of data from lots of species.”

    Next-generation sequencing technologies are also allowing Dunn to explore the evolution of animals by documenting differences in gene expression patterns across closely related species. The goal is to find out how these changes influence shifts in traits and behaviors across the tree of life. To do this, Dunn and his colleagues have turned to a new technique, known as RNA-Seq, that can gauge genetic activity in a sample by sequencing the complementary DNAs (cDNAs) that represent specific genes. The busier a gene is in a sample, the more times its cDNA will be sequenced.

    Dunn and Stefan Siebert, one of his postdocs, have already compared the gene activity of the swimming and feeding forms of a siphonophore, a marine colonial organism. That analysis yielded thousands of genes potentially responsible for the differences in the animal's two structures. By repeating this experiment with multiple related siphonophore species, Dunn hopes to home in on those key to, say, the swimmer's development. “This will allow us to identify which genes have changes in expression that are associated with evolutionary changes,” he says.

  8. Human Genome 10th Anniversary

    Using DNA to Reveal a Mosquito's History

    1. Elizabeth Pennisi

    Evolutionary geneticists are applying next-generation DNA sequencing tools to probe further details of the evolutionary history of the mosquito Wyeomyia smithii, which became a poster child for climate change when it was found to have migrated north in response to global warming.

    Ten years ago, the mosquito Wyeomyia smithii lived a largely anonymous life inside the “pitchers” of the purple pitcher plant common in bogs along the eastern United States, the Great Lakes, and southeastern Canada. Unlike some of its nastier relatives, the insect isn't known to transmit diseases to people or livestock. Larvae feast on microbes and detritus inside the pitcher plant, and adults sip on nectar, not blood, for the most part. Then in 2001, husband-and-wife evolutionary geneticists Christina Holzapfel and William Bradshaw of the University of Oregon (UO), Eugene, made the mosquito a poster child for climate change when they demonstrated for the first time that an animal had evolved in response to global warming.

    Mosquito hunters.

    Christina Holzapfel and William Bradshaw embraced next-generation sequencing last year.


    Now the same researchers are applying next-generation DNA sequencing tools to probe further details of this species' evolutionary history—tools that have become so cheap and widely available that they can be applied to other poorly studied organisms as well. It's a “transformative technology,” says Mark Blaxter of the University of Edinburgh in the United Kingdom.

    Holzapfel and Bradshaw began studying W. smithii 30 years ago, curious about how the mosquito had made its way so far north, because its relatives tend to reside in the tropics. In the course of their studies, they found that from 1972 to 1996, the mosquito's larvae in Maine had gradually delayed the start of hibernation by a week. Mosquitoes from farther north had postponed hibernation even later, whereas those in Florida had stuck to the same schedule as 25 years earlier. The pair concluded that the change in this genetically controlled trait was triggered by the longer growing season that resulted from gradual warming in the northern United States (Science, 23 November 2001, p. 1649).

    Although the finding drew headlines, it still didn't explain how the mosquitoes had ended up in the north. To address that, Holzapfel and Bradshaw wanted to know where the mosquitoes were in the past, particularly following a glacial period 20,000 years ago, when a warming trend had allowed them to migrate to new habitats. And to trace the migratory history of the species, the couple needed to establish the relatedness of populations from across the mosquito's range.

    For years, they had tried to do this, but existing techniques were not able to resolve the differences between populations clearly enough. The mosquitoes from the various populations look too much alike to be distinguished morphologically, for example. In the 1990s, they tried in vain to reconstruct the biogeographical record by comparing proteins called allozymes among populations. Later, they fruitlessly looked at population differences in the insect's mitochondrial DNA. Even microsatellites, short stretches of DNA used in constructing genetic fingerprints, weren't up to the task. “We needed a better tagging or sorting system,” Holzapfel recalls.

    In 2009, they found one down the hall. UO colleague William Cresko had just teamed up with UO molecular biologist Eric Johnson to study the evolution of sticklebacks. They had genetically characterized populations of this fish by developing a catalog of single-nucleotide polymorphisms (SNPs), individual bases that vary frequently within a species. That work was made possible because a year earlier, Johnson's and Cresko's labs had developed a shortcut SNP-discovery method known as restriction-site-associated DNA sequencing (RADSeq).

    This approach takes advantage of the speed and low cost of next-generation sequencing to quickly generate thousands of SNPs that distinguish populations and individuals. Researchers start by taking animals from multiple populations of a species and using so-called restriction enzymes to, at specific DNA sequences, chop up the genomes of each one into short fragments. Each animal's DNA fragments are then joined to a unique “bar code,” a synthetic five-base strand of DNA whose sequence reveals which animal the non-bar-code DNA came from. All the fragments are then pooled together for mass processing by a next-generation sequencing machine. Because the bar codes allow the resulting sequences to be associated with specific animals, researchers aided by bioinformatics software can quickly identify genetic differences among individuals or populations.

    Test case.

    Researchers didn't need a sequenced genome to make a dense genetic map of the pitcher plant mosquito.


    For the mosquitoes, the researchers found 13,000 SNPs, 3700 of which helped to finally determine the relatedness of various populations of W. smithii. “This gave us the resolution to discriminate between postglacial populations,” says Bradshaw. Based on that information, the researchers deduced that after glaciation, a remnant population of the pitcher plant mosquitoes gradually expanded out of the mountains of North Carolina—not out of the Gulf Coast, as some had presumed. The expansion proceeded gradually northward, then westward, they reported online 26 August 2010 in the Proceedings of the National Academy of Sciences.

    When Cresko and Johnson's team tested RADSeq on the stickleback, they were able to match the fish's already sequenced genome to the newly generated sequence to help look for differences. No one had the resources to sequence the genome of W. smithii, and yet RADseq still worked effectively on the mosquito, demonstrating that the technique could be useful for a variety of organisms, even those for which little is known about their genetics. “This tagging system is definitely the wave of the future,” says Holzapfel.

    Furthermore, the cost for the entire mosquito study—examining all 23 populations of W. smithii—was just $3000. “The RADSeq method is cheaper, faster, and delivers thousands of markers,” says Blaxter. He and his collaborators now have 18 RADSeq projects under way in snails, moths, nematodes, butterflies, salmon, ryegrass, sturgeon, beavers, beetles, oaks, elms, and spruce. Already for the diamondback moth, a crop pest, they have used newfound DNA markers to help pinpoint a gene that makes this moth resistant to a certain insecticide. Says Bradshaw, “This is an awesome technique.”

  9. Human Genome 10th Anniversary

    Tackling the Mystery of the Disappearing Frogs

    1. Elizabeth Pennisi

    The chytrid fungus, Batrachochytrium dendrobatidis, has wiped out amphibians around the globe. The results of new sequencing technologies that directly decipher all the active genes of a species suggest that in susceptible frogs, the immune system doesn't go on the defensive; the fungi somehow evades the body's defenses.

    For more than a decade, Roland Knapp has watched and agonized as the mountain yellow-legged frog, which normally thrives in high-altitude lakes and ponds too cold for other amphibians, disappears from the Sierra Nevada. In 1997, Knapp counted 10,000 tadpoles in a single mountain lake—the frogs seemed to “occupy every possible bit of water,” he recently recalled on his blog. This past summer there were almost none. Surveys of 15,000 sites by Knapp, a field ecologist at the Sierra Nevada Aquatic Research Laboratory in Mammoth Lakes, California, and others have shown that this frog—which is actually two species—is now missing from more than 90% of its former habitat.

    Going, going.

    The mountain yellow-legged frog has disappeared from 90% of its Sierra Nevada habitat.


    There are multiple explanations for the frog's disappearing act, but a key one is the chytrid fungus, Batrachochytrium dendrobatidis, which has wiped out amphibians around the globe, including many populations of the mountain yellow-legged frogs. Yet every so often, some of these frogs survive the fungus, and Knapp has been unable to discern whether the amphibian's immune response or some environmental factor made the difference. “It's been pretty clear that our field experiments and observations only take us so far,” he explains. “We needed to go to an entire new level of investigation.”

    So he was thrilled when Erica Bree Rosenblum, an evolutionary biologist now at the University of Idaho, Moscow, approached his team about collaborating on the endangered amphibian. In the past, Rosenblum, who studies the genetic basis of animal traits such as color or limb length, had been limited to what she calls “spearfishing”: sequencing specific genes already suspected of influencing the trait. But about 5 years ago, she realized that new sequencing technologies would make it affordable to directly decipher all the active genes of a species without doing the extensive, and expensive, presequencing legwork required in the past. Thus, she could try “net-fishing,” casting a net that could ensnare more than just suspected genes.


    Roland Knapp's genomic studies may help explain the mountain yellow-legged frog's die-off.


    Rosenblum, Knapp, Cherie Briggs of the University of California, Santa Barbara, and ecologist Vance Vredenburg of San Francisco State University are now using this approach on wild populations of the frogs, comparing ones that persist despite exposure to the fungus to nonexposed ones that ultimately prove susceptible to it. The key step, which next-generation sequencing greatly facilitated, was elaborating the frog's transcriptome, its full repertoire of expressed genes, by sequencing the so-called complementary DNAs (cDNAs) that represent each gene. With these cDNAs in hand, the researchers could construct a device known as a microarray to assess which genes were active in various organs of exposed and unexposed frogs. Results so far suggest that in susceptible frogs, the immune system doesn't go on the defensive, says Rosenblum; the fungi somehow evades the body's defenses.

    The researchers are also using the same microarray to evaluate gene activity in the amphibian's skin to understand why it degrades during infection. And by sequencing microbial DNA swabbed from frog skin, they are examining whether resistant frogs have an unusual repertoire of surface bacteria, as some microbes have been found to make an effective antifungal compound. Such genomic insights are much welcomed, says Vredenburg; the sequencing projects have “affected my work immensely.”

  10. Human Genome 10th Anniversary

    Digging Deep Into the Microbiome

    1. Elizabeth Pennisi

    A new analysis of the gut microbes of 146 people, made possible by the lower cost and higher efficiency of DNA sequencing, is providing researchers a new way to evaluate which genes are "must-haves" for the microbes.

    It isn't only animal studies that have benefited from the explosion in genomics tools. Next-generation DNA sequencing has transformed microbial ecology studies as well. The past decade has seen the growth of metagenomics, in which researchers sequence DNA from a soil sample, the gut, even a computer keyboard, to learn what bacteria live there. With the new technologies, “you can sequence at a level deep enough that you can understand what's going on in the community,” says Rob Knight, a microbiologist at the University of Colorado, Boulder.

    The microbial makeup of our gut is a case in point. In the past decade, scientists have come to realize that animal intestines naturally harbor diverse microbial communities that help provide nutrients and sustain good health. A landmark 2005 study by Stanford University's David Relman and colleagues (Science, 10 June 2005, p. 1635) concluded that the bacterial communities in the human gut vary tremendously from one individual to the next. But that work looked at the guts of just three people, using traditional sequencing technology to probe for different variants of ribosomal RNA genes, each of which represented a different microbe.

    Bug hunt.

    Rob Knight studies the microbiomes of humans, dogs, and other animals.


    A new analysis of 146 people, made possible by the lower cost and higher efficiency of DNA sequencing, is now telling a much more detailed story. Junjie Qin of BGI Shenzhen in China and colleagues recently collected fecal samples from 124 Europeans, some healthy, some obese, and a few with inflammatory bowel disease. They not only identified and sequenced all available ribosomal RNA genes in the samples but also deciphered more than 3 million other genes from the bacteria in the people's guts. (The 576.7 gigabases of DNA sequence data was almost 200 times the amount generated in any previous study.)

    This more comprehensive analysis revealed limits to how much the common gut microbiome varies among people. There is a core set of gut bacteria, indicating that the “prevalent human microbiome is of a finite and not overly large size,” Qin's team wrote in a 4 March 2010 Nature report. Certain bacterial gene sets and species in the gut also correlated with obesity, Knight adds. “When you look at a lot more people, you see systematic patterns of variation” in the gut microbiome, he says.

    A twist on next-generation sequencing is also providing a new way to evaluate which genes are “must-haves” for the microbes. To find these genes, Knight and his postdoc Cathy Lozupone are working with Andrew Goodman and Jeffrey Gordon of Washington University in St. Louis, Missouri, who have developed a technique called insertion sequencing. This involves using mobile DNA elements called transposons to introduce mutations into tens of thousands of bacteria. Before adding the transposons to the bacteria, the researchers tag each transposon with an identifiable DNA “bar code” that allows each mutant bacterium to be tracked—and for the gene disrupted by the transposon to be characterized. With the new sequencing technology, researchers can follow mixed populations of these mutant strains on various growth mediums or in different environments. The relative number of copies of a bar-coded DNA sequence will reflect the success of a particular mutant; bacteria carrying mutations in genes required for a specific environment or medium wind up poorly represented.

    The researchers first tried the insertion-sequencing technique on a human gut bacterium, Bacteroides thetaiotaomicron, introducing transposon-mutated strains into the guts of various kinds of mice. Some mice were normal and had their own gut microbes; others had various immune defects, were germ-free, carried human gut microbes, or had a combination of those characteristics. Two weeks later, the scientists took stock of how these bacteria fared in their different rodent hosts.

    Gut life.

    Researchers have pinpointed the must-have genes in these capsule-shaped intestinal microbes.

    CREDIT: J. L. SONNENBURG ET AL., SCIENCE 307, 5717 (25 MARCH 2005)

    In the germ-free mice, they saw a decrease in the abundance of 280 distinct bacterial strains, suggesting that the gene mutated in each had been essential to staying alive in the gut. Defects in about 90 genes seemed to provide a competitive advantage, as the corresponding mutant strains colonized the germ-free mice better than other strains did. Whether the mice carried other human gut microbes made a difference to the survival of various strains of B. thetaiotaomicron, as a different subset of mutants disappeared in those mice, the researchers reported in the 17 September 2009 issue of Cell Host & Microbe. “We were able to find genes that determine the ability of a bacterium to thrive in the mammalian gut in specific microbial community contexts,” says Goodman, now at Yale University.

    Goodman calls the insertion-sequencing approach “exciting.” Others agree. Several have begun to use it to characterize key genes for various microbes in different organisms and tissues.

  11. Human Genome 10th Anniversary

    Probing Pronghorn Mating Preferences

    1. Elizabeth Pennisi

    Animal behaviorists suspect that female pronghorns choose mates with the lowest burden of so-called deleterious mutations. Thanks to the growing availability of next-generation DNA sequencing, they may finally have a chance to prove this theory.

    Pronghorns, the American antelope, are the fastest animals on the North American continent, yet coyotes still kill many of the fawns, catching them before they develop the quickness to run away. Animal behaviorist John Byers of the University of Idaho, Moscow, has shown, however, that if a female pronghorn picks the right male, her fawns will grow faster than normal and have a much better chance of surviving. Since 1981, he and colleagues have tracked six pronghorn generations at the National Bison Range in Montana.


    Byers suspects that female pronghorns, which he found favor males best able to fight off other males, are actually choosing mates with the lowest burden of so-called deleterious mutations. Byers hasn't had a good way to prove his theory, but thanks to the growing availability of next-generation DNA sequencing, he may finally have a chance. He and his colleague have over the years collected tissue samples from 835 pronghorns across the generations, and they now plan to genetically profile each animal to determine whether female pronghorns do indeed pick genetic studs. “I think it's going to be ultracool,” Byers says.