News this Week

Science  08 Sep 2006:
Vol. 313, Issue 5792, pp. 1370

    First Pass at Cancer Genome Reveals Complex Landscape

    1. Jocelyn Kaiser

    Scientists have long known that the sparks that kindle cancer are mutations in a cell's genes. But most cancer-causing mutations have been discovered by looking in obvious places, such as in the genes that control cell division. Now it seems these efforts have barely glimpsed the big picture.

    As reported online this week in Science (, researchers have shined a searchlight across the genomes of breast and colorectal cancer cells, looking for mutations in more than half of all known human genes. And what they've uncovered is a much larger and richer set of cancer genes than expected.

    The findings, hailed as a tour de force by other cancer scientists, should speed the race for new drugs, diagnostics, and a better understanding of tumor development. “It will take a long time to unravel all of this, but this is what cancer is,” says Bert Vogelstein of Johns Hopkins Kimmel Cancer Center in Baltimore, Maryland, a co-leader of the sequencing effort.

    The results also appear to bolster The Cancer Genome Atlas, an ambitious $1.5 billion federal project to systematically search for genes mutated in dozens of cancer types (Science, 29 July 2005, p. 693). “I see this as a big shot in the arm for the argument that this strategy is going to work,” says Francis Collins, director of the National Human Genome Research Institute in Bethesda, Maryland, which together with the National Cancer Institute (NHGRI) will soon announce details of a $100 million, 3-year pilot effort for the atlas. Adds Eric Lander, director of the Broad Institute in Cambridge, Massachusetts, who first proposed sequencing the cancer genome, “This is a beautiful demonstration that if you turn over every rock, there is a lot more to be found.”

    Yet even supporters of the atlas say this first, quick pass at describing all cancer mutations reveals daunting complexity. And not everyone has been convinced of the larger project's value. Geneticist Stephen Elledge of Harvard Medical School in Boston, while predicting that the new study will become a “classic paper,” says that a costly sequencing project will give short shrift to functional genomics studies and take money away from investigators working on equally important cancer efforts. “I still believe we need a more balanced approach,” says Elledge, who first expressed those concerns last year (Science, 21 October 2005, p. 439).

    To conduct this mini-cancer-genome project, a 29-person team, headed by Vogelstein and Hopkins colleagues Kenneth Kinzler and Victor Velculescu, began with a database of 13,023 genes that are considered the best-studied and annotated of the 21,000 known genes in the human genome. Led by postdoc Tobias Sjöblom, the team resequenced the protein-coding regions of the genes in 11 breast cancer samples and 11 colon cancer samples, yielding 800,000-plus possible mutations. The team then winnowed out more than 99% of the mutations by removing errors, normal variants, and changes that didn't alter a protein.

    Genetic bounty.

    Breast (top) and colorectal (bottom) cancer cells contain many mutated genes.


    They ultimately found that the average breast or colon tumor has 93 mutated genes, and at least 11 are thought to be cancer-promoting. This yielded a total of 189 “candidate” cancer genes. Although some are familiar—the tumor-suppressor gene p53, for example—most had never been found mutated in cancer before. And the abundance of certain types of genes, such as those involved in cell adhesion and transcription, suggested that these processes play a huge role in cancer. The results, says Ronald DePinho of the Dana-Farber Cancer Institute in Boston, are a “treasure trove.”

    Verifying that each candidate gene is important to cancer won't be simple. Not only did the cancer genes differ between colon and breast cancers, but each tumor had a different pattern of mutations. The number of genes suggests that there may be more steps to cancer than thought. “It's a much more complex picture than we had anticipated,” Vogelstein says.

    At least two other pilot cancer-genome projects—one funded by NHGRI and one led by Michael Stratton and P. Andrew Futreal of the Sanger Institute in Hinxton, U.K.—are yielding similar results. The Sanger effort is looking at 500 genes in a larger number of tumor samples and cancer types and, according to an e-mail from Stratton and Futreal, has also found a “tremendous diversity of mutation number and pattern between cancers.”

    DePinho says the mutation differences from tumor to tumor could help explain why 90% of drugs fail in patients. Elledge, for his part, says the relatively small number of new genes common to the tumors reinforces his concerns about The Cancer Genome Atlas. He suggests that some of the government's money would be better spent on more direct studies, such as screens for lethal genes in cancer cells. The cost of the Hopkins study alone—Vogelstein says it took about $5 million, mostly from private funding sources—could fund five National Institutes of Health (NIH) grants on such topics, Elledge notes.

    Despite such doubts, the atlas project gets under way next week. NIH will announce the three cancers to be studied in the pilot phase and a set of repositories that will supply tissue samples for sequencing. Centers that will characterize the genes will be announced in early October. The project is on an “extremely aggressive timeline,” says DePinho, who co-chairs its advisory committee.


    Basic Science Agency Gets a Tag-Team Leadership

    1. Gretchen Vogel

    BERLIN—In a surprise decision, Europe has selected two leaders as successive heads of its new basic science agency, the European Research Council (ERC). The governing council announced last week that it has chosen biochemist Ernst-Ludwig Winnacker, current president of the German funding agency DFG, to be secretary general of ERC, which will make its first awards next year. But in July 2009, halfway through the 5-year term, Winnacker will be succeeded by Spanish economist Andreu Mas-Colell, who will serve through 2011.

    Twice the talent.

    European Research Council picks Winnacker (left) and Mas-Colell.


    Members of ERC's board said they created the unusual arrangement to recruit executives with different skills, not because either candidate requested a short appointment. “We couldn't pass up these two exceptional people who are very complementary,” says scientific council chair Fotis Kafatos, a molecular geneticist at Imperial College London. “Either one would have been great; having both will be even greater.”

    ERC is designed to be a sort of National Science Foundation (NSF) for all of Europe, and its $9.6 billion budget over 7 years is expected to fund cutting-edge research. But as the European Union-backed initiative gets off the ground, it faces a legacy of red tape in European science funding. Researchers have high hopes that it will prove much more user-friendly than the previous R&D efforts, called “Framework” programs, roundly criticized for the mountains of paperwork they generate. Kafatos says one early triumph is ERC's ability to make awards as research grants instead of the complicated contracts that other E.U. funding schemes require.

    The ERC Scientific Council, made up of 22 leading scientists from across Europe, sets ERC's rules and scientific guidelines. The secretary general will be ERC's chief executive, serving as a liaison between the Scientific Council and the European Commission, which will handle day-to-day operations.

    Both Winnacker and Mas-Colell say they were surprised to learn that they would serve truncated terms, which they were informed of at the same time they received the job offer, but both said they were honored to be chosen. Science Council vice-chair Helga Nowotny of the Vienna Centre for Urban Knowledge Management says the arrangement is intended to take advantage of the strengths of both men. In its start-up phase, she says, ERC needs someone with extensive experience overseeing a large granting organization. That's what Winnacker has done at DFG. But it will also need someone to stump for increased funding and to deal with politicians who may be unhappy with grants awarded on the basis of excellence without regard to geographic distribution. Mas-Colell's credentials as an economist and former state research minister will help him make a persuasive case, Nowotny says: “I think we will make good use of both of them, and we need both of them.”

    Winnacker, 65, had already announced plans to step down as DFG president at the end of 2006. This is “a solid appointment of someone who knows how to manage science at the highest level,” says Frank Gannon of the European Molecular Biology Organization in Heidelberg, Germany. Winnacker's experience at the semiautonomous DFG makes him well positioned to fight for ERC's independence if challenged by the E.U. Parliament or member country politicians, Gannon says: “He will not be pushed around.”

    Mas-Colell, 62, is a professor at the University Pompeu Fabra in Barcelona, president of the European Economic Association, and was commissioner for universities and research for Catalonia from 1999 to 2003. He is credited with fostering science investment in the region, which led to the development of several new institutes in Barcelona (Science, 2 June, p. 1295). Mas-Colell spent 26 years at the University of California, Berkeley, and Harvard University before returning to Spain in 1995. Last year, he told a meeting of economists to judge ERC's success on how closely it emulated the U.S. NSF. He says now that he was thinking especially of NSF's widely praised peer-review system.

    The scientific council's first call for applications will target young scientists, with 5-year awards of €100,000 to €400,000 per year. It hopes to award 200 such grants annually. A second program will target “advanced investigators” in a program intended to overcome both the limited size of awards given by national councils and the E.U.'s requirement that large projects be divided among many countries.

    “One of the weaknesses of the European system is that most of the national [funding] councils are too small to fund their excellent scientists adequately,” Winnacker says. But until now, large collaborative projects typically have required investigators from multiple countries. Winnacker says ERC's freedom from such geographical constraints will be “a big step forward. … No one would require someone from Massachusetts to collaborate with someone from South Dakota.”

  3. FDA

    Proposed Guidelines for Emergency Research Aim to Quell Confusion

    1. Jennifer Couzin

    Doing research in the emergency room would be difficult even if the rules were clear, but many clinicians say they aren't. Last week, the U.S. Food and Drug Administration (FDA) suggested revisions to its regulation over an ethically fraught but critical area: studies conducted in emergency situations, when subjects may be unconscious and unable to give consent. The current 10-year-old FDA rule permits emergency research under narrow circumstances—in life-threatening medical conditions in which available treatments are unsatisfactory.

    Under review.

    Research in emergency situations, which raises tough ethical questions, is receiving FDA scrutiny.


    Hoping to clarify the responsibilities of investigators, institutional review boards (IRBs), and others involved in emergency research, FDA has released draft guidelines that spell out each group's responsibilities. The agency is now accepting comments on the document ( and will hold an 11 October public meeting on the subject. One concern for FDA is that some terms that guide emergency research, such as “life-threatening,” may be defined differently by different people. In its proposal, the agency explains that “life-threatening” includes nonfatal risks, noting that emergency research on, say, victims of stroke or head injury could explore a treatment's ability to prevent disability as well as death.

    Emergency research came under scrutiny earlier this year after The Wall Street Journal described a blood-substitute trial in trauma patients unable to consent, in which some suffered heart attacks. FDA officials said in a conference call last week that its review had already been under way and was unrelated to the blood-substitute flap. “It's taken time for us to develop and gather a sizable body of data on how this regulation has actually worked,” said Sara Goldkind, an FDA bioethicist. The agency, she notes, has received roughly 60 applications for emergency research that allows for exceptions to informed consent and so far has approved about 20.

    Physicians who perform such trials agree that the existing rules can be bewildering. “There's been a lot of anxiety and some confusion … about these regulations and how to apply them,” says Lynne Richardson, an emergency-medicine specialist at Mount Sinai School of Medicine in New York City. For example, the dozens of IRBs overseeing a nationwide defibrillator study in which Richardson was involved required wildly different levels of community consultation.

    Graham Nichol, who directs the University of Washington Harborview Center for Prehospital Emergency Care in Seattle, believes that confusion over the current rules has discouraged appropriate emergency research and, by making it difficult to follow up with subjects after treatment, sometimes failed to protect patients. The number of published cardiac-arrest trials in the United States has decreased since the rules were implemented while the number of non-U.S. studies grew, he found.

    Will the new draft guidelines help? “I'm not sure they're any better,” says Nichol, calling them still “too full of nuance.” But, says Richardson, the new guidelines are clearly “an attempt to make sure that all of the research that actually qualifies in FDA's view” can go forward.


    Scientists Object to Massachusetts Rules

    1. Constance Holden

    Massachusetts stem cell researchers thought they were home free last year when the state legislature, overriding a veto by Republican Governor Mitt Romney, sanctioned research using human embryonic stem (hES) cells. But newly adopted final regulations to implement that legislation would cut off what some argue is an important potential avenue of stem cell research.

    In May 2005, state lawmakers passed a measure that explicitly permits scientists to do things that federally funded researchers cannot—derive new lines of hES cells, including disease-specific lines produced using somatic cell nuclear transfer (SCNT), otherwise known as research cloning. The law allows ES cell lines to be produced from spare embryos left over after in vitro fertilization but prohibits the “donation” of embryos created just for research via IVF. Violating that provision, added to satisfy those who worry about “embryo farms,” is punishable by up to 5 years in jail and a $100,000 fine. But the wording does not forbid scientists from working with such embryos if they weren't made in Massachusetts.

    Romney tried unsuccessfully to amend the bill so that “use” of any such embryos in research would also be illegal. After the Democrat-controlled legislature overrode his veto, the state Department of Public Health trumped the lawmakers by inserting the wording Romney wanted into the regulations. “The prohibition on the creation of embryos [by fertilization] solely for use in research is implicit in the language” of the law, contends the Public Health Council, the nine-member body that makes the regulations. “[W]here the primary purpose is research, only the asexual creation of an embryo is permitted.”

    When the proposed regulation was presented in May, eight Boston medical institutions argued that it would “give the force of law to a provision the legislature specifically rejected.” Scientists from those institutions reiterated their concerns last week when the final rules appeared. Harvard stem cell researcher Kevin Eggan says the regulation would prevent Massachusetts scientists from using cell lines derived in other states if they came from embryos created for research purposes. He stresses that it's important to preserve this option as an alternative to SCNT—which has not yet been proven—for creating disease-specific cell lines.

    More options.

    Harvard's Kevin Eggan says purpose-bred embryos may be needed if nuclear transfer doesn't work for creating disease-specific cell lines.


    But some scientists question the rule's impact on research. “I don't see it as a problem,” says stem cell researcher Evan Snyder of the Burnham Institute in San Diego, California. “Most scientists agree that you don't want to make embryos specifically for research,” he says, because it appears to be “ethically dicey.”

    The lawmakers are prepared to reassert their authority, starting with a hearing later this month. The leading gubernatorial candidates in the fall election (Romney is not running for reelection) support stem cell research, suggesting that the political winds are also favorable for a revision.


    Germany Launches a High-Tech Initiative

    1. Gretchen Vogel

    BERLIN—If Bill Gates had tried to start Microsoft from his father's garage in Germany, it never would have worked, says Holger Frommann of the German Venture Captial Association in Berlin. Among other things, he says, the government would have said that the garage didn't have enough windows to be a proper working environment. And whereas high-tech start-ups need less than a week to register in the U.S. or the U.K., he says, in Germany it can take much longer to complete the paperwork.

    The German government says it wants to make it easier for a German Bill Gates to translate research discoveries into products; to this end, it is increasing support for programs that help spin scientific findings into commercial ventures. In a wide-ranging “high-tech strategy” announced last week, the government says it will spend €14.6 billion ($19 billion) in the next 3 years to boost technology-based research and enterprises, including about €6 billion in new funding.

    Lowering barriers.

    The German government wants to make it easier to turn research results into profits.


    The government wants to “ignite ideas,” with a combination of new programs, funding schemes, and legislation, according to a multiagency strategy that Chancellor Angela Merkel and Research Minister Annette Schavan announced on 30 August. Researchers who collaborate with small- and midsized companies, for example, will qualify for a 25% funding premium from the government, up to €100,000. The government says it wants to change the tax law to encourage venture capitalists to invest in start-up companies. And the agriculture ministry has promised a new law governing genetically modified plants that should clear the way for more field trials.

    The plan also includes several new funding schemes. Some €80 million would back technologies aimed at preventing terrorist attacks and disaster prevention and response, and €800 million would foster health and medical technologies, including new support for clinical research and teaching hospitals. The two largest investments are €3.65 billion for aerospace research, including satellite communication and navigation systems, and €2 billion for energy technologies, including biofuels and nuclear energy.

    Tax breaks for start-up companies could be especially important, says Hans-Jürgen Klockner of the German Association of Biotechnology in Frankfurt. German scientists and industry leaders have long sought venture capital tax laws more in line with those of France and the United Kingdom. The details will be ironed out this fall in talks with the finance ministry.


    Academic Earmarks: The Money Schools Love to Hate

    1. Jeffrey Mervis

    An unusual query from a “pork-busting” U.S. senator has revealed an uneasy ambivalence among university presidents toward academic earmarks. Their answers suggest that, like it or not, such directed spending on research is now part of the fabric of higher education.

    On 27 July, Senator Tom Coburn (R-OK) asked 110 U.S. universities to describe any federal research dollars obtained in the past 6 years through the good graces of their congressional delegations rather than via a competitive review. He also wanted to know which universities have hired lobbyists to help obtain earmarks and the impact of the found money on their campuses and on science.

    Coburn, who chairs a Senate financial management subcommittee, calls research earmarks, which have grown into a multibillion-dollar-a-year phenomenon (see graphic), “a gateway drug to overspending.” His six-question letter set off a month-long frenzy of meetings and conference calls among vice presidents for sponsored research, directors of federal relations, professional associations, and lobbyists to figure out how, and whether, to respond. Only 14 schools met Coburn's 1 September deadline, although a few told him they needed more time.

    Research a la carte.

    Congress has become increasingly fond of larding agency budgets with university research projects based in their districts.

    SOURCE: AAAS 2006

    Respondents, which included major research universities and leading recipients of federal earmarks, offered varying views of earmarking. But even those who said they abhor the practice acknowledged occasional dalliances. Cornell University President David Skorton, for example, cited “a long-standing and well-documented policy of not pursuing or accepting earmarks from federal agencies that award funds on a competitive basis” before acknowledging, two paragraphs later, that “Cornell makes two exceptions to this policy.” The biggest is earmarked funds from the Department of Agriculture's cooperative research and extension service, which provides about 1.5% of the university's $381 million federal research budget. “They've worked on the basis of earmarks since 1865,” explains Robert Richardson, Cornell's vice provost for research, about a program he says is essential to fulfilling Cornell's role as a land-grant college.

    The University of Michigan shares Cornell's distaste for pork, says Stephen Forrest, vice president for research, although his reply to Coburn notes that Michigan last year received three earmarks totaling $5.3 million. In fact, the university has adopted a formal application process—much like a grant proposal in its length and complexity—for faculty members who think their idea deserves to be one of the school's “rare exceptions” (

    Some universities see earmarks as a way to simultaneously move up the academic food chain and strengthen the local economy. “The direct appropriations that the Kentucky delegation works hard to acquire for the university are an important part of UK's federal funded projects,” writes Lee Todd Jr., president of the University of Kentucky, who notes that his school has received “over 100 [since 2000] worth a total of $120 million.” Wendy Baldwin, U.K. vice president for research and the former head of extramural research at the National Institutes of Health, explains that earmarks “can help us to get into the top 20” recipients of federally funded research by public universities. The university closely monitors how the money is spent, she says, adding that “we expect people to advance based on this boost.”

    Not every institution is as comfortable as Kentucky is in speaking openly of its appetite for earmarks. University of Missouri President Elson Floyd, for example, provided the same answer to two of Coburn's questions, saying curtly that “all specific objectives and goals [for the research funded by the earmark] are outlined by the granting Federal agency… and specific measures of success are determined by [those] specific goals and objectives.” And Floyd gave one-word answers—no, yes, and yes—when asked whether Missouri has a policy on earmarks, hires lobbyists to snare them, and thinks they are beneficial to the school. (In an increasingly common practice among universities, Missouri retains a Washington lobbyist, Julie Dammann, former chief of staff to Missouri's senior senator, Republican Kit Bond, well-known for his earmarking prowess.)

    John Hart, Coburn's communications director, says his boss blames his legislative colleagues more than the academic community for what is happening. “The earmark process doesn't help universities so much as it helps lobbyists and Congress,” says Hart, who notes that Coburn has held dozens of hearings on all manner of federal spending practices. “Because every time they get an earmark, the politicians can hold a press conference to claim credit.”

    Not surprisingly, Coburn's aggressive campaign has angered influential senators who are also heavyweight porkers. Senator Ted Stevens (R-AK), chair of the Senate Appropriations Committee and author of the notorious $225 million “bridge to nowhere” earmark for his state, has so far blocked Coburn's bid to create a publicly accessible database of Senate earmarks. And many legislators are said to be incensed that Coburn went over their heads in asking universities how they obtained specific earmarks.

    Those tensions are a big reason that universities found Coburn's letter so troublesome. “The last thing you want to do,” explains one university lobbyist, “is to get caught in the middle of a fight between two powerful senators.”


    U.S. Supreme Court Gets Arguments for EPA to Regulate CO2

    1. Eli Kintisch

    Can a 36-year-old U.S. law intended to reduce air pollution keep up with science? The U.S. Supreme Court will address the question this term in a case about whether greenhouse gases such as carbon dioxide should be regulated as pollutants. Several prominent climate researchers hope the court will also correct what they see as a distortion by a lower court and the federal government of the current state of the science.

    Courting disaster.

    Coastal flooding is a likely impact of climate change, researchers tell the U.S. Supreme Court.


    First passed in 1970, the landmark Clean Air Act gave the new Environmental Protection Agency (EPA) the ability to tackle new pollutants as researchers discovered them. The law requires EPA to set vehicular emission standards for substances that could “reasonably be anticipated to endanger public health or welfare.” But although the statute defines effects on “welfare” to include impacts on climate as well as on soils and water, the agency has used the act to regulate smog and other pollution from cars—not greenhouse emissions.

    In 1999, as scientific evidence of climate change impacts accumulated, a Washington, D.C., nonprofit organization petitioned EPA to change its mind. EPA declined, and in 2003 a number of states and nonprofit groups sued. That case, Massachusetts v. EPA, is now before the Supreme Court, and last week 12 states and a number of cities and nonprofit groups filed their arguments.

    The filing coincides with new state limits for industrial emissions passed by the California legislature last week. “We cannot do the job alone,” said Ross C. “Rocky” Anderson, mayor of Salt Lake City, Utah, in a press briefing last week. EPA says it won't touch the issue because, among other things, “numerous areas of scientific uncertainty” surround climate change. Because greenhouse gases aren't pollutants, EPA officials assert, the agency doesn't have the authority to regulate them.

    What's especially galling to a number of prominent climate scientists is the agency's use of a 2001 White House-requested report from the National Academies' National Research Council (NRC). It stressed the scientific consensus on climate change but noted that the “health consequences … are poorly understood.” The report also cites the challenge of differentiating between anthropogenic climate change and “natural variability.” Massachusetts and its allies believe that the appeals court erred in its July 2005 ruling that gave EPA broad discretion to avoid a rigorous scientific analysis of the harmful effects of carbon dioxide.

    In a friend-of-the-court brief filed last week, a group of researchers says that the scientific evidence “is clearly sufficient” to support a “reasonable anticipation” of the risks of greenhouse gases. Both EPA and the appeals court “mischaracterized” the 2001 report by quoting from it selectively, they add. “We have the responsibility to correct when science is misrepresented,” says Inez Fung, a University of California, Berkeley, climate researcher and one of six members of the 2001 climate panel who signed onto the brief. Panel chair Ralph Cicerone, now president of the National Academy of Sciences, declined to join the effort, a spokesperson said, because NRC reports “can and must stand on their own.”

    EPA, with allied states and industries, will file its arguments next month. Jay Austin, an attorney with the Environmental Law Institute in Washington, D.C., says that Massachusetts's reliance on the text of the 1970 law could play well with a majority of the justices, who are expected to rule before their term ends in June.


    A Better View of Brain Disorders

    1. Greg Miller

    As imaging methods such as fMRI and PET make their way from lab to clinic, neurologists hope to make earlier and more accurate diagnoses of brain disorders

    Inside look.

    Long used in research, PET brain imaging is gaining a foothold in neurological practice.


    IT WASN'T SO LONG AGO THAT TURNING a patient upside down was the state of the art in clinical brain imaging. The technique, called pneumoencephalography, involved injecting air bubbles into the fluid surrounding the spinal cord and strapping the patient into a rotatable chair. As the chair swiveled, the bubbles floated upward and moved along the surface of the brain, allowing a series of x-ray images to better distinguish its contours. “You put the x-ray images together in your mind's eye, and you'd get a picture of the brain,” recalls Marcus Raichle, a neurologist at Washington University in St. Louis, Missouri, who learned the method in the late 1960s. Pneumoencephalography helped neurologists find tumors and diagnose other problems that altered the gross anatomy of the brain. But the films were hard to interpret, Raichle says, and the procedure gave patients a nasty headache.

    The advent of x-ray computed tomography scans in the early 1970s made pneumoencephalography obsolete almost overnight. When magnetic resonance imaging (MRI) came into clinical use in the early 1980s, it gave neurologists even more detailed snapshots of the brain's structure. But these techniques have shortcomings as well. Unlike, say, a femur, the fitness of the brain is hard to assess from still pictures.

    Slowly but surely, a new generation of brain-imaging methods is finding its way from research labs into the clinic—and these techniques are offering physicians a much more dynamic look into the brain. Functional MRI (fMRI), a method used since the early 1990s to infer brain activity in studies of human cognition, now helps neurosurgeons map patients' brains before surgery, and a report on page 1402 raises the possibility of using fMRI to determine whether a patient in a vegetative state has conscious thought.

    Old school.

    Pneumoencephalography was unpleasant for patients and produced fuzzy x-ray images of the brain (inset).


    Positron emission tomography (PET), another standard tool of cognitive neuroscientists, also has medical promise. Clinicians already use PET to distinguish Alzheimer's disease from other types of dementia, and they are investigating ways to use PET to diagnose Alzheimer's and other diseases before symptoms appear—and before substantial structural damage to the brain has occurred. Some scientists even envision a day when real-time images of a patient's neural activity will provide a treatment for chronic pain or guide therapy sessions for psychiatric disorders.

    Obstacles remain, even for developing routine diagnostic applications, but many experts say clinical uses of these brain-research tools are long overdue. “There's no question it's the future of my field,” says John Ulmer, a radiologist at the Medical College of Wisconsin in Milwaukee and president of the American Society of Functional Neuroradiology (ASFNR), a group founded in 2004 to promote clinical applications of brain-imaging tools such as fMRI and PET. “It's not going to revolutionize the treatment of brain diseases with one broad stroke, but it's entering the clinical realm gradually, and it's going to continue to grow.”

    A “spectacular result”

    The case study reported in this week's issue of Science (see related Perspective on p. 1395) hints at how measures of neural activity can provide a dramatically different picture of the brain than that gleaned from now-routine structural MRI scans. Adrian Owen, a neuroscientist at the Medical Research Council Cognition and Brain Sciences Unit in Cambridge, U.K., and his team used fMRI to examine brain function in a young woman who sustained severe head injuries last year in a traffic accident. Five months after the accident, she was unresponsive, unable to communicate, and met the clinical criteria for vegetative state.

    However, fMRI scans showed that language-processing regions of her brain became active when words were spoken to her but not when she was exposed to nonspeech sounds. Sentences containing ambiguous words such as “creek/creak” activated additional language regions, as they do in healthy people. These findings indicated that she retained some ability to process language, Owen says.

    In another test, the researchers instructed the woman to picture herself playing tennis or walking through her house. In healthy people, imagining each activity activates a different set of brain areas involved in planning movements. The patient's fMRI scans showed an identical pattern—clear evidence, Owen and colleagues say, that she made a conscious decision to follow their instructions.

    Although some researchers aren't convinced Owen's team has cinched the case for consciousness in this woman, most agree that the fMRI scans reveal evidence of cognition that could not have been anticipated from standard MRI scans. “It's a spectacular result,” says Nicholas Schiff, a neurologist at Columbia University.

    Owen hopes to build on this work to develop a battery of fMRI tests for measuring cognitive functions in brain-damaged patients who are unable to communicate. He says this approach might someday be used to customize a patient's rehabilitation. For instance, if a patient's fMRI scans revealed an incapacitated visual system but a working auditory system, therapists could employ speech and sound. It's a wonderful idea, says Schiff, but a “staggering” amount of work is needed to make it happen.

    Yet fMRI has already made some clinical inroads, most notably in presurgical planning. For example, patients with tumors in the left frontal lobe of the brain present an especially tricky challenge for neurosurgeons trying to remove the cancer without destroying nearby brain tissue that controls speech and movement. Ulmer and his colleagues have been using fMRI to map out the brain regions responsible for these functions in presurgical patients, and they've recently added on an MRI method called diffusion tensor imaging (DTI) to map the tracts of axons conveying information from one brain region to another. Surgeons use this road map to determine how to reach a tumor and how much tissue to remove, Ulmer says. “We've seen a fivefold decrease in neurological complications with [combined fMRI and DTI] mapping for left frontal lobe tumors at our institution,” he says.

    Researchers and clinicians are still experimenting with DTI, and most hospitals don't have the equipment and expertise to use it. More physicians have already embraced fMRI. In 2004, 30% of neuroradiologists responding to an ASFNR survey reported that their institutions used fMRI for presurgical planning; with nearly double that number expecting to use it.

    Signs of trouble.

    In PET scans, PIB lights up regions of β-amyloid accumulation (red-yellow) in an Alzheimer's patient (left) but not in a healthy control (right).


    Scientists are also excited about using fMRI in the early diagnosis of Alzheimer's disease. Although there are currently no drugs capable of slowing the disease's rampage through the brain, early diagnosis will be key if such drugs are found. Otherwise, any intervention may be too late to reverse the damage done.

    In 2004, Michael Greicius, a neurologist at Stanford University School of Medicine in Palo Alto, California, and colleagues reported in the Proceedings of the National Academy of Sciences (PNAS) that they'd used fMRI to distinguish people with mild Alzheimer's disease from healthy elderly people. Alzheimer's patients at rest had less activity in a “default network” of brain regions, first identified by Raichle and colleagues, that includes certain regions of the cerebral cortex and the hippocampus, a crucial memory region. Such changes probably reflect a long-term decline in cellular metabolism caused by the disease, Greicius says. Although other researchers have argued that using fMRI to monitor brain activity in subjects engaged in memory tests should be the most sensitive way to pick up early signs of Alzheimer's disease, Greicius fears that smaller hospitals may not have the expertise to do task-activated fMRI. His approach—if it proves its merit in larger trials—would be far easier to use. “It's the sort of thing that could be done at a community hospital, where a technician presses a button and says, ‘Keep your eyes closed,’ and the software does the rest.”

    Scott Small, a neurologist at Columbia University, is taking what he thinks is a more targeted approach to picking up early signs of Alzheimer's disease. Like Greicius, he's using fMRI to look for long-term changes in brain metabolism rather than for short-term changes in brain activity evoked by a task. But instead of using BOLD fMRI, which measures blood oxygenation and is widely used by researchers to infer neural activity, Small has been working to refine a variant of fMRI that measures a different indicator of metabolic activity, blood volume.

    There's an emerging consensus that Alzheimer's disease strikes the hippocampus first and afflicts some parts of the structure before others, Small says. The blood-flow method provides better spatial resolution—enough to distinguish hippocampal subregions—and is easier to interpret than BOLD fMRI, Small says. His studies on animal models of Alzheimer's disease and preliminary work with people suggest that the earliest detectable sign of the disease is reduced metabolism in the entorhinal cortex, a region closely connected to the hippocampus. Small and colleagues at Columbia now have a grant from the National Institute on Aging to evaluate the diagnostic potential of the method in up to 1000 elderly people.

    Neurologists' PET

    In the Alzheimer's arena, fMRI is a step or two behind PET. So-called FDG-PET, which measures glucose uptake in the brain, another metabolic indicator and proxy for neural activity, has been used in recent years to distinguish Alzheimer's disease (which reduces metabolism in temporal lobe structures such as the hippocampus) from frontotemporal dementia (which reduces metabolism in the frontal lobes) in people with signs of dementia. It's become more popular since Medicare began reimbursing doctors for the procedure in 2004.

    FDG-PET has also shown promise for detecting Alzheimer's disease before symptoms appear. In a study reported in PNAS in 2001, a team led by Mony de Leon, a neurologist at New York University, used FDG-PET to monitor glucose metabolism in the brains of 48 healthy elderly volunteers. Three years after those initial scans, 11 of the volunteers had developed moderate cognitive impairments and one had developed Alzheimer's disease. Reduced metabolism in the entorhinal cortex during the initial scanning session was the measure that best predicted which people experienced a subsequent decline, de Leon and colleagues reported.

    His team has recently completed a study of a larger group of elderly people followed for longer periods of time. “With FDG-PET, we can pick up changes [in the brain] 9 years before the onset of symptoms,” says de Leon. He adds that work from his group and others suggests that maximizing the sensitivity and accuracy of diagnostic tests will require combining FDG-PET with other biomarkers, such as levels of Alzheimer's-related compounds like β amyloid and tau in the cerebrospinal fluid. A more comprehensive evaluation of FDG-PET's diagnostic promise should come from the 5-year, $60 million Alzheimer's Disease Neuroimaging Initiative (ADNI), a federally funded longitudinal study of 800 elderly people, half of whom will receive an FDG-PET scan.

    Burning pain.

    As they seek to minimize computer-generated flames, chronic pain patients in an fMRI machine are actually trying to quell neural activity in pain-processing regions of their brains (right).


    ADNI will also investigate another potential use of PET: imaging β amyloid, the main ingredient in the β-amyloid plaques that are a defining characteristic Alzheimer's disease. In 2002, researchers hailed the long-awaited discovery of a radioactive compound that makes it possible to see β in the brains of living people (Science, 2 August 2002, p. 752). Several pharmaceutical companies are already using this compound, called PIB, in clinical trials to monitor the effectiveness of candidate Alzheimer's drugs aimed at reducing β-amyloid buildup in the brain, says PIB co-inventor William Klunk, a neurologist at the University of Pittsburgh in Pennsylvania. In July, the Alzheimer's Association announced a $2.1 million grant that will enable ADNI-funded researchers to incorporate PIB PET scans into their studies to evaluate the method as a diagnostic test for Alzheimer's disease.

    Blood loss.

    Less blood volume (cooler colors) in the entorhinal cortex distinguishes a patient with early Alzheimer's disease (left) from a healthy elderly person (right).


    The original version of PIB utilizes a radioactive isotope—carbon-11—with a half-life of just 20 minutes, limiting its use to hospitals with easy access to a cyclotron. Klunk, in partnership with GE Healthcare, has recently developed a version of PIB based on fluorine-18, which has a far more convenient 120-minute half-life. The first research studies with F-18 PIB in humans should be under way by the end of this year, Klunk says.

    Several other PET-compatible β-amyloid-imaging compounds are under investigation around the country. “These are coming fast and furious,” says Kenneth Marek, a neurologist and president of the Institute for Neurodegenerative Disorders, a nonprofit research institute in New Haven, Connecticut.

    PET markers are also in the works for Parkinson's disease—and one is already in clinical use in Europe. A marker called DaTSCAN, also developed by GE Healthcare, uses radioactive iodine to label dopamine transporters, proteins in nerve terminals that recycle the neurotransmitter dopamine after it's released into the synapse. Such methods provide a general indicator of whether the dopamine system, which breaks down in Parkinson's patients, is working properly, Marek says, and in principle they should be able to spot trouble before a clinician can. “By the time you've developed symptoms, you've probably lost 50% of these dopamine transporters,” he says.

    Marek and colleagues have investigated another compound that labels dopamine transporters, β-CIT. In pilot studies using single-photon-emission computed tomography, a method similar to PET, it showed promise for distinguishing Parkinson's disease from other movement disorders. In a group of 35 suspected Parkinson's patients referred by a community neurologist to a movement-disorders specialist, the imaging results with β-CIT agreed with the patients' ultimate diagnosis more than 90% of the time—an improvement over the 75% accuracy of the initial diagnosis made by the referring doctors, the researchers reported in 2004 in the Archives of Neurology.

    Extinguishing pain's flame

    Some researchers argue that the clinical uses of PET and fMRI won't be limited to diagnosing brain disorders. In the 20 December 2005 PNAS, neuroscientists reported using fMRI to teach people with chronic pain to monitor and control their own brain activity—a high-tech version of biofeedback. The research team included scientists from Stanford and the Massachusetts Institute of Technology and was led by Christopher deCharms, a neuroscientist and president of Omneuron, a start-up company in Menlo Park, California.

    Each patient slid into an fMRI scanner and watched a computer-generated flame flickering on a monitor. The intensity of the flame reflected, with a few seconds' delay, neural activity picked up by the scanner in the patient's right anterior cingulate cortex, a region implicated in pain perception. The patients who best learned to minimize the flame reported the greatest reduction of pain symptoms immediately after the session. Another group of patients whose flames were fed by neural activity in their posterior cingulate cortex, an area not associated with pain processing, showed no such reduction.

    “I thought this was enormously clever,” says Raichle. Biofeedback has been tried previously for chronic pain, he says, but this is the first attempt to specifically target the brain regions that process pain. DeCharms's team is now doing a larger trial with weekly neuro-feedback sessions for pain patients and following up to see how long the effect lasts.

    Omneuron is also experimenting with real-time fMRI to assist psychotherapy. The firm's preliminary work has been in people with obsessive-compulsive disorder (OCD). Last year, at the annual meeting of the Organization for Human Brain Mapping, deCharms and colleagues described the method. Patients with OCD lie in the scanner, where they see the computer-generated flame, as well as a video link to their therapist, who sits in the control booth and also keeps an eye on the flame.

    It's far too early to say whether the method will work. One of the central challenges, deCharms says, is determining the best brain areas to fuel the flames. Fortunately, he adds, functional neuro-imaging methods such as fMRI have already provided many clues about what regions are involved in many psychiatric disorders. “The big question for us is, ‘How can we take this nearly 20 years of research and turn it into clinical applications?'”


    A Threatened Nature Reserve Breaks Down Asian Borders

    1. Richard Stone

    Chinese and Koreans share a love of Changbai Mountain, which straddles their border. Now that the area is under threat, the two sides may join hands to save it


    Changbai's stunning vistas are drawing increasing numbers of tourists—and increasing pressures on the landscape.


    CHANGBAISHAN NATURE RESERVE, CHINA—To many Chinese, Changbai Mountain, whose jagged volcanic summit cups a crater lake on the border of North Korea, is the fatherland of Manchurian emperors who rose to power during the Qing Dynasty 4 centuries ago. Koreans, meanwhile, revere the iconic peak, which they call Paektu, as the birthplace of their culture and the nerve center of resistance to Japanese colonial rule in the 1930s and '40s. For scientists, Changbai is precious for another reason: It's a unique set of ecosystems under siege. Now, a new Chinese initiative aims to save it.

    Changbaishan Nature Reserve, the largest protected temperate forest in the world, is home to endangered Siberian tigers and the last stands of virgin Korean pine-mixed hardwood on the planet. It's “one of the most spectacular and relatively undisturbed ranges in China,” says Burton Barnes, a forest ecologist at the University of Michigan, Ann Arbor, who conducted research here in the 1980s and early '90s. But aggressive logging along the reserve's Chinese edge, and conversion to croplands on the Korean side, threaten to turn Changbai into “an oasis in a sea of clearcutting,” says Wang Shaoxian, director of the Jilin Changbai Mountain Academy of Sciences (JCMAS).

    The reserve, roughly half the size of New York's Long Island, is also under increasing pressure from the inside. Chinese hot-spring resorts and Korean revolutionary museums on Changbai's flanks—the rugged, isolated terrain provided cover for the resistance—have transformed the reserve into a tourist mecca.

    Hoping to counter these threats to the fragile ecosystems, the Chinese government this year designated Changbaishan, or “Perpetually White Mountain,” as a major research initiative in its latest 5-year plan. It's pouring money into new facilities and projects, including a biodiversity survey and a study of how to better manage the Changbai ecosystems. The venerated mountain may also become a symbol of science transcending boundaries. Chinese and North Korean forest ecologists, who have had scant contact in recent years, are discussing the potential for collaborations at Changbai. From the vantage of local authorities, such cooperation “would be incredibly possible,” says Ding Zhihui, deputy director of the Jilin Changbaishan Protection, Development, and Management Committee.

    A research stint at Changbai has long Chinese and Koreans share a love of Changbai Mountain, which straddles their border. Now that the area is under threat, the two sides may join hands to save it been a rite of passage for many Chinese researchers. Biologists, volcanologists, and meteorologists would winter at a cliff-hugging station with stunning views of Heaven Lake (in Korean, Lake Chon). “That time of year, it's like the North Pole here,” says Dai Limin of the Institute of Applied Ecology in Shenyang. Temperatures can plunge below-40°C, and heavy snowfalls make the winding road up the peak impassable for months. Only in 2001 did the hardy winter crews finally yield to automated stations. Year-round observations, especially volcanic monitoring, are critical, says Wang. Changbai has been quiet since minor eruptions in 1597, 1688, and 1702. “It's due,” Wang says. Chinese spas are deemed within striking distance of future lava flows.

    Bald spots.

    Logging and clear-cutting for crops have broadened the mountain's bare patches, indicated in pink on this Landsat map.


    Once the snow melts, the highlands teem with researchers. The Chinese Academy of Sciences (CAS) runs Changbai like a scientific boot camp, deploying an army of grad students and young researchers each summer. The ringlike ecological zones that change with altitude are a top draw. From the sky, the demarcation of forest types appears like a target, with the 2700-meter summit as the bull's-eye. “It's very unusual to have distinct ecological zones so easily observable in one area,” says Wang. Outside the reserve, he notes, one would have to hopscotch thousands of kilometers to see all the forest zones on display at Changbai.

    Barnes and other U.S. ecologists have made scientific pilgrimages to Changbai. “I was very impressed with the beauty and diversity of the area,” says Mark Harmon of Oregon State University, Corvallis. “The buzz of the bees in the basswood trees was just amazing.” With CAS colleagues, Hank Shugart of the University of Virginia, Charlottesville, is using Changbaishan as a test bed for modeling vegetation response to climate change across Eurasia.

    But scientific affection has not translated into robust protection. “Although no tree is allowed to be logged within the reserve, biodiversity has been degraded due to other human activities,” says Guofan Shao of Purdue University in West Lafayette, Indiana, who has mapped forest zones at Changbai. The most severe disturbances stem from the harvesting of two valuable commodities: ginseng roots and pine nuts. Wild ginseng is disappearing, so forest plots are cleared for ginseng plantations, causing erosion. And the removal of pine nuts impairs regeneration and forces animals such as the gray squirrel or the spotted nutcracker that feed on the nuts to find other food sources or die out. Local authorities, for the first time, have banned the collection of pine nuts in the reserve this year. As a result, says Shao, “they basically have to send people to guard the forest” during the summer months.

    Reaching out.

    Wang Shaoxian hopes to work with North Korean scientists to restore Changbai's embattled ecosystems.


    Jilin authorities created JCMAS earlier this year to strengthen and coordinate research in the reserve. Although Changbai boasts a panoply of life, including more than 2000 plant species, “there has never been a systematic survey,” says Wang. Just such an initiative started last December and should be completed this autumn, he says. JCMAS plans to work with universities and CAS institutes to compile a DNA library of the reserve's flora and fauna. And Barnes says a comparison of Changbaishan's ecosystems with similar regions in Japan and eastern North America, “before further development renders them fragmented and domesticated, is of the highest international priority.”

    Such work would undergird an ambitious attempt to “balance the competing interests of tourism and environmental protection,” Wang says. Down the road, he says, saving Changbai may mean extending the reserve's boundaries, which could require resettlement of villagers. Support for such a drastic measure might get a boost if UNESCO declares Changbai a World Heritage Site as expected in 2008, prompting a management and research policy vetted by international experts.

    Chinese officials hope to kick off cooperation with North Korea in advance of World Heritage designation. “We're very interested in working with them to restore the ecosystems,” says Wang. Since spring, he explains, the Chinese government has been providing “much more encouragement” for contacts with North Korean researchers. “The quality of their scientists is high,” says Dai, who in 2002 visited North Korea's lakeshore research station, at the bottom of a zigzagging staircase hundreds of meters long that's visible from the Chinese side. And exploratory talks have begun on involving U.S. researchers in projects with North Korea and China. Barnes, for one, is eager. North Korea's forests “are one of the least well known to Western ecologists of any in the temperate zone,” he says.

    Wang should be in a position to host collaborations in autumn 2007, when JCMAS expects to complete construction of a new research building. In the meantime, he and his colleagues are happy to see a treasure of two cultures finally getting the scientific attention it deserves.


    Sex and the Single Killifish

    1. Elizabeth Pennisi

    Males seem to be superfluous in one fish species but may come in handy when genetic diversity is needed

    Males—who needs them? Not the mangrove killifish. Made up primarily of hermaphrodites, the species reproduces just fine without the masculine touch. Yet male killifish do exist and can play a role in the species' survival, says John Avise, an evolutionary geneticist at the University of California (UC), Irvine. He and his colleagues have now shown that mangrove killifish are part of a select group of animals that use this unusual reproductive strategy, known as androdioecy.

    This particular killifish “is the single species of any vertebrate that is doing this,” says Stephen Weeks, an evolutionary ecologist at the University of Akron, Ohio.

    Among androdioecious species, which include certain clam shrimp, barnacles, and nematodes, most individuals have a single gonad that produces both eggs and sperm, which meet internally before the eggs leave the body. But in each of these species, a few diehard males exist.

    Until recently, evolutionary biologists considered androdioecy to be a transitory phase that occurs while a species, depending on its need for either genetic diversity or reproductive self-sufficiency, switches from two separate sexes to hermaphroditic, or vice versa. One reason is that “it's a high evolutionary hurdle” for males to persist among hermaphrodites, explains Loren Rieseberg, an ecologist at the University of British Columbia, Vancouver. “Males need twice the fertility of hermaphrodites.”

    Weeks has found that clam shrimp have no trouble jumping this hurdle, suggesting that for at least some species androdioecy is a viable, long-term solution. He recently added nine new species of clam shrimp to the list of androdioecious shrimp, for a total of 13. Moreover, the phylogeny and biogeography of these species indicate that this male-hermaphrodite strategy has lasted between 24 million and 180 million years, Weeks and his colleagues reported online in the 6 December 2005 Proceedings of the Royal Society B.

    Avise is just beginning to piece together the story of the mangrove killifish. It lives in the muck around the roots of mangroves in the Caribbean and along the coasts of South Florida and northern South America, hanging out in crab burrows and dead logs. Self-fertilization by the hermaphrodites yields offspring that are virtual clones of the parent, which is why researchers once expected to see little genetic diversity among killifish at any particular location.

    Going it alone.

    Neither the mangrove killifish (bottom) nor the clam shrimp (top) needs a male to reproduce.


    But 15 years ago, ichthyologist Bruce Turner of Virginia Polytechnic Institute and State University in Blacksburg discovered that certain populations had unexpectedly high levels of genetic diversity. He proposed that these fish might have unusually high mutation rates or that fish immigrating from other populations were the source of this variation. “Turner had it wrong,” says Avise.

    Working with colleagues, including Mark Mackiewicz of the University of Georgia, Athens, Andrey Tatarenkov of UC Irvine, and Turner himself, Avise collected killifish from along the Florida coast and analyzed their DNA. The group focused on 35 markers, DNA sequences called microsatellites, along the genome. In each population, the researchers found some individuals whose microsatellites were virtually identical. But, as they reported online 5 July in the Proceedings of the Royal Society B, some samples contained a few individuals whose DNA differed at so many markers that it raised suspicions that there was a second parent somewhere in the picture.

    As far back as the 1960s, ichthyologists had demonstrated that they could, in the lab, produce male mangrove killifish by keeping self-fertilized eggs cool, for instance, or by growing immature hermaphrodites at high temperature. But little was known about what conditions produced males in the wild.

    Avise and his team found very few males among the killifish collected in Florida or the Bahamas. But when they repeated the study with fish from Belize, 10% to 20% of the catches were male. And DNA analyses revealed dramatic differences in diversity among killifish from the various locations. Those from any one spot in the Bahamas or Florida were genetically similar, whereas members of Belize populations varied in their genetic makeup about as much as would be expected had they been following the typical male-female reproductive strategy, the researchers reported in the 27 June Proceedings of the National Academy of Sciences. More recently, Avise's group has confirmed in lab experiments that these males mate with the hermaphrodites and produce viable young that spice up the genetic diversity. They will report these results in an upcoming issue of the Journal of Heredity.

    The existence of androdioecy in species as different as killifish and shrimp indicates that “there must be underlying biological commonalities in the kinds of selection pressures … and the evolutionary responses involved,” says Avise. Weeks and other researchers think this strategy has worked so well—and for so long—in clam shrimp because they live in ephemeral pools and often find themselves trapped in new places sans partners. The widespread distribution of killifish suggests that it, too, is a good colonizer and that hermaphroditism may facilitate that skill, Avise adds.

    But David Bechler, an ichthyologist at Valdosta State University in Georgia, suspects that hermaphrodites won't always have the upper hand among these killifish. Both he and Avise agree that mangrove killifish were once a two-sex species. And although conditions now favor hermaphrodites, the high proportion of males in Belize suggests that the low genetic diversity is becoming a handicap. “What we are seeing is male evolution reoccurring,” Bechler suggests.


    Artificial Arrays Could Help Submarines Make Like a Fish

    1. Briahna Gray

    An interdisciplinary team has developed nanostructures that mimic how marine animals hunt, evade predators, and stay in the swim of things

    Listen. As you read, tiny hair cells in your inner ear amplify and convert sound waves into electrical signals that can alert you to the output of your iPod or the approach of a subway train. Similar structures on other animals, such as seal whiskers and the hairs on spider legs, help those organisms to track prey and evade predators. Now, engineers and biologists have developed the world's first functional artificial hair cell to mimic one of nature's most widespread and versatile data-collecting systems: the lateral lines of fish.


    Researchers modeled flow sensors on tiny hair cells found on fish such as this mottled sculpin.


    In a paper published in an August issue of EURASIP Journal on Applied Signal Processing, engineer Chang Liu of the University of Illinois, Urbana-Champaign, describes how biologically inspired microstructures enable a model fish to locate and track a dipole source. Real fish use a linear swatch of hair cells on their sides, known as the lateral line, to coordinate group movements, avoid predators, and otherwise navigate. “I'm thrilled to see this,” says Jeannette Yen, director of the Center for Biologically Inspired Design at the Georgia Institute of Technology in Atlanta. “It shows that we do understand the biological system well enough to make a mimic that works in a similar way.”

    Morley Stone, a former program manager at the U.S. Defense Advanced Research Projects Agency (DARPA), which funds Liu under a project called BioSenSE (Biological Sensory Structure Emulation), hopes that artificial hair cells might someday be used to navigate crewless underwater vehicles too small to be equipped with cameras. The hair cells would greatly expand underwater imaging capacities beyond those now generated by sonar or cameras, he notes. “When you look through a soda straw, it's hard to get an idea of what your world looks like,” says Stone.

    Like their analogs in real fish, Liu's hair cells work by measuring the movement of nearby water. Most commercial flow sensors measure the change in electrical resistance when flowing water cools a heated metal wire. Although Liu has also developed lateral-line arrays using more conventional “hot wire” technology, his hair sensors, by contrast, are activated by force. These are made using a standard microfabrication technique called photolithography to carve polymers into long, flexible, narrow strands about 500 to 700 micrometers long and 80 micrometers in diameter. The strands are rooted in a silicon base called a pedal, creating a minuscule lever. When the hairs are bent, the strain on the pedal causes a change in electrical potential that correlates to flow velocity.

    Liu tested his lateral-line array by installing it in an artificial fish. The model was attached via a rod to an agile motion stage whose positioning was directed by signals received by the fish in response to a wriggling dipole source. Although Liu's array used only 16 hairs rather than the 100 usually found on real fish, the artificial fish was able to target and track the moving source.

    The BioSenSE team includes biologists, neurologists, engineers, and mathematical modelers, all working to reverse-engineer nature's blueprint. “This is one of the largest international groups we've been able to pull together,” says Stone. For example, Sheryl Coombs, a neurobiologist at Bowling Green State University in Ohio, has collected data on the spatial distribution of pressure along the lateral line of real fish to develop algorithms sensitive enough to process the wealth of information gleaned by Liu's sensors. That information was then validated by numerical simulations carried out by biologist-engineer Joseph Humphrey of the University of Virginia, Charlottesville, and applied to the programming efforts of Douglas Jones, an engineer at the University of Illinois, Urbana-Champaign. “It illustrates the best of this new set of collaborations between biologists and engineers,” says Steven Vogel of Duke University in Durham, North Carolina, who studies biomechanics.

    Coombs's experiments show that even blinded fish still orient themselves toward movement via a “map of touch” created by their sensory system. Abroad, zoologists Horst Bleckmann of the University of Bonn in Germany and Friedrich Barth of the University of Vienna in Austria are studying seals and spiders, respectively, for potential applications in both underwater tracking and airborne drones.

    At Iowa State University in Ames, engineer Vladimir Tsukruk and his team used a synthetic hydrogel to mimic the soft cupula tissues surrounding fish hair cells that help relay information. The gel both protects the hairs against corrosion and makes them 10 times more sensitive. Liu's hair sensors can detect flows slower than 1 millimeter per second, half the rate of conventional sensors. However, increasing the sensitivity of the sensor is a double-edged sword, says Liu, because of the added burden of filtering out unwanted noise. Scientists are using fish biology as a guide to tackle that problem as well, managing to mimic their hair cells' structural alignment that allows fish to weed out background noise.

    Although the sensors were developed primarily to help guide small, robotic vehicles, Liu suggests that they could also assist submarines. For example, submarines now employ passive sonar to avoid giving away their position. But because that technology reads signals generated by noise, it cannot detect a stationary submarine or the subtle vortexes shed by large rocks. In addition, active sonar requires the emitted “ping” to travel away from the ship so that the feedback can be analyzed. That constraint creates a blind zone around the craft that makes subs vulnerable to sabotage by bomb-carrying divers, says Liu.

    The right bent.

    The artificial hair cells are only 500 to 700 micrometers long and can be adapted to function as both vibration and tactile sensors.


    Liu says that his array can eliminate that problem by detecting movement within a radius of about three to four times the length of the vessel, 200 meters or less for a full-sized submarine. Liu's hair cells are sensitive enough to detect both divers and large, unmoving bodies such as rock faces that are normally invisible in dark or murky conditions. Hair-cell sensors also have shown the potential to track other submarines based on wakes created minutes before, just as seals use their whiskers to track their prey. To turn those applications into reality, however, the artificial hair cells must be robust enough to withstand a marine environment.

    Scientists can also imagine nonmilitary applications for the sensors. Changing the shape of the hair, Liu speculates, could yield vibration or tactile sensors in addition to flow sensors. Scaling up production could lower the cost of semiconductor sensors from $12 to $1 per unit, opening up markets as diverse as sneakers, MP3 players, and stress gauges in buildings in earthquake-prone areas.

    Despite the many challenges, Stone predicts that DARPA will pick up the project for a second term beginning this fall. And if all goes well, someday hair cells might alert your iPod as well as your ear to the rumbling of an approaching subway train.


    Sea Animals Get Tagged for Double-Duty Research

    1. Christopher Pala*
    1. Christopher Pala is a writer in Honolulu, Hawaii.

    Elephant seals and other deep-diving species are providing an unexpected boost to a global oceanographic database

    Eight years ago, Dan Costa tagged nine elephant seals to learn how the sea mammals would respond to an expected El Niño event, a shift in a cold-water current in the Pacific. Sensors glued to the seals' backs were designed to record the depth at which they dived and the temperature of the water, while transmitters glued to their heads gave out their position. Once tagged, the giant pinipeds lumbered out from their rookery on Año Nuevo Island near Santa Cruz, California. Some went to the Aleutian Islands, others to the Gulf of Alaska, and a third group shot straight out West into the central Pacific.

    After one season, the seals returned to Año Nuevo toting detailed records of 75,000 dives in the North Pacific. Costa, a biologist at the University of California, Santa Cruz, learned that the seals dive more frequently and deeper than previously thought—some 60 times a day, routinely as far down as 600 meters, and sometimes as deep as 2000 meters. In the last decade, tagging of this kind has given researchers increasingly sophisticated data from fishes, turtles, seals, and whales, revolutionizing our understanding of how they behave under the surface (Science, 11 August, p. 775).

    In depth.

    The frequent, deep dives of California elephant seals provide a wealth of information about the ocean.


    But in addition to the bounty of information on the animals' movements, their dives also pointed to a new method for scooping up hard-to-get information about the ocean that's useful for climate research. The method promises a wealth of physical data from the deep that will soon dwarf the amount gathered by ships and research buoys. And whereas the first wave of tagged elephant seals could only record depth and temperature, today's more sophisticated tags also capture salinity. “Different water masses have unique temperature and salinity signatures, and these can be used to trace the origin of the oceanic water in a given region,” says Costa.

    Researchers want to learn about temperatures and water density in the polar regions, for example, because they affect circulation and climate. James Hansen, chief of NASA's Goddard Institute for Space Studies in New York City, says that although researchers have collected data from the upper layers of most of the oceans, the polar regions are poorly covered. With support from ocean scientists, Costa and others are now tagging animals in these less explored areas, taking advantage of their ability to reach places where no machines can go.

    Seals as lab assistants

    Looking over the collection of 75,000 depth profiles from elephant seals, Costa and his team thought the results might interest oceanographers. “But we had no idea what to do with the data, who to give it to, or how to prepare it,” he recalls. That summer, Costa presented the findings on El Niño's effects on elephant seals (surprisingly slight) at a meeting at the Scripps Institution of Oceanography in San Diego, California.

    In the audience sat George Boehlert, then a lab chief at the U.S. National Oceanic and Atmospheric Administration (NOAA). “This was incredible data,” recalls Boehlert, now head of Oregon State University's Hatfield Marine Science Center in Newport. “I was really surprised at the frequency of the dives and how deep these seals go.” After the presentation, Boehlert told Costa he knew how to check the figures against existing data and, if they were accurate, how to enter them into a massive depot called the World Ocean Database (WOD).

    Big picture.

    Dan Costa's team has been tagging elephant seals for 10 years at Año Nuevo Island near Santa Cruz.


    NOAA had funded the database to hold records from ships and submarines. Later, it added data from its 2500 “Argo” buoys, which drift around the world at about 1000 meters below the ocean's surface, rising every 10 days to transmit temperature profiles. According to Sydney Levitus, the NOAA scientist who manages the database, each year Argo buoys provide 100,000 depth profiles, whereas other buoys, ships, and submarines provide about 140,000.

    Back in 1998, Boehlert recalls, few oceanographers knew about animal electronic tags, and “among those who knew, there was a great deal of skepticism about the quality of the data.” But the data proved reliable after being checked against profiles obtained by ships and satellites. So the 75,000 profiles from elephant seals were added to the ocean database. Boehlert, Costa, and Levitus also published a proof-of-concept paper in 2001 in the Journal of Atmospheric and Oceanic Technology. “You can't understand a climate system without knowing what's going on at depth,” Levitus says. “So we want all the data we can get.”

    But the flow quickly dried up. What happened? After the California elephant seal study, Costa says, “we stumbled around trying to get funds to get tags, but we got nothing for years. We reused the tags we had,” he adds, but “we had no money to pay someone to process the data.” Although he and Levitus had shown the utility of the data for oceanography, that community has been slow to recognize its value—and to seek funding from the relevant federal agencies.

    But that situation is changing, as interest in using tagging data for ocean research is on the rise. Since 2000, the Tagging of Pacific Pelagics (TOPP) program, funded mostly by private foundations, has been tagging 23 species in the Pacific Ocean. Seven of those species—the air-breathing ones that carry location transmitters—now produce about 1 million depth/temperature profiles a year. And TOPP hopes to format the data and deposit it in WOD within a year.

    Under the ice

    Two years ago, Costa and a team from Old Dominion University in Virginia won a 3-year, $800,000 grant from the National Science Foundation to join colleagues from France, the United Kingdom, and Australia in a program called Southern Elephant Seals as Oceanographic Sensors. The group is tagging 70 southern elephant seals, who then spend much of their time diving and feeding under the Antarctic pack ice. As they go about their business, the seals are gathering more than 10,000 profiles a year.

    Antarctic data are critical for the study of ocean circulation, says Steve Rintoul, a U.S. oceanographer based at the Antarctic Climate and Ecosystems Cooperative Research Center in Hobart, Australia. Surface waters cool and become denser in the polar regions, sinking several kilometers to the ocean bottom. Warm water then flows in, creating the so-called thermohaline circulation. This process controls how much heat and carbon dioxide is stored by the ocean, influencing the rate of climate change. Climate models suggest that warming at the poles could slow down the circulation, driving further warming. But there are “almost no measurements,” he says, because “subs aren't allowed … in this blind spot” and the Argo buoys can't transmit through the ice.

    Meanwhile, Costa has turned over more than 1 million profiles—a decade of California elephant seal data—to Steven Bograd, an oceanographer with NOAA's Pacific Fisheries Environmental Laboratory in Pacific Grove, California. Bograd, another co-principal investigator for TOPP, is harmonizing and calibrating the data before comparing them with climatic events in the past decade, including two El Niño events. The goal, says Bograd, is to “better understand the mechanisms by which these climate signals impact the ecosystem.”

    So far, the most recent data from animal tags haven't gone into the ocean database, Costa says. “The reason it takes time is that we're coming up with much more precise and reliable methods of defining where the profiles were taken than we were in 1999,” he says. “Five years ago, anything was valuable, but now it's compared to the Argo buoys, which are very precise.”

    How soon might these profiles be ready for the database? “We're working on it,” Costa says. “I think we'll be able to turn over 2 years' worth of data, which is about 25,000 depth profiles, within 6 months.” Oceanographers and climate researchers await the promised deluge.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution

Navigate This Article