News this Week

Science  19 Dec 2008:
Vol. 322, Issue 5909, pp. 1766

    Reprogramming Cells

    1. Gretchen Vogel

    By inserting genes that turn back a cell's developmental clock, researchers are gaining insights into disease and the biology of how a cell decides its fate.

    By inserting genes that turn back a cell's developmental clock, researchers are gaining insights into disease and the biology of how a cell decides its fate

    This year, scientists achieved a long-sought feat of cellular alchemy. They took skin cells from patients suffering from a variety of diseases and reprogrammed them into stem cells. The transformed cells grow and divide in the laboratory, giving researchers new tools to study the cellular processes that underlie the patients' diseases. The achievement could also be an important step on a long path to treating diseases with a patient's own cells.


    The feat rests on a genetic trick, first developed in mice and described 2 years ago, in which scientists wipe out a cell's developmental “memory,” causing it to return to its pristine embryonic state and then regrow into something else. In 2008, researchers achieved another milestone in cell reprogramming. In an elegant study in live mice, they prompted cells to make the leap directly from one mature cell into another—flouting the usual rule that development of cells is a one-way street. These and other advances in tweaking cells to assume new identities add up to make the now flourishing field of cellular reprogramming Science's Breakthrough of the Year.


    See Web links on reprogramming cells

    This year's breakthroughs have done much to wipe out memories of a major scandal that erupted 3 years ago, after scientists in South Korea fraudulently claimed to have used somatic cell nuclear transfer—the technique used to clone Dolly the sheep—to generate stem cells from patients suffering from type 1 diabetes, spinal cord injury, and a congenital immune disease. The debacle dealt the field a huge setback; patient-specific stem cells seemed like a distant prospect.

    The new developments build on two previous breakthroughs. Ten years ago last month, scientists in Wisconsin announced that they had cultured human embryonic stem (hES) cells—cells with the potential to form any cell type in the body. That power, known as pluripotency, opened up a world of possibilities in developmental biology and medical research, but it came with baggage: Because isolating the cells typically destroys the embryo, the research sparked fierce debates over bioethics. In many countries, including the United States, political decisions limited the work scientists could do with hES cells.

    In 2006, Japanese researchers reported that they had found a possible way around the practical and ethical questions surrounding hES cells. By introducing just four genes into mouse tail cells growing in a lab dish, they could produce cells that looked and acted very much like ES cells. They called these cells induced pluripotent stem (iPS) cells. Last year, in a development recognized as the first runner-up in Science's 2007 Breakthrough of the Year issue, the same team and two others in the United States extended the reprogramming technique to human cells. That result opened the floodgates to new research.

    Cells, made to order

    For nearly a decade, stem cell biologists have sought a way to make long-lived cell lines from patients suffering from hard-to-study diseases. (Most adult cells do not survive culture conditions in the lab, so taking cells of interest directly from patients doesn't work.) This year, two groups achieved that goal. One team derived iPS cell lines from the skin cells of an 82-year-old woman suffering from amyotrophic lateral sclerosis (Lou Gehrig's disease), a degenerative disease that attacks the motor neurons, causing gradual paralysis. The scientists then directed the cells to form neurons and glia, the cells that are most affected by the disease (photos, below).

    Just a week later, another group reported making patient-specific iPS cell lines for 10 different diseases (see table), among them muscular dystrophy, type 1 diabetes, and Down syndrome. Many of these diseases are difficult or impossible to study in animal models; the reprogrammed cells give scientists a new tool for studying the molecular underpinnings of disease. They may also prove useful in screens for potential drugs. Eventually, such techniques might allow scientists to correct genetic defects in the lab dish and then treat patients with their own repaired cells.

    Another paper published this year suggests that the reprogramming exit ramp does not have to lead back to an embryonic state but can take a cell directly to a new mature fate. American researchers, working in mice, reprogrammed mature pancreas cells called exocrine cells into beta cells, the cells in the pancreas that produce insulin and are destroyed by type 1 diabetes. The team injected a cocktail of three viruses into the pancreases of adult mice. The viruses primarily infected the exocrine cells, and each one carried a different gene known to play a role in beta cell development. Within days, the treated mice formed insulin-producing cells that looked and acted like bona fide beta cells.

    The results are surprising because in living creatures, specialized cells almost never change course, changing, say, from a muscle cell into a lung cell. Such direct reprogramming, however, might be simpler and safer than using pluripotent cells to treat some diseases. The technique might also enable scientists to speed up the lab production of desired cell types, using defined factors to change one type of cultured cell directly into another.

    Wanted: more breakthroughs

    Although researchers made impressive progress in 2008, several more breakthroughs are needed before cellular reprogramming yields its first cure for disease. For reprogramming to be safe enough to use in cell therapy, researchers must find an efficient, reliable way to trigger it. They also want to understand exactly how the process works. Although dozens of labs have used the technique, what is happening inside the reprogrammed cell remains a mystery, and a combination of chance events seems to determine which rare cells end up being reprogrammed. A leading theory is that some of the reprogramming factors first help to loosen up the DNA in a cell's nucleus, making it easier to reactivate turned-off genes. Then the other factors help to set off a cascade of protein signals that give a cell its new identity (see the Review by Gurdon and Melton on p. 1811).

    The original reprogramming recipe relies on viruses to insert the reprogramming genes into the infected cell's genome, altering the DNA permanently. Scientists are wary of that approach for a couple of reasons. First, the inserted DNA could interrupt existing genes—for example, those that guard against cancer, leaving the cells likely to form tumors. And although the inserted genes seem to turn off after reprogramming is finished, allowing the cell's own genes to take over, scientists worry that the inserted genes could be reactivated or could have other subtle effects on the cell.

    For that reason, labs around the world are working on other ways to trigger reprogramming. This year, they made rapid progress. Several groups found that they could substitute chemicals for some of the inserted genes. Another found that adenoviruses could also do the trick, at least in mouse cells. Adenoviruses, which cause the common cold, do not insert themselves into the genome. The viruses express their genes long enough to reprogram the cells, but as the cells divide, the viruses are diluted down to undetectable levels, leaving reprogrammed cells with their original genomes unchanged. Researchers in Japan showed that rings of DNA called plasmids could also carry the required genes into the cell. The alternatives are much less efficient than the original recipe, however, and most have not yet worked in human cells, which are harder to reprogram than mouse cells.


    To be useful, reprogramming also needs to become much more efficient. Most experiments have managed to reprogram fewer than one in 10,000 cells. In what seems to be a lucky break for the field, however, two groups showed this year that the skin cells called keratinocytes are particularly easy to reprogram. Researchers can reprogram roughly 1% of the keratinocytes they treat, and the process takes only 10 days instead of the several weeks that other cells require. Hair follicles (photo, above) are a rich source of keratinocytes, and researchers in California and Spain showed that they could efficiently derive personalized cell lines from cells taken from a single human hair plucked from the scalp—an even easier source of cells than cutting out a piece of skin.


    Finally, reprogramming needs better quality control. This year, an American group took a major step in that direction by making cells in which the reprogramming genes could be turned on by the addition of the antibiotic doxycycline. They then used the reprogrammed cells to generate “second generation” iPS cells that are genetically identical—each contains the same viral inserts. These cells will allow scientists to study the process of reprogramming for the first time under standardized conditions and should help to reveal the biochemical processes that enable an adult cell to take an exit ramp from its one-way path of development.

    A thorough understanding of reprogramming is not enough, however. Ten years after the discovery of human ES cells, scientists are still working on standardizing procedures for coaxing pluripotent cells to become mature tissue. It's a critical problem: Stray pluripotent cells used in therapies could trigger dangerous tumors. And even though scientists can easily prompt pluripotent cells to become beating heart cells in a lab dish, no one has yet perfected a way to get such cells to integrate into the body's tissues to replace or repair their diseased counterparts. But researchers are moving faster down the highway of discovery than many had expected or dared to hope.

    See Web links on reprogramming cells


    The Runners-Up

    The first direct detections of exoplanets topped the list of this year's runners-up for Breakthrough of the Year. Other notable discoveries included cancer genes, new high-temperature superconductors, and a new water-splitting catalyst.

    2 Seeing Exoplanets

    See Web links on exoplanets

    Seeing might be believing, but for scientists belief rarely depends on seeing. The right squiggles coming out of an instrument are usually enough to confirm that they have caught their quarry, however infinitesimal, insubstantial, or bizarre. Astronomers searching for planets circling other stars, however, may have been getting just a tad impatient with their progress toward their ultimate goal: recognizing a habitable, even an inhabited, planet beyond our own solar system. For that, they'll need to see their target. But all exoplanet detections had been of the squiggly variety.

    Now, astronomers have seen exoplanets for the first time—a half-dozen candidates have been announced in the past few months. To some, the new observations may simply have replaced squiggles with dots. But the faint pinpricks of light from far-off worlds have captured the public's imagination and will give astronomers new clues to what those distant planets are made of and how they were formed. Key to these direct detections have been big telescopes and the latest technology to pick out a vanishingly faint planet from its host star's overwhelming glare.

    Previous, indirect detections of more than 300 exoplanets had provided breakthroughs of their own. For 13 years, astronomers have been finding exoplanets using ground-based telescopes to monitor the subtle wobble a planet gravitationally induces in its star. This workhorse radial-velocity technique is especially useful for finding massive “hot Jupiters” searingly close to their star. No light is seen from the planet, however. Another method, called microlensing—in which a planet's gravity momentarily brightens a background star by bending its passing light—is particularly good for detecting planets more distant from their stars and in principle could spot lightweights with masses down to that of Earth. But microlensing is a one-off event; once the fleeting alignment with the star is over, no sign of the planet will ever be seen again.


    If a planet happens to orbit across the face of its star as viewed from Earth, however, the repeated tiny dimming of the total light of the star plus the planet can reveal the presence of the planet. At the same time, starlight passing through the outer planetary atmosphere can reveal clues about composition. Already, water, methane, and—just last month—carbon dioxide have been detected in transiting exoplanets. Those compounds, plus molecular oxygen, are the key markers of an inhabited planet. But only hot Jupiters—unlikely abodes of life—are liable to transit their stars and be detected using current technology.

    That leaves direct detection. The chore is simple enough: Separate the light from a planet from the light of its nearby star. The hitch is that the star is millions of times brighter than any planet, and Earth's turbulent atmosphere churns the light of star and planet together. To solve the latter problem, astronomers can move their telescopes above the atmosphere to Earth orbit. Or they can correct the incoming telescopic image using so-called adaptive optics, in which precisely controlled warping of a mirror many times a second straightens out distorted light. Coping with the vast difference in brightness between planet and star requires a coronagraph in the telescope to physically block out the star or “virtual coronagraph” software to remove starlight from the image. It also helps to search for very young and therefore still hot planets at infrared wavelengths, in which case the star-planet contrast will be much smaller.

    With more than 5 years of observations using the latest technology, astronomers are suddenly busting down the doors to announce candidates for directly detected exoplanets. Published last month, the most secure—and surely the most stunning—are three objects orbiting a star called HR 8799, 128 light-years from Earth. Judged to have five to 10 times the mass of Jupiter, they orbit at least 24 to 68 times farther from their star than Earth orbits from the sun. That makes them among the most massive exoplanets discovered and by far the most distant from their star. New detection techniques typically start by finding such oddballs. These are giving theorists fits; they don't see how planets could have formed that far out.

    Other direct detections came one per star. Last month, another group also reported detecting a planet of roughly three Jupiter masses orbiting the star Fomalhaut, one of the brightest stars in the sky. A third group announced a single candidate exoplanet last September but must await confirmation that it is orbiting the star rather than just passing through. And a fourth group announced late last month what would be—at eight times the sun-Earth distance from its star—the imaged planet closest to its star.

    Astronomers are already starting to analyze the light of some of the new finds for clues to their physical and chemical nature. That should keep planetary formation theorists busy. The chance to directly study potentially inhabited planets is further off. Imaging Earth-like exoplanets in Earth-like orbits is probably still decades and certainly billions of dollars away.

    See Web links on exoplanets

    3 Cancer Genes

    See Web links on cancer genes

    Researchers this year turned a searchlight on the errant DNA that leads tumor cells to grow out of control. These studies are revealing the entire genetic landscape of specific human cancers, providing new avenues for diagnosis and treatment.

    Tumor cells are typically riddled with genetic mistakes that disrupt key cell pathways, removing the brakes on cell division. Thanks to the completion of the human genome and cheaper sequencing, researchers can now systematically survey many genes in cancer cells for changes that earlier methods missed. Results from the first of these so-called cancer genome projects came out 2 years ago, and the output ramped up in 2008.

    Leading the list were reports on pancreatic cancer and glioblastoma, the deadliest cancers. By sequencing hundreds or thousands of genes, researchers fingered dozens of mutations, both known and new. For example, a new cancer gene called IDH1 appeared in a sizable 12% of samples from glioma brain tumors. A separate glioma study revealed hints as to why some patients' tumors develop drug resistance. Other studies winnowed out abnormal DNA in lung adenocarcinoma tumors and acute myeloid leukemia.


    The expanding catalog of cancer genes reveals an exciting but sobering complexity, suggesting that treatments that target biological pathways are a better bet than “silver bullet” drugs aimed at a single gene. Genome projects for at least 10 more cancers are in the works.

    See Web links on cancer genes

    4 New High-Temperature Superconductors

    Physicists discovered a second family of high-temperature superconductors, materials that carry electricity without resistance at temperatures inexplicably far above absolute zero. The advance deepened the biggest mystery in condensed-matter physics.

    In February, a group in Japan reported the first material, fluorine-doped lanthanum iron arsenic oxide (LaFeAsO(1-x)Fx), which is superconducting up to a “critical temperature” of 26 kelvin. Within 3 months, four groups in China had replaced the lanthanum with elements such as praseodymium and samarium and driven the temperature for resistance-free flow up to 55 kelvin. Others have since found compounds with different crystal structures and have bumped the critical temperature up to 56 kelvin.


    For a critical temperature, that's not so hot. The record is 138 kelvin for members of the other family of high-temperature superconductors, the copper-and-oxygen, or “cuprate,” compounds discovered in 1986. Still, the iron-based materials have created a stir, in part because they might help solve the enduring mystery of how the cuprates work. The $64,000 question is whether the two families work the same way. So far, evidence points in both directions.

    5 Watching Proteins at Work

    See Web links on watching proteins

    After studying proteins for more than a century, biochemists pushed the boundaries of watching the molecules in action—and received surprises at every turn.

    Scientists have long debated how proteins bind to their targets. Most think the shape of a target molecule forces a protein to wiggle into a complementary profile. But it's also possible that proteins in solution wiggle among many slightly different conformations until one finds its target. Computational biologists in Germany and the United States offered bold new support for that upstart idea when they crunched extensive experimental data and showed how one long-studied protein seems to dance among dozens of conformations. In another surprise, a U.S. team tracked individual proteins and found that a single random molecular event can switch a bacterial cell from one metabolic state to another.


    Zooming out to the large scale, proteomics researchers in Germany simultaneously monitored the abundance of up to 6000 proteins in yeast cells and quantified how the expression of individual proteins differed between two different cell types. Their technique could lead to new insights into development and disease. Finally, proteomics researchers in Sweden revealed that different tissues in the body likely get their unique characteristics by controlling not which proteins are expressed but how much of each gets made.

    See Web links on watching proteins

    6 Water to Burn

    See Web links on water-splitting catalyst

    Renewable energy sources, such as wind and solar power, have plenty going for them. They're abundant and carbon-free, and their prices are dropping. But they're part-timers. Even when the sun is shining and the wind is blowing, there is no good way to store excess electricity on an industrial scale. Researchers in the United States reported this year that they've developed a new catalyst that could begin to change that picture.

    The catalyst, a mixture of cobalt and phosphorus, uses electricity to split water into hydrogen and oxygen. Hydrogen can then be burned or fed to fuel cells that recombine it with oxygen to produce electricity. Researchers have known for decades that precious metals such as platinum will split water. But platinum's rarity and high cost make it impractical for large-scale use. The cobalt version isn't all the way there yet, either—it still works too slowly for industrial use—but just getting a cheap and abundant metal to do the job is a key step. Now, if researchers can speed it up, on-again-off-again renewables could have a future as fuels that can be used anywhere at any time.

    See Web links on water-splitting catalyst

    7 The Video Embryo

    See Web links on imaging of development

    The dance of cells as a fertilized egg becomes an organism is at the center of developmental biology. But most microscopes allow only partial glimpses of the process. This year, scientists observed the ballet in unprecedented detail, recording and analyzing movies that traced the movements of the roughly 16,000 cells that make up the zebrafish embryo by the end of its first day of development.

    Researchers in Germany made the movies with a new microscope they designed. It uses a laser beam to scan through a living specimen, capturing real-time images and avoiding the bleaching and light damage that have usually limited such videos to just a few hours. The researchers then used massive computing power to analyze and visualize the recorded movements. They also ran the movies backward to trace the origin of cells that form specific tissues, such as the retina. A movie of a well-known mutant strain of fish revealed for the first time exactly what goes wrong as the embryo develops.


    The zebrafish movies are freely available on the Internet, and the developers say they hope the Web site will develop into a full-blown virtual embryo—a sort of developmental biology YouTube with contributions from labs around the world.

    See Web links on imaging of development

    8 Fat of a Different Color

    See Web links on brown fat

    This year, researchers finally uncovered the mysterious roots of so-called brown fat. Hardly blubber, the energy-using tissue turns out to be one step away from muscle.

    Anatomists first noted the distinction between our two fat types more than 400 years ago. White fat is the energy-caching padding that vexes doctors and dieters. If white fat is a quilt, brown fat is an electric blanket. Thanks to plentiful mitochondria, it burns fat molecules to generate heat that warms the body.


    Scientists long assumed that both fat varieties developed from the same kind of progenitor cell. Then a team led by U.S. scientists discovered that they could morph brown fat into muscle and vice versa. The researchers knew that the gene PRDM16 spurs specialization of brown fat. So when they turned down PRDM16 in brown-fat precursor cells, they expected white fat cells to result.

    Instead, the cells stretched out into tube-shaped muscle cells that could even twitch. Reflecting their altered identity, the cells switched off a raft of genes characteristic of brown fat and switched on genes typical of muscle. Coercing cells that had already begun differentiating into muscle to fashion PRDM16 triggered the reverse transformation, yielding brown fat. Using a technique called lineage tracing, the researchers identified the descendants of the muscle cell clan in mice. They included muscle and brown fat cells but not white fat cells.

    The discoveries could mark a step toward antiobesity treatments that melt away bad white fat, either by firing up existing fat-burning brown cells in the body or by transplanting new ones.

    See Web links on brown fat

    9 Proton's Mass ‘Predicted’

    Starting from a theoretical description of its innards, physicists precisely calculated the mass of the proton and other particles made of quarks and gluons. The numbers aren't new; experimenters have been able to weigh the proton for nearly a century. But the new results show that physicists can at last make accurate calculations of the ultracomplex strong force that binds quarks.

    In simplest terms, the proton comprises three quarks with gluons zipping between them to convey the strong force. Thanks to the uncertainties of quantum mechanics, however, myriad gluons and quark-antiquark pairs flit into and out of existence within a proton in a frenzy that's nearly impossible to analyze but that produces 95% of the particle's mass.


    To simplify matters, theorists from France, Germany, and Hungary took an approach known as “lattice quantum chromodynamics.” They modeled continuous space and time as a four-dimensional array of points—the lattice—and confined the quarks to the points and the gluons to the links between them. Using supercomputers, they reckoned the masses of the proton and other particles to a precision of about 2%—a tenth of the uncertainties a decade ago—as they reported in November.

    In 2003, others reported equally precise calculations of more-esoteric quantities. But by calculating the familiar proton mass, the new work signals more broadly that physicists finally have a handle on the strong force.

    10 Sequencing Bonanza

    New genome-sequencing technologies that are much faster and cheaper than the approach used to decipher the first human genome are driving a boom in sequencing.


    See Web links on gene sequencing

    This year, using “sequencing by synthesis” technology from 454 Sequencing, which “grows” fluorescently labeled DNA on microscopic beads, researchers produced the mitochondrial genomes of extinct cave bears and of a Neandertal, and 70% of the genome of a woolly mammoth. A preliminary draft of the full Neandertal genome is in the works. Another new technology, developed by Solexa (now part of Illumina), made its debut in the scientific literature with the descriptions of the first genomes of an Asian, an African, and a cancer patient, shedding new light on early human migrations and candidate genes that may underlie malignancies. Illumina's technology sequences DNA in massively parallel reactions on glass plates. A proof-of-concept paper by Pacific Biosciences, a company that sequences single DNA molecules, provided an exciting glimpse of even faster sequencing. Now the goal is to make it more accurate.

    Costs continue to drop; at least one company boasts that genomes for $5000 are in reach.

    See Web links on gene sequencing


    European Big Science

    1. Daniel Clery

    By most objective measures, U.S. research still leads the world, but in their ability to pool resources in the pursuit of "big science," European nations are showing increasing ambition and success.

    See Web links on European big science

    In September, when the first beams circulated through the Large Hadron Collider (LHC), Europe's giant particle accelerator near Geneva, Switzerland, media outlets were quick to name a winner. “Europe leaps ahead on physics frontier” ran a story on, and a blog trumpeted “LHC a sure sign that Europe is the center of physics.” The electrical fault that put the LHC out of action just days after its inauguration didn't change the overall picture.

    That success was bittersweet for U.S. particle physicists, whose own machine, the Superconducting Super Collider, was canceled in 1993. By most objective measures, U.S. research still leads the world, but in their ability to pool resources in the pursuit of “big science,” European nations are showing increasing ambition and success.

    CERN is the model of a pan-European laboratory. Formed in 1953 to help rebuild postwar European science and encourage international cooperation, the facility became a guiding light for European particle physics and spurred other fields to follow suit. The next few decades saw the creation of the Institut Laue-Langevin neutron source, the European Molecular Biology Laboratory, the European Space Agency, the European Southern Observatory, the Joint European Torus, and the European Synchrotron Radiation Facility (ESRF). But after the agreement to build ESRF in 1984, the enthusiasm for joint European ventures faded.

    That situation has changed this decade, however. First off, the European Union (E.U.) decided that it wanted to host ITER, the worldwide reactor project that aims to prove nuclear fusion as a viable power source. During much of 2004 and 2005, the E.U. was locked in a staring match with Japan over whose site should take the honor. Determined shuttle diplomacy and a face-saving formula put together by E.U. officials finally paid off, and ITER is now under construction at Cadarache in southern France. Such is Europe's confidence in the project that when Congress zeroed out the U.S. contribution to ITER from its 2008 budget, managers in Cadarache barely broke step.

    The E.U. didn't stop there. In 2002, it created the European Strategy Forum on Research Infrastructures (ESFRI), which set about drawing up a list of projects worthy of E.U. support. The ESFRI Roadmap, published in October 2006, lists 35 projects, which include a database on population aging and a neutrino observatory on the Mediterranean seabed. The E.U. didn't have money to build the projects. But it did have money to support design studies and asked all the Roadmap's nominated projects to apply—and nearly all of them did. The aim of the cash is to “get the projects to a point where a political decision can be made” on whether to build, says materials scientist John Wood of Imperial College London, who was chair of ESFRI at the time.


    The ESFRI Roadmap and E.U. infrastructure funding have given a number of projects a major push toward becoming reality. This year, the European XFEL, an x-ray light source, and the Facility for Antiproton and Ion Research, both in Germany, have enlisted international partners for construction, and both expect to sign conventions by early next year. The European Spallation Source, proposed in the early 1990s, now has three sites vying to host it, and a decision—in part brokered by ESFRI—is expected this month. A final design for the Aurora Borealis, a ground-breaking polar research ship, was also released this month. And this autumn, groups of European astronomers and astroparticle physicists have published their own road maps, listing potentially world-leading instruments such as the European Extremely Large Telescope and the Cherenkov Telescope Array. “I'm absolutely staggered at how influential [the Roadmap] has been,” says Wood.

    As this issue went to press, ESFRI was about to release a revised road map, updating its original effort and including some fields that were omitted before. European scientists are eager to see where it leads.

    See Web links on European big science



    The Large Hadron Collider came on smoothly in just a few hours, in keeping with last year's prediction; unfortunately, our warning that a mishap would take it out of action for months also came true. Last year's other predictions were a mixed bag.

    Rating last year's Areas to Watch

    A smashing start


    The Large Hadron Collider came on smoothly in just a few hours, in keeping with Science's observation that the European particle physics lab, CERN, has a knack for getting new machines running quickly. Nine days later, the enormous particle smasher wrecked so bad that it will be down until summer, fulfilling Science's warning that a mishap could take the machine out of action for months.



    MicroRNA work surged in 2008, as efforts to use the molecules to understand and modify disease edged forward. The first successful microRNA manipulation in primates lowered cholesterol in African green monkeys, and the molecules slowed virus replication in ailing mice. Companies are rushing to develop microRNA-based therapies—but coaxing microRNAs to combat disease is slow going, and safety concerns remain.

    See Web links on microRNAs

    Cell to order


    Despite high hopes, humanmade microbes are not yet in reach. Researchers did customize cell-signaling circuits in live cells and are exploring new ways of building genomes from scratch. One research group synthesized an entire bacterial genome but has yet to incorporate it into a cell. And designing microbes to make biofuels remains a pipe dream.

    See Web links on designing microbes



    It was a scramble to get enough sequence done, but a very rough genome of the Neandertal is almost in hand. Along the way, the sequencing team has obtained the complete sequence of Neandertal mitochondrial DNA, finding a few key differences between us and them. Two groups unraveled the mitochondrial genome of extinct cave bears. And sequencing 70% of the woolly mammoth genome prompted speculation about cloning this beast to bring it back to life.

    See Web links on paleogenomics



    Multiple electronic, magnetic, and structural behaviors give these materials the potential to carry out both logic and memory functions, now handled separately by semiconductors and metals. Researchers reported steady improvements in performance. Novel multiferroics can change their stripes near room temperature and in low magnetic fields, both important developments for real-world applications. But progress remains muted in turning these materials into complex circuitry.



    Metagenomics is in full swing, with several key surveys of microbial and viral diversity completed this year in environments as varied as microbial mats, subsurface ecosystems, and the mammalian gut. In addition, DNA sequences from nearly 200 genomes of bacteria associated with humans are finished, and hundreds more are in the pipeline. In October, groups from around the world formed the International Human Microbiome Consortium to study the role the human microbiome plays in health and disease.

    See Web links on megamicrobes

    New light on neural circuits


    This year's Nobel Prize in chemistry honored scientists who turned a luminescent protein from jellyfish into a powerful tool for imaging cells. Building on that work, neuroscientists can now tag neurons with myriad colors to study their connections. And light-sensitive proteins from algae have made it possible to control neural firing with laser pulses. Such methods have great potential for unraveling the function of neural circuits. This year saw steady progress, and the bigger breakthroughs we predicted can't be far off.


    Financial Meltdown

    1. Eliot Marshall

    Luckily, scientific research did not take a direct hit from this fall's global economic crisis, but scientists are feeling the consequences like everyone else, and research budgets could get caught in the fallout next year.

    See Web links on financial meltdown

    The sales slump that became a credit crunch and then a global financial crisis this fall will leave a big smudge in the economics record for 2008. Panic hit the stock market in October, engulfed investment companies, and even threatened to pull down the giant General Motors Co. The full scope of the breakdown isn't clear yet.

    Luckily, scientific research did not take a direct hit, but scientists are feeling the consequences like everyone else, and research budgets could get caught in the fallout next year. In the United States, the drop in stock values deflated private endowments—some by 15% to 25% over a few months. Retirement accounts withered. Meanwhile, programs funded by endowments began to cut back (Science, 7 November, p. 841). The Smithsonian Institution acknowledged in its first public board meeting in November that its endowment was down 21% over a 4-month period and that it would need to tighten its $1 billion budget. Like many, it has delayed announcing hard cuts.


    Companies that need capital to advance new technologies will be pinched, and some will go under. New energy projects seem likely to be delayed. In the biomedical area, a recent report by the Biotechnology Industry Organization in Washington, D.C., noted that 38% of its smaller public companies are on track to burn through their cash reserves in a year.

    State-funded hospitals and universities are cutting employees and putting off new facilities as state revenues decline. The California state university system, responding to the governor's budget, has threatened to cut student enrollment by 10,000, or 2.1%, next year. Private schools are being affected, too. President Drew Faust announced a 22% drop in Harvard's endowment, along with potential delays in the new Allston campus and research area. The Massachusetts Institute of Technology plans to reduce spending by 10% to 15%, said MIT President Susan Hockfield. Federal spending on research has not changed, but President-elect Barack Obama and Congress have not yet tackled the 2009 budget.

    In this murky landscape, there is at least one fixed point: the official starting date of the crisis. According to Federal Reserve Board Chairman Ben Bernanke, the cascade began in August 2007 with a general price collapse in U.S. real estate. Houses went unsold; owners walked away from mortgages; companies holding the mortgages began to default on obligations; a huge firm that insured against such defaults, AIG, ran out of funds and was saved by the government. Five high-flying U.S. investment banks with mortgage- related investments quit the investment field—four to become ordinary commercial banks and one to disappear (Lehman Brothers). Governments in North America, Europe, and Asia are now pumping hundreds of billions of dollars into private companies in an attempt to restore the economy's pulse.

    What caused the crash? Bernanke and other Fed economists describe it as the natural end of an “asset bubble,” an irrational run-up in values. Whether it's labeled as optimism or greed, the appetite for growth got out of hand, and the financial models that underpinned some investment strategies broke down. Some say the remedy is to increase controls on finance and enable more scrutiny of private funds. One school of economists argues that the solution is to create models that steer investors away from bad risks by relying less on “rational” economic principles and more on observed human behavior (Science, 12 December, p. 1624). The debates are just warming up and will occupy analysts of the 2008 crash for years to come.

    See Web links on financial meltdown


    Areas to Watch

    In 2009, Science's editors will be watching plant genomics, ocean acidification, neuroscience in court, the next international climate summit, dark-matter annihilations, "speciation genes," and the Tevatron.

    Plant genomics. Maize got the U.S. government behind deciphering plant DNA. In 2009, expect to see the analysis of its genome published, along with a bumper harvest of DNA sequences from other plants: crops such as soybean and foxtail millet; biofuels plants; monkey flower, much studied by ecologists; and a primitive plant called a lycopod. Several fruits are in the works, and other projects are gaining momentum. And to understand genetic variation, hundreds of strains of the model plant Arabidopsis are being sequenced.

    See Web links on plant genomics

    Ocean fizz. Acidification of the oceans driven by rising atmospheric carbon dioxide continues apace. The falling pH is bad news for sea creatures, from coral reefs to microscopic plankton. But the looming threat has yet to gain a poster child the likes of global warming's polar bear. Look for a rising tide of studies confirming the pervasive detrimental effects of ocean acidification, although whether more science will grab the public's attention is problematic.

    See Web links on ocean acidification

    Neuroscience in court. Scientists and legal scholars cringed this year when an Indian court convicted a woman of murdering her fiancé, citing electroencephalograms that purportedly revealed neural activity indicating “experiential knowledge” of the crime. Although images of anatomical abnormalities in the brain have previously been introduced as mitigating evidence during sentencing, the use of methods that measure brain activity is far more controversial. Even so, at least two companies now offer lie-detection services based on functional magnetic resonance imaging. Ready or not, neuroscience appears poised to have its day in court.

    Road to Copenhagen. A 12-day international climate summit in November 2009 marks the deadline for countries to set a successor for the Kyoto treaty, which expires in 2012. Can the United States, China, and India agree to binding targets tough enough to mitigate global warming? Will the agreement include funding for developing nations to adopt Western energy technologies and adapt to a warming world? President-elect Barack Obama has pledged that the United States will take a leading role in the talks and will push for a mandatory system. But with the world economy reeling and oil prices low, he'll face a tough crowd in the U.S. Senate, where lawmakers from coal and car states will want to block any deal that doesn't provide maximum leeway.


    Darkness visible. Are particles of exotic dark matter annihilating each other in the heavens to produce high-energy cosmic rays? This year, the orbiting PAMELA particle detector and the ATIC balloon experiment reported possible signs of such annihilations. Next year, PAMELA should test the consistency of its result and ATIC's, and NASA's Fermi Gamma-ray Space Telescope, launched this June, will look for photons from dark-matter annihilations. Still, don't expect the stuff to be lured into the light by next December.

    Defining species. In the 200th anniversary of Darwin's birth and the 150th of his On the Origin of Species, expect more clues about genes that split species into two. In 2008, researchers discovered several sources of genetic incompatibilities that prevent successful reproduction in animals as varied as nematodes and mice. Thanks to advances in genetics, gene sequencing, and protein surveying, they expect to find more and more of such “speciation genes” in coming months.

    Tevatron's triumph. Researchers in Switzerland will be scrambling to get the gargantuan Large Hadron Collider up and smashing particles. But the real drama should unfold at the Fermi National Accelerator Laboratory in Batavia, Illinois, where next year the Tevatron Collider should have produced just enough data to reveal signs of the Higgs boson—if its mass is as low as indirect evidence suggests. Don't be surprised to hear shouts of “Eureka!” if not next year then in 2010, when all of next year's data are analyzed.


    Nobelist Gets Energy Portfolio, Raising Hopes and Expectations

    1. Eli Kintisch

    Standing alongside President-elect Barack Obama this week in Chicago, a visibly nervous Steven Chu might have appeared to be a nerdy scientist out of place in the political spotlight. But make no mistake: Chu has a clear vision of where he wants to go and the determination to get there. His commitment to excellence underpinned his work on trapping supercooled atoms that led to the 1997 Nobel Prize in physics. It also drove him to abandon a comfortable academic career and embrace the challenge of reducing the world's carbon footprint as director of Lawrence Berkeley National Laboratory (LBNL). But it may be on the tennis court where the work ethic of the new secretary of energy nominee is most visible.

    While colleagues at a 1998 optics conference in Hawaii partook of the luxury accommodations, the slight, trim physicist, then 50, spent hours testing various rackets with the hotel's tennis pro and practicing his serve before his first match. “He was demanding, like 'Get up, Bambi,'” laughs Mark Cardillo, his assigned partner. “Once he goes after something, nothing is going to stop him,” says Galina Khitrova of the University of Arizona, Tucson.

    So far, little has. If approved by the Senate, which has given no indication that it would do otherwise, Chu would become the first career scientist to run the $24 billion agency. He'll be carrying on his back the hopes of U.S. researchers to jump-start stagnating science budgets at the Department of Energy (DOE) and retain U.S. leadership in the face of rising overseas competition. The curmudgeonly commentator for the American Physical Society, Robert Park, called the choice a “perfect call.”

    Out of many, one.

    One big challenge for Steven Chu will be to unite the disparate activities of the Department of Energy.


    DOE is a mission agency with four distinct portfolios (see graph); Chu has been tapped to beef up its role in science and energy. “In Steve Chu, we have a Nobel Prize-winning scientist who understands that technology and innovation are the cornerstones of our climate solutions,” said Vice President-elect Joe Biden when introducing him this week. Obama went even further in explaining the significance of Chu's nomination. “His appointment should send a signal to all that my Administration will value science, we will make decisions based on the facts, and we understand that the facts demand bold action.”

    It's a challenge commensurate with Chu's demanding standards. The nation desperately needs new low-cost technologies like solar power, better transmission lines for wind power, and successful large-scale demonstration projects for carbon capture from coal combustion or underground CO2 storage. Although Congress has indicated a willingness to provide basic physical science with double-digit annual funding increases, lawmakers have repeatedly failed to seal the deal when faced with competing budgetary priorities.

    Raised by Chinese immigrants in Garden City, New York, Chu says he was more free-thinking than studious until his undergraduate days at the University of Rochester in New York state. In graduate school, he bounced from theoretical astrophysics to whimsical tests to find the frequency sensitivity of the human ear. Once he wedded his creativity with his ambition, however, his career took off, first at AT&T Bell Labs and then at Stanford University.


    But being a successful academic scientist wasn't enough. Chu says it was the “sobering” scale of the climate challenge that drew him to accept an offer in 2004 to direct the 4000-person, $650 million Berkeley lab. There he led efforts to partner with BP on a $500 million energy biosciences institute (Science, 9 February 2007, pp. 747 and 784) and win a competition for an interdisciplinary $135 million bioenergy research center funded by DOE (Science, 25 April 2008, p. 478). Chemist Nathan Lewis of the California Institute of Technology in Pasadena, a partner under Chu's umbrella renewable-energy research program, called Helios, says Chu's ability to understand and cooperate with researchers across a variety of fields will serve him well as he seeks to coordinate research across the sprawling DOE system. “He knows he doesn't have to do it all,” says Lewis.

    Chu's commitment to interdisciplinary research was underscored at Stanford, where he joined three other professors to found the school's Bio-X program, which supports biological research directed at health, energy, and environmental needs. A major challenge for the next energy secretary will be demonstrating to industry that large-scale carbon sequestration facilities can work. Earlier this year, DOE canceled plans to build a $1.8 billion demonstration facility called FutureGen, opting for an approach that involves smaller test facilities. Obama's team has so far signaled that it likes the more modest approach. Although Chu has not made his preferences clear, he wrote in a report last year by the world's science academies that demonstration projects often get “insufficient attention from those who are or have been engaged in funding the R&D phase.”

    Obama transition team member Elgie Holstein has said that Obama wants to focus “more activities in the basic sciences on the energy problem.” Chu has a track record at LBNL of inspiring scientists to do that; Lewis and chemist Paul Alivisatos are among several top scientists who have followed Chu's lead and shifted their research into areas directly applicable to renewable energy. Chu is also the intellectual father of an idea to make DOE science more innovative and nimble through a miniagency called the Advanced Research Projects Agency-Energy (ARPA-E). Congress authorized spending $300 million last year on ARPA-E and “such sums as necessary in 2009 and 2010” in legislation championed by Representative Bart Gordon (D-TN), chair of the House science committee. But so far Congress has not appropriated any money for it, and the Bush Administration views the new agency as needless bureaucracy, an issue on which Chu clashed with DOE headquarters behind the scenes. (Indeed, presidential science adviser John Marburger may have had such encounters in mind when he told Science that Chu will need to hire assistants with “strong personalities to interact productively with a highly self-confident Nobel laureate.”)

    Meanwhile, physical scientists are hoping Chu's impeccable scientific pedigree will heal a wounded U.S. physics enterprise, which has fallen behind European colleagues in particle physics and could face thousands of layoffs across DOE's 21 national laboratories unless the budget picture brightens. “Having been a basic researcher for most of his life,” says Lewis, Chu knows “you never know where the big discoveries are going to come from.”

    But with money tight in Washington, Chu will need to show considerable political prowess to revitalize DOE. Although a regular visitor to Washington over the past 4 years, he lacks the vast network and insider knowledge possessed by Carol Browner, a former head of the Environmental Protection Agency under President Clinton, whom Obama has named as his White House czar for energy, climate, and the environment. Her relationship with Chu and influence over DOE research has yet to be explained.

    This time, it seems, it's going to take guile as well as sweat for Chu to walk off the court a champion.


    Obama's Choice to Direct EPA Is Applauded

    1. Lila Guterman*
    1. Lila Guterman is a science writer in Washington, D.C.

    President-elect Barack Obama's pick to head the Environmental Protection Agency (EPA), Lisa Jackson, has spent 20 years as an environmental officer at the state and national levels. She'll need every bit of that experience to revive an agency demoralized by the actions of Bush Administration appointees, say scientists and environmental activists who welcomed this week's announcement.

    A 16-year veteran of EPA's Superfund site remediation program before taking the top environmental job for the state of New Jersey, Jackson holds a master's degree in chemical engineering. “She will be an outstanding administrator, committed to defending the integrity of the science on which EPA regulations must be based,” says David Michaels, a research professor of environmental and occupational health at George Washington University (GWU) in Washington, D.C.

    That combination of skills and ethics is badly needed at EPA, say Michaels and other scientists. Kathryn Mahaffey, who left EPA this summer for GWU after 15 years of studying the risk to humans from exposure to pollutants, says that she was instructed in 2005 by a political appointee to “go back and recalculate” her results on blood mercury levels among U.S. women. Political interference has grown so serious, she says, that outside scientists “aren't sure what scientific publications coming out of EPA they really should have confidence in.”

    Familiar environment.

    Lisa Jackson has been nominated to lead EPA, an agency where she spent 16 years as a regulator.


    One issue awaiting the next EPA administrator is whether the agency will regulate carbon emissions under the Clean Air Act. Although the U.S. Supreme Court told EPA in 2007 to reexamine its opposition to doing so, agency Administrator Stephen Johnson said this summer that “the Clean Air Act is the wrong tool for the job” (Science, 18 July, p. 324). An aide to Obama said during the campaign that Obama would instruct EPA to regulate carbon under the act if Congress didn't adopt a cap-and-trade system in the next 18 months. Another Bush Administration policy opposed by many environmentalists—to deny California and other states a waiver to tighten auto emission standards—could be reversed by the new EPA administrator.

    As head of New Jersey's EPA, Jackson developed a plan to slash the state's carbon emissions and worked with other Northeast states on a regional program to do the same. Dena Mottola Jaborska, executive director of Environment New Jersey, an advocacy group, credits Jackson with making the state “a leader on global warming.” At the same time, some groups have criticized Jackson for making inadequate progress on cleaning up toxic waste sites. This month, she became chief of staff to Governor Jon Corzine. If confirmed by the Senate, Jackson, 46, would become the first African-American to lead EPA.


    Signs of Drug Resistance Rattle Experts, Trigger Bold Plan

    1. Martin Enserink

    NEW ORLEANS, LOUISIANA—“A catastrophic scenario,” one researcher calls it. “A global disaster,” predicts another, contemplating what could happen if malaria parasites worldwide developed resistance against the new artemisinin-based combination therapies (ACTs) that have become the gold standard. Large parts of the world would have no drugs to fall back on, and malaria cases and deaths could soar, erasing hope that the world might be on the eve of a huge reduction in the disease. Yet resistance against ACTs is precisely what now seems to be developing in western Cambodia, along the Thai border, according to several studies presented here last week at the annual meeting of the American Society of Tropical Medicine and Hygiene (ASTMH).

    The data have given new urgency to an audacious proposal hatched last year to eliminate malaria entirely from the areas where resistance seems to arise. Experts are gathering this week in Phnom Penh to discuss the plan's implementation, which will be coordinated by the World Health Organization (WHO). The Bill and Melinda Gates Foundation plans to bankroll the effort, says WHO malaria expert Pascal Ringwald.

    Scientists still don't fully understand the extent and nature of the problem, stresses Nicholas White of Mahidol University in Bangkok, who has a study about it coming out soon. The main phenomenon researchers have documented so far is a delay in clearing the parasites from the blood of some patients on artemisinin drugs. Most researchers prefer to say that the parasite is now “tolerant” rather than resistant to the drug. Still, says White, the data are worrisome.

    Cambodia's western border has long been the cradle of antimalarial drug resistance: Chloroquine, sulphadoxine-pyrimethamine, and mefloquine all met their match there before becoming useless elsewhere in the world. Scientists believe this may have to do with the misuse of drugs there and the widespread availability of underdose and counterfeit therapies. From Cambodia, gem miners and other migrants have carried resistant parasites to other Southeast Asian countries.

    In the case of ACTs, Cambodia has another problem. These combination treatments rely on an artemisinin derivative for their powerful punch and contain a second drug to make it more difficult for the parasite to become resistant. In Cambodia, however, many artemisinin monotherapies are on the market.

    Nailing down resistance is harder than it might seem. Treatment failures in individual patients—of which there have been several reports in the past 10 years—don't always signal resistance. Sometimes patients don't get better because their blood levels of the drug are too low, for instance. Testing parasites' sensitivity by exposing them to drugs in the test tube is possible, but the results are hard to interpret. Scientists have not yet found unequivocal genetic markers of resistance to artemisinin derivatives, either.

    Partly as a result of such problems, malaria scientists have been skeptical of early reports about resistance. When Harald Noedl, then at the U.S. Army Medical Component of the Armed Forces Research Institute of the Medical Sciences in Bangkok, presented resistance data at an ASTMH meeting in Atlanta 2 years ago, he was “attacked,” says Ringwald, who had trouble publishing data on the topic himself. “People didn't want to believe it,” he says.

    Two years later, new data have accumulated and the skepticism has largely dissipated, says Dyann Wirth of Harvard University. In a paper published in last week's issue of The New England Journal of Medicine, for instance, Noedl—who is now at the Medical University of Vienna—reports that out of 60 patients from western Cambodia treated with artesunate, two had delayed parasite clearance, with times of 133 and 95 hours, compared with the average of 59 hours. Both had adequate drug levels in their blood.

    On the border.

    To prevent resistance to ACTs, a massive malaria-elimination campaign is planned along the Thai-Cambodian border (map), where this picture of a malaria patient and his wife was taken.


    The new containment plan calls for eliminating malaria from the areas where tolerance has been found and greatly reducing transmission in a large surrounding area (see map). The plan, to be carried out by national malaria-control agencies in Cambodia and Thailand with support from various research institutes, includes rapid and widespread treatment with ACTs, improved mosquito control, the distribution of long-lasting insecticide-impregnated bed nets, a ban on monotherapies in Cambodia (they are already rare in Thailand), and an information campaign. Whether the plan can succeed is unclear, but “it's worth the investment,” says Wirth.

    Another type of response is also in the works. At the ASTMH meeting, scientists launched the Worldwide Antimalarial Resistance Network (WWARN), a global database that will collect information on resistance in vivo and in vitro, drug levels in patients' blood, and molecular markers. Based at Oxford University in the United Kingdom, the network will be led by Philippe Guèrin, an epidemiologist currently working for Doctors Without Borders. WWARN is in the late stages of negotiating a Gates Foundation grant.

    Resistance data tend to languish on desks and in drawers for years while they await publication, says Guèrin. In exchange for rapid reporting, WWARN will offer scientists statistical help in analyzing the numbers and perhaps even tools for producing a standard manuscript. WWARN also hopes to bring harmony to the myriad ways to test for resistance. “We really need data shared in real time,” says Philip Rosenthal of the University of California, San Francisco, who predicts scientists will cooperate. But, he adds, standardizing methods to test for resistance will be a challenge.


    DOE Picks Michigan State Lab for Rare-Isotope Accelerator

    1. Adrian Cho

    A relatively small university lab has beat out a much larger national lab in the competition to host a $550 million accelerator facility for nuclear physics. The U.S. Department of Energy (DOE) announced last week that it would build the Facility for Rare Isotope Beams (FRIB) at the National Superconducting Cyclotron Laboratory (NSCL) at Michigan State University (MSU) in East Lansing instead of at its own Argonne National Laboratory in Illinois. Many had expected Argonne's superior infrastructure and $530 million budget to give it a decisive edge.

    The MSU proposal “started out as a long shot,” admits C. Konrad Gelbke, director of NSCL, whose $20-million-a-year budget is provided by the U.S. National Science Foundation. “But I always felt very optimistic that if you presented the case in a very open and honest way, it would all level out in the end.” Argonne officials said in a statement that they were “disappointed” and noted that “much of the science for FRIB was developed here at the laboratory.”

    FRIB will serve as a source of exotic and fleeting radioactive nuclei. The heart of the machine will consist of a 400-meter-long high-intensity linear accelerator that can accelerate a nucleus of any weight from hydrogen to uranium. Those nuclei will smash into and through targets to make beams of exotic isotopes, which will then be used to refine existing theories of nuclear structure, probe fundamental symmetries of nature, and help unravel the processes in stellar explosions that presumably produce half the elements heavier than iron. FRIB would pump out beams at least 1000 times more intense than those currently produced at NSCL.

    The site selection “was not the easiest of decisions,” says Eugene Henry, DOE's acting associate director for nuclear physics. Henry declined to discuss the details of the two proposals but noted that MSU had offered to pay some of the construction costs. The university has committed to chipping in $94 million, says MSU spokesperson Terry Denbow. He did not say how the money would be raised.

    New foundation.

    Housed in a tunnel (gray), the new accelerator would feed experiments in the existing lab.


    DOE also had more confidence in MSU's spending plan, says Donald Levy, vice president for research and for national laboratories at the University of Chicago in Illinois, which contracts with DOE to run Argonne. “The budget part of their proposal was deemed to be more reliable than ours,” Levy says, in part because MSU's budget included more “contingency” money to cover possible cost overruns.

    Some observers have misgivings about building a large facility at such a small lab. “[T]he scale of the project was more appropriate for one of the existing DOE laboratories,” says Burton Richter, a particle physicist and former director of DOE's SLAC National Accelerator Laboratory in Menlo Park, California. But Richard Casten, a nuclear physicist at Yale University, says he's sure that NSCL is up to the task: “I have no worries about that at all.”

    Many scientists are just glad that the project, originally proposed in the late 1990s, is making progress. “We are happy that the decision has been made and that the excellent team at Michigan State has been chosen,” says Witold Nazarewicz, a nuclear theorist at the University of Tennessee, Knoxville. Researchers hope to start detailed design work next year and to have the machine built and running in about a decade.


    Rangers Assess Toll of Congo Conflict on Threatened Mountain Gorillas

    1. Robert Koenig
    Collateral risk.

    About 180 mountain gorillas, including this silverback called Karateka, live in an embattled national park.


    In the wake of severe fighting in Virunga National Park in the Democratic Republic of the Congo (D.R.C.), worried rangers began a painstaking census late last month of the park's highly endangered mountain gorillas, nearly a third of the world's known population. “It is imperative to find out the gorillas' status,” says the park's director, Emmanuel de Merode, who is leading the census.

    Violence in eastern D.R.C. worsened in October when government troops and allied militias clashed with rebels inside the 7800-square-kilometer park, which is Africa's oldest. About 400 park rangers and their families withdrew into temporary refugee status in a camp near the town of Goma.

    After an agreement was reached to allow most rangers to return to the park in late November, de Merode—a Ph.D. anthropologist who became the park's director in August—organized a census to find out whether any gorillas succumbed. Last year, he says, nine of the park's approximately 180 gorillas were killed: one for its meat; one in a botched trafficking attempt; and seven “vindictive killings,” possibly by a corrupt army faction involved in illegal logging.

    The census, expected to be completed by year's end, is focused on the park's 72 or so human-“habituated” gorillas, which rangers can track and recognize. “We identify the habituated gorillas primarily by unique wrinkles on their noses, which can be recorded as signatures,” de Merode told Science. Each morning, trackers follow forest trails to find a gorilla group; survey teams then try to identify all its members. Tragically, the habituated animals are “the most vulnerable,” de Merode says, partly because they “are closest to the forest's edge” and are less fearful of humans.

    The nose-count approach has some technical weaknesses: It accurately counts only the habituated animals, about one-third of mountain gorillas in the park, and it depends on the trained eyes of certain rangers. “The fate of the unhabituated groups is unknown” in such nose-counts, says primatologist Martha M. Robbins of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. “Once the area is stable, it will be important to do a census of the entire region so we can get an estimate on the entire population.”

    The densely populated and ethnically diverse eastern D.R.C. has been volatile for decades. Famed gorilla researcher Dian Fossey fled violence in the D.R.C. (then called Zaire) in 1967 to set up her Karisoke Research Center across the border in Rwanda; ever since then, researchers have tended to shun the D.R.C. in favor of outposts in adjacent countries. Although long-term gorilla research is ongoing at Karisoke and in Uganda, there has been a dearth of scientific studies on the D.R.C. side of the border. “The region has been too insecure” in the past decade, says Robbins, who worked at Karisoke before establishing the first long-term field study of mountain gorillas in Bwindi Impenetrable National Park in Uganda.

    Hot zone.

    Rebels occupy the gorilla sector of the Virunga National Park; other militias have entered from Rwanda.

    The danger and lack of resources have so far impeded a more accurate count using techniques such as a “genetic census.” It involves a simultaneous sweep of the entire habitat by trackers who find gorilla tracks, identify nest sites, and collect fecal samples from which DNA is later analyzed to genotype every individual gorilla. So far, scientists have conducted a full genetic census of mountain gorillas only at the Bwindi park. In 2006, that census found evidence of 302 animals, 34 fewer than estimated using traditional methods.

    Max Planck anthropologist Damien Caillaud, who studies the influence of habitat characteristics on the Bwindi mountain gorillas' social system, says the genetic census there “allows us not only to count the individuals but also to know their sex, the relatedness among individuals, … and the dispersal patterns.”

    A 2003 census using traditional methods of the entire three-country Virunga Volcanoes area—funded by seven gorilla research and conservation groups—set the region's total population at 380 gorillas, including the 180 to 200 on the D.R.C. side. Together with the 300 or so mountain gorillas in nearby but separate Bwindi, that puts the total known population of the primates at nearly 700.

    Although the D.R.C. rangers have now reentered Virunga park, the conflict continues, with some armed rebels entering from Rwanda. Whereas the major armed groups have outposts within the park's borders, a rebel group controls the gorilla area (Mikeno sector) in the park's southern region. As of last week, park rangers had fully surveyed two of the six habituated gorilla groups and found that new infants had increased their numbers. “The results so far have been pretty positive,” says de Merode, “but that can change.”

    Robbins adds that the rangers “have risked their lives and worked through extremely difficult conditions to help protect the gorillas.” Researchers and conservationists alike hope that the wild inhabitants of Virunga park will make it safely to 2009, which has been declared the “Year of the Gorilla.”


    Report Faults U.S. Strategy for Nanotoxicology Research

    1. Robert F. Service

    The U.S. government lacks an effective plan for ensuring the safety of nanotechnology, a new report by the National Research Council (NRC) concludes. The report, released last week, finds that the current plan for coordinating federal research on environmental, health, and safety (EHS) risks of nanotechnology amounts to an ad hoc collection of research priorities from the 25 federal agencies that make up the U.S. National Nanotechnology Initiative (NNI), which coordinates federal nanotech programs. What's needed, it argues, are an overall vision and a plan for how to get there and to come up with the money to do so.

    “The current plan catalogs nano-risk research across several federal agencies, but it does not present an overarching research strategy needed to gain public acceptance and realize the promise of nanotechnology,” says David Eaton, an environmental and occupational health scientist at the University of Washington, Seattle, who chaired the committee that wrote the report.

    The NRC report marks new movement in what has been a long-running tug of war between the Bush Administration and its critics in Congress, academia, and nongovernmental organizations over how best to ensure nanotechnology's safety. Administration officials have maintained that the agencies funding the research—such as the National Institutes of Health and the Environmental Protection Agency—are best qualified to set their priorities and budgets to ensure nanotech safety. Critics counter that a point person is needed to ensure that coordination takes place. The House Science and Technology Committee passed a bill earlier this year reauthorizing NNI and pushing much of the critics' agenda. But the Administration opposed installing a nano overseer, among other things. The bill was shunted aside by the election and the economic chaos this fall and never came to a vote.

    Steady climb.

    New products containing nanomaterials continue to rise, despite unknowns.


    The new report is something of a vindication, says Andrew Maynard, chief scientist at the Woodrow Wilson International Center for Scholars' Project on Emerging Nanotechnologies in Washington, D.C. “It shows we haven't been out on a limb for the last few years,” Maynard says. Maynard has long criticized coordination of EHS research under NNI and served on the NRC panel that wrote the report. “Now the government needs to decide who needs to do what risk research and where the money is going to come from.”

    The current report does not assess the potential toxicity of nanomaterials, which are now found in more than 800 commercial products. Rather, last year, the U.S. National Nanotechnology Coordination Office (NNCO), which oversees day-to-day coordination of nanotechnology programs between U.S. federal agencies, asked NRC to evaluate its current EHS research strategy. A statement from NNCO called the development of a broader national strategy “a worthy goal.” However, it says the report's call for reengineering how the federal government oversees NNI “was not within the scope of the NRC panel review, and would require extensive review and analysis and Congressional oversight.”

    The back and forth will resume next year. Bart Gordon (D-TN), who chairs the House Science and Technology Committee, said last week that he will reintroduce the bill to reauthorize NNI, which will again put these issues front and center. The key will be whether the new Obama Administration sides with its predecessors or with their critics.


    From the Science Policy Blog

    Last month, Science launched a policy blog, ScienceInsider, providing news and analysis on science policy around the world. Postings include breaking news covered more in depth in the magazine as well as news that doesn't appear in print:

    Museum layoffs prompt backlash. Archaeologists around the world are condemning the University of Pennsylvania Museum of Archaeology and Anthropology for planning to lay off 18 researchers, in particular one of the world's leading archaeobotanists, Naomi Miller, who has been in the field for 30 years. News of the layoffs, announced late last month, has ricocheted through the global archaeology community. Director Richard Hodges says the museum will find money to retain the scholars …

    Scientists seeking stimulus. A collection of U.S. research universities is making the case for science to be included in legislation aimed at reviving the moribund economy. In a letter to President-elect Barack Obama, the 62-member Association of American Universities (AAU) proposes $2.7 billion in immediate spending on academic buildings, scientific equipment, and young researchers. AAU joins a long line of interest groups hoping to tap into an economic stimulus package …

    Bioethics guidance from Rome. The Vatican has issued a new document addressing the morality of various developments in biotechnology, including in vitro fertilization, germ-line gene therapy, and so-called altered nuclear transfer (ANT). Dignitas Personae, issued at a Vatican press conference 12 December, is mainly a clarification of previously known positions. It does take a cautious line on ANT, which at least one Catholic bishop had endorsed. The technique was developed to find a way to produce stem cells from cloning without ever producing a viable embryo. Scientists have attempted to inactivate certain genes required for embryo development so that instead of producing an embryo, they produce disorganized cells—which nevertheless can be used to make stem cell lines. The document, however, takes a dim view of the effort …

    For the full postings and more, go


    Europa vs. Titan

    1. Richard A. Kerr

    Planetary scientists are in the final stretch of a first-time competition designed to get the most science for the buck from the next big planetary mission while avoiding the fiscal debacles of the past.

    Planetary scientists are in the final stretch of a first-time competition designed to get the most science for the buck from the next big planetary mission while avoiding the fiscal debacles of the past

    A frigid mystery.

    Titan (false color) somehow mimics Earth's complexity despite extreme cold.


    TEMPE, ARIZONA—Frances Bagenal wants to get back to the outer solar system, but that will take some doing. It will be at least 20 years before an instrument-laden robot can slip into orbit around Saturn's Titan, a deceptively Earth-like moon lashed by liquefied natural gas storms, or around Jupiter's Europa with its potentially inhabited, ice-covered ocean. And the trip will likely cost a good $3 billion, money that is only getting scarcer at NASA.

    So Bagenal and her colleagues in the community of outer planets scientists are submitting themselves to a unique, two-stage selection process for the next major mission to the outer planets: a winner-takes-all competition to deliver the best science at the best price.

    “Usually, a committee of graybeards meets and decides” which multibillion-dollar mission NASA will fly next, says Bagenal, a space physicist at the University of Colorado, Boulder, and the outgoing chair of NASA's Outer Planets Assessment Group. In the past, the broader planetary community was pretty much left out of the final decision-making, and there was nothing competitive about it. Now, for the first time, independent study teams of scientists and engineers have taken their best shots at designing missions to the Jupiter or Saturn system and pitted them head-to-head.


    One proposed joint NASA-ESA mission would send an orbiter, a splashdown lander, and this nuclear-fired hot-air balloon to Titan, land of natural-gas lakes and icy ground dirtied with organic goo.


    “I think that's healthy,” says Bagenal. “Each team has had to hone its arguments [even though] it means not everyone gets their favorite instrument on board.” In the process, scientists are having “to stand up and be responsible for the costs so we don't get a shock down the road,” she says. The novel approach will culminate late next month in the selection of a single mission.


    NASA's foray into community-based competition of the biggest missions came after discoveries in the outer solar system made the latest formal prioritization of future missions outdated. The so-called decadal survey from a committee of the National Research Council in 2002 (Science, 19 July 2002, p. 317) had just two missions to the outer planets among its 13 priorities for the decade 2003 to 2013: the relatively modest Juno mission to Jupiter itself to be flown in 2011 and a so-called flagship mission to Jupiter's Europa. Flagship missions are NASA's most expensive, running well above $1 billion apiece. The Europa mission gained that status after the Jupiter-orbiting Galileo found signs in 1997 of an ocean beneath kilometers of ice on Europa, raising the prospect of life. And the prospect of life, the decadal survey made plain, was the mission's prime driver.

    But astrobiology soon beckoned from other quarters around Jupiter and Saturn. In 2005, NASA's Cassini spacecraft, the outer planets flagship following Galileo, discovered a huge plume of ice-laden water vapor spewing from an overheated south pole region of Saturn's little moon Enceladus. Again, liquid water might lurk beneath the icy surface. And in 2005, beneath the haze of Titan, Cassini and its European Space Agency (ESA) Huygens lander found a profoundly frigid but eerily Earth-like world. Earth-like, that is, if you substitute liquefied natural gas for liquid water, water ice for rock, and dark organic gunk for soil. The astrobiology angle came with the gunk. Given its complex origins in Titan's upper atmosphere, it should bear a strong resemblance to the primordial organic matter that came together to begin life on Earth.

    Enticing new targets were cropping up, but the next decadal survey was not due out for years, and the prospect of getting to Europa was fading fast. Engineers had long struggled with designing a practical mission. “Each and every one of [the missions studied] was unimplementable,” says James Green, director of the planetary science division at NASA Headquarters in Washington, D.C. Jupiter's intense radiation belts threatened to fry the electronics of any spacecraft that lingered in Europa's vicinity for more than a few weeks.

    So NASA decided to go to the outer planets community for more possibilities. In 2007, in a process reminiscent of open competitions for missions a fifth the size (Science, 23 July 2004, p. 467), four groups of about a dozen scientists and engineers each were given half a year to work up missions to orbit one of four outer planet satellites: Titan, Europa, Enceladus, and Jupiter's moon Ganymede.

    Two missions eventually survived this head-to-head competition, with NASA Headquarters judging. The latest radiation-hardened electronics made Europa look more practical. And Titan came through, in part, because it “offers a deeper and broader science yield than Enceladus would,” suggests outer planets astronomer Heidi Hammel of the Space Science Institute in Boulder, Colorado. What Titan lacks as a potential abode for life, she says, it makes up for with both prelife organic chemistry and fascinating atmospheric and geologic processes.

    Enceladus lost out. It “was at a disadvantage,” explains engineer Amy Simon-Miller of NASA's Goddard Space Flight Center in Greenbelt, Maryland. “We had to start from scratch,” as no one had ever studied going into orbit around Enceladus. Enceladus has so little gravity and orbits so deep in Saturn's powerful “gravity well” that getting into Enceladus's orbit with any affordable propulsion system seemed impractical. Ganymede, on the other hand, was accessible but lacked both astrobiological interest and Titan's bevy of active geological, geophysical, and hydrologic processes, notes outer planets researcher Torrence Johnson of NASA's Jet Propulsion Laboratory in Pasadena, California. In addition, “things have shifted from a total astrobiology focus to more of the wonders of exploration,” he says, making Titan's geocomplexity all the more attractive.

    Neck and neck

    The four were eventually down to two, and those were about to come up against some fiscal rigor. First, S. Alan Stern—NASA associate administrator for space science at the time—imposed a skimpy $2.1 billion limit on the next outer planets flagship mission. Stern's dollar ceiling “was very healthy,” says Bagenal. “It made people prioritize, be really ruthless.”

    Meanwhile, money problems began to loom with NASA's Mars Science Laboratory (MSL), the behemoth rover scheduled to launch in October 2009. MSL's increasingly public problems finally forced a 2-year launch delay earlier this month (Science, 12 December, p. 1618). The cost overrun totaled as much as $800 million over the $1.6 billion set in 2006 when NASA committed to the mission. “We have to be careful we don't go down that path,” Bagenal said at a meeting of her assessment group here last month.

    As it happened, the path for the next outer planets flagship mission took a sharp turn when Stern left NASA this past March and was replaced by Edward Weiler (Science, 4 April, p. 31). Parallel studies of missions to Titan and to Europa had been under way since February. NASA study teams had merged with ESA teams to design joint, multicraft missions, but the NASA teams had stuck to Stern's price cap. Weiler wanted alternatives. He did ask for a core option costing $2.1 billion or less, but he added a “full decadal science” option delivering all the science in the decadal survey at whatever cost and an option hitting the “sweet spot” in between, where the science return on the dollar would be maximized. He gave the teams several more months to come up with their proposals.

    Looking deep.

    Another proposed joint NASA-ESA mission would probe Europa's ice shell (white in diagram) overlying an ocean (blue) to see whether the ice (above, enhanced color) has ever been breached.


    The flagship-on-the-cheap NASA core missions did not fare well. At the Tempe meeting, after final reports were submitted, NASA outer planets program manager Curt Niebur confirmed that the core option “just wasn't compelling enough to invest $2 billion.”

    Sweet spots, on the other hand, were more palatable. The Europa orbiter team started with the half-dozen instruments onboard in their $2.1 billion core option and added instruments one by one. The total cost rose slowly until they reached the 13th instrument, the mass of which required a bigger, much more expensive rocket. The sweet spot of $3.0 billion had just been passed. The Titan orbiter team took a somewhat different approach to calculating their sweet spot, but it came out to $2.5 billion. The Cadillac missions fell well out of the running.

    Ready, set, …

    At the Tempe meeting, Ronald Greeley of Arizona State University, Tempe, NASA co-chair of the joint science definition team, declared that “the Europa Jupiter System Mission is essentially ready to go. Of course, the key driver is exobiology.” And the theme of the proposed joint mission to the Jupiter system is the emergence of possibly habitable worlds. Both the NASA and the ESA spacecraft—to be launched in 2020 on separate rockets carrying a dozen instruments each—would first orbit Jupiter, probing its magnetosphere and flying by all four moons. NASA's would then enter Europa orbit to probe the ice shell with radar, size up the interior by gauging the ice's tidal flexing, and survey the geology and chemistry of the surface for signs that ocean water ever makes it to the exterior. Meanwhile, the ESA spacecraft would orbit Ganymede.

    The Titan team's pitch is broader. “There's something on Titan for virtually all aspects of planetary science,” was how Jonathan Lunine of the University of Arizona, Tucson, NASA co-chair of the Titan Saturn System Mission (TSSM) science team, described it. The mission would investigate how Titan functions as a system, determine how far prebiotic chemistry has developed, and continue exploring Enceladus with a series of flybys. A single rocket would deliver three craft: the NASA orbiter; ESA's lander targeted for a splashdown in one of Titan's larger liquid methane-ethane lakes; and an ESA nuclear-fired hot-air balloon that would circle the moon near its equator. “The technologies are ambitious, novel, and imaginative,” said Athena Coustenis of the Paris Observatory in Meudon, France, and the TSSM team. “This is the way we need to go.”

    That will be up to NASA's Weiler and ESA's science head David Southwood. By the end of January, they will jointly select one mission to proceed. They are likely to be balancing the technological ambition of the Titan balloon and the 9-year travel time to Saturn against a science focus on Europa. After that, though, both NASA and ESA have years of maneuvering for funds ahead. NASA will need to navigate around the fallout from MSL; ESA will have to compete its outer planets mission against two astrophysics projects, and both must overcome budgetary woes from the financial crisis. As Green said, an outer planets flagship mission “is not a done deal.”


    An Artist Develops a New Image--With Aid of Bacteria

    1. Mitch Leslie

    After dropping out of high school, Zack Booth Simpson became a video game programmer. Now he's at a university working with cutting-edge synthetic biology labs.

    After dropping out of high school, Zack Booth Simpson became a video game programmer. Now he's at a university working with cutting-edge synthetic biology labs

    Nearly 5 years ago, molecular biologist Edward Marcotte recalls, a high school dropout walked into his office at the University of Texas (UT), Austin, to talk shop. Despite the visitor's unconventional background, which included a stint as a video game programmer, Marcotte says that Zack Booth Simpson “won me over instantaneously. He was so clearly intelligent.” They ended up talking for hours on topics such as Marcotte's use of data mining to extract information about the protein networks that control cellular functions.

    That was just the beginning. Simpson now has a part-time paid position as a fellow at the university's Center for Systems and Synthetic Biology, where he's shared his expertise and ideas with several labs. His publication record, which includes co-authoring a paper in Nature and a chapter in a newly released book on synthetic cells, would make some postdocs envious. “He's jumped right into the top level of research,” says Marcotte, who contributes some of his lab funds to Simpson's salary. “It wouldn't be exaggerating to say Zack changed some research directions in my lab—for example, stimulating my interests in synthetic biology and cell-to-cell variability.” He can “span a variety of disciplines with relative ease, and he brings fresh and interdisciplinary perspectives to each field,” adds Andrew Ellington, another UT Austin molecular biologist who's worked with Simpson for almost as long. Not bad for someone who was once held back by dyslexia and considers science his hobby.

    More than 20 years after dropping out of high school during his junior year, Simpson, now 38, says his only regret was “not leaving earlier.” He was bored, and teachers weren't helping him overcome his dyslexia. Without intending to, he managed on his own to surmount the reading disability by “geeking out” on computer manuals. Meticulously deciphering something he found interesting did what dull reading assignments couldn't, he says. His parents had divorced by the time he decided to quit school, and his mother, a landscape architect, backed his decision. “She thought I'd figure out my own way in the world.”

    She was right. After starting out as a “junior programmer” at a database company, by age 23 he'd worked his way up to director of technology at the Austin-based video game maker Origin Systems, which crafted big-sellers such as Wing Commander and Ultima. Then, like almost everyone else in the 1990s, Simpson and some friends started their own company. However, the inauspiciously named Titanic Entertainment went under after releasing only one game, Net- Storm. It reportedly sold only 13,500 copies, although one Internet forum later tabbed it as “The Best Game of All Time that Nobody Bought.” The lesson from the company's failure, he says, was that “I liked learning things more than I liked doing things.”


    It wasn't long before he was doing something else, collaborating with artists, engineers, and computer scientists on a series of interactive exhibits called Mine Control. The art installations have appeared everywhere from science museums to department stores, and in such far-flung locations as Norway, Mexico, and Ecuador.

    Even children can fool around with Mine Control without alarming museum staff because what the children “handle”—light—can't break. Many of the exhibits use detectors Simpson developed to track a viewer's shadow, helping create the illusion of manipulating images that come from computer- controlled projectors. In one of the simplest exhibits, you stand in front of a screen on which multicolored sand appears to tumble from above. Hold out your hand, and the sand piles up in your palm's shadow on the screen. Many of the pieces have scientific themes, letting you tug and bend a Slinky-like RNA molecule, for example. Simpson's favorite, called Moderation, meshes the visual style of Japanese anime with an ecological message. How fast you walk around a pool projected onto the floor determines whether the virtual plants and other life that sprout in your footsteps thrive or die out. Walk too fast, and the virtual ecosystem dies out.

    Bacterial snapshots

    Although Simpson has lived just a few blocks from the UT campus in Austin for most of his adult life, he didn't try to forge a formal connection with the school until, after one-too-many conversations about the finer points of thermodynamics with his girlfriend, she recommended he find more nerd buddies. That inspired a short-lived plan to enter graduate school, which Simpson scuttled after learning that he'd first have to complete a high school equivalency program and an undergrad degree. Instead, he paid a visit to Marcotte.

    Diverse Interests.

    Zack Booth Simpson has helped design interactive art installations (left, photo illustration), develop a “camera” that produces pictures using light sensitive bacteria (center), and create a Gaudi-inspired house.


    Simpson arrived in time to help students working with Marcotte, Ellington, and colleagues devise a showstopping demonstration for the 2004 international Genetically Engineered Machine (iGEM) competition, a synthetic biology extravaganza in Cambridge, Massachusetts. The goal of this first-of-its-kind meeting was for teams of undergraduates to construct biological machines capable of performing some interesting task. Because he understood computational problems so well, Simpson's role was to urge everyone in the Austin labs to think big, recalls Ellington: “He was putting forward all these grand ideas for what molecules could do, and we were saying, no, no, maybe, no.”

    One of Simpson's big ideas was to use bacteria as an edge detector. Finding the boundary between objects is a standard task in image analysis and something software can achieve. But duplicating the feat biologically wasn't feasible, given the rapidly approaching iGEM deadline. So the team decided to focus on the project's first step: bacterial photography, in which microbes act like the light-capturing pixels on the sensor of a digital camera.

    Simpson provided the concept, but the UT Austin team still needed tricked-up bacteria. There, they got lucky, says Marcotte. They learned that chemical engineer Christopher Voigt of the University of California, San Francisco (UCSF), and colleagues had created the perfect biopixel: a genetically modified bacterium that fashions black pigment in the dark but not in the light. A layer of the bacteria can take a picture with pretty good resolution—they are clearer than some photos from cell phone cameras—though the exposure times required are measured in hours rather than in fractions of a second, as they are for most conventional photographs. The UCSF and UT Austin teams joined forces and showed off their shutterbugs at the competition and in a 2005 Nature paper of which Simpson was a co-author.

    The biological edge detector was next. Simpson explains that this project required engineering more astute bacteria than what were needed for photography. In an edge detector, a microbe not only has to determine whether it's in the light or dark but also has to know the status of its neighbors. Jeffrey Tabor, who was part of the biofilm team and is now a postdoc at UCSF, and colleagues have now built this device and plan to submit a paper describing it. Bacteria won't be replacing electronics anytime soon, Simpson says. But the two projects provide simple and striking examples of computation with cells rather than microchips. In the future, he says, researchers might build on the similarities between computers and life to create machines with capabilities of living things—repairing themselves if they break, for example, or growing from simpler structures.

    Emulating Gaudi

    Another scientist to benefit from Simpson's computational smarts is UT Austin biochemist Kenneth Johnson. Simpson helped program enzyme kinetics software now sold by the chemical instrument company Johnson founded. Simpson's contribution, the biochemist says, was applying his background in art and computer gaming to provide the software with intuitive controls and easy-to-understand output. Simpson didn't write every line of code, but “he was the brains behind what we did,” Johnson says.

    Simpson splits his time about 50-50 between his art—which brings in most of his income—and his scientific work. Those dual interests intersect in his fascination with “how processes beget shape.” That fascination shows in the house he recently built in Austin, parts of which reveal the influence of the Spanish architect Antonio Gaudi. Although he wasn't a scientist, Gaudi scrutinized natural forms—not to copy them but to understand “the processes that made those shapes and use that as inspiration,” Simpson says. For example, Gaudi designed pillars that branch like a tree or stand at an angle. A year in Barcelona, Spain, home of many of the architect's famous buildings, fed Simpson's love for that style, and Gaudiesque touches in his Austin home include the undulating eaves over the porch and the staircase, which was inspired by a dead, hollowed-out tree Simpson once saw. It's the kind of house that strangers stop by to photograph, says John Davis, an electrical engineering professor at UT Austin who has collaborated with Simpson scientifically and artistically.

    Cynics might dismiss Simpson as a science dilettante. And he admits that he lacks the deep knowledge of someone who's been immersed for years in the biological literature. But Simpson believes that he compensates with a breadth of knowledge, from fields as diverse as economics and ecology, that allows him to see new ways to analyze a problem.

    Ellington adds that Simpson's success suggests that there's room for more people in science who follow their own paths. “People who take charge of their education are unique, and that is unfortunate,” he says. “We need more people like Zack who learn for the fun of it.”


    Shortfalls in Electron Production Dim Hopes for MEG Solar Cells

    1. Robert F. Service

    At the Materials Research Society Fall Meeting, several teams reported setbacks in efforts to sharply increase the electrical output of future solar cells using light-absorbing nanoparticles that can generate more than one electron for every photon of light they absorb.


    Four years ago, researchers were delighted to discover that light-absorbing nanoparticles could readily generate more than one electron for every photon of light they absorb. Those extra charges, they hoped, would sharply increase the electrical output of future solar cells. But at the meeting, several teams reported setbacks in reaching that goal.

    In typical solar cells, when a semiconductor such as silicon absorbs a photon of light with the right amount of energy, it generates an exciton: an electron paired to a positively charged electron vacancy called a hole. The solar cell then separates those opposite charges and collects them at the electrodes. In 2004, researchers led by Victor Klimov of the Los Alamos National Laboratory in New Mexico reported that when lead sulfide (PbS) nanocrystals were hit with high-energy photons from a laser, they could generate up to seven excitons. Other groups jumped in and found a similar multiple exciton generation (MEG) effect in a variety of other nanocrystals, including cadmium selenide (CdSe) and silicon.

    Whither MEG?

    When semiconductors absorb high-energy photons, they typically create an excited electron (left). In nanocrystals (top), MEG uses leftover energy to excite more electrons (right).


    Then doubts began to creep in. Last year, Moungi Bawendi, a chemist at the Massachusetts Institute of Technology in Cambridge, and his Ph.D. student Gautham Nair reported that when they used a different technique, they spotted only a negligible MEG effect in CdSe nanocrystals, a result they later extended to PbS and lead selenide (PbSe). This year, a Dutch group that had previously reported a sharp increase in MEG in indium arsenide nanocrystals reported it couldn't reproduce the result. “The more results that came in, the more controversy there was,” Klimov says.

    At the meeting, John McGuire, a postdoctoral assistant from Klimov's group, reported new evidence that MEG in nanocrystals is far weaker than originally thought. In contrast to the 700% initially reported, the new Los Alamos results suggest the increase is likely about 40%, only slightly higher than the 25% increase seen by Bawendi's group. The upshot, both Bawendi and Klimov agree, is bad news. “These numbers at this point are not of practical use for solar energy,” Bawendi says.

    So what changed? Klimov says that for their current experiment, the results of which were also published online 12 November in Accounts of Chemical Research, the Los Alamos team stirred the samples to keep the nanoparticles from absorbing more than one photon at a time—a potential source of false-positive results.

    Hope for a strong MEG effect isn't entirely lost, Klimov says. In some samples, the MEG effect was more than three times as high as in other samples. Synthetic differences between samples may have left some with surfaces that enhance the effect, he says—and if so, researchers may learn to engineer particles to optimize it.

    Even if a large MEG turns out to be real, however, two separate teams found that getting those charges out of the nanocrystals won't be easy. Randy Ellingson of the National Renewable Energy Laboratory (NREL) in Golden, Colorado, reported that his group had made simple solar cells containing a layer of PbSe nanocrystals, with electrodes above and below. Creating the nanocrystals leaves them decorated with organic groups around the outside. When they are put straight into the films, the crystallites are too far apart to pass charges to the electrodes, where they can be sent through a circuit to do work. So in their current study, Ellingson and his team treated their nanocrystals with hydrazine, which shortened the organic groups and allowed the nanocrystals to sit closer to one another. The solar cells worked. But spectroscopic studies suggested that the hydrazine treatment killed the MEG effect.

    Meanwhile, Byung-Ryool Hyun, a graduate student in Frank Wise's group at Cornell University, reported another challenge in getting charges out of PbS. In this case, the nanocrystals were linked to titanium dioxide (TiO2) nanoparticles. When electrons are generated in the nanoparticles, they should move readily to the TiO2. But Hyun reported that electrons moved so slowly that the charges typically recombined with holes and gave up their energy before the TiO2 could snag them.

    So is this the end of the road for MEG? Arthur Nozik, who has helped lead the MEG effort at NREL, says he hopes not. “It's kind of a messy situation,” he says. He's hopeful that further research will reveal ways to produce a large MEG effect. For now, however, hopes are dimming for MEG solar cells.


    Protein Chip Promises Cheaper Diagnostics

    1. Robert F. Service

    At the Materials Research Society Fall Meeting, chemists reported making inexpensive, easy-to-manufacture microfluidic chips that can detect and quantify levels of a dozen different proteins in just a single drop of blood simultaneously, potentially lowering lab costs to just pennies per test.


    Ever since recent advances made it possible to study thousands of genes and proteins at once, researchers have dreamed of nipping diseases in the bud by spotting telltale proteins with simple blood tests. So far, that vision remains a long way off. Few individual proteins in blood and tissues have proven to be conclusive indicators of disease. Even if they were, clinical lab tests that measure proteins don't come cheap. Standard diagnostic panels can cost $50 each or more. At that price, scanning millions of patients for dozens or more proteins would cost a fortune. But new glass and plastic microfluidic chips could begin to change that equation.

    At the meeting, James Heath, a chemist at the California Institute of Technology (Caltech) in Pasadena, reported that his team has made microfluidic chips that can detect and quantify levels of a dozen different proteins in blood plasma simultaneously. What is more, the test needs only a single drop of blood, which it can analyze in less than 10 minutes. Because such chips are cheap and easy to manufacture, they could drop the lab costs to just pennies per test, Heath says.

    “If we can develop simple devices [for finding disease markers in plasma], that could be a big development for global health,” says Samir Hanash, a proteomics expert at the Fred Hutchinson Cancer Research Center in Seattle, Washington. Still, even though this new device makes strides toward that goal, he cautions that proving the device can work reliably under a wide range of conditions remains a distant prospect.

    Quick scan.

    A new biochip analyzes a single drop of blood plasma for up to a dozen different protein indicators of disease. Because the new chips are cheap and fast, they could revolutionize medical diagnostics.


    In coming up with their new chip, Heath's team combined recent innovations in microfluidic chips and DNA arrays, both of which have been advancing rapidly in recent years. The Caltech researchers used standard microfluidic chip-patterning techniques to carve a series of large and small channels in a thin polymer film. Then they bonded the film to a glass slide patterned with 12 strips of antibodies to specific proteins. The resulting device separates blood plasma from whole blood cells, then steers the plasma and the proteins it contains over the antibody arrays for analysis. Fluorescence analysis then reveals any proteins bound to the array, creating a bar-code readout of which proteins are present from each blood sample. When different concentrations of the same antibodies are placed on different strips, the chips can also determine the abundance of target proteins as well.

    Using their new chip, the members of the Caltech team showed that they could sort 22 cancer patients into different groups based on which of 12 different proteins associated with cancer were in their blood. Heath says he and his colleagues have formed a company in hopes of commercializing the technology.


    Graphene Recipe Yields Carbon Cornucopia

    1. Robert F. Service

    Chemists at the Materials Research Society Fall Meeting reported on a cheap, easy way to grow high-quality sheets of carbon just a single atom thick and then transfer them wherever they want, opening the door both to better ways of exploring the new physics of atomically thin materials and to potential applications.


    The hottest material in physics these days is graphene, sheets of carbon just a single atom thick. Graphene is flexible yet harder than diamond. It conducts electricity faster at room temperature than anything else. And it's nearly transparent, a handy property for devices such as solar cells and displays that need to let light through. The only trouble is that people have been able to make only small flakes of the stuff—until now.

    At the meeting, Alfonso Reina Cecco, a graduate student in chemist Jing Kong's lab at the Massachusetts Institute of Technology (MIT) in Cambridge, reported that he and several colleagues have come up with a cheap, easy way to grow high-quality graphene films and then transfer them wherever they want. “That's a big deal,” says Andre Geim, a physicist at the University of Manchester, U.K., who first reported making graphene (Science, 22 October 2004, p. 666). “It promises wafers [of graphene]. That changes everything.” It opens the door both to better ways of exploring the new physics of atomically thin materials and to potential applications.

    To create the first graphene sheets in 2004, Geim peeled single layers of graphene off chunks of graphite with clear tape. But that low-tech approach would be hard to scale up for industrial use. Researchers at the Georgia Institute of Technology in Atlanta came closer in 2004 by growing graphene films atop a substrate made of silicon carbide. But silicon carbide is expensive and must be processed in an ultrahigh vacuum, which also raises the cost.

    Film star.

    Graphene is prized for its electrical properties. Now, researchers can make sheets of the atomically thin material and pattern them for devices.


    At the meeting, Cecco reported that the MIT team had done away with the silicon carbide. Instead, they deposited a film of nickel atop a standard silicon wafer. They then used a conventional film-growing technique known as chemical vapor deposition to add graphene in either a single sheet or a stack of a few sheets.

    To transfer their graphene sheets to another surface, the MIT team coated it with a polymer known as PMMA, then etched away the silicon and the nickel after that, leaving only the graphene on the polymer film. Finally, they covered the newly reexposed graphene surface with glass and then dissolved away the PMMA. By initially patterning the nickel layer, Cecco and his MIT colleagues also showed that they could make graphene films in arbitrary patterns, such as those typically used to make electronic devices. The same day Cecco gave his talk, a paper on the topic was published online in Nano Letters.

    This ability to pattern and place graphene wherever it's needed, Geim says, will only increase the amount of research done with the material, ensuring that it will stay among the hottest materials in physics.


    Does 'Junk Food' Threaten Marine Predators in Northern Seas?

    1. John Whitfield*
    1. John Whitfield is a London science writer.

    Some fish-eating birds and mammals have full bellies but poor diets, say biologists puzzling over declines among these high-latitude marine predators.

    Some fish-eating birds and mammals have full bellies but poor diets, say biologists puzzling over declines among these high-latitude marine predators

    Low-fat sprat.

    As their prey becomes leaner, North Sea guillemots may go hungry.


    In 2004, ecologist Sarah Wanless was observing a colony of guillemots on the Isle of May off the coast of southeast Scotland. These diving seabirds were having a terrible breeding season in the United Kingdom, and some colonies hatched no chicks at all. But Wanless could see that parent birds were catching as many fish as ever, if not more. “We couldn't work out what was going wrong,” she said. The light dawned when she and her colleagues measured the fat and protein in the fish being caught, mostly sprat, a member of the herring family. Compared with previous years, the amount of energy a hungry guillemot received from a 10-centimeter sprat plunged in 2004, dropping from 55 kilojoules to 12 kilojoules. “They were largely water,” Wanless says.

    Wanless concludes that the guillemots (Uria aalge) were suffering from a diet of what some ecologists have called marine “junk food.” They hypothesize that in some cases, marine predators' prey is being replaced by less nutritious species or, like the sprat, becoming leaner. Human junk food is fatty fare, but for these high-latitude birds and mammals it is the opposite—food without enough fat and energy to sustain them.

    The highest-profile possible victim of marine junk food is the endangered Steller sea lion, which has seen a massive decline in its Alaskan population. But some researchers say the sea lion data point to suspects such as overfishing rather than junk food (Science, 4 April, p. 44). Among northern seabirds, however, researchers are finding multiple examples of struggling populations eating low-quality diets. The details vary among species, but a growing consensus holds that this is one result of climate-driven changes to food webs, which are disrupted as northern seas warm. “Until recently, we were preoccupied with how much food there was, and where. But what the food is can also be crucial,” says Wanless, who works at the Centre for Ecology and Hydrology in Edinburgh, U.K.

    Birth of an idea

    The junk-food hypothesis, as it's become known to the chagrin of some of the researchers working on it, was born in the early 1990s. John Piatt, a seabird ecologist at the U.S. Geological Survey Alaska Science Center in Anchorage, noticed that some colonies of common murres—the American name for guillemots—in the Gulf of Alaska had not recovered as expected from the 1989 Exxon Valdez oil spill. Piatt knew that murres and other seabirds were eating mainly juvenile walleye pollock. But gut content data from the early 1970s showed that they had been eating mostly capelin, an oily fish in the smelt family. A gram of capelin can contain up to twice as many calories as a gram of pollock, a lean white fish related to cod. In experiments using captive chicks of a variety of seabird species, those fed oily fish gained weight much more quickly than those fed pollock. Piatt thought this dietary shift might explain the murre's decline.

    Down, down, down.

    In Scotland, the total abundance of 13 species of breeding seabirds has been dropping since the 1980s.


    But what drove that shift? In the mid-1970s, before the spill, capelin numbers plunged while pollock's surged, as reflected in the catch brought up by research trawlers and the growth of the pollock fishery. This also coincided with a flip in a climate cycle called the Pacific Decadal Oscillation (PDO), which warmed the ocean surface off western Alaska by about 1°C in the space of a few months in late 1976. Except for a cool spell between 1998 and 2002, and another that began in late 2007 and which is as yet too brief to call a reversal, that region of water has been relatively warm ever since. The mid-1970s flip is also around the time when numbers of Steller sea lions in western Alaska began to plummet; in the early 1990s, marine biologist Dayton Alverson of Natural Resources Consultants Inc. in Seattle, Washington, independently wondered whether a switch from eating oily fish to pollock was the cause. In 1991, bird and mammal researchers, including both Alverson and Piatt, met to review seabird declines. At the meeting, seabird ecologist Scott Hatch, also of the Alaska Science Center, says he coined the phrase “junk-food hypothesis.”

    To test the idea, in 2000, marine-mammal researchers Andrew Trites and David Rosen of the University of British Columbia in Vancouver, Canada, fed captive sea lions either pollock or herring. Adults could survive on pollock, they found, but yearlings could not eat enough to sustain themselves. The animals needed to eat more than 20% of their own body weight in pollock each day, but their stomach capacity only allowed them to consume 17% to 18%. “We gave animals as much pollock as they would eat, and they were losing weight,” Trites says. The fact that the Steller sea lion population has actually increased in southeast Alaska, where the PDO shows the opposite pattern, also tends to confirm the junk-food/climate hypothesis.

    But some sea lion experts aren't convinced. “Lab work suggests the juveniles are most vulnerable to low-quality prey,” says Lowell Fritz of the National Marine Fisheries Service in Seattle. “But we're not seeing that result currently in the field. We should have seen starving animals, but we didn't—we just saw them not there.” Fritz thinks the sea lions were more likely hammered by deliberate shooting and by-catch in fisheries. He also thinks the PDO's effects on fish have been overstated. “I'm not convinced you can see any links [to ocean conditions] beyond the phytoplankton. I'm not seeing a direct link to fish.”

    Trites says that perhaps juvenile sea lions are scarce because adult females adapted to a low-fat diet by spacing out breeding. Looking at the condition of adult females might settle the question, but those studies haven't been permitted because of animal-welfare concerns. Although they seek to do more studies, Trites and Fritz agree that the cause of the sea lions' original decline may never be known.

    The ultimate junk food

    As researchers puzzle over Alaska's sea lions, other scientists report that the Baltic Sea has also shifted ecological regimes, at least in part thanks to humans. In the late 1980s, there was a boom in the sprat population, triggered by a combination of warming seas and heavy fishing of the sprat's main predator, cod. This abundance of small, oily fish ought to have been good news for the sea's guillemots, but in fact chick weights dropped through the 1990s, before recovering as cod stocks have regrown, says marine ecologist Henrik Österblom of the Stockholm Resilience Centre. “We thought that because sprat increased, that chick condition would improve. Instead we found the opposite,” he says. The North Sea also experienced large ecological changes in the late 1980s, leading to the bad breeding season Wanless observed in 2004, the worst on record, according to the U.K. government's Seabird Monitoring Programme.

    The changes to Baltic sprat are some of the best field evidence for the junk-food hypothesis: As the number of sprat went up, the nutritional worth of each fish in the crowded population went down, say Österblom and his colleagues. This matters to guillemots, because they deliver one fish at a time to their chick, “so what that fish contains is incredibly important,” says Österblom.

    In the past 5 years, the menu for North Sea birds has become even less promising. For reasons that aren't understood, large numbers of snake pipefish, a relative of seahorses, have appeared in these predator's diets. The pipefish have tough skins and are virtually fat-free: “It's the ultimate in junk food,” says Wanless. British birders have spotted starving kittiwake chicks—a marine gull that usually eats oily sand eels—surrounded by the corpses of uneaten pipefish.

    A junk-food diet can increase mortality in more roundabout ways. In 2004, for the first time, Wanless saw a pair of guillemot parents fishing at the same time, leaving their chick unattended. This atypical behavior has increased year-on-year, and last September, she and her colleagues revealed its consequences in Biology Letters: A growing number of chicks are being killed by their adult neighbors in the colony. Malnourishment has more subtle effects, too: A team including Piatt found that kittiwakes reared on low-fat fish in captivity showed higher levels of stress hormone and cognitive deficits. “We measure things that we can, such as growth rate, but hidden behind this could be physiological stresses that put birds at a disadvantage,” he says.

    At risk. Researchers still debate whether “junk food” led to the decline of the endangered Steller sea lion or whether overfishing is to blame.


    In the North Sea, bad breeding seasons for birds are becoming more common (see graph). This summer, the U.K.'s Royal Society for the Protection of Birds reported that “virtually no” chicks fledged from some kittiwake and tern colonies. The main cause is thought to be changes to fish populations due to fishing and the effects of global warming on fishes' planktonic food species, because the North Sea is now 1.5°C warmer than it was 40 years ago. The warming has resulted in a 70% drop in the populations of a tiny crustacean called Calanus finmarchicus, thought to be the main food of sprat and sand eels, as well as a rise in the numbers of a warm-water relative, C. helgolandicus, which contains much less fat and so is junk food for the fish. So the spread of C. helgolandicus makes these fish both fewer in number and leaner.

    In another twist, C. finmarchicus itself may have become junk food for other seabirds: Its range has shifted north by about 1600 kilometers, and it is now displacing the even larger and fattier Calanus species that support vast colonies of little auk, a crustacean-eating relative of the guillemot, in Greenland and Norway. In the long run, little auk numbers might decline and those of fish-eating birds such as guillemot might start to increase in these areas, says seabird ecologist Morten Frederiksen of the University of Aarhus in Denmark. “We might be seeing the whole system shifting north, with junk food-related problems developing at the northern and southern boundaries of the climate zones.”

    These studies show that “junk food” takes different forms and has different consequences for various species. So does it make sense to group the effects on North Sea guillemots with those on North Pacific sea lions? Seabird ecologist Robert Furness of the University of Glasgow in the U.K. has his doubts. “The junk-food hypothesis is an attractive term but a bit misleading,” he says. “Each ecosystem is unique.” In his view, simple issues of food quantity—such as the current low sand eel populations in the North Sea—will most often be the critical determinant of predators' health. Frederiksen adds that researchers aren't yet sure how to separate the effects of quantity from those of quality: “Working out how much the problems are a junk-food issue and how much is lack of food is difficult.”

    But Trites argues that there is an overarching message. “People have to realize that not all fish were created equal,” he says. Should that message hit home, the only thing left to fix will be the name. Trites thinks a better analogy for predators' woes is a diet of celery. Piatt agrees: “Junk food is very fatty,” he says. For animals, “it's the lean cuisine that's the problem.”

Navigate This Article