News this Week

Science  05 Dec 1997:
Vol. 278, Issue 5344, pp. 1700

    Japan Centers In on Genome Work

    1. Dennis Normile


    TokyoAfter years of lobbying for increased support, Japanese genome scientists stand on the threshold of a major increase in funding. Despite frantic efforts to tighten its belt, the government has proposed a doubling of spending on genome-related research in 1998. The increases are spread across agencies and fields, and include one program that researchers see as breaking new ground in genome-related work.

    View this table:

    The centerpiece of this new program is a proposed large-scale nuclear magnetic resonance (NMR) imaging center to study protein structure, which scientists both here and abroad believe will be the next important step in genome work. It will be on a grander scale [than] any NMR centers that we know of, says Kurt Wthrich, chair of the department of biology at the Swiss Federal Institute of Technology (ETH) in Zurich. Adds Gaetano Montelione of Rutgers University's Center for Advanced Biotechnology and Medicine in Piscataway, New Jersey, The structural analysis component [of Japan's plans] gives them a chance to leapfrog to the forefront of genome efforts. Such a major effort will be crucial to complement other methods of determining gene and protein function, says Montelione, who is also studying how to scale up NMR techniques for structural biology.

    The center's promoters plan to open the facility to research groups from around the world. It's one way Japan can make an international contribution to genome efforts, says biophysicist Akiyoshi Wada, director of the private Sagami Chemical Research Center, who was formerly at the University of Tokyo.

    Even that contribution, however, may not be enough to allay concerns that Japan is not doing enough to advance human genome sequencing efforts around the world. Although Japan is setting the pace in sequencing the rice genome (see sidebar on p. 1702), its human sequencing efforts are modest despite government plans to add a sequencing program to ongoing efforts. Molecular biologist Yusuke Nakamura, director of the University of Tokyo's Human Genome Center, finds it odd that the government would fund the NMR center while slighting the sequencing work he thinks should come first. This research has to be done step by step, he says. But in Japan, some steps are being omitted.

    Structural solutions

    The NMR center would be one of three components of a new Genetics Frontier Research Center at the Institute of Physical and Chemical Research (RIKEN) near Tokyo. Sponsored by the Science and Technology Agency (STA), the new center is expected to get 33.3 million, or more than one-fifth of the 149 million in genetics-related research spending proposed for next year. An enhanced human genome sequencing center will complement the NMR work. Japan doesn't yet have a high-throughput [sequencing] center, says Yoshiyuki Sakaki, a molecular biologist at the University of Tokyo's Human Genome Center, who will move to RIKEN to head this part of the program.

    Another part of the frontier program aims to sequence mouse cDNA completely, both for the mouse's value as an experimental animal and for studying the function of genes common to both mice and humans. The work will be led by Yoshihide Hayashizaki, chief scientist of RIKEN's Genome Science Laboratory, who is developing new sequencing reagents and techniques to increase sequencing speed and automation dramatically. His techniques also allow cloning of full-length cDNAthe DNA expressed to make a proteininstead of the fragments that typically result from other methods. The cDNA would be used to synthesize proteins, and the protein structure would then be determined by NMR spectroscopy or x-ray crystallography.

    The NMR center would be the key to breaking new ground in genome studies. Wthrich, who is so intrigued with the project that he has signed on as an adviser, explains that sequence data can provide only limited information. In some cases, protein structure and function can be predicted based on similarities with known gene sequence data. But the human genome program, he says, will lead to the discovery of a very large number of unknown proteins, and to make sense of this information the three-dimensional structure will have to be studied. Structural biology groups are already lined up to use Japan's new SPring 8 synchrotron in Harima, west of Osaka, for crystallography.

    Shigeyuki Yokoyama, a biophysicist who holds posts at both RIKEN's Cellular Signaling Laboratory and at the University of Tokyo, says that the scale of the proposed center makes it special. Most labs have only a few of the expensive NMR machines, he says, hardly enough to tackle the 100,000 human proteins arranged in some 1000 structural families. The RIKEN concept is to gather enough machines, 10 in the first stage alone and more later, and a cadre of engineers and technicians to run them.

    In addition to its core research, Yokoyama says the center would focus on the development of a new generation of NMR spectrometers using high-temperature superconducting magnets for greater sensitivity. With a large number of advanced NMR spectrometers, Yokoyama says the facility would usher in a new era in genome research: We can combine the structural biology as a part of genome research.

    Gold diggers or good science?

    The novel approach has drawn some internal criticism as well as international praise. Tokyo's Nakamura and others would like to redirect the resources for the cDNA and NMR efforts into human genome sequencing. Nakamura says that even with the sequencing component of the RIKEN plan, Japan's contribution to the international sequencing effort will be embarrassingly small.

    Nakamura's assessment gets some support from overseas colleagues. My take [on the plans] is that they are looking toward the future without contributing significantly to the initial sequence data collection phase, says Bruce Roe, who heads a sequencing effort at the University of Oklahoma, Norman. This smacks of what most [countries] want to do, which is to mine the data, he adds.

    Whether the RIKEN program works as planned, Nakamura believes that it will not strengthen the infrastructure needed to pursue the medically important research that builds on the sequencing data. I am a little afraid about the future of medical research in Japan, he says.

    In addition to questioning governmental priorities, Nakamura also thinks scarce resources are being used inefficiently. He notes that most of the additional 6.8 million headed for the university's Human Genome Center next year will go for supercomputing. Meanwhile, a tight budget lets his group run only a few of its 15 sequencers.

    For Nakamura, the heart of the problem is the lack of a national strategy for the human genome project. He would prefer to see the government support one or two large-scale sequencing centers, perhaps like Britain's Sanger Centre, that would focus on human genome sequencing and work in cooperation with researchers hunting for disease genes. He feels that the move to structural biology, while important, would benefit by waiting for less expensive, more sensitive NMR equipment.

    The issues raised by Nakamura and others caught the attention of Koji Omi, a Diet member and director-general of the Economic Planning Agency. Omi was one of the driving forces behind the 1995 passage of a Basic Law for Science and Technology, which set the stage for a subsequent plan to boost government spending on research significantly. This fall, Omi reportedly questioned officials of both RIKEN and STA about the feasibility of Hayashizaki's new sequencing technologies and the emphasis on NMR facilities at the expense of sequencing. Omi was not available for comment, but an aide said he was simply trying to understand the program better. In any event, RIKEN and STA officials are confident their plans will be adopted.

    Many of those pushing the RIKEN plan agree that the government should also be putting more resources into the university-based genome efforts. But thickets of regulations and red tape make it difficult for universities to hire technicians or assemble teams of scientists. Even Nakamura agrees that Japan's universities are not well-suited for large-scale sequencing efforts.

    For political reasons, STA officials say that a grander vision was needed to sell the project to government officials. It would have been difficult to get the money for a center for sequencing only, says Yasuhiro Itakura, deputy director of STA's life sciences division. At the same time, RIKEN officials acknowledge that the institute can't afford another costly failure after a previous scheme to automate genome sequencing was scrapped in the early 1990s. Yokoyama thinks the key to continued government support for genome work is the reaction of the international community. If this effort is appreciated by other countries, I think [the Japanese government] will expand its support, he says. But if the project is judged solely on its contribution to sequencing, he says, Japan's genome efforts could be in trouble.

    Despite the risks, Sagami's Wada is glad the country has decided to chart its own course. And he says he's very confident that Japanese researchers will contribute to the global quest to decipher the human genome.


    How Genome Research Rose to the Top

    1. Dennis Normile

    Japan's national finances are in dire shape. The government has vowed to cut overall spending next year, even though the overall research budget will inch up by 1.4. So how did genome-related work get earmarked for a doubling? The big increase reflects a confluence of factors, notably a growing public understanding of the value of genome work and a feeling that, after a long wait, it was genome researchers' turn.

    Over recent years, each ministry has devoted more and more of its budget to genome projects as a consensus on the importance of genome work has strengthened, says Kazuo Katao, director of the biochemical industry bureau at the Ministry of International Trade and Industry. Kiyoaki Maruyama, director of research and development for the Ministry of Agriculture, Forestry, and Fisheries, recalls that the small band of scientists and officials in the late 1980s who wanted 2.5 million a year to start mapping the rice genome faced an uphill battle convincing colleagues of the value of the work. But now, a 13.2 million request to move on to large-scale sequencing wins applause because the value of the work is well understood, he says.

    For other programs, it was a matter of waiting their turn. Two of the three projects bundled into the Institute of Physical and Chemical Research's (RIKEN's) proposed Genetics Frontier Research Centera program to sequence mouse cDNA and a new nuclear magnetic resonance spectroscopy centerhad been proposed as separate projects a year ago. But the priority last year at the Science and Technology Agency (STA) and at RIKEN was getting a new neuroscience center off the ground (Science, 14 March, p. 1562). This year, RIKEN officials bundled the two programs, added a human genome sequencing centersomething the STA had long wantedand secured backing for the package. Also, in STA's case, significant cuts in nuclear power research freed up funds.

    Another critical factor was an official stamp of approval from the Council for Science and Technology. Chaired by the prime minister, the council is the nation's highest science advisory body. In mid-1996, the council decided to draw up a Basic Plan for Research and Development on Life Science, for no particular reason other than that the council had never considered life sciences before, says council member Wataru Mori, a pathologist and former president of the University of Tokyo. The council produced a report last August emphasizing the importance of basic research on the functions and structures of genes and proteins as the foundation for progress in all of life science.

    Mori says the council's deliberations were not part of a scheme to pave the way for a dramatic increase in funding for life science research. But there is a telling precedent. In mid-1996, the council produced a report on the importance of neuroscience, and the 1997 budget included funding for a new Brain Science Institute. And government officials admit that the genome report helped their case. If our requests are in line with a report like this, the Ministry of Finance can't be too critical, says Maruyama.


    Rice Genome Races Ahead

    1. Dennis Normile

    While government officials are struggling to reshape Japan's human genome effort, the country's Rice Genome Research Program (RGP) is moving forward with alacrity. Program scientists have put together highly detailed genetic and physical maps, and a large library of cDNA clones, reaching the major goals of the first 5-year phase of the program on schedule and within budget. The work, which includes 2400 DNA markers on a genetic map, has earned plaudits from plant geneticists around the world.

    The [RGP] project has been acclaimed, and rightly so, for its technical contributions, says Susan McCouch, a Cornell University geneticist who works on rice. The program is also getting a vote of confidence from the Japanese government in the form of a proposed 429 increase in funding. The growth could help offset a drop in contributions from the Japanese Racing Associationa quasi-public entity that runs betting operations at the country's horse-racing tracks and is required to donate a portion of its proceeds to agricultural researchthat once matched what the government spent but which have been hurt by a sluggish economy.

    Whatever the total, the money will support efforts to sequence all 450 million base pairs on the plant's 12 chromosomes and to expand efforts to identify genetic markers common to other cereals. Such a concerted effort has helped the country overcome a fast start by McCouch and her colleagues at Cornell, who published a basic genetic map in 1988. The U.S. agricultural research establishment is deep; Japan's is shallow, says Naoki Katsura, head of research planning for the National Institute of Agrobiological Resources (NIAR). So we recognized the merits of concentration.

    The logical choice was rice, not only because it has the smallest genome of any of the major cereals, but also because of its importance as a crop and as a cultural icon for Asia. The ministry and its researchers also concentrated resources in one location within NIAR's Tsukuba campus, assembling a team that has grown to 50 scientists.

    A similar strategy for the second phase of the projectdoubling the number of scientists and hiring more technicianswill also extend collaborations beyond current efforts with the John Innes Centre in Norwich, U.K., to identify markers common to rice and wheat, and scientific exchanges with the International Rice Research Institute in Los Baos in the Philippines. Katsura says he would also welcome help on the large-scale sequencing to shorten what otherwise might be a decade-long effort.


    SUNY-Battelle Team to Run Brookhaven

    1. Andrew Lawler

    Six months of uncertainty at Brookhaven National Laboratory ended last week, when Department of Energy (DOE) managers chose a new contractor to operate the troubled facility in Upton, New York. The winning team, which takes over in January, is made up of the State University of New York (SUNY), Stony Brook, and Battelle Memorial Institute. It now faces the challenge of mending fences with the lab's neighbors and resolving the fate of a major research reactor that local groups and New York politicians want to close.

    The new director, former Stony Brook President John Marburger, promises to make major changes to clean up environmental problems and ease the concerns of the surrounding Long Island population. Job one is to establish contact with our community, he said on 25 November. He also pledged to make major reassignments within the 3200-member staff to ensure that Brookhaven abides by environmental regulations while producing quality science. Marburger, a physicist, is a former chair of Universities Research Association Inc., a consortium that operates DOE's Fermi National Accelerator Laboratory, the country's most powerful accelerator.

    DOE Secretary Federico Pea fired Brookhaven's current contractor, Associated Universities Inc. (AUI), in May, following revelations of a long-standing tritium leak at the High-Flux Beam Reactor (HFBR), which is shut down for repairs (Science, 9 May, p. 890). In the competition that followed, the Stony Brook- and Columbus, Ohio-based Battelle beat out another group led by Westinghouse and IIT Research Institute of Chicago for the 2 billion, 5-year contract to operate the 400-million-a-year lab. The winning factors, DOE sources say, were Stony Brook's proximity to the lab and Battelle's experience in managing large organizations like Pacific Northwest Laboratory in Richland, Washington, next door to the polluted Hanford site used to store waste from nuclear weapons production. DOE says the new contract's higher annual fee1 million more than the 4.2 million paid to AUIreflects the added cost of ensuring environmental safety and restoring public confidence in the lab.

    The new team, called Brookhaven Science Associates, will have a 16-member board, including representatives from the six northeastern universities who make up AUI. But the majority of members will be Stony Brook and Battelle managers, Marburger said. Although DOE told bidders to rule out any layoffs or dismissals, Marburger says there will be a lot of change in the functions people perform.

    One of the most pressing issues for the new operator is the fate of the HFBR. Many neutron scientists are eager for it to come back online, but Senator Al D'Amato (R-NY) and Representative Michael Forbes (R-NY) say it's an environmental hazard that should be closed permanently. Pea is expected to decide by early spring, but even if he sides with the scientists, an environmental impact review will keep the HFBR shut until 1999. That is also the starting date for Brookhaven's newest facility, the Relativistic Heavy-Ion Collider.

    Given those problems, Marburger says he's prepared for some long days on the job. I don't have any illusions, he told reporters. The next year will probably be a difficult one.


    Indonesia Opens the Door for Global Climate Studies

    1. Jeffrey Mervis

    The forest fires that raged out of control in Sumatra and Kalimantan for several months this year have left at least one promising scientific development in their acrid wake. Reacting to international criticism that his country has been slow to take action against the fires and other environmental problems, Indonesia's longtime leader, President Suharto, last month invited the world's scientific community to conduct research on global climate variability in Indonesia. He also pledged financial support for international efforts in climate prediction and mitigation, including a fledgling U.S.-based effort at Columbia University.

    We are pleased to open ourselves and offer the uniqueness of our territory, whether at sea, on land, or in the atmosphere, for scientific studies of climate, weather, and the environment, Suharto told an international conference on global environmental change held 10 to 12 November in Jakarta. Last week, he repeated the promise to world leaders gathered at the Asia Pacific Economic Cooperation summit in Vancouver, Canada.

    Suharto's statements herald a new policy, confirms geophysicist M. T. Zen, senior adviser to State Minister for Research and Technology B. J. Habibie. Indonesia's military has at times been reluctant to allow international research projects among its far-flung islands and within its territorial waters (Science, 5 January 1996, p. 23). But Zen says, The severity of the drought and the extent of the forest fires make it clear that we can no longer look at climate change as a problem that only the wealthy nations can afford to think about. As a maritime continent, Indonesia has a special role to play in improving our understanding of global climate.

    Researchers were quick to welcome Suharto's invitation, noting that it opens up a world of possibilities. Indonesia sits at the center of some of the most pressing issues in climate research, including understanding the variability of El Nio, the interaction between the Pacific and Indian oceans, and the conveyer-belt model of ocean circulation. In addition, its geographysome 30,000 islands covering an area the size of the continental United Statesmakes oceanography a matter of national importance. It does seem to be a significant change in attitude, and very encouraging, says Kevin Trenberth, head of climate analysis at the National Center for Atmospheric Research in Boulder, Colorado. It should make researchers very happy, adds Columbia University physical oceanographer Arnold Gordon, head of a 10-year project begun in 1993 to study the interaction of the Pacific and Indian oceans in Indonesia and its impact on global climate. It's very important that it's coming straight from the top, says Gordon, whose Arlindo project relies on Indonesian vessels and scientists to deploy and reclaim instruments.

    As a first step toward international cooperation, Suharto has proposed Indonesia for core membership in the new International Research Institute for Climate Prediction (IRI), a U.S.-backed effort based at Columbia's Lamont-Doherty Earth Observatory. The institute, begun last spring with a 3-year, 18 million award from the National Oceanic and Atmospheric Administration (NOAA), hopes that governments around the world will eventually pick up the tab for its activities, which include a large modeling team at the Scripps Institution of Oceanography in La Jolla, California.

    NOAA told us it will cost 3 million over 3 years to become a core member of IRI and 1.5 million a year after that, says Zen, and Habibie has said that's not a problem. The institute, which will break ground in February for a 10 million building on the Columbia campus, is negotiating a similar arrangement with Taiwan, Australia, and Brazil, says IRI director Antonio Moura, a former head of Brazil's weather service. Over the next 2 years, IRI also hopes to ink bilateral agreements with agencies in several countries that will help it, in Moura's words, go beyond the models to real applications, including demonstration projects that help governments and people deal with the impact of climate change.

    In his speech to the Jakarta conference, Habibie also staked out a claim to IRI's first regional center, which Zen said will be called the Indonesia Research Institute for Climate, Environment, and Society. Over the next 6 months, a group of 20 scientists from Indonesia's technology agency, BPPT, will be assembled to form the core of the institute, which will later expand to include researchers at other Indonesian agencies as well as scientists in other countries. Its goal will be to anticipate future El Nio events and recommend ways to mitigate their impact.

    Whether we like it or not, El Nio will be with us on a recurring basis, and it's our job to keep reminding people of what must be done, says Zen. If we know when it's coming, we can advise people what and when and where to plant, as well as exercise more caution in widespread biomass burning.

    Researchers also hope to extend an international program on climate variability and predictability, called Clivar, into the region. The project, which uses an array of million-dollar buoys to measure the interaction of ocean and atmosphere over a time scale ranging from months to decades, is being deployed in the Pacific and Indian oceans. Permission to work in Indonesian waters would plug an important gap in coverage, says Trenberth, co-chair of Clivar's scientific steering committee. A proposed international experiment to measure the effect of air-sea interactions on the South Asian monsoon would also benefit from Indonesia's input, says co-organizer Peter Webster, an atmospheric scientist at the University of Colorado, Boulder. India has committed buoys and Australia has pledged a ship, he says about the proposal to work in an area west of Sumatra in the Indian Ocean, but we'd also like Indonesia to be involved.

    The wave of concern about climate variability may, of course, peakand opposition to global collaborations could returnonce the forest fires die out, the drought ends, and El Nio recedes. But there's something to be said for striking while the iron is hot, says Trenberth. The main thing is to assure Indonesia and other countries that they have more to gain than lose in generating and sharing information on the forces shaping our climate.


    Drug Companies Decline to Collaborate

    1. Nigel Williams

    A group of leading pharmaceutical companies has shot down a proposal for a 180 million joint project to develop new treatments for the world's most threatening tropical diseases, particularly malaria. A group of public and private organizations put the proposal to the drug companies last month, and earlier confidential discussions were thought to be going well. But industry leaders who met in Switzerland decided that there are too many uncertainties and potential commercial conflicts to go ahead. The rejection is particularly disappointing, says Bob Howells, head of international programs at Britain's Wellcome Trust, one of the proposers of the initiative.

    Along with the Wellcome Trust, the proposal was put together by the World Health Organization (WHO), the World Bank, the Association of the British Pharmaceutical Industry, and the drug companies Roche and Glaxo Wellcome. The plan was driven by growing concern about the threats posed by some diseasesmalaria was the pilot disease targeted in the proposalin which widespread resistance to the cheapest drugs is developing rapidly. Researchers have perceived a lack of interest among pharmaceutical companies in developing new inexpensive treatments. We are worried that most companies are only interested in treatments for Western travelers, says WHO's Win Gutteridge, who helped put the proposal together.

    The proposal involved setting up a not-for-profit company that would screen and develop new treatments over an initial 7-year term. Industry would stump up the lion's share of the cost, and any profits would go back into the company to help meet running costs and begin to become self-financing. It's quite a new idea to develop collaborations between companies, says Gutteridge. The novelty may be part of the proposal's problem. The proposed company would have worked in parallel with the new Multilateral Initiative on Malaria (MIM), an attempt to coordinate the research efforts of several public and private funding bodies, including the U.S. National Institutes of Health, WHO, the Wellcome Trust, and France's Pasteur Institute (Science, 21 November, p. 1393).

    Following the Geneva meeting, Simon Sargent of Glaxo Wellcome, one of the proposers, sent a brief fax to Gutteridge. He reported that the industry leaders concluded: The underlying issues are difficult and complex and need further examination. They also pointed out that there are drugs and vaccines for malaria already in development within the pharmaceutical industry, and that there is a need to explore further what work is being done. For example, Glaxo Wellcome is planning trials of a new drug, Malarone, in Africa ahead of a potential donation of 1 million treatment doses per year in the region.

    Malaria researcher Brian Greenwood of the London School of Hygiene and Tropical Medicine says researchers were very disappointed by the companies' decision: We really thought industry was wanting to help. Harvey Bale, director of the International Federation of Pharmaceutical Manufacturers' Associations in Geneva, says there were very serious obstacles to moving ahead with the proposal. There are a number of problems at the business level, and we've some very substantial questions, he says. And the cost of the project is high for some companies.

    The decision is a blow to the newly formed MIM, which had hoped to work alongside the proposed public-private partnership. Gutteridge says MIM will still go ahead, but it has not been helped by the companies' decision. He is, however, still optimistic that a solution can be thrashed out: I'm not surprised they want to examine the underlying issues further. There's a lot of research going on, and maybe there are things we don't know about. The proposers plan to discuss the decision with company representatives to look for a way forward. Our bottom-line hope is that the proposal isn't dead, says Gutteridge. We had such a good initial response.


    Spiked Coffee Prompts Police Inquiry

    1. Jocelyn Kaiser

    It's enough to make a scientist swear off coffee. Last month, six researchers at the University of California, San Diego (UCSD), Medical Center fell ill after swigging coffee at a lab meeting. The cause: Their drinks were laced with acrylamide, a neurotoxic chemical used in molecular biology labs. All six were treated in local hospital emergency rooms and returned to work the same day. But that wasn't the end of the story: When tests revealed that the amount of chemical in the coffee seemed too high to be accidental, campus police began a criminal investigation 2 weeks ago.

    The poisoning occurred on the morning of 5 November in a conference room of the Medical Center's Cellular and Molecular Medicine East building. According to several sources, the researchers who drank the tainted coffee worked in the lab of cell biologist Don Cleveland, although Medical Center spokesperson Nancy Stringer said members of several labs were at the meeting. Cleveland declined to comment. The victimstwo grad students and four staff membersexperienced light-headedness and blurry vision, the center said in a statement.

    Tests of coffee from the meeting room revealed that it contained acrylamide, a white, water-soluble crystal that's used to make gels for protein analysis. Although trace contamination might have occurred by mistake, say, when someone cleaned a coffee pot with a towel that had been used to wipe up acrylamide solution, that seems unlikely. The levels of acrylamide were far above trace amounts and could not be explained by an accident, says UCSD police detective Robert Jones.

    Acrylamide can cause the body's potassium levels to drop and permanently damage the nervous system. The amount in the coffee was estimated at about one-tenth of a lethal dose, however, Jones says. And although the long-term effects from a single exposure to acrylamide haven't been well studied, neurotoxicologist Jean Harry of the National Institute of Environmental Health Sciences says that because the victims ingested the poison, they likely became nauseous before they could consume enough to suffer serious harm.

    The university had reported no suspects earlier this week and was conducting interviews and collecting evidence, according to Stringer. The poisoning thus remains a mysterythe most recent in a string of suspected poisonings in research laboratories, many of them still unsolved. Two years ago, for example, several scientists at the National Institutes of Health and the Massachusetts Institute of Technology were found to have ingested the radioactive isotope P-32, and in 1994, 15 researchers at Rockefeller University in New York City got sick after drinking coffee laced with sodium fluoride. About 15 years ago, several people at the San Diego biotech company Quidel Corp.including CEO David Katzdrank coffee that had been doped with acrylamide. A chemist at Quidel was convicted of the crime.

    For the next 2 years, there will be jokes about drinking coffee, but the sad side is, it's a really vicious way of taking revenge, says UCSD immunologist Maurizio Zanetti, who as an employee at Quidel helped identify the poison in that case. Zanetti suggests that anxiety over funding may have prompted someone to act irrationally. The UCSD police, however, say it is too soon to draw any conclusions. Like a good investigator, Assistant Police Chief Jay Dyer says All possibilities have to be looked at.

  8. CHINA

    New Robotic Vessel Extends Deep-Ocean Exploration

    1. Li Hui
    1. Li Hui writes for China Features in Beijing.

    BeijingChina has deployed a new and untethered robotic submarine that extends the reach of researchers and others interested in learning more about the deep ocean. In a series of dives earlier this year in the central Pacific, the autonomous underwater vehicle (AUV) CR-01 went down 6000 meters, matching the deepest levels reached by other submersibles. It can remain underwater for an extended period of time, maneuver to avoid obstacles, and take both still and video pictures.

    The achievement puts China in the front ranks of countriesnotably France, Russia, the United States, and Japanwith such a capability. And it may provide a boost for the entire field, which in the post-Cold War era has had to scramble for funding. It's a significant accomplishment, says James Bales of the AUV lab at the Massachusetts Institute of Technology, which has a fleet of five smaller AUVs. The fact that China is putting major resources into autonomous vehicles also validates what the rest of us have been doing.

    Exploring the deep-ocean floor is a formidable technological challenge. Besides the great pressure and the difficulty in communicating through thousands of meters of seawater, navigating through unknown and irregular terrain is a treacherous business. But the potential rewardsboth economic and scientificare compelling. The vehicle can be used to get information on the topography and hydrology of the seabed, to explore mineral resources underwater, and to search for sunken objects. It is currently being used to explore deep-sea mineral deposits and is available for contract work. It gives China the technological means for tapping ocean mineral resources, says the robot's designer, automatic control expert Feng Xisheng of the Shenyang Institute of Automation.

    The torpedo-shaped AUV is 4.4 meters long, 0.8 meters in diameter, and weighs 1300 kilograms, making it significantly bigger and heavier than other deep-diving robots are. Its size allows for larger batteries, which means more staying powerup to 23 hoursand reduces the need for miniaturized sensors and other specialized equipment. Powered by a group of silver-zinc batteries with a total capacity of 4.8 kilowatt-hours, the vessel has a cruising speed of 2 knots, with bursts up to 2.7 knots.

    To avoid collisions, the AUV uses five ranging sonars with a 60-meter range. It also uses one long-baseline acoustic positioning system for navigation. The system consists of three beacons placed on the seabed and separated by a distance of 3.6 kilometers, along with one subsystem installed on the support ship and one on the AUV. Each part of the system can send and receive signals, allowing controllers to pinpoint the robot relative to its support ship and to send commands for recovering the vehicle. This long-baseline system can cover an area of about 40 square kilometers; within this area, the system's positioning accuracy is 10 meters. In case of a breakdown, Feng says, the robot is capable of automatically resurfacing and raising the alarm signal by extending an emergency wireless antenna and flashing its lights.

    The robot has no external devices to collect samples. To get information on metal-rich nodules that litter parts of the Pacific sea floor, explains sonar expert Zhu Weiqing, from the Beijing-based Institute of Acoustics under the Chinese Academy of Sciences, the robot takes photos (up to 3000 exposures) and video recordings with a camera that holds 4 hours of tape. It has two side-scanning sonars that can operate within a range of 350 meters on each side and one chirp sub-bottom profiler that can penetrate 50 meters into the seabed. Signals emitted by the side sonars help researchers analyze an area for any mineral deposits; sounds from the profiler provide information on the depth of those deposits.

    The vehicle was put through its paces during a Chinese expedition to a region some 1350 kilometers southeast of Honolulu. CR-01 spent 8 hours on each of five dives, conducted between 21 May and 27 June, exploring an area the size of a large golf course from an altitude of between 3 and 5 meters above the sea floor.

    The project, funded by the State Science and Technology Commission, began as an effort to design an AUV capable of diving to 1000 meters as part of China's high-level Project 863, begun in 1985. In 1992, the Chinese turned to Russia for help. The Vladivostok-based Institute of Maritime Technology Problems had some technologies we badly needed, such as the long-baseline acoustic system and the airtight sealing technology, says Xu Fengan, an automatic control scientist and CR-01's deputy general designer. More importantly, they have valuable operating experience under the water with a variety of vessels, most recently the MT-88, also rated at 6000 meters. That knowledge, Xu says, was blended with Chinese expertise in automatic control technologies, propulsion and energy systems, and vehicle-recovery methods.

    Mikhail Ageev, director of the Vladivostok institute, says he anticipates additional joint projects with China. The institute was also hired to help design and build a similar AUV for South Korea.

    CR-01's firstand so far onlyclient is the China Ocean Mineral Resources R&D Association. Supervised by seven state departments, the association's main task is to explore and exploit mineral resources in an area of 75,000 square kilometers under the Pacific designated by the Deep-Seabed Authority, an international organization that oversees ocean-floor explorations.

    CR-01 appears ready to play an important role in that exploration, and mineral association officials say they hope to acquire at least one more AUV, paid for by the state commission. The Shenyang Institute is also looking for outside support from other governments and private companies, including those in France, Sweden, Italy, and the United States.


    Putative Martian Microbes Called Microscopy Artifacts

    1. Richard A. Kerr

    Ever since they made their public debut 15 months ago, the squiggly little objects that mightor might notbe microscopic fossils in a meteorite from Mars have been at the center of a heated debate. Most scientists have focused on arcane arguments about mineral isotopic compositions and formation temperatures (Science, 4 April, p. 30), but for many, the proposed microfossils provoked a more visceral reactionthey simply look lifelike. Looks can be deceptive, however, and a group of three meteoriticists now argues that there's nothing lifelike about the martian bugs. Rather, they're simply a trick of the eye abetted by the peculiarities of the powerful microscopes used to image them.

    In a short paper in this week's Nature, John Bradley of MVA Inc. in Norcross, Georgia, Ralph Harvey of Case Western Reserve University in Cleveland, and Harry McSween of the University of Tennessee, Knoxville, present their own nanometer-scale images of martian meteorite ALH84001. They argue that most of the putative microfossils are nothing more than narrow ledges of mineral protruding from the underlying rock, that under certain viewing conditions can masquerade as fossil bacteria. There are regions that are absolutely teeming with these emergent [mineral] structures, whose shapes appear to be indistinguishable from some of the proposed nanofossils, says Bradley.

    The originators of the nanofossils from Mars idea, led by David McKay of NASA's Johnson Space Center in Houston, defend themselves in a response accompanying the Bradley paper. You have to be very careful, says team member Kathie Thomas-Keprta of Lockheed Martin in Houston. We know that the [mineral structures] are there, but that's not what we're calling the martian bugs. The exchange leaves other researchers wondering what kind of evidence might end the debate. Few if any rocks have been studied as intensively as 84001, [yet] there is room for multiple interpretations, says microscopist Peter Buseck of Arizona State University in Tempe. The problem, says interplanetary dust specialist Donald Brownlee of the University of Washington, Seattle, is that a structure's shape is very weak evidence for past life. I don't know how you can ever resolve this with microscopy alone, he says.

    To make their original claims last year, the McKay group used a field-emission scanning electron microscope (FE-SEM) to image swarms of possible microfossils as well as The Worm, the lone, segmented structure that seems a dead ringer for a worm. In an FE-SEM, the beam of electrons that scans back and forth across the sample is so narrow that it creates an image with nanometer-scale resolutionfar better than a standard SEM can achieve. Bradley and colleagues now use that same advanced technology to arrive at an interpretation of ALH84001's structures as being lifeless. Given the new level of detail seen by an FE-SEM, a lot of things we see on surfaces are new to us, says Bradley, and they may not be what they appear. On the surfaces of fractures inside ALH84001, he and his colleagues found what they are calling emergent lamellae, crystalline lips or ledges where the natural layering of the minerals pyroxene and carbonate has been accentuated the way individual sheets stand out on the edge of a loose sheaf of papers.

    Viewing those mineral edges under an FE-SEM can give the impression of tiny, elongated bacteria, says Bradley, for two reasons. SEM generally requires that the sample be coated with a thin film of metal so that the SEM's electrons do not charge the sample and fuzz up the view. But coating samples with such metals, including the gold-palladium mixture usually used by the McKay group, can change texture, round off shapes, and even create segmentation like the worm's, says Bradley. In addition, the perceived wormness of the layers depends on the viewing angle, he says, so that ledges obviously rooted in the underlying mineral in one perspective can, with a dollop of metal and a twist of the viewing angle, transform into a bug or even a worm. And when these layers appear highly aligned, they are reminiscent of the herds of proposed microfossils in roughly parallel poses presented at the 1996 press conference announcing the McKay group's findings,1 says Bradley. What we are reporting is a whole population of elongated forms that bear an uncanny resemblance to the proposed nanofossils. I can't see the difference, he says.

    McKay and company say they can. You really cannot mistake [mineral layers] for the features that we are calling possible microfossils, says Thomas-Keprta, because [layers] are so well aligned and they range over a huge area. Possible microfossils, on the other hand, are not so orderly. They can overlap each other or stand in relative isolation; some even have an S shape. What's more, the Houston researchers have tried to account for the subtle effects of metal coatings by imaging a variety of surfaces, both coated and uncoatedand they say that the effects don't produce structures like the microfossils.

    In addition, says Thomas-Keprta, the emergent lamellae are in general too small to be mistaken for microfossils. The McKay group is not abandoning its claim that structures as small as 0.02 micrometer could be microfossils, despite complaints from biologists that no living thing could be so small, but we are concentrating on some of the larger features in part because they are less controversial, says Thomas-Keprta. And some of the largest microfossils are 0.75 micrometer long, while most of Bradley's layers in his Nature images are 0.1 micrometer and smaller, she says.

    Finally, just because Bradley failed to find any structures resembling microfossils doesn't mean they are not there, says Thomas-Keprta. The McKay group has seen layers in some parts of the meteorite all along, but hadn't discussed them because they weren't sure what they were; they suspect that lamellae are clays starting to weather out of the pyroxene. But to maximize the chances of finding possible microfossils, they look most closely on the rims of the so-called carbonate rosettes, Thomas-Keprta says, and there they find no layers.

    Bradley has a counter-rebuttal for each of these defenses. He says that some lamellae do indeed mimic the putative microfossils, appearing jumbled in lifelike poses, ranging up to a micrometer in length, and even exhibiting a distinct, wormlike S-shape. And he and his colleagues say they find layers on both pyroxene and carbonate. In the rims of rosettes, Bradley agrees that there are no lamellae. But there, he argues that the impostors may be nonbiological magnetite whiskers or grains, as he and his colleagues have suggested before. So in the view of Bradley and his colleagues, although some of the elongated forms of ALH84001 could conceivably be martian nanofossils, the majority appear to be either emergent substrate lamellae or magnetite [grains].

    How can all this be resolved? Bradley's results show that there are definitely nonbiological processes that can produce these buglike morphologies, says Brownlee. But Bradley can't prove whether the particular structures imaged by the McKay group are microfossils or artifacts. Who knows? says Washington's Brownlee. I'm not nearly as hopeful as when I first saw the McKay paper. If the shapes of structures can't settle the issue, perhaps the McKay group's planned dissection of a microfossil will help. But the claim of life on Mars may have to stand or fall on other evidence. It may not be possible to prove they are microfossils from Mars, says Brownlee.


    Dust Disks May Point Way to Exoplanets

    1. Govert Schilling

    Utrecht, The NetherlandsDust is not always a sign of failure, says Carsten Dominik of Leiden Observatory in the Netherlands. Astronomers have traditionally believed that the formation of planets would leave little or no dust around a star because all the dust would end up in planets; a disk of dust, they thought, was a sign that no planets had formed in that particular system. But Dominik and his colleagues have now found a dust disk around 55 Cancri, a dim, sunlike star in the constellation Cancer that is thought to be accompanied by one or possibly two massive planets (Science, 26 July 1996, p. 429). Apparently, dust disks and planets are not mutually exclusive after all.

    Dusty surroundings.

    Dust disks like this one around Beta Pictoris can coexist with planets.


    Observing the star using the German ISOPHOT camera on board Europe's Infrared Space Observatory (ISO), Dominik's team found that it is too bright in the infrared. In a report soon to be published in Astronomy & Astrophysics Letters, they conclude that the excess radiation can only be explained by a disk of cool dust particles. More than a decade ago, IRAS, a U.S.-Dutch infrared satellite, observed such disks around other stars, but until now they have never been seen around stars suspected of hosting planets. The new discovery leads Dominik to suggest that a sunlike star surrounded by a dust disk might be a promising place to hunt for exoplanets.

    The team found the bulk of the infrared excess around the wavelength of 60 micrometers, implying that most of the dust particles have temperatures between 40 and 100 kelvins. This puts the dust some 9 billion kilometers from the star, roughly the same distance as from our own sun to the Kuiper belt, a region beyond the orbit of Neptune that contains dust and numerous ice dwarfscometlike bodies tens of kilometers across. This disk is comparable to the Kuiper belt in our own solar system, says Dominik, although it contains a lot more dust. We wouldn't be able to see the Kuiper Belt from afar [with ISO]. Dominik adds that the disk around 55 Cancri is not the protoplanetary disk from which planets are believed to form; the star is much too old to show the remains of this primordial disk.

    Supporting the Kuiper belt analogy is the second putative planet around 55 Cancri, which orbits at a large distance from the star. In our own solar system, the inner edge of the Kuiper belt is swept out by the gravity of Neptune, the outermost massive planet. In the same way, the second planet of 55 Cancri might define the inner edge of the dust disk, says Dominik. Just how the disk persists is something of a puzzle for astronomers, however. Microscopic dust particles should spiral down into the central star within 20 million years, so Dominik's team believes the disk must be continuously replenished in some way, presumably by erosion of larger objects, such as a large number of ice dwarfs similar to those in the Kuiper belt.

    No one yet knows whether the ISO discovery implies that other dusty stars are promising places to look for exoplanets. According to theoretician Alan Boss of the Carnegie Institution of Washington, we don't know for certain if we should expect to find a dust disk around stars with planets or if [dust] disks generally imply the absence of planets. Dominik admits that the ISO observations of 55 Cancri do not decide this question either way, as other stars with planets have no observable dust disks, while many dusty stars do not show evidence of planetary companions.

    It would be nice to know if the orbital plane of the planet coincides with that of the dust disk, says Boss. If so, then there would be a good argument that the planet and dust disk both owe their origin to a common protoplanetary disk. Dominik thinks the planets and disk must lie in the same plane, because they would be hard to explain unless they have a common origin.


    Seeking a Source of Potent Cosmic Rays

    1. David Ehrenstein

    College Park, MarylandEvery so oftenxs, Earth's outer atmosphere is blasted by subatomic particles packing so much energy that they defy explanation. These so-called ultrahigh-energy cosmic rays (UHECRs) pose a conundrum: No known source in our cosmic neighborhood has enough power to generate them, yet the particles must come from close by, because if they traveled far, they would lose energy to the ubiquitous microwave background radiation. And their mystery is heightened by their rarity. Ground-based detectors built to monitor a wide spectrum of cosmic rays, which constantly rain down on Earth, have spotted only a handful of these superenergetic particles. We have just enough to know they exist, and that's the tantalizing part, says physicist James Cronin of the University of Chicago.

    With so little to go on, researchers have few clues to the composition and potential sources of UHECRs. But physicists are planning ways to collect a lot more information on them over the next 2 decades. At a NASA-organized conference here at the University of Maryland last month, researchers backed a proposal to fly twin satellitescalled the Orbiting Wide-Angle Light collector (OWL)to keep watch for the flashes of light generated when energetic particles, including UHECRs, slam into the atmosphere, creating showers of secondary particles.

    Astrophysicists have been scratching their heads for decades over UHECRs, defined as particles with 1020 electron volts (eV) or more of energy, 100 million times the energy of any particle created on Earth. In 1966, 3 years after the first 1020 eV cosmic ray was detected, physicists pointed out that a particle with that much energy would travel no more than about 20 million light-years, on average, before being transformed into lower energy particles by interactions with the newly discovered cosmic background radiation, the leftover glow from the big bang. The energy limit, about 5 1019 eV, has dogged astrophysicists ever since, as they have tried to explain observations of particles with higher energies.

    At the NASA meeting, scientists discussed several leading theories about these enigmatic particles. It's possible, some said, that UHECRs might be accelerated to tremendous energies by supermassive black holes which are thought to be at the centers of some galaxies, or by powerful gamma ray bursters that might signify gigantic explosions of coalescing neutron stars. But none of these acceleration mechanisms has been seen close enough to the Milky Way to account for UHECRs.

    Another idea, proposed by University of Chicago physicist David Schramm and his colleagues, is that topological defects formed shortly after the big bang, trapping huge amounts of energy in hot spots. Schramm suggests that these defects decayed into particles with much more energy than UHECRs, but interactions with the cosmic background radiation cooled them before they reached Earth. Every explanation you come to leads to something that's very exotic and very exciting, says Schramm. You know you're onto something interesting when the dullest [explanation] that's proposed is involving black holes.

    Researchers are looking to the proposed new detectors to help them sort out these mysteries. Jonathan Ormes of the NASA Goddard Space Flight Center in Greenbelt, Maryland, the principal investigator for the proposed OWL mission, organized the meeting to build support for the project, which he hopes can be launched in 2010. Each of the two OWL satellites would contain about 10 square meters of photodetectors for observing the tracks of ultraviolet fluorescence generated by cosmic rays streaming through the atmosphere. They would provide a stereoscopic view of about 1 million square km of the atmosphere at a time and observe perhaps 500 to 1000 UHECR showers per year, according to Ormes. This will be a huge improvement over current ground-based facilities, which all combined observe roughly one UHECR shower per year. The data should help researchers determine the composition of UHECRs and pinpoint the directions from which they arrive.

    But ground-based observation is also expected to advance dramatically with the multinational Pierre Auger project (Science, 1 September 1995, p. 1221), which is scheduled to begin construction in 1999. If all the money can be raised, the Auger collaboration, named for the discoverer of cosmic ray air showers, will cover 3000 square km with detectors at sites in Utah and Argentina. The detectors should be able to spot 50 to 100 UHECR showers per year.

    At about the same time as Auger starts catching rays, a second international collaboration based in Italy hopes to fly a more modest space-based detector known as the Airwatch From Space. John Linsley of the University of New Mexico, Albuquerque, who detected the first UHECR and also has worked on the OWL project, describes Airwatch as less ambitious, just a first step in a series of planned satellites. Livio Scarsi, a physicist at the University of Palermo in Italy and the spokesperson for the team, says the project has already passed initial reviews for Italian Space Agency funding.

    OWL's backers strongly support Auger and other, smaller ground arrays as steppingstones that should provide important data and motivation for OWL. We have to do everything possible on the ground first, Ormes says. And if these steppingstones lead to an answer, physicists would be delighted. Says Cronin: The prospect of really fundamental advances in physics or astrophysics is almost certain.


    Photonic Crystal Made to Work at an Optical Wavelength

    1. Gary Taubes

    One of the great dreams of the telecommunications industry has been to use light, rather than electrons, to carry information. The reason: The speed of information communicated between and within computers would increase almost unimaginably. The industry has already taken the first step toward that goal in the form of fiber-optic cables that transmit informationtelephone calls, for instance, or the frenetic buzz of the Internetfrom one location to another. But a major bottleneck remains. Much of the work of getting information into and out of the fiber is still done with traditional electronics. Recent developments, however, could soon change that.

    To bring about this photonic revolution, researchers have to devise a key component: materials in which photons of light will behave the same way electrons do in semiconductors. The problem is that the features of these materials, known as photonic crystals, have to be several orders of magnitude smaller than those found in today's integrated circuits, and no one had been able to build such devicesuntil now, that is. Aided by x-ray lithography, a team of physicists and electrical engineers at the Massachusetts Institute of Technology (MIT) has created the first photonic crystal that works at an optical wavelength. The results were published in the 13 November issue of Nature.

    The team, led by MIT physicist John Joannopoulos, accomplished this feat by using the lithography technique to drill a series of tiny holes in a silicon strip. By spacing the holes at critical distances, they were able to produce a microcavity half a micrometer acrossjust the size to trap light of the infrared wavelength used by the telecommunications industry in fiber optics. For the first time, the opportunity exists to make structures that are significantly smaller than the wavelength of light, says physicist Axel Scherer of the California Institute of Technology in Pasadena.

    That means, he says, that light can now be made to jump through the same kind of hoops as electrons do in integrated circuits. It can be forced, for example, to move through a crystal, even around bends, or only under certain conditions. And with that capability comes the possibility that researchers will be able to build lasers and light-emitting diodes (LEDs) small enough to pack onto integrated circuits, where they can be used to transmit and manipulate information.

    For a decade, researchers have been trying to build photonic crystals that will do for light what semiconductors do for electrons. The beauty of a semiconductor is that the flow of electricity can be controlled because the semiconductor's crystalline structure forbids the passage of electrons in a well-defined energy range, known as the band gap. But by doping the semiconductor with another materialone that adds electrons, for instanceit can be made to conduct electricity as desired. It is this property that makes solid state transistorsand today's computerspossible.

    To build a material with a band gap for light requires creating a photonic crystal, which has a unique periodic structure that will reflect and refract light of specific wavelengths. In the late 1980s, Eli Yablonovitch, a physicist at the University of California, Los Angeles, suggested that such a crystal could be made by creating regular arrays of materials with different dielectric properties and thus different abilities to bend light at their boundaries. You make a three-dimensional dielectric structure that resembles a crystal, Yablonovitch says. As light waves try to propagate through the structure, they reflect off the interfaces between the different dielectrics and become out of phase to the point where they cancel completely. The result, depending on how the periodic interfaces are designed, can be either no back reflection at all, or total back reflection. What Yablonovitch was after, and what he eventually got, was a perfect three-dimensional mirrora material that reflected light waves back no matter what direction they were moving.

    In Yablonovitch's early work, he took a simple approach to producing the periodicity. Guided by a theoretical model produced by a group led by physicist Costas Soukoulis at Iowa State University in Ames, he drilled a diamondlike pattern of holes into a piece of silicon. Because the holes were separated by centimeters, they interfered with light at centimeter wavelengths (microwaves) in such a way that light at these wavelengths bounced off the crystal, while light at other wavelengths passed through it.

    But creating a band gap for light is only the first step. Like semiconductors, photonic crystals don't come alive as a useful technology until impurities or defects are added that can allow light of a particular wavelength to pass. If the periodicity of the crystal is broken by removing one of the air holes, for instance, light at that point no longer sees a perfect crystal in all directions, and a single forbidden frequency of light can exist in the neighborhood of the defect. The result is a microcavity that acts as a filter: It blocks a range of frequencies, but traps one frequency in that range temporarily and then passes that frequency forward out the other end.

    This ability to trap light means, Joannopoulos says, that you can do neat new things with light. You can confine it to very, very small dimensions and make it go wherever you want it to go. If the defect is pointlike, he explains, light trying to move away from the defect will see a perfect crystal, which pushes it back into the defect. And if the defect is a line, the photon will be forced to follow that line through the crystal. Even if it makes a perfect 90-degree turn, Joannopoulos says, the light has to make that turn.

    Since Yablonovitch's early work, researchers have perfected the models needed to predict the properties of photonic crystals with different kinds of defects. The catch has been that to interfere with light in any way, a device has to have features that are roughly half the wavelength of the light of interest1.5 micrometers in the case of the infrared light used for telecommunications. Accurately creating features on that small a scale requires either x-ray or electron beam lithography, technologies that aren't widely available at university research labs. Joannopoulos and his MIT colleagues, however, were fortunate enough to have a setup at team member Henry Smith's lab that they could use.

    Still, it took the MIT researchers 3 years to fabricate and test the design they worked out for trapping infrared light. The design is what Joannopoulos calls a hybrid. Part of it is traditional technologyin particular the silicon waveguide, down which light will travel in a straight lineand part is entirely new: a series of periodic holes that create the photonic crystal and the necessary band gap.

    Using x-ray lithography, Jim Foresi, who was then a graduate student at MIT, drilled a series of four air holes, spaced 0.22 micrometer apart, into the waveguide, followed by a gap of 0.42 micrometer, and then another series of four holes. The regularly spaced holes provide periodicity in one dimension along the waveguide, while the gap in the middle is the defect that creates the microcavity, allowing only light at a wavelength of 1.5 micrometers to pass. You send in a pulse of light with a whole bunch of frequencies, says Joannopoulos. When it hits that configuration of four plus four plus the defect, it can only go through if it has the frequency of that defect state. The MIT team then tested the device by sending in a pulse from a semiconductor laser and analyzing the light that made it through to the other end. The result, says team member Henry Ippen, was surprisingly right on the money.

    To take the next step in the photonic revolution, however, researchers must solve another problemcreating microcavities that actively emit light, rather than just allowing a single wavelength to pass through. Although virtually everyone in the field is trying to accomplish this, they are all reluctant to talk about their progress until they have a publication in the works. The idea, though, is to put a classical semiconductor laser or an LED in the middle of the microcavity, which requires making considerably more complex structures on the submicrometer distance scales of the MIT device. These would be designed to emit a wavelength of light identical to the wavelength the microcavity allows to exist. If such devices can be achieved, given the degree of miniaturization required, they would be perfect for generating the light that would be used to transmit information on integrated circuits that combined both electronics and photonics.

    Because only a single wavelength of light would be allowed to exist within the cavity, the efficiency of these optoelectronic devices would be much higher than today's strictly electronic devices, and they would need considerably less energy to run. Indeed, researchers predict they may someday make LEDs that will convert energy into light moving in a single direction with an efficiency of 90 compared to the 30 in present devices.

    A microcavity laser would be what is known as a zero threshold laser. In traditional lasers, it takes time and energy to get to the point where the light is coherent and the laser is lasing. This would not be the case with a microcavity laser. The very first photon that gets emitted goes straight into your laser beam, says MIT team member Pierre Villeneuve. With all other lasers, until that threshold is reached, you're wasting energy.

    Considering that LEDs are already ubiquitous, and someday photonic devices may be as densely packed on chips as present semiconductor devices, it becomes obvious why such savings in energy and heat loss make the photonic researchers believe they are onto something big. But they have some challenges to overcome first. One would be to devise ways of building photonic microcavities with technologies that are easier to come by than x-ray lithography. Villeneuve predicts, however, that that should be possible within a few years. If so, it should launch the photonic revolution out of the realm of fantasy.

    Until then, however, Joannopoulos has one caveat. We're still theorists, he says, and so we have to learn how to play the game in the real world. It's fine to design something, but then someone has to make it and mass-produce and integrate what exists already. These are all difficult issues.


    Herpesvirus Linked to Multiple Sclerosis

    1. Phil Berardelli
    1. Phil Berardelli is a free-lance science writer based in northern Virginia.

    A new study has yielded evidence linking a strain of herpesvirus to multiple sclerosis (MS). More than 70 of patients in the study with the most common form of MS showed signs of active infection with herpesvirus-6 (HHV-6). The finding, reported in the December issue of Nature Medicine, is not yet conclusive proof, however, and some researchers question whether the apparent association is a symptom rather than a possible cause of MS.

    In multiple sclerosis, immune cells attack and inflame the myelin, fatlike sheaths surrounding neurons in the central nervous system. Symptoms can vary widely, but MS is generally characterized by muscle weakness and neurological impairments, and most patients see their condition wax and wane with new symptoms appearing or old ones becoming more severe, alternating with periods of remission. Eventually, however, it can lead to disability and paralysis. HHV-6which infects young children, causing a condition called roseola marked by high fever and rashesalso inflames myelin. It is present in about 90 of the U.S. population and can become reactivated when the immune system is under stress from factors such as secondary infections or emotional strain. Virologist Steven Jacobson and his colleagues at the National Institute of Neurological Disorders and Stroke in Bethesda, Maryland, wondered whether there might be a link between HHV-6 infection and MS.

    The researchers looked for signatures of HHV-6 in the serum of 36 MS patients and 66 controls in a blind test. As expected, nearly all had long-term antibodies, known as IgGs, that react with HHV-6 antigens. But 73 of MS patients also had IgM, an early antibody response to HHV-6 antigens and a potential marker of active virus replication. Only 18 of the control group showed IgMs directed against HHV-6. DNA from the virus was also found in more than one-third of MS patients, but in none of the controls. Moreover, magnetic resonance imaging detected numerous lesions in the myelin in the brain of a recently deceased MS patient, and an autopsy revealed HHV-6 in the lesions, but not in the adjoining normal tissues.

    Jacobson, who has been looking for a viral cause of MS for more than 20 years, says Now, we think we're a little closer. But some other experts are not yet convinced. Patricia O'Looney, a biochemist with the National Multiple Sclerosis Society in New York City, points out that not all the MS patients showed indications of active HHV-6 infection. She also notes that MS breaks down the blood-brain barrier, which may allow the virus to migrate into the central nervous system. In that case, HHV-6 infection could simply be a symptom of MS.

    Jacobson cautions that even if an association between HHV-6 and MS is strengthened with further studies, it will not lead directly to a treatment for MS because nothing is yet available that can kill the virus without aggravating the breakdown of the myelin sheath. But it could open up new avenues for research.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution