News this Week

Science  07 Dec 2001:
Vol. 294, Issue 5549, pp. 2066

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution

  1. 2003 U.S. BUDGET

    NSF to Gain Funds From Smithsonian, Other Agencies?

    1. Elizabeth Pennisi*
    1. With reporting by Jeffrey Mervis.

    Funding for three Smithsonian research centers would be transferred next year to the National Science Foundation (NSF) under a White House budget strategy aimed at rewarding agencies for their management prowess. Science has learned that the move is part of a proposed shift of roughly $120 million from several agencies to NSF. Parts of the $30 million water resources program at the U.S. Geological Survey and the $60 million university-based Sea Grant program run by the National Oceanic and Atmospheric Administration would also be moved.

    Last week, the White House Office of Management and Budget (OMB) wrote Smithsonian secretary Lawrence Small that it intended to take $35 million away from his agency's 2003 budget and give it to NSF. The proposal, still under wraps by the Bush Administration, came as a surprise to Smithsonian officials wrestling with their own controversial plan to restructure science at the institution's 16 museums, National Zoo, and half-dozen research centers. Scientists at the affected institutes—the Harvard-Smithsonian Center for Astrophysics, the Smithsonian Environmental Research Center (SERC), and the Smithsonian Tropical Research Institute—would be free to compete for funding under NSF's regular programs.

    NSF director Rita Colwell declined to comment on the proposed transfer, and an OMB official said the agency doesn't comment on ongoing negotiations leading to the president's 2003 budget proposal to Congress in February. But last week OMB director Mitch Daniels foreshadowed the move during a Washington, D.C., speech in which he singled out NSF for praise and warned other agencies to shape up or suffer the consequences. “Programs [like NSF's] that perform well, that are accountable to you as taxpayers for reaching real results, and measuring and attaining those results, deserve to be singled out, fortified, and strengthened,” Daniels said. “Conversely, programs that make no such attempt or fail to deliver really need to be scrutinized and the money we are now investing in them redeployed to higher purposes.”

    Storming the castle.

    OMB director Mitch Daniels wants to transfer money from three Smithsonian research centers.


    The OMB directive, in response to the Smithsonian's 2003 budget submission, came as a shock to Smithsonian leaders and the research community. “To say that I was taken aback is an understatement,” says Jeremy Sabloff, director of the University of Pennsylvania's Museum of Archaeology and Anthropology and head of a commission evaluating the future of Smithsonian science.

    Research at the Smithsonian has been squeezed for the past 20 years as the institution has struggled with ever-expanding needs for renovations and new construction. The situation came to a head last spring when Small proposed closing two research centers and rearranging scientific research throughout the institution (Science, 13 July, p. 194). Although Congress stepped in to protect those research centers, Sabloff and more than a dozen other experts were tapped to advise the Smithsonian on what it should do.

    The OMB plan would force scientists at the three centers to compete with academic researchers for the majority of their funding, although it provides for a 1-year transition in 2003. The research centers already support some of their work through outside grants: SERC, for example has $18 million in peer-reviewed grants and contracts, including about $1 million from NSF, says SERC director Ross Simons. But ending appropriated federal support “would be disastrous,” says one Smithsonian scientist. Adds Sabloff, “it would be very unfortunate if [the proposed transfer] came to be.”

    The Smithsonian has asked OMB to reconsider its proposal in the next round of budget negotiations. Meanwhile, OMB has requested that Smithsonian and NSF leaders map out a plan by mid-January to implement these changes.


    Caltech Aims for Big Jump in Women Faculty

    1. Andrew Lawler

    The California Institute of Technology (Caltech) hopes to more than double the number of women faculty members over the next decade to help rectify a glaring gender disparity at the elite science- and technology-oriented school. The goal would mean adding a net of four women faculty members a year, as women currently make up a mere 31 of Caltech's 284 faculty members. The target is included in a new report that examined the status of women faculty members at the university.

    “Female faculty are markedly more dissatisfied than their male peers” with life at Caltech, says the report, which was commissioned 2 years ago in the wake of a similar report by the Massachusetts Institute of Technology (MIT) (Science, 12 November 1999, p. 1272). Although the Caltech committee found no conclusive evidence that women suffer in terms of salary or space, panel members say that the paucity of women made it difficult to carry out a meaningful statistical analysis or provide the necessary anonymity. The panel was chaired by astronomer Anneila Sargent.

    Women wanted.

    Sargent panel sets high target.


    The most sweeping recommendation in the report is to increase the number of female faculty members from the current 11% to 25% within a decade. “The worse thing about Caltech has been the low numbers [of women],” says faculty chair Marianne Bronner-Fraser, a biologist who was chosen this summer as the first woman to hold that elected position. The number of women in each Caltech division (department) outside the humanities varies from 18% in biology to 5% in engineering.

    Caltech president David Baltimore cautiously embraced the target, which he estimates will require some 40% of new hires over the next decade to be women. “It's not an unattainable goal, but it will be very difficult to achieve,” he says. With faculty growth unlikely, he says that the shift will have to come primarily through replacements. The committee also called for a fund-raising campaign to bolster the number of women faculty members and to attract more women students.

    The survey, which included all women and a sample of men, found that more than half the women say they have encountered gender bias, and 30% recalled “adverse interactions” with their chairs over gender issues. Women are three times as likely as men to be dissatisfied with their visibility at Caltech, and less than half expressed satisfaction with their jobs, compared with 73% of men. Tenure decisions are another sticking point: “As many as 70% of women who have successfully attained tenure have at least reservations about the process,” the report notes, compared with just 19% of men.

    The small numbers of women made it hard to determine the reason for a disparity in salaries between men and women, says Sargent. It also forced the committee to abandon attempts to investigate differences in lab space—a key metric in the MIT report.

    The report also urges Caltech to hire women as senior administrators, and Baltimore says he is committed to making changes in the male-dominated upper tier as positions come open. None of the six current division chairs is a woman, although last month biologist Barbara Wold—who also served on the Sargent panel—was named director of the Beckman Institute, a biology and chemistry research facility on campus.


    Men Still Have Edge in U.S. Science Careers

    1. Jeffrey Mervis

    Having children improves a man's chances of becoming a full professor but hinders a woman's progress in academia. That's one of many provocative findings from a National Research Council (NRC) panel that has been exploring gender differences in the careers of U.S. scientists and engineers.

    Issued last month, the panel's 340-page report* eschews the usual analysis of existing studies with policy recommendations. Instead, the panel did its own research on gender differences in the scientific workforce, mining four versions of two ongoing federal surveys. Its conclusion—that men retain an edge that cannot be explained by any objective criteria—may be disturbing to those who think that discrimination is a thing of the past.

    The family effect.

    Married women with children are less likely to be full professors than those without. The opposite is now true for men.


    “There's clear evidence that women have been treated unfairly,” says panel chair J. Scott Long, a sociologist at Indiana University, Bloomington, and a scholar in the field of women's studies. “It's also clear that marriage and family issues are major factors that need to be addressed.” Although the five-member panel was not asked to make recommendations, its report suggests that employers consider policies to help “promising employees with young families.” It also calls on top research universities to revise graduate school admissions practices to attract and retain more women. “I think every university should do the type of review” carried out by the Massachusetts Institute of Technology and Caltech (see previous story), says Long, “to see if there are current policies that are discriminatory or past practices that need to be addressed.”

    Among the panel's findings:

    • Tenure is becoming more elusive for women than for men. Comparing data in the 1995 and 1999 surveys, the panel discovered that the share of academics in tenure-track positions dropped from 70% to 55% for women and from 82% to 72% for men.

    • Male graduate students are more likely than women to get jobs as research assistants; the difference ranges up to 9% in mathematics, although the gap is narrowing for all disciplines except those in the physical sciences.

    • The salary gender gap is widening among more senior academics. Tenure-track men who earned their Ph.D.s in 1979 earned 10% more than women from that class, compared with a 6% difference for those with degrees from 1975.

    “Women certainly represent a growing percentage of the scientific workforce,” Long notes—from 7% in 1973 to 22% in 1999. “But they're finding a tougher job market, especially in academia.”


    Government Shoots Down GM Plant Trials

    1. Giselle Weiss*
    1. Giselle Weiss is a writer in Allschwil.

    ALLSCHWIL, SWITZERLAND— In a blow to Swiss biotechnology, the government has rejected a high-profile application to conduct field trials of genetically modified (GM) wheat. The decision, now being appealed, has caused widespread consternation among Swiss scientists, who argue that it amounts to a de facto moratorium on field tests of any transgenic plant. Five members of the federal biosafety commission have resigned in protest, including its president, Riccardo Wittek. “If I were working in plants,” Wittek says, “I would leave the country.”

    In November 2000, Christof Sautter of the Institute for Plant Sciences at the Swiss Federal Institute of Technology (ETH) in Zürich sought permission to sow, on a small outdoor plot, wheat seeds engineered to resist the stinking smut fungus. Smuts and bunts—a related pest—devastated European wheat in the 18th century and continue to plague crops in many developing countries. The diseases are hard to detect and are spread mainly through planting infected seeds.

    Sautter modified two Swiss spring wheat lines to express a viral gene, KP4, that encodes a protein that inhibits fungal growth. In greenhouse experiments, the transgenic plants proved 30% less susceptible than controls to infection with stinking smut. In 1998, Sautter was ready to take the next step: petition the Swiss Agency for the Environment, Forests, and Landscape (BUWAL) to grow the transgenic plants on a plot “twice the size of a double bed,” he says.

    But Sautter hesitated, worried about the outcome of a national referendum that would ban transgenic research (Science, 12 June 1998, p. 1685). The referendum was defeated, but the climate remained uncertain as parliament launched a debate—which is still going on—about how to legislate gene technology. According to Wilhelm Gruissem, director of ETH's plant biotechnology laboratory, BUWAL representatives requested an “informal” meeting at the Bern train station in December 2000 to discourage him and Sautter from submitting their field trial petition. BUWAL by then had already rejected two applications from other teams and appeared to be tipping its hand to the ETH duo: Gruissem claims they were told that their experiment would be “politically inopportune.” BUWAL spokesperson Andreas Stuber confirms that the meeting took place but insists that its purpose was constructive.

    Waiting for Godot?

    Christof Sautter displays his dormant 8-square-meter plot with safety measures, including a tent to prevent pollen from escaping.

    Sautter and Gruissem went ahead with their application on 19 January, after which BUWAL requested additional greenhouse tests. They got a boost on 5 September when the biosafety commission ruled that the experiment posed no “appreciable” risk to people or the environment. But at a press conference on 20 November, Philippe Roch, director of BUWAL and former head of the Swiss World Wildlife Fund, announced that the department had rejected the application. Roch argued that it was impossible to assess the experiment's risks because too little is known about the KP4 protein and because the transgenic wheat contains a foreign antibiotic resistance gene. Although this gene is dormant and not known to pose a risk, Swiss legislators are moving to outlaw trials of plants that contain it anyway.

    Gruissem rejects BUWAL's rationale, arguing that the field trial would have been “the perfect risk-assessment experiment.” The proposal included such restrictive safety measures—wire mesh to keep out field mice, for example, and a tent cover to prevent pollen from escaping—that members of the biosafety commission, Wittek recalls, joked whether it could still be called an open field trial.

    ETH announced on 29 November that it will appeal the ruling to the Department of Environment, Transport, Energy, and Communications. In the meantime, Sautter's continued funding from the Swiss National Science Foundation stipulates that he must obtain approval by February for field trials of his wheat. Failing that, he says, he could pack up and go to the United States, although he says he would prefer to remain in Switzerland to argue the case for GM field trials.

    Beat Keller, a plant biologist at the University of Zürich who coordinates the Swiss National Science Foundation program on wheat, sees the decision as a culmination of nonscientific approaches to the regulation of GM plants. “It is so obviously wrong,” he says. And it is not likely to be righted anytime soon: Wittek says there are no other field-trial applications pending or in sight.


    Pathogen Researchers Get Help From TIGR

    1. David Malakoff

    Immunologist Pam Baker is getting the backup she needs. As a professor at Bates College, a small undergraduate institution in Lewiston, Maine, Baker doesn't have easy access to the advanced gene research tools that could help her understand how the bacteria Porphyromonas gingivalis helps spark gum disease. So she was pleased when The Institute for Genomic Research (TIGR) in Rockville, Maryland, recently offered to provide her with the specialized glass microarrays that can document how P. gingivalis's genes and proteins behave during infection. “We've got just basic equipment, [so to be able] to use microarrays is a big step up,” she says.

    Candidate list.

    A new center will pick three organisms to start from a pool of pathogens for focused work, including microarray preparation.


    Many other disease researchers soon will be joining Baker in benefiting from TIGR's expertise. Last week the institute announced that it has signed a 5-year, $25 million contract with the National Institute of Allergy and Infectious Diseases (NIAID) to help scientists expose the inner workings of at least 10 human pathogens whose genomes have been sequenced. The new Pathogen Functional Genomics Resource Center will exploit the growing sequence archive by “making some essential tools more easily available to microbial researchers,” says TIGR's Robert Fleischmann, one of the center's leaders.

    Scientists have sequenced the genomes of more than two dozen pathogens over the last 7 years, including killers such as cholera and syphilis, with more pending. But putting all that information to use in understanding infections or developing drugs is difficult. It takes expertise and money to make the specialized reagents, gene clones, and microarrays—chemically treated glass slides or silicon wafers that can detect the activity of hundreds of genes at a time—that researchers need. To avoid funding duplicate requests, NIAID officials 2 years ago began looking at ways to centralize some toolmaking and training activities, and last year they announced a competition to select a host for the new center.

    As the winner, TIGR is moving quickly to outfit labs and recruit a staff of 25 and a 10-member advisory committee; it hopes to have the center humming by spring. A first task will be to select three target pathogens from a short list of hot candidates (see table), with at least another seven coming by 2004. Then TIGR can begin making and distributing materials, processing samples, and analyzing data for needy labs.


    Aspergillus fumigatus; Bacillus anthracis; Borrelia burgdorferi; Yersinia pestis; Burkholderia mallei; Chlamydia pneumoniae; Entamoeba histolytica; Enterococcus faecalis;Group B Streptococcus; Mycobacterium smegmatis; Mycobacterium tuberculosis; Neisseria meningitidis; Plasmodium falciparum; Pseudomonas aeruginosa; Rickettsia prowazeki, R. conorii, R. typhi; Salmonella typhimurium; Staphylococcus aureus; Streptococcus pneumoniae; Vibrio cholerae

    Some offerings, however, will be rationed because of their high cost. Microarrays, for instance, may initially be available to just 10 selected labs per pathogen, says Fleischmann, with each lab getting about 150 of the glass slides. The center also has to work out data-sharing and patenting policies. In both cases, Fleischmann says the intent is to share information and materials as widely as possible, particularly with labs at smaller institutions such as Bates.

    Eventually, organizers hope TIGR will help the research community solve two long-standing problems: training talent and establishing workable, accepted standards for various lab techniques and data-storage methods. Says Fleischmann: “We want to be more than a factory for pumping out reagents.”


    Researchers Say Rules Are Too Restrictive

    1. Jocelyn Kaiser

    A coalition of biomedical societies and research universities is mounting a major assault on a new rule covering the privacy of health records, arguing that the regulation will stifle research. However, patient rights groups say that scientists are overreacting to needed reforms.

    The Privacy Rule sets out new procedures for handling patient records, including a requirement that certain information be stripped from records that researchers can use without prior permission. It also gives patients the right to see their records and to find out if they have been made available to a public health or law enforcement agency, or for research.

    The 32-page rule was hammered out by the Department of Health and Human Services (HHS) in response to a 1996 health insurance law; it was published in final form in December 2000 as one of the Clinton Administration's final acts and goes into effect in April 2003. But the Bush Administration decided to review portions of the rule after concerns poured into HHS over how it will work. Health care organizations and researchers have taken advantage of that opportunity to make their case.

    The new rule “will seriously impair our ability to conduct clinical trials” as well as pathological, epidemiological, and genetic studies, says a 20 November letter to HHS Secretary Tommy Thompson, signed by more than 60 professional societies and 110 universities. David Korn, senior vice president for biomedical and health sciences research at the Association of American Medical Colleges (AAMC), which has spearheaded the campaign, says that changes the biomedical groups have proposed would not weaken patient privacy.

    Eyes only.

    Access to patient data sparks renewed debate.


    The letter urges HHS to pare down the amount of data cleansed from patient records before they are made available to researchers. Removing even data such as zip codes and birth dates, says Korn, makes the data “useless” for research that requires such “identifiers.” Researchers who want to use identified data without a patient's permission can apply for a waiver from an ethics board. But the rule lays out fuzzy review criteria, such as weighing whether a privacy risk is “reasonable.” The research community urges HHS instead to leave the decision in the hands of the ethics review panel that assesses the original study. Other recommendations include easing restrictions on access to existing archives.

    The researchers have found receptive ears for some of this: In a 21 November letter to Thompson, an advisory committee to HHS that has been tracking the rule says it “detected a high level of anxiety” from researchers in recent public hearings and recommends reconsidering a few sections, including the rules for stripping identifiers. But other complaints result from a “misunderstanding” of the rule, says panel member Mark Rothstein, a bioethicist at the University of Louisville School of Medicine in Kentucky. Angela Choy of the consumer-oriented Health Privacy Project at Georgetown University in Washington, D.C., says the rule offers researchers “lots of ways to get” the information they need.

    Biomedical groups worry that some hospitals may decline to share any data to avoid the cost of compliance and to steer clear of criminal penalties for any violations. Some hospitals in Minnesota, which passed a law 5 years ago that imposes strict rules for releasing records, have banned external researchers—those not on hospital staff—from using their databases, Korn says. Having more health care systems put their databases off-limits, the AAMC letter warns, could “paralyze vital public health research.”


    Pot-Bellied Mice Point to Obesity Enzyme

    1. Trisha Gura*
    1. Trisha Gura is a science writer in Cleveland, Ohio.

    Words linking fruit and the human anatomy have long sweetened sonnets and love letters. But lately the term “apple-shaped” has gained renown on the pages of medical texts. People who carry excess fat around their waists—the so-called apple-shaped body type—are more prone to obesity-related maladies than their equally overweight but pear-shaped counterparts, who pack weight around their hips. Physicians have observed the connection for decades, but no one could explain it, let alone search for a therapy to right the scales.

    Now on page 2166, researchers at Beth Israel Deaconess Medical Center in Boston suggest a reason for the disease-body type relationship, and a possible new target for treatment. The culprit is an obscure enzyme that works to recycle a steroid stress hormone called cortisol. Through delicate genetic engineering, endocrinologist Jeffrey Flier and his colleagues overexpressed the gene for this enzyme solely in the fat of mice. These rodents look and act a lot like overweight apple-shaped people: They eat more than normal mice and gain fat disproportionately around their middles. As adulthood sets in, the animals develop the early biochemical symptoms of heart disease and diabetes. Blocking the enzyme in people, the researchers suggest, might thwart obesity-related illnesses.

    Belt loosener.

    Activating an enzyme in fat gives mice a syndrome seen in apple-shaped people.


    “This was really the first proof that manipulating steroid conversion in fat alone is enough to lead to all these abnormalities,” says endocrinologist Stephen O'Rahilly of Addenbrooke's Hospital in Cambridge, U.K., who studies the genetics of obesity and diabetes. “I wish I'd done the experiment myself.”

    Inspiration for the study came indirectly from a rare illness called Cushing syndrome. Its sufferers have too much cortisol coursing through their bloodstreams and become diabetic and severely obese. For decades, endocrinologists hypothesized that common forms of obesity may represent very mild cases of Cushing syndrome. If so, most obese people should have higher than normal blood levels of cortisol—but researchers found that they don't and discounted the hypothesis.

    The theory was resurrected by Paul Stewart of the University of Birmingham in Edgbaston, U.K., whose group found that people have pockets of high cortisol activity. The team compared stress hormone production in two types of fat in 16 patients undergoing surgery, most of whom were of normal weight. One sample came from underneath the skin, the other from adipose tissue in the abdomen. In the belly fat, the researchers found higher activity of an enzyme called 11β hydroxysteroid dehydrogenase type 1 (11β HSD-1), which regenerates active cortisol from its inactive form, cortisone.

    Flier read a 1997 paper in The Lancet on the research and thought, “If we could make a mouse that overexpresses the enzyme only in fat, we could ask the question, ‘Will that mouse get the apple-shaped body type and all its ill effects?’” he recalls. Visiting scientist Hiroaki Masuzaki engineered the mice; he linked a rat 11β HSD-1 gene to a promoter that turns on only in fat. The mice had 2.4 times more enzyme activity in their belly fat than did normal mice. Stress hormone levels in stomach fat tissue rose by 15% to 30%, but, as in most obese humans, bloodstream levels of the hormone were normal. As adulthood set in, the transgenic mice ate more, got fatter than normal mice, and carried the fat in their abdomens. Even when fed low-fat diets, the transgenics carried a spare tire that accounted for 37.9% of their total body fat compared with 27.5% in normal mice. The mice showed the hallmarks of early diabetes and hypertension: insulin resistance, renegade blood glucose levels, and other biochemical abnormalities. And a high-fat diet accelerated the pot-bellied rodents' downward spiral.

    “It is really the whole picture of what we refer to as the metabolic syndrome,” says Flier, citing a term now in vogue in endocrinology circles to describe the growing population of obese people at risk for diabetes and heart disease.

    But O'Rahilly points out that no one can yet pin down 11β HSD-1 as the cause of the millions of cases of diabetes and heart disease. “You have to find out whether the level of metabolic disturbance in people correlates with the activity of this enzyme,” O'Rahilly says.

    Meanwhile, two recent clinical observations support the team's results: In April, Joel Berger's group at Merck Research Laboratories in Rahway, New Jersey, showed that a class of antidiabetic drugs now on the market suppresses 11β HSD-1 levels in fat cells. And Eva Rask of Umeå University Hospital in Sweden and Brian Walker of the University of Edinburgh, U.K., report that obese men express higher levels of 11β HSD-1 activity in fat tissue than do lean males, which begins to address O'Rahilly's concerns.

    Flier and O'Rahilly both say they are aware of drug companies that have in hand, or are scrambling to come up with, potent inhibitors of the enzyme. Such compounds might be used to treat obesity by altering stress hormone levels in belly fat. “We have wanted to know for some time what properties of fat inside the abdomen make it different from fat outside the abdomen,” says O'Rahilly. “If this enzyme explains it, that would be interesting indeed.”


    Finding the Holes in the Magnetosphere

    1. Andrew Watson*
    1. Andrew Watson is a writer in Norwich, U.K.

    Just outside the protective cocoon of our atmosphere, a battle rages in space. A gas of electrically charged particles—the solar wind—traveling at hundreds of kilometers per second streams at us from the sun. All we have to guard us is Earth's magnetic field, but this shield is not impregnable. Every so often, particles and energy burst through, by means of a process called magnetic reconnection, causing displays such as the aurora borealis as well as magnetic storms that disrupt satellites, power lines, and communications. Researchers have puzzled for decades over how and where reconnection happens. Now physicists from the British Antarctic Survey (BAS) in Cambridge have developed a way to pick between two competing views of where reconnection occurs.

    The work “seems to provide strong evidence for one [model] rather than the other,” says space plasma physicist Stan Cowley of the University of Leicester, U.K. Finding a recipe for picking between the two models is an “important step,” agrees physicist Ray Greenwald of Johns Hopkins University's Applied Physics Laboratory in Laurel, Maryland.

    The solar wind is no steady breeze. Violent events in and around the sun, such as flares and coronal mass ejections, can whip up the wind to gale force. And because it is made up of charged particles, the solar wind carries the sun's magnetic field with it. As it nears Earth, our magnetosphere diverts the solar wind around our planet like river water around a bridge pier. But sometimes the two magnetic fields don't just rub together: They hook up, creating an entry point for the particles and energy to pour into the magnetosphere.

    Looking up.

    The British Antarctic Survey's SHARE radar scans the skies over Halley research station in Antarctica.


    Researchers still don't understand reconnection events well enough to predict when and where they will happen. Theoretical models have divided them into two principal camps. Supporters of the “subsolar” theory hold that the action takes place at the point closest to the sun, the “nose” where the magnetosphere bears the full brunt of the solar wind. The rival “antiparallel” camp, meanwhile, believes that any point where the sun's and Earth's fields are in direct opposition—typically well away from the “nose”—is fair game for reconnection. “It is debated at every meeting,” says Greenwald.

    A team from BAS decided to settle the matter. A key difference between the two theories is that, under particular seasonal and solar wind conditions, the antiparallel model predicts that two reconnection points will always be created, whereas the subsolar theory produces only one. Finding reconnection events, which may be just a few thousand kilometers wide and last only a few minutes, is hard for the handful of spacecraft currently surveying the vast magnetosphere. But Richard Horne and his BAS team realized they had just the tool for the job: ground-based radar.

    Horne and his colleagues have spent years monitoring Earth's ionosphere, the plasma layer that forms the uppermost tier of the atmosphere. Because reconnection events cause disturbances in the ionosphere, the BAS researchers realized that the radar data they had collected might contain “footprints” of past reconnections. The team searched back through years of data from radar stations close to the poles, the best places to monitor the ionosphere. Data collected from Goose Bay in Newfoundland and Stokkseyri in Iceland on a December day in 1997 showed two distinct ionospheric disturbances signaling reconnection events. Neither took place close to the spot nearest the sun favored by the subsolar model. The results, to appear this month in the Journal of Geophysical Research(Volume 106, p. 28995), show “clear evidence in favor of the antiparallel theory,” Horne says.

    Researchers caution that one example doesn't clinch the case. Reconnection events may appear “all over the place,” Cowley says, perhaps with one model dominating the other. Greenwald agrees that more observations are needed. The BAS team has since identified three more double events and has submitted a second paper to the Journal of Geophysical Research. In time, these results should help solve what Cowley calls “the fiendishly difficult problem” of understanding in full how magnetic reconnection works.


    Paring Down the Big Five Mass Extinctions

    1. Richard A. Kerr

    BOSTON— The five largest extinctions of the past half-billion years seemed immutable milestones on the path to modern life. Ever since researchers fingered a huge impact to explain the most recent of them, the one that ended the age of the dinosaurs 65 million years ago, the rest have also borne the tinge of doom. But now a pair of paleontologists say that two of the Big Five just don't measure up. Instead, Richard Bambach and Andrew Knoll of Harvard University argue, the losers should be demoted to “mass depletions”: plunges in diversity caused by still-mysterious failures to produce enough new species.

    Doubts about the legitimacy of the Big Five—those that came late in the Ordovician and Devonian periods and at the ends of the Permian, Triassic, and Cretaceous—began with the same sort of data first used to identify them. As they reported last month at the annual meeting of the Geological Society of America (GSA) here, Bambach and Knoll started their analysis with a listing of fossil marine genera compiled by John Sepkoski, who died in 1999 at age 50. It was Sepkoski and paleontologist David Raup, retired from the University of Chicago, who in the early 1980s drew attention to the Big Five as the largest extinction events since the Cambrian explosion of life 540 million years ago.

    Bambach and Knoll used Sepkoski's last compilation—genera of marine fossils arranged by their first and last appearances in the fossil record—and crunched numbers to see whether the extinctions were indeed large, sudden, and unusual enough to qualify as distinctly different from the multitude of lesser extinctions that mark the fossil record. To start, they dropped the entire Cambrian period and the early part of the subsequent Ordovician from their analysis. Extinction rates were high and varied wildly in those early days, prompting exclusion of the whole 60 million years from the analysis as too atypical.


    Reports of extinctions of blastoids were greatly exaggerated.


    With the early days dropped, four intervals of extinction stood out as exceptionally intense. The extinction in the late Devonian 364 million years ago, however, did not. “It fails the first criterion for a mass extinction interval: It isn't unusual,” says Bambach. Several intervals in a row, including the interval in question, have higher-than-normal extinction rates, but none of them is high enough to be called “big.”

    The next to go was the end Triassic extinction of 200 million years ago. When Bambach and Knoll compared its extinction rate with those of intervals coming before and after, the end Triassic did not stand out as bigger than its neighbors. “It's not an outlier,” says Bambach. “All of the Triassic has high extinction rates.” What's more, Bambach says, although all five events are marked by large losses in the diversity of genera, the end Triassic and late Devonian intervals lose more of their diversity through a failure to produce new genera than through extinction. “Normal extinction was high,” says Bambach, “and there wasn't much origination” of new genera to replace losses due to extinction. He calls these two events “mass depletions” rather than mass extinctions.

    Paleontologists who specialize in the demoted intervals are taking the losses well. “I wouldn't argue too strongly” with the end Triassic's being dropped, says Anthony Hallam of the University of Birmingham, U.K. “I've been challenging the idea [that] there was a catastrophic event at the time. It was more gradual. Its magnitude was certainly less than [Bambach's] three big ones.”

    The late Devonian doesn't have many adamant defenders, either. “We agree with Bambach and Knoll,” paleontologist Johnny A. Waters of the State University of West Georgia in Carrollton told the GSA meeting. “We believe the late Devonian ‘mass extinction’ should go away.” Waters and his colleagues argue that the late Devonian has been overblown in much the way the lesser extinction at the Cenomanian-Turonian boundary has been (Science, 10 August, p. 1037). In the case of the late Devonian, says Waters, paleontologists have tended to collect fossils close to their labs in Western Europe and North America. But during the late Devonian, sediment-laden waters flushed those areas, altering the local marine ecology and skewing the fossil counts. As paleontologists look farther afield, as Waters and colleagues have done in northwest China, more species turn up, lessening the apparent magnitude of the extinction.

    Other paleontologists are taking the demotions in stride because they believe there are better ways to gauge evolutionary events. “We've gone as far as we can playing number games with taxonomic diversity,” says paleontologist George McGhee of Rutgers University in New Brunswick, New Jersey. “We need to look now at analyzing the ecological impact of the big events.”

    By the reckoning of McGhee and his colleagues, the end Permian extinction retains its position as the number one crisis in the history of life, followed by the end Cretaceous extinction that led to the replacement of dinosaur-dominated ecosystems by mammal-dominated ones. But in McGhee's ecological ranking, the late Devonian overtakes the late Ordovician extinction of 450 million years ago—the third of extinction's Big Three. Although plenty of new creatures appeared after the Ordovician extinction, he says, ecosystems worked much the way they had before; in contrast, after the Devonian extinction, reef communities did not fully recover for a couple hundred million years. A mass depletion may never have the cachet of a mass extinction, but perhaps it can trigger crises in the history of life just as well.


    Web Site Aims to Bridge North-South Divide

    1. Ben Shouse

    The Internet has been billed as the great democratizer, providing cheap and easy access to information for all. In many developing countries, however, the reality is very different: Computers and decent phone lines are scarce, and subscriber-only Web sites bar people from the best data. Now a new Web site, officially launched in London this week, aims to bridge the gap with scientific news and information relevant to developing nations (see Editorial on p. 2053). The site is also intended to help foster scientific cooperation.

    Journalists, scientists, and development agencies conceived the site, known as SciDev.Net (, 3 years ago and drummed up funding from international bodies. With a budget of $2 million over the first two and a half years, a staff of six journalists and several foreign correspondents will provide daily news, in-depth features by scientists and officials, and a selection of articles from Science and Nature. The site also features a database of scientific organizations, the first stage of a regional network of scientists designed to promote “North-South and South-South collaboration,” says Mohamed Hassan, executive director of the Third World Academy of Sciences (TWAS) in Trieste, Italy, which helped conceptualize SciDev.Net.

    Part of SciDev.Net's mission will be to separate the wheat from the chaff: It is often hard to find accurate and useful information on the Web when many appealing sites are promoting questionable assertions, such as the claim that HIV does not cause AIDS. “Some of these sites that are trying to undermine scientific ideas are really very user-friendly,” says David Dickson, SciDev.Net's director and Nature's former news editor.

    The site's success will depend on access, which is severely lacking in some developing countries. Only the best Indian universities, for example, have reliable access to the Web. “The Internet has the potential to transform research in the developing countries. But the potential will remain just that if we do not take care of several other factors,” says Subbiah Arunachalam, an information consultant in Chennai, India, and a former officer of the Indian Academy of Sciences.

    Several agencies are already hard at work providing access, some sponsoring telecenters in poorer countries, and some, such as TWAS, developing networks of research academies and science ministries across the North-South divide. SciDev.Net complements these efforts, Hassan says: “This is really confidence building for scientists in the developing countries.”


    High-Speed Biologists Search for Gold in Proteins

    1. Robert F. Service

    Proteomics aims to chart the ebb and flow of tens of thousands of proteins at once to produce snapshots of life inside cells. For now, the technology isn't there. But this young field is growing up fast

    The scene from the picture windows in GeneProt's third-floor conference room looks downright leisurely: vineyards atop rolling hills with the Swiss Alps beyond. But inside this Geneva-based biotech upstart, it's all about speed.

    In a labyrinth of rooms in GeneProt's first- and second-floor laboratories, four different kinds of bench-top robots—24 machines in all—steadily work together in silence. The robots, some weighing as much as 150 kilograms with arms that whir in all directions, carefully isolate a mix of proteins from a tissue sample, separate them into clumps of identical proteins, chop members of each clump into fragments, and place them into an array of wells on tiny metal plates. Technicians feed these plates into a series of 51 mass spectrometers worth over $150,000 each; every second, each of these refrigerator-sized machines spits out a fingerprint of a protein fragment based on its mass. A supercomputer then compares each fingerprint to a database to identify the amino acids it contains. Then, within minutes, it reassembles the jumble of fragments to identify the proteins from which they came. The result: a list of thousands of proteins present in the starting sample. A few years ago, identifying just one of these proteins often took years. Today it takes hours.

    GeneProt execs are betting that by comparing such lists from diseased and normal tissues, they will be able to identify which proteins are the most important in various diseases—and therefore make the best targets for new medicines. Although GeneProt started building this futuristic lab only last year, company officials say they've already fingered six proteins that could serve as drugs themselves or as targets for other compounds.

    GeneProt's lab is just one of many converging on biology's biggest boom industry: proteomics. The goal of this new “-omic” is no less than to catalog the identity and function of all the proteins in living organisms. In terms of complexity, proteomics makes genomics look like child's play. Instead of an estimated 30,000 to 40,000 genes, protein experts think that humans have somewhere between 200,000 and 2 million proteins. What's more, whereas genes remain essentially unchanged through life, proteins are constantly changing, depending on the tissues they're in, a person's age, and even what someone ate for breakfast.

    Researchers are hell-bent on tracking down proteins, says GeneProt co-founder and chief scientist Keith Rose, because proteins, not genes, are where the action is. Whereas genomics offers a look at the blueprints for life, proteomics reveals the nuts and bolts. Defective proteins are responsible for the chemistry that leads to a range of diseases from cancer to Alzheimer's. And blocking or boosting these proteins offers the straightest shot to finding the next blockbuster drug, says Sam Hanash, a proteomics expert at the University of Michigan, Ann Arbor: “The proteome can bring a lot of the fruit that the genome could not.”

    Both money and hype are flowing fast and furious. Dozens of new companies have sprung up in the past few years to either search for proteins en masse or sell tools to the protein prospectors. Most pharmaceutical giants, such as GlaxoSmithKline and Pfizer, have launched their own proteomics efforts as well; all are racing to find and patent proteins. In a time of tight markets and wary investors, proteomics companies have attracted more than $530 million in venture capital funds in the past 22 months. Stock offerings have raised hundreds of millions more.

    Eye on the prize.

    Fast protein analysis offers hope for finding blockbuster drugs.


    “You're talking about an absolute explosion of interest in proteomics in industry,” says Raj Parekh, who directs proteomics research at Oxford GlycoSciences (OGS) in the United Kingdom. “Proteomics, a word almost no one discussed two years ago, has become the new darling of the investment community,” life sciences market watcher G. Steven Burrill, CEO of Burrill & Co. in San Francisco, wrote recently.

    But despite the deep pockets, “there is still a bit of snake oil in this field,” says Phil Andrews, a proteomics expert at the University of Michigan, Ann Arbor. In private, few researchers deny that identifying all the body's proteins might be a lot harder to achieve than the industry's public relations suggests. The technology to pull it off simply doesn't exist yet, and competition is stiff for those proteins that can be nabbed with current technology. And if proteomics does turn up new targets, what's to say they will be any easier to develop into drugs than the targets already out there? But for now, the allure is so compelling that few want to dwell on the gritty underside.

    Grab bag

    Just what is a proteome? Ask a dozen experts, and you will get a dozen different answers. Most commonly, it means an organism's complete set of proteins in every form they assume. But with proteins winking in and out of existence, what you see depends on when you look and what tools you use. “In genomics, the end point is well defined: the full sequence of an organism's DNA. With proteomics that's completely different. It's an attempt to capture the dynamics of a living system,” says Ruedi Aebersold, a proteomics expert and co-founder of the Institute for Systems Biology in Seattle, Washington. That makes it all but impossible to define a single proteome that can be tackled in a large-scale, systematic way akin to the genome project—although there is a nascent attempt at international collaboration to do something along those lines.

    Nor is there likely to be a single technology that will dominate the field—as robotic gene sequencers did for genomics—nor a single corporate juggernaut like Celera Genomics of Rockville, Maryland. That's because unlike genes, proteins vary widely in their chemical behaviors, making it difficult to come up with one technique that works equally well on all proteins.

    The result is a balkanized landscape in which different groups—mostly companies at this point—are chipping away at different pieces of the puzzle, all with the hope of finding the next blockbuster drug. Some want to know what proteins are expressed in diseased versus normal tissue. Others have set their sights on how proteins interact and what they do. Still others are determining the three-dimensional structure and function of proteins. The field is so vast, the goal so expansive, that there is room for everyone, says Denis Hochstrasser, a proteomics pioneer at the University of Geneva in Switzerland and another co-founder of GeneProt.

    X-ray Crystallography

    This tried-and-true technique maps the atomic structure of proteins. Researchers overexpress a protein, purify it, and coax the individual proteins to line up in crystals. They then use a beam of x-rays to produce a diffraction pattern (top), which helps determine the final structure.


    The new frontier

    No one could even consider studying proteins en masse—instead of one by one—until the mid-1970s and 1980s, when technologies for separating mixtures of proteins and tracking which proteins bind to one another first made their debut. But what truly got the field going, says GeneProt's Rose, was the mass of genomics data churned out by the Human Genome Project and huge advances in computing power in the late 1990s. Suddenly, researchers could identify almost any protein they fished out of a tissue sample. All they needed to do was translate a fragment of the protein's amino acid sequence into DNA's code of A's, T's, G's, and C's; this information could then be used to search a computer database for the gene that made it along with the identity of the complete protein. Backed with the right robotics and supercomputers, researchers can now analyze hundreds of thousands of proteins in a tissue sample in a few months.

    These newfound capabilities promise a major expansion in the number of potential drug targets. “Today the whole industry operates on 500 protein [drug] targets. There are thought to be between 10,000 and 20,000 protein targets. So the whole race [in proteomics]—and it is a race as all the technologies pile on—is how are we going to find those,” says OGS chief Michael Kranda. Even so, Kranda readily acknowledges that the real roadblock in the pharmaceutical industry is not a lack of novel targets but the difficulty and expense of turning them into marketable drugs. Matthias Mann, chief scientist of MDS Proteomics in Toronto, Canada, agrees, adding that companies such as GeneProt that are relying on heavy firepower up front might find themselves disappointed in the end. “I don't think the race is going to be won by the number of machines a company is using,” Mann says.

    All this has created a land-grab mentality, similar to that among genomics companies in the 1990s as they raced to patent genes. “The driving force is to crank through as many proteins as possible to patent them” and claim as much intellectual-property real estate as possible, says Ian Taylor, who heads proteomics efforts at PerkinElmer Life Sciences in Cambridge, U.K. OGS, for example, expects to file some 4000 patents by year's end, each of which will represent a protein whose function is known and linked to disease, says Kranda. At least some of the protein patents seem likely to duplicate existing gene patents, setting the stage for court battles as genomics and proteomics companies try to protect their turf (see p. 2082).

    Searching for answers

    Researchers are pursuing the race using three fundamental approaches. Two are basically goosed-up versions of long-known technologies for separating mixtures of proteins and watching their interactions. The third is x-ray crystallography, an even older technique for mapping the structure of proteins in atomic detail.

    For now, the dominant technology for separating proteins—and the workhorse for companies such as GeneProt, OGS, and Large Scale Biology Corp. of Vacaville, California—is two-dimensional (2D) gel electrophoresis (see figure below). The technique uses electric fields to pull proteins through a slab of gelatin to separate them from one another by their molecular weight and charge. Proteins of interest are then cut from the gels, chopped into fragments, and handed off to a mass spectrometer and computer for identification. Companies use 2D gels to try to detect differences in protein expression between tissues—comparing, say, cancerous and healthy liver tissue.

    2D Gel Electrophoresis

    After a mixture of proteins is placed in a gel, the proteins are separated in one direction by their charges and in the perpendicular direction by their molecular weights. Proteins of interest are then cut from the gels, purified, and broken into fragments. These fragments are sent to a mass spectrometer, which measures their atomic masses. Masses from the fragments are then used to identify the protein.


    Among the recent deals in this red-hot sector, last November GeneProt and Novartis Pharmaceuticals announced the largest proteomics agreement to date. For an up-front $43 million and the promise of another $41 million over 4 years, GeneProt agreed to run three studies for Novartis to identify drug targets. According to company literature, Large Scale Biology, meanwhile, is using its version of the technology to assemble a database of proteins called the Human Protein Index, “the protein equivalent of the Human Genome Project.”

    Other companies are using a technology called the yeast-two-hybrid method to map protein-protein interactions (see figure below). This approach uses known “bait” proteins to bind unknown “prey” and thereby reveal which proteins interact; this information provides insight into the function of the unknown captives. By repeating such experiments en masse, investigators can work out the tangled protein interaction networks in cells. And because these networks reveal the chain of communication cells use to survive and thrive, asserts Sudhir Sahasrabudhe, chief of research at Myriad Genetics of Salt Lake City, Utah, the technique offers the fastest way to home in on potential drug targets.


    Researchers insert a gene in yeast for a “bait” protein alongside DNA for half of an “activator” protein. The other half of the activator DNA is then inserted alongside DNA for random “prey” proteins. The yeast cells are then grown up and the proteins are allowed to interact. If bait and prey proteins bind, the two halves of the activator protein be close enough to work together to turn on another yeast gene that turns the cell blue, signaling a match.


    In April, Myriad trumpeted a $185 million collaboration with Hitachi and Oracle as an effort to “map the human proteome” in 3 years. “This project represents a bold leap toward the future of drug development,” Myriad president and CEO Peter Meldrum declared at the time.

    A third set of research groups, meanwhile, is rushing to automate high-speed x-ray crystallography and related approaches to map the atomic landscapes of proteins. The techniques, collectively known as structural genomics, aren't as widespread as other approaches to proteomics. But a more moderately paced version of the technique is widely used as a key step in designing drugs to interact at specific sites in proteins. Over the last 40 years, those efforts have generated some 2000 unique protein structures in public databases. Tim Harris, who heads Structural GenomiX in San Diego, California, claims his company alone will more than double that number in just 5 years.

    Here, too, the dealmaking has been brisk. According to Raymond Stevens of the Scripps Research Institute in La Jolla, California, structural genomics efforts have raised more than $500 million since 1999. About half of that, he says, comes from publicly financed programs in the United States, Japan, and Europe. The rest is private money raised to back start-up companies such as Syrrx and Structural GenomiX in the United States and Astex Technology in the U.K.

    Speed bumps

    Upon closer inspection, the lofty goals and broad statements that have accompanied announcements of these deals come with important caveats. Myriad, for example, says it won't actually find all the protein-protein interactions in cells. Rather, the company will track down all it can with the yeast-two-hybrid and mass spec approaches. Likewise, Large Scale Biology's human proteome database will survey only tissues the company deems relevant to finding new drugs and diagnostic markers. One reason that the reality is likely to be less impressive than the hype is that all three frontline proteomics technologies suffer from serious limitations.

    The 2D gels, in particular, face big drawbacks. The gels typically do a poor job of separating proteins that are either small or very large. They also stumble when it comes to separating out proteins normally embedded in cell membranes, often the best drug targets. And despite improvements, mass spectrometers still have a hard time seeing proteins that are expressed only in minute quantities, many of which could be key protein markers for cancer or other diseases. The 2D gel/mass spec approach “is not based on tomorrow's technology,” says MDS's Mann.

    The yeast-two-hybrid method is also limited in what it can accomplish. For one, the technique detects proteins that interact when placed inside yeast cells, outside their natural environment. And although many of the interactions might prove interesting from a basic science perspective, few are likely to be related to disease. Moreover, “interactions and abundance [of particular proteins] change over time,” says the Institute for Systems Biology's Aebersold. So although mapping the links between proteins is “attractive,” Aebersold says, “it is by no means sufficient to explain the biology of a cell.”

    For structural studies of proteins, the biggest drawback is speed. Even with cutting-edge robotics, individual companies or academic collaborations can hope to resolve only several hundred protein crystal structures a year. And although this might prove useful to drug designers, it's impossible to imagine how researchers might survey an entire proteome with the technology.

    Finally, all three techniques falter when it comes to the chemical tags that proteins receive after they emerge from the ribosome where they are built. These “posttranslational modifications,” such as the addition of phosphate or sulfate groups, can have profound effects on a protein's function, says OGS's Parekh. But deciphering these chemical tags can be painstaking work. “We don't have good tools for looking at these in a high-throughput way,” says David Hachey, a proteomics specialist at Vanderbilt University in Nashville, Tennessee.

    GeneProt's Rose and others claim that there are technological fixes for some of the problems. For example, another long-used separation technology called liquid chromatography does a better job of scanning for small proteins. Although several companies are adding the technique to their arsenals, Celera, for one, is so convinced of its superiority that it is relying exclusively on it for separating proteins. Still, no one denies that all these technologies will overlook many, if not most, proteins. “But we will be able to detail very, very many,” adds Rose. And for now, he and others think that they'll generate plenty of moneymaking discoveries. And new tools are on the way, they say, as academic and corporate researchers continue to push the envelope on characterizing proteins (see sidebar). “The technology is getting 10 times better every year,” says Mann.

    Whither academics

    With much of proteomics dominated by biotech and drug companies, many researchers wonder whether the field is already beyond the reach of academics. “All the really big studies are being done in the private sector,” says Aebersold. Mann adds that “it's difficult for a university group to mount a sustained effort in proteomics.”

    But Hanash argues that the breadth of proteomics and the inability of one technology to answer all the questions will ultimately play to the strength of academia. “It's the exact opposite of the genome project,” where Celera was able to match the public gene-sequencing effort with a single centralized sequencing lab, says Hanash. Large-scale commercial proteomics efforts “are very limited in what they can accomplish,” he adds. “It will be a plurality of efforts that brings the payoff in proteomics.”

    One academic collaboration is already under way, the Alliance for Cell Signaling (ACS). Launched last year by Alfred G. Gilman at the University of Texas Southwestern Medical Center in Dallas, this consortium is made up of more than 50 experts in 21 institutions. It aims to track all the proteins that carry out communication in two types of cells, antibody-producing B lymphocytes and heart muscle cells. That's not technically a full-scale proteomics effort, because the alliance isn't tracking all the proteins in those cells. However, says Michigan's Andrews, “it is a doable project that makes sense for academics.”

    Many researchers are already beginning to look to ACS as a model for a publicly funded proteome effort, in part to ensure that not all proteomics data are locked up by companies. At a meeting in Washington, D.C., in October, academic leaders in proteomics huddled with representatives from the National Institutes of Health and other government funding agencies, as well as from proteomics companies, to discuss launching a coordinated initiative. The group decided to proceed with caution, recommending pilot projects in three areas: profiling protein expression in selected tissues, detailing proteins' functions, and creating new bioinformatics tools to handle the deluge of data. The projects, says Michigan's Hanash, will likely follow ACS's lead and focus on specific research topics, such as looking at cytoskeletal proteins or those involved in key organelles such as the energy-producing mitochondria. “We want to build the encyclopedia one chapter at a time,” he says.

    If there's any lesson from the early work in this field, it's that the proteomics encyclopedia isn't likely to come together quickly. And when it does, its pages will be scattered in databases around the world. It won't hold all the answers for understanding life inside the cell. And it won't remove most obstacles to turning drugs into products. But in the modern world of high-speed biology, being first to discover something means opportunity: for glory, for profits, and for the right to lead the field in new directions. For all its limitations, proteomics has just become biology's latest wellspring of opportunity.


    Proteomics 2.0: The View Ahead

    1. Robert F. Service
    Isotope-Coded Affinity Tags

    Pioneered by Ruedi Aebersold at the Institute for Systems Biology in Seattle, Washington, this technique enables researchers to chemically tag specific proteins in two separate samples with distinct heavy and light isotopes. By then tracking their relative abundance with a mass spectrometer, researchers get a quantitative measure of how protein expression changes with disease.

    Protein Chips

    On these chips researchers lay down a checkerboard-like grid of molecules designed to capture specific proteins at specific sites. They then use fluorescent probes or other means to detect where proteins bind on the grid. And because they know the identity of the probes at each spot on the grid, this reveals which protein they have captured. Although protein chips have been slow to develop, researchers expect that in time they could become a fast way to scan samples for hundreds or thousands of different proteins (see p. 2080).


    Researchers at several universities and companies have created silicon, glass, and plastic chips, engineered to harbor networks of sample holders, channels, and reaction chambers to carry out the complex sequence of steps needed to prepare protein samples for analysis by mass spectrometers and other analytical equipment. Because the chips are fast and can work with very small samples, they have the potential to dramatically improve the speed and sensitivity of proteomic analyses.

    Differential In Gel Electrophoresis

    This technique, commercialized by Amersham Pharmacia Biotech, offers researchers a global view of how protein expression changes between two samples. Proteins from one sample are all tagged with a single fluorescent compound, while those from another sample are tagged with a different colored fluorescent dye. The two samples are then mixed together and the individual proteins are separated on a single two-dimensional gel; this separates proteins in one direction by their charges and in the perpendicular direction by their molecular weights. A quick look at the gel reveals whether separate spots show both colors—meaning that a protein is expressed in both samples—or just a single color that shows which sample harbors the protein.


    A Proteomics Upstart Tries to Outrun the Competition

    1. Robert F. Service

    By assembling an arsenal of technology, GeneProt aims to identify proteins associated with disease

    GENEVA, SWITZERLAND— “Proteomics is something you either do in a big way or putter around in a corner,” says Keith Rose. He should know. Last year, along with fellow proteomics pioneers Denis Hochstrasser, Amos Bairoch, Ron Appel, and Robin Offord, Rose launched GeneProt, the biggest proteomics test-bed to date in this young field. Armed with 51 mass spectrometers—protein sequencing machines that can cost over $150,000 each—and a massive supercomputer, the company is already a proteomics powerhouse.

    And that's just for starters. In addition to the research lab here in the outskirts of Geneva, the company is constructing an even bigger facility in North Brunswick, New Jersey, slated to open next year. Moreover, it's considering plans to open a third site in Japan. All this for a deceptively simple goal: finding proteins involved in disease.

    The company is banking that these proteins will produce a string of lucrative drugs aimed at top killers such as cancer and heart disease. GeneProt will use the proteins either as drugs themselves—like Amgen's top-selling protein therapeutic erythropoietin—or as targets for making their own small-molecule drugs.

    GeneProt is sprinting out of the blocks, says Rose, because drugs, patents, and profits will go to those who are first with discoveries or can fish out the most important disease-related proteins. And GeneProt execs are convinced that the technology for analyzing proteins en masse has finally come of age. “You can either drop a line in or drain the whole area and see what is on the bottom,” Rose says. The company, obviously, is taking the second tack.

    Despite GeneProt's fast start, fellow proteomics firms Oxford GlycoSciences (OGS) in the United Kingdom and Large Scale Biology Corp. (LSBC) in Vacaville, California, are already in hot pursuit. OGS has protein-hunting collaborations with Pfizer, Bayer, GlaxoSmithKline, and the agricultural firm Pioneer Hi-Bred; LSBC has teamed up with Proctor & Gamble and Dow AgroSciences, among others. Celera's footsteps are also getting louder. This genomics powerhouse in Rockville, Maryland, raised nearly $1 billion in a March 2000 stock offering, much of it slated for proteomics. The company is now completing a seperate proteomics research center. And it recently formed a joint venture with equipment dynamo Applied Biosystems—which helped Celera sequence the human genome—giving it early access to the latest mass spectrometers, among other tools. To stay ahead, concedes GeneProt president Cédric Loiret-Bernal, the company will have to cut several deals with firms willing to have GeneProt be their primary proteomics source. He says the company is in discussions with 11 possible partners, six of which are “very interested.”


    GeneProt execs including Keith Rose (middle) and Denis Hochstrasser are betting that banks of high-speed mass spec machines and supercomputing power will put them ahead of the pack.


    GeneProt's moves have inspired both awe and skepticism among other proteomics researchers. “They are going to industrialize the process,” says David Hachey, a proteomics expert at Vanderbilt University in Nashville, Tennessee. “I think it will work.” Ruedi Aebersold of the Institute for Systems Biology in Seattle, Washington, gives them a shot as well. The expertise of GeneProt's founders in proteomics is “very, very good,” he says.

    But John Yates, a proteomics researcher at the Scripps Research Institute in La Jolla, California, wonders whether the company is starting too fast with talk of three research centers: “It seems like a lot given that proteomics is still unknown as a business strategy.” Hochstrasser's response: “If you want to place a satellite in orbit, you have to have a rocket of a certain size.”

    Hochstrasser concedes, however, that “there are not that many customers” for his business—perhaps 40 pharmaceutical companies that would be willing to strike deals. But he thinks they will, because to maintain their recent run of strong returns to investors, they must discover an ever larger number of blockbuster drugs: “The drug industry is ready because they want to go faster and can't do everything internally.”

    GeneProt has already lured one big customer. Last October, while its labs were still under construction, the company struck an $84 million deal with Novartis Pharmaceuticals. Backed by these resources, the company plans to work its way through a series of “twin proteome” studies, which can take up to 6 months apiece. Each will carefully analyze a pair of tissues—such as normal lung tissue and tissue from a lung cancer—and look for changes in expressed proteins that correlate with disease. For its investment, Novartis will receive three twin proteome studies, on diseases the company declines to specify.

    GeneProt plans to keep its lead, says Rose, by producing a better product faster. He claims that GeneProt can analyze several hundred thousand proteins a year. That's on par with numbers by competitors such as OGS. But when GeneProt's new U.S. facility comes on line next year, its output will double, the company claims. At the Geneva site alone, this stream of proteins is expected to generate a torrent of some 40 terabytes, or 40 trillion bytes, of data per year. To handle that flood, the company has teamed up with Compaq to create one of the largest civilian supercomputers in the world, comprising 1400 separate processors capable of carrying out 2 trillion operations per second.

    And GeneProt will offer its partners something its competitors don't: synthesized proteins. Not only will this strategy help jump-start drugmaking efforts, asserts Rose, but it may also help GeneProt researchers dodge patent disputes. Companies such as Incyte Genomics in Palo Alto, California, and Human Genome Sciences in Rockville, Maryland, claim rights to certain genes that can be used to make proteins in bacteria. By synthesizing proteins directly, Rose asserts, GeneProt can navigate around those claims. “The genomics companies thought they would stake out acres of virgin land,” Rose asserts. “I'm not sure that will cover chemical protein synthesis.”

    If GeneProt's technology is as powerful as its executives claim, the first drug targets, and even drug candidates, should show up over the next year. “Expectations are very high,” says Loiret-Bernal. “People are really looking at us to see if we are going to be successful or fail.” Whatever the outcome, it's likely to serve as a bellwether for other firms looking to cash in on industrial protein analysis.


    Searching for Recipes for Protein Chips

    1. Robert F. Service

    Protein arrays could be the basis for new diagnostics and research tools, but the technology has been slow to develop

    When medical visionaries talk about the future, many offer up the image of a computer chip or CD-ROM that stores your complete DNA sequence. Interested in your odds of getting Huntington's disease or breast cancer? Just have your doctor scan your DNA.

    In most cases, however, we want to know what we've got right now, not what we might face in 30 years. DNA and genes won't always provide immediate answers, but looking at proteins just might. That's because proteins reflect the chemistry taking place inside cells, chemistry that is altered in potentially diagnostic ways by different diseases. The problem is that such diagnoses depend on technology that does not exist today: chips that can spot hundreds or thousands of distinct proteins at a time from a sample, say, of blood or urine.

    Both academic and commercial labs around the globe are furiously competing to perfect such next-generation biochips, postage stamp-sized devices that would track many proteins in a single step. Rudimentary versions that spot a handful of proteins are already on the market. But making more complex versions is vastly more complicated than creating DNA chips, popular research tools for analyzing suites of genes involved in everything from cancer to normal cell development.

    Despite the difficulty, a handful of academic groups and adventurous companies, from small start-ups to research powerhouses, are pursuing the technology. “Everyone is working on this so aggressively because it's so potentially useful,” says Larry Gold, who recently stepped away from a decade of mixed success chasing biotech drugs to launch a protein biochip start-up called SomaLogic in Boulder, Colorado.

    So far there hasn't been much to show for these efforts. But two recent studies offer hope that protein arrays will succeed. “I think it's virtually a sure thing,” says Pat Brown, a Stanford University biochemist and pioneer of both DNA and protein chips. “But what will be the best technology, and how soon, remains to be seen.”

    If and when protein chips hit the market, they stand to make a big impact. “We believe there is a pent-up demand for these things. People are anxiously awaiting the technology,” says Felicia Gentile, president of BioInsights, a market research firm in Redwood City, California. BioInsights predicts that the market for protein chips will grow to $500 million by 2005; other market watchers put that number as much as 10 times higher.

    Much of the allure surrounds the diagnostic tests these chips might make possible. Proteomics companies are working overtime to find novel protein and peptide biomarkers whose expression correlates with particular diseases. If they succeed, a single scan of a drop of a patient's blood or urine could reveal whether the person is making proteins linked to cancer, arthritis, or heart disease. DNA chips are limited as diagnostic tests, in part because most diseases don't have a distinctive genetic signature.

    Array of possibilities.

    Advanced protein chips promise ultrafast detection—if and when they make it to market.


    Beyond the doctor's office, protein chips might also help reveal the web of protein-protein interactions in different cell types, thereby enabling researchers to work out the complex chains of chemical communication inside cells. Versions of the technology might illuminate how much of a given protein is expressed at a given place and time, offering insights into, say, cellular development or aging. And drug screening could be thrown into overdrive if researchers are able to quickly test whether new compounds bind to particular proteins immobilized on chips. “Protein chips will be orders of magnitude more useful than DNA chips, and DNA chips are very useful,” says Michael Snyder, a biochemist and protein chip developer at Yale University in New Haven, Connecticut.

    Second wave

    Protein chips are made in much the same way as DNA microarrays. Researchers dot a glass or plastic surface with an array of molecules designed to grab specific proteins; the grabbers can be other proteins such as antibodies or even snippets of DNA. Then fluorescent markers or other detection schemes reveal which spots have snagged their prey. Because researchers keep track of the identity of each protein-grabbing molecule as it's laid down in the grid, when they see that a particular spot on the grid lights up, they know which protein has been captured.

    It sounds simple enough. But getting all the elements to work is far more difficult than with DNA arrays. “Measuring nucleic acids [in an array] is a simple and streamlined system,” says Brown. That's because nucleic acids have the distinct advantage of complementary binding, in which one strand of DNA or RNA binds specifically to another with the corresponding sequence of bases. To make an array, you simply synthesize strands of nucleic acids complementary to the sequences you are looking for and attach them in a grid pattern to a substrate. Companies such as Affymetrix of Santa Clara, California, now sell DNA chips that screen for as many as 60,000 genes and gene fragments at once.

    Proteins, in contrast, bind to their targets based on the three-dimensional shape of each, as well as a myriad of chemical interactions. Thus, for each spot on an array, researchers must come up with a unique and specific molecule to capture a desired protein target. “To measure a protein is a new problem every time,” Brown laments. Moreover, the biochemistry of proteins varies widely. Soluble proteins found in blood, for example, typically have water-friendly hydrophilic groups near their surface, whereas proteins embedded in cell membranes are often coated in fatty hydrophobic groups. A biochip surface that is chemically treated to bind hydrophilic proteins won't usually work well with hydrophobic ones.

    Even so, researchers are starting to tame protein arrays. In one recent report (Science, 14 September, p. 2101), Snyder's group at Yale and colleagues at North Carolina State University in Raleigh created a protein chip that, when presented with copies of a particular yeast protein, highlights almost all of the other yeast proteins to which it binds. This feat required the researchers to clone as many yeast genes as possible—they succeeded with 5800 out of about 6200—by inserting the genes into other yeast cells, coaxing the bugs to overexpress the proteins, and then laboriously purifying and collecting them. They used a now-standard DNA array robot to dab tiny samples of each yeast protein in more than 200 rows atop a glass microscope slide. To find out what these yeast proteins bind to, the team spritzed the slide with solutions containing various test proteins and labeled the spots where they bound.

    Snyder and his colleagues rapidly identified the proteins that interacted among the thousands of arrayed yeast proteins, they reported. That revealed a wealth of details about the network of communication channels yeast use to survive. For example, the team discovered 33 new proteins that bind calmodulin—a widespread protein involved in calcium sensing—and 52 proteins that bind phosphotidylinositides, cell membrane proteins involved in growth, differentiation, and cytoskeletal rearrangements. “This is a biochemist's dream, to be able to look for any activity over the entire proteome,” says Eric Phizicky, a biochemist at the University of Rochester Medical Center in New York state.

    The work is a coup, says Phizicky, because Snyder's team managed to get so many different proteins to stick to a surface and remain active. The team accomplished this by engineering each of the proteins to contain a nickel-binding group, coating the microscope slide in nickel, and dabbing the proteins on top. Snyder has launched a company, called Protometrix, to commercialize the technology.

    But as valuable as it is for spotting protein-protein interactions, the Snyder team's chip won't help much with diagnostic tests. That's because these tests must be able to fish out particular proteins present in low concentrations in fluids chock-full of other proteins, some of which can cross-react with the chip's sensors to give off false signals.

    At Stanford, Brown's team is hotly pursuing a technique for diagnostic chips. Instead of using an array of everyday proteins to capture other proteins, the team is developing arrays that use antibodies, which capture specific proteins even at low concentrations. Researchers have decades of experience dotting such antibodies on surfaces for one-at-a-time assays. But Brown's team has more ambitious goals. The researchers arrayed hundreds of antibodies atop microscope slides that had been specially treated with a polymer called poly-L-lysine and other compounds to promote the binding and stability of the antibodies. They then tested how well they could detect samples of protein. The results were far from perfect: Only 20% of the arrayed antibodies could provide accurate measurements of proteins at low concentrations, the team reported in the February issue of Genome Biology. But Brown insists that the study marks an important first step toward making useful antibody arrays. Snyder agrees, calling the work “a good start.”

    Complex diagnostic arrays, however, could be years away. As Brown and others have found, working with antibodies is tough. They are large, weighing about 150,000 daltons compared to just a few thousand for typical probes that capture DNA. Separate probes therefore must be placed farther apart, limiting the number that can fit into an array. And even though antibodies harbor small active sites that are more specific in their binding than those of many other proteins, they contain large protein-based supporting structures that can cross-react with proteins other than those to which they are designed to bind, confounding results.

    Race to market

    Brown and Snyder are leading the academic race for protein chips. But plenty of industrial competitors are hot on their heels: More than a dozen companies are working to bring protein chips to market. The biggest battle is over which molecules to lay down in the array to best capture proteins of interest. Like Brown, some companies are developing arrays that use antibodies to identify specific proteins. For example, Zyomyx of Hayward, California, is gearing up to begin precommercial tests with a relatively small antibody chip designed to screen for 30 cytokines, proteins known to play a key role in inflammatory diseases such as arthritis and heart disease. Still, other protein-grabbing strategies abound.

    Cambridge Antibody Technology (CAT) in Cambridge, U.K., and Dyax in Cambridge, Massachusetts, for example, are making smaller versions of antibodies using an established technique known as phage display. Smaller proteins potentially mean less trouble, because they minimize the chance that a nontarget protein will interact with the antibody's supporting structures, says Larry Cohen, CEO of Zyomyx, which is also working with both CAT and Dyax to develop protein chips using antibody fragments. Antibody fragments can also be produced far more quickly than full-sized antibodies by growing them on the surface of viruses that infect bacteria and make more copies. With this technique, CAT can screen about 20,000 different antibody fragments per month. The downside, however, is that in some cases these molecules don't bind as tightly to their protein targets as do full-sized antibodies.

    Phylos, a biotech company in Lexington, Massachusetts, has its own twist on the technology. Its founders developed a system to create libraries of small antibody-mimic proteins. These mimics are as easy to produce as antibody fragments made by phage display, and they are more stable, says Albert Collinson, who heads the company's business development. Phylos also has a scheme for arraying the capture proteins in high density and with a common orientation. Because of these advantages, “Phylos appears to have the most sophisticated protein capture technology,” according to the market research firm BioInsight's most recent review of the protein-chip field. Collinson says the company hopes to begin testing its chips for diagnostics and other uses early next year.

    View this table:

    But using a protein, antibody or otherwise, to capture another protein has its drawbacks. This approach makes it tricky to detect where target proteins bind on a chip: Both the capture molecules and the targets are proteins, so a simple protein-staining technique would light up each spot. That forces many companies to use more complex assays, such as creating fluorescent compounds that have to bind to target proteins to light up. SomaLogic's Gold says a better solution is changing the probe molecules laid down on the grid to “aptamers,” short stretches of nucleotides that can twist, fold, and bind to target molecules much like proteins do. A key advantage, Gold says, is that once an aptamer binds to a protein, researchers can forge a tight covalent bond by hitting it with ultraviolet light, allowing them to wash excess protein off the chip surface and scan for the tight binders that remain. SomaLogic, Gold says, doesn't plan to make chips itself but is in discussions with about 10 other companies that might market aptamer-based chips.

    For now, all of these approaches are having trouble getting out of first gear to make products that compete with rudimentary protein chips already on the market. In 1990, Biacore in Uppsala, Sweden, began introducing sensor chips that use a technique known as surface plasmon resonance to investigate which proteins interact and to monitor the speed of such reactions. Ciphergen Biosystems of Freemont, California, sells a chip that screens samples for the presence of up to eight different proteins. But with both chips, researchers can look at no more than a few different proteins at one time. Ciphergen president Bill Rich is quick to admit that most researchers want more and that these chips are just the earliest examples of what is to come.

    Which technology will prevail is unclear. But Zyomyx's Cohen says it's safe to assume that the nascent field will go through a shake-out in the next couple of years. Even with some success, protein chips will not match the complexity of DNA chips anytime soon, says Ruedi Aebersold, a proteomics expert at the Institute for Systems Biology in Seattle, Washington. He thinks companies will start with a limited approach, making chips to test for the presence of just tens to hundreds of proteins. Still, Aebersold and others believe even such modest gains could make the chips useful diagnostic tools. If so, protein chips could take an opposite course from that of DNA chips and be useful in the clinic long before they make a big impact in the research lab.


    Gene and Protein Patents Get Ready to Go Head to Head

    1. Robert F. Service

    Genomics companies thought they had genetic medicine to themselves. Now proteomics firms are staking a claim

    When dueling teams unveiled the near-complete human genome last February, among those cheering the loudest were companies racing to patent proteins.

    Humans, the sequencers told us, may have only 30,000 to 40,000 genes, far fewer than the previous estimate of 100,000. But with proteins, the more they look, the more they find: Researchers now believe that we have as many as 2 million. Not only does this finding demolish the dogma that each gene encodes a single protein, it also throws a wrench in the business strategy of many firms that have spent the past decade furiously locking up patents on key genes involved in disease. Those patents cover what were thought to be the single proteins those genes encode—which means that any other proteins the genes give rise to may be ripe for patent lawyers' pickings. “The patent game isn't closed by any means,” says Raj Parekh, chief scientist at Oxford GlycoSciences, a proteomics firm in the United Kingdom. That may be good news for protein-hunting companies like Oxford GlycoSciences, but it's likely to produce a confusing landscape of competing gene and protein patent claims, perhaps setting the stage for legal battles for control over the future of genetic medicine.

    Genomics powerhouses such as Human Genome Sciences (HGS) in Rockville, Maryland, and Incyte Genomics in Palo Alto, California, have collectively filed more than 25,000 DNA-based patent applications (a number that includes both full-length genes and gene fragments). If any pharmaceutical company wants to use a patented gene and protein to develop new drugs, the reasoning goes, it has to pay royalties. This strategy makes sense as long as one gene produces one messenger RNA (mRNA) that in turn codes for one protein, as the textbooks say. But genes clearly don't tell the whole story.

    Recent studies have revealed that cells often splice mRNAs together in a variety of ways to make different versions of a protein. These “splice variants” can perform separate functions in the body. One mRNA variant, for example, makes calcitonin—a hormone that increases calcium uptake in bones—whereas another creates calcitonin gene-related polypeptide, which prompts blood vessels to dilate. Furthermore, once these proteins are produced, cells can also tag them with small chemical groups that aren't coded for by genes. These small changes can also have big effects on a protein's function.

    That means that a patent on a specific DNA sequence and the protein it produces may not cover some biologically important variants. “If you find a splice variant that is different at the protein level, you can patent that variant,” says Scott Brown, chief patent counsel at Millennium Pharmaceuticals in Cambridge, Massachusetts. John Doll, who heads biotechnology patents for the U.S. Patent and Trademark Office in Arlington, Virginia, says that the same holds true for patents on proteins modified by chemical groups. As long as these changes lead to proteins with new and unclaimed functions and uses, researchers can stake separate patent claims on them, he says.

    So far, genomics firms say they aren't too concerned that their gene patents will wind up being worthless. One reason is that “most of these splice variants don't have very different activity from the main protein,” says James Davis, general counsel for HGS. And if some variants do turn out to have critical functions, several genomics firms plan to be the first to find and patent them. HGS, Incyte, and Celera of Rockville, Maryland, are all building their own proteomics facilities to ensure that they find the most important protein variants linked to disease-related genes.

    Still, showdowns may be inevitable. Some companies will undoubtedly find novel protein variants that correlate better with disease than those another company claimed earlier in gene patents, leading to competing claims over very similar molecules.

    If that happens, “I think in the vast majority of cases, people will work out a deal” to cross-license each other's patents, says Davis, who notes that that's how microelectronics companies typically deal with competing claims. “Nobody likes litigation,” agrees Parekh. “Cross-licensing is far cheaper than going to court.”

    But Davis and Brown admit that gene and protein patents may well prove different. Microelectronics researchers can often engineer their way around using particular inventions. But that's not so easy for drugmakers, who target specific proteins. That gives pharmaceutical companies little choice but to use those proteins—and the genes that make them—in searching for new medicines. That may make gene and protein patent holders a little less willing to back away from a legal battle.


    Rockefeller's Star Lured to San Diego Company

    1. Eliot Marshall

    A crystallographer who leads a public consortium, Stephen Burley surprised colleagues by taking a private-sector job—and taking NIH funds with him

    Stephen Burley doesn't look like someone getting ready to leap into the jungle. His bow tie, polished manners, and British accent (a blend from Australia, Canada, and Oxford University) speak of prudence and deliberation. His record as a structural biologist—21 years devoted to measuring the precise shape of protein molecules—doesn't suggest risk-taking, either. But Burley has decided to plunge into a new career. In January, he will quit an endowed professorship at Rockefeller University in New York City, resign his appointment as a Howard Hughes Medical Institute investigator, and begin directing research at a small company in San Diego called Structural GenomiX (SGX). He's stepping into a biotech melee, helping a young company analyze proteins rapidly for drug development—and possibly for a profit.

    Many biologists have trodden the path to industry, but Burley's route is a little different. Unlike other university stars, Burley will not be joining the gray ranks of a pharmaceutical company. He is leaving the pinnacle of his field for a firm that's still scrambling to prove itself. And his switch from academia to industry raises questions about the propriety of mixing public and private funds and ways to ensure public access to key biological data.

    As one of Burley's colleagues says, he's heading into “a kind of East Coast-versus-West Coast battle” that's broken out in San Diego, pitting the cream of New York's crystallographers against California's. SGX, just 2 years old, is competing against several talented rivals, including one down the road called Syrrx. SGX was founded by top structural biologists Wayne Hendrickson and Barry Honig of Columbia University in New York City. Syrrx, also founded in 1999, includes among its partners and leaders structural biologist Ian Wilson of the Scripps Research Institute in La Jolla, California, and company co-founders Raymond Stevens of Scripps and Peter Schultz, formerly at the University of California, Berkeley, and now director of the Genomics Institute of the Novartis Research Foundation in San Diego. Both companies are specializing in automated, rapid determination of protein structures by x-ray crystallography.

    In transition.

    Stephen Burley is leaving Rockefeller after 11 years to direct research at Structural GenomiX.


    Academic peers say they're not surprised that Burley wants to work in industry; after all, companies can throw money and talent at problems to solve them in a hurry, whereas academics are limited by the grant system and university fiefdoms. But they are amazed that he will become an officer at a start-up company. “We were all surprised,” says Helen Berman, a structural biologist who runs the Protein Data Bank at Rutgers University in New Brunswick, New Jersey. (Burley chairs her advisory committee.) “Steve is one of the shining stars in structural biology,” she notes, marveling at how this will “change his whole life and career.” Lawrence Shapiro, a structural biologist at Mount Sinai School of Medicine in New York City who also consults for SGX, says: “Before this, we were betting that he would become the president of Rockefeller or director of the National Institutes of Health [NIH].”

    In addition to being a top biologist—known for his work on RNA transcription factors—Burley has also been a community leader, says protein modeler Tom Terwilliger of Los Alamos National Laboratory in New Mexico. “Steve was one of the people who got involved early” in an NIH plan to fund pilot projects in high-throughput protein crystallography (Science, 29 September 2000, p. 2254). The program, run by the National Institute of General Medical Sciences (NIGMS), funds nine teams, each of which will get on average $4.5 million per year for up to 5 years. The teams must make data public. One of the grantees is the New York Structural Genomics Consortium, which Burley helped form and for which he is the principal investigator (PI).

    That may be another reason why Burley's decision to move to San Diego startled people. The champion of high-volume analysis and public data release will now be answerable to investors who may not be enthusiastic about giving away protein structure information. Burley is trying to straddle the fence.

    X-ray vision.

    Burley stands behind a generator for x-ray crystallography.


    Although Burley will be an officer of SGX, he also plans to remain a PI on the grant to the New York consortium. “The scenario which I presented to NIGMS,” Burley said in a recent interview at Rockefeller, “is to do target selection within the academic enterprise in New York. The cloning, protein analysis, protein purification, crystallization, and x-ray measurements would all be done in the company using the robotic platform that already exists” in San Diego. When the x-ray data files are complete, they would be sent back to New York, according to Burley, where academic labs “would actually complete the structure determination and take responsibility for deposition of the atomic coordinates in the Protein Data Bank.” NIH would pay for the robotic work at SGX, but the “final stage of structure determination where the information becomes most sensitive” would remain in academia—and academia would control the intellectual property. The company would “work on its own targets” and “not on publicly funded targets,” Burley said. SGX president Tim Harris says this plan “doesn't faze me at all … I welcome it.”

    The deal has raised an eyebrow or two among Burley's academic peers, however. “It poses a challenge to the definition of conflict of interest,” says structural biologist Gaetano Montelione of Rutgers. He notes that other people seeking grants from NIGMS have had to comply with rules that didn't allow them to be company officers. Eaton Lattman of Johns Hopkins University in Baltimore, Maryland, a member of NIGMS's structural genomics advisory panel, says that in public-private ventures, “you want to be sure that the company doesn't eat the results.” He'd like to see a written version of the data-sharing plan. NIGMS staffers note that there is a precedent for company involvement: Syrrx is a “subcomponent” of a NIGMS grant for which Wilson is PI, under terms approved before the award was made. NIGMS director Marvin Cassman isn't commenting.

    Burley argues that the new arrangement should help the field because it will speed up protein-structure determination. The public will benefit by using the company's excess capacity. “Not only do I want this to be right, in terms of U.S. law,” Burley said, “I want this to look right.” He's waiting for Rockefeller to spell out the intellectual-property terms before sending the written plan to NIGMS.

    Given the resources already at his fingertips at Rockefeller, why would Burley want to tangle in such controversies—or in any industrial concerns, for that matter? Salary was not a major incentive: According to a New York City executive recruiter, Burley opted for equity holdings as his main remuneration and a salary of $200,000 to $400,000. The basic reason, Burley says, is that he wants to use structural biology to solve medical problems—especially to create new antibiotics and other drugs. Although such work is being done in academic labs, he says, “the scale is the problem.” He wants to move faster.

    “The aspect of structural genomics that most interests me,” Burley explains, is “trying to find small molecules that you can target to particular protein families and try to restrict binding as much as possible. This can only really be done in an industrial context,” because so many different resources must be brought to bear. The challenge, Burley says, is to find a compound that “binds to a target of interest and doesn't bind to everything else.” To do this—and not take a quarter-century—one needs lots of talent and computing power focused on candidate screening.

    SGX, Burley claims, has bought a computer platform that “rivals the one that Celera has.” It has also built a set of four beamline stations for crystal analysis at the Advanced Photon Source at Argonne National Laboratory in Illinois, the first of which will start operating this month. The company had already invested in protein structure modeling experts and software. The chance to coordinate all of this was an “unparalleled scientific opportunity,” Burley said.

    It is “a very good time for structural biologists,” says Rockefeller colleague Andrej Sali, who understands how friends like Burley can be lured into companies. But he still doesn't understand how they will make a profit. That's an entirely new puzzle that Burley and others will be trying to solve.


    A Physicist-Turned-Biologist

    1. Eliot Marshall

    Stephen Burley's switch from Rockefeller University to a start-up company (see main text) is not the first strategic shift in his career. After beginning in science as an undergraduate student of theoretical physics at the University of Western Ontario, he saw that biology held better prospects, and “crystallography looked like the obvious place to go.” He went to Oxford as a Rhodes Scholar, earning a D.Phil. in molecular biophysics under David Phillips in 1983. Then he enrolled in a Harvard-Massachusetts Institute of Technology health sciences and technology program, earning a Harvard M.D. in 1987. He quit the clinic but not his interest in medicine, returning to crystallography as a postdoc at Harvard until 1990, when Rockefeller hired him.

    TATA box-binding protein.CREDIT: COURTESY OF BURLEY LAB

    His best known work may be the research with Robert G. Roeder and other colleagues at Rockefeller determining the structure of a molecule critical in DNA transcription called the TATA box-binding protein (see image). Lawrence Shapiro of Mount Sinai School of Medicine also credits Burley with pioneering two widely used techniques to analyze proteins that are hard to crystallize.

    Recently, Burley has worked with computational biologist Andrej Sali at Rockefeller to apply modeling techniques to protein structures. Sali has developed a program called ModBase that predicts the structure of any protein based on sequence data; he makes it available free on his Web site. With Burley, he formed a company 2 years ago to exploit the technology; in 2001 it merged with Structural GenomiX.