News this Week

Science  19 May 2006:
Vol. 312, Issue 5776, pp. 980

    Prosecutors Allege Elaborate Deception and Missing Funds

    1. D. Yvette Wohn*,
    2. Dennis Normile
    1. D. Yvette Wohn is a reporter in Seoul.

    SEOUL—Once-famed, now-disgraced stem cell pioneer Woo Suk Hwang was indicted on 12 May on charges of fraud, embezzlement, and violations of a bioethics law. Five other members of his team have also been indicted, three on fraud charges, one on a bioethics law violation, and one for destroying evidence and obstructing business operations. Hwang claims that he has been falsely accused on several points, according to Geon Haeng Lee, one of Hwang's seven lawyers.

    Hwang, formerly a professor at Seoul National University (SNU), had claimed in a 2004 Science paper (12 March 2004, p. 1669) to have made a breakthrough in so-called therapeutic cloning by creating a stem cell line from a cloned human blastocyst. He followed that up a year later with a second Science paper claiming to have created 11 stem cell lines derived from tissue contributed by patients suffering from spinal cord injury, diabetes, or an immune disorder (17 June 2005, p. 1777). Together, these papers seemed to pave the way toward creating replacement cells and tissues for these and other diseases that would be genetically matched to individual patients. Hwang was feted by scientists around the world and became a national hero in South Korea, which hoped to ride his achievements to worldwide prominence in stem cell research.

    Sweeping charges.

    In Gyu Lee of the Korean public prosecutor's office released a long-awaited report on the cloning scandal, indicting Woo Suk Hwang and five others on charges including fraud and embezzlement.


    The claims started unraveling last fall. First, bioethical lapses in collecting oocytes were alleged, then problems with manipulated photos and other supporting data were identified (Science, 23 December 2005, p. 1886). In January 2006, SNU announced that an investigating committee had concluded that no cloned stem cell lines existed. Hwang and his co-authors retracted both papers, and Seoul public prosecutors launched an investigation (Science, 6 January, p. 22).

    The prosecutors' conclusions are documented in a 150-page report that fills in some of the remaining holes in the Hwang saga. According to the prosecutors, Hwang and his team apparently believed that the “number 1” stem cell line that formed the basis for the 2004 Science paper was truly derived from a cloned blastocyst. Two separate investigations by SNU, however, concluded that the blastocyst most likely resulted from parthenogenesis, a form of asexual reproduction. The prosecutors' report leaves it up to academics to sort out whether the blastocyst was the result of cloning or parthenogenesis.

    However, the report says Hwang's team did not keep proper records and did not have evidence to support any scientific claims about stem cell line number 1. So, the prosecutors allege, Hwang ordered associates Jong Hyuk Park and Sun Jong Kim to fabricate photos, DNA test results, and other supporting data for the 2004 Science paper.

    For the June 2005 paper claiming the creation of 11 patient-specific cell lines, the report says that Kim, a member of the team from MizMedi Hospital in Seoul, was in charge of deriving stem cells from cloned blastocysts that had been created at the SNU lab. He was unable to do so. But, the report says, feeling pressure to perform and wanting to make a name for himself, he took fertilized stem cells from MizMedi's collection and mixed them with material from Hwang's lab. He reportedly told other researchers that light was “not good for the cells” and did most of the work in semidarkness. Prosecutors concluded that no one else in the lab, including Hwang, realized what had been done until suspicions were raised after the paper was published, when DNA fingerprinting tests in December 2005 showed that the customized stem cell lines were identical to the fertilized stem cells from MizMedi.

    The report alleges that Kim created two lines, and Hwang, believing they were real, ordered him to fabricate data to make it look as though they had made 11. Kim was indicted for obstructing research work at SNU, as well as for destroying evidence. The prosecutors allege that, in addition to deleting related computer files from his laptop and computers at MizMedi, Kim told MizMedi researchers to hide the fact that he was removing stem cells from its labs.

    Although Kim allegedly deceived Hwang, the prosecutors say that Hwang was ultimately responsible for ordering subordinates to fabricate data. The prosecutors did not file any charges against Hwang for publishing fraudulent research reports, however, saying it would be a complicated procedure that would have to involve Science.

    The prosecutors confirmed earlier reports that Hwang had used many more oocytes than the several hundred he acknowledged, collecting 2236 eggs from 122 women, 71 of whom were compensated. Paying for oocytes continued even after a bioethics law banned the practice in January 2005, the prosecutors' report states.

    Meanwhile, in addition to research misconduct, the prosecutors claim Hwang misappropriated $2.99 million in state funds and private donations. Their report outlines an elaborate scheme in which Hwang withdrew large amounts of cash and carried it in bags to other banks to avoid a paper trail of bank transfers. The prosecutors say he had 63 accounts under different names, including those of junior researchers and relatives. To cover up some of the alleged embezzlement, he wrote false tax statements claiming to have bought pigs and cows for research purposes. Hwang faces up to 3 years in prison for violating the bioethics law and up to 10 years for the misuse of state funds.

    The prosecutors also indicted two of Hwang's colleagues at SNU, professors Byeong Chun Lee and Sung Keun Kang, for fraud. The report says the two provided false evidence in order to receive government grants and then misappropriated the money. SNU has begun taking steps to fire the two professors.

    Sang Sik Chang, head of the Hanna Women's Clinic in Seoul, which provided Hwang with eggs in 2005, was charged with violations of the bioethics law in connection with egg procurement. Hyun Soo Yoon, a professor of medicine at Hanyang University in Seoul, was indicted for creating false receipts and embezzling research funds approved for a joint research project to create stem cells at MizMedi.

    Sung Il Roh, director of MizMedi, who also gave oocytes to Hwang, was not indicted; prosecutors say Roh did not pay for any oocytes after the bioethics law went into effect. Shin Yong Moon, a stem cell specialist at SNU who was co-lead author with Hwang on the 2004 Science paper, was cleared of wrongdoing by the prosecutors.

    Hwang's lawyer, Lee, says Hwang maintains that he did not order junior researchers to fabricate data for the 2004 article and that he believed a member of his team had created the number 1 stem cell line from a blastocyst resulting from somatic cell nuclear transfer, not parthenogenesis. “Prosecutors based their conclusion on testimonies from Jong Hyuk Park and Sun Jong Kim and did not take into consideration Hwang's statements that he did not order them to fabricate data,” Lee says.

    Hwang's lawyer also denied that Hwang embezzled funds, saying that the scientist had made huge profits from lectures and publications, which amounted to about $840,000. That money was put into the same bank accounts as his grants, but items such as his wife's car were bought with those private earnings, he contends. He says Hwang's lawyers will fight the charges in court. The first trial is scheduled for 20 June.

    Meanwhile, the South Korean government says that it will try to retrieve the grant money given to Hwang and his lab at SNU. The Ministry of Science and Technology says, however, that about $3.2 million has already been spent on design and construction of a new research facility that was being built adjacent to the College of Veterinary Medicine; those funds will be considered losses. SNU has not yet decided what to do with the unfinished building.

    Hwang's supporters continue to urge Hwang to restart his research and the South Korean government to acquire a patent on the first stem cell line. “Hwang may have rushed to publish the 2005 article, but he should be acknowledged for creating the first stem cell line and cloning Snuppy” the dog, one supporter says. “We have to obtain a patent for the country's sake, not Hwang's.”

    Last weekend, hundreds of Hwang's supporters gathered in front of the prosecutors' office, protesting Hwang's indictment. Police sealed off access to rooftops of nearby buildings to prevent suicide, as has been attempted in the past. Before the indictment, the Venerable Seol, a Buddhist monk, announced on 8 May that three individuals had pledged to contribute $65 million to help Hwang, a fellow Buddhist, restart his research. After the prosecution's announcement on 12 May, several monks began a 24-hour relay bowing ritual next to Jogye Temple in central Seoul in support of Hwang.


    Looking beyond individual culpability, senior prosecutor In Gyu Lee said at a press briefing that he placed partial blame for the scandal on “the strict Korean lab culture,” which leaves junior researchers powerless to refuse unethical demands by lab heads. He added that although the scandal demonstrated that “a lot of scientists lacked ethics,” he also noted that the fraud had damaged many junior researchers and collaborators who had no idea of what Hwang and his close associates were up to.

    South Korea's research community seems to be taking the lesson to heart, says Kye Seong Kim, a stem cell researcher at Hanyang University College of Medicine. He believes universities will now set up offices of research integrity. “That's one good thing that might come out of this tragedy,” he says. Others think reforms must go further. Duck Hwan Lee, a chemistry professor at Sogang University in Seoul, places partial blame on the government for pouring so much money into Hwang's project without sufficient information. “[The government] should create a system that enables more transparent research funding. Scientists should be able to compete for grants fairly instead of relying on lobbying or personal ties,” he says.


    Well-Balanced Panel to Tackle Algebra Reform

    1. Jeffrey Mervis

    Calling himself an “honest broker,” former University of Texas president and chemist Larry Faulkner has been named to chair a new presidentially appointed panel that will tackle the long-running debate over reforming U.S. mathematics education.

    The 17-member National Mathematics Advisory Panel is part of a proposed $250 million mathematics initiative by the Bush Administration.* The Math Now initiative, aimed at giving elementary school students a strong foundation in math and boosting the abilities of middle school students who have fallen behind (Science, 10 February, p. 762), puts special emphasis on algebra as the key to educational success. “The president wants the best advice on promoting student readiness for algebra and higher-level courses,” says Faulkner, who now heads the $1.6 billion Houston Endowment, a private philanthropy. “Algebra is a tremendously important gateway course, but our success rates are not very good.”

    Faulkner jokes that he was chosen “as someone with credentials in education and with the ability to massage egos.” The panel, which will begin meeting next week, includes several prominent players in the ongoing debate about what teachers and students need to know and whether those needs are met by the recent curricular reforms.

    Math mediator.

    Larry Faulkner hopes to reconcile the various views of panelists.


    The two professional mathematicians on the panel—Harvard University's Wilfried Schmid and Hung-His Wu of the University of California, Berkeley—have been vocal critics of those reforms and have argued for more rigorous instruction on basic skills. Panelist Francis “Skip” Fennell is president of the National Council of Teachers of Mathematics, the nation's leading math education organization, which has championed many of those reforms, as has math educator Deborah Loewenberg Ball of the University of Michigan, Ann Arbor. But Ball and Schmid are also members of a group that has pushed to find common ground between the reformers and their critics (see p. 988). The panel's vice chair is Camilla Benbow, an educational psychologist at Vanderbilt University in Nashville, Tennessee, who co-directs a longitudinal study of gifted math students.

    Education Secretary Margaret Spellings says she hopes the panel's initial recommendations, due to her in January 2007, will help U.S. teachers “know what's most effective in the classroom.” The commission also has the authority to order research on related topics before submitting its final report in February 2008. Although Faulkner doesn't rule out that possibility, he says “I think quite a lot of work has already been done.”


    PTO Wants to Tap Experts to Help Patent Examiners

    1. Eli Kintisch

    Think someone's trying to patent an old idea? The U.S. Patent and Trademark Office (PTO) may want you to chime in.

    The patent office is weighing an online pilot project to solicit public input on patent applications. Speaking last week at an open forum, officials said that tapping into the expertise of outside scientists, lawyers, and laypeople would improve the quality of patents—and might also reduce a backlog that this month topped 1 million applications. “Instead of one examiner, what if you have thousands of examiners reading an application?” says Beth Simone Noveck of New York University Law School, who is an independent advocate of the idea.


    The peer initiative focuses on so-called prior art, the scientific papers and previous patents that could render claims invalid. Although applicants often flood PTO with supporting material, PTO's 4500 examiners are prohibited from consulting with outsiders about its relevance. (The law does allow outsiders to pay $180 to submit up to 10 pieces of prior art, but comments are barred to avoid the appearance of meddling.) IBM is a firm supporter of the pilot system, and PTO officials hint that software and microchip patents will be one area of focus. Former examiner Leon Radomsky says outside experts would “definitely help” those areas given the dearth of outside prior-art resources, although supporters feel that the pilot could also benefit biotechnology and the chemical sector.

    Although the pilot is tentatively set to begin in December, details remain sketchy. The idea is for volunteers to be alerted about new patent applications—applications become public after 18 months—and invited to submit prior art. The community would then rank each other's suggestions, a la and the geek-news site Slashdot. Theoretically, says PTO official Jay Lucas, the process would generate a list of, say, 10 pieces of prior art that the examiner would do well to consult. Outsiders might also help examiners with another element of their job, namely, ruling on the tricky question of whether a proposed invention is obvious.

    Some observers worry that the system will simply add to an already heavy workload for examiners. Others speculate that a competitor, assuming that an applicant would be awarded a patent, might try to game the system by not introducing some prior art until it could be used for maximum leverage as part of a later challenge to the patent. And some think PTO's problems lie elsewhere. Former patent examiner Charles Wieland III, an attorney with Buchanan Ingersoll PC, says PTO should “just let examiners develop their expertise.” Inexperience is the “real problem” at PTO, he adds.

    A decision on launching the project is expected this summer.


    How the Hobbit Shrugged: Tiny Hominid's Story Takes New Turn

    1. Elizabeth Culotta

    SAN JUAN, PUERTO RICO—The strangest ancient humans may be Indonesia's “hobbits,” the 1-meter-tall people who made stone tools and hunted dwarf elephants 18,000 years ago. When announced 2 years ago, the fossils from the island of Flores seemed almost too bizarre for fiction. Now, close-up looks at some of the bones have given the hobbits' saga even more odd twists.

    At a recent meeting here,* two anatomists presented analyses suggesting that the original hobbit skeleton may not be female, as first described, and that its shoulders differ from those of modern people and hark back to an ancient human ancestor, Homo erectus. That detail and others bolster the notion that an H. erectus population on the island evolved into the dwarf form of H. floresiensis, anatomist Susan Larson of Stony Brook University in New York said in her talk at the meeting.

    Other researchers' opinions about almost every aspect of the hobbits, however, continue to run the gamut. Many are impressed with Larson's analysis. “I support Larson's observations … [and see] evidence of a faint phylogenetic signal” connecting the finds with H. erectus, says paleoanthropologist Russell Ciochon of the University of Iowa in Iowa City, who calls the skeleton from Flores “a very important link to our past.” But a few researchers still find the whole tale too tall to swallow. In a Technical Comment published online this week by Science, paleoanthropologist Robert D. Martin of the Field Museum in Chicago, Illinois, and colleagues argue that the single skull is that of a modern human suffering from microcephaly (see sidebar). And even some researchers who are reasonably convinced that the fossils do not represent diseased modern people caution that the sample size for the shoulder bones is one. “It's always nicer to have more than one individual” to hang a hypothesis on, says Eric Delson of Lehman College, City University of New York.


    Details of the Homo floresiensis skeleton suggest that it may be descended from H. erectus.


    At the meeting, a packed room listened intently as Larson described her work on the upper arm bone, or humerus, of the original skeleton, labeled LB1 as the first human from Liang Bua cave. The LB1 humerus is peculiar—or, rather, it lacks a peculiarity shared by living people.

    In modern humans, the top or head of the humerus is twisted with respect to the elbow joint by about 145 to 165 degrees. As a result, when you stand straight, the insides of your elbows face slightly forward, allowing you to bend your elbows and work with your hands in front of your body.

    But in H. floresiensis, the humerus appeared only slightly twisted. Last fall, Michael Morwood of the University of New England in Armidale, Australia, co-discoverer of the Flores bones, asked Larson, known for her work on the upper arm, how this could work in a toolmaking hominid. “I told him I didn't know,” says Larson. “It wouldn't work.”

    So at the invitation of Morwood and Tony Djubiantono of the Indonesian Centre for Archaeology in Jakarta, Larson flew to Jakarta last fall to study the bones with her Stony Brook colleague William Jungers, who was to work on the lower limbs. The pair are among the handful of researchers who have studied the original specimens.

    Larson found that the LB1 humeral head was in fact rotated only about 110 degrees. (No rotation would be expressed as 90 degrees.) Curious, she examined LB1's broken collarbone plus a shoulder blade from another individual.

    Larson concluded that the upper arm and shoulder were oriented slightly differently in H. floresiensis than in living people. The shoulder blade was shrugged slightly forward, changing its articulation with the humerus and allowing the small humans to bend their elbows and work with their hands as we do. This slightly hunched posture would not have hampered the little people, except when it came to making long overhand throws: They would have been bad baseball pitchers, says Larson.

    When Larson looked at other human fossils for comparison, she found another surprise: The only H. erectus skeleton known, the 1.55-million-year-old “Nariokotome boy” from Kenya, also has a relatively untwisted humerus, a feature not previously noted. Larson concluded that the evolution of the modern shoulder was a two-stage process and that H. erectus and H. floresiensis preserved the first step.

    H. erectus expert G. Philip Rightmire of Binghamton University in New York, who works on fossils from Dmanisi, Georgia, supports this view. Larson's and Jungers's analyses “make it clearer and clearer that Homo floresiensis is not some sort of dwarf modern human. This is a different species from us,” he says.

    In a separate talk, Jungers reported more unexpected findings. He was able to reconstruct the pelvis, which had been broken when the bones were moved to a competing lab in Indonesia (Science, 25 March 2005, p. 1848). Although previous publications had described the pelvis as similar to those of the much more primitive australopithecines, Jungers found that the orientation of the pelvic blades is modern. The observation adds weight to the notion that hobbits had H. erectus, rather than australopithecine, ancestry.

    The skeleton was first described as female, although the competing Indonesian-Australian team described it as male in press accounts. Now Jungers says he is “agnostic” about its sex. He notes that limb bones from other individuals from Liang Bua are even smaller—“they make LB1 look like the Hulk,” he says—raising the possibility that males and females differed in size, with LB1 in the role of big male.

    More surprises are still to come. Jungers said in his talk that LB1 includes an essentially complete foot, something not identified previously, and hinted that the foot is extremely large. Indonesia's hobbits, like J. R. R. Tolkien's fictional creatures, may have trekked about on big hairy feet.

    • * Paleoanthropology Society, 24–26 April.


    But Is It Pathological?

    1. Elizabeth Culotta

    Even as some researchers draw inferences about the ancestry of Homo floresiensis (see main text), others remain convinced that the bizarre bones from the Indonesian island of Flores are nothing more than diseased modern humans. In a Technical Comment published online by Science this week (, paleoanthropologist Robert D. Martin of the Field Museum of Natural History in Chicago, Illinois, and colleagues make that case.

    Martin gathered scaling data on the brains and bodies of other mammals, including data on the proportions of elephants as they evolved into dwarf forms on islands. Using several possible scaling models, he argues that shrinking a H. erectus brain to roughly the size of the Liang Bua skull would yield a body size no greater than 11 kilograms—the size of a small monkey.

    If the Liang Bua bones aren't a new species of human, what are they? Martin argues that the single tiny skull may be a modern human with microcephaly, or a pathologically small head. A previous Science paper by Dean Falk of Florida State University in Tallahassee and her colleagues argued that the Liang Bua skull did not show the extreme pathology seen in a microcephalic brain. But Martin counters that some microcephalic brains exhibit much less pathology, including one from a 32-year-old woman reported to have had the body size of a 12-year-old child. “I'm not saying I'm 100% certain it's microcephaly,” says Martin. “I'm saying that that brain size is simply too small” to be normal.

    Jean-Jacques Hublin of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, who has seen the original specimens, finds the scaling arguments “quite convincing.” But Martin's arguments are provoking a sharp response. Falk calls Martin's claims “unsubstantiated assertions” and adds that her team is surveying microcephalics to learn more. And bones from several small individuals have now been recovered from Flores, notes William Jungers of Stony Brook University in New York. He says that Martin's explanation implies that the island was home to “a village of microcephalic idiots.” He adds that “there are precious few ‘scaling laws’ out there” and that examples of unusual scaling are not unexpected.

    Paleoanthropologist Ralph Holloway of Columbia University, who is also studying microcephalic brains, says that so far he sees some differences between the Liang Bua skull and what's called primary microcephaly. But he warns that it will take a substantial survey to be sure. “I am coming around to believing that it isn't primary microcephaly,” he says. But “I certainly would not rule out pathology just yet.”


    U.K. Embryos May Be Screened for Cancer Risk

    1. Laura Blackburn*
    1. With reporting by Jocelyn Kaiser in Washington, D.C.

    CAMBRIDGE, U.K.—In vitro fertilization patients will be able to use genetic testing to avoid having children with mutations in genes such as BRCA1 and BRCA2 that raise cancer risks, the U.K. Human Fertilisation and Embryology Authority (HFEA) ruled last week. The decision, which follows a public consultation, breaks new ground because it permits screening for genes that are worrisome but not necessarily lethal or likely to produce trauma in childhood. The medical community is generally supportive, but critics are concerned that the decision could lead to screening for less risky traits in the future.

    New frontier.

    Fertility clinics will be allowed to sample embryos before implantation for mutations in genes such as BRCA1 and BRCA2 and reject them.


    Ten clinics in the United Kingdom are currently licensed to carry out preimplantation genetic diagnosis (PGD), in which one or two cells are removed from the embryo at the eight-cell stage and tested for lethal genetic conditions such as cystic fibrosis or Huntington's disease. HFEA chair Suzi Leather said on 10 May in a prepared statement that the authority's decision is “not about opening the door to wholesale genetic testing.” Rather, genetic tests would be available to the minority of people with a clear history of cancer in the family. HFEA will consider applications for testing on a case-by-case basis, she says, considering factors such as family medical history and whether the condition is treatable.

    Like many others in the medical community, Simon Fishel, managing director of CARE Nottingham, a U.K. clinic licensed to perform PGD, described the decision as “ethically sound.” He predicts that only a very small proportion of clients will elect to use the tests. Cost will also limit take-up: Depending on how much the government contributes, patients could be left with a bill of $10,000.

    But for some, the U.K. decision raises troubling questions. “I'm not entirely comfortable because of the concerns about the whole spectrum, from very severe diseases to what are essentially traits,” says Francis Collins, director of the U.S. National Human Genome Research Institute in Bethesda, Maryland. “There is no bright line along that spectrum.” What is most worrying, he says, is that embryo screening is not regulated in the United States, and no one is sure how widespread testing is.

    Some U.K. lobby groups and disability campaigners oppose the policy outright, however, saying it smacks of eugenics. “We are concerned that people are eliminating embryos, whether they have cancer or not,” says Josephine Quintavalle of the U.K. lobby group Comment on Reproductive Ethics. Quintavalle argues that research efforts should be concentrated on cancer cures, not destroying affected embryos. “We are concerned that people will view PGD as a cure for cancer,” she says.


    Genomes Throw Kinks in Timing of Chimp-Human Split

    1. Elizabeth Pennisi*
    1. With reporting by Ann Gibbons.

    A new genomic analysis has added a provocative twist to the history of humans. After comparing the genomes of five primate species, researchers have concluded that the ancestors of chimps and humans went their separate ways about 6 million years ago—at least a million years later than fossils suggest. But that's not even the most controversial claim: Early hominids interbred with their chimp cousins, says David Reich, a geneticist at Harvard Medical School in Boston. This hybridization helped make the human genome a mosaic of DNA with varying degrees of similarity to the chimp genome, he and his colleagues report in a paper published online on 17 May by Nature.

    Human roots.

    New DNA studies challenge the hominid status of the 7-million-year-old Toumaï fossil (bottom) by suggesting that humans (top) and chimps (middle) diverged much more recently.


    Researchers are impressed by the huge amount of data Reich, Nick Patterson of the Broad Institute in Cambridge, Massachusetts, and their colleagues incorporated into their study. “The paper showed that the comparative genomic approach is very powerful,” says geneticist Hideki Innan of the University of Texas Health Science Center in Houston. But some, particularly paleontologists whose fossils suddenly might become too old to be hominids, are more critical. Martin Pickford of the Collège de France in Paris predicts that the work will be “of passing significance.”

    For decades, anthropologists have argued about the timing of the chimp-human split, with estimates ranging from 10 million to 5 million years ago. The oldest fossil put forth as a human ancestor is a spectacular skull unearthed in Chad in 2002 nicknamed Toumaï. It dates back 7 million years, says co-discoverer Michel Brunet of the University of Poitiers, France. Two other hominid species were alive in Kenya and Ethiopia 5.8 million to 6 million years ago, according to other fossils.

    This fossil record doesn't neatly fit with the new findings by Reich's team. They matched up DNA sequences from the human, chimp, orangutan, macaque, and gorilla genomes and documented the differences. Having DNA from the orangutan, and from an even less related species, the macaque, allowed the group to confirm that mutations accumulated at about the same rate in different lineages of apes and humans. This meant that the number of differences in each lineage could be compared directly and were reliable for calculating how long the branches between apes and humans on the tree should be.

    The sequence comparisons provided relative “genetic” ages of the five species, and based on the ages of fossils of the ancestors of orangutans and macaques, the investigators concluded that the human lineage split from chimps no more than 6.3 million years ago and perhaps even more recently than 5.4 million years ago. That timing roughly agrees with another genetic analysis, reported in December 2005, by Blair Hedges, an evolutionary biologist at Pennsylvania State University in State College. “Together, they make a strong argument against the claims of older divergence times by paleontologists and other molecular evolutionists,” says Hedges.

    Brunet counters that it's too early to rewrite human history based on the DNA data. “Their explanation is just a hypothesis, while Toumaï is a true fossil,” he says. Also, the difference between the dates from the molecular analyses and the age of the Chad fossil may not be significant. “There are broad confidence limits on genetic data,” says Montgomery Slatkin, a population geneticist at the University of California, Berkeley.

    But no matter when hominid speciation occurred, the genetic analysis revealed that the transition wasn't very smooth. By comparing discrete sections of the primate genomes, Reich's team was able to calculate at least a 4-million-year difference in the ages of the oldest and youngest parts of the human genome. The X chromosome's age was most surprising. Chimp and human X chromosomes are much more similar than are the rest of their chromosomes, says Reich. Based on this congruency, he and his colleagues calculate that the X chromosomes became species-specific 1.2 million years after the rest of the genomes.

    To explain this oddity, Reich proposes that after evolving their separate ways for an unknown length of time, the earliest hominids and chimps hybridized. To be fertile, the hybrids had to have compatible X chromosomes, and thus there was intense selection to weed out any differences on that chromosome. Only after hybridization ceased did the X chromosome evolve into two different ones again.

    Innan's analysis of just human and chimp DNA, published earlier this month in Molecular Biology and Evolution, supports the idea of hybridization between chimp and human ancestors. Still, Reich theory's is getting a tough reception. “I don't buy these hybrids,” says Harvard anthropologist David Pilbeam, arguing that the ancestors of hominid and chimp were too different, morphologically and developmentally, to produce fertile offspring.

    As more primate genomes are sequenced, the history of the X chromosome should become clearer, says Reich. Whether chimp ancestors interbred with human ancestors or not, notes Svante Pääbo of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, comparative genomics “tells us … things that paleontology can't.”


    RU-486-Linked Deaths Open Debate About Risky Bacteria

    1. Jennifer Couzin

    ATLANTA, GEORGIA—Government officials and scientists convened last week to address troubling questions about two deadly types of bacterial infections that may be growing more common. One pathogen, Clostridium sordellii, has drawn intense political and scientific interest after being linked to deaths in young women following medical abortions, most with the abortion pill RU-486. The other, its cousin Clostridium difficile, is a growing scourge in hospitals.

    The meeting, held at the Centers for Disease Control and Prevention (CDC) here, was called three months ago largely because of the abortion-associated deaths, which then stood at five and are now thought to number seven. But it turned into a much broader, handwringing discussion over how much remains to be learned about both types of Clostridium.


    Clostridium difficile bacteria (green-white), attached here to human intestinal tissue, are making more and more people sick.


    C. difficile, which ravages the colon, has killed hundreds of hospital patients since 2000 and is increasingly showing up in healthy people and in animals. That's led to some concern about transmission through the food chain. Indeed, seven C. difficile patients appear to harbor animal strains of the bacterium, Clifford McDonald, a medical epidemiologist at CDC, announced. “We are a little disturbed” by that, he says.

    Rates of C. difficile infections have soared recently, doubling in U.S. hospitals between 2000 and 2003 and jumping another 25% in 2004. In the United Kingdom, the disease rate leapt from 1 in 100,000 people to 22 in 100,000 over 10 years. The mortality rate also appears to be increasing, from about 1% to almost 7% in some cases, such as an epidemic in Quebec hospitals in Canada 2 years ago.

    Typically, C. difficile sickens hospital patients who have taken antibiotics, although how the drugs predispose patients to the germ is “pretty much a black box,” says Ciarán Kelly, a gastroenterologist at Harvard's Beth Israel Deaconess Medical Center in Boston. The bacterium is showing up more and more outside hospitals and among people with no recent antibiotic exposure. Analyzing the responsible strains, says McDonald, could sort out whether C. difficile has become food-borne. It could also determine whether the bacterium has mutated. In December, scientists described in the New England Journal of Medicine a novel strain of C. difficile that may churn out more toxin.

    A better grasp of the bacterium's basic biology could offer clues to preventing and treating infections, but working with the microbe is a challenge. C. difficile, like C. sordellii, is difficult to manipulate genetically. And some work suggests that the bacterium behaves differently in humans than in animals, implying that animal models may be misleading, says Kelly.

    C. sordellii is even less well understood. It's not clear what predisposes people to an infection, and unlike C. difficile, which may respond to antibiotics, C. sordellii infections are rarely treatable. When the U.S. Food and Drug Administration (FDA) approved RU-486 in 2000, it urged that possible adverse events be reported to the agency, and it was those reports, of a handful of women succumbing within hours or days to a terrifying infection, that first alerted health officials. The overall risk of death from these infections has been estimated at about 1 in 100,000.

    Some have speculated that vaginal rather than oral administration of misoprostol—a drug that acts with RU-486 to induce an abortion—was a factor in the deaths. But the meeting underscored that “this is a far more complex medical and epidemiologic situation than originally appeared to be the case,” says Sandra Kweder, deputy director of FDA's Office of New Drugs.

    CDC's Marc Fischer detailed 10 fatal cases of C. sordellii genital tract infection from 1977 to 2001, which the agency found by combing through old records. Eight cases occurred after women gave birth; one followed a miscarriage; and the last was not associated with pregnancy. McDonald presented four additional cases CDC is investigating, three of which are thought to have followed nonsurgical abortions and a fourth following a miscarriage. To make matters more confusing, two of those new cases involve not C. sordellii but a third member of the clostridium family, C. perfringens. “If you look at the presentation of these illnesses, they always come after delivery, after miscarriage, after the passage of abortion,” says David Soper, an obstetrician-gynecologist at the Medical University of South Carolina in Charleston. “Does pregnancy hold the key?”

    Esther Sternberg of the National Institute of Mental Health in Bethesda, Maryland, has found that C. sordellii toxins disrupt hormone receptors for glucocorticoids, which may predispose women to an excessive inflammatory response in the presence of the bacteria. Fischer noted that CDC is limited in its ability to track down old C. sordellii cases. It is asking physicians to report suspicious deaths following pregnancy or miscarriage.

    Companies are now developing vaccines, as well as drugs that bind to C. difficile's toxins. A National Institutes of Health (NIH) official at the meeting urged attendees to submit Clostridium research proposals. Meanwhile, CDC, FDA, and NIH plan to identify research priorities in the field. So far, FDA has given no indication that it will change how RU-486 is marketed.


    Invention of China's Homegrown DSP Chip Dismissed as a Hoax

    1. Hao Xin

    In a major embarrassment for China's national electronics R&D program, an inventor's claim to have created a series of homegrown computer chips has been declared a fraud. After a months-long investigation, Shanghai Jiao Tong University (SJTU) announced on 12 May that it found “serious falsification and deception in the research and development of the Hanxin series of chips led by [SJTU dean] Chen Jin.” The university announced that Chen had been dismissed. Chen did not respond to telephone or e-mail messages.

    Chen won national acclaim in February 2003 when he unveiled what he described as the first digital signal processor (DSP) chip designed and manufactured in China, called Hanxin-1 or “Chinese chip.” He quickly followed with two improved designs and promised a fourth and fifth generation with both a DSP and a central processing unit. The 37-year-old inventor built his career on aiming, as he told a reporter, “to put the label ‘Made in China’ on high-end computer chips.”

    With a 1998 Ph.D. in computer engineering from the University of Texas, Austin, Chen spent a short stint as a test engineer at Motorola Semiconductor Product Sector in Texas, now called Freescale Semiconductor, and returned to China in 2000. At SJTU, Chen embarked on a road to take back China's DSP market shares. In less than 2 years, he managed to set up an integrated-circuit design lab and had a product ready.

    Government and academic leaders embraced the inventions. Chen was appointed dean of SJTU's newly established School of Microelectronics; he founded the company SJTU HISYS Technology Ltd. and became its CEO. More than $7 million in public R&D funds poured in. The Shanghai government named Chen CEO of Shanghai Silicon Intellectual Property Exchange, a platform established in 2003 with $3.75 million in municipal funds for trading semiconductor rights.

    But on 17 January, an anonymous posting on a Chinese Web site presented evidence alleging that the project was a fraud. The tipster claimed that Chen had purchased 10 Motorola DSP chips in August 2002 and had the original logo sanded off and replaced with HISYS and SJTU labels. According to the allegations, Chen promoted the chips as his Hanxin-1 design and later passed off other derivative products as his own inventions.

    HISYS Technology issued a statement on 21 January calling the allegations “pure fabrication.” However, 5 days later, SJTU issued a statement expressing concern over the alleged fraud and announcing that the university had asked national ministries and the Shanghai government to help investigate.

    The investigation was organized by China's Ministry of Science and Technology (MOST)—a major investor in the project—the Ministry of Education, and the Shanghai government. An expert team interviewed Chen, the still-anonymous Internet tipster or tipsters, and others. It inspected and compared technical documents on site and checked the design and process specifications of Hanxin chips 1 through 4.

    Before the fall.

    Microelectronics wizard Chen Jin with his chip at a press conference in February 2003.


    Last week, SJTU released the team's findings: The device Chen had displayed as Hanxin-1 at a press conference in 2003 was not the one that had been submitted for evaluation; instead, Chen substituted another chip that his lab did not design. SJTU's report also said that Chen did not own the “core technology” of other chips that he claimed. Chen, the report said, “used false results to cheat evaluation experts, Shanghai Jiao Tong University, his research team, local government, ministries of the central government, as well as the media and the public,” but the report does not say how the evaluation experts were cheated. MOST terminated Chen's ministry-funded projects and asked him to return the research funds.

    Chen appears to be moving on to other ventures. At a low-key news conference last month, he announced that HISYS Technology—now severed from the university—is forming an alliance with Skyworks Shanghai to develop products for the mobile phone market.


    Finding Common Ground in the U.S. Math Wars

    1. Jeffrey Mervis

    For years, mathematicians and math educators have blamed one another or the inadequacies of U.S. mathematics education. But both sides may finally be headed toward agreement on how to fix the system


    Like a sheriff summoned to restore order to a lawless town in the Wild West, Richard Schaar knew that taking on the Math Wars would be a rough assignment. An applied mathematician and former president of the calculator division at Texas Instruments (TI), Schaar was part of an industry-led panel trying to improve U.S. science and math education a few years back when he realized that a huge schism in the community would likely block any effort to reform elementary and secondary school mathematics.

    “I hate labels, but in general the professional mathematicians were on one side, and the math educators were on the other,” says Schaar, describing a debate, triggered by a huge backlash to a 1990s reform movement, that has persisted despite mounting concern about how poorly U.S. students fare in international comparisons. “The argument over direct instruction versus discovery learning, as the two sides are commonly described, was pulling the field apart. The mutual respect had gone away. And in that climate, any attempt to improve math standards at the state level would have been doomed to failure.”

    The solution seemed obvious to him: Bring together a handful of top guns from each side and hope for harmony rather than bloodshed. And that's exactly what Schaar has done, in the Common Ground initiative ( The six-member group has made modest but impressive progress over the past 18 months in finding agreement on issues that for the last decade have led mathematicians and math educators, in the words of one mathematics society executive, “to sit on the sidelines and lob bombs at each other.” (To be fair, both sides claim to be appalled by the analogy to warfare. But they use combat imagery repeatedly in conversations as a shorthand to describe their experiences.)

    The Common Ground initiative is one of several hopeful signs that the two sides may be ready to call a truce and work together to improve U.S. mathematics education. Last month, the country's largest group of mathematics educators, the National Council of Teachers of Mathematics (NCTM), endorsed a short list of math skills, by grade, that every elementary and middle school student needs to master. These skills, called Curriculum Focal Points, are an attempt to correct what math educators decry as “mile-wide, inch-deep” curricula in most U.S. schools that leave many students unprepared for high school and, ultimately, precludes them from pursuing careers in science and engineering. This week, the Department of Education named mathematicians, educators, and community leaders to a presidential panel that will review the state of mathematics education (see p. 982). Observers are hopeful that the easing of tensions will improve the quality of the panel's recommendations on bread-and-butter issues such as student instruction, teacher training, and the additional research needed to enhance each area, not to mention make those recommendations easier to sell.

    “I think Common Ground is a historic and groundbreaking exercise,” says Frances “Skip” Fennell, a mathematics education professor at McDaniel College in Westminster, Maryland, and NCTM president. “I worked in the education directorate at NSF [National Science Foundation] in the late 1990s, and I was blown away by the anger in the community. This is exactly what we need to get things moving forward.”

    All for algorithms

    Professional mathematicians blame themselves for some of those angry words. They were heavily involved in a major reform of the U.S. mathematics curriculum in the 1960s, after Sputnik, that was widely criticized as too difficult for the average student. In response, mathematicians largely withdrew from the fray and were silent when math educators promulgated the next round of reforms in response to a 1983 report that said low student achievement in reading and math was putting the country at risk. “There's been a divide between education and subject matter fields for a long time, but it's had its worst consequences in math,” notes Roger Howe, a Yale University mathematician who has thought hard about the mathematical foundations of elementary principles such as place value. And when the mathematicians belatedly discovered aspects of the new courses that they didn't like, they unleashed their wrath upon federal officials and math educators, castigating them at every opportunity for demanding too little of students and watering down their discipline.

    Given the rancorous tone of the debate, Schaar knew that he needed to sign up leading figures from both sides. He spent a year picking his team: two mathematics professors who have been sharp, public critics of the reform curricula (R. James Milgram of Stanford University in Palo Alto, California, and Harvard University's Wilfried Schmid) and three math educators in the forefront of those reforms (Deborah Loewenberg Ball of the University of Michigan, Ann Arbor; Joan Ferrini-Mundi of Michigan State University in East Lansing; and Jeremy Kilpatrick of the University of Georgia, Athens). In December 2004, the same month he retired from TI, Schaar convened the first meeting of the Common Ground initiative, with himself as facilitator.

    Six months and six meetings later, the group issued a three-page document describing a handful of principles that should guide math education from kindergarten through high school. The principles include the automatic recall of basic facts, the importance of abstract reasoning, the need to acquire a mastery of key algorithms, and the judicious use of calculators and real-world problems. Two months ago, an expanded group met for a weekend to tackle the topics in greater detail, and last week, initial working papers from that meeting were posted. The core group met again last weekend to plot its next steps, as well as to clarify its earlier statement about setting high expectations for students—one that's been misinterpreted as an argument for making calculus a required course in high school.

    The document doesn't say when or how any of the concepts should be taught. Common Ground is not a curriculum, Schaar points out. The most its participants can hope to achieve is to influence the process by which states develop standards, adopt textbooks, and develop the assessment tools to measure what students should be learning. Even so, their carefully worded statements on selected topics reflect hard-fought compromises on core issues that have roiled the community for more than a decade and that, once resolved, could pave the way for continued progress.

    “There will always be differences,” says Milgram, who in 2000 testified before Congress that “the sad state of U.S. mathematics education” is the result of “a constructivist philosophy” promoted by NCTM standards and endorsed by NSF and the Department of Education, the two leading federal sources of support for teaching mathematics. “But if we can agree on the essential content that students need to know, then the other fights become manageable. And I'd say that there has been far more agreement than disagreement.”

    Ball, who has done pioneering work on what math teachers need to know to do their jobs well (i.e., not just how to teach long division but also to understand why Susie's method is incorrect), believes that the process has been just as important as the product. “Our goal was to provide leadership to the field, to say to everybody: ‘If we can do it, then the rest of you can, too.’ And I think we've shown that it's possible to come together on many of the flash points.”


    One major flash point is the use of algorithms—how to do long division, for example—and the memorization of the facts upon which they are based. Many mathematicians maintain that current state standards and instructional materials downplay the use of such time-tested algorithms or allow students to bypass them entirely by using calculators. So when Common Ground asserts that “students should be able to use the basic algorithms of whole number arithmetic fluently, and they should understand how and why the algorithms work,” the participants are trying to stitch up a vast rift in the community.

    “Of course kids have to know how to compute and know their basic facts. But they also have to make sense of what they are being taught and explore the ideas with open-ended problems,” says Sybilla Beckmann Kazez, a mathematician at the University of Georgia, Athens, who is well respected by both camps. “If you put it that way, everybody would agree.” Schaar concurs that the initiative has only scratched the surface on this contentious subject: The question of algorithms “is an incredibly challenging area that will require additional exploration.”

    Getting to the (focal) point

    NCTM's new curriculum focal points, covering prekindergarten through grade eight, are also just beginning their long journey through the educational system. (The document won't even be released publicly until fall, officials say, although drafts have circulated and the council's executive board approved the latest version last month at the organization's annual meeting in St. Louis, Missouri.) With three per grade, the focal points address what math educators decry as overly broad and shallow curricula in most U.S. schools that hinder mastery and prepare students poorly for college-level work.

    NCTM President Fennell says the focal points are intended to provide “curricular relief” to elementary and middle school teachers whose school districts expect them to achieve as many as 100 objectives in mathematics. Many of those objectives span several grades, with teachers expected to tailor them to the maturing child. But there's no urgency because teachers know that their students will get another bite of the apple the following year.

    “While lots of things are important, we're saying to teachers that here are three things you need to zero in on,” says Fennell. “For example, we'll teach some probability in the fourth grade. But it's not as important as multiplication,” which takes center stage alongside fractions and decimals and the concept of area. Second graders should concentrate on addition and subtraction, place value, and linear measurement, says NCTM, even if their teachers also touch upon other topics.

    Although focal points must first be woven into state and district guidelines to have any real effect, the council's action already represents a significant move toward common ground: Professional mathematicians love to attack the 1989 and 2000 NCTM standards, and they see focal points as a tacit admission that some of their criticisms were on the mark. They also welcome the message that, for most students, less is more.

    “The idea of coming up with a few topics that should be addressed in K through 8 is a very needed step,” says Richard Askey, a professor emeritus of mathematics at the University of Wisconsin, Madison, and an outspoken critic of earlier NCTM standards and curricula based on them. “I think that publishers, who now have to deal with all [different] state standards, will also like the idea” of a limited number of key objectives for each grade.

    Jane Schielack, a mathematician and math educator at Texas A&M University in College Station who led the NCTM task force that assembled the focal points, agrees that they are very much a product of the times. “This is something we couldn't have done 4 or 5 years ago,” she says. In addition to the greater emphasis on accountability spawned by the 2001 federal No Child Left Behind law, Schielack cites the growing recognition that some countries, notably Singapore and China, excel on international student comparisons because of a national curriculum that focuses on a small number of topics and policies that give teachers the necessary training and resources to get the job done. “That's the biggest difference between the United States and the top-achieving nations,” agrees Milgram. “Having NCTM come out with a statement to this effect should make an enormous difference on what we expect kids to learn.”

    Even so, nobody expects Common Ground and focal points, by themselves, to usher in a golden age of quality mathematics education. There's too much that remains to be done. “It's a long, long journey,” says Hung-Hsi Wu, a mathematician at the University of California, Berkeley, who runs summer institutes for classroom teachers whose grasp of basic mathematics is often poor or nonexistent. “Better mathematics education in the United States won't take place in the next 10 years. I think it will take 30 years.”

    At the age of 60, Schaar doesn't plan on staying in the line of fire for quite that long. But he's not ready to saddle up and ride out of Dodge. Schaar believes that Common Ground, funded by NSF and TI and staffed by the Mathematical Association of America, has restored a measure of civility to the debate. And this month, after a coalition of 16 leading mathematical societies applauded his 2-hour presentation and told him to keep up the good work, he said that kind of support is exactly what's needed.

    “I'm not looking for an endorsement,” he says. “I'm looking for help in getting more people involved.” A bigger “in” crowd means fewer outcasts. And that's good news for a sheriff.


    After a Tough Year, ALMA's Star Begins to Rise at Last

    1. Daniel Clery

    Cost hikes, scarce labor, and management changes have buffeted the first global telescope array, but new funding agreements may augur smoother sailing ahead

    All together now.

    For different observing jobs, ALMA's 50 antennas can be rearranged with a giant purpose-built truck.


    The world's largest ground-based astronomy project, the Atacama Large Millimeter Array (ALMA), is back on track after a tumultuous couple of years that have seen costs balloon by about 40% and the capability of the enormous microwave telescope scaled back.

    ALMA, with an overall budget now in the region of $1 billion, is a collaboration between the United States, the European Southern Observatory (ESO), and Japan, plus minor partners Canada and Spain. As a result of skyrocketing prices in commodities needed to build its antennas and huge hikes in labor costs in Chile, where ALMA is being built, astronomers have had to go cap in hand to their funders for more money. ESO agreed to swallow its share of the increases last autumn, but it was not until last week that the U.S. National Science Foundation (NSF) won agreement from its governing board. “It's been a fairly intense 18 months,” says astronomer Christine Wilson of McMaster University in Hamilton, Canada, chair of ALMA's scientific advisory committee.

    “I'm told that most big projects go through something like this,” Wilson says. “Cost increases are a given.” But for researchers waiting to see whether funders would keep faith with the project, the process has been nerve-wracking. “We were holding our breath back in the summer and fall for ESO,” Wilson says. “It's been a very stressful situation for everyone in the project.” U.S. team members had to await the outcome of a series of cost reviews, but in a meeting on 10 May, the National Science Board gave NSF permission to increase U.S. spending on ALMA from $344 million to $499 million, subject to the approval of Congress. According to ESO's Thomas Wilson, European project scientist on ALMA, during these discussions there was an unspoken warning from the funders: “This is it. Don't come back and ask for more.”

    ALMA, the first truly global effort in ground-based astronomy, grew out of three separate projects. U.S. astronomers started discussing a Millimeter Array in the mid-1980s; European plans for a Large Southern Array took shape about a decade later. ESO and the U.S. National Radio Astronomy Observatory (NRAO) in Socorro, New Mexico, began discussions on merging the two projects in 1997 and in June 1999 agreed to build a joint instrument comprising 64 12-meter antennas spread over an area up to 12 kilometers across. The array took its new name from Chile's Atacama desert, where researchers had found a wide plateau, the Llano de Chajnantor, which at 5000 meters altitude is high enough and dry enough to avoid most of the atmospheric water vapor that blocks signals at the wavelengths ALMA is designed to receive.

    The push for such an instrument came because better receivers, fast digital electronics, and antenna design were improving the capabilities of millimeter-wave telescopes. Astronomers calculated that a large number of receivers arranged as an interferometer could rival the resolutions of the best optical instruments, such as Hubble and ESO's Very Large Telescope in Chile. At millimeter and submillimeter wavelengths, astronomers can study the lowest-energy emissions from simple molecules. With ALMA, they hope to peer into star-forming galaxies when the universe was young to see whether stars formed in a burst early on or more steadily over a long period. Closer to home, they can see whether disks of dust and gas around young stars—places where planets could form—are commonplace or rare.

    Japan, which had been developing its own Large Millimeter and Submillimeter Array, joined the club in 2001. The plan is for Japan to construct a parallel instrument, the Atacama Compact Array (ACA), made up of 4 12-meter antennas and 12 7-meter antennas. Sited next to the main array, ACA will be better able to image extended diffuse objects. In addition, Japan is providing receivers to cover three extra wavebands for antennas in both ACA and the main array.

    At first, everything moved along according to plan. Prototype antennas for the main array were ordered from two suppliers, one in Europe and one in the United States. Work crews began preparing the site at Llano de Chajnantor in late 2003. Once delivered, the prototype antennas were put through a series of tests at a specially built facility in Socorro, home of the Very Large Array radio telescope. Testing was completed in April 2004 with a view to awarding the antenna contracts—the biggest items on the ALMA shopping list—later that year.

    ALMA researchers, however, were not happy. “The first round of tests were not conclusive,” says Thijs De Graauw of SRON, the Netherlands Institute for Space Research, and chair of ALMA's management advisory committee. “There were valid concerns,” adds astronomer Lee Mundy of the University of Maryland, College Park. “They were asking for a very precise antenna and wanted to make sure it could accomplish the science.”

    New tests were ordered, but the delay proved costly. At the time, the prices of commodities essential for the antennas' construction, such as steel, were going through the roof. And as the extra tests dragged on into 2005, ALMA managers had to ask the manufacturers to resubmit their bids for building the production antennas. The bids came in much higher than managers had expected and threw the project into crisis. Asked whether ALMA could make do with fewer antennas, the scientific advisory committee concluded that the array could achieve its primary science goals with 50 rather than 64 dishes, but observations would take longer and would be more prone to systematic errors. An array of less than 50 instruments would still be “a superb instrument,” the advisers said, but its goals would be compromised.

    “We decided to reduce the number of antennas so the cost increase would not be too large,” says ESO Director General Catherine Cesarsky. The North American team went ahead in July 2005 and placed an order for 25 antennas, with an option to buy another seven. ESO was poised to follow suit, but then it hit another snag. Under its rules, it had to take the lowest bid that met specifications. ESO had planned to buy from the same company NRAO had ordered from, VertexRSI of Kilgore, Texas. But the European consortium led by French-Italian company Alcatel Alenia Space submitted a cheaper revised bid. Before signing on the dotted line, Cesarsky says ESO waited to see a cost review of the whole ALMA project that was completed in October and carried out a review of all its programs to see whether enough economies could be made to cover the extra costs.

    High and dry.

    With a Chilean construction boom in progress, ALMA managers are having trouble finding people to work in the thin desert air 5000 meters above sea level.


    Concerns remained even after ESO ordered its 25 antennas from Alcatel last December. Some researchers worried that having two sets of antennas from different suppliers would increase costs down the line because it would require double the number of technicians and spare parts. But in January, a “delta” review of the increased cost reported that it was unlikely to be more than 1% of ALMA's total budget. Meanwhile, other costs were also draining ALMA's coffers. Chile's economy has been booming, and the consequent boost to the construction industry has made labor hard to find and more expensive. In addition, copper prices are at an all-time high, and northern Chile has extensive copper deposits. Chilean workers, it turned out, would rather mine copper than work in the cold airlessness of 5000 meters.

    Labor troubles have exacerbated another hurdle ALMA is working to overcome: learning to manage a global engineering project. “Astronomers are not used to this scale of project,” Mundy says. “It's taking astronomy into the big league.” Some have charged that managers' cost estimates at the start of the project were unrealistic and that ESO based its estimated construction costs on the other observatories it had built in Chile, which were all at lower altitudes. “Assumptions were optimistic,” says De Graauw. “Errors came from not knowing in enough detail what was to be built.” Says Mundy: “In a project of this scale, managers and management systems are needed. These were not components of the original pricing.”

    Cesarsky acknowledges that running the project with two management teams separated by the Atlantic has been difficult: “It was not clear who should make decisions. A strong central management was needed.” More control has now been put in the hands of the Joint ALMA Off ice in Santiago, Chile's capital, Cesarsky says.

    The flurry of reviews that have assessed the project from within and from outside have now given it a clean bill of health. “I think things are going along very well,” says Al Wootten, ALMA's North America project scientist. But for researchers, the necessity to cut back the number of antennas to 50 rankles. “People are unhappy about it still,” says ESO's Wilson. Cesarsky thinks there's still a possibility that the array can be built at full strength, “if we're lucky and have not spent our contingency.” Not everyone is so positive. “Do we skimp and endanger the whole instrument? Surely it's better to do it right once,” argues Mundy. “I haven't heard any way to get there, but the door is still open.”


    Waiting for ITER, Fusion Jocks Look EAST

    1. Dennis Normile*
    1. With reporting by Gong Yidong.

    China is breaking new ground with a fusion test bed that will tide researchers over until the ITER megaproject comes online

    Speed matters.

    It has taken just over 5 years and $37 million to complete China's new tokamak, according to the Institute of Plasma Physics.


    HEFEI, CHINA—The official launch of the International Thermonuclear Experimental Reactor (ITER) project next week will mark a coming of age for fusion research in Asia. When the $11 billion effort was initiated in 1985, ITER's four original backers—the United States, the European Union, Japan, and the Soviet Union—accounted for nearly all worldwide research into harnessing fusion, the process that powers the sun, to produce energy. But now the three newest ITER partners, China, South Korea, and India, are showing that they didn't just buy their way into one of the biggest physics experiments since the Manhattan Project: They are contributing crucial expertise as well.

    The first new Asian fusion tiger out of the gate is the Institute of Plasma Physics (IPP) of the Chinese Academy of Sciences, which in March completed testing a machine that has never been built before: a fully superconducting tokamak. This toroidal vessel isn't the largest or most powerful device for containing the superhot plasma in which hydrogen isotopes fuse and release energy. But until India and South Korea bring similar machines online (see sidebar, p. 993), it will be the only tokamak capable of confining a plasma for up to 1000 seconds, instead of the tens of seconds that machines elsewhere can muster. ITER, expected to be completed in Cadarache, France, in 2016, will have to sustain plasmas far longer to demonstrate fusion as a viable energy source. But researchers from China and around the world will be able to use IPP's Experimental Advanced Superconducting Tokamak (EAST) to get a head start on learning to tame plasmas for extended periods. “This will make a big contribution for the future of fusion reactors,” declares Wan Yuanxi, a plasma physicist who heads EAST.

    Fire when ready.

    EAST will fill a crucial gap for fusion researchers until ITER is built, says Director Wan Yuanxi.


    Fusion research over the next decade will be probing the physics of steady-state plasmas like those promised by ITER, says Ronald Stambaugh, vice president for the Magnetic Fusion Energy Program at General Atomics in San Diego, California. “EAST will play a big role in that,” he says. Others credit IPP for building its advanced tokamak fast, in just over 5 years, on a shoestring $37 million budget. That's a fraction of what it would have cost in the United States, says Kenneth Gentle, a plasma physicist and director of the Fusion Research Center at the University of Texas, Austin. “That they did this in spite of the financial constraints is an enormous testimony to their will and creativity,” adds Richard Hawryluk, deputy director of the Princeton Plasma Physics Laboratory.

    IPP adroitly fills a generational gap. Fusion power will rely on heating hydrogen isotopes to more than 100 million degrees Celsius, until they fuse into heavier nuclei. The leading design for containing this fireball is the tokamak, a doughnut-shaped vacuum chamber in which a spiraling magnetic field confines the plasma. Ringlike metal coils spaced around the doughnut—toroidal field coils—and a current in the plasma produce this spiraling field. Additional coils in the center of the doughnut and along its circumference—poloidal field coils—induce the current in the plasma and control its shape and position.

    Early tokamaks had circular cross sections and copper coils, which can only operate at peak power in brief pulses before overheating. ITER will be far more sophisticated. It will have a D-shaped cross section, designed to create a denser plasma that can generate its own current to supplement the induced current, reducing energy input. And coils will be superconducting. (No major tokamak has had superconducting poloidal field coils.) At temperatures approaching absolute zero, superconductors carry current without generating resistance, allowing more powerful magnetic fields that can be maintained longer.

    Researchers want to try out a D-shaped, fully superconducting test bed before scaling up to ITER, which will be two to three times the size of current tokamaks. The Princeton Plasma Physics Laboratory had planned to build such a device. But a cost-conscious U.S. Congress killed their $750 million Tokamak Physics Experiment in 1995. EAST and the two other Asian tokamaks under construction intend to fill this gap.

    “We recognized this was an opportunity for us to make a contribution for fusion research,” Wan says. For support, he tapped into China's worries about its growing demand for energy. “There is no way we can rely entirely on fossil fuels,” he says. China's government approved EAST in 1998.

    IPP faced an enormous challenge. The institute, founded in 1978, had built a few tiny tokamaks in the 1980s and got a hand-me-down, partially superconducting tokamak from Russia's Kurchatov Institute in 1991. EAST would be a totally different beast. “We didn't have any experience in the design, fabrication, or assembly of these kinds of magnets,” Wan admits. Neither did Chinese manufacturers.

    Industrial partners supplied parts of the tokamak, including the vacuum vessel. But the superconducting coils and many other high-tech components would have been too expensive to import. “We had to do [these] ourselves,” says the tokamak's chief engineer, Wu Songtao. So Wu's team bought precision milling machines, fabricated their own coil winders, and built a facility to test materials and components at cryogenic temperatures. “They literally built a whole manufacturing facility on site,” says Hawryluk.

    IPP physicists and engineers passed a major milestone earlier this year, when they tested the entire assembled device, cooling the 200 tons of coils to the operating temperature, 4.5 kelvin. They discovered only minor, fixable glitches, Wan says, and are now undertaking the necessary tweaks and installing shielding materials and diagnostic devices. In August, they plan to inject hydrogen and fire up EAST's first plasma.

    With the tokamak passing its cool-down test, Wan says the team was “finally able to get a good night's sleep.” They are now planning experiments to explore how to control D-shaped plasmas. Tugging a plasma into a specific shape can create instabilities, Gentle says. Control is all the more difficult because superconducting coils respond poorly to current fluctuations. IPP will probe these issues. “That's where the science is going to be extremely valuable,” says Hawryluk.

    EAST has limitations. The most significant is that, unlike ITER, it will not attempt a burning plasma, in which at least half the energy needed to drive the fusion reaction is generated internally. ITER will use a combination of deuterium and tritium (hydrogen isotopes with, respectively, one and two neutrons in the nucleus), which fuse at a lower temperature than other gases, to achieve a burn. Because radioactive tritium requires specialized and expensive handling systems and shielding, EAST will use only hydrogen or deuterium.

    That limitation is hardly dampening enthusiasm for the hot new kids on the block. IPP researchers, says Hawryluk, “have already put themselves on the fusion community map.”


    Asian Fusion

    1. Dennis Normile

    India, Korea, and possibly Japan are joining China in building next-generation tokamaks. These machines seek to fill a research gap on the road to the International Thermonuclear Experimental Reactor (ITER) by employing all-superconducting coils to study the physics of confining plasmas for long durations, which current tokamaks can't do.

    • India's Institute for Plasma Research is now commissioning its Steady State Superconducting Tokamak. An engineering test at cryogenic temperatures turned up problems that are now being addressed. Institute plasma physicist Y. C. Saxena says they are hoping to try a second engineering test later this month. If that goes well, they will attempt their first plasma in the summer. The $45 million project, launched in 1994, is the smallest of the new tokamaks. But Saxena says they believe they can help unravel the physics of long-lasting plasmas.

    • The most ambitious machine is the Korean Superconducting Tokamak Reactor (KSTAR), being built by the National Fusion Research Center in Daejeon. KSTAR relies on superconductors made from the more advanced niobium-tin alloy that ITER will employ. The $330 million project was delayed because of Korea's late-1990s economic crisis. Project Director Lee Gyung-su says they are now aiming for first plasma in early 2008.

    • For several years, Japan's Atomic Energy Agency has been studying the possibility of upgrading its JT-60 tokamak to be fully superconducting. Japan may get funding for the upgrade from the European Union as compensation for its assent on the agreement to build ITER in France. An agency spokesperson says key decisions are under negotiation.


    Should Academics Self-Censor Their Findings on Terrorism?

    1. Yudhijit Bhattacharjee

    Some government-funded researchers believe their papers require special handling. But others say that creating such a gray area undermines academic freedom

    Last year, after Detlof von Winterfeldt and his colleagues at the University of Southern California (USC) in Los Angeles finished a study on the likelihood and impact of a dirty bomb attack by terrorists on the Los Angeles harbor, they omitted some important details from a paper they posted on the Internet. Although the team had used no classified material, von Winterfeldt felt that self-censorship was prudent given the subject matter. It's also in line with draft guidelines being considered by the U.S. Department of Homeland Security (DHS), which funds the Center for Risk and Economic Analysis of Terrorism Events that he directs. “We were still able to present the methodology behind the analysis fully and effectively,” he says. “It made perfect sense to make those changes.”

    But some scientists say that stance conflicts with academic freedom, and that the public deserves access to anything not explicitly classified. They worry that the actions of the USC researchers could serve as a model for restricting the conduct and dissemination of university research. Their concerns are tied to an ongoing effort by the Bush Administration to draw up common standards across federal agencies for withholding information under the rubric of sensitive but unclassified (SBU) material.

    “The only appropriate mechanism for controlling information is classification,” says Steven Aftergood, who runs the Project on Government Secrecy for the Federation of American Scientists. “If we want to gain the benefits of university research on problems of national security, we need to conduct it openly. Imposing restrictions short of classification is a slippery slope that will ultimately paralyze the academic process.”

    Universities have traditionally drawn a sharp line between classified and unclassified information, refusing to accept the ill-defined SBU category. Yet, in a 28 March meeting at the U.S. National Academies, DHS officials and directors of the six university centers funded by the agency discussed draft guidelines to control the dissemination of sensitive information generated by their research. The guidelines were developed by the center directors in collaboration with DHS officials. The academies agreed to be host because of their ongoing interest in the topic.

    Besides recommending the scrubbing of papers before publication, the guidelines would have center directors decide whether proposed research projects are likely to produce sensitive information—loosely defined as information not easily available from public sources and/or of potential use to terrorists. Projects that fit that description would be subject to additional scrutiny. The results, says the document, could include “producing different version(s) of the findings for ‘For Official Use Only’ and for public dissemination, declin[ing] the proposed work, or mov[ing] it to a classified environment.”

    The guidelines simply acknowledge “the reality of a changing world,” says Melvin Bernstein, acting director of DHS's Office of Research and Development, which helped set up the university centers with 3-year renewable grants. “There's an increasing recognition in the university community that there could be circumstances when researchers need to be careful about what can be disseminated.”

    Although Bernstein says it's too early to know whether the guidelines will become official policy, they appear consistent with a presidential directive issued last December ordering common standards across the government by the end of 2006 for handling SBU information. After talking with DHS officials, the center directors decided that writing some of the rules themselves would be better than having the government impose them. “We knew we had no choice. This thing was coming our way sooner or later,” says Gary LaFree, co-director of the Study of Terrorism and Responses to Terrorism at the University of Maryland, College Park.

    One reason that universities have resisted the SBU concept is its vagueness, which some academics fear could lead to federal agencies trying to set arbitrary restrictions on campus research. The executive branch itself seems confused about what information should be withheld from the public and why: The Government Accountability Office reported in March that agencies use 56 different SBU categories in deciding how to control information. Last week, Thomas E. “Ted” McNamara, an official in the Office of the Director of National Intelligence who is leading a federal effort to sort out the confusion, told a congressional panel that some of the government's procedures for handling SBU information “are not only inconsistent but are contradictory.” McNamara expects to submit his recommendations next month on standardizing SBU procedures.

    But a clearer definition of SBU is unlikely to end the debate. LaFree says the guidelines discussed at the academies meeting could have serious implications for research at the DHS centers. “They could lead to restrictions on the involvement of foreign students and researchers in certain projects,” he says, adding that not all center directors are comfortable with the guidelines, despite their role in writing them. “That would be simply unacceptable.”

    Playing it safe.

    USC researchers removed some details from their paper on the risk and impact of a dirty bomb attack on Los Angeles harbor (left) to avoid helping terrorists. Inset shows a model of how radiation might spread.


    LaFree's concern is not unfounded. In fact, the USC center has been developing procedures—not included in the draft guidelines—that would require foreign nationals to agree to certain conditions before being given access to sensitive information. (Von Winterfeldt won't say what those conditions might be.) Such procedures, critics say, could encourage principal investigators to drop foreigners from sensitive projects. That's already happened in some cases: Yacov Haimes of the University of Virginia in Charlottesville says he deliberately avoided including any foreign nationals when his research team did an unclassified study for the federal government 2 years ago on the risk of a high-altitude electromagnetic pulse attack on the United States.

    That approach could backfire on universities, warns Robert Hardy of the nonprofit Council on Governmental Relations in Washington, D.C. By placing restrictions on publishing, he says, the centers could risk losing the privileges that universities enjoy because they do fundamental research—defined as work whose results are “published and shared broadly within the scientific community.” One important privilege is being able to involve foreign nationals in any research project without obtaining a government license.

    Randolph Hall, vice president for research advancement at USC and a researcher at the USC center, disagrees with Hardy's interpretation of what qualifies as open publishing. Taking some information out of a paper is not the same as preventing a researcher from publishing, he says, and shouldn't have any bearing on the exemption given to institutions. “It's not unusual for reports at any institution to go through editing, even if some of the changes might be purely grammatical,” Hall says. “Similarly, editing out sensitive data is more of a revision than a restriction.”

    Shaun Kennedy, a chemical engineer and deputy director of the National Center for Food Protection and Defense at the University of Minnesota, Twin Cities, says the proposed guidelines bump up against state laws meant to ensure public access to information. “If I have a For Official Use Only version of a paper in a folder, shredding it would be a violation of the Minnesota Data Practices Act,” says Kennedy, adding that the center decided not to start a proposed project analyzing chinks in the nation's food supply chain partly because of that provision. (Instead, the Food and Drug Administration is doing the research internally.)

    Some scientists say that there's a more fundamental issue at stake, namely, whether a limit on what goes into the open literature might actually weaken the nation's security. “If you don't publish the information, it might reduce the chances of an attack. But just as likely it could reduce the chances of another researcher coming up with a solution. If the risks are so great, then why shouldn't the research be classified?” asks Toby Smith of the Association of American Universities.

    LaFree thinks the argument makes sense. What universities bring to the stable, he says, “is the best minds to look at the data that we pass around. If we end up putting a lot of fences around information, that'll defeat the purpose of doing this type of research in an academic environment.”

    Von Winterfeldt doesn't believe that a little secrecy will doom research, but he does agree that universities should set and implement policies to protect SBU information. Panels similar to Institutional Review Boards could be set up to do the job, he suggests. And he acknowledges that the panels will have to wrestle with some tough questions. Asked why a sentence in his team's paper on using a helicopter to disperse a dirty bomb didn't qualify as sensitive information, von Winterfeldt said, “It's in the gray zone. I'll discuss it at my next meeting with the author and our staff.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution