Final Report on Stapel Also Blames Field As a Whole

See allHide authors and affiliations

Science  07 Dec 2012:
Vol. 338, Issue 6112, pp. 1270-1271
DOI: 10.1126/science.338.6112.1270

This article has a correction. Please see:

Off the tracks.

A pirated copy of Stapel's autobiography appeared on the Internet 2 days after it was published.


AMSTERDAM—Just when it seemed time to close the book on Diederik Stapel, a new one was opened, quite literally, by Stapel himself. On 28 November, the day on which three investigative panels presented their exhaustive final report on the disgraced social psychologist, Stapel made his first media appearance in more than a year. In a video statement taped at his home by Dutch public television, he said he was deeply sorry and announced he had written an autobiography to explain how it all happened.

The 2:20-minute clip and the appearance of the book the next day caused a new media storm in a country already obsessed with the “Lying Dutchman,” as a blogger with The Washington Post called him. Many felt that Stapel, fired from Tilburg University in September 2011, was hogging the limelight once again. “Making the announcement on the same day … he's still an ego-tripper,” says psychologist Pieter Drenth of the Free University Amsterdam, who chaired one of the committees. Others dispute Stapel's right to earn money off his tale.

But the key message in the joint report—there was one committee for each of the universities where Stapel worked—was a different one: It's not just about Stapel. Colleagues, co-authors, reviewers, and editors at even the most prestigious journals deserve some degree of blame for letting Stapel get away with “blatant” fraud, says the report, which concludes that “the critical function of science has failed on all levels.”

The final tally concludes that Stapel made up data in 55 of his 137 papers and in the Ph.D. theses of 10 students he supervised. In another 10 papers, scientific misconduct could not be established beyond a reasonable doubt, either because the original data were missing or Stapel did not confess to fraud, but statistical analyses on the published papers showed “evidence of fraud,” says Drenth, who chaired the Amsterdam committee.

These analyses were based partly on the work of psychologist and fraud detective Uri Simonsohn of the Wharton School at the University of Pennsylvania, who sent shockwaves through the field earlier this year with a new method to find suspicious statistical patterns in published papers (Science, 6 July, p. 21). Simonsohn has raised questions about the work of several social psychologists, including Dirk Smeesters of Erasmus University Rotterdam and Lawrence Sanna of the University of Michigan, Ann Arbor. Amsterdam panel member and statistician Chris Klaassen adapted the method to further reduce chances of a false alarm, Drenth says, adding that the technique “could be used far more broadly when there are suspicions.”

In what the report describes as “bycatch,” it says that Stapel and his co-authors committed a whole range of smaller sins in papers that weren't outright fraudulent. (It even coins a new Dutch word for sloppy science, “slodderwetenschap,” to describe them.) For example, research subjects were struck from the data on dubious grounds, certain parts of experiments weren't reported, or data from experiments were merged, all to reach statistical significance. There were numerous statistical errors in the studies as well as discrepancies between the actual experiments and published papers.

The implications go beyond Stapel, says psycholinguist Willem Levelt, who chaired the Tilburg committee and coordinated the entire investigation. “We're not saying all of social psychology is sloppy science,” he says. “But the fact that this could happen shows that the review process has failed from the bottom to the top.” Levelt believes the field is taking the message to heart, however. The report praises the “reproducibility project,” a large collaborative effort to replicate psychology studies set up by Brian Nosek of the University of Virginia in Charlottesville (Science, 30 March, p. 1558), as well as the November issue of Perspectives on Psychological Science, which is devoted to analyses of what ails the field and proposals to cure it. “I was impressed by many of the papers,” Levelt says. “I have faith that they're cleaning up their shop.”

Nosek says the Stapel inquiry is highly unusual in its transparency, the fact that it looked at Stapel's entire corpus of work, and dug deep into the surrounding circumstances. In the case of Harvard University evolutionary biologist Marc Hauser, for instance, a federal investigation found misconduct in six studies, but the university has not released its own report, which reportedly described eight instances of misconduct (Science, 14 September, p. 1283). Sanna resigned in May and several of his papers have been retracted, but neither the University of Michigan nor his previous employer, the University of North Carolina, have revealed further information about the case. “We have no idea which of his papers we can trust,” Nosek says. “I see the Stapel investigation as a paradigmatic example of how to do it right.”

In the wake of the Stapel report, Erasmus University announced that it will take an in-depth look at all of Smeesters's papers, instead of the three that a previous committee studied. Erasmus MC, the university's hospital, will investigate whether the same is possible for Don Poldermans, a cardiologist who was fired in November 2011 after an investigation found “violations of scientific integrity,” including fictitious data, in five studies. The job would be huge, because Poldermans has authored about 600 papers.

Stapel's book, meanwhile, poses a problem for curious fellow psychologists and the pubic at large: Buy it or not? “I probably will, but I have very mixed feelings about it,” Drenth says. “It would be less of a problem if he shared the proceeds with the Ph.D. students he deceived.” Yet there may be an alternative. A 324-page PDF of the book showed up on a file sharing Web site 2 days after publication; the URL was tweeted and retweeted extensively, often with undisguised schadenfreude. The file was removed 2 days later, only to pop up on a new site.

In the book, entitled Ontsporing (Derailment), Stapel does not deny his fraud, but says that once he started, he did not know how to stop. “I had become addicted to the rhythm of digging, discovering, testing, publishing, scoring, and applause,” he writes. The book includes details on how he designed studies with Ph.D. students and colleagues, and then insisted on doing the actual data collection himself, usually at a high school or university where he said he had good contacts. The carefully prepared questionnaires weren't used and ended up in a dumpster; Stapel grew fat, he writes, from eating small bags of M&M's that supposedly were the rewards for his research subjects.

The collapse of his house of cards devastated his family and led him to consider suicide. “I was too weak even for that,” he writes. “I hated myself.”

Navigate This Article