Last week, I wrote about the case of Dirk Smeesters, a social psychologist who had resigned from Erasmus University Rotterdam after an investigation uncovered problems with the data in two of his papers. His case follows the scandal of Diederik Stapel, another psychologist from a Dutch University who was found guilty of research fraud last year.
As I noted in my post, the Smeesters case is unique. While earlier cases of fraud in psychology, including Stapel Karen Ruggiero and Marc Hauser, were uncovered by internal whistleblowers using inside information, Smeesters was found out by an external source using statistical sleuthing. At the time of writing, that source was anonymous but on Thursday, he was revealed to be Uri Simonsohn, another social psychologist from the University of Pennsylvania.
I’ve now interviewed Simonsohn for Nature News, about how he started his investigation, his motives, the fallout, and more. Go and read that first. What follows is bonus material, including a few specific points I wanted to highlight, and some quotes that were cut for length:
Not a witch hunt. Several people have speculated that this might have been a grudge match, and that the then-anonymous whistleblower was someone familiar with Smeesters who was out to hurt him. According to Simonsohn, that is not the case. His investigation started “by chance” because a colleague sent him paper by Smeesters and he thought the data looked too good to be true.
Technique will be published soon. Simonsohn’s statistical tool is still undescribed, but he will be sending it for publication soon. He says that it is simple to use, could be broadly applied to other sciences, and works particularly well for small sample sizes. He doesn’t want to comment on the existing attempts to reverse-engineer his method from the Erasmus University committee report.
More misconduct afoot. There could be other cases of misconduct that we do not know yet about. Simonsohn identified four. One was Stapel (after the fact). The second was Smeesters. A third has apparently been investigated but not been made official yet. No one is doing anything about the fourth case. Simonsohn is convinced that data have been fabricated, but the suspect’s co-authors have not been willing to help him, and he doesn’t have the time to pursue it himself. There’s a fifth case that’s more ambiguous. “I wouldn’t bet money the papers are true but I’m not sufficiently convinced to do something about it,” says Simonsohn.
What did Smeesters actually do? Smeesters claims that he was only doing things that are common practice among psychological researchers, including leaving out outliers or people who did not read the instructions carefully. Simonsohn says that uncovering what Smeesters did was a job for the university, but adds, “His data aren’t consistent with dropping outliers, or dropping people who don’t understand instructions. [And] when I contacted Smeesters, he never mentioned the possibility that he deleted outliers. He said that he might have incorrectly entered data and agreed to retract his papers and re-run the study.”
How common is this? Simonsohn says that “it’s really hard to have an informed opinion” of how common such practices are in psychology or any other science. However, he is concerned with how easily he came across his cases. “I wasn’t looking. They landed on my desk without any explicit plan to find them.”
Why bother? Simonsohn does worry about how these activities will be perceived. “Whenever someone gets notoriety, people make inferences about their motives. It’s not hard to come up with bad motives for what I’m doing,” he says.” I asked him what his motive was. “Simply that it was wrong to look the other way,” he said.
Could this trap the innocent? Simonsohn is aware and worried of the possibility of pointing the finger at an innocent party. If he finds one dodgy paper, he always looks for at least two more before contacting someone. “I also proceed with extreme caution. I first came across Smeesters’ paper 9 months ago, and I had cordial correspondence with him for months. The empirical analyses are only the first step.”
And finally, for some context, here’s my feature for Nature about psychology’s problems with replication, and what people are trying to do about it.
Open all references in tabs: [1 - 10]