When former Harvard pyschology professor Marc Hauser was found solely responsible in a series of six scientific misconduct cases in 2012, he distanced himself from the problems, portraying them as an unfortunate consequence of his heavy workload. He said he took responsibility, “whether or not I was directly involved.”
But a copy of an internal Harvard report released to the Globe under the Freedom of Information Act now paints a vivid picture of what actually happened in the Hauser lab and suggests it was not mere negligence that led to the problems.
Continue reading below
The 85-page report details instances in which Hauser changed data so that it would show a desired effect. It shows that he more than once rebuffed or downplayed questions and concerns from people in his laboratory about how a result was obtained. The report also describes “a disturbing pattern of misrepresentation of results and shading of truth” and a “reckless disregard for basic scientific standards.”
A three-member Harvard committee reviewed 40 internal and external hard drives, interviewed 10 people, and examined original video and paper files that led them to conclude that Hauser had manipulated and falsified data.
Their report was sent to the federal Office of Research Integrity in 2010, but it was not released to the Globe by the agency until this week.
The documents offer a rare window into one of the highest-profile cases of scientific research fraud in recent years.
Hauser, a widely published and charismatic professor with a prominent public profile, had made waves with his research showing that animals had cognitive abilities often thought to be uniquely human.
Continue reading below
After the Globe first reported that Harvard was investigating his underlying research for potential misconduct, “Hausergate” became a window into the mechanics of research fraud and the tense internal politics of a university that was investigating one of its best-connected figures.
Much has been redacted from the report, including the identities of those who did the painstaking investigation and those who brought the problems to light.
Hauser, reached by phone Thursday, said he is focused on his work with at-risk youth on Cape Cod and declined to comment on the report.
Still, the report details the lengths to which the committee went to check Hauser’s defenses. They reviewed his written responses, read seven letters of support from scientific colleagues, and met with him and his lawyer for nine hours. When Hauser suggested that someone had doctored a videotape of raw data showing monkeys responding to sounds, for example, two external firms were commissioned to do a forensic examination. They found no signs of tampering. When possible, the committee consulted original videotape of monkeys responding to cues to figure out how the problems arose and who was responsible.
“We did not find evidence that [professor] Hauser has been inventing findings out of whole cloth,” the committee wrote.
“. . . Hauser’s shortcomings in respect to research integrity have in the main consisted instead of repeated instances of cutting corners, of pushing analyses of data further in the direction of significance than the actual findings warranted, and of reporting results as he may have wished them to have been, rather than as they actually were.”
Although many of the broad problems with the work have been previously detailed, the report gives an inside view of how the investigation unfolded and what was found:
■ In a 2002 paper published in the journal Cognition that has since been retracted, videotape of monkeys being exposed to two different patterns of syllables never showed the animals being exposed to one of the specific set of syllable patterns that were reported in the paper. Hauser suggested a number of alternative explanations for the problem, including the possibility the tape had been doctored, which were carefully considered and rejected by the committee.
■ In 2005, Hauser and colleagues did a statistical analysis of an experiment in which monkeys responded to two artificial languages. In a later statistical analysis, an unnamed individual using the raw data got very different results.
The committee painstakingly reconstructed the process of data analysis and determined that Hauser had changed values, causing the result to be statistically significant, an important criterion showing that findings are probably not due to chance.
For example, after the data from one experiment were analyzed in 2005, the results initially were not statistically significant. After Hauser informed a member of his lab of this by e-mail, he wrote a second e-mail: “Hold the horses. I think I [expletive] something up on the coding. Let me get back to you.”
After correcting for that problem, he concluded that the result was statistically significant.
According to the Harvard report, five data points had changed from the original file, and four of the five changes were in the direction of making the result statistically significant.
In a second, related experiment, a collaborator asked to be walked through the analysis because he or she had obtained very different results when analyzing the raw data. Hauser sent back a spreadsheet that he said was simply a reformatted version, but then his collaborator made a spreadsheet highlighting which values had apparently been altered.
Hauser then wrote an e-mail suggesting the entire experiment needed to be recoded from scratch. “Well, at this point I give up. There have been so many errors, I don’t know what to say. . . . I have never seen so many errors, and this is really disappointing,” he wrote.
In defending himself during the investigation, Hauser quoted from that e-mail, suggesting it was evidence that he was not trying to alter data.
The committee disagreed.
“These may not be the words of someone trying to alter data, but they could certainly be the words of someone who had previously altered data: having been confronted with a red highlighted spreadsheet showing previous alterations, it made more sense to proclaim disappointment about ‘errors’ and suggest recoding everything than, for example, sitting down to compare data sets to see how the ‘errors’ occurred,” the report states.
■ In 2007, a member of the laboratory wanted to recode an experiment involving rhesus monkey behavior, due to “inconsistencies” in the coding.
“I am getting a bit pissed here. There were no inconsistencies!” Hauser responded, explaining how an analysis was done.
Later that day, the person resigned from the lab. “It has been increasingly clear for a long time now that my interests have been diverging sharply from what the lab does, and it seems like an increasingly inappropriate and uncomfortable place for me,” the person wrote.
The committee said it carefully considered Hauser’s allegation that people in his laboratory conspired against him, due to academic rivalry and disgruntlement, but did not find evidence to support the idea.
The committee also acknowledged that many of Hauser’s overall findings about the cognitive abilities of animals may stand. His results that showed that animals may have some of the same cognitive abilities as people have been important for the field. But science depends on the data.
“Skepticism above all toward the veracity of one’s own hypotheses is, of course, an essential virtue for scientists,” the committee wrote, “and one that must be modeled for the benefit of trainees.”
Related:
• Ex-Harvard scientist fabricated, manipulated data, report says