Don’t believe everything you read, right? That’s what The New York Times learned last week when they covered a group of researchers re-evaluating some of the most cited psychological studies, some of which the newspaper covered when originally released.
The Reproducibility Project at the Center for Open Science in Charlottesville, Virginia, has spent a year testing 100 psychological studies published in three of the area’s top journals — Psychological Science, the Journal of Personality and Social Psychology and the Journal of Experimental Psychology: Learning, Memory and Cognition.
Get ready to cringe: More than 60 of the studies have not held up when reproduced. These are studies that have shaped decisions made by therapists, researchers and educators — including learning leaders.
I talked to Frank Bosco, a member of the team and professor of management at Virginia Commonwealth University School of Business, who explained the reason behind so many faulty studies — a lot of it had to do with the publication process.
Researchers are often professors who want tenure, and to get that they have to show they have been published in some of the top journals. But these publications tend to favor studies with results — for example, that temperature affects whether someone prefers Coke over Pepsi — than nonresults that show there’s no correlation between being sweaty and wanting one brown soda over the other.
Bosco said some journals are taking steps to avoid this bias by reviewing studies without looking at the results, but that doesn’t help the possibly false research that’s already been published.
Human resources might not have as much to fear, however, because they tend to look at organizational research rather than experimental research.
“In organizational research, we do a much better job than they do in experimental social psychology research,” Bosco said. “We have millions of findings, and so in a sense, there’s a built-in assessment in organizational research.”
He gave five tips for learning leaders who want to apply organizational research to their work:
1. Never trust a single study. Learn to look at meta-analysis that compiles multiple sets of data and contextualizes it.
2. Be reluctant to accept “newest” or "hottest" studies. Many consultants who conduct studies can’t make money by using public domain questions that measure factors like engagement, retention or turnover. Instead, they create their own surveys that might not be reliable. The trouble is that they also give their measures new, interesting names to increase marketability. With this marketability often comes a distancing from a rigorous scientific foundation and a needless increase in complexity of the terminology landscape.
3. Conduct your own studies. If your organization is a large enough sample — 1,000 employees or more — use it to your advantage. — leverage your existing survey, performance, and turnover records. Conduct analyses to answer targeted questions, such as "which factors are driving performance?" "Which factors are driving turnover?"
“Everyone talks about Google,”Bosco said. “They hire social scientists who know how to run things, and they conduct their research in-house.”
4. Network with HR professors. Consider it a mutual relationship. Professors want to conduct research using non-student subjects, and learning leaders can help them access their organization’s employees. Then CLOs can use the results to their advantage.
5. Know your research publications. You don’t have to read every journal out there, but look for those specifically designed to translate scientific findings into practices.
6. Leverage research tools. Look for those that locate measurements nd existing findings, such as the Inter-Nomological Network and metaBUS.org.
Bosco said the Reproducibility Project, like any solid experiment, didn’t aim for any particular outcome regarding the validity of these 100 studies. The goal was to provide an honest, open assessment.
“Open science in general — the goal is to show the power of it,” he said. “Instead of hiding behind the data, which is sort of the standard right now, there’s a power that comes from providing the data to community.”