Does Psychology Have a Reproducibility Problem? Study Says Yes

The hallmark of sound scientific research is replicability—the ability of subsequent researchers to reproduce a study’s results. When research results can’t be replicated, it suggests that the results might have been a fluke, that researcher bias might have colored the results, or that the study was poorly designed. A new analysis of 100 psychology studies suggests that the field may have a replicability problem. An international team of researchers was disappointed to find they could only reproduce the results of 39 studies.

Scientists Fail to Reproduce Results of Psychological Studies

The effort was launched in response to growing concerns about fraud, researcher bias, and erroneous data analysis in 2008. A team of 270 researchers pulled 100 studies published in three major psychology journals—the Journal of Personality and Social Psychology, Psychological Science, and the Journal of Experimental Psychology: Learning, Memory, and Cognition. Relying on highly structured protocols to reproduce the studies they selected, researchers set out to determine whether they could replicate the studies’ results.

Only 39 studies were successfully repeated with the same results as the original research. An additional 24 studies produced “moderately similar” results, though they did not fully reproduce the results of the original studies. Fourteen of the studies produced results that were in no way similar to the original results, with the rest producing some similarities, but not enough to meet scientific standards of replicability. 

Does This Study Undermine Psychology’s Credibility?

The studies were all published in peer-reviewed journals, suggesting that not only the studies’ authors, but also the scientists who reviewed the research, didn’t notice important flaws. This certainly hints at a problem with psychological research, but the research may reflect a larger trend among all sciences, not just psychology. For instance, in 2014, Springer and the Institute of Electrical and Electronic Engineers removed 100 studies from their databases after it was revealed the studies were computer-generated gibberish.

Psychology, like all sciences, is home to an increasingly cutthroat political and academic culture. The well-known mandate to “publish or perish” may force academics to rush through their research or to publish results that aren’t nearly as strong or compelling as they first appear. Exciting results may make for flashy headlines, only to subsequently be retracted—or fade into the woodwork after other researchers can’t reproduce the original findings. To armchair psychologists, therapists, and mental health advocates who are interested in or affected by psychological research, this study serves as an important reminder that published scientific studies don’t always “prove” what they claim to.

References:

  1. Do normative scientific practices and incentive structures produce a biased body of research evidence? (2015, April 30). Retrieved from https://osf.io/ezcuj/wiki/home/
  2. First results from psychology’s largest reproducibility test. (2015, April 30). Retrieved from http://www.nature.com/news/first-results-from-psychology-s-largest-reproducibility-test-1.17433
  3. Van Noorden, R. (2014, February 24). Publishers withdraw more than 120 gibberish papers. Retrieved from http://www.nature.com/news/publishers-withdraw-more-than-120-gibberish-papers-1.14763
  4. Yong, E. (2012, May 16). Replication studies: Bad copy. Retrieved from http://www.nature.com/news/replication-studies-bad-copy-1.10634

Connect with Zawn on Google+

© Copyright 2015 by www.GoodTherapy.org - All Rights Reserved.

Sign up for the GoodTherapy.org Newsletter!
Get weekly mental health and wellness news and information sent straight to your inbox!

Print This Post Print This Post

Read More

  • Find the Right Therapist
  • Join GoodTherapy.org - Therapist Only

Open bundled references in tabs:

Leave a Reply