The 7 Deadly Sins of Psychology Book Cover

You may have already went into some books addressing misconduct in research but your journey through this critical landscape will be far from monotonous with Chris Chambers’ insightful offering in this book. He casts a fresh light on the subject by focusing specifically on challenges encountered within the field of psychology. Chambers meticulously delineates these issues, elucidating their mechanics and exploring their emergence in psychological research. Furthermore, he does not just stop at identifying the problems; at the conclusion of each chapter, and ultimately the book itself, Chambers posits thought-provoking yet pragmatic solutions to rectify these issues.

This book is a call to action, challenging its readers to engage in a comprehensive re-evaluation and transformation of their approach to psychological research.

Summary

The Sin of Bias

  • People are biased toward estimating the probability of data if a particular hypothesis is true, rather than the opposite probability of it being false.

Confirmation Bias

  • It contrives a measure of scientific reproducibility in which it is possible to replicate but never falsify previous findings, and it encourages altering the hypotheses of experiments after the fact to “predict” unexpected outcomes.

Confirmation bias is in perfectly rational in a society where winning arguments is more important than establishing truths.

Psychologists Hugo Mercier and Dan Sperber
  • We can never completely eliminate confirmation bias—in Nietzsche’s words we are human, all too human → rather than waging a fruitless war on our own nature, we would do better to accept imperfection and implement measures that protect the outcome of science as much as possible from our inherent flaws as human practitioners. Such protection against bias: preregistration.
    • The essence: the study rationale, hypotheses, experimental methods, and analysis plan are stated publicly in advance of collecting data.
    • Prevents publication bias by ensuring that whether results are positive or negative, novel or familiar, groundbreaking or incremental, is irrelevant to whether the science will be published

The Sin of Hidden Flexibility

Hidden flexibility

Hidden flexibility is analyzing complex data in many different ways and report only the most interesting and statistically significant outcomes. Doing so deceives the audience into believing that such outcomes are credible, rather than existing within an ocean of unreported negative or inconclusive findings. The are all kinds of reasons why researchers peek at data before data collection is complete, but one central motivation is efficiency. In an environment with limited resources it can often seem sensible to stop data collection as soon all-important statistical significance is either obtained or seems out of reach.

The Sin of Unreliability

  • Science can never escape the risk of human error, but it can and must ensure that it self-corrects.
  • Replication is the immune system of science, identifying false discoveries by testing whether other scientists can repeat them. Unfortunately, the process of replication—so intrinsic to the scientific method—is largely ignored or distorted in psychology.

The Sin of Data Hoarding

  • It is standard practice for psychology researchers to withhold data unless motivated to share out of self-interest or (rarely) when required to share by a higher authority.
  • One reason why psychologists might refuse to share data is for fear of their results being disputed owing to questionable or sloppy research practices.
  • Only by scrutinizing raw data can the scale of such imitation be uncovered, and only by introducing a culture where data sharing is the norm can such cases be efficiently detected and purged from the scientific record.

The Sin of Corruptibility

  • Falsifying data offers a low-risk, high-reward career strategy for scientists who, for whatever reason, lose their moral compass and sense of purpose. Like many other sciences, psychology is based on an honesty system that is poorly equipped to prevent or detect fraud. Worst of all, when exposed, fraudulent researchers and their institutions are often minimally accountable while whistle-blowers face vilification.
  • Deliberate exploitation of questionable practices may provide a gateway to fraud, which in turn raises the tricky question of whether conscious exploitation of questionable practices should also be treated as fraudulent.
  • The ultimate method for weeding out all scientific error, including fraud: direct replication by independent researchers.

The Sin of Internment

  • Barrier-based publishing: allows the academic, but not the public, to read each other’s articles, although as we will see not all universities can even afford to furnish their academics with subscriptions
  • For failing to embrace the culture of open access that the public and nonacademic users require, psychology is thus guilty of our sixth major transgression: the sin of internment.

Why do psychologists support barrier-based publishing?

  • Stems from the low reliability of psychological research.
  • Many continue to reinforce the status quo, not only to maintain their academic influence but to protect the careers of their junior proteges.

The Sin of Bean Counting

  • Metrics such as “impact factor” of the journals in which we publish, the quantity and monetary value of grants we receive, the number of papers we publish, our author position on those papers, and the number of citations we receive, have all become so-called key performance indicators. And because academics, and the bureaucrats who oversee us, view these indicators as synonymous with high-quality science, the measures themselves have become career targets.
  • there are reasons to be skeptical about reducing the quality of science to numbers. Goodhart’s law of economics warns us that when a measure becomes a target, it ceases to be a good measure.

Roads to nowhere

  • We reduce the skills and knowledge of individual scientists to their citation metrics called “impact”.
  • We reward academics disproportionately for winning research grants, summing scientific inputs (funding) to outputs (discoveries) rather than weighing them against each other, and in the process favoring more expensive research. The authorship order of scientists on research papers while clinging to an antiquated system of attribution that muddies their individual contribution → we have become obsessed with the price of everything and the value of nothing.

Roads to somewhere

  • Metrics like journal impact factor are meaningless as indicators of scientific quality, but citations themselves at the level of individual articles–while not indicators of quality–are suggestive of influence and interest-value
  • As with citation metrics, however, grant income should never be used to assess the quality or potential of researchers because a grant is a mission plan, not a completed mission.

Read-alikes

If you are interested in reading more about similar topics, but not particularly in the field of psychology, these books might be for you:


Author: Chris Chambers

Publication date: 25 April 2017

Number of pages: 288 pages


Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *