“Just when I thought you couldn’t get any dumber, you go and do something like this… and totally redeem yourself!”
– Harry in Dumb & Dumber
On January 25th, my inbox began filling with emails from friends and fellow researchers around the globe. “Have you seen the article in the Guardian?” they asked. “What do you make of it?” others inquired, “Have you read the study the authors are talking about? Is it true?!” A few of the messages were snarkier, even gloating, “Scott, research has finally proven the Dodo verdict is wrong!”
The article the emails referred to was titled, Are all psychological therapies equally effective? Don’t ask the dodo. The subtitle boldly announced, “The claim that all forms of psychotherapy are winners has been dealt a blow.”
Honestly, my first thought on reading the headline was, “Why is an obscure topic like the ‘Dodo verdict’ the subject of an article in a major newspaper?” Who in their right mind–outside of researchers and small cadre of psychotherapists–would care? What possible interest would a lengthy dissertation on the subject–including references to psychologist Saul Rozenzweig (who first coined the expression in the 1930’s) and researcher allegiance effects–hold for the average Joe or Jane reader of The Guardian. At a minimum, it struck me as odd.
And odd it stayed, until I glanced down to see who had written the piece. The authors were psychologist Daniel Freeman–a strong proponent of the empirically-supported treatments–and his journalist brother, Jason.
Briefly, advocates of EST’s hold that certain therapies are better than others in the treatment of specific disorders. Lists of such treatments are created–for example, the NICE Guidelines–dictating which of the therapies are deemed “best.” Far from innocuous, such lists are, in turn, used to direct public policy, including both the types of treatment offered and the reimbursement given.
Interestingly, in the article, Freeman and Freeman base their conclusion that “the dodo was wrong” on a single study. Sure enough, that one study comparing CBT to psychoanalysis, found that CBT resulted in superior effects in the treatment of bulimia. No other studies were mentioned to bolster this bold claim–an assertion that would effectively overturn nearly 50 years of robust research findings documenting no difference in outcome among competing treatment approaches.
In contrast to what is popularly believed extraordinary findings from single studies are fairly common in science. As a result, scientists have learned to require replication, by multiple investigators, working in different settings.
The media, they’re another story. They love such studies. The controversy generates interest, capturing readers attention. Remember cold fusion? In 1989, researchers Stanley Pons and Martin Fleischmann–then two of the world’s leading electrochemists–claimed that they had produced a nuclear reaction at room temperature–a finding that would, if true, not only overturn decades and decades of prior research and theory but, more importantly, revolutionize energy production.
The media went nuts. TV and print couldn’t get enough of it. The hope for a cheap, clean, and abundant source of energy was simply too much to ignore. The only problem was that, in the time that followed, no one could replicate Pons and Fleischmann’s results. No one. While the media ran off in search of other, more tantalizing findings to report, cold fusion quietly disappeared, becoming a footnote in history.
Back to The Guardian. Curiously, Freeman and Freeman did not mention the publication of another, truly massive study published in Clinical Psychology Review—a study available in print at the time their article appeared. In it, the researchers used the statistically rigorous method of meta-analysis to review results from 53 studies of psychological treatments for eating disorders. Fifty-three! Their finding? Confirming mountains of prior evidence: no difference in effect between competing therapeutic approaches. NONE!
Obviously, however, such results are not likely to attract much attention.
Sadly, the same day that the article appeared in The Guardian, John R. Huizenga passed away. Huizenga is perhaps best known as one of the physicists who helped build the atomic bomb. Importantly, however, he was also among the first to debunk the claims about cold fusion made by Pons and Fleischman. His real-world experience, and decades of research, made clear that the reports were a case of dumb (cold fusion) being followed by dumber (media reports about cold fusion).
“How ironic this stalwart of science died on this day,” I thought, “and how inspiring his example is of ‘good science.'”
I spent the rest of the day replying to my emails, including the link to study in Clinical Psychology Review (Smart). “Don’t believe the hype,” I advised, “stick to the data” (and smarter)!