The subject line of a recent email immediately caught my attention: what is the future of psychotherapy?
I couldn’t help myself. I clicked and began reading.
“The future is incredibly bright,” the two, well-known writers and clinicians asserted. Don’t worry about “rising rates of trauma, anxiety, bullying, depression, or the precipitous decline in people seeking help from therapists,” they counseled as, “There are things emerging that will take us beyond any paradigm that we grew up in!”
Bold and hopeful claims to be sure. The truth is our field is founded on hope. Fundamental to the work is the belief that people can change, that the past is not the future, genes and environment are not destiny. A happier, healthier, more successful life awaits. We can help.
And yet, the fact remains the effectiveness of psychotherapy has not improved in 40+ years. More, despite widespread belief to the contrary, individual clinicians do not get better with time, training, or experience in the field. If anything, the evidence shows the opposite: effectiveness declines!
So, what stands in the way of getting better results? Could it be, as the email in my inbox suggested, the field suffers from a paucity of appropriate methods for helping people? It’s a tempting idea. And yet, on reflection, its hard to believe. The field (and practitioners) have, at their disposal, literally hundreds of “evidence-based” treatment approaches.
What about a lack of will? Perhaps practitioners are just too lazy or complacent to improve? I’ve heard this claim more than once from various sources. Here again, however, its hard to believe. Research documents therapists, as a group, overwhelmingly want to get better. Indeed, continuous improvement is central to their identity, an antidote to compassion fatigue and burnout.
So what is it? With so many good ideas and intentions, why haven’t outcomes of the field improved?
In a word, the answer is: Implementation.
Although new ideas are plentiful, and literally thousands of research articles are published each month, the support necessary for effective execution of clinical innovations in daily practice is sorely lacking. Despite their best intentions, many practitioners end up paralyzed by what Dr. Sarah Boon has called, “21st century science overload.” Others, faced with demands from regulators and payers, do whatever is necessary to comply with new “minimum” standards. All too often, the result is merely an increased administrative burden with little or no actual payoff in terms of results.
Feedback-Informed Treatment (FIT) is one of those “new ideas.” A large and growing number of randomized controlled trials document significantly improved quality, retention, and effectiveness of behavioral health services when standardized measures are used to solicit feedback from consumers of behavioral health services.
Each week, hundreds of therapists sign up for a free license to use the Outcome and Session Rating Scales in their work. I answer scores of emails from agency managers about using the tools to meet new standards from SAMHSA and the Joint Commission for “measurement-based care.”
If improving outcomes via FIT were a simple matter of combining evidence with a desire to improve, success would be guaranteed. Unfortunately, experience proves otherwise. The majority of clinicians who download the ORS and SRS never use them; the small number who do, stop within a short period of time. The same fate affects a suprisingly high number of agencies who invest their meager resources in acquiring licenses or software, and sending staff to 2-day training events.
Last week, I interviewed psychologist Heidi Brattland about her new study on implementing FIT in a hospital-based psychiatric clinic. As in previous investigations, soliciting feedback via the ORS and SRS resulted in significantly better results–indeed, therapies informed by ongoing feedback where two and a half times more likely to prove beneficial. Interestingly, however, the results weren’t immediate. In fact, in the first year, FIT made no difference whatsoever. Instead, confirming what research from the field of implementation science has long indicated, success took time–the best results being obtained after four years.
Click on the video below to learn what was essential to sustaining the agency’s effort along the way (You can read the complete research report here). At the upcoming ICCE Implementation intensive, we’ll help you translate these findings into a step-by-step, evidence-based plan. It’s the only such training available–a primary reason that SAMHSA gave the ICCE perfect scores for implementation and support resources.
Until next time,
Scott D. Miller, Ph.D.
Director, International Center for Clinical Excellence