SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
scottdmiller@ talkingcure.com +1.773.454.8511

Intake: A Mistake

September 4, 2015 By scottdm 1 Comment

bad idea

 

 

 

 

Available evidence leaves little doubt.  As I’ve blogged about previously, separating intake from treatment results in:

• Higher dropout rates;
• Poorer outcomes;
• Longer treatment duration; and
• Higher costs

And yet, in many public behavioral health agencies, the practice is commonplace. What else can we expect?

Chronically underfunded, and perpetually overwhelmed by mindless paperwork and regulation, agencies and practitioners are left with few options to meet the ever-rising number of people in need of help. Between 2009 and 2012, for example, the number of people receiving mental health services increased by 10%. During the same period, funding to state agencies decreased $4.35 billion. Not long ago, in my own home town of Chicago, the city shuttered half—50%–of the city’s mental health clinics, forcing the remaining, already burdened, agencies to absorb an additional 5,000 people in need of care.

crowd

 

 

 

Simply put, the practice of separating intake from treatment is little more than a form of “crowd management”–and an ineffective one at that.

feedback keyboard

 

 

 

 

Adding to the growing body of evidence is a new study investigating the impact of computerized intake on the consumer’s experience of the therapeutic relationship and continuation in care. Not only did researchers find that therapist use of a computer had a negative impact on the quality of the working relationship—one of the best predictors of outcome–but clients were between 62 and 97% less likely to continue in care!

domino

 

 

 

 

It’s not hard to see how these well-intentioned—some would argue, absolutely necessary—solutions actually end up exacerbating the problem. Money is wasted when the paperwork is completed but people don’t come back; money that would be better spent providing treatment. Those who do not return don’t disappear, they simply access services in other ways (e.g., the E.R., police and social services, etc.)—after all, they need help! The ones who do continue after intake, experience poorer outcomes and stay longer in care, a cost to both the consumer and the system.

What to do?

solution

 

 

 

 

In addition to pushing back against the mindless regulation and paperwork, there are several steps practitioners and agency managers can take:

  • Stop separating intake from treatment

The practices does not save time and actually increases costs. Consider having consumers complete as much of the paperwork as possible before the session begins. The first visit is critical. It determines whether people continue or drop pout. Listen first. At the end of the visit, review the paperwork, filling in missing data, and completing any remaining forms.

  • Begin monitoring outcome

Research to date shows that routinely monitoring progress reduces dropout rates and the length of time spent in treatment while simultaneously improving outcome. Combined, such results work to alleviate the bottleneck at the entry point of services.

  • Begin monitoring the quality of the therapeutic relationship:

Engagement and outcomes are improved when problems in the relationship are identified and openly discussed. Even when intake is separated from treatment, feedback should be sought. Data to date indicate that the most effective clinicians seek and more often receive negative feedback, a skill that enables them to better meet the needs of those they serve.

Getting started is not difficult. Indeed, there’s an entire community of professionals just a click away who are working with and learning from one another. The International Center for Clinical Excellence is the largest, web based community of mental health professionals in the world. It’s ad free and costs nothing to join.

Sign up for the ICCE Fall Webinar. You will learn:

  • The Empirical Basis for Feedback Informed Treatment
  • Basics of Outcome and Alliance Measurement
  • Integrating Feedback into Practice & Creating a Culture of Feedback
  • Understanding Outcome and Alliance Data

Register online at: https://www.eventbrite.ie/e/fall-2015-feedback-informed-treatment-webinar-series-tickets-17502143382. CE’s are available.

Finally, join colleagues and friends from around the world for the Advanced and FIT Supervision courses are held in March in Chicago. We work and play hard. You will leave with a thorough grounding in feedback-informed principles and practice. Registration is limited, and the courses tend to sell out several month in advance.

Until then,

Scott

Scott D. Miller, Ph.D. Director, International Center for Clinical Excellence

Scott D. Miller - Australian Drug and Alcohol Symposium

 

Filed Under: Behavioral Health, evidence-based practice, Feedback, Feedback Informed Treatment - FIT, ICCE

What’s happening to CBT? And why all the hoopla misses the point

July 29, 2015 By scottdm 8 Comments

Previously, I’ve blogged about results from a Swedish study examining the impact of psychotherapy’s “favorite son”–cognitive behavioral therapy–on the outcome of people disabled by depression and anxiety.  Like many other Western countries, the percentage of people in Sweden disabled by mental health problems was growing dramatically.  Costs were skyrocketing.  Even with treatment, far too many left the workforce permanently.

Sweden embraced “evidence-based practice”–most popularly construed as the application of specific treatments to specific disorders–as a potential solution.  Socialstyrelsen, the country’s National Board of Health and Welfare, developed and disseminated a set of guidelines (“riktlinger”) specific to mental health practice.  Topping the list?  CBT.

A billion crowns were spent training clinicians in the method; another billion using it to treat people with diagnoses of depression and anxiety.   As I reported, the State’s “return on investment” was … zilch.  Said another way, the widespread adoption of method had no effect whatsoever on outcome (see Socionomen, Holmquist Interview).   Not only that but many who were not disabled at the time they were treated with CBT became disabled along the way, bringing the total price tag, when combined with the 25% who dropped out of treatment, to a staggering 3.5 billon!

And now, a new study–this time from Norway, Sweden’s neighbor to the west.  Norwegian researchers looked at how the effectiveness of CBT has fared over time.  Examining data from 70 randomized clinical trials, study authors Johnsen and Friborg found the approach to be roughly half as effective as it was four decades ago.  Mind you, not 10 or 20 percent.  Not 30 or 40.  Fifty percent less effective!  Cause for concern, to be sure.

So, what’s happening to CBT?  Is the “favored son” losing its effectiveness?

Naturally, the results published by the Norwegian researchers generated a great deal of activity in social media.  Critics were gleeful (see the comments at the end of the article).  Proponents, of course, questioned the results.

If the findings are confirmed in subsequent studies, CBT will be in remarkably good company.  Across a variety of disciplines–pharmacology, medicine, zoology, ecology, physics–promising findings often “lose their luster,” with many fading away completely over time (Lehrer, 2010; Yong, 2012).  Alas, even in science, the truth occasionally wears off.  In psychiatry and psychology, this phenomenon, known as the “decline effect,” is particularly vexing.

That said, while the study and commentary have managed to generate a modest amount of heat, they’ve shed precious little light on the question of how to improve the outcome of psychotherapy.  After all, that’s what led Sweden to invest so heavily in CBT in the first place–doing so, it was believed, would improve the effectiveness of care.  So today, I called Rolf Holmqvist.

RolfHolmqvist
Rolf is a professor in the Department of Behavioral Science and Learning at Linköping University.  He’s also the author of the Swedish study mentioned above.  I wanted to catch up, find out what, if anything, had happened since he published his results.

“Some changes were made in the guidelines some time ago.  In the case of depression, for example, the guidelines have become a little more open, a little broader.  CBT is always on top, along with IPT, but psychodynamic therapy is now included … although it’s further down on the list.”

Sounded like progress, until Rolf continued, “They are broadening a bit.  Still the fact is that if you look at the research, for example, with mild and moderate depression, almost any method works if it’s done systematically.”

Said another way, despite the lack of evidence for the differential effectiveness of psychotherapeutic approaches–in this case, CBT for depression–the mindset guiding the creation of lists of “specific treatments for specific disorders” remains.

Rolf’s sentiments are echoed by uber-researchers, Wampold and Imel (2015), who point out, “Given the evidence that treatments are about equally effective, that treatments delivered in clinical settings are effective (and as effective as that provided in clinical trials), that the manner in which treatments are provided much more important than which treatment is provided, mandating particular treatments seems illogical.  In addition, given the expense involved in “rolling out” evidence-based treatments in private practices, agencies, and in systems of care, it seems unwise to mandate any particular treatment.”

Right now, in Sweden, an authority within the Federal government (Riksrevisorn) is conducting an investigation evaluating the appropriateness of funds spent on training and delivery of CBT.  In an article on the subject in one of the countries largest newspapers , Rolf Holmqvist argues, “Billions spent–without any proven results.”

Returning to the original question: what can be done to improve the outcome of psychotherapy?

“We need transparent evaluation systems,” Rolf quickly answered, “that provide feedback at each session about the progress of treatment.  This way, therapists can begin to look at individual treatment episodes, and be able to see when, where, and with whom they are and are not successful.”

“Is that on the agenda?” I asked, hopefully.

“Well,” he laughed, “here, we need to have realistic expectations.  The idea of recommending that you should employ a clinician because they are effective and a good person, rather than because they can do a certain method, is hard for regulatory agencies like Socialstyrelsen.  They think of clinicians as learning a method, and then applying that method, and that its the method that makes the process work…”

“Right,” I thought, “mindset.”

“… and that will take time,” Rolf said, “but I am hopeful.”

But, you don’t have to wait.  You can begin tracking the quality and outcome of your work right now.  It’s easy and free.  Click here to access two simple scales–the ORS and SRS.  the first measures progress; the second, the quality of the working relationship.

Next, read our latest article on how the field’s most effective practitioners use the measures to, as Rolf advised, “identify when, where, and with whom” they are and are not successful, and what steps they take to improve their effectiveness.

Finally, why not get some training in “Feedback Informed Treatment?”

Filed Under: Feedback Informed Treatment - FIT

Love, Mercy, & Adverse Events in Psychotherapy

July 9, 2015 By scottdm 10 Comments

Just over a year ago, I blogged about an article that appeared in one of the U.K.’s largest daily newspapers, The Guardian.  Below a picture of an attractive, yet dejected looking woman (reclined on a couch), the caption read, “Major new study reveals incorrect … care can do more harm than good.”

I was interested.

As I often do in such cases, I wrote directly to the researcher cited in the article asking for a reprint or pre-publication copy of the study.  No reply.  One month later, I wrote again.  Still, no reply. Two months after my original email, I received a brief note thanking me for my interest in the study and offering to share any results once they became available.

“Wait a minute,” I immediately thought, “The results of this ‘major new study’ about the harmful effects of psychotherapy had already been announced in a leading newspaper.  How could they not be available?”  Then I wondered, “If there are no actual results to share, what exactly was the article in The Guardian based on?”

So-called “adverse events” are a hot topic at the moment.  That some people deteriorate while in care is not in question.  Research dating back several decades puts the figure at about 10%, on average (Lambert, 2010). When those being treated are adolescents or children, the rates ca be twice as high (Warren et al., 2009).

Putting this in context, compared to medical procedures with effect sizes similar to psychotherapy (e.g., coronary artery bypass surgery, stages II and III breast cancer, stroke), the rate is remarkably low. Nonetheless, it is a matter of concern–especially given research showing that therapists are not particularly adept at recognizing when those they serve deteriorate in their care (Hannan et al., 2005)

The question, of course, is the cause?

To date, whenever the question of adverse events is raised, two “usual suspects” are trotted out: (1) the method of treatment used; and (2) the therapist.  Let’s take a closer look at each.

In an October 2914 article published in World Psychiatry, Linden and Schermuly-Haupt wrote about estimates of side effects associated with specific methods of treatment that had been reported in an earlier study by Swiss researchers.  The numbers were shocking.  Patient reported “burdens caused by therapy” were 19.7% with CBT, 20.4% for systemically oriented treatments, 64.8% with humanistic approaches, and a staggering 94.1% with psychodynamic psychotherapy.

Based on such results, one could only conclude that anyone seeking anything other than CBT should have their head examined.

There is only one problem.  The figures reported were wrong.  Completely and utterly wrong.  Linden and Schermuly-Haupt made an arithmetic error and, as a result, totally misinterpreted the Swiss findings. Read the study for yourself.  When it comes to adverse events in psychotherapy, CBT–the fair-haired child of the evidence-based practice movement–is not better.  Indeed, as the study clearly shows, people treated with humanistic and systemic approaches suffered fewer “burdens” than expected, while those in CBT had a slightly higher, although not statistically significant, level. More, the observed percentage of people in care who perceived the quality of the therapeutic relationship–the single most potent predictor of engagement and outcome–as poor was significantly higher than expected in CBT and lower for both humanistic and systemic approaches.

How could the researchers have gotten it so wrong?

As I pointed out in my blog over year ago, despite claims to the contrary (e.g., Lilenfeld, 2007), no psychotherapy approach tested in a clinical trial has ever been shown to reliably lead to or increase the chances of deterioration.  NONE.  Scary stories about dangerous psychological treatments are limited to a handful of fringe therapies–approaches that have been never vetted scientifically and which all practitioners, but a few, avoid.  In short, its not about the method.

(By the way, over a month ago, I wrote to the lead author of the paper that appeared in World Psychiatry via the ResearchGate portal–a site where scholars meet and share their publications–providing a detailed breakdown of the statistical errors in the publication.  No response thusfar)

With only one suspect left, attention naturally turns to the therapist–you know, the “bad apple” in the bunch.  Here’s what we know.  That some practitioners do more harm than others is not exactly news. Have you seen the new biopic Love & Mercy, about the life of Beach Boy Brian Wilson?  You should.  The acting is superb.

Wilson’s therapist, psychologist Eugene Landy (chillingly recreated by actor Paul Giamatti), is a prime example of an adverse event.  See the film and you’ll most certainly wonder how the guy kept his license to practice so long.  And yet, as I also pointed out in my blog last year, there are too few such practitioners to account for the total number of clients who worsen.  Consider this unsettling fact: beyond the 10% of those who deteriorate in psychotherapy, an additional 30 and 50% experience no benefit whatsoever!

Where does this leave us when it comes to adverse events in psychotherapy?

Whatever the cause, lack of progress and risk of deterioration are issues for all clinicians and clients.   The key to addressing these problems is tracking progress from visit to visit so that those not improving, or getting worse, can be identified and offered alternatives.  It’s that simple.

Right now, practitioners can access two simple, easy-to-use scales for free at www.scottdmiller.com.  Both have been tested in multiple, randomized, clinical trials and deemed evidence-based by the Federal Substance Abuse and Mental Health Services Administration (SAMHSA).

Learning to use the tools isn’t difficult.  It costs nothing to subscribe to the International Center for Clinical Excellence and begin interacting with professionals around the world who are using the measures to improve the quality and outcome of behavioral health services.

P.S.: On the one year anniversary of my original email to the reseacher cited in the Guardian, I sent another.  That’s over a month ago.  So far, no reply.  By contrast, the reporter who broke the story, Sarah Boseley , wrote back within a half hour!  She’s following up her sources.  I’ll let you know if she gets a response.

Filed Under: Behavioral Health, Conferences and Training, Feedback Informed Treatment - FIT, Top Performance

  • « Previous Page
  • 1
  • …
  • 45
  • 46
  • 47
  • 48
  • 49
  • …
  • 108
  • Next Page »

SEARCH

Subscribe for updates from my blog.

[sibwp_form id=1]

Upcoming Training

There are no upcoming Events at this time.

FIT Software tools

FIT Software tools

LinkedIn

Topics of Interest:

  • behavioral health (5)
  • Behavioral Health (109)
  • Brain-based Research (2)
  • CDOI (12)
  • Conferences and Training (62)
  • deliberate practice (29)
  • Dodo Verdict (9)
  • Drug and Alcohol (3)
  • evidence-based practice (64)
  • excellence (61)
  • Feedback (36)
  • Feedback Informed Treatment – FIT (230)
  • FIT (27)
  • FIT Software Tools (10)
  • ICCE (23)
  • Implementation (6)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (9)
  • Practice Based Evidence (38)
  • PTSD (4)
  • Suicide (1)
  • supervision (1)
  • Termination (1)
  • Therapeutic Relationship (9)
  • Top Performance (37)

Recent Posts

  • Agape
  • Snippets
  • Results from the first bona fide study of deliberate practice
  • Fasten your seatbelt
  • A not so helpful, helping hand

Recent Comments

  • Typical Duration of Outpatient Therapy Sessions | The Hope Institute on Is the “50-minute hour” done for?
  • Dr Martin Russell on Agape
  • hima on Simple, not Easy: Using the ORS and SRS Effectively
  • hima on The Cryptonite of Behavioral Health: Making Mistakes
  • himalaya on Alas, it seems everyone comes from Lake Wobegon

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland Hypertension icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training