SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
scottdmiller@ talkingcure.com +1.773.454.8511

Do you know who said, "Sometimes the magic works, sometimes it doesn’t"?

April 30, 2014 By scottdm Leave a Comment

Well, do you?

It was Chief Dan George playing the role of Old Lodge Skins in the 1970 movie, “Little Big Man.”  Whether or not you’ve seen or remember the film, if you’re a practicing therapist, you know the wisdom contained in that quote.  No matter how skilled the clinician or devoted the client, “sometimes therapy works, sometimes it doesn’t.”

Evidence from randomized clinical trials indicates that, on average, clinicians achieve a reliable change–that is, a difference not attributable to chance, maturation, or measurement error–with approximately 50% of people treated.  For the most effective therapists, it’s about 70%.  Said another way, all of us fail between 30-50% of the time.

Of greater concern, however, is the finding that we don’t see the failure coming.  Hannan and colleagues (2005) found, for example, that therapists correctly predicted deterioration in only 1 of 550 people treated, despite having been told beforehand the likely percentage of their clients that would worsen and knowing they were participating in a study on the subject!

It’s one thing when “the magic doesn’t work”–nothing is 100%–but it’s an entirely different matter when we go on believing that something is working, when it’s not.  Put bluntly, we are a terminally, and forever hopeful group of professionals!

What to do?  Hannan et al. (2005) found that simple measures of progress in therapy correctly identified 90% of clients “at risk” for a negative outcome or dropout.  Other studies have found that routinely soliciting feedback from people in treatment regarding progress and their experience of the therapeutic relationship as much as doubles effectiveness while simultaneously reducing dropout and deterioration rates.

You can get two, simple, evidence-based measures for free here.   Get started by connecting with and learning from colleagues at The International Center for Clinical Excellence.  It’s free and signing up takes only a minute or two.

Finally, read the Feedback Informed Treatment and Training Manual, containing step by step instructions for using the scales to guide and improve the services you offer.

Here’s to knowing when our “magic” is working, and when it’s not!

Filed Under: Feedback Informed Treatment - FIT Tagged With: icce, international center for cliniclal excellence, magic, outcome measurement, randomized clinical trial, therapy

Do you do psychotherapy?

September 26, 2013 By scottdm 1 Comment

You know psychotherapy works. Forty years of research evidence backs up your faith in the process. And yet, between 1998 and 2007, psychotherapy use decreased by 35%.  People still sought help, they just went elsewhere to get it.  For instance, use of psychotropic drugs is up 40% over the last decade.

A recent article in Popular Science traced the decline and outlined 3 provocative steps for saving the field. If you provide psychotherapy, it’s worth a read. The article is dead serious when recommending:

1. It’s time to GO BIG;

2. Getting a cute commercial; and

3. Dropping the biology jargon.

You’ve got to admit that the field’s fascination with biology is curious. A mountain of evidence points instead to the relationship between the provider and recipient of care. Other research shows that psychotherapy promotes more lasting change, at less cost and with fewer side effects than medication.

How to get the message out?

Many people and organizations are making a valiant effort. Ryan Howe almost single-handedly established September 25, as National Psychotherapy Day.  The American Psychological Association published a rare, formal resolution on the efficacy of psychotherapy.

Frankly though, the best commercial for psychotherapy is our results. Consider the approach taken by the Colorado Center for Clinical Excellence. They don’t merely cite studies supporting psychotherapy in general, they report their actual results!

You can begin doing the same by downloading two free, simple to use measures here.

Then, learn how to use the scales by reading the latest edition of the FIT Treatment and Training Manual.  In it, you’ll also learn how to use the data to improve both the quality and outcome of your services.

Filed Under: behavioral health, Conferences and Training, Feedback Informed Treatment - FIT Tagged With: American Psychological Association, NREPP, Popular Science, psychotherapy, SAMHSA

NIMH Dumps the DSM-5: The No News Big News

May 10, 2013 By scottdm 1 Comment

Some time ago, I blogged about results from field trials of the soon-to-be-released, fifth edition of the Diagnostic and Statistical Manual of Mental Disorders.  Turns out, many of the diagnoses in the “new and improved” version were simply unreliable.  In fact, the likelihood of two clinicians, applying the same criteria to assess the same person for the two most common mental health conditions—anxiety and depression—and agreeing, was worse than it was with DSM IV, the ICD-10, or the DSM-III!

The question of validity, that is how well the diagnoses relate to real world phenomena, has never been addressed empirically in any edition.  Essentially, DSM is a collection of symptom clusters, not too dissimilar from categorizing people according to the four humours—and, it turns out, about as helpful in determining the appropriate or likely outcome of any treatment provided.

Despite these serious shortcomings, the volume exerted tremendous power and influence over research and practice for the last three decades.  Nearly all graduate programs teach it, research is organized around its content, and insurance companies and payers (including the Federal government) demand it for reimbursement.  In short, everyone acted “as if” it were true—that is, until last week when NIMH Director, Thomas Insel, announced the organization was abandoning the DSM.  As if having woken up from a thirty-year- nap the reason given was the volume’s lack of validity!  Really?

The day the announcement was made, I received a bunch of emails.   Most of the writers were elated.  They knew I’d been critical of the volume for many years.  “Finally,” one said, “a return to sanity.”  My response?  Not so fast.

To begin, DSM is not going away any time soon.  Sorry, but if you want to be paid, keep your trusty copy nearby.

More troubling— if you read the fine print—NIMH is promising a better system, based on “a new idea everyone should welcome.”   Just what is that idea?   Mental health problems are biological in origin.  To achieve better outcomes, NIMH funded researchers need to map the “cognitive, circuit, and genetic aspects of mental disorders” so as to identify “new and better targets for treatment.”  Insel calls it, “precision medicine.”

Now, I don’t know about you, but the new idea sounds a heck of a lot like the old one to me!  Psychiatry’s biological bandwagon blew into town last century and has been playing the same tune ever since.  Remember the “dexamethasone suppression test” for differentiating endogenous from non-endogenous depression?  How about the claims made about Xanax in the treatment of panic or the “new” anti-psychotics?   There’s always prefrontal lobotomy which like the DSM, proponents continued to use and promote long after its lack of efficacy and brain disabling side effects were known.  Heck, the originator won a Nobel Prize!

As far the promise of something better is concerned, history should chasten any hope one might feel.  Honestly, when was the last time the field failed to claim significant progress was being made?  Each new treatment approach is pitched as a vast improvement over “old ideas.”  CBT is better than psychodynamic,  specific is better than eclectic, evidence-based treatments are better than routine clinical practice, and so on—except none of these widely promulgated notions holds empirical water.

If “news” = new + different, then the NIMH announcement, like so much of what you find on TV and other social media, is definitely not news.  It’s more of the same.  Precision medicine in mental health is: 90% promise + 10% hyperbole, or marketing.

Here are a couple of newsworthy facts with immediate implications for mental health policy, practice, and research:

  1. Treatment works.  Evidence gathered over the last four decades documents that people who receive therapy are better off than 80% of those (with the same problem or concern) as those without the benefit of treatment.
  2. A majority of potential consumers (78%) cite “lack of confidence” in the outcome of treatment as a barrier to seeking help from a mental health professional.
  3. Tracking a consumer’s engagement and progress during treatment enables clinicians to tailor services to the individual, resulting in lower costs, fewer drop outs, and as much as three times the effects!

Just a thought—if we really want to step into the future, rather than geneticists, neurologists, and radiologists perhaps the field could start by listening to consumers.  That’s exactly the point Ernesto Sirolli made at a recent TED talk.  If you haven’t seen it, here it is:

Filed Under: Feedback Informed Treatment - FIT Tagged With: CBT, DSM, ICD-10, NIMH, psychiatry

What to Pay Attention to in Therapy?

March 15, 2013 By scottdm Leave a Comment

A week or so ago, I received an email from my friend, colleague, and mentor Joe Yeager.  He runs a small listserve that sends out interesting and often provocative information.  The email contained pictures from a new and, dare I say, ingenious advertising campaign for Colgate brand dental floss.  Before I give you any of further details, however, take a look at the images yourself:

All right.  So what caught your attention?  If you’re like most people–including me–you probably found yourself staring at the food stuck in the teeth of the men in all three images.  If so, the ad achieved its purpose.  Take a look at the pictures one more time.  In the first, the woman has one too many fingers on her left hand.  The second image has a “phamtom arm” around the man’s shoulder.  Can you see the issue in the third?

The anomalies in the photos are far from minor!  And yet, most of us, captured by the what initially catches our eye, miss them.

Looking beyond the obvious is what Feedback Informed Treatment (FIT) is all about.  Truth is, much of the time therapy works.  What we do pay attention to gets results–except when it doesn’t!  At those times, two things must happen: (1) we have to know when what we usually do isn’t working with a given person; and (2) look beyond the obvious and see a bigger picture.  Doing this takes effort and support.    What can you do?

1. Download two free, brief, simple to use tools for tracking outcome and engagement in care (the ORS and SRS) and begin using them in your work;

2. Join the International Center for Clinical Excellence, a free, online, non-denominational organization of behavioral health professionals;

3. Read the FIT Treatment and Training manual

Filed Under: Feedback Informed Treatment - FIT Tagged With: accountability, Alliance, behavioral health, deliberate practice, evidence based practice, feedback, NREPP, SAMHSA

S.A.M.S.H.A. designates Feedback-Informed Treatment an "Evidence-based Practice"

February 2, 2013 By scottdm Leave a Comment

(This post is included for historical purposes.  Following the 2016 election, the NREPP registry was decommissioned)

February 2, 2013
Chicago, Illinois USA

I am honored to announce that Feedback-Informed Treatment (FIT) has been added to SAMSHA’s official database of evidence-based practices (EBP) known as NREPP (the National Registry of Evidence-based Programs and Practices).  Briefly, NREPP is a searchable online registry of behavioral health interventions that have been reviewed and rated by independent reviewers.  The purpose of the registry is to assist the public, payers, and practitioners in identifying approaches that have both empirical support and materials available to facilitate implementation.

The Institute of Medicine and American Psychological Association define EBP as, “the integration of the best available research with clinical expertise in the context of patient characteristics, culture, and preferences (see American Psychologist, May 2006).  The principles and practices of feedback-informed treatment (FIT) are not only consistent with but provide practitioners with a simple and practical method for operationalizing EBP in their daily work.  To wit, routinely and formally soliciting feedback from consumers regarding the therapeutic alliance and outcome of care and using the resulting information to inform and tailor service delivery.  Multiple, carefully-controlled, randomized clinical trials document that FIT improves outcomes while simultaneously decreasing the risk of drop out and deterioration in care.

Scientific evidence is one matter; being able to support practitioners, agencies, and systems of care in implementing an EBP is another.  On this subject, I am proud to say that FIT received perfect ratings.  Unlike other similar approaches, “no weaknesses” were identified by reviewers.  Instead, the summary noted, “ICCE…has an array of comprehensive, well-organized, and high-quality materials to support…implementation…The steps for successful implementation are clear and accompanied by tools and guidance to support the entire process, from the determination of organizations readiness through evaluation.”

Such high marks would not been possible without the contribution of ICCE Senior Associates who worked tirelessly to create the materials and complete the application.   A big thanks to Jason Seidel, Psy.D., Bob Bertolino, Ph.D., Susanne Bargmann, Cynthia Maeschalck, Rox Axsen, Bill Robinson, Robbie Babbins-Wagner, Ph.D., and Julie Tilsen, Ph.D..

The formal recognition of FIT as an EBP is a watershed moment in the history of the International Center for Clinical Excellence, further enabling the organization to achieve it’s mission of improving the quality and outcome of behavioral health services.

Filed Under: Feedback Informed Treatment - FIT Tagged With: NREPP, SAMHSA

Curing Clinician Overconfidence: Try Darting and Frowning

January 10, 2013 By scottdm Leave a Comment

Overconfidence.  It’s a problem that leads to systematic errors in judgement.   Long thought to arise out of hubris or the corrupting effects of the emotion, the evidence actually shows it to be built into humans’ evolved cognitive machinery.  Existimo ergo certus sum (I think, therefore I am…certain).

Behavioral health professionals are not immune.  A recently published study by Walfish, McAlister, O’Donnell, and Lambert (2012) asked clinicians how their effectiveness rates compared to other professionals.  Turns out, clinicians, on average, believed their results were better than 80% of their peers.  Not a single practitioner surveyed viewed themselves as below average and a full quarter (25%) thought they fell at the 90th percentile or higher in skill level and effectiveness!

It’s true that we are not alone in this tendency.  As indicated above, it’s how our brains work.  The typical driver, for example, believes themselves to be better than 80% of others on the road.  University professors, it appears, suffer from the most inflated levels of self-esteem, ranking themselves at the 94th percentile on average.

When it comes to learning, the consequences are significant.  Why change, after all, if you’re already pretty darn good and if the real problem is obvious: other drivers, poor students, etc., difficult life circumstances or the complex nature of some mental disorders?

Researchers have discovered a relatively simple solution to overconfidence: frowning.  That’s right.  Turning that smile upside down short circuits our reptilian wiring, making us more analytical and vigilant in our thinking, in the process enabling us to “question stories that we would otherwise unreflectively accept as true because they are facile and coherent” ( Holt, 2011).

What else can clinicians do?  Do something to gain perspective.  Take on another, divergent point of view, for example.  Practically speaking, scan rather than fix your gaze.  Literally, move your eyes.

Everyone has heard of “tunnel vision.”  Turns out, despite pledges to remain open and flexible, it ain’t so easy.  If you don’t agree, try a little experiment.    Fix your eyes on the flashing red and/or green dot at the center of the graphic and notice what happens to the surrounding yellow ones.  Be patient if the image hasn’t loaded.  It can take a minute or two.

They either blinked on and off or disappeared completely.  Interesting enough but here’s what’s really strange: the yellow dots actually never disappear.  They are always there despite what you see.  And no, the computer did not scan your visual field and cause the yellow dots to blink.  Neither is this an optical illusion.  Once again, it’s the way we are wired.  We think we are seeing everything…but we are not.  The result: overconfidence.  It’s why, following an automobile accident, people will say, “the other driver came out of nowhere.”  It’s why surgeons leave sponges inside their patients or miss seeing bleeds or small nicks of the scalpel.  It’s also why behavioral health practitioners routinely fail to detect deterioration and people at risk for dropping out of services (Hannan, et al. 2005).

Now, look again.  This time, however, shift your eyes about while watching the flashing dot in the center.  In other words, don’t fix your gaze.  If that doesn’t change what you see, then step back from the image and view it from a distance.  There, see!  The yellow dots are present the entire time.

Helping busy practitioners step back, shift their gaze, and otherwise improve their critical faculties and skills is the mission of ICCE.  Members connect, learn from, and share with the largest online community of mental health professionals in the world.  Thousands of members, hundreds of discussion forums, a massive and every growing library of research and other supportive documents, and how-to videos are available for free 24-7-365.

Many of the members and associates will be meeting in Amsterdam, Holland for the Achieving Clinical Excellence conference on May 16-18th.  Conference coordinator, Liz Pluut, has organized an line-up of international speakers, researchers, and practitioners that is guaranteed to push your clinical performance to the next level!  Participants are coming from all over Europe, the US, Canada, Asia, Australia, and more.  Don’t wait to register.  Space is limited and the response has been amazing.

OK, here’s something fun.  Take a look at the video below.  Oh yeah, make sure you smile and keep your eyes fixed on my hands!

Filed Under: Conferences and Training, Feedback Informed Treatment - FIT Tagged With: behavioral health, icce

Believing is Seeing: How Wishing Makes Things So

January 3, 2013 By scottdm Leave a Comment

Yesterday evening, my family and I were watching a bit of T.V.  My son, Michael commented about all the ads for nutrional supplements, juicing machines, weight loss programs and devices.  “Oh yeah,” I thought, then explained to him, “It’s the start of a new year.”  Following “spending more time with family,” available evidence shows exercise and weight loss top the bill of resolutions.  Other research shows that a whopping 80% eventually break these well intentioned commitments.  Fully a third won’t even make it to the end of the month!  Most attribute the failure to being too busy, others to a lack of motivation.  Whatever the cause, it’s clear that, when it comes to change, hope and belief will only take you so far. 

What can help?  More on that in a moment.

In the meantime, consider a recent study on the role of hope and belief in research on psychotherapy.  Beginning in the 1970’s, study after study, and studies of studies, have found a substantial association between the effectiveness of particular treatment models and the beliefs of the researchers who conduct the specific investigations.  In the literature, the findings are referred to under the generic label, “research allegiance” or R.A.  Basically, psychotherapy outcome researchers tend to find in favor of the approach they champion, believe in, and have an affinity towards.  Unlike New Year’s resolutions, it seems, the impact of hope and belief in psychotherapy outcome research is not limited; indeed, it carries investigators all the way to success–albeit a result that is completely “in the eye of the beholder.”  That is, if one believes the research.  Some don’t.

Hang with me now as I review the controversy about this finding.  As robust as the results on researcher allegiance appear, an argument can be made that the phenomenon is a reflection rather than a cause of differences in treatment effectiveness.  The argument goes: researcher allegiance is caused by the same factors that lead to differences in outcome between approaches: real differences in outcome betweepproaches.  In short, researchers’ beliefs do not cause the effects, as much as the superior effects of the methods cause researchers to believe.   Makes sense, right?  And the matter has largely nguished there, unresolved for decades.

That is, until recently.  Turns out, believing is seeing.  Using a sample of studies in which treatments with equivalent efficacy were directly compared within the same study, researchers Munder, Fluckiger, Gerger, Wampold, and Barth (2012) found that a researcher’s allegiance to a particular method systemically biases their results in favor of their chosen approach.  The specific methods included in this study were all treatments designated as “Trauma-focused” and deemed “equally effective” by panels of experts such as the U.K.’S National Institute for Clinical Excellence.  Since the TFT approaches are equivalent in outcome, researcher allegiance should not have been predictive of outcome.  Yet, it was–accounting for an incredible 12% of the variance.  When it comes to psychotherapy outcome research, wishing makes it so.

What’s the “take-away” for practitioners?  Belief is powerful stuff: it can either help you see possibilities or blind you to important realities.  Moreover, you cannot check your beliefs at the door of the consulting room, nor would you want to.  Everyday, therapists encourage people to take the first steps toward a happier, more meaningful life by rekindling hope.  However, if researchers, bound by adherence to protocol and subject to peer review can be fooled, so can therapists.  The potentially significant consequences of unchecked belief become apparent when one considers a recently published study by Walfish et al. (2012) which found that therapists on average overestimate their effectiveness by 65%.

When it comes to keeping New Year’s resolutions, experts recommend avoiding broad promises and grand commitments and instead advise setting small, concrete measureable objectives.  Belief, it seems, is most helpful when its aims are clear and effects routinely verified.  One simple way to implement this sage counsel in psychotherapy is to routinely solicit feedback from consumers about the process and outcome of the services offered.  Doing so, research clearly shows, improves both retention and effectiveness.

You can get two, simple, easy-to use scales for free by registering at: http://scottdmiller.com/srs-ors-license/  A world wide community of behavioral health professionals is available to support your efforts at: www.centerforclinicalexcellence.com.

You can also join us in Chicago for four days of intensive training.  We promise to challenge your both beliefs and provide you with the skills and tools necessary for pushing your clinical performance to the next level of effectiveness.

Filed Under: Feedback Informed Treatment - FIT Tagged With: NICE, ors, outome rating scale, psychotherapy, session rating scale, srs, wampold

Feedback in Groups: New Tools, New Evidence

December 29, 2012 By scottdm Leave a Comment

 

Groups are an increasingly popular mode for delivering behavioral health services.  Few would deny that using the same hour to treat mutliple people is more cost effective.  A large body of research shows it to be as effective in general as individually delivered treatments.

Now clinicians can incorporate feedback into the group therapy using a brief, scientifically validated measurement scale: the Group Session Rating Scale.  The measure is part of the packet of FIT tools available in 20+ languages on both my personal and the International Center for Clinical Excellence websites.   Since the alliance is one of the most robust predictors of outcome, the GSRS provides yet another method for helping therapists obtain feedback from consumers of behavior health services.  As readers of this blog know, over a dozen randomized clinical trials document the positive impact of routinely assessing consumers’ experience of progress and the alliance on both retention and outcome of treatment.

The most up-to-date information about incorporating the GSRS into group therapy is covered in Manual 5: Feedback Informed Clinical Work: Specific Populations and Service Settings written together with ICCE Senior Associates Julie Tilsen, Cynthia Maeschalck, Jason Seidel, and Bill Robinson.

Manual 5 is one of six, state-of-the-art, how-to volumes on Feedback-Informed Treatment.  The series covers every aspect of FIT, from supporting research to implementation in agencies and larger systems of care.  The were developed and submitted in partial support of ICCE’s application to SAMSHA for designation as an evidence-based practice.

These popular e-books are being used in agencies and by practitioners around the world.  Right now, they are also available on a limited edition, searchable CD at 50% off the regular price.  As always, individual clinicians can download the GSRS and begin using it in their work for free.  

Advanced FIT Training - March 2013

Using the GSRS to inform and improve the effectiveness of group therapy will also be a focus on the ICCE Advanced Intensive training scheduled for March 18th-21st in Chicago, Illinois (USA).  Registration is simple and easy.  Click here to get started.  Participants from all over the United States, Canada, Europe and elsewhere are already registered to attend.

Click on the link below to read the validation article on the GSRS:

The Group Session Rating Scale (Quirk, Miller, Duncan, Owen, 2013)

Filed Under: Feedback Informed Treatment - FIT Tagged With: behavioral health, feedback informed treatment, ors, outcome rating scale, session rating scale, srs

Dealing with Scientific Objections to the Outcome and Session Rating Scales: Real and Bogus

December 15, 2012 By scottdm Leave a Comment

The available evidence is clear: seeking formal feedback from consumers of behavioral health services decreases drop out and deterioration while simultanesouly improving effectiveness.  When teaching practitioners how to use the ORS and SRS to elicit feedback regarding progress and the therapeutic relationship,  three common and important concerns are raised:

  1. How can such simple and brief scales provide meaningful information?
  2. Are consumers going to be honest?
  3. Aren’t these measures merely assessing satisfaction rather than anything meaninful?

Recently, I was discussing these concerns with ICCE Associate and Certified Trainer, Dan Buccino.

Briefly, Dan is a clinical supervisor and student coordinator in the Adult Outpatience Community Psychiatry program at Johns Hopkins.  He’d not only encountered the concerns noted above but several additional objections.  As he said in his email, “they were at once baffling and yet exciting, because they were so unusal and rigorous.”

“It’s a sign of the times,” I replied, “As FIT (feedback informed treatment) becomes more widespread, the supporting evidence will be scrutinized more carefully.  It’s a good sign.”

Together with Psychologist and ICCE Senior Associate and Trainer, Jason Seidel, Dan crafted detailed response.  When I told them that I believed the ICCE community would value having access to the document they created, both agreed to let me publish it on the Top Performance blog.  So…here it is.  Please read and feel free to pass it along to others.

 

 

 

Filed Under: Feedback Informed Treatment - FIT Tagged With: accountability, behavioral health, Certified Trainers, evidence based practice, feedback, interviews, mental health, ors, practice-based evidence, psychometrics, research, srs

The Importance of "Whoops" in Improving Treatment Outcome

December 2, 2012 By scottdm Leave a Comment

“Ring the bells that still can ring,
Forget your perfect offering
There is a crack in everything,
That’s how the light gets in.”

Leonard Cohen, Anthem

Making mistakes.  We all do it, in both our personal and professional lives.  “To err is human…,” the old saying goes.  And most of us say, if asked, that we agree whole heartedly with the adage–especially when it refers to someone else!  When the principle becomes personal, however, its is much more difficult to be so broad-minded.

Think about it for a minute: can you name five things you are wrong about?  Three?  How about the last mistake you made in your clinical work?  What was it?  Did you share it with the person you were working with?  With your colleagues?

Research shows there are surprising benefits to being wrong, especially when the maker views such errors differently.  As author Alina Tugend points out in her fabulous book, Better by Mistake, custom wrongly defines a mistake as ” the failure of a planned sequence of mental or physical activities to achieve its intended outcome.”  When you forget a client’s name during a session or push a door instead of pull, that counts as  slip or lapse.  A mistake, by contrast, is when “the plan itself is inadequate to achieve it’s objectives” (p. 11).  Knowing the difference, she continues, “can be very helpful in avoiding mistakes in the future” because it leads exploration away from assigning blame to the exploring systems, processes, and conditions that either cause mistakes or thwart their detection.

Last week, I was working with a talented and energetic group of helping professionals in New Bedford, Massachusetts.  The topic was, “Achieving Excellence: Pushing One’s Clinical Performance to the Next Level of Effectiveness.”  As part of my presentation, I talked about becoming more, “error-centric” in our work; specifically, using ongoing measurement of the alliance to identify opportunities for improving our connection with consumers of behavioral health services.  As an example of the benefits of making mistakes the focus of professional development efforts, I showed a brief video of Rachel Hsu and Roger Chen, two talented musicians who performed at the last Achieving Clinical Excellence (ACE) conference.  Rachel plays a piece by Liszt, Roger one by Mozart.  Both compositions are extremely challenging to play.  You tell me how they did (by the way, Rachel is 8 years old, Roger. 9):

Following her performance, I asked Rachel if she’d made any mistakes during her performance.  She laughed, and then said, “Yes, a lot!”  When I asked her what she did about that, she replied, “Well, its impossible to learn from my mistakes while I’m playing.  So I note them and then later practice those small bits, over and over, slow at first, then speeding up, until I get them right.”

After showing the video in New Bedford, a member of the audience raised his hand, “I get it but that whole idea makes me a bit nervous.”  I knew exactly what he was thinking.  Highlighting one’s mistakes in public is risky business.  Studies documenting that the most effective clinicians experience more self-doubt and are more willing to admit making mistakes is simply not convincing when one’s professional self-esteem or job may be on the line.  Neither is research showing that health care professionals who admit making mistakes and apologize to consumers are significantly less likely to be sued.  Becoming error centric, requires a change in culture, one that not only invites discloure but connects it with the kind of support and structure that leads to superior results.

Creating a “whoops-friendly” culture will be a focus of the next Achieving Clinical Excellence conference, scheduled for May 16-18th, 2013 in Amsterdam, Holland.  Researchers and clinicians from around the world will gather to share their data and experience at this unique event.  I promise you don’t want to miss it.  Here’s a short clip of highlights from the last one:

My colleague, Susanne Bargmann and I will also be teaching the latest research and evidence based methods for transforming mistakes into improved clinical performance at the upcoming FIT Advanced Intensive training in Chicago, Illinois.   I look forward to meeting you at one of these upcoming events.  In the meantime, here’s a fun, brief but informative video from the TED talks series on mistakes:

By the way, the house pictured above is real.  My family and I visited it while vacationing in Niagara Falls, Canada in October.  It’s a tourist attraction actually.  Mistakes, it seems, can be profitable.

Filed Under: Feedback Informed Treatment - FIT Tagged With: accountability, Alliance, behavioral health, cdoi, conferences, continuing education, deliberate practice, evidence based practice, feedback, mental health, Therapist Effects, top performance

What is the Real Source of Effectiveness in Smoking Cessation Treatment? New Research on Feedback Informed Treatment

November 24, 2012 By scottdm Leave a Comment

When it rains, it pours!  So much news to relay regarding recent research on Feedback Informed Treatment (FIT).  Just received news this week from ICCE Associate Stephen Michaels that research using the ORS and SRS in smoking cessation treatment is in print!   A few days prior to that, Kelley Quirk sent a copy of our long-awaited article on the validity and reliability of the Group Session Rating Scale.  On that very same day, the editors from the journal Psychotherapy sent proofs of an article written by me, Mark Hubble, Daryl Chow, and Jason Seidel for the 50th anniversary issue of the publication.

Let’s start with the validity and reliability study.  Many clinicians have already downloaded and been using Group Session Rating Scale.  The measure is part of the packet of FIT tools available in 20+ languages on both my personal and the International Center for Clinical Excellence websites.   The article presents the first research on the validity and reliability of the measure.  The data for the study was gathered at two sites I’ve worked with for many years.   Thanks to Kelley Quirk and Jesse Owen for crunching the numbers and writing up the results!   Since the alliance is one of the most robust predictors of outcome, the GSRS provides yet another method for helping therapists obtain feedback from consumers of behavior health services.

Moving on, if there were a Nobel Prize for patience and persistence, it would have to go to Stephen Michaels, the lead author of the study, Assessing Counsellor Effects on Quit Rates and Life Satisfactions Scores at a Tobacco Quitline” (Michael, Seltzer, Miller, and Wampold, 2012).  Over the last four years, Stephen has trained Quitline staff in FIT, implemented the ORS and SRS in Quitline tobacco cessation services, gathered outcome and alliance data on nearly 3,000 Quitline users, completed an in-depth review of the available smoking cessation literature, and finally, organized, analyzed, and written up the results.

What did he find?  Statistically significant differences in quit rates attributable to counselor effects.  In other words, as I’ve been saying for some time, some helpers are more helpful than others–even when the treatment provided is highly manualized and structured.  In short, it’s not the method that matters (including the use of the ORS and SRS), it’s the therapist.

What is responsible for the difference in effectiveness among therapists?  The answer to that question is the subject of the article, “The Outcome of Psychotherapy: Yesterday, Today, and Tomorrow” slated to appear in the 50th anniversary issue of Psychotherapy.  In it, we review controversies surround the question, “What makes therapy work?” and tip findings from another, soon-to-be-published empirical analysis of top performing clinicians.  Stay tuned.

Filed Under: Feedback Informed Treatment - FIT Tagged With: addiction, behavioral health, cdoi, Certified Trainers, evidence based practice, excellence, feedback, healthcare, icce, Smoking cessation, Therapist Effects

Psychotherapy Training: Is it Worth the Bother?

October 29, 2012 By scottdm 2 Comments

Big bucks.  That’s what training in psychotherapy costs.  Take graduate school in psychology as an example.  According to the US Department of Education’s National Center (NCES), a typical doctoral program takes five years to complete and costs between US$ 240,000-300,000.00.

Who has that kind of money laying around after completing four years of college?  The solution? Why, borrow the money, of course!  And students do.  In 2009, the average amount of debt of those doctoral students in psychology who borrowed was a whopping US$ 88,000–an amount nearly double that of the prior decade.  Well, the training must be pretty darn good to warrent such expenditures–especially when one considers that entry level salaries are on the decline and not terribly high to start!

Oh well, so much for high hopes.

Here are the facts, as recounted in a recent, concisely written summary of the evidence by John Malouff:

1. Studies comparing treatments delivered by professionals and paraprofessionals either show that paraprofessionals have better outcomes or that there is no difference between the two groups;

2. There is virtually no evidence that supervision of students by professionals leads to better client outcomes (you should have guessed this after reading the first point);

3. There is no evidence that required coursework in graduate programs leads to better client outcomes.

If you are hoping that post doctoral experience will make up for the shortcomings of professional training, well, keep hoping.  In truth, professional experience does not correlate often or significantly with client therapy outcomes.

What can you do?  As Malouf points out, “For accrediting agencies to operate in the realm of principles of evidence-based practice, they must produce evidence…and this evidence needs to show that…training…contribute(s) to psychotherapy outcomes…[and] has positive benefits for future clients of the students” (p. 31).

In my workshops, I often advise therapists to forgo additional training until they determine just how effective they are right now.  Doing otherwise, risks perceiving progress where, in fact, none exists.  What golfer would buy new clubs or pursue expensive lessions without first knowing their current handicap?  How will you know if the training you attend is “worth the bother” if you can’t accurately measure the impact of it on your performance?

Determining one’s baseline rate of effectiveness is not as hard as it might seem.  Simply download the Outcome Rating Scale and begin using it with your clients.  It’s free.  You can then aggregate and analyze the data yourself or use one of the existing web-based systems (www.fit-outcomes.com or www.myoutcomes.com) to get data regarding your effectiveness in real time.

After that, join your colleagues at the upcoming Advanced Intensive Training in Feedback Informed Treatment.   This is an “evidence-based” training event.  You learn:

• How to use outcome management tools (e.g., the ORS) to inform and improve the treatment services you provide;

• Specific skills for determining your overall clinical success rate;

• How to develop an individualized, evidence-based professional development plan for improving your outcome and retention rate.

There’s a special “early bird” rate available for a few more weeks.  Last year, the event filled up several months ahead of time, so don’t wait.

On another note, just received the schedule for the 2013 Evolution of Psychotherapy conference.  I’m very excited to have been invited once again to the pretigious event and will be bring the latest information and research on acheiving excellence as a behavioral health practitioner.  On that note, the German artist and psychologist, Andreas Steiner has created a really cool poster and card game for the event, featuring all of the various presenters.  Here’s the poster.  Next to it is the “Three of Hearts.”  I’m pictured there with two of my colleagues, mentors, and friends, Michael Yapko and Stephen Gilligan:

Filed Under: Conferences and Training, Feedback Informed Treatment - FIT, Top Performance Tagged With: Andreas Steiner, evidence based medicine, evidence based practice, Evolution of Psychotherapy conference, john malouff, Michael Yapko, ors, outcome management, outcome measurement, outcome rating scale, paraprofessionals, psychology, psychotherapy, session rating scale, srs, Stephen Gilligan, therapy, Training, US Department of Education's National Center (NCES)

Looking for Results in All the Wrong Places: What Makes Feedback Work?

September 16, 2012 By scottdm Leave a Comment

As anyone knows who reads this blog or has been to one of my workshops, I am a fan of feedback.  Back in the mid-1990’s, I began using Lynn Johnson’s 10-item Session Rating Scale in my clinical work.  His book, Psychotherapy in the Age of Accountability, and our long relationship, convinced me that I needed to check in regularly with my clients.  At the same time, I started using the Outcome Questionnaire (OQ-45).  The developer, Michael Lambert, a professor and mentor, was finding that routinely measuring outcome helped clinicians catch and prevent deterioration in treatment.  In time, I worked with colleagues to develop a set of tools, the brevity of which made the process of asking for and receiving feedback about the relationship and outcome of care, feasible.

Initial research on the measures and feedback process was promising.   Formally and routinely asking for feedback was associated with improved outcomes, decreased drop-out rates, and cost savings in service delivery!  As I warned in my blogpost last February, however, such results, while important, were merely “first steps” in a scientific journey.  Most importantly, the research to date said nothing about why the use of the measures improved outcomes.  Given the history of our field, it would be easy to begin thinking of the measures as an “intervention” that, if faithfully adopted and used, would result in better outcomes.  Not surprisingly, this is exactly what has happened, with some claiming that the measures improve outcomes more than anything since the beginning of psychotherapy.  Sadly, such claims rarely live up to their initial promise.  For decades the quest for the holy grail has locked the field into a vicious cycle of hope and despair, one that ultimately eclipses the opportunity to conduct the very research needed to facilitate understanding of the complex processes at work in any intervention.

In February, I wrote about several indirect, but empirically robust, avenues of evidence indicating that another variable might be responsible for the effect found in the initial feedback research.  Now, before I go on, let me remind you that I’m a fan of feedback, a big fan.  At the same time, its important to understand why it works and, specifically, what factors are responsible for the effect.  Doing otherwise risks mistaking method with cause, what we believe with reality.  Yes, it could be the measures.  But, the type research conducted at the time did not make it possible to reach that conclusion.  Plus, it seemed to me, other data pointed elsewhere; namely to the therapist.  Consider, for example, the following findings: (1) therapists did not appear to learn from the feedback provided by measures of the alliance and outcome; (2) therapists did not become more effective over time as a result of being exposed to feedback.  In other words, as with every other “intervention” in the history of psychotherapy, the effect of routinely monitoring the alliance and outcome seems to vary by therapist.

Such results, if true, would have significant implications for the feedback movement (and the field of behavioral health in general).  Instead of focusing on methods and interventions, efforts to improve the outcome of behavioral health practice should focus on those providing the service.  And guess what?  This is precisely what the latest research on routine outcome measurement (ROM) has now found. Hot off the press, in the latest issue of the journal, Psychotherapy Research, Dutch investigators de Jong, van Sluis, Nugter, Heiser, and Spinhoven (2012) found that feedback was not effective under all circumstances.  What variable was responsible for the difference?  You guessed it: the therapist–in particular, their interest in receiving feedback, sense of self-efficacy, commitment to use the tools to receive feedback, and…their gender (with women being more willing to use the measures).  Consistent with ICCE’s emphasis on supporting organizations with implementation, other research points to the significant role setting and structure plays in success.  Simon, Simon, Harris and Lambert (2011), Reimer and Bickman (2012), and de Jong (2012) have all found that organizational and administrative issues loom large in mediating the use and impact of feedback in care.

Together with colleagues, we are currently investigating both the individual therapist and contextual variables that enable clinicians to benefit from feedback.  The results are enticing.  The first will be presented at the upcoming Achieving Clinical Excellence conference in Holland, May 16-18th.  Other results will be reported in the 50th anniversayry issue of the journal, Psychotherapy, to which we’ve been asked to contribute.  Stay tuned.

Filed Under: Feedback Informed Treatment - FIT Tagged With: cdoi, continuing education, holland, icce, Michael Lambert, post traumatic stress

Obesity Redux: The RFL Results and complex Nature of Truth and Science

August 28, 2012 By scottdm 2 Comments

Back in April, I blogged about research published by Ryan Sorrell on the use of feedback-informed treatment in a telephonically-divered weight management program.  The study, which appeared in the journal Disease Management*, not only found that the program and feedback led to weight loss, but also significant improvements in distress, health eating behaviors (70%), exercise (65%), and presenteeism on the job (64%)–the latter being critical to employers who were paying for the service.

Despite these results, the post garnered no attention until four months later during the first week of August when three clinicians posted comments on the very same day–that’s the beauty of the web, a long memory and an even longer reach.

What can I say?  I’m having to eat my hat (or, the bird on my shoulder is…).  I learned a great deal from the feedback:

  • Despite having sourced the figure from the American Academy of Child and Adolescent Psychiatry, the claim that weight gain due to poor diet and a lack of exercise was responsible for 300,000 deaths was false.  According to the comments, the figure is closer to 26,000, a mere 10% of the number claimed!
  • The same was true regarding the reported annual cost of obesity.  The 100 billion dollar figure reported on the AACAP website is, I was told, “grossly inflated” and worse, missed the point.  By focusing on BMI, the writer counseled, “we will have wasted money spent on the 51% of the healthy people who are deemed ‘unhealthy’ based on weight and the 18% unhealthy ones who are overlooked because their weight looks fine (see Wildman et al., 2008).”

Solid points both.  Thankfully, one of the writers noted what was supposed to have been the main point of the post; namely, ” the importance of “practice-based” evidence” in guiding service delivery, “making clear that finding the ‘right’ or ‘evidence-based’ approach for obesity (or any problem for that matter) is less important than finding out “what works” for each person in need of help.”

I want to make sure readers have access to the results of the study because they are an impressive demonstration of what’s possible when the feedback is sought from and used to guide service to people “in care.”  Weight loss aside, Ryan also reported significant improvements in distress, healthy eating behaviors (70%), exercise (65%), and presenteeism on the job (64%).  All this by using two simple, 4-question scales.

*Sorrell, R. (September, 2007).  Application of an Outcome-Directed Behavioral Modification Model for Obesity on a Telephonic, Web-based Platform.Disease Management, 10, Supplement 1, 23-26.

PS: An AP article that came out this last weekend and was discussed on NPR suggests the truth about the “weight of the nation” may be more complicated than either I or those who commented on my blog may realize.  Among the many changes that have occured over the last decades, the piece declares, “Who are we?  Fatter.  The average woman has gained 18 pounds since 1990, to 160 pounds; the average man is up 16 pounds, to 196.”   Hmm.

Filed Under: Feedback Informed Treatment - FIT, obesity Tagged With: American Academy of Child and Adolescent Psychiatry, Chronic Disease, cognitive-behavioral therapy, disease management, evidence based practice, icce, Weight Management

Feedback Informed Treatment: Update

August 16, 2012 By scottdm Leave a Comment

Chicago, IL (USA)

The last two weeks have been a whirlwind of activity here in Chicago.  First, the “Advanced Intensive.”  Next came the annual “Training of Trainers.”  Each week, the room was filled to capacity with practitioners, researchers, supervisors, and agency directors from around the globe receiving in-depth training in feedback-informed practice.  It was a phenomenal experience.  As the video below shows, we worked and played hard!

Already, people are signing up for the next “Advanced Intensive” scheduled for the third week of March 2013 and the new three-day intensive training on FIT supervision scheduled for the 6-9th of August 2013.   Both events follow and are designed to complement the newly released ICCE FIT Treatment and Training Manuals.  In fact, all participants receive copies of the 6 manuals, covering every detail of FIT practice, from the empirical evidence to implementation.  The manuals were developed and submitted to support ICCE’s submission of FIT to the National Registry of Evidence Based Practices (NREPP).  As I blogged about last March, ICCE trainings fill up early.  Register today and get the early bird discount.

Filed Under: CDOI, Conferences and Training, evidence-based practice, Feedback Informed Treatment - FIT, FIT Tagged With: cdoi, icce

The DSM 5: Mental Health’s "Disappointingly Sorry Manual" (Fifth Edition)

June 11, 2012 By scottdm 2 Comments

Have you seen the results from the field trials for the fifth edition of the Diagnostic and Statistical Manual?  The purpose of the research was to test the reliability of the diagnoses contained in the new edition.  Reliable (ri-lahy–uh-buhl), meaning “trustworthy, dependable, consistent.”

Before looking at the data, consider the following question: what are the two most common mental health problems in the United States (and, for that matter, most of the Western world)?  If you answered depression and anxiety, you are right.  The problem is that the degree of agreement between experts trained to used the criteria is unacceptably low.

Briefly, reliability is estimated using what statisticians call the Kappa (k) coefficient, a measure of inter-rater agreement.  Kappa is thought to be a more robust measure than simple percent agreement as it takes into account the likelihood of raters agreeing by chance.

The results?  The likelihood of two clinicians, applying the same criteria to assess the same person, was poor for both depression and anxiety.  Although there is no set standard, experts generally agree that kappa coefficients that fall lower that .40 can be considered poor; .41-.60, fair; .61-.75, good; and .76 and above, excellent.  Look at the numbers below and judge for yourself:

Diagnosis DSM-5 DSM4 ICD-10 DSM-3
Major Depressive Disorder .32 .59 .53 .80
Generalized Anxiety Disorder .20 .65 .30 .72

Now, is it me or do you notice a trend?  The reliability for the two most commonly diagnosed and treated “mental health disorders” has actually worsened over time!  The same was found for a number of the disorders, including schizophrenia (.46, .76, .81), alcohol use disorder (.40, .71, .80), and oppositional defiant disorder (.46, .51., .66).  Antisocial and Obsessive Personality Disorders were so variable as to be deemed unreliable.

Creating a manual of  “all known mental health problems” is a momumental (and difficult) task to be sure.  Plus, not all the news was bad.  A number of diagnoses demonstrated good reliability (autism spectrum disorder, posttraumatic stress disorder (PTSD), and attention-deficit/hyperactivity disorder (ADHD) in children (.69, .67, .61, respectively).  Still, the overall picture is more than a bit disconcerting–especially when one considers that the question of the manual’s validity has never been addressed.  Validity (vuh–lid-i-tee), meaning literally, “having some foundation; based on truth.”  Given the lack of any understanding of or agreement on the pathogenesis or etiology of the 350+ diagnoses contained in the manual, the volume ends up being, at best, a list of symptom clusters–not unlike categorizing people according to the four humours (e.g., phlegmatic, choleric, melancholy, sanquine).

Personally, I’ve always been puzzled by the emphasis placed on psychiatric diagnoses, given the lack of evidence of diagnostic specific treatment effects in psychotherapy outcome research.  Additionally, a increasing number of randomized clinical trials has provided solid evidence that simply monitoring alliance and progress during care significantly improves both quality and outcome of the services delivered.  Here’s the latest summary of feedback-related research.

Filed Under: Feedback Informed Treatment - FIT Tagged With: continuing education, DSM

Feedback Informed Treatment as Evidence-Based Practice

May 23, 2012 By scottdm Leave a Comment

Back in November, I blogged about the ICCE application to SAMSHA’s National Registry for consideration of FIT as an official evidence-based approach (EBP).  Given the definition of EBP by the Institute of Medicine and the American Psychological Association, Feedback Informed Treatment seems a perfect, well, FIT.  According to the IOM and APA, evidence-based practice means using the best evidence and tailoring services to the client, their preferences, culture, and circumstances.  Additionally, when evidence-based, clinicians must monitor “patient progress (and of changes in the patient’s circumstances—e.g.,job loss, major illness) that may suggest the need to adjust the treatment. If progress is not proceeding adequately, the psychologist alters or addresses problematic aspects of the treatment (e.g., problems in the therapeutic relationship or in the implementation of the goals of the treatment) as appropriate.”

In late Summer 2011, ICCE submitted 1000’s of pages of supporting documents, research studies, as well as video in support of the application.  This week, we heard that FIT passed the “Quality of Research” phase of the review.  Now, the committee is looking at the “Readiness for Dissemination” materials, including the six detailed treatment and implementation manuals on feedback informed treatment.  Keep your fingers crossed.  We’ve been told that the entire process should be completed sometime in late fall.

In the meantime, we are preparing for this summer’s Advanced Intensive and Training of Trainer workshops.  Once again, clinicians, educators, and researchers from around the world will be coming together for cutting edge training.  Only a few spots remain, so register now.

Filed Under: Feedback Informed Treatment - FIT Tagged With: American Psychological Association, evidence based medicine, evidence based practice, feedback informed treatment, FIT, icce, Institute of Medicine, NREPP, practice-based evidence, SAMHSA, Training

Revolution in Swedish Mental Health Care: Brief Update

May 14, 2012 By scottdm 1 Comment

In April 2010, I blogged about Jan Larsson, a Swedish clinician who works with people on the margins of the mental health system.  Jan was dedicated to seeking feedback, using the ORS and SRS to tailor services to the individuals he met.  It wasn’t easy.  Unilke most, he did not meet his clients in an office or agency setting.  Rather, he met them where they were: in the park, on the streets, and in their one room aparments.  Critically, wherever they met, Jan had them complete the two measures–“just to be sure,” he said.  No computer.  No I-phone app.  No sophisticated web-based adminsitration system.  With a pair of scissors, he simply trimmed copies of the measures to fit in his pocket-sized appointment book! I’ve been following his creative application of the scales ever since.

Not surprisingly, Jan was on top of the story I blogged about yesterday regarding changes in the guidelines governing Swedish mental health care practice.  He emailed me as I was writing my post, including the link to the Swedish Radio program about the changes.  Today, he emailed again, sending along links to stories appearing in two Swedish newspapers: Dagens Nyheter and Goteborg Posten.

Thanks Jan!

And to everyone else, please continue to send any new links, videos, and comments.

Filed Under: behavioral health, excellence, Feedback Informed Treatment - FIT, Top Performance Tagged With: continuing education, Dagens Nyheter, evidence based practice, Goteborg Posten, icce, ors, outcome rating scale, session rating scale, srs, sweden

Revolution in Swedish Mental Health Practice: The Cognitive Behavioral Therapy Monopoly Gives Way

May 13, 2012 By scottdm 34 Comments

Sunday, May 13th, 2012
Arlanda Airport, Sweden

Over the last decade, Sweden, like most Western countries, embraced the call for “evidence-based practice.”  Socialstyrelsen, the country’s National Board of Health and Welfare, developed and disseminated a set of guidelines (“riktlinger”) for mental health practice.  Topping the list of methods was, not surprisingly, cognitive-behavioral therapy. 

The Swedish State took the list seriously, restricting payment for training of clinicians and treatment of clients to cognitive behavioral methods.  In the last three years, a billion Swedish crowns were spent on training clinicians in CBT.  Another billion was spent on providing CBT to people with diagnoses of depression and anxiety.  No funding was provided for training or treatment in other methods. 

The State’s motives were pure: use the best methods to decrease the number of people who become disabled as result of depression and anxiety.  Like other countries, the percentage of people in Sweden who exit the work force and draw disability pensions has increased dramatically.  As a result, costs skyrocketed.  Even more troubling, far too many became permanently disabled. 

The solution?  Identify methods which have scientific support, or what some called, “evidence-based practice.” The result?  Despite substantial evidence that all methods work equally well, CBT became the treatment of choice throughout the country.  In point of fact, CBT became the only choice.

As noted above, Sweden is not alone in embracing practice guidelines.  The U.K. and U.S. have charted similar paths, as have many professional organizations.  Indeed, the American Psychological Association has now resurrected its plan to develop and disseminate a series of guidelines advocating specific treatments for specific disorders.  Earlier efforts by Division 12 (“Clinical Psychology”) met with resistance from the general membership as well as scientists who pointed to the lack of evidence for differential effectiveness among treatment approaches. 

Perhaps APA and other countries can learn from Sweden’s experience.  The latest issue of Socionomen, the official journal for Swedish social workers, reported the results of the government’s two billion Swedish crown investment in CBT.  The widespread adoption of the method has had no effect whatsoever on the outcome of people disabled by depression and anxiety.  Moreover, a significant number of people who were not disabled at the time they were treated with CBT became disabled, costing the government an additional one billion Swedish crowns.  Finally, nearly a quarter of those who started treatment, dropped out, costing an additional 340 million!

In sum, billions training therapists in and treating clients with CBT to little or no effect.  

Since the publication of Escape from Babel in 1995, my colleagues and I at the International Center for Clinical Excellence have gathered, summarized, published, and taught about research documenting little or no difference in outcome between treatment approaches.  All approaches worked about equally well, we argued, suggesting that efforts to identify specific approaches for specific psychiatric diagnoses were a waste of precious time and resources.  We made the same argument, citing volumes of research in two editions of The Heart and Soul of Change.

Yesterday, I presented at Psykoterapi Mässan, the country’s largest free-standing mental health conference.  As I have on previous visits, I talked about “what works” in behavioral health, highlighting data documenting that the focus of care should shift away from treatment model and technique, focusing instead on tailoring services to the individual client via ongoing measurement and feedback.  My colleague and co-author, Bruce Wampold had been in the country a month or so before singing the same tune.

One thing about Sweden:  the country takes data seriously.  As I sat down this morning to eat breakfast at the home of my long-time Swedish friend, Gunnar Lindfeldt, the newscaster announced on the radio that Socialstyrelsen had officially decided to end the CBT monopoly (listen here).  The experiment had failed.  To be helped, people must have a choice. 

“What have we learned?” Rolf Holmqvist asks in Socionomen, “Treatment works…at the same time, we have the possibility of exploring…new perspectives.  First, getting feedback during treatment…taking direction from the patient at every session while also tracking progress and the development of the therapeutic relationship!”

“Precis,” (exactly) my friend Gunnar said. 

And, as readers of my blog know, using the best evidence, informed by clients’ preferences and ongoing monitoring of progress and alliance is evidence-based practice.  However the concept ever got translated into creating lists of preferred treatment is anyone’s guess and, now, unimportant.  Time to move forward.  The challenge ahead is helping practitioners learn to integrate client feedback into care—and here, Sweden is leading the way.

“Skål Sverige!”

Filed Under: Behavioral Health, evidence-based practice, Feedback Informed Treatment - FIT, Practice Based Evidence Tagged With: CBG, continuing education, evidence based practice, icce, Socialstyrelsen, sweden

Mental Health Practice in a Global Economy

April 17, 2012 By scottdm 2 Comments

Did you feel it?  The seismic shift that occurred in field of mental health just a little over a month ago?  No?  Nothing?  Well, in truth, it wasn’t so much a rip in the space-time continuum as a run.  That “run,” however, promises to forever alter the fabric of clinical practice–in particular how clinicians earn and maintain a certain standard of living.

For decades, licensing statutes have protected behavioral health professionals from competing with providers living outside of their state and local jurisdiction.  In order to bill or receive reimbursement, mental health professionals needed to be licensed in the state in which treatment services were offered.  Over the years, the various professional organizations have worked to make it easier for professionals to become licensed when they move from one state to the another.  Still, it ain’t easy and, some practitioners and professional groups would argue, for good reason.  Such laws, to some extent, insure that fees charged for services are commensurate with the cost of living in the place where therapists live and work.  The cost of therapy in Manhattan varies considerably, for example, depending on whether one is talking about the city located in state of New York or Kansas.

As far as outcomes are concerned, however, there is no evidence that people who pay more necessarily get better results.  Indeed, as reviewed here on this blog, available evidence indicates little or no difference in outcome between highly trained (and expensive) clinicians and minimally trained (and less expensive) para-professionals and students.  If the traditional geographic (licensing) barriers were reduced or eliminated, consumers would with few exceptions gravitate to the best value for their money.  In the 1980’s and 90’s, for example, comsumers deserted small, Main Street retailers when big box stores opened on the outskirts of town offering the same merchandise at a lower price.  Now, big box retailers are closing en masse as consumers shift their purchases to less expensive, web based outlets.

And that’s precisely the shift that began a little over a month ago in the field of mental health.  The U.S. Military eliminated the requirement that civilian providers be licensed in the same jurisdiction or state in which treatment is offered.  The new law allows care to be provided wherever the receipient of services lives and regardless of where the provider is licensed.  Public announcements argued that the change was needed to make services available to service members and veterans living in isolated or rural areas where few providers may be available.  Whatever the reason, the implications are profound: in the future, clinicians, like Main Street retailers, will be competing with geographically distant providers.

Just one week prior to the announcement by the U.S. Military, I posted a blogpost highlighting a recent New York Times column by author and trend watcher, Thomas Friedman.  In it, I argued that “Globalization and advances in information technology were…challenging the status quo…access. At one time, being average enabled one to live an average life, live in an average neighborhood and, most importantly, earn an average living.  Not so anymore.  Average is now plentiful, easily accessible, and cheap. What technology can’t do in either an average or better way, a younger, less-trained but equally effective provider can do for less. A variety of computer programs and web-based systems provide both psychological advice and treatment.”

Truth is, the change is likely to be a boon to consumers of mental health services: easier access to services at a better price.  What can clinicians do?  First, begin measuring outcome.  Without evidence of their effectiveness, individual providers will lose out to the least expensive provider.  No matter how much people complain about “big box and internet retailers,” most use them.  The savings are too great to ignore.

What else can clinicians do?  The advice of Friedman, which I quoted in my recent blogpost, applies, “everyone needs to find their extra–their unique value contribution that makes them stand out in whatever is their field.” Measuring outcome and finding that “something special” is what the International Center for Clinical Excellence is all about.  If you are not a member, please join the thousands of other professionals online today.   After that, why not spend time with peers and cutting edge instructors at the upcoming “advanced intensive” or “training of trainers” workshops this summer.

Filed Under: Behavioral Health, Feedback Informed Treatment - FIT, ICCE Tagged With: behavioral health, brief therapy, cdoi, evidence based practice, mental health, Thomas Friedman

The Outcome and Session Rating Scales: Support Tools

March 30, 2012 By scottdm 6 Comments

Japan, Sweden, Norway, Denmark, Germany, France, Israel, Poland, Chile, Guam, Finland, Hungary, Mexico, Australia, China, the United States…and many, many more.  What do all these countries have in common?  In each, clinicians and agencies are using the ORS and SRS scales to inform and improve behavioral health services.  Some are using web-based systems for administration, scoring, interpretation and data aggregation (e.g., myoutcomes.com and fit-outcomes), many are accessing paper and pencil versions of the measures for free and then administering and scoring by hand.

Even if one is not using a web-based system to compare individual client progress to cutting edge norms, practitioners can still determine simply and easily whether reliable change is being made by using the “Reliable Change Chart” below.  Recall, a change on the ORS is considered reliable when the difference in scores exceeds the contribution attributable to chance, maturation, and measurement error. Feel free to print out the graph and use it in your practice.

To learn how to get the most out of the measures, be sure and download the six FIT Treatment and Training Manuals.  The six manuals cover every aspect of feedback-informed practice including: empirical foundations, basic and advanced applications (including FIT in groups, couples, and with special populations), supervision, data analysis, and agency implementation. Each manual is written in clear, step-by-step, non-technical language, and is specifically designed to help practitioners and agencies integrate FIT into routine clinical practice. Indeed, the manuals were submitted as part of ICCE’s application for consideration of FIT as an “evidence-based practice” to the National Registry of Evidence-Based Programs and Practices

ORS Reliable Change Chart

Filed Under: Behavioral Health, excellence, Feedback Informed Treatment - FIT Tagged With: cdoi, Hypertension, icce, NREPP, ors, outcome rating scale, SAMHSA, session rating scale, srs

NEWSFLASH: The Advanced Intensive and Training of Trainers in Feedback Informed Therapy (FIT)

March 17, 2012 By scottdm Leave a Comment

Dateline: March 17th, 2012, Chicago, Illinois USA

Barely a month ago, I announced the addition of a second “Advanced Intensive” (AI) course in Feedback Informed Treatment (FIT).  The original March training filled really early this year and a long waiting list formed.  Now the second Advanced Intensive training in FIT scheduled for July 30th through August 1st is nearly full.  Register now and you can still receive the early bird price.  Additionally, we’re offering a super special discount for people attending both the AI and the ICCE Training of Trainers.  Don’t wait though, only a handful of spaces remain.  If you would like to attend both courses, drop me an email straight away and I’ll send you the special registration code.

We look forward to meeting everyone attending the AI this week.  Stay tuned for tweets and video from the training.

 

Filed Under: Conferences and Training, Feedback Informed Treatment - FIT

The Achieving Clinical Excellence Conference CALL FOR PAPERS

March 13, 2012 By scottdm Leave a Comment

In October 2010, the first annual “Achieving Clinical Excellence” was held in Kansas City, Missouri.  A capacity crowd joined leading experts on the subject of top performance for three days worth of training and inspiration.  K. Anders Ericsson reviewed his groundbreaking research, popularized by Malcolm Gladwell and others.  ICCE Director, Scott D. Miller translated the research into speciific steps for improving clinical performance.  Finally, classical piansts David Helfgott, Rachel Hsu, and Roger Chen, demonstrated what can be accomplished when such evidence-based strategies are applied to the process of learning specific skills.

The ICCE is proud to announce the 2nd “ACE” conference to be held May 16th-18th, 2013 in Amsterdam, Holland.  Join us for three educational, inspiring, and fun-filled days.  Register today and receive a significant “Early Bird” discount.  The ACE conference committee is also issuing an international “Call for Papers.”  If you, your agency, or practice are committed to excellence, using outcomes to inform practice, or have published research on the subject, please visit the conference website to submit a proposal.

Here’s what attendees said about the last event:

Filed Under: Behavioral Health, Conferences and Training, evidence-based practice, excellence, Feedback Informed Treatment - FIT Tagged With: cdoi, holland, Therapist Effects

Implementation Science, FIT, and the Training of Trainers

March 8, 2012 By scottdm Leave a Comment

The International Center for Clinical Excellence (ICCE) is pleased to announce the 6th annual Training of Trainers event to be held in Chicago, Illinois August 6th-10th, 2012.  As always, the ICCE TOT prepares participants provide training, consultation, and supervision to therapists, agencies, and healthcare systems in Feedback-Informed Treatment (FIT).  Attendees leave the intensive, hands-on training with detailed knowledge and skills for:

  1. Training clinicians in the Core Competencies of Feedback Informed Treatment (FIT/CDOI);
  2. Using FIT in supervision;
  3. Methods and practices for implementing FIT in agencies, group practices, and healthcare settings;.
  4. Conducting top training sessions, learning and mastery exercises, and transformational presentations.

Multiple randomized clinical trials document that implementing FIT leads to improved outcomes and retention rates while simultanesouly decreasing the cost of services.

This year’s “state of the art” faculty include: ICCE Director, Scott D. Miller, Ph.D., ICCE Training Director, Julie Tilsen, Ph.D., and special guest lecturer and ICCE Coordinator of Professional Development, Cynthia Maeschalck, M.A.

Scott Miller (Evolution 2014)

tilsencynthia-maeschalckJoin colleagues from around the world who are working to improve the quality and outcome of behavioral healthcare via the use of ongoing feedback. Space is limited.  Click here to register online today.  Last year, one participants said the training was, “truly masterful.  Seeing the connection between everything that has been orchestrated leaves me amazed at the thought, preparation, and talent that has cone into this training.”  Here’s what others had to say:

 

Filed Under: Behavioral Health, Conferences and Training, evidence-based practice, excellence, Feedback Informed Treatment - FIT Tagged With: addiction, Carl Rogers, cdoi, magic, psychometrics

Is the Research on Feedback too Good to be True? Better to UPOD than OPUD!

February 29, 2012 By scottdm 6 Comments

 

It is a standard maxim of good business practice: Under Promise, OverDeliver (or UPOD).  As my father used to say, “Do your best, and then a little better.”  Sadly, history shows that the field of behavioral health has followed a difference course: Over Promise, Under Deliver.  The result? O, PUDs.

The most gripping account of the field’s failed promises is Robert Whitaker’s Mad in America: Mad Science, Bad Medicine, and Enduring Mistreatment of the Mentall Ill. In fact, Whitaker’s book inspired me to write what became my most popular article, downloaded from my website more often than any other: Losing Faith.  In it, I document how, each year, new models, methods, and diagnoses appear promising to revolutionize mental health care, only later to be shown ineffective, wrong, and, in some instances, harmful.  Remember Multiple Personality Disorder?  Satanic Ritual Abuse?  Xanax for panic disorder?  Johnsonian-style Interventions for Addiction?  Co-dependence? Thought Field Therapy?  Rebirthing?  How about SSRIs?  Weren’t they supposed to be much better than those, you know, old-fashioned tricyclics?  The list is endless.

“Not to worry,” current leaders and pundits advise, “We’ve made progress.  We have a new idea.  A much better idea than the old one. We promise!”

However, when it comes to claims about advances in the field of behavioral health, history indicates that caution is warranted.  That includes, by the way, claims about the use of feedback tools in therapy.  As readers of this blog know, I have, for several years, been championing the use of simple checklists for guiding and improving the quality and outcome of treatment. Several studies document–as reviewed here on this blog–improved outcomes and decreased drop out and deterioration rates.  These studies are important first steps in the scientific process.  I’ve been warning however that these studies are only, first steps.  Why?

Studies to date, while important, suffer from the same allegiance effects and unfair comparisons of other RCT’s.  With regard to the latter, no study compares feedback with an active control condition.  Rather, all comparisons have been to “treatment as usual.”  Such research, as a result, says nothing about why the use of the measures improves outcomes.  At the same time, several indirect, but empirically robust, avenues of evidence indicate that another variable may be responsible for the effect!  Consider, for example, the following findings: (1) therapists do not learn from the feedback provided by measures of the alliance and outcome; (2) therapists do not become more effective over time as a result of being exposed to feedback.  Such research indicates that focus on the measures and outcome may be misguided–or at least a “dead end.”

Such shortcomings are why researchers and clinicians at ICCE are focused on the literature regarding expertise and expert performance.  Focusing on measures misses the point.  Already, there is talk about methods for insuring fidelity to a particular way of using feedback tools.  Instead, the research on expertise indicates that we need to help clinicians develop practices which enable them to learn from the feedback they receive.

Several studies are in progress.  In Trondheim, Norway, the first ever study to include an active control comparison for feedback is underway.  I fully expect the control to be as effective as the simple use of checklists in treatment.  In a joint research project being conducted at agencies in the US, UK, Canada, and Australia, research is underway investigating how top performing therapists use feedback to learn and improve compared to average and below average clinicians.  Such studies are the necessary second step to insure that we understand the elements responsible for the effective use of feedback.  Inch by inch, centimeter by centimeter, the results of such studies will advance our understanding and effectiveness.  The gains I’m sure will be modest at best–and that’s just fine.  In fact, the latest feedback research using the ORS and SRS found in small, largely insignificant effects! (I’m still waiting for permission to publish the entire article on this blog).  Until then, interested readers can find a summary here).  Such findings can be disturbing to those who have heard others claim that “feedback is the most effective method ever invented in the history of the field!”  OPUD is dangerous.  It keeps the field stuck in a vicious cycle of hope and despair, one that ultimately eclipses the opportunity to conduct the very research needed to facilitate understanding of the complex processes at work in any intervention. People loose faith until the “next best thing” comes along.

I’m excited about the research that is in process.  Stay tuned for updates. Until then, let’s agree to UPOD.

 

Filed Under: Feedback Informed Treatment - FIT

Goodbye Mr. & Ms. Know-it-All: Redefining Competence in the Era of Increasing Complexity

February 12, 2012 By scottdm 3 Comments

Every day behavioral health professionals make hundreds of decisions.  As experts in the field, they meet and work successfully with diverse clients presenting an array of different difficulties.  Available evidence indicates that the average person who receives care is better off than 80% of those with similar problems that do not.  Outcomes in mental health are on par or better than most medical treatments and, crucially, have far few side effects!  Psychotherapy, for example, is equal in effect to coronary artery bypass surgery and three times more effective than flouride for cavities.

Not all the news is good, however.  Drop out rates run around 25% or higher.  Said another way, clinicians do great work with the people who stay.  Unfortunately, many do not, resulting in increased costs and lost opportunities.  Another problem is that therapists, the data indicate, are not particularly adept at identifying clients at risk for dropping out or deterioration.  For decades, research has has shown that approximately 10% of people worsen while in treatment.  Practitioners, despite what they may believe, are none the wiser.  Finally, it turns out that a small percentage (between 10-20%) of people in care account for lion’s share of expenses in behavioral health service delivery (In case you are wondering, roughly the same figures apply in the field of medicine).  Such people continue in care for long periods, often receiving an escalating and complicated array of services, without relief.  At the same time, clinician caseloads and agency waiting lists grow.

What can be done?

At one time, being a professional meant that one possessed the knowledge, training, and skills to deliver the right services to the right people for the right problem in a consistent, correct, and safe manner.  To that end, training requirements–including schooling, certification, and continuing professional development–expanded, exponentially so.  Today’s behavioral health professionals spend more time training and are more highly specialized than ever before.  And yet, the above noted problems persist.

Some call for more training, others for increasing standardization of treatment approaches, many for more rigorous licensing and accreditation standards.  The emphasis on “empirically supported treatments”–specific methods for specific diagnoses–typify this approach.  However, relying as these solutions do on an antiquated view of professional knowledge and behavior, each is doomed to fail.

In an earlier era, professionals were “masters of their domain.”  Trained and skillful, the clinician diagnosed, developed a plan for treatment, then executed, evaluated, and tailored services to maximize the benefit to the individual client.  Such a view assumes that problems are either simple or complicated, puzzles that are solvable if the process is broken down into a series of steps.  Unfortunately, the shortcomings in behavioral health outcomes noted above (drop out rates, failure to identify deterioration and lack of progress) appear to be problems that are not so much simple or complicated but complex in nature.  In such instances, outcomes are remain uncertain throughout the process.  Getting things right is less about following the formula than continually making adjustments, as “what works” with one person or situation may not easily transfer to another time or place.  Managing such complexity requires a change of heart and direction, a new professional identity.  One in which the playing field between providers and clients is leveled, where power is moved to the center of the dyad and shared, where ongoing client feedback takes precedence over theory and protocol.

In his delightful and engaging book, The Checklist Manifesto, physician and surgeon Atul Gawande provides numerous examples in medicine, air travel, computer programming, and construction where simple feedback tools have resulted in dramatic improvements in efficiency, effectiveness, and safety.  The dramatic decrease in airplane related disasters over the last three decades is one example among many–all due to the introduction of simple feedback tools.  Research in the field of behavioral health documents similar improvements.  Multiple studies document that routinely soliciting feedback regarding progress and the alliance results in significantly improved effectiveness, lower drop out rates, and less client deterioration–and all this while decreasing the cost of service delivery.  The research and tools are described in detail in a new series of treatment manuals produced by the members and associates of the International Center for Clinical Excellence–six simple, straightforward, how-to guidebooks covering everything from the empirical foundations, administration and interpretation of feedback tools, to implementation in diverse practice settings.  Importantly, the ICCE Manuals on Feedback Informed Treatment (FIT) are not a recipe or cookbook.  They will teach not to you how to do treatment.  You will learn, however, skills for managing the increasingly complex nature of modern behavioral health practice.

In the meantime, here’s a fantastic video of Dr. Gawande on the subject.  Use the cursor to skip ahead to the 2:18 mark:

Filed Under: Behavioral Health, Feedback Informed Treatment - FIT, Practice Based Evidence Tagged With: Atul Gawande, behavioral health, feedback informed treatment, icce, The Checklist Manifesto

Getting FIT: Another Opportunity

February 4, 2012 By scottdm Leave a Comment

The March Advanced Intensive in Feedback Informed Treatment is full!  Not a single space left.  For several weeks, we put folks on a waiting list.  When that reached nearly 20, we told most they’d probably have to wait until next year to attend.

Wait no more!

The ICCE is pleased to announce a second, “Advanced Intensive” Training schedule for July 30th through August 2nd, 2012 in Chicago, IL, USA.  If you’ve read the books and attended a one or two day introductory workshop and want to delve deeper in your understanding and use of the principles and practices of FIT, this is the training for you!  Multiple randomized clinical trials document that FIT improves outcomes and retention rates while decreasing costs of behavioral health.

Four intensive days focused on skill development. Participants will receive a thorough grounding in:

  • The empirical foundations of FIT (i.e., research supporting the common factors, outcome and alliance measures, and feedback)
  • Alliance building skills that cut across different therapeutic orientations and diverse client populations
  • How to use outcome management tools (including one or more of the following: ORS, SRS, CORE, and OQ 45 to inform and improve treatment)
  • How to determine your overall clinical success rates
  • How to significantly improve your outcome and retention rate via feedback and deliberate practice
  • How to use technology for support and improvement of the services you offer clients and payers
  • How to implement FIT in your setting or agency

The training venue is situated along the beautiful “Magnificent Mile,” near Northwestern hospital, atop a beautiful tall building steps from the best retail therapy and jazz clubs in Chicago. As always, the conference features continental breakfast every morning, a night of Blues at one of Scott’s favorite haunts and dinner at arguably the best Italian restaurant in Chicago.

Unlike any other training, the ICCE “Advanced Intensive” offers both pre and post attendance support to enhance learning and retention.  All participants are provided with memberships to the ICCE Trainers Forum where they can interact with the course instructors and participants, download coarse readings, view “how-to” videos, and reach out to and learn from the thousands of other member-clinicians around the world.

Don’t wait.  Register today here.

If you are interested in hanging out in Chicago a few extra days, why not register for both the “Advanced Intensive” and the 2012 “Training of Trainers” workshop?  Thanks to the demand, for the first time ever, the two events are being held back to back. Sign up for both events by May 31st and receive 25% off for the trainings!  To obtain your discount code for both events, email: events@centerforclinicalexcellence.com today.

Filed Under: Conferences and Training, Feedback Informed Treatment - FIT Tagged With: cdoi, feedback informed treatment

Excellence "Front and Center" at the Psychotherapy Networker Conference

January 30, 2012 By scottdm Leave a Comment

This year, the Psychotherapy Networker is celebrating it’s 35th anniversary.  I’m not going to let on how long I’ve been a reader and subscriber, but I can say that I eagerly anticipate each issue.  Rich Simon and his incredibly dedicated and talented crew always seem to have their fingers on the pulse of the profession.

It is no accident that our most recent work on achieving excellence in behavioral health appeared in the pages of the Networker–in 2007, our study of top performing clinicians, “Supershrinks,” and then last year, “The Road to Mastery” which layed out the most recent findings as well as identified the resources necessary for the development of therapeutic expertise.

I was deeply honored when Rich Simon asked me to give one of the plenary addresses at this year’s Networker Symposium, March 22-25th, 2012.  The theme of this year’s event is, “Creating a New Wisdom: The Art and Science of Optimal Well Being” and I’ll be delivering Friday’s luncheon address on applying the science of expertise to the world of clinical practice.

Click here to register online and join me for 3 fantastic days at this historic meeting.

Filed Under: Conferences and Training, excellence, Feedback Informed Treatment - FIT Tagged With: brief therapy

Looking Back, Looking Forward

January 6, 2012 By scottdm Leave a Comment

Bidding goodbye to last year and welcoming the new always puts me in a reflective frame of mind.  How did my life, work, and relationships go?  What are my hopes for the future?

Just two short years ago, together with colleagues from around the world, the International Center for Clinical Excellence (ICCE) was launched.  Today, the ICCE is the largest, global, web-based community of providers, educators, researchers, and policy makers dedicated to improving the quality and outcome of behavioral health services.  Clinicians can choose to participate in any of the 100-plus forums, create their own discussion group, immerse themselves in a library of documents and how-to videos, and consult directly with peers. Membership costs nothing and the site is free of the advertising.  With just a few clicks, practitioners are able to plug into a group of like-minded clinicians whose sole reason for being on the site is to raise everyone’s performance level.  I have many people to thank for the success of ICCE: senior associates and trainers, our community manager Susanne Bargmann, director of training Julie Tilsen, and our tech wizard Enda Madden. 

As membership in ICCE has grown from a few hundred to well over 3000, many in the community have worked together to translate research on excellence into standards for improving clinical practice.  Routine outcome monitoring (ROM) has grown in popularity around the world.  As a result, new measures and trainings have proliferated.  In order to insure quality and consistency, a task force was convened within ICCE in 2010 to develop a list of “Core Competencies”—a document establishing the empirical and practice foundations for outcome-informed clinical work.  In 2011, the ICCE Core Competencies were used to develop and standardize the curricula for the “Advanced Intensive” and “Training-of-Trainers” workshops as well as the exam all attendees must pass to achieve certification as an ICCE Trainer.   As if these accomplishments were not enough, a small cadre of ICCE associates banded together to compose the Feedback Informed Treatment and Training Manuals—six practical, “how-to”volumes covering everything from empirical foundations to implementation.  None of this would have been possible without the tireless contributions of Bob Bertolino, Jason Seidel, Cynthia Maeschalck, Rob Axsen, Susanne Bargmann, Bill Robinson, Robbie Wagner, and Julie Tilsen.

Looking back, I feel tremendous gratitude–both for the members, associates, and trainers of ICCE as well as the many people who have supported my professional journey.  This year, two of those mentors passed away: Dick Fisch and James Hillman.   During my graduate school years, I read James Hillman’s book, Suicide and the Soul.  Many years later, I had the opportunity to present alongside him at the “Evolution of Psychotherapy” conference.  Dick, together with his colleagues from MRI, had a great influence on my work, especially during the early years when I was in Milwaukee with Insoo Berg and Steve de Shazer in Milwaukee doing research and writing about brief therapy.  Thinking about Dick reminded me of two other teachers and mentors from that period in my life; namely, John Weakland and Jay Haley.


Looking forward, I am filled with hope and high expectations.  The “Advanced Intensive” training scheduled for March 19-22nd is booked to capacity—not a single spot left.  Registrations for this summer’s “Training of Trainers” course are coming in at a record pace (don’t wait if you are thinking about joining me, Cynthia and Rob).  Currently, I am awaiting word from the National Registry of Evidence Based Programs and Practices (NREPP) formally recognizing “Feedback Informed Treatment” (FIT) as an evidence-based approach.  The application process has been both rigorous and time-consuming.  It’s worth it though.  Approval by this department within the federal government would instantly raise awareness about as well as increased access to funding for implementing FIT.  Keep your fingers crossed!

There’s so much more:

  • Professor Jan Blomqvist, a researcher at the Center for Alcohol and Drug Research at Stockholm University (SoRAD) launched what will be the largest, independent evaluation of feedback informed treatment to date, involving 80+ clinicians and 100’s of clients located throughout Sweden.   I provided the initial training to clinicians in October of last year.  ICCE Certified Trainers Gunnar Lindfeldt and Magnus Johansson are providing ongoing logistic and supervisory support.
  • The most sophisticated and empirically robust interpretive algorithms for the Outcome Rating Scale (based on a sample of 427,744 administrations of the ORS, in 95,478 unique episodes of care, provided by 2,354 different clinicians) have been developed and are now available for integration into software and web based applications.  Unlike the prior formulas–which plotted the average progress of all consumers successful and not–the new equations provide benchmarks for comparing individual consumer progress to both successful and unsuccessful treatment episodes.
  • The keynote speakers and venue for the Second Achieving Clinical Excellence Conference have been secured.  We’ll be meeting at one of the nicest hotels in Amsterdam, Holland, May 16-18=9th, 2013.  Thanks go to the planning committee: Bill Andrews, Susanne Bargmann, Liz Plutt, Rick Plutt, Tony Jordan, and Bogdan Ion.  Please visit the conference website and submit a proposal for a workshop or presentation.
  • Finally, I’ve been asked to deliver the lunchtime keynote at the upcoming Psychotherapy Networker Conference scheduled on March 23, 2012.  The topic?  Achieving excellence as a behavioral health practitioner.  Last year, my colleague Mark Hubble and I published the lead article in the May-June issue of the magazine, describing the latest research on top performing clinicians.  I’m deeply honored by the opportunity to speak at this prestigious event.

More coming in the weeks ahead.  Until then, look forward to connecting on ICCE.

Filed Under: Behavioral Health, Conferences and Training, excellence, Feedback Informed Treatment - FIT, ICCE, PCOMS Tagged With: cdoi, feedback informed treatment, HHS, Insoo Berg, NREPP, ors, outcome rating scale, session rating scale, srs, Steve de Shazer

What’s disturbing Mental Health? Opportunities Lost

November 29, 2011 By scottdm Leave a Comment

In a word, paperwork.  Take a look at the book pictured above.  That massive tome on the left is the 2011 edition of “Laws and Regulations” governing mental health practice in the state of California.  Talk about red tape!  Hundreds and hundreds of pages of statutes informing, guiding, restricting, and regulating the “talking cure.”  Now, on top of that, layer federal and third party payer policies and paperwork and you end up with…lost opportunities.  Many lost opportunities.  Indeed, as pointed out in our recent article, The Road to Mastery, as much as 30% of clinicians time is spent completing paperwork required by various funding bodies and regulatory agencies.  THIRTY PERCENT.  Time and money that could be spent much more productively serving people with mental health needs. Time and money that could be spent on improving treatment facilities and training of behavioral health professionals.  In the latest edition of our book, The Heart and Soul of Change, authors Bob Bohanske and Michael Franczak described their struggle to bring sanity to the paperwork required in public mental health service settings in the state of Arizona.  “The forms needed to obtain a marriage certificate, buy a new home, lease an automobile, apply for a passport, open a bank account, and die of natural causes were assembled,” they wrote, “…and altogether weighed 1.4 ounces.  By contrast, the paperwork required for enrolling a single mother in counseling to talk about difficulties her child was experiencing at school came in at 1.25 pounds” (p. 300).  What gives?

The time has come to confront the unpleasant reality and say it outloud: regulation has lost touch with reality.  Ostensibly, the goal of paperwork and oversight procedures is to improve accountability.  In these evidence-based times, that leads me to say, “show me the data.”  Consider the wide-spread practice–mandate, in most instances–of treatment planning. Simply put, it is less science than science fiction.  Perhaps this practice improves outcomes in a galaxy far, far away but on planet Earth, supporting evidence is spare to non-existent (see the review in The Heart and Soul of Change, 2nd Edition).

No amount of medication will resolve this craziness.  Perhaps a hefty dose of CBT might do some good identifying and correcting the distoreted thinking that has led to this current state of affairs.  Whatever happens, the field needs an alternative.  What practice not only insures accountability but simultaneously improves the quality and outcome of behavioral health services?  Routine outcome measurement and feedback (ROMFb).  As I’ve blogged about several times, numerous RCT’s document increased effectiveness and efficiency and decreased costs and rates of deterioration.   Simply put, as the slide below summarizes, everybody wins.  Clinicians.  Consumers.  Payers.
Everybody wins

Learn about or deepen your knowledge of feedback-informed treatment (FIT) by attending the upcoming “Advanced Intensive” workshop in March 2012; specfically, the 19th-22nd.  We will have four magical days together.  Space is filling rapidly, so register now.  And then, at the end of the last day of the training, fly to Washington, D.C. to finish off the week by attending the Psychotherapy Networker conference.  Excellence is front and center at the event and I’ve been asked to do the keynote on the subject on the first day!

Filed Under: Behavioral Health, Conferences and Training, Feedback Informed Treatment - FIT Tagged With: bob bohanske, counselling, mental health, michael franczak, The Heart and Soul of Change

  • « Previous Page
  • 1
  • …
  • 4
  • 5
  • 6
  • 7
  • 8
  • Next Page »

SEARCH

Subscribe for updates from my blog.

[sibwp_form id=1]

Upcoming Training

There are no upcoming Events at this time.

FIT Software tools

FIT Software tools

LinkedIn

Topics of Interest:

  • behavioral health (5)
  • Behavioral Health (109)
  • Brain-based Research (2)
  • CDOI (12)
  • Conferences and Training (62)
  • deliberate practice (29)
  • Dodo Verdict (9)
  • Drug and Alcohol (3)
  • evidence-based practice (64)
  • excellence (61)
  • Feedback (36)
  • Feedback Informed Treatment – FIT (230)
  • FIT (27)
  • FIT Software Tools (10)
  • ICCE (23)
  • Implementation (6)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (9)
  • Practice Based Evidence (38)
  • PTSD (4)
  • Suicide (1)
  • supervision (1)
  • Termination (1)
  • Therapeutic Relationship (9)
  • Top Performance (37)

Recent Posts

  • Agape
  • Snippets
  • Results from the first bona fide study of deliberate practice
  • Fasten your seatbelt
  • A not so helpful, helping hand

Recent Comments

  • Typical Duration of Outpatient Therapy Sessions | The Hope Institute on Is the “50-minute hour” done for?
  • Dr Martin Russell on Agape
  • hima on Simple, not Easy: Using the ORS and SRS Effectively
  • hima on The Cryptonite of Behavioral Health: Making Mistakes
  • himalaya on Alas, it seems everyone comes from Lake Wobegon

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland Hypertension icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training