SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
scottdmiller@ talkingcure.com +1.773.454.8511

Research on the Outcome Rating Scale, Session Rating Scale & Feedback

January 7, 2010 By scottdm Leave a Comment

PCOMS - Partners for change outcome management system Scott D Miller - SAMHSA - NREPP“How valid and reliable are the ORS and SRS?”  “What do the data say about the impact of routine measurement and feedback on outcome and retention in behavioral health?”  “Are the ORS and SRS ‘evidence-based?'”

These and other questions regarding the evidence supporting the ORS, SRS, and feedback are becoming increasingly common in the workshops I’m teaching in the U.S. and abroad.

As indicated in my December 24th blogpost, routine outcome monitoring (PROMS) has even been endorsed by “specific treatments for specific disorders” proponent David Barlow, Ph.D., who stated unequivocally that “all therapists would soon be required to measure and monitor the outcome of their clinical work.”  Clearly, the time has come for all behavioral health practitioners to be aware of the research regarding measurement and feedback.

Over the holidays, I updated a summary of the data to date that has long been available to trainers and associates of the International Center for Clinical Excellence.  The PDF reviews all of the research on the psychometric properties of the outcome and session ratings scales as well as the studies using these and other formal measures of progress and the therapeutic relationship to improve outcome and retention in behavioral health services.  The topics is so important, that I’ve decide to make the document available to everyone.  Feel free to distribute the file to any and all colleagues interested in staying up to date on this emerging mega-trend in clinical practice.

Measures And Feedback from Scott Miller

Filed Under: evidence-based practice, Feedback Informed Treatment - FIT, Practice Based Evidence Tagged With: behavioral health, continuing education, david barlow, evidence based medicine, evidence based practice, feedback, Hypertension, icce, medicine, ors, outcome measurement, outcome rating scale, post traumatic stress, practice-based evidence, proms, randomized clinical trial, session rating scale, srs, Training

The Study of Excellence: A Radically New Approach to Understanding "What Works" in Behavioral Health

December 24, 2009 By scottdm 2 Comments

“What works” in therapy?  Believe it or not, that question–as simple as it is–has and continues to spark considerable debate.  For decades, the field has been divided.  On one side are those who argue that the efficacy of psychological treatments is due to specific factors (e.g., changing negative thinking patterns) inherent in the model of treatment (e.g., cognitive behavioral therapy) remedial to the problem being treated (i.e., depression); on the other, is a smaller but no less committed group of researchers and writers who posit that the general efficacy of behavioral treatments is due to a group of factors common to all approaches (e.g., relationship, hope, expectancy, client factors).

While the overall effectiveness of psychological treatment is now well established–studies show that people who receive care are better off than 80% of those who do not regardless of the approach or the problem treated–one fact can not be avoided: outcomes have not improved appreciably over the last 30 years!  Said another way, the common versus specific factor battle, while generating a great deal of heat, has not shed much light on how to improve the outcome of behavioral health services.  Despite the incessant talk about and promotion of “evidence-based” practice, there is no evidence that adopting “specific methods for specific disorders” improves outcome.  At the same time, as I’ve pointed out in prior blogposts, the common factors, while accounting for why psychological therapies work, do not and can not tell us how to work.  After all, if the effectiveness of the various and competing treatment approaches is due to a shared set of common factors, and yet all models work equally well, why learn about the common factors?  More to the point, there simply is no evidence that adopting a “common factors” approach leads to better performance.

The problem with the specific and common factor positions is that both–and hang onto your seat here–have the same objective at heart; namely, contextlessness.  Each hopes to identify a set of principles and/or practices that are applicable across people, places, and situations.  Thus, specific factor proponents argue that particular “evidence-based” (EBP) approaches are applicable for a given problem regardless of the people or places involved (It’s amazing, really, when you consider that various approaches are being marketed to different countries and cultures as “evidence-based” when there is in no evidence that these methods work beyond their very limited and unrepresentative samples).  On the other hand, the common factors camp, in place of techniques, proffer an invariant set of, well, generic factors.  Little wonder that outcomes have stagnated.  Its a bit like trying to learn a language either by memorizing a phrase book–in the case of EBP–or studying the parts of speech–in the case of the common factors.

What to do?  For me, clues for resolving the impasse began to appear when, in 1994, I followed the advice of my friend and long time mentor, Lynn Johnson, and began formally and routinely monitoring the outcome and alliance of the clinical work I was doing.  Crucially, feedback provided a way to contextualize therapeutic services–to fit the work to the people and places involved–that neither a specific or common factors informed approach could.

Numerous studies (21 RCT’s; including 4 studies using the ORS and SRS) now document the impact of using outcome and alliance feedback to inform service delivery.  One study, for example, showed a 65% improvement over baseline performance rates with the addition of routine alliance and outcome feedback.  Another, more recent study of couples therapy, found that divorce/separation rates were half (50%) less for the feedback versus no feedback conditions!

Such results have, not surprisingly, led the practice of “routine outcome monitoring” (PROMS) to be deemed “evidence-based.” At the recent, Evolution of Psychotherapy conference I was on a panel with David Barlow, Ph.D.–a long time proponent of the “specific treatments for specific disorders” (EBP)–who, in response to my brief remarks about the benefits of feedback, stated unequivocally that all therapists would soon be required to measure and monitor the outcome of their clinical work.  Given that my work has focused almost exclusively on seeking and using feedback for the last 15 years, you would think I’d be happy.  And while gratifying on some level, I must admit to being both surprised and frightened by his pronouncement.

My fear?  Focusing on measurement and feedback misses the point.  Simply put: it’s not seeking feedback that is important.  Rather, it’s what feedback potentially engenders in the user that is critical.  Consider the following, while the results of trials to date clearly document the benefit of PROMS to those seeking therapy, there is currently no evidence of that the practice has a lasting impact on those providing the service.  “The question is,” as researcher Michael Lambert notes, “have therapists learned anything from having gotten feedback? Or, do the gains disappear when feedback disappears? About the same question. We found that there is little improvement from year to year…” (quoted in Miller et al. [2004]).

Research on expertise in a wide range of domains (including chess, medicine, physics, computer programming, and psychotherapy) indicates that in order to have a lasting effect feedback must increase a performer’s “domain specific knowledge.”   Feedback must result in the performer knowing more about his or her area and how and when to apply than knowledge to specific situations than others.  Master level chess players, for example, have been shown to possess 10 to 100 times more chess knowledge than “club-level” players.  Not surprisingly, master players’ vast information about the game is consilidated and organized differently than their less successful peers; namely, in a way that allows them to access, sort, and apply potential moves to the specific situation on the board.  In other words, their immense knowledge is context specific.

A mere handful studies document similar findings among superior performing therapists: not only do they know more, they know how, when, and with whom o apply that knowledge.  I noted these and highlighted a few others in the research pipeline during my workshop on “Achieving Clinical Excellence” at the Evolution of Psychotherapy conference.  I also reviewed what 30 years of research on expertise and expert performance has taught us about how feedback must be used in order to insure that learning actually takes place.  Many of those in attendance stopped by the ICCE booth following the presentation to talk with our CEO, Brendan Madden, or one of our Associates and Trainers (see the video below).

Such research, I believe, holds the key to moving beyond the common versus specific factor stalemate that has long held the field in check–providing therapists with the means for developing, organizing, and contextualizing clinical knowledge in a manner that leads to real and lasting improvements in performance.

Filed Under: Behavioral Health, excellence, Feedback, Top Performance Tagged With: brendan madden, cdoi, cognitive behavioral therapy, common factors, continuing education, david barlow, evidence based medicine, evidence based practice, Evolution of Psychotherapy, feedback, icce, micheal lambert, ors, outcome rating scale, proms, session rating scale, srs, therapist, therapists, therapy

Five Incredible Days in Anaheim

December 15, 2009 By scottdm 2 Comments

From December 9-13th, eight thousand five hundred mental health practitioners, from countries around the globe, gathered in Anaheim, California to attend the “Evolution of Psychotherapy” conference.  Held every five years since 1985, the conference started big and has grown only larger.  “Only a few places in the US can accommodate such a large gathering,” says Jeffrey K. Zeig, Ph.D., who has organized the conference since the first.

The event, held every five years, brings together 40 of the field’s leading researchers, practitioners, trend setters, and educators to deliver keynote addresses and workshops, host discussion panels, and offer clinical demonstrations on every conceivable subject related to clinical practice.  Naturally, I spoke about my current work on “Achieving Clinical Excellence” as well as served on several topical panels, including “evidence based practice” (with Don Meichenbaum), “Research on Psychotherapy” (with Steven Hayes and David Barlow), and “Severe and Persistent Mental Illness (with Marsha Linnehan and Jeff Zeig).

Most exciting of all, the Evolution of Psychotherapy conference also served as the official launching point for the International Center for Clinical Excellence.  Here I am pictured with long-time colleague and friend, Jeff Zeig, and psychologist and ICCE CEO, Brendan Madden, in front of the ICCE display in the convention center hall.

Over the five days, literally hundreds of visitors stopped by booth #128 chat with me, Brendan, and Senior ICCE Associates and Trainers, Rob Axsen, Jim Walt, Cynthia Maeschalck, Jason Seidel, Bill Andrews, Gunnar Lindfeldt, and Wendy Amey.  Among other things, a cool M and M dispenser passed out goodies to folks (if they pressed the right combination of buttons), we also talked about and handed out leaflets advertising the upcoming “Achieving Clinical Excellence” conference, and finally people watched a brief video introducing the ICCE community.  Take a look yourself:.


More to come from the week in Anaheim….

Filed Under: Behavioral Health, Conferences and Training, excellence, ICCE Tagged With: Acheiving Clinical Excellence, brendan madden, david barlow, Don Meichenbaum, evidence based practice, Evolution of Psychotherapy, icce, Jeff Zeig, jeffrey K. zeig, Marsha Linnehan, mental health, psychotherapy, Steve Hayes

Evolution of Psychotherapy and the International Center for Clinical Excellence

December 9, 2009 By scottdm Leave a Comment

evolution-2005

Dateline: Chicago, Illinois
December 7, 2009

I’ve just finished packing my bags and am heading for the airport.  Tomorrow the “Evolution of Psychotherapy” begins.  Nearly 25 years after volunteering at the first “Evolution” conference, I’m back a second time to present.  Tomorrow, I’ll be talking about “Achieving Clinical Excellence.”  On the days that follow, I’m on panels with my friend Don Meichenbaum, as well as David Barlow, Marsha Linnehan, and others.  I’m really looking forward to the four days in Anaheim.

Of everything going on in sunny southern California, I have to say that I’m most excited about the launch of the International Center for Clinical Excellence.  We have a booth (#128) in the exhibitor hall where folks can stop by, talk, and peruse our new website.  As promised, it is a true web 2.0 experience, enabling clinicians researchers. and educators around the world to connect, share, and learn from each other.

We’ll be streaming video to facebook and twitter. Stay tuned to my blog and twitter accounts as well for updates, videos, and pictures from the conference.

Filed Under: Conferences and Training, excellence, ICCE Tagged With: achieving clinical excellence, david barlow, Don Meichenbaum, Evolution of Psychotherapy, Marsha Linnehan, psychotherapy

SEARCH

Subscribe for updates from my blog.

loader

Email Address*

Name

Upcoming Training

Jun
03

Feedback Informed Treatment (FIT) Intensive ONLINE


Oct
01

Training of Trainers 2025


Nov
20

FIT Implementation Intensive 2025

FIT Software tools

FIT Software tools

LinkedIn

Topics of Interest:

  • Behavioral Health (112)
  • behavioral health (5)
  • Brain-based Research (2)
  • CDOI (14)
  • Conferences and Training (67)
  • deliberate practice (31)
  • Dodo Verdict (9)
  • Drug and Alcohol (3)
  • evidence-based practice (67)
  • excellence (63)
  • Feedback (40)
  • Feedback Informed Treatment – FIT (246)
  • FIT (29)
  • FIT Software Tools (12)
  • ICCE (26)
  • Implementation (7)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (11)
  • Practice Based Evidence (39)
  • PTSD (4)
  • Suicide (1)
  • supervision (1)
  • Termination (1)
  • Therapeutic Relationship (9)
  • Top Performance (40)

Recent Posts

  • Agape
  • Snippets
  • Results from the first bona fide study of deliberate practice
  • Fasten your seatbelt
  • A not so helpful, helping hand

Recent Comments

  • Bea Lopez on The Cryptonite of Behavioral Health: Making Mistakes
  • Anshuman Rawat on Integrity versus Despair
  • Transparency In Therapy and In Life - Mindfully Alive on How Does Feedback Informed Treatment Work? I’m Not Surprised
  • scottdm on Simple, not Easy: Using the ORS and SRS Effectively
  • arthur goulooze on Simple, not Easy: Using the ORS and SRS Effectively

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors conferences continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training