SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
scottdmiller@ talkingcure.com +1.773.454.8511

Top Resources for Top Performers

September 28, 2009 By scottdm 1 Comment

Since the 1960’s, over 10,000 “how-to” book on psychotherapy have been published.  I joke about this fact at my workshops, stating “Any field that needs ten thousand books to describe what it’s doing…surely doesn’t know what its doing!” I continue, pointing out that, “There aren’t 10,000 plus books on ‘human anatomy,’ for example.  There are a handful!  And the content of each is remarkably similar.”  The mere existence of so many, divergent points of view makes it difficult for any practitioner to sort the proverbial “wheat from the chaff.”

Over the last 100 years or so, the field has employed three solutions to deal with the existence of so many competing theories and approaches.  First, ignore the differences and continue with “business as usual”– this, in fact, is the approach thats been used for most of the history of the field.  Second, force a consolidation or reduction by fiat–this, in my opinion, is what is being attempted with much of the current evidence-based practice (“specific treatments for specific disorders”) movement.  And third, and finally, respect the field’s diverse nature and approaches, while attempting to understand the “DNA” common to all–said another way, identify and train clinicians in the factors common to all approaches so that they can tailor their work to their clients.

Let’s face it: option one is no longer viable.  Changes in both policy and funding make clear that ignoring the problem will result in further erosion of clinical autonomy.  For anyone choosing option two–either enthusistically or by inaction–I will blog later this week about developments in the United States and U.K. on the “evidence-based practice” front that I’m sure will give you pause.  Finally, for those interested in movng beyond the rival factions and delivering the best clinical service to clients, I want to recommend two resources.  First, Derek Truscott’s, Becoming an Effective Psychotherapist.  The title says it all.  Whether you are new to the field or an experienced clinician, this book will help you sort through the various and competing psychotherapy approaches and find a style that works for you and the people you work with.  The second volume, is Mick Cooper’s Essential Research Findings in Counselling and Psychotherapy.  What can I say about this book?  It is a gem.  Thorough, yet readable.  Empirical in nature, but clinically relevant.  When I’m out and about teaching around the globe and people ask me what to read in order to understand the empirical literature on psychotherapy, I recommend this book.

OK, enough for now.  Stay tuned for further updates this week. In the meantime, I did manage to find a new technique making the rounds on the workshop circuit.  Click on the video below.

Filed Under: Behavioral Health, Practice Based Evidence Tagged With: common factors, counselling, Derek Truscott, evidence based practice, icce, Mick Cooper, psychotherapy, randomized clinical trial

History doesn’t repeat itself,

September 20, 2009 By scottdm 2 Comments

Mark Twain photo portrait.

Image via Wikipedia

“History doesn’t repeat itself,” the celebrated American author, Mark Twain once observed, “but it does rhyme.” No better example of Twain’s wry comment than recurring claims about specifc therapeutic approaches. As any clinician knows, every year witnesses the introduction of new treatment models.  Invariably, the developers and proponents claim superior effectivess of the approach over existing treatments.  In the last decade or so, such claims, and the publication of randomized clinical trials, has enabled some to assume the designation of an “evidence-based practice” or “empirically supported treatment.”  Training, continuing education, funding, and policy changes follow.

Without exception, in a few short years, other research appears showing the once widely heralded “advance” to be no more effective than what existed at the time.  Few notice, however, as professional attention is once again captured by a “newer” and “more improved” treatment model.  Studies conducted by my colleagues and I (downloadable from the “scholarly publications” are of my website), document this pattern with treatments for kids, alcohol abuse and dependence, and PTSD over the last 30 plus years.

As folks who’ve attended my recent workshops know, I’ve been using DBT as an example of approaches that have garnered significant professional attention (and funding) despite a relatively small number of studies (and participants) and no evidence of differential effectiveness.  In any event, the American Journal of Psychiatry will soon publish, “A Randomized Trial of Dialectical Behavior Therapy versus General Psychiatric Management for Borderline Personality Disorder.”

As described by the authors, this study is “the largest clinical trial comparing dialectical behavior therapy and an active high-standard, coherent, and principled approach derived from APA guidelines and delivered by clinicians with expertise in treating borderline personality disorder.”

And what did these researchers find?

“Dialectical behavior therapy was not superior to general psychiatric management with both intent-to-treat and per-protocol analyses; the two were equally effective across a range of outcomes.”  Interested readers can request a copy of the paper from the lead investigator, Shelley McMain at: Shelley_McMain@camh.net.

Below, readers can also find a set of slides summarizing and critiquing the current research on DBT. In reviewing the slides, ask yourself, “how could an approach based on such a limited and narrow sample of clients and no evidence of differential effectives achieved worldwide prominence?”

Of course, the results summarized here do not mean that there is nothing of value in the ideas and skills associated with DBT.  Rather, it suggests that the field, including clinicians, researchers, and policy makers, needs to adopt a different approach when attempting to improve the process and outcome of behavioral health practices.  Rather than continuously searching for the “specific treatment” for a “specific diagnosis,” research showing the general equivalence of competing therapeutic approaches indicates that emphasis needs to be placed on: (1) studying factors shared by all approaches that account for success; and (2) developing methods for helping clinicians identify what works for individual clients. This is, in fact, the mission of the International Center for Clinical Excellence: identifying the empirical evidence most likely to lead to superior outcomes in behavioral health.

Dbt Handouts 2009 from Scott Miller

Filed Under: Behavioral Health, Dodo Verdict, Practice Based Evidence Tagged With: alcohol abuse, Americal Psychological Association, American Journal of Psychiatry, APA, behavioral health, CEU, continuing education, CPD, evidence based medicine, evidence based practice, mental health, psychiatry, PTSD, randomized control trial, Training

Practice-Based Evidence Goes Mainstream

September 5, 2009 By scottdm 3 Comments

welcome-to-the-real-worldFor years, my colleagues and I have been using the phrase “practice-based evidence” to refer to clinicians’ use of real-time feedback to develop, guide, and evaluate behavioral health services. Against a tidal wave of support from professional and regulatory bodies, we argued that the “evidence-based practice”–the notion that certain treatments work best for certain diagnosis–was not supported by the evidence.

Along the way, I published, along with my colleagues, several meta-analytic studies, showing that all therapies worked about equally well (click here to access recent studies children, alcohol abuse and dependence, and post-traumatic stress disorder). The challenge, it seemed to me, was not finding what worked for a particular disorder or diagnosis, but rather what worked for a particular individual–and that required ongoing monitoring and feedback.  In 2006, following years of controversy and wrangling, the American Psychological Association, finally revised the official definition to be consistent with “practice-based evidence.” You can read the definition in the May-June issue of the American Psychologist, volume 61, pages 271-285.

Now, a recent report on the Medscape journal of medicine channel provides further evidence that practice-based evidence is going mainstream. I think you’ll find the commentary interesting as it provides compelling evidence that an alternative to the dominent paradigm currently guiding professional discourse is taking hold.  Watch it here.

Filed Under: Behavioral Health, evidence-based practice, Practice Based Evidence Tagged With: behavioral health, conference, deliberate practice, evidence based medicine, evidence based practice, mental health, Therapist Effects

The Evolution of Psychotherapy: Twenty-Five Years On

September 1, 2009 By scottdm Leave a Comment

In 1985, I was starting my second year as a doctoral student at the University of Utah.  Like thousands of other graduate students, I’d watched the “Gloria” films.  Carl Roger, Albert Ellis, Fritz Perls were all impressive if not confusing given their radically different styles.  I also knew that I would soon have the opportunity to meet each one live and in person.  Thanks to Jeffrey K. Zeig, Ph.D. and the dedicated staff at the Milton H. Erickson Foundation, nearly every well known therapist, guru, and psychotherapy cult-leader would gather for the first mega-conference ever held, the field’s Woodstock: The Evolution of Psychotherapy.

Having zero resources at my disposal, I wrote to Jeff asking if I could volunteer for the event in exchange for the price of admission.  Soon after completing the multiple-page application, I received notice that I had been chosen to work at event.  I was ecstatic.  When December finally came around, I loaded up my old car with food and a sleeping bag and, together with a long time friend Paul Finch, drove from Salt Lake City to Phoenix.   What can I say?  It was alternately inspiring and confusing.  I learned so very much and also felt challenged to make sense of the disparate theories and approaches.

At that time, I had no idea that some twenty years later, I’d receive a call from Jeff Zeig asking me to participate as one of the “State of the Art” faculty for the 2005 Evolution Conference.  Actually, I can remember where I was when my cell phone rang: driving on highway 12 on southwest Michigan toward Indian Lake, where my family has a small cottage.  In any event, I’m looking forward to attending and presenting at the 2009 conference.  I encourage all of the readers of my blog to attend.  Registration information can be found at the conference website: www.evolutionofpsychotherapy.com.  The highlight of the event for me is a debate/discussion I’ll be having with my friend and colleague, Don Meichenbaum, Ph.D. on the subject of “evidence-based practice.”

One more thing.  To get a feel for the event, I included a clip of a panel discussion from the first Evolution conference featuring Carl Rogers.  Not trying to be hyperbolic, but listening to Rogers speak changed my life.  I won’t bore you with the details but the night following his presentation, I had a dream…(more later)…

Filed Under: Behavioral Health, Conferences and Training, Dodo Verdict, evidence-based practice, excellence Tagged With: albert ellis, carl roger, Don Meichenbaum, erickson, evidence based practice, Evolution of Psychotherapy, fritz perl, jejjrey k. zeig, psychotherapy

The Debate of the Century

August 27, 2009 By scottdm

doubt_diceWhat causes change in psychotherapy?  Specific treatments applied to specific disorders?  Those in the “evidence-based” say so and have had a huge influence on behavioral healthcare policy and reimbursement.  Over the last 10 years, my colleagues and I have written extensively and traveled the world offering a different perspective: by and large, the effectiveness of care is due to a shared group of factors common to all treatment approaches.

In place of “evidence-based” practice, we’ve argued for “practice-based”evidence.  Said another way, what really matters in the debate is whether clients benefit–not the particular treatment approach.  Here on my website, clinicians can download absolutely free measures that can be used to monitor and improve outcome and retention (click Performance Metrics).

bruce-wampold-364px

Anyway, the message is finally getting through.  Recently, uber-statistician and all around good guy Bruce Wampold, Ph.D. debated prominent EBP proponent Steve Hollon.  Following the exchange, a vote was taken.  Bruce won handily: more than 15:1.

Scroll down to “Closing Debate” (Thursday)

Filed Under: Behavioral Health, Practice Based Evidence Tagged With: bruce wampold, cdoi, evidence based medicine, evidence based practice, ors, outcome rating scale, PCOMS, performance metrics, practice-based evidence, psychotherapy, session rating scale, srs, steve hollon

Superior Performance as a Psychotherapist: First Steps

April 1, 2009 By scottdm Leave a Comment

So what is the first step to improving your performance?  Simply put, knowing your baseline.  Whatever the endeavor, you have to keep score.  All great performers do.  As a result, the performance in most fields has been improving steadily over the last 100 years.

Consider, for instance, the Olympics. Over the last century, the best performance for every event has improved–in some cases by 50%!  The Gold Medal winning time for the marathon in the 1896 Olympics was just one minute faster than the entry time currently required just to participate in the Chicago and Boston marathons.

By contrast, the effectiveness of psychological therapies has not improved a single percentage point over the last 30 years.  How, you may wonder, could that be?  During the same time period: (1) more than 10,000 how-to books on psychotherapy have been published; (2) the number of treatment approaches has mushroomed from 60 to 400; and (3) there are presently 145 officially approved, evidenced-based, manualized treatments for 51 of the 397 possible DSM IV diagnostic groups.  Certainly, given such “growth,” we therapists must be more effective with more people than ever before.  Unfortunately, however, instead of advancing, we’ve stagnated, mistaking our feverish peddling for real progress in the Tour de Therapy.

Truth is, no one has been keeping score, least of all we individual practitioners. True, volumes of research now prove beyond any doubt that psychotherapy works.  Relying on such evidence to substantiate the effectiveness of one’s own work, however, is a bit like Tiger Woods telling you the par for a particular hole rather than how many strokes it took him to sink the ball.  The result on outcome, research indicates, is that effectiveness rates plateau very early in most therapists careers while confidence level continue to grow.

In one study, for example, when clinicians were asked to rate their job performance from A+ to F, fully two-thirds considered themselves A or better. No one, not a single person in the lot, rated him or herself as below average. As researchers Sapyta, Riemer, and Bickman (2005) conclude, “most clinicians believe that they produce patient outcomes that are well above average” (p. 146). In another study, Deirdre Hiatt and George Hargrave used peer and provider ratings, as well as a standardized outcome measure, to assess the success rates of therapists in a sample of mental health professionals. As one would expect, providers were found to vary significantly in their effectiveness. What was disturbing is that the least effective therapists in the sample rated themselves on par with the most effective!

The reason for stagnant success rates in psychotherapy should be clear to all: why try to improve when you already think your the best or, barring that, at least above average?

Here again, expanding our search for excellence beyond the narrow field of psychotherapy to the subject of expertise and expert performance in general can provide some helpful insights. In virtually every profession, from carpentry to policework, medicine to mathematics, average performers overestimate their abilities, confidently assigning themselves to the top tier. Therapists are simply doing what everyone else does. Alas, they are average among the average.

Our own work and research proves that clinicians can break away from the crowd of average achievers by using a couple of simple, valid, and reliable tools for assessing outcome. As hard as it may be to believe, the empirical evidence indicates that performance increases between 65-300% (click here to read the studies). Next time, I’ll review these simple tools as well as a few basic methods for determining exactly how effective you are. Subscribe now so you’ll be the first to know.

One more note, after posting last time, I heard from several readers who had difficulty subscribing. After doing some research, we learned that you must use IE 7 or Firefox 3.0.7 or later for the subscribe function to work properly.  Look forward to hearing from you!

In the meantime, the transcript below is of a recent interview I did for Shrinkrap radio.  It’s focused on our current work:

Supershrinks: An Interview with Scott Miller about What Clinicians can Learn from the Field’s Most Effective Practitioners from Scott Miller

 

Filed Under: Behavioral Health, excellence, Top Performance Tagged With: cdoi, evidence based practice, excellence, mental health, outcome measures, psychology, psychotherapy, srs, supershrinks

  • « Previous Page
  • 1
  • 2
  • 3

SEARCH

Subscribe for updates from my blog.

loader

Email Address*

Name

Upcoming Training

There are no upcoming Events at this time.

FIT Software tools

FIT Software tools

LinkedIn

Topics of Interest:

  • behavioral health (5)
  • Behavioral Health (112)
  • Brain-based Research (2)
  • CDOI (14)
  • Conferences and Training (67)
  • deliberate practice (31)
  • Dodo Verdict (9)
  • Drug and Alcohol (3)
  • evidence-based practice (67)
  • excellence (63)
  • Feedback (40)
  • Feedback Informed Treatment – FIT (246)
  • FIT (29)
  • FIT Software Tools (12)
  • ICCE (26)
  • Implementation (7)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (11)
  • Practice Based Evidence (39)
  • PTSD (4)
  • Suicide (1)
  • supervision (1)
  • Termination (1)
  • Therapeutic Relationship (9)
  • Top Performance (40)

Recent Posts

  • Agape
  • Snippets
  • Results from the first bona fide study of deliberate practice
  • Fasten your seatbelt
  • A not so helpful, helping hand

Recent Comments

  • Dr Martin Russell on Agape
  • hima on Simple, not Easy: Using the ORS and SRS Effectively
  • hima on The Cryptonite of Behavioral Health: Making Mistakes
  • himalaya on Alas, it seems everyone comes from Lake Wobegon
  • himalayan on Do certain people respond better to specific forms of psychotherapy?

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors conferences continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training