SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
info@scottdmiller.com 773.404.5130

Leading Outcomes in Vermont: The Brattleboro Retreat and Primarilink Project

November 8, 2009 By scottdm 4 Comments

For the last 7 years, I’ve been traveling to the small, picturesque village of Brattleboro, Vermont to work with clinicians, agency managers, and various state officials on integrating outcomes into behavioral health services.  Peter Albert, the director of Governmental Affairs and PrimariLink at the Brattleboro Retreat, has tirelessly crisscrossed the state, promoting outcome-informed clinical work and organizing the trainings and ongoing consultations.   Over time, I’ve done workshops on the common factors, “what works” in therapy, using outcome to inform treatment, working with challenging clinical problems and situations and, most recently, the qualities and practices of super effective therapists.  In truth, outcome-informed clinical work both grew up and “came of age” in Vermont.  Indeed, Peter Albert was the first to bulk-purchase the ASIST program and distribute it for free to any provider interested in tracking and improving the effectiveness of their clinical work.

If you’ve never been to the Brattleboro area, I can state without reservation that it is one of the most beautiful areas I’ve visited in the U.S.–particularly during the Fall, when the leaves are changing color.  If you are looking for a place to stay for a few days, the Crosy House is my first and only choice.  The campus of the Retreat is also worth visiting.  It’s no accident that the trainings are held there as it has been a place for cutting edge services since being founded in 1874.  The radical idea at that time?  Treat people with respect and dignity.  The short film below gives a brief history of the Retreat and a glimpse of the serene setting.

Anyway, this last week, I spent an entire day together with a select group of therapists dedicated to improving outcomes and delivering superior service to their clients.  Briefly, these clinicians have been volunteering their time to participate in a project to implement outcome-informed work in their clinical settings.  We met in the boardroom at the Retreat, discussing the principles and practices of outcome-informed work as well as reviewing graphs of their individual and aggregate ORS and SRS data.

It has been and continues to be an honor to work with each and every one in the PrimariLink project.  Together, they are making a real difference in the lives of those they work with and to the field of behavioral health in Vermont.  If you are a clinician located in Vermont or provide services to people covered by MVP or PrimariLink and would like to participate in the project, please email Peter Albert.  At the same time, if you are a person in need of behavioral health services and looking for a referral, you could do no better than contacting one of the providers in the project!

Filed Under: Behavioral Health, Feedback Informed Treatment - FIT, FIT Software Tools, Practice Based Evidence Tagged With: behavioral health, common factors, consultation, ors, outcome rating scale, session rating scale, srs, supershrinks, therapy, Training

Common versus Specific Factors and the Future of Psychotherapy: A Response to Siev and Chambless

October 31, 2009 By scottdm 4 Comments

Early last summer, I received an email from my long time friend and colleague Don Meichenbaum alerting me to an article published in the April 2009 edition of the Behavior Therapist–the official “newsletter” of the Association for Behavioral and Cognitive Therapies–critical of the work that I and others have done on the common factors.

Briefly, the article, written by two proponents of the “specific treatments for specific disorders” approach to “evidence-based practice” in psychology, argued that the common factors position–the idea that the efficacy of psychotherapy is largely due to shared rather than unique or model-specific factors–was growing in popularity despite being based on “fallacious reasoning” and a misinterpretation of the research.

Although the article claimed to provide an update on research bearing directly on the validity of the “dodo verdict”–the idea that all treatment approaches work equally well–it simply repeated old criticisms and ignored contradictory, and at times, vast evidence.  Said another way, rather than seizing the opportunity they were given to educate clinicians and address the complex issues involved in questions surrounding evidence-based practice, Siev and Chambless instead wrote to “shore up the faithful.”  “Do not doubt,” authors Siev and Chambless were counseling their adherents, “science is on our side.”

That differences and tensions exist in the interpretation of the evidence is clear and important.  At the same time, more should be expected from those who lead the field.  You read the articles and decide.  The issues at stake are critical to the future of psychotherapy.  As I will blog about next week, there are forces at work in the United States and abroad that are currently working to limit the types of approaches clinicians can employ when working with clients.  While well-intentioned, available evidence indicates they are horribly misguided.  Once again, the question clinicians and consumers face is not “which treatment is best for that problem,” but rather “which approach “fits with, engages, and helps” the particular consumer at this moment in time?”

Behavior Therapist (April 2009) from Scott Miller

Dissemination of EST’s (November 2009) from Scott Miller

Filed Under: Dodo Verdict, evidence-based practice, Practice Based Evidence Tagged With: Association for Behavioral and Cognitive Therapies, behavior therapist, Don Meichenbaum, evidence based medicine, evidence based practice, psychology, psychotherapy

Whoa Nellie! A 25 Million Dollar Study of Treatments for PTSD

October 27, 2009 By scottdm 1 Comment

I have in my hand a frayed and yellowed copy of observations once made by a well known trainer of horses. The trainer’s simple message for leading a productive and successful professional life was, “If the horse you’re riding dies, get off.”

You would think the advice straightforward enough for all to understand and benefit.  And yet, the trainer pointed out, “many professionals don’t always follow it.”  Instead, they choose from an array of alternatives, including:

  1. Buying a strong whip
  2. Switching riders
  3. Moving the dead horse to a new location
  4. Riding the dead horse for longer periods of time
  5. Saying things like, “This is the way we’ve always ridden the horse.”
  6. Appointing a committee to study the horse
  7. Arranging to visit other sites where they ride dead horses more efficiently
  8. Increasing the standards for riding dead horses
  9. Creating a test for measuring our riding ability
  10. Complaining about how the state of the horse the days
  11. Coming up with new styles of riding
  12. Blaming the horse’s parents as the problem is often in the breeding.
When it comes to the treatment of post traumatic stress disorder, it appears the Department of Defense is applying all of the above.  Recently, the DoD awarded the largest grant ever awarded to “discover the best treatments for combat-related post-traumatic stress disorder” (APA Monitor).  Beneficiaries of the award were naturally ecstatic, stating “The DoD has never put this amount of money to this before.”
Missing from the announcements was any mention of research which clearly shows no difference in outcome between approaches intended to be therapeutic—including, the two approaches chosen for comparison in the DoD study!  In June 2008, researchers Benish, Imel, and Wampold, conducted a meta-analysis of all studies in which two or more treatment approaches were directly compared.  The authors conclude, “Given the lack of differential efficacy between treatments, it seems scientifically questionable to recommend one particular treatment over others that appear to be of comparable effectiveness. . . .keeping patients in treatment would appear to be more important in achieving desired outcomes than would prescribing a particular type of psychotherapy” (p. 755).
Ah yes, the horse is dead, but proponents of “specific treatments for specific disorders” ride on.  You can hear their rallying cry, “we will find a more efficient and effective way to ride this dead horse!” My advice? Simple: let’s get off this dead horse. There are any number of effective treatments for PTSD.  The challenge is decidedly not figuring out which one is best for all but rather “what works” for the individual. In these recessionary times, I can think of far better ways to spend 25 million than on another “horse race” between competing therapeutic approaches.  Evidence based methods exist for assessing and adjusting both the “fit and effect” of clinical services—the methods described, for instance, in the scholarly publications sections of my website.  Such methods have been found to improve both outcome and retention by as much as 65%.  What will happen? Though I’m hopeful, I must say that the temptation to stay on the horse you chose at the outset of the race is a strong one.

Filed Under: Behavioral Health, Feedback Informed Treatment - FIT, Practice Based Evidence, PTSD Tagged With: behavioral health, continuing education, evidence based medicine, evidence based practice, icce, meta-analysis, ptst, reimbursement

The Field, the Future, and Feedback

October 2, 2009 By scottdm 1 Comment

There is an old (but in many ways sad) joke about two clinicians–actually, the way I first heard the story, it was two psychiatrists.  The point of the story is the same regardless of the discipline of the provider.  Anyway, two therapists meet in the hallway after a long day spent meeting clients.  One, the younger of the two, is tired and bedraggled.  The other, older and experienced, looks the same as s/he did at the start of the day: eyes bright and attentive, hair perfectly groomed, clothes and appearance immaculate.  Taken aback by the composure of the more experienced colleague, the younger therapist asks, “How do you do it?  How do you listen to the trials and tribulations, the problem and complaints, the dire lives and circumstances of your clients, minute and minute, hour upon hour…and yet emerge at the end of the day in such good shape?”  Slowly shaking his head from left to right, the older and more experienced clinician immediately reached out, tapping the less experienced colleague gently on the shoulder, and then after removing the thick plugs stuffed into both of his years, said, “Excuse me, what did you say?”

Let’s face it: healthcare is in trouble.  Behavioral healthcare in particular is in even worse shape.  And while solutions from politicians, pundits, industry insiders and professionals are circulating in Washington with all the sound and fury of a hurricane, the voice of consumers is largely absent.  Why?  Of course, many of the barriers between providers and consumers are systemic in nature and as such, out of the control of average clinicians and consumers.  Others, however, are local and could be addressed in an instance with a modicum of interest and attention on the part of professionals.

Chief among the steps practitioners could take to bridge to chasm between them and consumers is the adoption of routine, ongoing feedback.  Seeking and utlizing real-time feedback from consumers has the added advantage of significantly boosting outcomes and increasing retention in services (several studies documenting the impact of feedback are available in the “Scholarly publications and Handouts” section of my website). Healthcare providers can download two well validated and easy-to-use scales right now for free by clicking on the Performance Metrics tab to the left.

So far, however, few in healthcare seem interested and others are downright hostile to the idea of asking consumers for input.  Consider the following story by reporter Lindsey Tanner entitled, “Take two, call me in the morning…and keep it quiet.” Tanner discovered that some in healthcare are demanding that people (patients. clients, consumers) sign “gag orders” prior to being treated–agreeing in effect not to post comments about the provider (negative and otherwise) to online sites such as Zagats.com, Angieslist.com, and RateMds.com.  According to the article, a Greensboro, N.C. company, ironically called “Medical Justice” is, for a fee, now providing physicians with standardized waiver agreements and advising all doctors to have patients sign on the dotted line.  And if the patient refuses?  Simple: find another doctor.

Can you imagine a hotel chain or restaurant asking you to sign a legally-binding agreement not to disclose your experience prior to booking your room or handing you the menu?  Anyone who has travelled lately knows the value of the information contained on consumer-driven websites such as TripAdvisor.com.  It’s outlandish really–except in healthcare.

To be sure, there is at least one important difference between healthcare and other service industries.  Specifically, healthcare providers, unlike business owners and service managers, are prevented from responding to online complaints by existing privacy laws.  However, even if this problem were insurmountable–which it is not–how then can one explain the continuing reluctance on the part of professionals to give people access to their own healthcare records?  And this despite federal regulations under the Health Insurance Portability and Accountability Act (HIPAA) permitting complete and unfettered access (click here to read the recent NPR story on this subject).  Clearly, the problem is not legal but rather cultural in nature.  Remember when Elaine from Seinfeld asked to see her chart?

Earlier this summer, my family and I were vacationing in Southwest Michigan.  One day, after visiting the beach and poking around the shops in the lakeside town of South Haven, we happened on a small Italian bistro named,Tello.  Being from a big city famous for its good eats, I’ll admit I wasn’t expecting much.  The food was delicious.  More surprising, was the service.  Not only were the staff welcoming and attentive, but at the end of the meal, when I thought the time had come to pay the bill, the folder I was given contained a small PDA rather than the check.  I was being asked for my feedback.Answering the questions took less than a minute and the manager, Mike Sheedy, appeared at our table within moments of my hitting the “send” button.  He seemed genuinely surprised when I asked if he felt uncomfortable seeking feedback so directly.  “Have you learned anything useful?” I then inquired.  “Of course,” he answered immediately, “just last week a customer told us that it would be nice to have a children’s menu posted in the window alongside the standard one.” I was dumbstruck as one of the main reasons we had decided to go into the restaurant rather than others was because the children’s menu was prominently displayed in the front window!

Filed Under: excellence, Feedback, Feedback Informed Treatment - FIT, Practice Based Evidence Tagged With: behavioral health, holland, randomized clinical trial

Top Resources for Top Performers

September 28, 2009 By scottdm 1 Comment

Since the 1960’s, over 10,000 “how-to” book on psychotherapy have been published.  I joke about this fact at my workshops, stating “Any field that needs ten thousand books to describe what it’s doing…surely doesn’t know what its doing!” I continue, pointing out that, “There aren’t 10,000 plus books on ‘human anatomy,’ for example.  There are a handful!  And the content of each is remarkably similar.”  The mere existence of so many, divergent points of view makes it difficult for any practitioner to sort the proverbial “wheat from the chaff.”

Over the last 100 years or so, the field has employed three solutions to deal with the existence of so many competing theories and approaches.  First, ignore the differences and continue with “business as usual”– this, in fact, is the approach thats been used for most of the history of the field.  Second, force a consolidation or reduction by fiat–this, in my opinion, is what is being attempted with much of the current evidence-based practice (“specific treatments for specific disorders”) movement.  And third, and finally, respect the field’s diverse nature and approaches, while attempting to understand the “DNA” common to all–said another way, identify and train clinicians in the factors common to all approaches so that they can tailor their work to their clients.

Let’s face it: option one is no longer viable.  Changes in both policy and funding make clear that ignoring the problem will result in further erosion of clinical autonomy.  For anyone choosing option two–either enthusistically or by inaction–I will blog later this week about developments in the United States and U.K. on the “evidence-based practice” front that I’m sure will give you pause.  Finally, for those interested in movng beyond the rival factions and delivering the best clinical service to clients, I want to recommend two resources.  First, Derek Truscott’s, Becoming an Effective Psychotherapist.  The title says it all.  Whether you are new to the field or an experienced clinician, this book will help you sort through the various and competing psychotherapy approaches and find a style that works for you and the people you work with.  The second volume, is Mick Cooper’s Essential Research Findings in Counselling and Psychotherapy.  What can I say about this book?  It is a gem.  Thorough, yet readable.  Empirical in nature, but clinically relevant.  When I’m out and about teaching around the globe and people ask me what to read in order to understand the empirical literature on psychotherapy, I recommend this book.

OK, enough for now.  Stay tuned for further updates this week. In the meantime, I did manage to find a new technique making the rounds on the workshop circuit.  Click on the video below.

Filed Under: Behavioral Health, Practice Based Evidence Tagged With: common factors, counselling, Derek Truscott, evidence based practice, icce, Mick Cooper, psychotherapy, randomized clinical trial

History doesn’t repeat itself,

September 20, 2009 By scottdm 2 Comments

Mark Twain photo portrait.

Image via Wikipedia

“History doesn’t repeat itself,” the celebrated American author, Mark Twain once observed, “but it does rhyme.” No better example of Twain’s wry comment than recurring claims about specifc therapeutic approaches. As any clinician knows, every year witnesses the introduction of new treatment models.  Invariably, the developers and proponents claim superior effectivess of the approach over existing treatments.  In the last decade or so, such claims, and the publication of randomized clinical trials, has enabled some to assume the designation of an “evidence-based practice” or “empirically supported treatment.”  Training, continuing education, funding, and policy changes follow.

Without exception, in a few short years, other research appears showing the once widely heralded “advance” to be no more effective than what existed at the time.  Few notice, however, as professional attention is once again captured by a “newer” and “more improved” treatment model.  Studies conducted by my colleagues and I (downloadable from the “scholarly publications” are of my website), document this pattern with treatments for kids, alcohol abuse and dependence, and PTSD over the last 30 plus years.

As folks who’ve attended my recent workshops know, I’ve been using DBT as an example of approaches that have garnered significant professional attention (and funding) despite a relatively small number of studies (and participants) and no evidence of differential effectiveness.  In any event, the American Journal of Psychiatry will soon publish, “A Randomized Trial of Dialectical Behavior Therapy versus General Psychiatric Management for Borderline Personality Disorder.”

As described by the authors, this study is “the largest clinical trial comparing dialectical behavior therapy and an active high-standard, coherent, and principled approach derived from APA guidelines and delivered by clinicians with expertise in treating borderline personality disorder.”

And what did these researchers find?

“Dialectical behavior therapy was not superior to general psychiatric management with both intent-to-treat and per-protocol analyses; the two were equally effective across a range of outcomes.”  Interested readers can request a copy of the paper from the lead investigator, Shelley McMain at: Shelley_McMain@camh.net.

Below, readers can also find a set of slides summarizing and critiquing the current research on DBT. In reviewing the slides, ask yourself, “how could an approach based on such a limited and narrow sample of clients and no evidence of differential effectives achieved worldwide prominence?”

Of course, the results summarized here do not mean that there is nothing of value in the ideas and skills associated with DBT.  Rather, it suggests that the field, including clinicians, researchers, and policy makers, needs to adopt a different approach when attempting to improve the process and outcome of behavioral health practices.  Rather than continuously searching for the “specific treatment” for a “specific diagnosis,” research showing the general equivalence of competing therapeutic approaches indicates that emphasis needs to be placed on: (1) studying factors shared by all approaches that account for success; and (2) developing methods for helping clinicians identify what works for individual clients. This is, in fact, the mission of the International Center for Clinical Excellence: identifying the empirical evidence most likely to lead to superior outcomes in behavioral health.

Dbt Handouts 2009 from Scott Miller

Filed Under: Behavioral Health, Dodo Verdict, Practice Based Evidence Tagged With: alcohol abuse, Americal Psychological Association, American Journal of Psychiatry, APA, behavioral health, CEU, continuing education, CPD, evidence based medicine, evidence based practice, mental health, psychiatry, PTSD, randomized control trial, Training

Practice-Based Evidence Goes Mainstream

September 5, 2009 By scottdm 4 Comments

welcome-to-the-real-worldFor years, my colleagues and I have been using the phrase “practice-based evidence” to refer to clinicians’ use of real-time feedback to develop, guide, and evaluate behavioral health services. Against a tidal wave of support from professional and regulatory bodies, we argued that the “evidence-based practice”–the notion that certain treatments work best for certain diagnosis–was not supported by the evidence.

Along the way, I published, along with my colleagues, several meta-analytic studies, showing that all therapies worked about equally well (click here to access recent studies children, alcohol abuse and dependence, and post-traumatic stress disorder). The challenge, it seemed to me, was not finding what worked for a particular disorder or diagnosis, but rather what worked for a particular individual–and that required ongoing monitoring and feedback.  In 2006, following years of controversy and wrangling, the American Psychological Association, finally revised the official definition to be consistent with “practice-based evidence.” You can read the definition in the May-June issue of the American Psychologist, volume 61, pages 271-285.

Now, a recent report on the Medscape journal of medicine channel provides further evidence that practice-based evidence is going mainstream. I think you’ll find the commentary interesting as it provides compelling evidence that an alternative to the dominent paradigm currently guiding professional discourse is taking hold.  Watch it here.

Filed Under: Behavioral Health, evidence-based practice, Practice Based Evidence Tagged With: behavioral health, conference, deliberate practice, evidence based medicine, evidence based practice, mental health, Therapist Effects

The Debate of the Century

August 27, 2009 By scottdm

doubt_diceWhat causes change in psychotherapy?  Specific treatments applied to specific disorders?  Those in the “evidence-based” say so and have had a huge influence on behavioral healthcare policy and reimbursement.  Over the last 10 years, my colleagues and I have written extensively and traveled the world offering a different perspective: by and large, the effectiveness of care is due to a shared group of factors common to all treatment approaches.

In place of “evidence-based” practice, we’ve argued for “practice-based”evidence.  Said another way, what really matters in the debate is whether clients benefit–not the particular treatment approach.  Here on my website, clinicians can download absolutely free measures that can be used to monitor and improve outcome and retention (click Performance Metrics).

bruce-wampold-364px

Anyway, the message is finally getting through.  Recently, uber-statistician and all around good guy Bruce Wampold, Ph.D. debated prominent EBP proponent Steve Hollon.  Following the exchange, a vote was taken.  Bruce won handily: more than 15:1.

Scroll down to “Closing Debate” (Thursday)

Filed Under: Behavioral Health, Practice Based Evidence Tagged With: bruce wampold, cdoi, evidence based medicine, evidence based practice, ors, outcome rating scale, PCOMS, performance metrics, practice-based evidence, psychotherapy, session rating scale, srs, steve hollon

  • « Previous Page
  • 1
  • 2

SEARCH

Subscribe for updates from my blog.

  

Upcoming Training

 ICCE FIT Intensive Training with Dr. Scott Miller
ICCE FIT Supervision Intensive Online Scott D Miller

FIT Software tools

FIT Software tools

NREPP Certified

HTML tutorial

LinkedIn

Topics of Interest:

  • Behavioral Health (110)
  • behavioral health (4)
  • Brain-based Research (2)
  • CDOI (14)
  • Conferences and Training (67)
  • deliberate practice (27)
  • Dodo Verdict (9)
  • Drug and Alcohol (3)
  • evidence-based practice (65)
  • excellence (61)
  • Feedback (38)
  • Feedback Informed Treatment – FIT (203)
  • FIT (25)
  • FIT Software Tools (12)
  • ICCE (26)
  • Implementation (7)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (11)
  • Practice Based Evidence (38)
  • PTSD (4)
  • Suicide (1)
  • supervision (1)
  • Termination (1)
  • Therapeutic Relationship (8)
  • Top Performance (39)

Recent Posts

  • Making Sense of Client Feedback
  • Umpires and Psychotherapists
  • Augmenting the Two-Dimensional Sensory Input of Online Psychotherapy
  • Death of a Friend
  • The Cost of Caring

Recent Comments

  • Asta on The Expert on Expertise: An Interview with K. Anders Ericsson
  • Michael McCarthy on Culture and Psychotherapy: What Does the Research Say?
  • Jim Reynolds on Culture and Psychotherapy: What Does the Research Say?
  • gloria sayler on Culture and Psychotherapy: What Does the Research Say?
  • Joseph Maizlish on Culture and Psychotherapy: What Does the Research Say?

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors conferences continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training