SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
info@scottdmiller.com 773.404.5130

Clinical Practice Guidelines: Beneficial Development or Bad Therapy?

December 4, 2017 By scottdm 15 Comments

complianceA couple of weeks ago, the American Psychological Association (APA) released clinical practice guidelines for the treatment of people diagnosed with post-traumatic stress disorder (PTSD).  “Developed over four years using a rigorous process,” according to an article in the APA Monitor, these are the first of many additional recommendations of specific treatment methods for particular psychiatric diagnoses to be published by the organization.

Almost immediately, controversy broke out.   On the Psychology Today blog, Clinical Associate Professor Jonathon Shedler, advised practitioners and patients to ignore the new guidelines, labeling them “bad therapy.”  Within a week, Professors Dean McKay and Scott Lilienfeld responded, lauding the guidelines a “significant advance for psychotherapy practice,” while repeatedly accusing Shedler of committing logical fallacies and misrepresenting the evidence.

One thing I know for sure, coming in at just over 700 pages, few if any practitioners will ever read the complete guideline and supportive appendices.  Beyond length, the way the information is presented–especially the lack of hypertext for cross referencing of the studies cited–seriously compromises any strainghtforward effort to review and verify evidentiary claims.

devil-in-the-detailIf, as the old saying goes, “the devil is in the details,” the level of mind-numbing minutae contained in the offical documents ensures he’ll remain well-hidden, tempting all but the most compulsive to accept the headlines on faith.

Consider the question of whether certain treatment approaches are more effective than others?  Page 1 of the Executive Summary identifies differential efficacy as a “key question” to be addressed by the Guideline.  Ultimately, four specific approaches are strongly recommended, being deemed more effective than…wait for it… scratchinghead“relaxation.”

My first thought is, “OK, curious comparison.”   Nevertheless, I read on.

Only by digging deep into the report, tracing the claim to the specific citations, and then using PsychNET, and another subscription service, to access the actual studies, is it possible to discover that in the vast majority of published trials reviewed, the four “strongly recommended” approaches were actually compared to nothing.  That’s right, nothing.

In the few studies that did include relaxation, the structure of that particular “treatment” precluded sufferers from talking directly about their traumatic experiences.   At this point, my curiosity gave way to chagrin.  Is it any wonder the four other approaches proved more helpful?  What real-world practitioner would limit their work with someone suffering from PTSD to recording “a relaxation script” and telling their client to “listen to it for an hour each day?”

Holy-Moly-Logo-Nur-Sprechblase(By the way,  it took me several hours to distill the information noted above from the official documentation–and I’m someone with a background in research, access to several online databases, a certain facility with search engines, and connections with a community of fellow researchers with whom I can consult)

On the subject of what research shows works best in the treatment of PTSD, meta-analyses of studies in which two or more approaches intended to be therapeutic are directly compared, consistently find no difference in outcome between methods–importantly, whether the treatments are designated “trauma-focused” or not.  Meanwhile, another highly specialized type of research–known as dismantling studies–fails to provide any evidence for the belief that specialized treatments cduck or rabbitontain ingredients specifically remedial to the diagnosis!  And yes, that includes the ingredient most believe essential to therapeutic success in the treatment of PTSD: exposure (1, 2).

So, if the data I cite above is accurate–and freely available–how could the committee that created the Guideline come to such dramatically different conclusions?  In particular, going to great lengths to recommend particular approaches to the exclusion of others?

Be forewarned, you may find my next statement confusing.  The summary of studies contained in the Guideline and supportive appendices is absolutely accurate.  It is the interpretation of that body of research, however, that is in question.

More than anything else, the difference between the recommendations contained in the Guideline and the evidence I cite above, is attributable to a deep and longstanding rift in the body politic of the APA.  How otherwise is one able to reconcile advocating the use of particular approaches with APA’s official policy on psychotherapy recognizing, “different forms . . . typically produce relatively similar outcomes”?

envySeeking to place the profession “on a comparable plane” with medicine, some within the organization–in particular, the leaders and membership of Division 12 (Clinical Psychology) have long sought to create a psychological formulary.  In part, their argument goes, “Since medicine creates lists of recommended treatments and procedures,  why not psychology?”

Here, the answer is simple and straightforward: because psychotherapy does not work like medicine.  As Jerome Frank observed long before the weight of evidence supported his view, effective psychological care is comprised of:

  • An emotionally-charged, confiding relationship with a helping person (e.g., a therapist);
  • A healing context or setting (e.g., clinic);
  • A rational, conceptual scheme, or myth that is congruent with the sufferer’s worldview and provides a plausible explanation for their difficulties (e.g., psychotherapy theories); and
  • Rituals and/or procedures consistent with the explanation (e.g., techniques).

The four attributes not only fit the evidence but explain why virtually all psychological approaches tested over the last 40 years, work–even those labelled pseudoscience (e.g., EMDR) by Lilienfeld, and other advocates of guidelines comprised of  “approved therapies.”  guidelines

That the profession could benefit from good guidelines goes without saying.  Healing the division within APA would be a good place to start.  Until then, encouraging practitioners to follow the organization’s own definition of evidence-based practice would suffice.  To wit, “Evidence based practice is the integration of the best available research with clinical expertise in the context of patient (sic) characteristics, culture, and preferences.”  Note the absence of any mention of specific treatment approaches.  Instead, consistent with Frank’s observations, and the preponderance of research findings, emphasis is placed on fitting care to the person.

How to do this?   The official statement continues, encouraging the “monitoring of patient (sic) progress . . . that may suggest the need to adjust the treatment.” Over the last decade, multiple systems have been developed for tracking engagement and progress in real time.  Our own system, known as Feedback Informed Treatment (FIT), is being applied by thousands of therapists around the world, with literally millions of clients. It is listed on the National Registry of Evidence based Programs and Practices.  More, when engagement and progress are tracked together with clients in real time, data to date document improvements in retention and outcome of mental health services regardless of the treatment method being used.

Until  next time,

Scott

Scott D. Miller, Ph.D.
Director, International Center for Clinical Excellence

 

Filed Under: evidence-based practice, Practice Based Evidence, PTSD

More Deliberate Practice Resources…

May 30, 2017 By scottdm 1 Comment

what happenedLast week, I blogged about a free, online resource aimed at helping therapists improve their outcomes via deliberate practice.  As the web-based system was doubling as a randomized controlled trial (RCT), participants would not only be accessing a cutting-edge, evidence-based protocol but also contributing to the field’s growing knowledge in this area.

To say interest was high, doesn’t even come close.  Within 45 minutes of the first social media blast, every available spot was filled–including those on the waiting list!  Lead researchers Daryl Chow and Sharon Lu managed to open a few additional spots, and yet demand still far exceeded supply.

I soon started getting emails.  Their content was strikingly similar–like the one I received from Kathy Hardie-Williams, an MFT from Forest Grove, Oregon, “I’m interested in deliberate practice!  Are there other materials, measures, tools that I can access and start using in my practice?”

The answer is, “YES!”  Here they are:

Cycle of Excellence cover - single

Resource #1.  Written for practicing therapists, supervisors, and supervisees, this volume brings together leading researchers and supervisors to teach practical methods for using deliberate practice to improve the effectiveness of psychotherapy.

Written for practicing therapists, supervisors, and supervisees, this volume brings together leading researchers and supervisors to teach practical methods for using deliberate practice to improve the effectiveness of psychotherapy.

Twelve chapters split into four sections covering: (1) the science of expertise and professional development; (2) practical, evidence-based methods for tracking individual performance; (3) step-by-step applications for integrating deliberate practice into clinical practice and supervision; and (4) recommendations for making psychotherapist expertise development routine and expected.

“This book offers a challenge and a roadmap for addressing a fundamental issue in mental health: How can therapists improve and become experts?  Our goal,” the editors of this new volume state, ” is to bring the science of expertise to the field of mental health.  We do this by proposing a model for using the ‘Cycle of Excellence’ throughout therapists’ careers, from supervised training to independent practice.”

The book is due out June 1st.  Order today by clicking here: The Cycle of Excellence: Using Deliberate Practice to Improve Supervision and Training

Resource #2: The MyOutcomes E-Learning Platform

The folks at MyOutcomes have just added a new module on deliberate practice to their already extensive e-learning platform.  The information is cutting edge, and the production values simply fantastic.  More, MyOutcomes is offering free access to the system for the first 25 people who email to support@myoutcomes.com.  Put the words, “Responding to Scott’s Blogpost” in the subject line.  Meanwhile, here’s a taste of the course:

Resource #3:

proDLast but not least, the FIT Professional Development Intensive.  There simply is no better way to learn about deliberate practice than to attend the upcoming intensive in Chicago.  It’s the only such training available.  Together with my colleague, Tony Rousmaniere–author of the new book, Deliberate Practice for Psychotherapists: A Guide to Improving Clinical Effectiveness, we will help you develop an individualized plan for improving your effectiveness based on the latest scientific evidence on expert performance.

We’ve got a few spaces left.  Those already registered are coming from spots all around globe, so you’ll be in good company.  Click here to register today!

OK, that’s it for now.  Wishing you all the best for the Summer,

Scott D. Miller, Ph.D.

 

Filed Under: Behavioral Health, deliberate practice, evidence-based practice, excellence, Feedback, Feedback Informed Treatment - FIT, Practice Based Evidence

The Asch Effect: The Impact of Conformity, Rebelliousness, and Ignorance in Research on Psychology and Psychotherapy

December 3, 2016 By scottdm 5 Comments

asch-1
Consider the photo above.  If you ever took Psych 101, it should be familiar.   The year is 1951.  The balding man on the right is psychologist, Solomon Asch.   Gathered around the table are a bunch of undergraduates at Swarthmore College participating in a vision test.

Briefly, the procedure began with a cardboard printout displaying three lines of varying length.  A second containing a single line was then produced and participants asked to state out loud which it best matched.  Try it for yourself:
asch-2
Well, if you guessed “C,” you would have been the only one to do so, as all the other participants taking the test on that day chose “B.”  As you may recall, Asch was not really assessing vision.  He was investigating conformity.  All the participants save one were in on the experiment, instructed to choose an obviously incorrect answer in twelve out of eighteen total trials.

The results?not-me

On average, a third of the people in the experiment went along with the majority, with seventy-five percent conforming in at least one trial.

Today, practitioners face similar pressures—to go along with the assertion that some treatment approaches are more effective than others.

Regulatory bodies, including the Substance Abuse and Mental Health Services Administration in the United States, and the National Institute for Health and Care Excellence, are actually restricting services and limiting funding to approaches deemed “evidence based.”  The impact on publicly funded mental health and substance abuse treatment is massive.

So, in the spirit of Solomon Asch, consider the lines below and indicate which treatment is most effective?

asch-3
If your eyes tell you that the outcomes between competing therapeutic approaches appear similar, you are right.  Indeed, one of the most robust findings in the research literature over the last 40 years is the lack of difference in outcome between psychotherapeutic approaches.

The key to changing matters is speaking up!  In the original Asch experiments, for example, the addition of even one dissenting vote reduced conformity by 80%!   And no, you don’t have to be a researcher to have an impact.  On this score, when in a later study, a single dissenting voter wearing thick glasses—strongly suggestive of poor visual acuity—was added to the group, the likelihood of going along with the crowd was cut in half.

That said, knowing and understanding science does help.  In the 1980’s, two researchers found that engineering, mathematics, and chemistry students conformed with the errant majority in only 1 out of 396 trials!

What does the research actually say about the effectiveness of competing treatment approaches?

You can find a review in the most current review of the research in the latest issue of Psychotherapy Research–the premier outlet for studies about psychotherapy.  It’s just out and I’m pleased and honored to have been part of a dedicated and esteemed group scientists that are speaking up.  In it, we review and redo several recent meta-analyses purporting to show that one particular method is more effective than all others.  Can you guess which one?

The stakes are high, the consequences, serious.  Existing guidelines and lists of approved therapies do not correctly represent existing research about “what works” in treatment.  More, as I’ve blogged about before, they limit choice and effectiveness without improving outcome–and in certain cases, result in poorer results.  As official definitions make clear, “evidence-based practice” is NOT about applying particular approaches to specific diagnoses, but rather “integrating the best available research with clinical expertise in the context of patient characteristics, culture, and preferences” (p. 273, APA, 2006).

Read it and speak up!

Until next time,

Scott

Scott D. Miller, Ph.D.
Director, International Center for Clinical Excellence
Scott D. Miller - Australian Drug and Alcohol Symposium

Filed Under: Dodo Verdict, evidence-based practice, Feedback Informed Treatment - FIT, Practice Based Evidence

  • 1
  • 2
  • 3
  • …
  • 12
  • Next Page »

SEARCH

Subscribe for updates from my blog.

  

Upcoming Training

FIT Supervision Intensive 2018
FIT Implementation Intensive 2018

FIT Implementation Intensive 2019
FIT Training of Trainers 2019
FIT Deliberate Practice Aug 2019 - ICCE

NREPP Certified

HTML tutorial

LinkedIn

Topics of Interest:

  • Behavioral Health (107)
  • behavioral health (4)
  • Brain-based Research (1)
  • CDOI (14)
  • Conferences and Training (66)
  • deliberate practice (21)
  • Dodo Verdict (8)
  • Drug and Alcohol (3)
  • evidence-based practice (56)
  • excellence (57)
  • Feedback (32)
  • Feedback Informed Treatment – FIT (156)
  • FIT (18)
  • FIT Software Tools (10)
  • ICCE (25)
  • Implementation (5)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (10)
  • Practice Based Evidence (36)
  • PTSD (3)
  • Suicide (1)
  • Termination (1)
  • Therapeutic Relationship (3)
  • Top Performance (39)

Recent Posts

  • Time for a New Paradigm? Psychotherapy Outcomes Stagnant for 40 years
  • Beating the Dodo Verdict: Can Psychotherapy Ever Achieve Better Results?
  • “Clients Won’t Like It” and Other Concerns about Feedback Informed Treatment
  • Aren’t You the Anti-Evidence-Based Practice Guy? My Socks. And Other Crazy Questions.
  • What Works in Psychotherapy? Valuing “What Works” rather than Working with What We Value

Recent Comments

  • Rikke Addis on Time for a New Paradigm? Psychotherapy Outcomes Stagnant for 40 years
  • alan beach on Time for a New Paradigm? Psychotherapy Outcomes Stagnant for 40 years
  • Adriano Bugliani on Time for a New Paradigm? Psychotherapy Outcomes Stagnant for 40 years
  • Nick Drury on Time for a New Paradigm? Psychotherapy Outcomes Stagnant for 40 years
  • John Fitzgerald on Symptom Reduction or Well-being: What Outcome should Matter Most in Psychotherapy

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors conferences continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training

FIT Software tools

FIT Software tools