SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
scottdmiller@ talkingcure.com +1.773.454.8511

Improving Individual Therapist Effectiveness

January 1, 2017 By scottdm 23 Comments

Take a look at the figure to the right.

therapist-declineIt’s data taken from the largest study conducted in the history of psychotherapy research examining the relationship between experience and effectiveness.

Each of the smaller lines represents the outcomes of an individual practitioner followed, in some cases, over a 17-year period.  The single, thicker, dashed-line plots the average across all, 170 practitioners in total–a group that measured the results of their work at every visit, with every client.

The data confirm what a host of correlational studies have hinted at since the mid-1980’s: in general, clinician effectiveness declines with time and experience.

It’s not a steep slope, to be sure.  It is slow and gradual, like a leak in a bicycle tire.  Most problematically, other research shows, it’s also imperceptible.  Indeed, as experience in the field grows, clinician-confidence increases, leading most to see themselves as more effective than their results actually indicate.

Now, consider the second figure.  Once again, outcomes of individual practitioners are plotted.therapist-improve

Here, however, the slope is positive.  In other words, therapists are becoming more and more effective over time.  As before, the change is slow and gradual.  Said another way, there are no shortcuts to improved outcomes.  Slow and steady wins the race.  More, unlike the prior results, therapists are both aware of their growth and, curiously, less confident about the effectiveness of their work.

Importantly, the data from the latter are not drawn from the application of some hypothetical, yet-to-be-hoped-for treatment model or training scenario.  Indeed, they are from the only published study to date documenting the factors that actually influence development of individual therapist effectiveness. When employed purposefully and mindfully, clinicians improve three times larger than they were documented to decline in the prior study.

What are these factors?

As simple as it sounds: (1) measure your results; and (2) focus on your mistakes.

There is no way around this basic fact: any effort aimed at improving effectiveness begins with a valid and reliable estimate of one’s current outcomes. Without that, there is no way of knowing when progress is being made or not.

On the subject of mistakes, several studies now confirm that “healthy self-criticism,” or professional self-doubt (PSD), is a strong predictor of both alliance and outcome in psychotherapy (Nissen-Lie et al., 2015).  Not surprisingly, therapist effectiveness improves in practice environments that provide: (1) ample opportunity and a safe place for discussing cases that are not making progress (or deteriorating); and (2) concrete suggestions for improvement that are tailored to the individual therapist.

Here are a couple of evidence-based resources you can tap in your efforts to improve your effectiveness in 2017:

  • Begin measuring your results using two simple scales that have been tested in diverse settings and with a wide range of treatment populations.  In 2013, they were approved by the Substance Abuse and  Mental Health Services Administration (SAMHSA) and listed on the National Registry of Evidence-based Programs and Practices.  Get them for free by registering for a free license here.
  • If you work alone or need a error-friendly practice community to discuss your work, join the International Center for Clinical Excellence.  It’s our free, online community.  There, you can link up with other like-minded clinicians, share your work, discuss your outcomes, watch “how-to” videos, help and be helped by practitioners around the world.

Filed Under: Feedback Informed Treatment - FIT

The Asch Effect: The Impact of Conformity, Rebelliousness, and Ignorance in Research on Psychology and Psychotherapy

December 3, 2016 By scottdm 5 Comments

asch-1
Consider the photo above.  If you ever took Psych 101, it should be familiar.  The year is 1951.  The balding man on the right is psychologist, Solomon Asch.  Gathered around the table are a bunch of undergraduates at Swarthmore College participating in a vision test.

Briefly, the procedure began with a cardboard printout displaying three lines of varying length.  A second containing a single line was then produced and participants asked to state out loud which it best matched.  Try it for yourself:
asch-2
Well, if you guessed “C,” you would have been the only one to do so, as all the other participants taking the test on that day chose “B.”  As you may recall, Asch was not really assessing vision.  He was investigating conformity.  All the participants save one were in on the experiment, instructed to choose an obviously incorrect answer in twelve out of eighteen total trials.

The results?

On average, a third of the people in the experiment went along with the majority, with seventy-five percent conforming in at least one trial.

Today, practitioners face similar pressures—to go along with the assertion that some treatment approaches are more effective than others.

Regulatory bodies, including the Substance Abuse and Mental Health Services Administration in the United States, and the National Institute for Health and Care Excellence, are actually restricting services and limiting funding to approaches deemed “evidence based.”  The impact on publicly funded mental health and substance abuse treatment is massive.

So, in the spirit of Solomon Asch, consider the lines below and indicate which treatment is most effective?

asch-3
If your eyes tell you that the outcomes between competing therapeutic approaches appear similar, you are right.  Indeed, one of the most robust findings in the research literature over the last 40 years is the lack of difference in outcome between psychotherapeutic approaches.

The key to changing matters is speaking up!  In the original Asch experiments, for example, the addition of even one dissenting vote reduced conformity by 80%!   And no, you don’t have to be a researcher to have an impact.  On this score, when in a later study, a single dissenting voter wearing thick glasses—strongly suggestive of poor visual acuity—was added to the group, the likelihood of going along with the crowd was cut in half.

That said, knowing and understanding science does help.  In the 1980’s, two researchers found that engineering, mathematics, and chemistry students conformed with the errant majority in only 1 out of 396 trials!

What does the research actually say about the effectiveness of competing treatment approaches?

You can find a review in the most current review of the research in Psychotherapy Research–the premier outlet for studies about psychotherapy.  I’m pleased and honored to have been part of a dedicated and esteemed group scientists that are speaking up.  In it, we review and redo several recent meta-analyses purporting to show that one particular method is more effective than all others.  Can you guess which one?

The stakes are high, the consequences, serious.  Existing guidelines and lists of approved therapies do not correctly represent existing research about “what works” in treatment.  More, as I’ve blogged about before, they limit choice and effectiveness without improving outcome–and in certain cases, result in poorer results.  As official definitions make clear, “evidence-based practice” is NOT about applying particular approaches to specific diagnoses, but rather “integrating the best available research with clinical expertise in the context of patient characteristics, culture, and preferences” (p. 273, APA, 2006).

Filed Under: Dodo Verdict, evidence-based practice, Feedback Informed Treatment - FIT, Practice Based Evidence

The Replication Crisis in Psychology: What is and is NOT being talked about

November 7, 2016 By scottdm 9 Comments

Psychology has been in the headlines a fair bit of late—and the news is not positive.  I blogged about this last year, when a study appeared documenting that the effectiveness of CBT was declining–50% over the last four decades.

The problem is serious.  Between 2012 and 2014, for example, a team of researchers working together on their free time tried to replicate 100 published psychology experiments and succeeded only a third of the time!  As one might expect, such findings sent shock waves through academia.

Now, this week, The British Psychological Society’s Research Digest piled on, reviewing 10 “famous” findings that researchers have been unable to replicate—despite the popularity and common sense appeal of each.  Among others, these include:

  • Power posing does not make you more powerful;
  • Smiling does not make you happier;
  • Exposing you to words (known as “priming”) related to ageing does not cause you to walk like an old person;
  • Having a mental image of a college professor in mind does not make you perform more intelligently (another priming study);
  • Being primed to think of money will not cause make you act more selfishly; and
  • Despite being reported in nearly every basic psychology text, babies are not born with the power to imitate.

Clearly, replication is a problem.  The bottom line?  Much of psychology’s evidence-base is built on a foundation of sand.

Amidst all the controversy, I couldn’t help thinking of psychotherapy.  In this area, I believe, the problem with the available research is not so much the failure to replicate, but rather an unwillingness to accept what has been replicated repeatedly.  Contrary to hope and popular belief, one—if not the most—replicated finding is the lack of difference in outcome between psychotherapeutic approaches.

It’s not for lack of trying.  Massive amounts of time and resources have been spent comparing treatment methods.  With few exceptions, either no or inconsequential differences are found.

Consider, for example, the U.S. Government spent $33,000,000 studying different approaches for problem drinking only to find what we already know: all worked equally well.  A decade later, the British officials spent millions of pounds on the same subject with similar results.

Just this week, a study was released comparing the hugely popular method called DBT to usual care in the treatment of “high risk suicidal veterans.”   Need I tell you what they found?

As the Ground-Hog-Day-like quest continues, another often replicated finding is ignored.  One of the best predictors of the outcome of psychotherapy is the quality of the therapeutic relationship between the provider and recipient of care.  That was one of the chief findings, for example, in both of the studies on alcohol treatment cited above (1, 2).  Put simply, better relationship = improved engagement and effectiveness.

Sadly, but not surprisingly, research, writing, and educational opportunities focused on the alliance lags model and techniques.  Consider this: slightly more than 55,000 books are in print on the latter subject, compared to a paltry 193 on the former.  It’s mind-boggling, really.  How could one of the most robust and replicated findings in psychotherapy be so widely ignored?

My colleague Daryl Chow is working hard to get beyond the “lip service” frequently paid to the therapeutic relationship.  In an an ongoing series of studies aimed at helping clinicians improve their ability to engage, retain, and help people in psychotherapy by targeting training to the individual practitioners strengths and weaknesses.  Not surprisingly, the results show slow and steady improvement in connecting with a broader, more diverse, and challenging group of clinical scenarios!

Filed Under: Conferences and Training, deliberate practice, Dodo Verdict, Therapeutic Relationship

  • « Previous Page
  • 1
  • …
  • 38
  • 39
  • 40
  • 41
  • 42
  • …
  • 108
  • Next Page »

SEARCH

Subscribe for updates from my blog.

[sibwp_form id=1]

Upcoming Training

There are no upcoming Events at this time.

FIT Software tools

FIT Software tools

LinkedIn

Topics of Interest:

  • behavioral health (5)
  • Behavioral Health (109)
  • Brain-based Research (2)
  • CDOI (12)
  • Conferences and Training (62)
  • deliberate practice (29)
  • Dodo Verdict (9)
  • Drug and Alcohol (3)
  • evidence-based practice (64)
  • excellence (61)
  • Feedback (36)
  • Feedback Informed Treatment – FIT (230)
  • FIT (27)
  • FIT Software Tools (10)
  • ICCE (23)
  • Implementation (6)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (9)
  • Practice Based Evidence (38)
  • PTSD (4)
  • Suicide (1)
  • supervision (1)
  • Termination (1)
  • Therapeutic Relationship (9)
  • Top Performance (37)

Recent Posts

  • Agape
  • Snippets
  • Results from the first bona fide study of deliberate practice
  • Fasten your seatbelt
  • A not so helpful, helping hand

Recent Comments

  • Typical Duration of Outpatient Therapy Sessions | The Hope Institute on Is the “50-minute hour” done for?
  • Dr Martin Russell on Agape
  • hima on Simple, not Easy: Using the ORS and SRS Effectively
  • hima on The Cryptonite of Behavioral Health: Making Mistakes
  • himalaya on Alas, it seems everyone comes from Lake Wobegon

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland Hypertension icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training