SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
scottdmiller@ talkingcure.com +1.773.454.8511

Three Common Misunderstandings about Deliberate Practice for Therapists

April 13, 2021 By scottdm Leave a Comment

Better Results CoverDeliberate Practice is hot.  Judging from the rising number of research studies, workshops, and social media posts, it hard to believe the term did not appear in the psychotherapy literature until 2007.

The interest is understandable.  Among the various approaches to professional development — supervision, continuing education, personal therapy — the evidence shows deliberate practice is the only one to result in improved effectiveness at the individual therapist level.

Devoting time to rehearsing what one wants to improve is hardly a novel idea.  Any parent knows it to be true and has said as much to their kids.  Truth is, references to enhancing one’s skills and abilities through focused effort date back more than two millennia.   And here is where confusion and misunderstanding begin.

  • Clinical practice is not deliberate practice.  If doing therapy with clients on a daily basis were the same as engaging in deliberate practice, therapists would improve in effectiveness over the course of their careers.  Research shows they do not.  Instead, confidence improves.  Let that sink in.  Outcomes remain flat but confidence in our abilities continuously increases.  It’s a phenomenon researchers term “automaticity” — the feeling most of us associate with having “learned” to do something –where actions are carried out without much conscious effort.  One could go so far as to say clinical practice is incompatible with deliberate practice, as the latter, to be effective, must force us to question what we do without thinking.
  • Deliberate practice is not a special set of techniques.  The field of psychotherapy has a long history of selling formulaic approaches. Gift-wrapped in books, manuals, workshops, and webinars, the promise is do this — whatever the “this” is — and you will be more effective.  Decades of research has shown these claims to be empty.  By contrast, deliberate practice is not a formula to be followed, but a form.  As such, the particulars will vary from person to person depending on what each needs to learn.  Bottom line: beware pre-packaged content.
  • Applying deliberate practice to mastering specific treatment models or techniques.  Consider a recent study out of the United Kingdom (1).  There, like elsewhere, massive amounts of money have been spent training clinicians to use cognitive behavioral therapy (CBT).  The expenditure is part of a well-intentioned government program aimed at improving access to effective mental health services (2).  Anyway, in the study, clinicians participated in a high intensity course that included more than 300 hours of training, supervision, and practice.  Competence in delivering CBT was assessed at regular intervals and shown to improve significantly throughout the training.  That said, despite the time, money, and resources devoted to mastering the approach, clinician effectiveness did not improve.  Why?  Contrary to common belief, competence in delivering specific treatment protocols contributes a negligible amount to the outcome of psychotherapy.  As common sensical as it likely sounds, to have an impact, whatever we practice must target factors have leverage on outcome.

My colleague, Daryl Chow and I, have developed a tool for helping practitioners develop an effective deliberate practice plan. Scott and Daryl Known as the “Taxonomy of Deliberate Practice Activities” (TDPA), it helps you identify aspects of your clinical performance likely to have the most impact on improving your effectiveness.   Step-by-step instructions walk you through the process of assessing your work, setting small individualized learning objectives, developing practice activities, and monitoring your progress.   As coaching is central to effective deliberate practice, a version of the tool is available for your supervisor or coach to complete.  Did I mention, its free?  Click here to download the TDPA contained in the same packet as the Outcome and Session Rating Scales.  While your are at it, join our private, online discussion group where hundreds of clinicians around the world meet, support one another, and share experiences and ideas.

Filed Under: Feedback Informed Treatment - FIT

Feedback Informed Treatment in Statutory Services (Child Protection, Court Mandated)

March 17, 2021 By scottdm 3 Comments

Treatment definition“We don’t do ‘treatment,’ can we use FIT?”

It’s a question that comes up with increasing frequency as use of the Outcome and Session Rating Scales in the helping professions spreads around the globe and across diverse service settings.

When I answer an unequivocal, “yes,” the asker often responds as though I’d not heard what they said.

Speaking slowly and enunciating, “But Scott, we don’t do “t r e a t m e n t.‘”  Invariably, they then clarify, “We do child protection,” or “We’re not therapists, we are case managers,” or providers in any of a large number of supportive, criminal justice, or other statutory social services.

How “treatment” became synonymous with psychotherapy (and other medical procedures) is a mystery to me.   The word, as Merriam-Webster defines it, is merely the way we conduct ourselves — our specific manner, actions and behaviors — towards others.

With this definition in mind, working “feedback-informed” simply means interacting with people as though their experience of the service is both FIT in Clinical Practiceprimary and consequential.  The challenge, I suppose, is how to do this when lives may be at risk (e.g., child protection, probation and parole), or when rules and regulations prescribe (or proscribe) provider and agency actions irrespective of how service users feel or what they prefer.

Over the last decade, many governmental and non-governmental organizations have succeeded in making statutory services feedback-informed — and the results are impressive.  For recipients, more engagement and better outcomes.  For providers, less burnout, job turnover, and fewer sick days.

I had the opportunity to speak with the members and managers of one social service agency — Gladsaxe Kommune in Denmark — this last week.   They described the ups, downs, and challenges they faced — including retraining staff, seeking variances to existing laws from authorities, — while working to transform agency practice and culture.  If you work in this sector, I know you’ll find their experience both inspiring and practical.  You can find the video below.  Another governmental agency has created a step-by-step guide (in English) for implementing feedback informed treatment (FIT) in statutory service settings.  It’s amazingly detailed and comprehensive.  It’s also free.  To access, click here.

Cliff note version of the results of implementing FIT in statutory services?

  • 50% fewer kids placed outside the home
  • 100% decrease in complaints filed by families against social service agencies and staff
  • 100% decrease in staff turnover and sick days

OK, that’s it for now.  Please leave a comment.  If you, or your agency, is considering implementing FIT, please join us for the two-day intensive training in August.  This time around, you can participate without leaving home as the entire workshop will be held online.  For more information, click on the icon below.

Filed Under: Feedback Informed Treatment - FIT

Do We Learn from Our Clients? Yes, No, Maybe So …

March 2, 2021 By scottdm Leave a Comment

When it comes to professional development, we therapists are remarkably consistent in opinion about what matters.  Regardless of experience level, theoretical preference, professional discipline, or gender identity, large, longitudinal studies show “learning from clients” is considered the most important and influential contributor (1, 2).  Said another way, we believe clinical experience leads to better, increasingly effective performance in the consulting room.

As difficult as it may be to accept, the evidence shows we are wrong.  Confidence, proficiency, even knowledge about clinical practice, may improve with time and experience, but not our outcomes.  Indeed, the largest study ever published on the topic — 6500 clients treated by 170 practitioners whose results were tracked for up to 17 years — found the longer therapists were “in practice,” the less effective they became (3)!  Importantly, this result remained unchanged even after researchers controlled for several patient, caseload, and therapist-level characteristics known to have an impact effectiveness.

Only two interpretations are possible, neither of them particularly reassuring.  Either we are not learning from our clients, or what we claim to be learning doesn’t improve our ability to help them.  Just to be clear, the problem is not a lack of will.   Therapists, research shows, devote considerable time, effort, and resources to professional development efforts (4).  Rather, it appears the way we’ve approached the subject is suspect.

Consider the following provocative, but evidence-based idea.  Most of the time, there simply is nothing to learn from a particular client about how to improve our craft.  Why?  Because so much of what affects the outcome of individual clients at any given moment in care is random — that is, either outside of our direct control or not part of a recurring pattern of therapist errors.  Extratherapeutic factors, as influences are termed, contribute a whopping 87% to outcome of treatment (5, 6).   Let that sink in.

The temptation to draw connections between our actions and particular therapeutic results is both strong and understandable.  We want to improve.  To that end, the first step we take — just as we counsel clients — is to examine our own thoughts and actions in an attempt to extract lessons for the future.  That’s fine, unless no causal connection exists between what we think and do, and the outcomes that follow … then, we might as well add “rubbing a rabbit’s foot” to our professional development plans.

So, what can we to do?   Once more, the answer is as provocative as it is evidence-based.  Recognizing the large role randomness plays in the outcome of clinical work, therapists can achieve better results by improving their ability to respond in-the-moment to the individual and their unique and unpredictable set of circumstances.  Indeed, uber-researchers Stiles and Horvath note, research indicates, “Certain therapists are more effective than others … because [they are] appropriately responsive … providing each client with a different, individually tailored treatment” (7, p. 71).

FIT BookWhat does improving responsiveness look like in real world clinical practice?  In a word, “feedback.”  A clever study by Jeb Brown and Chris Cazauvielh found, for example, average therapists who were more engaged with the feedback their clients provided — as measured by the number of times they logged into a computerized data gathering program to view their results — in time became more effective than their less engaged peers (8).  How much more effective you ask?  Close to 30% — not a bad “return on investment” for asking clients to answer a handful of simple questions and then responding to the information they provide!

If you haven’t already done so, click here to access and begin using two, free, standardized tools for gathering feedback from clients.  Next, ioin our free, online community to get the support and inspiration you need to act effectively and creatively on the feedback your clients provide — hundreds and hundreds of dedicated therapists working in diverse settings around the world support each other daily on the forum and are available regardless of time zone.

And here’s a bonus.  Collecting feedback, in time, provides the very data therapists need to be able to sort random from non-random in their clinical work, to reliably identify when they need to respond and when a true opportunity for learning exists.  Have you heard or read anything about “deliberate practice?”  Since first introducing the term to the field in our 2007 article, Supershrinks, it’s become a hot topic among researchers and trainers.  If you haven’t yet, chances are you will soon be seeing books and videos offering to teach how to use deliberate practice for mastering any number of treatment methods.  The promise, of course, is better outcomes.  Critically, however, if training is not targeted directly to patterns of action or inaction that reliably impact the effectiveness of your individual clinical performance in negative ways, such efforts will, like clinical experience in general, make little difference.

If you are already using standardized tools to gather feedback from clients, you might be interested in joining me and my colleague Dr. Daryl Chow for upcoming, web-based workshop.  Delivered weekly in bite-sized bits, we’ll not only help you use your data to identify your specific learning edge, but work with you to develop an individualized deliberate practice plan.  You go at your own pace as access to the course and all training materials are available to you forever.  Interested?  Click here to read more or sign up.

Filed Under: Behavioral Health, deliberate practice, evidence-based practice, Feedback Informed Treatment - FIT, FIT

  • « Previous Page
  • 1
  • …
  • 12
  • 13
  • 14
  • 15
  • 16
  • …
  • 108
  • Next Page »

SEARCH

Subscribe for updates from my blog.

[sibwp_form id=1]

Upcoming Training

There are no upcoming Events at this time.

FIT Software tools

FIT Software tools

LinkedIn

Topics of Interest:

  • behavioral health (5)
  • Behavioral Health (109)
  • Brain-based Research (2)
  • CDOI (12)
  • Conferences and Training (62)
  • deliberate practice (29)
  • Dodo Verdict (9)
  • Drug and Alcohol (3)
  • evidence-based practice (64)
  • excellence (61)
  • Feedback (36)
  • Feedback Informed Treatment – FIT (230)
  • FIT (27)
  • FIT Software Tools (10)
  • ICCE (23)
  • Implementation (6)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (9)
  • Practice Based Evidence (38)
  • PTSD (4)
  • Suicide (1)
  • supervision (1)
  • Termination (1)
  • Therapeutic Relationship (9)
  • Top Performance (37)

Recent Posts

  • Agape
  • Snippets
  • Results from the first bona fide study of deliberate practice
  • Fasten your seatbelt
  • A not so helpful, helping hand

Recent Comments

  • Typical Duration of Outpatient Therapy Sessions | The Hope Institute on Is the “50-minute hour” done for?
  • Dr Martin Russell on Agape
  • hima on Simple, not Easy: Using the ORS and SRS Effectively
  • hima on The Cryptonite of Behavioral Health: Making Mistakes
  • himalaya on Alas, it seems everyone comes from Lake Wobegon

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland Hypertension icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training