SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
info@scottdmiller.com 773.404.5130

Is Professional Training a Waste of Time?

March 18, 2010 By scottdm 6 Comments

readerEvery year, thousands of students graduate from professional programs with degrees enabling them to work in the field of behavioral health. Many more who have already graduated and are working as a social worker, psychologist, counselor, or marriage and family therapist attend—often by legal mandate—continuing education events. The costs of such training in terms of time and money are not insignificant.

Most graduates enter the professional world in significant debt, taking years to pay back student loans and recoup income that was lost during the years they were out of the job market attending school. Continuing professional education is also costly for agencies and individuals in practice, having to arrange time off from work and pay for training.

To most, the need for training seems self-evident. And yet, in the field of behavioral health the evidence is at best discouraging. While in traveling in New Zealand this week, my long-time colleague and friend, Dr. Bob Bertolino forwarded an article on the subject appearing in the latest issue of the Journal of Counseling and Development (volume 88, number 2, pages 204-209). In it, researchers Nyman and Nafziger reported results of their study on the relationship between therapist effectiveness and level of training.

First, the good news: “clients who obtained services…experienced moderate symptom relief over the course of six sessions.” Now the bad news: it didn’t matter if the client was “seen by a licensed doctoral –level counselor, a pre-doctoral intern, or a practicum student” (p. 206, emphasis added). The authors conclude, “It may be that researchers are loathe to face the possibility that the extensive efforts involved in educating graduate students to become licensed professionals result in no observable differences in client outcome” (p. 208, emphasis added).

In case you were wondering, such findings are not an anomaly.  Not long ago, Atkins and Christensen (2001) reviewed the available evidence in an article published in the Australian Psychologist and concluded much the same (volume 36, pages 122-130); to wit, professional training has little if any impact on outcome.  As for continuing professional education, you know if you’ve been reading my blog that there is not a single supportive study in the literature.

“How,” you may wonder, “could this be?” The answer is: content and methods.  First of all, training at both the graduate and professional level continues to focus on the weakest link in the outcome chain—that is, model and technique. Recall, available evidence indicates that the approach used accounts for 1% or less of the variance in treatment outcome (see Wampold’s chapter in the latest edition of the Heart and Soul of Change).  As just one example, consider workshops being conduced around the United States using precious resources to train clinicians in the methods studied in the “Cannabis Youth Treatment” (CYT) project–a study which found that the treatment methods used contributed zero to the variance in treatment outcome.  Let me just say, where I come from zero is really close to nothing!

Second, and even more important, traditional methods of training (i.e., classroom lecture, reading, attending conferences) simply do not work. And sadly, behavioral health is one of the few professions that continue to rely on such outdated and ineffective training methods.

The literature on expertise and expert performance provides clear, compelling, and evidence-based guidelines about the qualities of effective training. I’ve highlighted such data in a number of recent blogposts. The information has already had a profound impact on the way how the ICCE organizes and conducts trainings.   Thanks to Cynthia Maeschalck, Rob Axsen, and Bob, the entire curriculum and methods used for the annual “Training of Trainers” event have been entirely revamped. Suffice it to say, agencies and individuals who invest precious time and resources attending the training will not only learn but be able to document the impact of the training on performance.  More later.

Filed Under: Top Performance Tagged With: behavioral health, Carl Rogers, cdoi, continuing professional education, healthcare, holland, icce, Journal of Counseling and Development, psychometrics

Excellence on a Shoestring: The “Home for Good” Program

March 17, 2010 By scottdm Leave a Comment

Today I’m teaching in Christchurch, New Zealand. For the last two days, I’ve been in Nelson, a picturesque coastal town opposite Abel Tasman, working with the local DHB (District Health Board). If you’ve never visited, make a point of adding the country to your list of top travel destinations. The landscape and the people are second to none. (In Nelson, be sure and visit The Swedish Bakery. My 8-year old son, Michael, unequivocally states it has the best hot chocolate in the world—and, believe me, he’s an expert).

I’ve been traveling to New Zealand at least once a year for the last several years to provide training on using outcomes to inform behavioral healthcare. Interest is keen and providers and managers are working hard to deliver top-notch services. However, like many other places around the globe, economic factors are taking a toll.   On the day I arrived, one of the lead stories in the local paper (The Nelson Mail) focused on the economic crisis in healthcare.   “Complaints about money, shortages, overwork, stress and unsympathetic management…in the always-stretched hospital service,” the story began, “[indicate] a rapidly worsening situation” (p. 5, News Extra). Today, the headline of an article in section A5 of The Press Christchurch warns, “Health Ministry staff brace for job losses.”

A little over two weeks ago, I was in Richmond, Virginia working with managers and providers of public behavioral health agencies. There too, economic problems loom large. Over the last two years, for example, agencies have had to absorb across-the-board, double-digit cuts in funding. The result, in many instances, has been layoffs and the elimination of services and programs—with a few prominent exceptions.

On March 5th, I blogged about the crew at Chesterfield CSB in Virginia that were serving 70% more people than they did in 2007 despite there being no increase in available staff resources in the intervening period and, at the same time, decreasing clinician caseloads by nearly 30%.  In January, I posted text and video about agencies in Ohio that had managed to improve outcome, retention, and productivity at the same time that cutbacks had forced the furlough of staff! The common denominator in both instances is outcomes; that is, measuring the “fit and effect” of treatment on an ongoing basis and then using the data in consultation with consumers to improve service delivery.

If you’re not yet convinced, I have one more example to add to the mix: the “Home for Good” program.  Vision, commitment, and drive are words that best capture the management and staff who work at this Richmond, Virginia-based in-home behavioral health services program. Some might question the wisdom of starting a private, primarily Medicaid-funded treatment program in the worst economic climate since the Great Depression. A commitment to helping families keep their children at home—preventing placement in residential treatment centers, foster care, and detention—is what drove founder and director Kathy Levenston to take up the challenge. The key to their success says Kathy is that “we take responsibility for the results.” As in Ohio and Chesterfield, Kathy and her crew routinely monitor the alliance and results of the work they do and then use the data to enhance retention and outcome. Listen to Kathy as she describes the “Home for Good” program. I’m sure her story will inspire you to push for excellence whatever the “shoestring” budget you may be surviving on at the moment.

Filed Under: Behavioral Health, Top Performance Tagged With: cdoi, Home for Good, New Zealand

New Year’s Resolutions: Progress Report and Future Plans

January 1, 2010 By scottdm Leave a Comment

One year ago today, I blogged about my New Year’s resolution to “take up the study of expertise and expert performance.”  The promise marked a significant departure from my work up to that point in time and was not without controversy:

“Was I no longer interested in psychotherapy?”

“Had I given up on the common factors?

“What about the ORS and SRS?” and was I abandoning the field and pursue magic as a profession?”

Seriously.

The answer to all of the questions was, of course, an emphatic “NO!”  At the same time, I recognized that I’d reached an empirical precipice–or, stated more accurately, dead end.  The common factors, while explaining why therapy works did not and could never tell us how to work.  And while seeking and obtaining ongoing feedback (via the ORS and SRS) had proven successful in boosting treatment outcomes, there was no evidence that the practice had a lasting impact on the professionals providing the service.

Understanding how to improve my performance as a clinician has, as is true of many therapists, been a goal and passion from the earliest days of my career.  The vast literature on expertise and expert performance appeared to provide the answers I’d long sought.   In fields as diverse as music and medicine, researchers had identified specific principles and methods associated with superior performance.  On January 2nd, 2009, I vowed to apply what I was learning to, “a subject I know nothing about…put[ting] into practice the insights gleaned from the study of expertise and expert performance.”

The subject? Magic (and the ukulele).

How have I done?  Definitely better than average I can say.  In a column written by Barbara Brotman in today’s Chicago Tribune, psychologist Janine Gauthier notes that while 45% of people make New Year’s resolutions, only 8% actually keep them!  I’m a solid 50%.  I am still studying and learning magic–as attendees at the 2009 “Training of Trainers” and my other workshops can testify.  The uke is another story, however.  To paraphrase 1988 Democratic vice-presidential candidate, Lloyd Bentsen , “I know great ukulele players, and Scott, you are no Jake Shimabukuro.”

I first saw Jake Shimabukuro play the ukulele at a concert in Hawaii.  I was in the islands working with behavioral health professionals in the military (Watch the video below and tell me if it doesn’t sound like more than one instrument is playing even though Jake is the only one pictured).

Interestingly, the reasons for my success with one and failure with the other are as simple and straightforward as the principles and practices that researchers say account for superior (and inferior) performance.  I promise to lay out these findings, along with my experiences, over the next several weeks.  If you are about to make a New Year’s resolution, let me give you step numero uno: make sure your goal/resolution is realistic.  I know, I know…how mundane.  And yet, while I’ve lectured extensively about the relationship between goal-setting and successful psychotherapy for over 15 years, my reading about expert performance combined with my attempts to master two novel skills, has made me aware of aspects I never knew about or considered before.

Anyway, stay tuned for more.  In the meantime, just for fun, take a look at the video below from master magician Bill Malone.  The effect he is performing is called, “Sam the Bellhop.”  I’ve been practicing this routine since early summer, using what I’ve learned from my study of the literature on expertise to master the effect (Ask me to perform it for you on break if you happen to be in attendance at one of my upcoming workshops).

Filed Under: Behavioral Health, deliberate practice, excellence, Top Performance Tagged With: Alliance, cdoi, ors, outcome rating scale, psychotherapy, sessino rating scale, srs, Therapist Effects, training of trainers

The Study of Excellence: A Radically New Approach to Understanding "What Works" in Behavioral Health

December 24, 2009 By scottdm 2 Comments

“What works” in therapy?  Believe it or not, that question–as simple as it is–has and continues to spark considerable debate.  For decades, the field has been divided.  On one side are those who argue that the efficacy of psychological treatments is due to specific factors (e.g., changing negative thinking patterns) inherent in the model of treatment (e.g., cognitive behavioral therapy) remedial to the problem being treated (i.e., depression); on the other, is a smaller but no less committed group of researchers and writers who posit that the general efficacy of behavioral treatments is due to a group of factors common to all approaches (e.g., relationship, hope, expectancy, client factors).

While the overall effectiveness of psychological treatment is now well established–studies show that people who receive care are better off than 80% of those who do not regardless of the approach or the problem treated–one fact can not be avoided: outcomes have not improved appreciably over the last 30 years!  Said another way, the common versus specific factor battle, while generating a great deal of heat, has not shed much light on how to improve the outcome of behavioral health services.  Despite the incessant talk about and promotion of “evidence-based” practice, there is no evidence that adopting “specific methods for specific disorders” improves outcome.  At the same time, as I’ve pointed out in prior blogposts, the common factors, while accounting for why psychological therapies work, do not and can not tell us how to work.  After all, if the effectiveness of the various and competing treatment approaches is due to a shared set of common factors, and yet all models work equally well, why learn about the common factors?  More to the point, there simply is no evidence that adopting a “common factors” approach leads to better performance.

The problem with the specific and common factor positions is that both–and hang onto your seat here–have the same objective at heart; namely, contextlessness.  Each hopes to identify a set of principles and/or practices that are applicable across people, places, and situations.  Thus, specific factor proponents argue that particular “evidence-based” (EBP) approaches are applicable for a given problem regardless of the people or places involved (It’s amazing, really, when you consider that various approaches are being marketed to different countries and cultures as “evidence-based” when there is in no evidence that these methods work beyond their very limited and unrepresentative samples).  On the other hand, the common factors camp, in place of techniques, proffer an invariant set of, well, generic factors.  Little wonder that outcomes have stagnated.  Its a bit like trying to learn a language either by memorizing a phrase book–in the case of EBP–or studying the parts of speech–in the case of the common factors.

What to do?  For me, clues for resolving the impasse began to appear when, in 1994, I followed the advice of my friend and long time mentor, Lynn Johnson, and began formally and routinely monitoring the outcome and alliance of the clinical work I was doing.  Crucially, feedback provided a way to contextualize therapeutic services–to fit the work to the people and places involved–that neither a specific or common factors informed approach could.

Numerous studies (21 RCT’s; including 4 studies using the ORS and SRS) now document the impact of using outcome and alliance feedback to inform service delivery.  One study, for example, showed a 65% improvement over baseline performance rates with the addition of routine alliance and outcome feedback.  Another, more recent study of couples therapy, found that divorce/separation rates were half (50%) less for the feedback versus no feedback conditions!

Such results have, not surprisingly, led the practice of “routine outcome monitoring” (PROMS) to be deemed “evidence-based.” At the recent, Evolution of Psychotherapy conference I was on a panel with David Barlow, Ph.D.–a long time proponent of the “specific treatments for specific disorders” (EBP)–who, in response to my brief remarks about the benefits of feedback, stated unequivocally that all therapists would soon be required to measure and monitor the outcome of their clinical work.  Given that my work has focused almost exclusively on seeking and using feedback for the last 15 years, you would think I’d be happy.  And while gratifying on some level, I must admit to being both surprised and frightened by his pronouncement.

My fear?  Focusing on measurement and feedback misses the point.  Simply put: it’s not seeking feedback that is important.  Rather, it’s what feedback potentially engenders in the user that is critical.  Consider the following, while the results of trials to date clearly document the benefit of PROMS to those seeking therapy, there is currently no evidence of that the practice has a lasting impact on those providing the service.  “The question is,” as researcher Michael Lambert notes, “have therapists learned anything from having gotten feedback? Or, do the gains disappear when feedback disappears? About the same question. We found that there is little improvement from year to year…” (quoted in Miller et al. [2004]).

Research on expertise in a wide range of domains (including chess, medicine, physics, computer programming, and psychotherapy) indicates that in order to have a lasting effect feedback must increase a performer’s “domain specific knowledge.”   Feedback must result in the performer knowing more about his or her area and how and when to apply than knowledge to specific situations than others.  Master level chess players, for example, have been shown to possess 10 to 100 times more chess knowledge than “club-level” players.  Not surprisingly, master players’ vast information about the game is consilidated and organized differently than their less successful peers; namely, in a way that allows them to access, sort, and apply potential moves to the specific situation on the board.  In other words, their immense knowledge is context specific.

A mere handful studies document similar findings among superior performing therapists: not only do they know more, they know how, when, and with whom o apply that knowledge.  I noted these and highlighted a few others in the research pipeline during my workshop on “Achieving Clinical Excellence” at the Evolution of Psychotherapy conference.  I also reviewed what 30 years of research on expertise and expert performance has taught us about how feedback must be used in order to insure that learning actually takes place.  Many of those in attendance stopped by the ICCE booth following the presentation to talk with our CEO, Brendan Madden, or one of our Associates and Trainers (see the video below).

Such research, I believe, holds the key to moving beyond the common versus specific factor stalemate that has long held the field in check–providing therapists with the means for developing, organizing, and contextualizing clinical knowledge in a manner that leads to real and lasting improvements in performance.

Filed Under: Behavioral Health, excellence, Feedback, Top Performance Tagged With: brendan madden, cdoi, cognitive behavioral therapy, common factors, continuing education, david barlow, evidence based medicine, evidence based practice, Evolution of Psychotherapy, feedback, icce, micheal lambert, ors, outcome rating scale, proms, session rating scale, srs, therapist, therapists, therapy

Outcomes in Oz II

November 25, 2009 By scottdm 4 Comments

Sitting in my hotel room in Brisbane, Australia.  It’s beautiful here: white, sandy beaches and temperatures hovering around 80 degrees.  Can’t say that I’ll be enjoying the sunny weather much.  Tomorrow I’ll be speaking to a group of 135+ practitioners about “Supershrinks.”  I leave for home on Saturday.  While it’s cold and overcast in Chicago, I’m really looking forward to seeing my family after nearly two weeks on the road.

I spent the morning talking to practitioners in New Zealand via satellite for a conference sponsored by Te Pou.  It was a completely new and exciting experience for me, seated in an empty television studio and talking to a camera.  Anyway, organizers of the conference are determined to avoid mistakes made in the U.S., Europe, and elsewhere with the adoption of “evidence-based practice.”  As a result, they organized the event around the therapeutic alliance–the most neglected, yet evidence-based concept in the treatment literature!  More later, including a link to the hour-long presentation.

On Friday and Saturday of this last week, I was in the classic Victorian city of Melbourne, Australia doing two days worth of training at the request of WorkSafe and the Traffic Accident Commission.  The mission of WorkSafe is, “Working with the community to deliver outstanding workplace safety, together with quality care and insurance protection to workers and employers.”  100+ clinicians dedicated to helping Australians recover from work and traffic-related injuries were present for the first day of training which focused on using formal client feedback to improve retention and outcome of psychological services.  On day 2, a smaller group met for an intensive day of training and consultation.  Thanks go to the sponsors and attendees for an exciting two days.  Learn more about how outcomes are being used to inform service delivery by watching the video below with Daniel Claire and Claire Amies from the Health Services Group.

 

Filed Under: Behavioral Health, Top Performance Tagged With: australia, evidence based medicine, evidence based practice, New Zealand, supershrinks

Where is Scott Miller going? The Continuing Evolution

November 16, 2009 By scottdm 2 Comments

I’ve just returned from a week in Denmark providing training for two important groups.  On Wednesday and Thursday, I worked with close to 100 mental health professionals presenting the latest information on “What Works” in Therapy at the Kulturkuset in downtown Copenhagen.  On Friday, I worked with a small group of select clinicians working on implementing feedback-informed treatment (FIT) in agencies around Denmark.  The day was organized by Toftemosegaard and held at the beautiful and comfortable Imperial Hotel.

In any event, while I was away, I received a letter from my colleague and friend, M. Duncan Stanton.  For many years, “Duke,” as he’s known, has been sending me press clippings and articles both helping me stay “up to date” and, on occasion, giving me a good laugh.  Enclosed in the envelope was the picture posted above, along with a post-it note asking me, “Are you going into a new business?!”

As readers of my blog know, while I’m not going into the hair-styling and spa business, there’s a grain of truth in Duke’s question.  My work is indeed evolving.  For most of the last decade, my writing, research, and training focused on factors common to all therapeutic approaches. The logic guiding these efforts was simple and straightforward. The proven effectiveness of psychotherapy, combined with the failure to find differences between competing approaches, meant that elements shared by all approaches accounted for the success of therapy (e.g., the therapeutic alliance, placebo/hope/expectancy, structure and techniques, extratherapeutic factors).  As first spelled out in Escape from Babel: Toward a Unifying Language for Psychotherapy Practice, the idea was that effectiveness could be enhanced by practitioners purposefully working to enhance the contribution of these pantheoretical ingredients.  Ultimately though, I realized the ideas my colleagues and I were proposing came dangerously close to a new model of therapy.  More importantly, there was (and is) no evidence that teaching clinicians a “common factors” perspective led to improved outcomes–which, by the way, had been my goal from the outset.

The measurable improvements in outcome and retention–following my introduction of the Outcome and Session Rating Scales to the work being done by me and my colleagues at the Institute for the Study of Therapeutic Change–provided the first clues to the coming evolution.  Something happened when formal feedback from consumers was provided to clinicians on an ongoing basis–something beyond either the common or specific factors–a process I believed held the potential for clarifying how therapists could improve their clinical knowledge and skills.  As I began exploring, I discovered an entire literature of which I’d previously been unaware; that is, the extensive research on experts and expert performance.  I wrote about our preliminary thoughts and findings together with my colleagues Mark Hubble and Barry Duncan in an article entitled, “Supershrinks” that appeared in the Psychotherapy Networker.

Since then, I’ve been fortunate to be joined by an internationally renowned group of researchers, educators, and clinicians, in the formation of the International Center for Clinical Excellence (ICCE).  Briefly, the ICCE is a web-based community where participants can connect, learn from, and share with each other.  It has been specifically designed using the latest web 2.0 technology to help behavioral health practitioners reach their personal best.  If you haven’t already done so, please visit the website at www.iccexcellence.com to register to become a member (its free and you’ll be notified the minute the entire site is live)!

As I’ve said before, I am very excited by this opportunity to interact with behavioral health professionals all over the world in this way.  Stay tuned, after months of hard work and testing by the dedicated trainers, associates, and “top performers” of ICCE, the site is nearly ready to launch.

Filed Under: excellence, Feedback Informed Treatment - FIT, Top Performance Tagged With: denmark, icce, Institute for the Study of Therapeutic Change, international center for cliniclal excellence, istc, mental health, ors, outcome rating scale, psychotherapy, psychotherapy networker, session rating scale, srs, supershrinks, therapy

On the Path of the Supershrinks: An Article by Bill Robinson

September 24, 2009 By scottdm 1 Comment

Not too long ago, my colleagues and I published some preliminary thoughts and findings from our research into “Supershrinks.”

That differences in effectiveness exist between clinicians is neither surprising or new.  Indeed, “therapist effects”–as they are referred to in the research literature–have been documented for decades and rival the contribution of factors long known to influence successful psychotherapy (e.g., the therapeutic alliance, hope and expectancy, etc.).  Personally, I believe that studying these super-effective clinicians will help practitioners improve the outcome of their clinical work.

Aside from research documenting the existence of “supershrinks,” and our own articles on the subject, little additional information exists documenting how superior performing clinicians achieve the results they do.

Enter Bill Robinson, manager, counselor, and a senior supervsor with Relationships Australia based in Mandurah, Western Australia.  I’m also proud to say that Bill is one of a highly select group of clinicians that have completed the necessary training to be designated an ICCE Certified Trainer.

In any event, in the last issue of Psychotherapy in Australia–a treasure of a publication that every clinician dedicated to improving their work should subscribe to–Bill explores the topic of therapist effects, suggesting possible links between effectiveness and clinicians’ abilities to connect with the phenomenological worlds of the people they work with.  Trust me, this peer reviewed article is worth reading.  Don’t forget to post a comment, by the way, once you’ve finished!

Robinson from Scott Miller

 

Filed Under: Top Performance Tagged With: addiction, australia, brief therapy, conferences, ors, outcome rating scale, session rating scale, srs, supershrinks, theraputic alliance

Expertise and Excellence: What it Takes to Improve Therapeutic Effectiveness

April 2, 2009 By scottdm 1 Comment

downloadIf you’ve been following my website and the Top Performance Blog you know that my professional interests over the last couples of years have been shifting, away from psychotherapy, the common factors, and feedback and toward the study of expertise and excellence.

Studying this literature (click here for an interesting summary), makes clear that the factors responsible for superior performance are the same regardless of the specific endeavor one sets out to master. The chief principle will come as no surprise: You have to work harder than everyone else at whatever you want to be best at.

In other words, you have to practice.

Hard work is not enough, however.  Research shows that few attain international status as superior performers without access to high levels of support and detailed instruction from exceptional teachers over sustained periods of time. In the massive “Cambridge Handbook of Expertise and Expert Performance,” Feltovich et al. note, “Research on what enabled some individuals to reach expert performance, rather than mediocre achievement, revealed that expert and elite performers seek out teachers and engage in specifically designed training activities…that provide feedback on performance, as well as opportunities for repetition and gradual refinement” (p. 61).

What makes for a “good” teacher? Well, in essence, that is what the “Top Performance” blog is all about. I’m going on a journey, a quest really.  I’ve decided to take up two hoppies–activities I’ve always had a interest in but never had to the time to study seriously–magic and the ukelele.

Practicing is already proving challenging.  Indeed, the process reminds me a lot of when I started out in the field of psychology.  In a word, its daunting.  There are literally thousands of “tricks” and “songs,” (as there are 100’s of treatment models), millions of how-to books, videos, and other instructional media (just as in the therapy world), as well as experts (who, similar to the field of psychotherapy, offer a wide and bewildering array of different and oftentimes contractory opinions).

By starting completely over with subjects I know nothing about, I hope to put into practice the insights gleaned from our study of expertise and expert performance, along the way reporting the challenges, triumphs and failures associated with learning to master new skills.  I’ll review performances, instructional media (live, printed, DVD, etc), and the teachers I met.  Stay tuned.

Filed Under: Behavioral Health, deliberate practice, excellence, Top Performance Tagged With: Feltovich, ors, outcome rating scale, session rating scale, srs

Superior Performance as a Psychotherapist: First Steps

April 1, 2009 By scottdm Leave a Comment

So what is the first step to improving your performance?  Simply put, knowing your baseline.  Whatever the endeavor, you have to keep score.  All great performers do.  As a result, the performance in most fields has been improving steadily over the last 100 years.

Consider, for instance, the Olympics. Over the last century, the best performance for every event has improved–in some cases by 50%!  The Gold Medal winning time for the marathon in the 1896 Olympics was just one minute faster than the entry time currently required just to participate in the Chicago and Boston marathons.

By contrast, the effectiveness of psychological therapies has not improved a single percentage point over the last 30 years.  How, you may wonder, could that be?  During the same time period: (1) more than 10,000 how-to books on psychotherapy have been published; (2) the number of treatment approaches has mushroomed from 60 to 400; and (3) there are presently 145 officially approved, evidenced-based, manualized treatments for 51 of the 397 possible DSM IV diagnostic groups.  Certainly, given such “growth,” we therapists must be more effective with more people than ever before.  Unfortunately, however, instead of advancing, we’ve stagnated, mistaking our feverish peddling for real progress in the Tour de Therapy.

Truth is, no one has been keeping score, least of all we individual practitioners. True, volumes of research now prove beyond any doubt that psychotherapy works.  Relying on such evidence to substantiate the effectiveness of one’s own work, however, is a bit like Tiger Woods telling you the par for a particular hole rather than how many strokes it took him to sink the ball.  The result on outcome, research indicates, is that effectiveness rates plateau very early in most therapists careers while confidence level continue to grow.

In one study, for example, when clinicians were asked to rate their job performance from A+ to F, fully two-thirds considered themselves A or better. No one, not a single person in the lot, rated him or herself as below average. As researchers Sapyta, Riemer, and Bickman (2005) conclude, “most clinicians believe that they produce patient outcomes that are well above average” (p. 146). In another study, Deirdre Hiatt and George Hargrave used peer and provider ratings, as well as a standardized outcome measure, to assess the success rates of therapists in a sample of mental health professionals. As one would expect, providers were found to vary significantly in their effectiveness. What was disturbing is that the least effective therapists in the sample rated themselves on par with the most effective!

The reason for stagnant success rates in psychotherapy should be clear to all: why try to improve when you already think your the best or, barring that, at least above average?

Here again, expanding our search for excellence beyond the narrow field of psychotherapy to the subject of expertise and expert performance in general can provide some helpful insights. In virtually every profession, from carpentry to policework, medicine to mathematics, average performers overestimate their abilities, confidently assigning themselves to the top tier. Therapists are simply doing what everyone else does. Alas, they are average among the average.

Our own work and research proves that clinicians can break away from the crowd of average achievers by using a couple of simple, valid, and reliable tools for assessing outcome. As hard as it may be to believe, the empirical evidence indicates that performance increases between 65-300% (click here to read the studies). Next time, I’ll review these simple tools as well as a few basic methods for determining exactly how effective you are. Subscribe now so you’ll be the first to know.

One more note, after posting last time, I heard from several readers who had difficulty subscribing. After doing some research, we learned that you must use IE 7 or Firefox 3.0.7 or later for the subscribe function to work properly.  Look forward to hearing from you!

In the meantime, the transcript below is of a recent interview I did for Shrinkrap radio.  It’s focused on our current work:

Supershrinks: An Interview with Scott Miller about What Clinicians can Learn from the Field’s Most Effective Practitioners from Scott Miller

 

Filed Under: Behavioral Health, excellence, Top Performance Tagged With: cdoi, evidence based practice, excellence, mental health, outcome measures, psychology, psychotherapy, srs, supershrinks

My New Year’s Resolution: The Study of Expertise

January 2, 2009 By scottdm Leave a Comment

Most of my career has been spent providing and studying psychotherapy.  Together with my colleagues at the Institute for the Study of Therapeutic Change, I’ve now published 8 books and many, many articles and scholarly papers.  If you are interested you can read more about and even download many of my publications here.

Like most clinicians, I spent the early part of my career focused on how to do therapy.  To me, the process was confusing and the prospect of sitting opposite a real, suffering, client, daunting.  I was determined to understand and be helpful so I went graduate school, read books, and attended literally hundreds of seminars.

Unfortunately, as detailed in my article, Losing Faith, written with Mark Hubble, the “secret” to effective clinical practice always seemed to elude me.  Oh, I had ideas and many of the people I worked with claimed our work together helped.  At the same time, doing the work never seemed as simple or effortless as professional books and training it appear.

Each book and paper I’ve authored and co-authored over the last 20 years has been an attempt to mine the “mystery” of how psychotherapy actually works.  Along the way, my colleagues and I have paradoxically uncovered a great deal about what contributes little or nothing to treatment outcome! Topping the list, of course, are treatment models.  In spite of the current emphasis on “evidence-based” practice, there is no evidence that using particular treatment models for specific diagnostic groups improves outcome.  It’s also hugely expensive!  Other factors that occupy a great deal of professional attention but ultimately make little or no difference include: client age, gender, DSM diagnosis, prior treatment history; additionally, therapist age, gender, years of experience, professional discipline, degree, training, amount of supervision, personal therapy, licensure, or certification.

In short, we spend a great deal of time, effort, and money on matters that matter very little.

For the last 10 years, my work has focused on factors common to all therapeutic approaches. The logic guiding these efforts was simple and straightforward. The proven effectiveness of psychotherapy, combined with the failure to find differences between competing approaches, meant that elements shared by all approaches accounted for the success of therapy. And make no mistake, treatment works. The average person in treatment is better off than 80% of those with similar problems that do not get professional help.

In the Heart and Soul of Change, my colleagues and I, joined by some of the field’s leading researchers, summarized what was known about the effective ingredients shared by all therapeutic approaches. The factors included the therapeutic alliance, placebo/hope/expectancy, structure and techniques in combination with a huge, hairy amount of unexplained “stuff” known as “extratherapeutic factors.”

Our argument, at the time, was that effectiveness could be enhanced by practitioners purposefully working to enhance the contribution of these pantheoretical ingredients.  At a minimum, we believed that working in this manner would help move professional practice beyond the schoolism that had long dominated the field.

Ultimately though, we were coming dangerously close to simply proposing a new model of therapy–this one based on the common factors.  In any event, practitioners following the work treated our suggestions as such.  Instead of say, “confronting dysfunctional thinking,” they understood us to be advocating for a “client-directed” or strength-based approach.  Discussion of particular “strategies” and “skills” for accomplishing these objectives did not lag far behind.  Additionally, while the common factors enjoyed overwhelming empirical support (especially as compared to so-called specific factors), their adoption as a guiding framework was de facto illogical.  Think about it.  If the effectiveness of the various and competing treatment approaches is due to a shared set of common factors, and yet all models work equally well, why would anyone need to learn about the common factors?

Since the publication of the first edition of the Heart and Soul of Change in 1999 I’ve struggled to move beyond this point. I’m excited to report that in the last year our understanding of effective clinical practice has taken a dramatic leap forward.  All hype aside, we discovered the reason why our previous efforts had long failed: our research had been too narrow.  Simply put, we’d been focusing on therapy rather than on expertise and expert performance.  The path to excellence, we have learned, will never be found by limiting explorations to the world of psychotherapy, with its attendant theories, tools, and techniques.  Instead, attention needs to be directed to superior performance, regardless of calling or career.

A significant body of research shows that the strategies used by top performers to achieve superior success are the same across a wide array of fields including chess, medicine, sales, sports, computer programming, teaching, music, and therapy!  Not long ago, we published our initial findings from a study of 1000’s of top performing clinicians in an article titled, “Supershrinks.”  I must say, however, that we have just “scratched the surface.”  Using outcome measures to identify and track top performing clinicians over time is enabling us, for the first time in the history of the profession, to “reverse engineer” expertise.  Instead of assuming that popular trainers (and the methods they promote) are effective, we are studying clinicians that have a proven track record.  The results are provocative and revolutionary, and will be reported first here on the Top Performance Blog!  So, stay tuned.  Indeed, why not subscribe? That way, you’ll be among the first to know.

Filed Under: Behavioral Health, excellence, Top Performance Tagged With: behavioral health, cdoi, DSM, feedback informed treatment, mental health, ors, outcome measurement, psychotherapy, routine outcome measurement, srs, supervision, therapeutic alliance, therapy

  • « Previous Page
  • 1
  • 2

SEARCH

Subscribe for updates from my blog.

loader

Email Address*

Name

Upcoming Training

There are no upcoming Events at this time.

FIT Software tools

FIT Software tools

LinkedIn

Topics of Interest:

  • behavioral health (4)
  • Behavioral Health (112)
  • Brain-based Research (2)
  • CDOI (14)
  • Conferences and Training (67)
  • deliberate practice (31)
  • Dodo Verdict (9)
  • Drug and Alcohol (3)
  • evidence-based practice (66)
  • excellence (63)
  • Feedback (40)
  • Feedback Informed Treatment – FIT (218)
  • FIT (29)
  • FIT Software Tools (12)
  • ICCE (26)
  • Implementation (7)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (11)
  • Practice Based Evidence (39)
  • PTSD (4)
  • Suicide (1)
  • supervision (1)
  • Termination (1)
  • Therapeutic Relationship (9)
  • Top Performance (40)

Recent Posts

  • Managing the next Pandemic
  • The Most Important Psychotherapy Book
  • Naïve, Purposeful, and Deliberate Practice? Only One Improves Outcomes
  • Study Shows FIT Improves Effectiveness by 25% BUT …
  • Seeing What Others Miss

Recent Comments

  • Jeffrey Von Glahn on The Most Important Psychotherapy Book
  • James Venneear on The Most Important Psychotherapy Book
  • Asta on The Expert on Expertise: An Interview with K. Anders Ericsson
  • Michael McCarthy on Culture and Psychotherapy: What Does the Research Say?
  • Jim Reynolds on Culture and Psychotherapy: What Does the Research Say?

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors conferences continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training