SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
info@scottdmiller.com 773.404.5130

Using Feedback Informed Treatment to Improve Medication Adherence and Reduce Healthcare Costs

September 10, 2014 By scottdm Leave a Comment

persontakingpill

Medication adherence is a BIG problem.  According to recent research, nearly one-third of the prescriptions written are never filled.  Other data document that more than 60% of people who actually go the pharmacy and get the drug, do not take it as prescribed.

What’s the problem, you may ask?  Inefficiency aside, the health risks are staggering.  Consider, for example, that the prescriptions least likely to be filled are those aimed at treating headache (51 percent), heart disease (51.3 percent), and depression (36.8)percent).

medication adherence

When cost is factored into the equation, the impact of the problem on an already overburdened healthcare system becomes even more obvious.  Research indicates that not taking the medicines costs an estimated $290 billion dollars per year–or nearly $1000 for every man, woman, and child living in the United States.  It’s not hard to imagine more useful ways such money could be spent.

What can be done?

Pringle_Photo 2013

Enter Dr. Jan Pringle, director of the Program Evaluation Research Unit, and Professor of Pharmacy and Therapeutics at the University of Pittsburgh. As I blogged about back in 2009, Jan and I met at a workshop I did on feedback-informed treatment (FIT) in Pittsburgh.  Shortly thereafter, she went to work training pharmacists working in a community pharmacy to use the Session Rating Scale ([SRS] a four-item measure of the therapeutic alliance) in their encounters with customers.

It wasn’t long before Jan had results.  Her first study found that administering and discussing the SRS at the time medications were dispensed resulted in significantly improved adherence (you can read the complete study below).

She didn’t stop there, however.

reading

Just a few weeks ago, Jan forwarded the results from a much larger study, one involving 600 pharmacists and nearly 60,000 patients (via a special arrangement with the publisher, the entire study is available by clicking the link on her publications page of the University website).

Suffice it to say that using the measures, in combination with a brief interview between pharmacist and patient, significantly improved adherence across five medication classes aimed at treating chronic health conditions (e.g., calcium channel blockers, oral diabetes medications, beta-blockers, statins, and renin angiotemsin system antagonists).  In addition to the obvious health benefits, the study also documented significant cost reductions.  She estimates that using the brief, easy-to-use tools would result in an annual savings of $1.4 million for any insurer/payer covering at least 10,000 lives!

Prior to Jan’s research, the evidence-base for the ORS and SRS was focused exclusively on behavioral health services.  These two studies point to exciting possibilities for using feedback to improve the effectiveness and efficiency of healthcare in general.

The tools used in the pharmacy research have been reviewed and deemed evidence-based by the Substance Abuse and Mental Health Services Administration.

PCOMSLogoKnown as PCOMS, detailed information about the measures and feedback process can be found at www.whatispcoms.com.  It’s easy to get started and the measures are free for individual healthcare practitioners!

Filed Under: Feedback Informed Treatment - FIT, medication adherence Tagged With: depression, healthcare, heart disease, medication adherence, medicine, mental health, ors, outcome rating scale, pharmacy, prescriptions, SAMHSA, sesison rating scale, srs

Dealing with Scientific Objections to the Outcome and Session Rating Scales: Real and Bogus

December 15, 2012 By scottdm Leave a Comment

The available evidence is clear: seeking formal feedback from consumers of behavioral health services decreases drop out and deterioration while simultanesouly improving effectiveness.  When teaching practitioners how to use the ORS and SRS to elicit feedback regarding progress and the therapeutic relationship,  three common and important concerns are raised:

  1. How can such simple and brief scales provide meaningful information?
  2. Are consumers going to be honest?
  3. Aren’t these measures merely assessing satisfaction rather than anything meaninful?

Recently, I was discussing these concerns with ICCE Associate and Certified Trainer, Dan Buccino.

Briefly, Dan is a clinical supervisor and student coordinator in the Adult Outpatience Community Psychiatry program at Johns Hopkins.  He’d not only encountered the concerns noted above but several additional objections.  As he said in his email, “they were at once baffling and yet exciting, because they were so unusal and rigorous.”

“It’s a sign of the times,” I replied, “As FIT (feedback informed treatment) becomes more widespread, the supporting evidence will be scrutinized more carefully.  It’s a good sign.”

Together with Psychologist and ICCE Senior Associate and Trainer, Jason Seidel, Dan crafted detailed response.  When I told them that I believed the ICCE community would value having access to the document they created, both agreed to let me publish it on the Top Performance blog.  So…here it is.  Please read and feel free to pass it along to others.

 

 

 

Filed Under: Feedback Informed Treatment - FIT Tagged With: accountability, behavioral health, Certified Trainers, evidence based practice, feedback, interviews, mental health, ors, practice-based evidence, psychometrics, research, srs

The Importance of "Whoops" in Improving Treatment Outcome

December 2, 2012 By scottdm Leave a Comment

“Ring the bells that still can ring,
Forget your perfect offering
There is a crack in everything,
That’s how the light gets in.”

Leonard Cohen, Anthem

Making mistakes.  We all do it, in both our personal and professional lives.  “To err is human…,” the old saying goes.  And most of us say, if asked, that we agree whole heartedly with the adage–especially when it refers to someone else!  When the principle becomes personal, however, its is much more difficult to be so broad-minded.

Think about it for a minute: can you name five things you are wrong about?  Three?  How about the last mistake you made in your clinical work?  What was it?  Did you share it with the person you were working with?  With your colleagues?

Research shows there are surprising benefits to being wrong, especially when the maker views such errors differently.  As author Alina Tugend points out in her fabulous book, Better by Mistake, custom wrongly defines a mistake as ” the failure of a planned sequence of mental or physical activities to achieve its intended outcome.”  When you forget a client’s name during a session or push a door instead of pull, that counts as  slip or lapse.  A mistake, by contrast, is when “the plan itself is inadequate to achieve it’s objectives” (p. 11).  Knowing the difference, she continues, “can be very helpful in avoiding mistakes in the future” because it leads exploration away from assigning blame to the exploring systems, processes, and conditions that either cause mistakes or thwart their detection.

Last week, I was working with a talented and energetic group of helping professionals in New Bedford, Massachusetts.  The topic was, “Achieving Excellence: Pushing One’s Clinical Performance to the Next Level of Effectiveness.”  As part of my presentation, I talked about becoming more, “error-centric” in our work; specifically, using ongoing measurement of the alliance to identify opportunities for improving our connection with consumers of behavioral health services.  As an example of the benefits of making mistakes the focus of professional development efforts, I showed a brief video of Rachel Hsu and Roger Chen, two talented musicians who performed at the last Achieving Clinical Excellence (ACE) conference.  Rachel plays a piece by Liszt, Roger one by Mozart.  Both compositions are extremely challenging to play.  You tell me how they did (by the way, Rachel is 8 years old, Roger. 9):

Following her performance, I asked Rachel if she’d made any mistakes during her performance.  She laughed, and then said, “Yes, a lot!”  When I asked her what she did about that, she replied, “Well, its impossible to learn from my mistakes while I’m playing.  So I note them and then later practice those small bits, over and over, slow at first, then speeding up, until I get them right.”

After showing the video in New Bedford, a member of the audience raised his hand, “I get it but that whole idea makes me a bit nervous.”  I knew exactly what he was thinking.  Highlighting one’s mistakes in public is risky business.  Studies documenting that the most effective clinicians experience more self-doubt and are more willing to admit making mistakes is simply not convincing when one’s professional self-esteem or job may be on the line.  Neither is research showing that health care professionals who admit making mistakes and apologize to consumers are significantly less likely to be sued.  Becoming error centric, requires a change in culture, one that not only invites discloure but connects it with the kind of support and structure that leads to superior results.

Creating a “whoops-friendly” culture will be a focus of the next Achieving Clinical Excellence conference, scheduled for May 16-18th, 2013 in Amsterdam, Holland.  Researchers and clinicians from around the world will gather to share their data and experience at this unique event.  I promise you don’t want to miss it.  Here’s a short clip of highlights from the last one:

My colleague, Susanne Bargmann and I will also be teaching the latest research and evidence based methods for transforming mistakes into improved clinical performance at the upcoming FIT Advanced Intensive training in Chicago, Illinois.   I look forward to meeting you at one of these upcoming events.  In the meantime, here’s a fun, brief but informative video from the TED talks series on mistakes:

By the way, the house pictured above is real.  My family and I visited it while vacationing in Niagara Falls, Canada in October.  It’s a tourist attraction actually.  Mistakes, it seems, can be profitable.

Filed Under: Feedback Informed Treatment - FIT Tagged With: accountability, Alliance, behavioral health, cdoi, conferences, continuing education, deliberate practice, evidence based practice, feedback, mental health, Therapist Effects, top performance

Clinical Support Tools for the ORS and SRS

November 20, 2012 By scottdm 1 Comment

I have so much to be grateful for at this time.  Most of all, I’m happy to be home with my family.  As we have in the past, this year we’ll be spending the holiday at the home of our long time friends John and Renee Dalton.  The two always put out a fantastic spread and our son, Michael, is fast friends with their two kids.

I’m also grateful for the International Center for Clinical Excellence (ICCE) community.  Currently, ICCE has over 4200 members located around the world, making the organization the largest, web-based community of professionals, educators, managers, and clinicians dedicated to using feedback to pursue excellence in the delivery of behavioral health services.  Recently, the site was highlighted as one of the best resources for practitioners available on the web.  Articles, how-to videos, and discussion forums are available everyday, all day–and for free!  No come-ons for books or webinars and no “cult of personality”–just sharing among peers.  If you are not a member, you can join at: www.centerforclinicalexcellence.com

A special thanks goes to several ICCE senior advisors and associates, including Susanne Bargmann, Jason Seidel, Cynthia Maeschalck, Bob Bertolino, Bill Plum, Julie Tilsen, and Robbie Babbins-Wagner.  These folks are the backbone of the organization.  Together, they make it work.  Most recently, we all joined together to create the ICCE Feedback Informed Treatment and Training Manuals, a cutting edge series covering every aspect of FIT–from the empirical foundations to implementation–in support of our application to SAMSHA for recognition as an “evidence-based practice.”

As a way of supporting everyone using the ORS and SRS, I wanted to make a couple of clinical support tools available.  If you are using the measures, the first item will need no introduction.  It’s a 10 cm ruler!  Save the file and print it off and you also have a ready reminder of the upcoming Achieving Clinical Excellence conference, coming up in May 2013.  Like last time, this will feature the latest inforamtion about feedback informed practice!  The second item is a reliable change graph.  If you are using the paper and pencil measures, rather than one of the existing web based systems (www.fit-outcomes.com, www.myoutcomes.com), you can use this tool to determine whether a change in scores from session to session is reliable (that is, greater than chance, the passage of time, and measurement error [and therefore, due to the care being provided]) or even clinically significant (that is, both reliable and indicating recovered).  The last item is an impressive summary of various systems for monitoring progress in treatment.

In addition ACE Health have developed openFIT, a plug-in which seamlessly integrates the ORS, SRS and associated algorithms into any existing Electronic Health Record, Case Management System of eMental Health application.

I wish everyone a peaceful and rewarding Thanksgiving holiday.

 

Filed Under: FIT Software Tools Tagged With: behavioral health, cdoi, excellence, feedback, healthcare, icce, mental health, ors, Outcome, practice-based evidence, srs

Mental Health Practice in a Global Economy

April 17, 2012 By scottdm 2 Comments

Did you feel it?  The seismic shift that occurred in field of mental health just a little over a month ago?  No?  Nothing?  Well, in truth, it wasn’t so much a rip in the space-time continuum as a run.  That “run,” however, promises to forever alter the fabric of clinical practice–in particular how clinicians earn and maintain a certain standard of living.

For decades, licensing statutes have protected behavioral health professionals from competing with providers living outside of their state and local jurisdiction.  In order to bill or receive reimbursement, mental health professionals needed to be licensed in the state in which treatment services were offered.  Over the years, the various professional organizations have worked to make it easier for professionals to become licensed when they move from one state to the another.  Still, it ain’t easy and, some practitioners and professional groups would argue, for good reason.  Such laws, to some extent, insure that fees charged for services are commensurate with the cost of living in the place where therapists live and work.  The cost of therapy in Manhattan varies considerably, for example, depending on whether one is talking about the city located in state of New York or Kansas.

As far as outcomes are concerned, however, there is no evidence that people who pay more necessarily get better results.  Indeed, as reviewed here on this blog, available evidence indicates little or no difference in outcome between highly trained (and expensive) clinicians and minimally trained (and less expensive) para-professionals and students.  If the traditional geographic (licensing) barriers were reduced or eliminated, consumers would with few exceptions gravitate to the best value for their money.  In the 1980’s and 90’s, for example, comsumers deserted small, Main Street retailers when big box stores opened on the outskirts of town offering the same merchandise at a lower price.  Now, big box retailers are closing en masse as consumers shift their purchases to less expensive, web based outlets.

And that’s precisely the shift that began a little over a month ago in the field of mental health.  The U.S. Military eliminated the requirement that civilian providers be licensed in the same jurisdiction or state in which treatment is offered.  The new law allows care to be provided wherever the receipient of services lives and regardless of where the provider is licensed.  Public announcements argued that the change was needed to make services available to service members and veterans living in isolated or rural areas where few providers may be available.  Whatever the reason, the implications are profound: in the future, clinicians, like Main Street retailers, will be competing with geographically distant providers.

Just one week prior to the announcement by the U.S. Military, I posted a blogpost highlighting a recent New York Times column by author and trend watcher, Thomas Friedman.  In it, I argued that “Globalization and advances in information technology were…challenging the status quo…access. At one time, being average enabled one to live an average life, live in an average neighborhood and, most importantly, earn an average living.  Not so anymore.  Average is now plentiful, easily accessible, and cheap. What technology can’t do in either an average or better way, a younger, less-trained but equally effective provider can do for less. A variety of computer programs and web-based systems provide both psychological advice and treatment.”

Truth is, the change is likely to be a boon to consumers of mental health services: easier access to services at a better price.  What can clinicians do?  First, begin measuring outcome.  Without evidence of their effectiveness, individual providers will lose out to the least expensive provider.  No matter how much people complain about “big box and internet retailers,” most use them.  The savings are too great to ignore.

What else can clinicians do?  The advice of Friedman, which I quoted in my recent blogpost, applies, “everyone needs to find their extra–their unique value contribution that makes them stand out in whatever is their field.” Measuring outcome and finding that “something special” is what the International Center for Clinical Excellence is all about.  If you are not a member, please join the thousands of other professionals online today.   After that, why not spend time with peers and cutting edge instructors at the upcoming “advanced intensive” or “training of trainers” workshops this summer.

Filed Under: Behavioral Health, Feedback Informed Treatment - FIT, ICCE Tagged With: behavioral health, brief therapy, cdoi, evidence based practice, mental health, Thomas Friedman

Is the "Summer of Love" Over? Positive Publication Bias Plagues Pharmaceutical Research

March 27, 2012 By scottdm Leave a Comment


Evidence-based practice is only as good as the available “evidence”–and on this subject, research points to a continuing problem with both the methodology and type of studies that make it into the professional literature.  Last week, PloS Medicine, a peer-reviewed, open access journal of the Public Library of Science, published a study showing a positive publication bias in research on so-called atypical antipsychotic drugs.  In comparing articles appearing in journals to the FDA database, researchers found that almost all postive studies were published while clinical trials with negative or questionable results were not or–and get this–were published as having positive results!

Not long ago, similar yet stronger results appeared in the same journal on anti-depressants.  Again, in a comparison with the FDA registry, researchers found all postive studies were published while clinical trials with negative or questionable results were not or–and get this–were published as having positive results!  The problem is far from insignificant.  Indeed, a staggering 46% of studies with negative results were not published or published but reported as positive.

Maybe the “summer of love” is finally over for the field and broader American public.  Today’s Chicago Tribune has a story by Kate Kelland and Ben Hirschler reporting data about sagging sales of anti-depressants and multiple failures to bring new, “more effective” drug therapies to market.  Taken together, robust placebo effects, the FDA mandate to list all trials (positive and negative), and an emphasis in research on conducting fair comparisons (e.g., comparing any new “products” to existing ones) make claims about “new and improved” effectiveness challenging.

Still one sees ads on TV making claims about the biological basis of depression–the so called, “biochemical imbalance.”  Perhaps this explains why a recent study of Medicaid clients found that costs of treating depression rose by 30% over the last decade while the outcomes did not improve at all during the same period.  The cause for the rise in costs?    Increased use of psychiatric drugs–in particular, anti-psychotics in cases of depression.

“It’s a great time for brain science, but at the same time a poor time for drug discovery for brain disorders,” says David Nutt, professor of neuropsychopharmacology, cited in the Chicago Tribune, “That’s an amazing paradox which we need to do something about.”

Here’s an idea: how about not assuming that problems in living are reduceable to brain chemistry?   That the direction of causality for much of what ails people is not brain to behavior but perhaps behavior to brain?  On this note, it is sad to note that while the percentage of clients prescribed drugs rose from 81 to 87%–with no improvement in effect–the number of those receiving psychotherapy dropped from 57 to 38%.

Here’s what we know about psychotherapy: it works and it has a far less troublesome side effect profile than psychotropic drugs.  No warnings needed for dry mouth, dizziness, blood and liver problems, or sexual dysfunction.  The time has come to get over the collective 1960’s delusion of better living through chemistry.

Filed Under: Practice Based Evidence Tagged With: behavioral health, continuing education, depression, evidence based practice, icce, Medicaid, mental health, psychotherapy

What’s disturbing Mental Health? Opportunities Lost

November 29, 2011 By scottdm Leave a Comment

In a word, paperwork.  Take a look at the book pictured above.  That massive tome on the left is the 2011 edition of “Laws and Regulations” governing mental health practice in the state of California.  Talk about red tape!  Hundreds and hundreds of pages of statutes informing, guiding, restricting, and regulating the “talking cure.”  Now, on top of that, layer federal and third party payer policies and paperwork and you end up with…lost opportunities.  Many lost opportunities.  Indeed, as pointed out in our recent article, The Road to Mastery, as much as 30% of clinicians time is spent completing paperwork required by various funding bodies and regulatory agencies.  THIRTY PERCENT.  Time and money that could be spent much more productively serving people with mental health needs. Time and money that could be spent on improving treatment facilities and training of behavioral health professionals.  In the latest edition of our book, The Heart and Soul of Change, authors Bob Bohanske and Michael Franczak described their struggle to bring sanity to the paperwork required in public mental health service settings in the state of Arizona.  “The forms needed to obtain a marriage certificate, buy a new home, lease an automobile, apply for a passport, open a bank account, and die of natural causes were assembled,” they wrote, “…and altogether weighed 1.4 ounces.  By contrast, the paperwork required for enrolling a single mother in counseling to talk about difficulties her child was experiencing at school came in at 1.25 pounds” (p. 300).  What gives?

The time has come to confront the unpleasant reality and say it outloud: regulation has lost touch with reality.  Ostensibly, the goal of paperwork and oversight procedures is to improve accountability.  In these evidence-based times, that leads me to say, “show me the data.”  Consider the wide-spread practice–mandate, in most instances–of treatment planning. Simply put, it is less science than science fiction.  Perhaps this practice improves outcomes in a galaxy far, far away but on planet Earth, supporting evidence is spare to non-existent (see the review in The Heart and Soul of Change, 2nd Edition).

No amount of medication will resolve this craziness.  Perhaps a hefty dose of CBT might do some good identifying and correcting the distoreted thinking that has led to this current state of affairs.  Whatever happens, the field needs an alternative.  What practice not only insures accountability but simultaneously improves the quality and outcome of behavioral health services?  Routine outcome measurement and feedback (ROMFb).  As I’ve blogged about several times, numerous RCT’s document increased effectiveness and efficiency and decreased costs and rates of deterioration.   Simply put, as the slide below summarizes, everybody wins.  Clinicians.  Consumers.  Payers.
Everybody wins

Learn about or deepen your knowledge of feedback-informed treatment (FIT) by attending the upcoming “Advanced Intensive” workshop in March 2012; specfically, the 19th-22nd.  We will have four magical days together.  Space is filling rapidly, so register now.  And then, at the end of the last day of the training, fly to Washington, D.C. to finish off the week by attending the Psychotherapy Networker conference.  Excellence is front and center at the event and I’ve been asked to do the keynote on the subject on the first day!

Filed Under: Behavioral Health, Conferences and Training, Feedback Informed Treatment - FIT Tagged With: bob bohanske, counselling, mental health, michael franczak, The Heart and Soul of Change

Yes, More Evidence: Spanish version of the ORS Validated by Chilean Researchers

June 16, 2011 By scottdm Leave a Comment

Last week, Chile.  This week, Perth, Australia.  Yesterday, I landed in Sydney following a 30 hour flight from the United States.  I managed to catch the last flight out to Perth before all air travel was grounded due to another ash clound–this time coming from Chile!  I say “another” as just over a year ago, I was trapped behind the cloud of ash from the Icelandic eruption!  So far so good.  Today, I’ll spend the day talking about “excellence” in behavioral healthcare.

Before heading out to teach for the day, I wanted to upload a report from a recent research project conducted in Chile investigating the statistical properties of the ORS.  I’ve attached the report here so you can read for yourself.  That said, let me present the highlights:

  • The spanish version of the ORS is reliable (alpha coefficients .90-.95).
  • The spanish version of the ORS shows good construct and convergent validity (correlations with the OQ45 .5, .58).
  • The spanish version of the ORS is sensitive to change in a treated population.

The authors of the report that was presented at the Society for Psychotherapy Research meeting conclude, “The ORS is a valid instrument to be used with the Chilean population.”

As asked in my blogpost last week, “how much more evidence is needed?”  Now, more than ever, clinicians needs simple, valid, reliable, and feasible tools for evaluating the process and outcome of behavioral healthcare.  The ORS and SRS FITS the bill!

Filed Under: FIT, PCOMS, Practice Based Evidence Tagged With: behavioral health, cdoi, Chile, evidence based practice, mental health, ors, outcome rating scale, session rating scale, srs

The War on Unhappiness Heats Up

November 24, 2010 By scottdm Leave a Comment

Back in September, I blogged about an article by Gary Greenberg published in the August issue of Harper‘s magazine that took aim at the “helping profession.”   He cast a critical eye on the history of the field, it’s colorful characters, constantly shifting theoretical landscape, and claims and counterclaims regarding “best practice.”   Several paragraphs were devoted to my own work; specifically, research documenting the relatively inconsequential role that particular treatment approaches play in successful treatment and the importance of using ongoing feedback to inform and improve mental health services.

Just this last week, while I was overseas teaching in Romania (more on that trip soon), I received an email from Dr. Dave of ShrinkRapRadio who felt the piece by Greenberg was unfair to the field in general and a mischaracterization of the work by many of the clinicians cited in the article, including me.  “I’ve got a blog on the Psychology Today website and I’m planning to take him to task a bit,” he wrote.

If you have not had a chance to read the Greenberg article, you can find it on my original blogpost.  It’s a must read, really.  As I said then, whatever your opinion about the present state of practice, “Greenberg’s review of current and historical trends is sobering to say the least–challenging mental health professionals to look in the mirror and question what we really know for certain–and a must read for any practitioner hoping to survive and thrive in the current practice environment.”  Then, take a moment and read Dr. Dave’s response.  With his permission, I’ve posted it below!

  

Popping The Happiness Bubble: The Backlash Against Positive Psychology

Readers will recall that in Part 1, I suggested that a backlash against the ebullience of the positive psychology movement was probably inevitable. The most visible sign of that rebellion was last year’s best-selling book by Barbara Ehrenreich, Bright-Sided: How The Relentless Promotion of Positive Thinking Has Undermined America. While I found myself in agreement with much of her appraisal of American culture and our historical fascination with “positive thinking,” I thought her critique of positive psychology fell short by equating positive psychology to “positive thinking.” It also seemed to me that she failed to recognize that a huge body of research conducted by an army of independent researchers is emerging on a very diverse range of topics, which have been subsumed under the general heading of positive psychology. And, finally, much of her argument was based on an ad hominem attack on Martin Seligman.

I found further evidence of this backlash in the lead article in the October 2010 issue of Harper’s by psychotherapist Gary Greenberg, “The War on Unhappiness: Goodbye Freud, Hello Positive Thinking.” Greenberg is the author of Manufacturing Depression, a book that came out earlier this year. In addition, he is a prolific writer who has published articles that bridge science, politics, and ethics in a number of leading magazines. So he’s got great credentials both as a psychologist and a writer. Yet, I found this particular article unsatisfying. At least, that was my reaction upon first reading. As I later read it a second time to write about it here, I got a clearer sense of what he was up to and found myself in substantial agreement with his overall thrust.

The stimulus for Greenberg’s piece appears to have been his attendance at the annual Evolution of Psychotherapy Conference in Anaheim earlier this year. He seems to take a pretty dyspeptic view of the whole event: “Wandering the conference, I am acquainted, or reacquainted, with Cognitive Behavioral Therapy, Ericksonian Hypnosis, Emotionally Focused Therapy, Focusing, Buddhist Psychology, Therapist Sculpting, Facilitating Gene Expression, and Meditative methods.” A forty-year veteran of the California personal-growth/therapy scene, myself, it’s easy to develop a jaundiced eye over time as a panoply of approaches come and go. Yet, I have to say my own view, as a result of over 300 podcast interviews with psychologists across a broad spectrum of orientations, is there is more of a developing consensus and that the differences between many approaches are relatively minor.

By contrast, Greenberg seems to go into despair.

As I say, it took two readings of Greenberg’s article to really get the overall sweep. On first reading, it seems to be a bit of a meander, beginning with some slighting anecdotes about Freud. Then we’re on to the Anaheim conference and some handwringing about the seeming tower of Babel created by the profusion of therapeutic approaches. This segues into a discussion of Rozenzwig’s 1936 “Dodo Bird Effect” which asserts that therapeutic orientation doesn’t matter because all orientations work. As the Dodo pronounces in Alice in Wonderland, “Everyone has won and all must have prizes.” According to Greenberg, the Dodo Bird Effect has been borne out in subsequent studies and the requisite common ingredient for therapeutic success is faith, both the client’s and the therapist’s.

Greenberg goes on to describe several of the presentations, most notably by Otto Kernberg, Scott D. Miller, David Burns, and Martin Seligman. Part of what put me off about this article on my first reading is that I have conducted in-depth interviews with the first three of these gentlemen and I would not have recognized them from Greenberg’s somewhat muddled account.

Otto Kernberg, MD, one of the grand old men of psychoanalysis, is characterized as intoning “the old mumbo jumbo about the Almost Untreatable Narcissistic Patient…” In my opinion, this really slights his lifetime commitment to research, his many contributions to object relations theory, and his role as Director of The Institute for Personality Disorders at the Cornell Medical Center.  In my interview with Dr. Kernberg, I was struck by the flexibility of this octogenerian to incorporate the findings of neuroscience, genetics, and even cognitive behavioral therapy in this thinking.

Greenberg seems to use Dr. Scott D. Miller’s research as supporting the Dodo Bird effect. I attended a daylong workshop with Scott Miller a few years ago and it was one of the best presentations I’ve ever seen. I also interviewed him for one of my podcasts. The key takeaway for me from Scott Miller’s work is that the Dodo Bird effect shows up only when therapeutic effectiveness is averaged across therapists. That is, on average, all psychotherapies are moderately effective. However, Miller reports that not all therapists are equally effective and that, if you look at therapists who are consistently rated as effective by their clients vs. therapists who are consistently rated as ineffective, then therapy emerges as a highly worthwhile enterprise.

As Miller said in my interview with him, “If the consumer is able to feed back information to the system about their progress, whether or not progress is being made, those two things together can improve outcomes by as much as 65%.”

As I say, I had difficulty recognizing Miller in Greenberg’s account. Evidently, Greenberg is critical of Miller having developed a standardized set of rating scales for clients to provide feedback to their therapists. Greenberg sees these scales as playing into the hands of managed care and the trend towards “manualized” therapies. However, in my interview with Miller, he is very clearly critical of managed care, at least in terms of their emphasis on particular treatments for particular diagnostic categories. As Miller said in his interview with me, “If there were inter-rater reliability that would be one thing; the major problem with the DSM is that is lacks validity, however. That these groupings of symptoms actually mean anything… and that data is completely lacking… We are clustering symptoms together much the way medicine did in the medieval period: this is the way we treated people and thought about people when we talked about them being phlegmatic for example; or the humors that they had. Essentially they were categorizing illnesses based on clusters of symptoms.”

I also had difficulty recognizing Stanford psychiatry professor, David Burns, from Greenberg’s summary of the session he attended with Burns.  In short, Greenberg portrays Burns, who has developed a Therapist’s Toolkit inventory as wishing to replace “open-ended conversation with a five-item test… to take an X-ray of our inner lives.” This runs counter to my experience of Burns who, for example, in my interview with Dr. Burns about his cognitive therapy approach to couples work said, “…cognitive therapy has become probably the most widely practiced and researched form of psychotherapy in the world. But I really don’t consider myself a cognitive therapist or any other school of therapy; I’m in favor of tools, not schools of therapy. I think all the schools of therapy have had important discoveries and important angles, but the problem is they are headed up by gurus who push too hard trying to say cognitive therapy is the answer to everything, or rational emotive therapy is the answer to everything, or psychoanalysis is the answer to everything. And that is reductionism, and kind of foolish thinking to my point of view.” This hardly sounds like someone who thinks he’s invented a paper-and-pencil test that will be the end-all of psychotherapy.

And then Greenberg goes on to skewer positive psychology, which is what drew me to his article in the first place. After all, the title “The War on Unhappiness” seems to promise that. Like Ehrenreich, however, Greenberg’s critique is largely an ad hominem attack on Seligman. For example, referring to his earlier work subjecting dogs to electric shock boxes to study learned helplessness, Greenberg characterizes Seligman as, “More curious about dogs than about the people who tortured them…” He goes on to recount Seligman’s presentation to the CIA on learned helplessness which became the basis for enhanced “interrogation” techniques in Iraq. Now, we are told Seligman is working with the U.S. Army to teach resilience to our troops. In Greenberg’s view, Seligman would have us going his dogs one better by “thriving on the shocks that come our way rather than merely learning to escape them.”

So, it turns out that Greenberg’s attack on positive psychology is rather incidental to his larger concern which turns out to be that clinical psychology has sold its soul to the evidence-based, managed-care lobby in order to feed at the trough of medical reimbursement.

Greenberg’s article is a circular ramble that begins with slighting references to Freud and psychoanalysis and then ends with Freud as the champion of doubt.

It took me two readings to see that Greenberg is essentially using Miller, Burns, and Seligman as foils to attack smug certainty and blind optimism, the enemies of doubt. Of himself, Greenberg concludes, “I’m wondering now why I’ve always put such faith in doubt itself, or, conversely, what it is about certainty that attracts me so much, that I have spent twenty-seven years, thousands of hours, millions of other people’s dollars to repel it.”

Greenberg evidently values the darker side, the questions, the unknown, the mystery. “Even if Freud could not have anticipated the particulars – the therapists-turned-bureaucrats, the gleaming prepackaged stories, the trauma-eating soldiers-he might have deduced that a country dedicated in its infancy to the pursuit of happiness would grow up to make it a compulsion. He might have figured that American ingenuity would soon, maybe within a century, find a way to turn his gloomy appraisal of humanity into a psychology of winners.”

I think I’m in agreement with at least some of Greenberg’s larger argument. My fear, however, is that the general reader will come away with the impression that psychotherapists don’t know what they are doing and that the whole enterprise is a waste of time and money. That would be too bad. Both because I don’t think it’s true and I don’t think Greenberg does either.

I encourage you to find Greenberg’s article and to post your own reactions here in the comments area.

I had planned to stake out my own position on positive psychology in response to the critiques of Ehrenreich and Greenberg. It’s looking like there may need to be a Part 3. Stay tuned!

Filed Under: Practice Based Evidence Tagged With: Barbara Ehrenreich, evidence based practice, gary greenberg, healthcare, Manufacturing Depression, mental health, psychology today

Goodbye Freud, Hello Common Factors

September 14, 2010 By scottdm Leave a Comment

Gary Greenberg certainly has a way with words.  In his most recent article, The War on Unhappiness, published in the August issue of Harper‘s magazine, Greenberg focuses on the “helping profession”–its colorful characters, constantly shifting theoretical landscape, and claims and counterclaims regarding “best practice.”  He also gives prominence to the most robust and replicated finding in psychotherapy outcome research: the “dodo bird verdict.”  Simply put, the finding that all approaches developed over the last 100 years–now numbering in the thousands–work about equally well.   Several paragraphs are devoted to my own work; specifically, research documenting the relatively inconsequential role that particular treatment approaches play in successful treatment and the importance of using ongoing feedback to inform and improve mental health services.  In any event, Greenberg’s review of current and historical trends is sobering to say the least–challenging mental health professionals to look in the mirror and question what we really know for certain–and a must read for any practitioner hoping to survive and thrive in the current practice environment.  OK.  Enough said.  Read it yourself here.

View more documents from Scott Miller.

Filed Under: Behavioral Health Tagged With: cdoi, gary greenberg, healthcare, mental health, psychotherapy

Ohio Update: Use of CDOI improves outcome, retention, and decreases "board-level" complaints

August 5, 2010 By scottdm Leave a Comment

A few days ago, I received an email from Shirley Galdys, the Associate Director of the Crawford-Marion Alcohol and Drug/Mental Health Services Board in Marion, Ohio.  Back in January, I blogged about the steps the group had taken to deal with the cutbacks, shortfalls, and all around tough economic circumstances facing agencies in Ohio.  At that time, I noted that the dedicated administrators and clinicians had improved the effectiveness and efficiency of treatment so much by their systematic use of Feedback-Informed Treatment (FIT) that they were able to absorb cuts in funding and loss of staff without having to cut services to their consumers.

Anyway, Shirley was writing because she wanted to share some additional good news.  She’d just seen an advance copy of the group’s annual report.  “Since we began using FIT over two years ago,” she wrote, “board level complaints and grievances have decreased!”

In the past, the majority of such complaints have centered on client rights.  “Because of FIT,” she continued, “we are making more of an effort to explain to people what we can and cannot do for them as part of the ‘culture of feedback’….we took a lot for granted about what people understood about behavioral health care prior to FIT.”

The Crawford-Marion Alcohol and Drug/Mental Health Services Board is now into the second full year of implementation.  They are not merely surviving, they are thriving!  In the video below, directors Shirley Galdys, Bob Moneysmith, and Elaine Ring talk about the steps for a successful implementation.

Filed Under: Behavioral Health, Feedback Informed Treatment - FIT, FIT, Implementation Tagged With: addiction, behavioral health, cdoi, mental health, shirley galdys

Evidence-based practice or practice-based evidence? Article in the Los Angeles Times addresses the debate in behavioral health

January 18, 2010 By scottdm Leave a Comment


January 11th, 2010

“Debate over Cognitive & Traditional Mental Health Therapy” by Eric Jaffe

The fight debate between different factons, interest groups, scholars within the field of mental health hit the pages of the Los Angeles Times this last week. At issue?  Supposedly, whether the field will become “scientific” in practice or remain mired in traditions of the past.  On the one side are the enthusiastic supporters of cognitive-behavioral therapy (CBT) who claim that existing research provides overwhelming support for the use of CBT for the treatment of specific mental disorders.  On the other side are traditional, humanistic, “feel-your-way-as-you-go” practitioners who emphasize quality over the quantitative.

My response?  Spuds or potatoes.  Said another way, I can’t see any difference between the two warring factions.  Yes, research indicates the CBT works.  That exact same body of literature shows overwhelmingly, however, that any and all therapeutic approaches intended to be therapeutic are effective.  And yes, certainly, quality is important.  The question is, however, “what counts as quality?” and more importantly, “who gets to decide?”

In the Los Angeles Times article, I offer a third way; what has loosely been termed, “practice-based evidence.”  The bottom line?  Practitioners must seek and obtain valid, reliable, and ongoing feedback from consumers regarding the quality and effectiveness of the services they offer.  After all, what person following unsuccessful treatment would say, “well, at least I got CBT!” or, “I’m sure glad I got the quality treatment.”

Filed Under: Behavioral Health, Dodo Verdict, Practice Based Evidence Tagged With: behavioral health, cognitive-behavioral therapy (CBT), evidence based practice, icce, Los Angeles Times, mental health, meta-analysis, public behavioral health

Five Incredible Days in Anaheim

December 15, 2009 By scottdm 2 Comments

From December 9-13th, eight thousand five hundred mental health practitioners, from countries around the globe, gathered in Anaheim, California to attend the “Evolution of Psychotherapy” conference.  Held every five years since 1985, the conference started big and has grown only larger.  “Only a few places in the US can accommodate such a large gathering,” says Jeffrey K. Zeig, Ph.D., who has organized the conference since the first.

The event, held every five years, brings together 40 of the field’s leading researchers, practitioners, trend setters, and educators to deliver keynote addresses and workshops, host discussion panels, and offer clinical demonstrations on every conceivable subject related to clinical practice.  Naturally, I spoke about my current work on “Achieving Clinical Excellence” as well as served on several topical panels, including “evidence based practice” (with Don Meichenbaum), “Research on Psychotherapy” (with Steven Hayes and David Barlow), and “Severe and Persistent Mental Illness (with Marsha Linnehan and Jeff Zeig).

Most exciting of all, the Evolution of Psychotherapy conference also served as the official launching point for the International Center for Clinical Excellence.  Here I am pictured with long-time colleague and friend, Jeff Zeig, and psychologist and ICCE CEO, Brendan Madden, in front of the ICCE display in the convention center hall.

Over the five days, literally hundreds of visitors stopped by booth #128 chat with me, Brendan, and Senior ICCE Associates and Trainers, Rob Axsen, Jim Walt, Cynthia Maeschalck, Jason Seidel, Bill Andrews, Gunnar Lindfeldt, and Wendy Amey.  Among other things, a cool M and M dispenser passed out goodies to folks (if they pressed the right combination of buttons), we also talked about and handed out leaflets advertising the upcoming “Achieving Clinical Excellence” conference, and finally people watched a brief video introducing the ICCE community.  Take a look yourself:.


More to come from the week in Anaheim….

Filed Under: Behavioral Health, Conferences and Training, excellence, ICCE Tagged With: Acheiving Clinical Excellence, brendan madden, david barlow, Don Meichenbaum, evidence based practice, Evolution of Psychotherapy, icce, Jeff Zeig, jeffrey K. zeig, Marsha Linnehan, mental health, psychotherapy, Steve Hayes

Outcomes in OZ III

December 4, 2009 By scottdm Leave a Comment

Dateline: November 28, 2009 Brisbane, Australia

accor

Crown Plaza Hotel
Pelican Waters Golf Resort & Spa

As their name implies, LifeLine Australia is the group people call when they need a helping hand.  During the last leg of my tour of eastern Australia, I was lucky enough to spend two days working with Lifeline’s dedicated and talented clinicians on improving the retention and outcome of clinical services they offer.

The two-day conference was the kick off for a “transformation project,” as Trevor Carlyon, the executive director of Lifeline Community Care points out in the video segment below, the stated goal of which is “putting clients back at the center of care.”   Nearly 200 clinicians working with a diverse clientele located throughout northern Queensland gathered for the event.  I look forward to returning in the future as the ideas are implemented across services throughout the system.

 

Filed Under: Behavioral Health, CDOI, evidence-based practice, Feedback Informed Treatment - FIT, Implementation Tagged With: australia, lifeline community care, mental health

Where is Scott Miller going? The Continuing Evolution

November 16, 2009 By scottdm 2 Comments

I’ve just returned from a week in Denmark providing training for two important groups.  On Wednesday and Thursday, I worked with close to 100 mental health professionals presenting the latest information on “What Works” in Therapy at the Kulturkuset in downtown Copenhagen.  On Friday, I worked with a small group of select clinicians working on implementing feedback-informed treatment (FIT) in agencies around Denmark.  The day was organized by Toftemosegaard and held at the beautiful and comfortable Imperial Hotel.

In any event, while I was away, I received a letter from my colleague and friend, M. Duncan Stanton.  For many years, “Duke,” as he’s known, has been sending me press clippings and articles both helping me stay “up to date” and, on occasion, giving me a good laugh.  Enclosed in the envelope was the picture posted above, along with a post-it note asking me, “Are you going into a new business?!”

As readers of my blog know, while I’m not going into the hair-styling and spa business, there’s a grain of truth in Duke’s question.  My work is indeed evolving.  For most of the last decade, my writing, research, and training focused on factors common to all therapeutic approaches. The logic guiding these efforts was simple and straightforward. The proven effectiveness of psychotherapy, combined with the failure to find differences between competing approaches, meant that elements shared by all approaches accounted for the success of therapy (e.g., the therapeutic alliance, placebo/hope/expectancy, structure and techniques, extratherapeutic factors).  As first spelled out in Escape from Babel: Toward a Unifying Language for Psychotherapy Practice, the idea was that effectiveness could be enhanced by practitioners purposefully working to enhance the contribution of these pantheoretical ingredients.  Ultimately though, I realized the ideas my colleagues and I were proposing came dangerously close to a new model of therapy.  More importantly, there was (and is) no evidence that teaching clinicians a “common factors” perspective led to improved outcomes–which, by the way, had been my goal from the outset.

The measurable improvements in outcome and retention–following my introduction of the Outcome and Session Rating Scales to the work being done by me and my colleagues at the Institute for the Study of Therapeutic Change–provided the first clues to the coming evolution.  Something happened when formal feedback from consumers was provided to clinicians on an ongoing basis–something beyond either the common or specific factors–a process I believed held the potential for clarifying how therapists could improve their clinical knowledge and skills.  As I began exploring, I discovered an entire literature of which I’d previously been unaware; that is, the extensive research on experts and expert performance.  I wrote about our preliminary thoughts and findings together with my colleagues Mark Hubble and Barry Duncan in an article entitled, “Supershrinks” that appeared in the Psychotherapy Networker.

Since then, I’ve been fortunate to be joined by an internationally renowned group of researchers, educators, and clinicians, in the formation of the International Center for Clinical Excellence (ICCE).  Briefly, the ICCE is a web-based community where participants can connect, learn from, and share with each other.  It has been specifically designed using the latest web 2.0 technology to help behavioral health practitioners reach their personal best.  If you haven’t already done so, please visit the website at www.iccexcellence.com to register to become a member (its free and you’ll be notified the minute the entire site is live)!

As I’ve said before, I am very excited by this opportunity to interact with behavioral health professionals all over the world in this way.  Stay tuned, after months of hard work and testing by the dedicated trainers, associates, and “top performers” of ICCE, the site is nearly ready to launch.

Filed Under: excellence, Feedback Informed Treatment - FIT, Top Performance Tagged With: denmark, icce, Institute for the Study of Therapeutic Change, international center for cliniclal excellence, istc, mental health, ors, outcome rating scale, psychotherapy, psychotherapy networker, session rating scale, srs, supershrinks, therapy

History doesn’t repeat itself,

September 20, 2009 By scottdm 2 Comments

Mark Twain photo portrait.

Image via Wikipedia

“History doesn’t repeat itself,” the celebrated American author, Mark Twain once observed, “but it does rhyme.” No better example of Twain’s wry comment than recurring claims about specifc therapeutic approaches. As any clinician knows, every year witnesses the introduction of new treatment models.  Invariably, the developers and proponents claim superior effectivess of the approach over existing treatments.  In the last decade or so, such claims, and the publication of randomized clinical trials, has enabled some to assume the designation of an “evidence-based practice” or “empirically supported treatment.”  Training, continuing education, funding, and policy changes follow.

Without exception, in a few short years, other research appears showing the once widely heralded “advance” to be no more effective than what existed at the time.  Few notice, however, as professional attention is once again captured by a “newer” and “more improved” treatment model.  Studies conducted by my colleagues and I (downloadable from the “scholarly publications” are of my website), document this pattern with treatments for kids, alcohol abuse and dependence, and PTSD over the last 30 plus years.

As folks who’ve attended my recent workshops know, I’ve been using DBT as an example of approaches that have garnered significant professional attention (and funding) despite a relatively small number of studies (and participants) and no evidence of differential effectiveness.  In any event, the American Journal of Psychiatry will soon publish, “A Randomized Trial of Dialectical Behavior Therapy versus General Psychiatric Management for Borderline Personality Disorder.”

As described by the authors, this study is “the largest clinical trial comparing dialectical behavior therapy and an active high-standard, coherent, and principled approach derived from APA guidelines and delivered by clinicians with expertise in treating borderline personality disorder.”

And what did these researchers find?

“Dialectical behavior therapy was not superior to general psychiatric management with both intent-to-treat and per-protocol analyses; the two were equally effective across a range of outcomes.”  Interested readers can request a copy of the paper from the lead investigator, Shelley McMain at: Shelley_McMain@camh.net.

Below, readers can also find a set of slides summarizing and critiquing the current research on DBT. In reviewing the slides, ask yourself, “how could an approach based on such a limited and narrow sample of clients and no evidence of differential effectives achieved worldwide prominence?”

Of course, the results summarized here do not mean that there is nothing of value in the ideas and skills associated with DBT.  Rather, it suggests that the field, including clinicians, researchers, and policy makers, needs to adopt a different approach when attempting to improve the process and outcome of behavioral health practices.  Rather than continuously searching for the “specific treatment” for a “specific diagnosis,” research showing the general equivalence of competing therapeutic approaches indicates that emphasis needs to be placed on: (1) studying factors shared by all approaches that account for success; and (2) developing methods for helping clinicians identify what works for individual clients. This is, in fact, the mission of the International Center for Clinical Excellence: identifying the empirical evidence most likely to lead to superior outcomes in behavioral health.

Dbt Handouts 2009 from Scott Miller

Filed Under: Behavioral Health, Dodo Verdict, Practice Based Evidence Tagged With: alcohol abuse, Americal Psychological Association, American Journal of Psychiatry, APA, behavioral health, CEU, continuing education, CPD, evidence based medicine, evidence based practice, mental health, psychiatry, PTSD, randomized control trial, Training

Practice-Based Evidence Goes Mainstream

September 5, 2009 By scottdm 4 Comments

welcome-to-the-real-worldFor years, my colleagues and I have been using the phrase “practice-based evidence” to refer to clinicians’ use of real-time feedback to develop, guide, and evaluate behavioral health services. Against a tidal wave of support from professional and regulatory bodies, we argued that the “evidence-based practice”–the notion that certain treatments work best for certain diagnosis–was not supported by the evidence.

Along the way, I published, along with my colleagues, several meta-analytic studies, showing that all therapies worked about equally well (click here to access recent studies children, alcohol abuse and dependence, and post-traumatic stress disorder). The challenge, it seemed to me, was not finding what worked for a particular disorder or diagnosis, but rather what worked for a particular individual–and that required ongoing monitoring and feedback.  In 2006, following years of controversy and wrangling, the American Psychological Association, finally revised the official definition to be consistent with “practice-based evidence.” You can read the definition in the May-June issue of the American Psychologist, volume 61, pages 271-285.

Now, a recent report on the Medscape journal of medicine channel provides further evidence that practice-based evidence is going mainstream. I think you’ll find the commentary interesting as it provides compelling evidence that an alternative to the dominent paradigm currently guiding professional discourse is taking hold.  Watch it here.

Filed Under: Behavioral Health, evidence-based practice, Practice Based Evidence Tagged With: behavioral health, conference, deliberate practice, evidence based medicine, evidence based practice, mental health, Therapist Effects

Superior Performance as a Psychotherapist: First Steps

April 1, 2009 By scottdm Leave a Comment

So what is the first step to improving your performance?  Simply put, knowing your baseline.  Whatever the endeavor, you have to keep score.  All great performers do.  As a result, the performance in most fields has been improving steadily over the last 100 years.

Consider, for instance, the Olympics. Over the last century, the best performance for every event has improved–in some cases by 50%!  The Gold Medal winning time for the marathon in the 1896 Olympics was just one minute faster than the entry time currently required just to participate in the Chicago and Boston marathons.

By contrast, the effectiveness of psychological therapies has not improved a single percentage point over the last 30 years.  How, you may wonder, could that be?  During the same time period: (1) more than 10,000 how-to books on psychotherapy have been published; (2) the number of treatment approaches has mushroomed from 60 to 400; and (3) there are presently 145 officially approved, evidenced-based, manualized treatments for 51 of the 397 possible DSM IV diagnostic groups.  Certainly, given such “growth,” we therapists must be more effective with more people than ever before.  Unfortunately, however, instead of advancing, we’ve stagnated, mistaking our feverish peddling for real progress in the Tour de Therapy.

Truth is, no one has been keeping score, least of all we individual practitioners. True, volumes of research now prove beyond any doubt that psychotherapy works.  Relying on such evidence to substantiate the effectiveness of one’s own work, however, is a bit like Tiger Woods telling you the par for a particular hole rather than how many strokes it took him to sink the ball.  The result on outcome, research indicates, is that effectiveness rates plateau very early in most therapists careers while confidence level continue to grow.

In one study, for example, when clinicians were asked to rate their job performance from A+ to F, fully two-thirds considered themselves A or better. No one, not a single person in the lot, rated him or herself as below average. As researchers Sapyta, Riemer, and Bickman (2005) conclude, “most clinicians believe that they produce patient outcomes that are well above average” (p. 146). In another study, Deirdre Hiatt and George Hargrave used peer and provider ratings, as well as a standardized outcome measure, to assess the success rates of therapists in a sample of mental health professionals. As one would expect, providers were found to vary significantly in their effectiveness. What was disturbing is that the least effective therapists in the sample rated themselves on par with the most effective!

The reason for stagnant success rates in psychotherapy should be clear to all: why try to improve when you already think your the best or, barring that, at least above average?

Here again, expanding our search for excellence beyond the narrow field of psychotherapy to the subject of expertise and expert performance in general can provide some helpful insights. In virtually every profession, from carpentry to policework, medicine to mathematics, average performers overestimate their abilities, confidently assigning themselves to the top tier. Therapists are simply doing what everyone else does. Alas, they are average among the average.

Our own work and research proves that clinicians can break away from the crowd of average achievers by using a couple of simple, valid, and reliable tools for assessing outcome. As hard as it may be to believe, the empirical evidence indicates that performance increases between 65-300% (click here to read the studies). Next time, I’ll review these simple tools as well as a few basic methods for determining exactly how effective you are. Subscribe now so you’ll be the first to know.

One more note, after posting last time, I heard from several readers who had difficulty subscribing. After doing some research, we learned that you must use IE 7 or Firefox 3.0.7 or later for the subscribe function to work properly.  Look forward to hearing from you!

In the meantime, the transcript below is of a recent interview I did for Shrinkrap radio.  It’s focused on our current work:

Supershrinks: An Interview with Scott Miller about What Clinicians can Learn from the Field’s Most Effective Practitioners from Scott Miller

 

Filed Under: Behavioral Health, excellence, Top Performance Tagged With: cdoi, evidence based practice, excellence, mental health, outcome measures, psychology, psychotherapy, srs, supershrinks

My New Year’s Resolution: The Study of Expertise

January 2, 2009 By scottdm Leave a Comment

Most of my career has been spent providing and studying psychotherapy.  Together with my colleagues at the Institute for the Study of Therapeutic Change, I’ve now published 8 books and many, many articles and scholarly papers.  If you are interested you can read more about and even download many of my publications here.

Like most clinicians, I spent the early part of my career focused on how to do therapy.  To me, the process was confusing and the prospect of sitting opposite a real, suffering, client, daunting.  I was determined to understand and be helpful so I went graduate school, read books, and attended literally hundreds of seminars.

Unfortunately, as detailed in my article, Losing Faith, written with Mark Hubble, the “secret” to effective clinical practice always seemed to elude me.  Oh, I had ideas and many of the people I worked with claimed our work together helped.  At the same time, doing the work never seemed as simple or effortless as professional books and training it appear.

Each book and paper I’ve authored and co-authored over the last 20 years has been an attempt to mine the “mystery” of how psychotherapy actually works.  Along the way, my colleagues and I have paradoxically uncovered a great deal about what contributes little or nothing to treatment outcome! Topping the list, of course, are treatment models.  In spite of the current emphasis on “evidence-based” practice, there is no evidence that using particular treatment models for specific diagnostic groups improves outcome.  It’s also hugely expensive!  Other factors that occupy a great deal of professional attention but ultimately make little or no difference include: client age, gender, DSM diagnosis, prior treatment history; additionally, therapist age, gender, years of experience, professional discipline, degree, training, amount of supervision, personal therapy, licensure, or certification.

In short, we spend a great deal of time, effort, and money on matters that matter very little.

For the last 10 years, my work has focused on factors common to all therapeutic approaches. The logic guiding these efforts was simple and straightforward. The proven effectiveness of psychotherapy, combined with the failure to find differences between competing approaches, meant that elements shared by all approaches accounted for the success of therapy. And make no mistake, treatment works. The average person in treatment is better off than 80% of those with similar problems that do not get professional help.

In the Heart and Soul of Change, my colleagues and I, joined by some of the field’s leading researchers, summarized what was known about the effective ingredients shared by all therapeutic approaches. The factors included the therapeutic alliance, placebo/hope/expectancy, structure and techniques in combination with a huge, hairy amount of unexplained “stuff” known as “extratherapeutic factors.”

Our argument, at the time, was that effectiveness could be enhanced by practitioners purposefully working to enhance the contribution of these pantheoretical ingredients.  At a minimum, we believed that working in this manner would help move professional practice beyond the schoolism that had long dominated the field.

Ultimately though, we were coming dangerously close to simply proposing a new model of therapy–this one based on the common factors.  In any event, practitioners following the work treated our suggestions as such.  Instead of say, “confronting dysfunctional thinking,” they understood us to be advocating for a “client-directed” or strength-based approach.  Discussion of particular “strategies” and “skills” for accomplishing these objectives did not lag far behind.  Additionally, while the common factors enjoyed overwhelming empirical support (especially as compared to so-called specific factors), their adoption as a guiding framework was de facto illogical.  Think about it.  If the effectiveness of the various and competing treatment approaches is due to a shared set of common factors, and yet all models work equally well, why would anyone need to learn about the common factors?

Since the publication of the first edition of the Heart and Soul of Change in 1999 I’ve struggled to move beyond this point. I’m excited to report that in the last year our understanding of effective clinical practice has taken a dramatic leap forward.  All hype aside, we discovered the reason why our previous efforts had long failed: our research had been too narrow.  Simply put, we’d been focusing on therapy rather than on expertise and expert performance.  The path to excellence, we have learned, will never be found by limiting explorations to the world of psychotherapy, with its attendant theories, tools, and techniques.  Instead, attention needs to be directed to superior performance, regardless of calling or career.

A significant body of research shows that the strategies used by top performers to achieve superior success are the same across a wide array of fields including chess, medicine, sales, sports, computer programming, teaching, music, and therapy!  Not long ago, we published our initial findings from a study of 1000’s of top performing clinicians in an article titled, “Supershrinks.”  I must say, however, that we have just “scratched the surface.”  Using outcome measures to identify and track top performing clinicians over time is enabling us, for the first time in the history of the profession, to “reverse engineer” expertise.  Instead of assuming that popular trainers (and the methods they promote) are effective, we are studying clinicians that have a proven track record.  The results are provocative and revolutionary, and will be reported first here on the Top Performance Blog!  So, stay tuned.  Indeed, why not subscribe? That way, you’ll be among the first to know.

Filed Under: Behavioral Health, excellence, Top Performance Tagged With: behavioral health, cdoi, DSM, feedback informed treatment, mental health, ors, outcome measurement, psychotherapy, routine outcome measurement, srs, supervision, therapeutic alliance, therapy

SEARCH

Subscribe for updates from my blog.

  

Upcoming Training

May
31

FIT CAFÉ May/June 2022


Aug
01

FIT Implementation Intensive 2022


Aug
03

Feedback Informed Treatment (FIT) Intensive ONLINE

FIT Software tools

FIT Software tools

NREPP Certified

HTML tutorial

LinkedIn

Topics of Interest:

  • Behavioral Health (112)
  • behavioral health (4)
  • Brain-based Research (2)
  • CDOI (14)
  • Conferences and Training (67)
  • deliberate practice (31)
  • Dodo Verdict (9)
  • Drug and Alcohol (3)
  • evidence-based practice (66)
  • excellence (63)
  • Feedback (40)
  • Feedback Informed Treatment – FIT (216)
  • FIT (29)
  • FIT Software Tools (12)
  • ICCE (26)
  • Implementation (7)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (11)
  • Practice Based Evidence (39)
  • PTSD (4)
  • Suicide (1)
  • supervision (1)
  • Termination (1)
  • Therapeutic Relationship (9)
  • Top Performance (40)

Recent Posts

  • Naïve, Purposeful, and Deliberate Practice? Only One Improves Outcomes
  • Study Shows FIT Improves Effectiveness by 25% BUT …
  • Seeing What Others Miss
  • How Knowing the Origins of Psychotherapy Can Improve Your Effectiveness
  • Session Frequency and Outcome: What is the “Right Dose” for Effective Psychotherapy?

Recent Comments

  • Asta on The Expert on Expertise: An Interview with K. Anders Ericsson
  • Michael McCarthy on Culture and Psychotherapy: What Does the Research Say?
  • Jim Reynolds on Culture and Psychotherapy: What Does the Research Say?
  • gloria sayler on Culture and Psychotherapy: What Does the Research Say?
  • Joseph Maizlish on Culture and Psychotherapy: What Does the Research Say?

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors conferences continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training