SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
info@scottdmiller.com 773.404.5130

Do Psychotherapists Improve with Time and Experience?

October 27, 2015 By scottdm 18 Comments

researchThe practice known as “routine outcome measurement,” or ROM, is resulting in the publication of some of the biggest and most clinically relevant psychotherapy studies in history.  Freed from the limits of the randomized clinical trial, and accompanying obsession with manuals and methods, researchers are finally able to examine what happens in real world clinical practice.

A few weeks ago, I blogged about the largest study of psychotherapy ever published.  More than 1,400 therapists participated.  The progress of over 26,000 people (aged 16-95) treated over a 12 year period in primary care settings in the UK was tracked on an ongoing basis via ROM.  The results?  In an average of 8 visits, 60% of those treated by this diverse group of practitioners achieved both reliable and clinically significant change—results on par with tightly controlled RCT’s.  The study is a stunning confirmation of the effectiveness of psychotherapy.

This week, another mega-study was accepted for publication in the Journal of Counselexperienceing Psychology.   Once more,
ROM was involved.  In this one, researchers Goldberg, Rousemanier, Miller, Whipple, Nielsen, Hoyt, and Wampold examined a large, naturalistic data set that included outcomes of 6500 clients treated by 170 practitioners whose results had been tracked an average of 5 years.

Their question?

Do therapists become more effective with time and experience?

Their answer?  No.

readerFor readers of this blog, such findings will not be particularly newsworthy.  As I’ve frequently pointed out, experience has never proven to be a significant predictor of effectiveness.

What might be a bit surprising is that the study found clinicians’ outcomes actually worsened with time and experience.  That’s right.  On average, the longer a therapist practiced, the less effective they became!  Importantly, this finding remained even when controlling for several patient-level, caseload-level, and therapist-level characteristics, as well as when excluding several types of outliers.

Such findings are noteworthy for a number of reasons but chiefly because they contrast sharply with results from other, equally-large studies documenting that therapists see themselves as continuously developing in both knowledge and ability over the course of their careers.   To be sure, the drop in performance reported by Goldberg and colleagues wasn’t steep.  Rather, the pattern was a slow, inexorable decline from year to year.

Where, one can wonder, does the disconnect come from?  How can therapists’ assessments of themselves and their work be so at odds with the facts?  Especially considering, in the study by Goldberg and colleagues, participating clinicians had ongoing access to data regarding their effectiveness (or lack thereof) on real-time basis!  Even the study I blogged about previously—the largest in history where outcomes of psychotherapy were shown to be quite positive—a staggering 40% of people treated experienced little or no change whatsoever.  How can such findings be reconciled with others indicating that clinicians routinely overestimate their effectiveness by 65%?

Turns out, thboundariese boundary between “belief in the process” and “denial of reality” is remarkably fuzzy.  Hope is a  significant contributor to outcome—accounting for as much as 30% of the variance in results.  At the same time, it becomes toxic when actual outcomes are distorted in a manner that causes practitioners to miss important opportunities to grow and develop—not to mention help more clients.  Recall studies documenting that top performing therapists evince more of what researchers term, “professional self-doubt.”  Said another way, they are less likely to see progress where none exists and more likely to values outcomes over therapeutic process.

What’s more, unlike their more average counterparts, highly effective practitioners actually become more effective with time and experience.  In the article below, my colleagues and I at the International Center for Clinical Excellence identify several evidence-based steps any practitioner follow to match such results.

Let me know your thoughts.

Until next time,

Scott

Scott D. Miller, Ph.D.
headerMain8.pngRegistration is now open for our March Intensives in Chicago.  Join colleagues from around the world for the FIT Advanced and the FIT Supervision workshops.

Do therapists improve (preprint)
The outcome of psychotherapy yesterday, today, and tomorrow (psychotherapy miller, hubble, chow, seidal, 2013)

 

Filed Under: Behavioral Health, evidence-based practice, Feedback, Feedback Informed Treatment - FIT, FIT, Top Performance Tagged With: excellence, outcome rating scale, psychotherapy

Public Attitudes Toward Mental Health Services: A Change for the Worse

July 3, 2014 By scottdm 1 Comment

Here it is

The results are not encouraging.  A recent meta-analysis found that public attitudes toward psychotherapy have become progressively more negative over the last 40 years.  The impact on practitioners is staggering.  Between 1997 and 2007, use of psychotherapy declined by 35%.  Not surprisingly, clinicians’ incomes also suffered, dropping 15-20% over the last decade.

So, if not psychotherapy, what do consumers of mental health services really want?

Well, if you trust the study I’ve cited, the answer seems clear: drugs.  During the same time period that talking fell out of favor, use of pharmaceuticals increased a whopping 75%!  Some blame society’s short attention span and desire for a “quick fix.”  Such an argument hardly seems credible, however, given that psychotherapy works to alleviate distress as fast or faster than most psychotropics.

Others, including the authors of the meta-analysis, blame public education campaigns and pharmacological marketing aimed at “convincing the public that mental disorders have a neurobiological etiology that require biological treatments” (p. 103).  At first glance, this idea is compelling.  After all, every year, the pharmaceutical industry spends $5 billion dollars on direct-to-consumer advertising.

And yet, what is it the drug companies are really selling in those ads?  In one of the most well-known TV commercials for a popular antidepressant, less than 7 seconds is spent on the supposed neurobiological cause.  Instead, the majority of the time is spent depicting the positive results one can expect from the product.   It’s marketing 101: focus on the benefits not the features of whatever you’re selling.

What do consumers want?  The answer is: results.  Your training, degree, certification, and treatment approach are irrelevant, mere features most consumers could care less about.  Your rate of effectiveness is another matter entirely–its the benefit people are looking for from working with you.

So, how effective are you?  Do you know?  Not a guess or a hunch, but the actual number of people you treat that are measurably improved?  If not, its easy to get started.  Start by downloading two, simple, free, SAMHSA-approved scales for measuring progress and quality of mental health services.  Next, visit www.whatispcoms.com to learn how individual practitioners and agencies can use these tools to monitor and improve outcome and retention in treatment, as well as communicate results effectively to consumers.

To see how outcomes attract consumers, just take a look at the Colorado Center for Clinical Excellence website.   This Denver-based group of practitioners is a model for the future of clinical practice.

Filed Under: Behavioral Health Tagged With: antidepressants, Colorado Center for Clinical Excellence, drugs, meta-analysis, ors, outcome rating scale, pharmalogical, psychotherapy, SAMHSA, session rating scale, srs

Is your therapy making your clients worse? The Guardian Strikes Again

June 12, 2014 By scottdm 1 Comment

demand-evidence-and-think

Last week, an article appeared in The Guardian, one of the U.K.’s largest daily newspapers.  “Counselling and Therapy can be Harmful,” the headline boldly asserted, citing results of a study yet to be published.  It certainly got my attention.

Do some people in therapy get worse?  The answer is, most assuredly, “Yes.”  Research dating back several decades puts the figure at about 10% (Lambert, 2010).  Said another way, at termination, roughly one out of ten people are functioning more poorly than they were at the beginning of treatment.

The cause?  Here’s what we know.  Despite claims to the contrary (e.g., Lilenfeld, 2007), no psychotherapy approach tested in a clinical trial has ever been shown to reliably lead to or increase the chances of deterioration.  NONE.  Scary stories about dangerous psychological treatments are limited to a handful of fringe therapies–approaches that have been never vetted scientifically and which all practitioners, but a few, avoid.

So, if it’s not about the method, then how to account for deterioration?  As the article points out, “some therapists had a lot more clients [who] deteriorated than others.”  And yet, while that statement is true–lots of prior research shows that some do more harm than others–there are too few such clinicians to account for the total number of clients who worsen.  Moreover, beyond that 10%, between 30 and 50% of people in treatment experience no benefit whatsoever!

Here is where the old adage, “an ounce of prevention is worth a pound of cure,” applies.  Whatever the cause, lack of progress and risk of deterioration are issues for all clinicians.  A growing body of research makes clear, the key to addressing the problem is tracking the progress of clients from visit to visit so that those not improving, or getting worse, can be identified and offered alternatives.

It’s not hard to get started.  You can learn a simple, evidence-based method for tracking progress and the quality of the relationship at: www.whatispcoms.com.  Best of all, practitioners can access the tools for free!

After that, join fellow practitioners from the US, Canada, Europe, and Australia  for one of our intensive trainings  coming up this August in Chicago.  I promise you’ll leave prepared to address the issue of deterioration directly and successfully.

Filed Under: Feedback Informed Treatment - FIT Tagged With: clinical trial, counselling, lilenfeld, michael lambery, psychotherapy, the guardian, therapy, Training, whatispcoms

Dumb and Dumber: Research and the Media

April 2, 2014 By scottdm 1 Comment

DUMB-AND-DUMBER

“Just when I thought you couldn’t get any dumber, you go and do something like this… and totally redeem yourself!”
– Harry in Dumb & Dumber

On January 25th, my inbox began filling with emails from friends and fellow researchers around the globe.  “Have you seen the article in the Guardian?” they asked.  “What do you make of it?” others inquired, “Have you read the study the authors are talking about?  Is it true?!”  A few of the messages were snarkier, even gloating,  “Scott, research has finally proven the Dodo verdict is wrong!”

The article the emails referred to was titled, Are all psychological therapies equally effective?  Don’t ask the dodo.  The subtitle boldly announced, “The claim that all forms of psychotherapy are winners has been dealt a blow.”

Honestly, my first thought on reading the headline was, “Why is an obscure topic like the ‘Dodo verdict’ the subject of an article in a major newspaper?”  Who in their right mind–outside of researchers and small cadre of psychotherapists–would care?  What possible interest would a lengthy dissertation on the subject–including references to psychologist Saul Rozenzweig (who first coined the expression in the 1930’s) and researcher allegiance effects–hold for the average Joe or Jane reader of The Guardian.  At a minimum, it struck me as odd.

And odd it stayed, until I glanced down to see who had written the piece.  The authors were psychologist Daniel Freeman–a strong proponent of the empirically-supported treatments–and his journalist brother, Jason.

Jason&Daniel-Freeman

Briefly, advocates of EST’s hold that certain therapies are better than others in the treatment of specific disorders.  Lists of such treatments are created–for example, the NICE Guidelines–dictating which of the therapies are deemed “best.”  Far from innocuous, such lists are, in turn, used to direct public policy, including both the types of treatment offered and the reimbursement given.

Interestingly, in the article, Freeman and Freeman base their conclusion that “the dodo was wrong” on a single study.  Sure enough, that one study comparing CBT to psychoanalysis, found that CBT resulted in superior effects in the treatment of bulimia.  No other studies were mentioned to bolster this bold claim–an assertion that would effectively overturn nearly 50 years of  robust research findings documenting no difference in outcome among competing treatment approaches.

In contrast to what is popularly believed extraordinary findings from single studies are fairly common in science.  As a result, scientists have learned to require replication, by multiple investigators, working in different settings.

The media, they’re another story.  They love such studies.   The controversy generates interest, capturing readers attention.   Remember cold fusion?  In 1989, researchers Stanley Pons and Martin Fleischmann–then two of the world’s leading electrochemists–claimed that they had produced a nuclear reaction at room temperature–a finding that would, if true, not only overturn decades and decades of prior research and theory but, more importantly, revolutionize energy production.

The media went nuts.  TV and print couldn’t get enough of it.  The hope for a cheap, clean, and abundant source of energy was simply too much to ignore.  The only problem was that, in the time that followed, no one could replicate Pons and Fleischmann’s results.  No one.  While the media ran off in search of other, more tantalizing findings to report, cold fusion quietly disappeared, becoming a footnote in history.

Back to The Guardian.  Curiously, Freeman and Freeman did not mention the publication of another, truly massive study published in Clinical Psychology Review—a study available in print at the time their article appeared.  In it, the researchers used the statistically rigorous method of meta-analysis to review results from 53 studies of psychological treatments for eating disorders.  Fifty-three!  Their finding?  Confirming mountains of prior evidence: no difference in effect between competing therapeutic approaches.  NONE!

Obviously, however, such results are not likely to attract much attention.

HUIZENGA

Sadly, the same day that the article appeared in The Guardian, John R. Huizenga passed away.  Huizenga is perhaps best known as one of the physicists who helped build the atomic bomb.  Importantly, however, he was also among the first to debunk the claims about cold fusion made by Pons and Fleischman.  His real-world experience, and decades of research, made clear that the reports were a case of dumb (cold fusion) being followed by dumber (media reports about cold fusion).

“How ironic this stalwart of science died on this day,” I thought, “and how inspiring his example is of ‘good science.'”

I spent the rest of the day replying to my emails, including the link to study in Clinical Psychology Review (Smart). “Don’t believe the hype,” I advised, “stick to the data” (and smarter)!

Filed Under: Practice Based Evidence Tagged With: CBT, Clinical Psychology Review, Daniel Freeman, dodo verdict, eating disorder, Jason Freeman, Martin Fleischmann, meta-analysis, NICE, psychoanalysis, psychotherapist, psychotherapy, research, Saul Rozenzweig, Stanley Pons, the guardian

Are you any good as a therapist? The Legacy of Paul W. Clement

March 26, 2014 By scottdm 4 Comments

Paul Clement

Twenty years ago, I came across an article published in the journal, Professional Psychology.  It was written by a psychologist in private practice, Paul Clement.  The piece caught my eye for a number of reasons.  First, although we’d never met, Paul lived and worked in a town near my childhood home: Pasadena, California.  Second, the question he opened his article with was provocative, to say the least, “Are you any good?”  In other words, how effective are YOU as a psychotherapist?  Third, and most important, he had compiled and was reporting a quantitative analysis of his results over the last 26 years as a practicing clinician.  It was both riveting and stunning.  No one I knew had ever done had published something similar before.

In graduate school, I’d learned to administer a variety of tests (achievement, vocational, personality, projective, IQ, etc.).  Not once, however, did I attend a course or sit in a lecture about how to measure my results.   I was forced to wonder, “How could that be?”  Six years in graduate school and not a word about evaluating one’s outcomes.  After all, if we don’t know how effective we are, how are any of us supposed to improve?

What was the reason for the absence of measurement, evaluation, and analysis?   It certainly wasn’t because psychotherapy wasn’t effective.  A massive amount of research existed documenting the effectiveness of treatment.  Paul’s research confirmed these results.  Of those he’d worked with, 75% were improved at termination.  Moreover, such results were obtained in a relatively brief period of time, the median number of sessions used being 12.

Other results he reported were not so easy to accept.  In short, Paul’s analysis showed that his outcomes had not improved over the course of his career.   At the conclusion of the piece, he observed, “I had expected to find that I had gotten better and better over the years, but my data failed to suggest any systematic change in my therapeutic effectiveness across the 26 years in question…it was a bad surprise for me.” (p. 175).

For years, I carried the article with me in my briefcase, hoping that one day, I might better understand his findings.   Maybe, I thought, Clement was simply an outlier?  Surely, we get better with experience.  It was hard for me to believe I hadn’t improved since my first, ham-handed sessions with clients.  Then again, I didn’t really know.  I wasn’t measuring my results in any meaningful way.

The rest is history.  Within a few short years, I was routinely monitoring the outcome and alliance at every session I did with clients.  Thanks to my undergraduate professor, Michael Lambert, Ph.D., I began using the OQ 45 to assess outcomes.  Another mentor, Dr. Lynn Johnson had developed a 10-item scale for evaluating the quality of the therapeutic relationship, know as the Session Rating Scale.  Both tools became an integral part of the way I worked.  Eventually, a suggestion by Haim Omer, Ph.D., led me to consider creating shorter, less time consuming visual analogue versions of both measures.  In time, together with colleagues, the ORS and SRS were developed and tested.  Throughout this process, Paul Clement, and his original study remained an important, motivating force.

Just over a year ago, Paul sent me an article evaluating 40 years of his work as a psychotherapist.   Once again, I was inspired by his bold, brave, and utterly transparent example.  Not only had his outcomes not improved, he reported, they’d actually deteriorated!  Leave it to him to point the way!   As readers of this blog know, our group is busy at work researching what it takes to forestall such deterioration and improve effectiveness.  Last year, we summarized our findings in the 50th Anniversary issue of Psychotherapy.  As I write, we are preparing a more detailed report for publication in the same journal.

Yesterday, I was drafting an email, responding to one I’d recently received from him, when I learned Paul had died.  I will miss him.  In this, I know I’m not alone.

Filed Under: Top Performance Tagged With: clinician, Haim Omer, Lynn Johnson, Michael Lambert, OQ45, ors, outcome rating scale, Paul Clement, popular psychology, practice-based evidence, psychotherapy, session rating scale, srs, top performance

Good News and Bad News about Psychotherapy

March 25, 2014 By scottdm 3 Comments

good news bad news

Have you seen this month’s issue of, “The National Psychologist?”  If you do counseling or psychotherapy, you should read it.  The headline screams, “Therapy: No Improvement for 40 Years.”  And while I did not know the article would be published, I was not surprised by the title nor it’s contents.  The author and associate editor, John Thomas, was summarizing the invited address I gave at the recent Evolution of Psychotherapy conference.

Fortunately, it’s not all bad news.  True, the outcomes of psychotherapy have not been improving.  Neither is there much evidence that clinicians become more effective with age and experience.  That said, we can get better.  Results from studies of top performing clinicians point the way.  I also reviewed this exciting research in my presentation.
Even if you didn’t attend the conference, you can see it here thanks to the generosity of the Milton H. Erickson Foundation.  Take a look at the article and video, then drop me a line and let me know what you think.  To learn more, you can access a variety of articles for free in the scholarly publications section of the website.

Click here to access the article from the National Psychologist about Scott Miller’s speech at the Evolution of Psychotherapy Conference in Anaheim, California (US) 

Filed Under: Top Performance Tagged With: accountability, Alliance, counselling, deliberate practice, erickson, evidence based practice, Evolution of Psychotherapy, feedback, healthcare, john thomas, psychotherapy, The National Psychologist, therapy

Do you do psychotherapy?

September 26, 2013 By scottdm 1 Comment

You know psychotherapy works. Forty years of research evidence backs up your faith in the process. And yet, fewer and fewer people are seeking out the services of professionals. Between 1998 and 2007, psychotherapy use decreased by 35%. People still sought help, they just went elsewhere to get it. For instance, use of psychotropic drugs is up 40% over the last decade.

A recent article in Popular Science traced the decline and outlined 3 provocative steps for saving the field. If you provide psychotherapy, it’s worth a read. The article is dead serious when recommending:

1. It’s time to GO BIG;

2. Getting a cute commercial; and

3. Dropping the biology jargon.

You’ve got to admit that the field’s fascination with biology is curious. A mountain of evidence points instead to the relationship between the provider and recipient of care. Other research shows that psychotherapy promotes more lasting change, at less cost and with fewer side effects than medication.

How to get the message out?

Many people and organizations are making a valiant effort. Ryan Howe almost single-handedly established today, September 25, as National Psychotherapy Day.  The American Psychological Association published a rare, formal resolution on the efficacy of psychotherapy.

Frankly though, the best commercial for psychotherapy is our results. Consider the approach taken by the Colorado Center for Clinical Excellence. They don’t merely cite studies supporting psychotherapy in general, they report their actual results!

You can begin doing the same by downloading two free, simple to use measures here.

Then, learn how to use the scales to determine your effectiveness at an upcoming Feedback Informed Treatment Intensive (FIT) training.

There, you’ll also learn how to use the data to improve both the quality and outcome of your services. That’s why the Substance Abuse and Mental Health Services Administration (SAMHSA) recently listed FIT on the National Registry of Evidence Based Programs and Practices!

So, now is the time GO BIG by joining us. The next training is coming up in March! Register now at: http://ai2014.eventbrite.ie/.

AIMarch2014 FITSupervisionMar2014

 

 

Filed Under: behavioral health, Conferences and Training, Feedback Informed Treatment - FIT Tagged With: American Psychological Association, NREPP, Popular Science, psychotherapy, SAMHSA

The Revolution in Swedish Mental Health Services: UPDATE on the CBT Monopoly

April 5, 2013 By scottdm Leave a Comment

No blogpost I’ve ever published received the amount of attention as the one on May 13th, 2012 detailing changes to Swedish Mental Health practice.  At the time, I reported about research results showing that the massive investment of resources in training therapists in CBT had not translated into improved outcomes or efficiency in the treatment of people with depression and anxiety.  In short, the public experiment of limiting training and treatment to so called, “evidence-based methods” had failed to produce tangible results.  The findings generated publications in Swedish journals as merited commentary in Swedish newspapers and on the radio.

I promised to keep people updated if and when research became available in languages other than Swedish.  This week, the journal Psychotherapy, published an article comparing outcomes of three different treatment approaches, including CBT, psychodynamic, and integrative-eclectic psychotherapy.  Spanning a three year period, the results gathered at 13 outpatient clinics, found that psychotherapy was remarkably effective regardless of the type of treatment offered!  Read the study yourself and then ask yourself: when will a simpler, less expensive, and more client-centered approach to insuring effective and efficient behavioral health services be adopted?  Routinely seeking feedback from consumers regarding the process and outcome of care provides such an alternative.  The failure to find evidence that adopting specific models for specific disorders improves outcomes indicates the time has come.  You can learn more about feedback-informed treatment (FIT), a practice recently designed “evidence-based” by the Substance Abuse and Mental Health Services Administration (SAMHSA), by visiting the International Center for Clinical Excellence web-based community or attending an upcoming training with me in Chicago or on the road.

  • Learn more about what is going on in Sweden by reading:

Everyday evidence outcomes of psychotherapies in swedish public health services (psychotherapy werbart et al 2013)

  • Here’s one additional reference for those of you who read Swedish.  It’s the official summary of the results from the study that started this entire thread:
Delrapport ii slutversion

Filed Under: Practice Based Evidence Tagged With: CBT, evidence based practice, ors, outcome rating scale, psychotherapy, session rating scale, srs, sweden

Believing is Seeing: How Wishing Makes Things So

January 3, 2013 By scottdm Leave a Comment

Yesterday evening, my family and I were watching a bit of T.V.  My son, Michael commented about all the ads for nutrional supplements, juicing machines, weight loss programs and devices.  “Oh yeah,” I thought, then explained to him, “It’s the start of a new year.”  Following “spending more time with family,” available evidence shows exercise and weight loss top the bill of resolutions.  Other research shows that a whopping 80% eventually break these well intentioned commitments.  Fully a third won’t even make it to the end of the month!  Most attribute the failure to being too busy, others to a lack of motivation.  Whatever the cause, it’s clear that, when it comes to change, hope and belief will only take you so far. 

What can help?  More on that in a moment.

In the meantime, consider a recent study on the role of hope and belief in research on psychotherapy.  Beginning in the 1970’s, study after study, and studies of studies, have found a substantial association between the effectiveness of particular treatment models and the beliefs of the researchers who conduct the specific investigations.  In the literature, the findings are referred to under the generic label, “research allegiance” or R.A.  Basically, psychotherapy outcome researchers tend to find in favor of the approach they champion, believe in, and have an affinity towards.  Unlike New Year’s resolutions, it seems, the impact of hope and belief in psychotherapy outcome research is not limited; indeed, it carries investigators all the way to success–albeit a result that is completely “in the eye of the beholder.”  That is, if one believes the research.  Some don’t.

Hang with me now as I review the controversy about this finding.  As robust as the results on researcher allegiance appear, an argument can be made that the phenomenon is a reflection rather than a cause of differences in treatment effectiveness.  The argument goes: researcher allegiance is caused by the same factors that lead to differences in outcome between approaches: real differences in outcome betweepproaches.  In short, researchers’ beliefs do not cause the effects, as much as the superior effects of the methods cause researchers to believe.   Makes sense, right?  And the matter has largely nguished there, unresolved for decades.

That is, until recently.  Turns out, believing is seeing.  Using a sample of studies in which treatments with equivalent efficacy were directly compared within the same study, researchers Munder, Fluckiger, Gerger, Wampold, and Barth (2012) found that a researcher’s allegiance to a particular method systemically biases their results in favor of their chosen approach.  The specific methods included in this study were all treatments designated as “Trauma-focused” and deemed “equally effective” by panels of experts such as the U.K.’S National Institute for Clinical Excellence.  Since the TFT approaches are equivalent in outcome, researcher allegiance should not have been predictive of outcome.  Yet, it was–accounting for an incredible 12% of the variance.  When it comes to psychotherapy outcome research, wishing makes it so.

What’s the “take-away” for practitioners?  Belief is powerful stuff: it can either help you see possibilities or blind you to important realities.  Moreover, you cannot check your beliefs at the door of the consulting room, nor would you want to.  Everyday, therapists encourage people to take the first steps toward a happier, more meaningful life by rekindling hope.  However, if researchers, bound by adherence to protocol and subject to peer review can be fooled, so can therapists.  The potentially significant consequences of unchecked belief become apparent when one considers a recently published study by Walfish et al. (2012) which found that therapists on average overestimate their effectiveness by 65%.

When it comes to keeping New Year’s resolutions, experts recommend avoiding broad promises and grand commitments and instead advise setting small, concrete measureable objectives.  Belief, it seems, is most helpful when its aims are clear and effects routinely verified.  One simple way to implement this sage counsel in psychotherapy is to routinely solicit feedback from consumers about the process and outcome of the services offered.  Doing so, research clearly shows, improves both retention and effectiveness.

You can get two, simple, easy-to use scales for free by registering at: http://scottdmiller.com/srs-ors-license/  A world wide community of behavioral health professionals is available to support your efforts at: www.centerforclinicalexcellence.com.

You can also join us in Chicago for four days of intensive training.  We promise to challenge your both beliefs and provide you with the skills and tools necessary for pushing your clinical performance to the next level of effectiveness.

Filed Under: Feedback Informed Treatment - FIT Tagged With: NICE, ors, outome rating scale, psychotherapy, session rating scale, srs, wampold

Psychotherapy Training: Is it Worth the Bother?

October 29, 2012 By scottdm 2 Comments

Big bucks.  That’s what training in psychotherapy costs.  Take graduate school in psychology as an example.  According to the US Department of Education’s National Center (NCES), a typical doctoral program takes five years to complete and costs between US$ 240,000-300,000.00.

Who has that kind of money laying around after completing four years of college?  The solution? Why, borrow the money, of course!  And students do.  In 2009, the average amount of debt of those doctoral students in psychology who borrowed was a whopping US$ 88,000–an amount nearly double that of the prior decade.  Well, the training must be pretty darn good to warrent such expenditures–especially when one considers that entry level salaries are on the decline and not terribly high to start!

Oh well, so much for high hopes.

Here are the facts, as recounted in a recent, concisely written summary of the evidence by John Malouff:

1. Studies comparing treatments delivered by professionals and paraprofessionals either show that paraprofessionals have better outcomes or that there is no difference between the two groups;

2. There is virtually no evidence that supervision of students by professionals leads to better client outcomes (you should have guessed this after reading the first point);

3. There is no evidence that required coursework in graduate programs leads to better client outcomes.

If you are hoping that post doctoral experience will make up for the shortcomings of professional training, well, keep hoping.  In truth, professional experience does not correlate often or significantly with client therapy outcomes.

What can you do?  As Malouf points out, “For accrediting agencies to operate in the realm of principles of evidence-based practice, they must produce evidence…and this evidence needs to show that…training…contribute(s) to psychotherapy outcomes…[and] has positive benefits for future clients of the students” (p. 31).

In my workshops, I often advise therapists to forgo additional training until they determine just how effective they are right now.  Doing otherwise, risks perceiving progress where, in fact, none exists.  What golfer would buy new clubs or pursue expensive lessions without first knowing their current handicap?  How will you know if the training you attend is “worth the bother” if you can’t accurately measure the impact of it on your performance?

Determining one’s baseline rate of effectiveness is not as hard as it might seem.  Simply download the Outcome Rating Scale and begin using it with your clients.  It’s free.  You can then aggregate and analyze the data yourself or use one of the existing web-based systems (www.fit-outcomes.com or www.myoutcomes.com) to get data regarding your effectiveness in real time.

After that, join your colleagues at the upcoming Advanced Intensive Training in Feedback Informed Treatment.   This is an “evidence-based” training event.  You learn:

• How to use outcome management tools (e.g., the ORS) to inform and improve the treatment services you provide;

• Specific skills for determining your overall clinical success rate;

• How to develop an individualized, evidence-based professional development plan for improving your outcome and retention rate.

There’s a special “early bird” rate available for a few more weeks.  Last year, the event filled up several months ahead of time, so don’t wait.

On another note, just received the schedule for the 2013 Evolution of Psychotherapy conference.  I’m very excited to have been invited once again to the pretigious event and will be bring the latest information and research on acheiving excellence as a behavioral health practitioner.  On that note, the German artist and psychologist, Andreas Steiner has created a really cool poster and card game for the event, featuring all of the various presenters.  Here’s the poster.  Next to it is the “Three of Hearts.”  I’m pictured there with two of my colleagues, mentors, and friends, Michael Yapko and Stephen Gilligan:

Filed Under: Conferences and Training, Feedback Informed Treatment - FIT, Top Performance Tagged With: Andreas Steiner, evidence based medicine, evidence based practice, Evolution of Psychotherapy conference, john malouff, Michael Yapko, ors, outcome management, outcome measurement, outcome rating scale, paraprofessionals, psychology, psychotherapy, session rating scale, srs, Stephen Gilligan, therapy, Training, US Department of Education's National Center (NCES)

Is the "Summer of Love" Over? Positive Publication Bias Plagues Pharmaceutical Research

March 27, 2012 By scottdm Leave a Comment


Evidence-based practice is only as good as the available “evidence”–and on this subject, research points to a continuing problem with both the methodology and type of studies that make it into the professional literature.  Last week, PloS Medicine, a peer-reviewed, open access journal of the Public Library of Science, published a study showing a positive publication bias in research on so-called atypical antipsychotic drugs.  In comparing articles appearing in journals to the FDA database, researchers found that almost all postive studies were published while clinical trials with negative or questionable results were not or–and get this–were published as having positive results!

Not long ago, similar yet stronger results appeared in the same journal on anti-depressants.  Again, in a comparison with the FDA registry, researchers found all postive studies were published while clinical trials with negative or questionable results were not or–and get this–were published as having positive results!  The problem is far from insignificant.  Indeed, a staggering 46% of studies with negative results were not published or published but reported as positive.

Maybe the “summer of love” is finally over for the field and broader American public.  Today’s Chicago Tribune has a story by Kate Kelland and Ben Hirschler reporting data about sagging sales of anti-depressants and multiple failures to bring new, “more effective” drug therapies to market.  Taken together, robust placebo effects, the FDA mandate to list all trials (positive and negative), and an emphasis in research on conducting fair comparisons (e.g., comparing any new “products” to existing ones) make claims about “new and improved” effectiveness challenging.

Still one sees ads on TV making claims about the biological basis of depression–the so called, “biochemical imbalance.”  Perhaps this explains why a recent study of Medicaid clients found that costs of treating depression rose by 30% over the last decade while the outcomes did not improve at all during the same period.  The cause for the rise in costs?    Increased use of psychiatric drugs–in particular, anti-psychotics in cases of depression.

“It’s a great time for brain science, but at the same time a poor time for drug discovery for brain disorders,” says David Nutt, professor of neuropsychopharmacology, cited in the Chicago Tribune, “That’s an amazing paradox which we need to do something about.”

Here’s an idea: how about not assuming that problems in living are reduceable to brain chemistry?   That the direction of causality for much of what ails people is not brain to behavior but perhaps behavior to brain?  On this note, it is sad to note that while the percentage of clients prescribed drugs rose from 81 to 87%–with no improvement in effect–the number of those receiving psychotherapy dropped from 57 to 38%.

Here’s what we know about psychotherapy: it works and it has a far less troublesome side effect profile than psychotropic drugs.  No warnings needed for dry mouth, dizziness, blood and liver problems, or sexual dysfunction.  The time has come to get over the collective 1960’s delusion of better living through chemistry.

Filed Under: Practice Based Evidence Tagged With: behavioral health, continuing education, depression, evidence based practice, icce, Medicaid, mental health, psychotherapy

Are Mental Health Practioners Afraid of Research and Statistics?

September 30, 2011 By scottdm Leave a Comment

A few weeks back I received an email from Dr. Kevin Carroll, a marriage and family therapist in Iowa.  Attached were the findings from his doctoral dissertation.  The subject was near and dear to my heart: the measurement of outcome in routine clinical practice.  The findings were inspiring.  Although few graduate level programs include training on using outcome measures to inform clinical practice, Dr. Carroll found that 64% of those surveyed reporting utilizing such scales with about 70% of their clients!  It was particularly rewarding for me to learn that the most common measures employed were the…Outcome and Session Rating Scales (ORS & SRS)

As readers of this blog know, there are multiple randomized clinical trials documenting the impact that routine use of the ORS and SRS has on retention, quality, and outcome of behavioral health services.  Such scales also provide direct evidence of effectiveness.  Last week, I posted a tongue-in-cheek response to Alan Kazdin’s broadside against individual psychotherapy practitioners.  He was bemoaning the fact that he could not find clinicians who utilized “empirically supported treatments.”  Such treatments when utilized, it is assumed, lead to better outcomes.  However, as all beginning psychology students know, there is a difference between “efficacy” and “effectiveness” studies.  The former tell us whether a treatment has an effect, the latter looks at how much benefit actual people gain from “real life” therapy.  If you were a client which kind of study would you prefer?  Unfortunately, most of the guidelines regarding treatment models are based on efficacy rather than effectiveness research.  The sine qua non of effectiveness research is measuring the quality and outcome of psychotherapy locally.  After all, what client, having sought out but ultimately gained nothing from psychotherapy, would say, “Well, at least the treatment I got was empircally supported.”  Ludicrous.

Dr. Carroll’s research clearly indicates that clinicians are not afraid of measurement, research, and even statistics.  In fact, this last week, I was in Denmark teaching a specialty course in research design and statistics for practitioners.  That’s right.  Not a course on research in psychotherapy or treatment.  Rather, measurement, research design, and statistics.  Pure and simple.  Their response convinces me even more that the much talked about “clinician-researcher” gap is not due to a lack of interest on practitioners’ parts but rather, and most often, a result of different agendas.  Clinicians want to know “what will work” for this client.  Research rarely address this question and the aims and goals of some in the field remain hopelessly far removed from day to day clinical practice.  Anyway, watch the video yourself:

Filed Under: Feedback, Feedback Informed Treatment - FIT Tagged With: continuing education, holland, icce, ors, Outcome, psychotherapy, Session Rating Scales, srs

Goodbye Freud, Hello Common Factors

September 14, 2010 By scottdm Leave a Comment

Gary Greenberg certainly has a way with words.  In his most recent article, The War on Unhappiness, published in the August issue of Harper‘s magazine, Greenberg focuses on the “helping profession”–its colorful characters, constantly shifting theoretical landscape, and claims and counterclaims regarding “best practice.”  He also gives prominence to the most robust and replicated finding in psychotherapy outcome research: the “dodo bird verdict.”  Simply put, the finding that all approaches developed over the last 100 years–now numbering in the thousands–work about equally well.   Several paragraphs are devoted to my own work; specifically, research documenting the relatively inconsequential role that particular treatment approaches play in successful treatment and the importance of using ongoing feedback to inform and improve mental health services.  In any event, Greenberg’s review of current and historical trends is sobering to say the least–challenging mental health professionals to look in the mirror and question what we really know for certain–and a must read for any practitioner hoping to survive and thrive in the current practice environment.  OK.  Enough said.  Read it yourself here.

View more documents from Scott Miller.

Filed Under: Behavioral Health Tagged With: cdoi, gary greenberg, healthcare, mental health, psychotherapy

So you want to be a better therapist? Take a hike!

July 16, 2010 By scottdm Leave a Comment

How best to improve your performance as a clinician?  Take the continuing education multiple-choice quiz:

a. Attend a two-day training;
b. Have an hour of supervision from a recognized expert in a particular treatment approach;
c. Read a professional book, article, or research study;
d. Take a walk or nap.

If you chose a, b, or c, welcome to the world of average performance!  As reviewed on my blog (March 2010), there is exactly zero evidence that attending a continuing education event improves performance.  Zero.  And supervision?  In the most recent review of the research, researchers Beutler et al. (2005) concluded, “Supervision of psychotherapy cases has been the major method of ensuring that therapists develop proficiency and skill…unfortunately, studies are sparse…and apparently, supervisors tend to rate highly the performance of those who agree with them” (p. 246).  As far as professional books, articles, and studies are concerned–including those for which a continuing education or “professional development” point may be earned–the picture is equally grim.  No evidence.  That leaves taking a walk or nap!

K. Anders Ericsson–the leading researcher in the area of expertise and expert performance–points out the type and intensity of practice required to improve performance, “requires concentration that can be maintained only for limited periods of time.”  As a result, he says, “expert performers from many domains engage in practice without rest for only around an hour…The limit…holds true for a wide range of elite performers in difference domains…as does their increased tendency to recperative take naps”  (p.699, Erickson, 2006).  By the way, Ericsson will deliver a keynote address at the upcoming “Achieving Clinical Excellence” conference.  Sign up now for this event to reserve your space!


Two recently released studies add to the evidence base on rest and expertise.  The first, conducted at the University of California, Berkeley by psychologist Matthew Walker found that a midday nap markedly improved the brain’s learning capacity.  The second, published last week in the European Journal of Developmental Psychology, found that simply taking a walk–one where you are free to choose the speed–similarly improved performance on complex cognitive tasks.

So, there you go.  I’d say more but I’m feeling sleepy.

Filed Under: Behavioral Health, deliberate practice, evidence-based practice, excellence Tagged With: cdoi, European Journal of Developmental Psychology, evidence based practice, K. Anders Erickson, professional development, psychotherapy, supervision

ICCE Membership Hits 1000!

April 28, 2010 By scottdm Leave a Comment

Just yesterday, the membership of the International Center for Clinical Excellence burst through the 1000 mark, making it the largest community of behavioral health professionals dedicated to excellence and feedback informed treatment (FIT).  And there’s more news…click on the video below.

Filed Under: ICCE Tagged With: addiction, behavioral health, cdoi, common factors, psychotherapy, Therapist Effects

Where Necessity is the Mother of Invention: Forming Alliances with Consumers on the Margins

April 11, 2010 By scottdm 3 Comments

Spring of last year, I traveled to Gothenburg, Sweden to provide training GCK–an top notch organization led by Ulla Hansson and Ulla Westling-Missios providing cutting-edge training on “what works” in psychotherapy.  I’ll be back this week again doing an open workshop and an advanced training for the group.

While I’m always excited to be out and about traveling and training, being in Sweden is special for me.  It’s like my second home.  My family roots are Swedish and Danish and, it just so happens, I speak the language.  Indeed, I lived and worked in the country for two years back in the late seventies.  If you’ve never been, be sure and put it on your short list of places to visit…

AND IMPORTANTLY, go in the Summer!  (Actually, the photos above are from the famous “Ice Hotel”–that’s right, a hotel completely made of icc.  The lobby, bar, chairs, beds.  Everything!  If you find yourself in Sweden during the winter months, it’s a must see.  I promise you’ll never forget the experience).

Anyway, the last time I was in Gothenburg, I met a clinician whose efforts to deliver consumer-driven and outcome-informed services to people on the margins of society were truly inspiring.   During one of the breaks at the training, therapist Jan Larsson introduced himself, told me he had been reading my books and articles, and then showed me how he managed to seek and obtain feedback from the people he worked with on the streets.  “My work does not look like ‘traditional’ therapeutic work since I do not meet clients at an office.  Rather, I meet them where they live: at home, on a bench in the park, or sitting in the library or local activity center.”

Most of Jan’s clients have been involved with the “psychiatric system” for years and yet, he says, continue to struggle and suffer with many of the same problems they entered the system with years earlier.  “Oftentimes,” he observed, “a ‘treatment plan’ has been developed for the person that has little to do with what they think or want.”

So Jan began asking.  And each time they met, they also completed the ORS and SRS–“just to be sure,” he said.  No computer.  No I-phone app.  No sophisticated web-based adminsitration system.  With a pair of scissors, he simply trimmed copies of the measures to fit in his pocket-sized appointment book.

His experience thusfar?  In Swedish Jan says, “Det finns en livserfarenhet hos klienterna som bara väntar på att bli upptäckt och bli lyssnad till. Klienterna är så mycket mer än en diagnos. Frågan är om vi är nyfikna på den eftersom diagnosen har stulit deras livberättelse.”  Translated: “There is life experience with clients that is just waiting to be noticed and listened to.  Clients are so much more than their diagnosis.  The question is whether we are curious about them because the diagnosis has stolen their life story.”

I look forward to catching up Jan and the crew at GKC this coming week.  I also be posting interviews with Ulla and Ulla as well as ICCE certified trainers Gun-Eva Langdahl (who I’ll be working with in Skelleftea) and Gunnar Lindfeldt (who I’ll be meeting in Stockholm).  In the meantime, let me post several articles he sent by Swedish research Alain Topor on developing helpful relationships with people on the margins.  Dr. Topor was talking about the “recovery model” among people considered “severely and persistently mentally ill long before it became popular here in the States. Together with others, such as psychologist Jan Blomqvist (who I blogged about late last year), Alain’s work is putting the consumer at the center of service delivery.

Filed Under: Behavioral Health, Feedback Informed Treatment - FIT Tagged With: evidence based practice, Hypertension, Jan Blomqvist, ors, outcome rating scale, Pharmacology, psychotherapy, randomized clinical trial, recovery model, session rating scale, srs, sweden, Training

Neurobabble Redux: Comments from Dr. Mark Hubble on the Latest Fad in the World of Therapy Spark Comment and Controversy

April 8, 2010 By scottdm 2 Comments

 


Last week, my long time colleague and friend, Dr. Mark Hubble blogged
about the current interest of non-medically trained therapists in the so-called “neurobiology of human behavior.”  In my intro to his post, I “worried” out loud about the field’s tendency to search for legitimacy by aligning with the medical model.  Over the years, psychotherapy has flirted with biology, physics, religion, philosophy, chaos, and “energy meridians” as both the cause of what ails people and and the source of psychotherapy’s effectiveness.

For whatever reason, biological explanations have always had particular cachet in the world of psychotherapy.  When I first entered the field, the “dexamethasone suppression test” was being touted as the first “blood test” for depression.  Some twenty years on, its hard to remember the hope and excitement surrounding the DST.

Another long-time friend and colleague, psychologist Michael Valentine is fond of citing the many problems–social, physical, and otherwise–attributed to genetics (including but not limited to: anxiety, depression, addictions, promiscuity, completed suicides, thrill seeking obscene phone calls, smoking, gambling, and the amount of time one spends watching TV) for which there is either: (a) precious little or inconsistent evidence; or (b) the variance attributable to genetics is small and insignificant compared to size and scope of the problem.

In any event, I wanted to let readers know that response to Mark’s post has been unusually strong.  The numerous comments can be found on the syndicated version of my blog at the International Center for Clinical Excellence.  Don’t miss them!

Filed Under: Behavioral Health Tagged With: behavioral health, brief therapy, dexamethasone suppression test, icce, mark hubble, meta-analysis, Michael Valentine, psychotherapy, public behavioral health

Neurobabble: Comments from Dr. Mark Hubble on the Latest Fad in the World of Therapy

March 24, 2010 By scottdm Leave a Comment


Rarely does a day go by without hearing about another “advance” in the neurobiology of human behavior.  Suddenly, it seems, the world of psychotherapy has discovered that people have brains!  And now where the unconscious, childhood, emotions, behaviors, and cognitions once where…neurons, plasticity, and magnetic resonance imagining now is.  Alas, we are a field forever in search of legitimacy.  My long time colleague and friend, Mark Hubble, Ph.D., sent me the following review of recent developments.  I think you’ll enjoy it, along with video by comedian John Cleese on the same subject.

Mark Hubble, Ph.D.

Today, while contemplating the numerous chemical imbalances that are unhinging the minds of Americans — notwithstanding the longstanding failure of the left brain to coach the right with reason, and the right to enlighten the left with intuition — I unleashed the hidden power of my higher cortical functioning to the more pressing question of how to increase the market share for practicing therapists. As research has dismantled once and for all the belief that specific treatments exist for specific disorders, the field is left, one might say, in an altered state of consciousness. If we cannot hawk empirically supported therapies or claim any specialization that makes any real difference in treatment outcome, we are truly in a pickle. All we have is ourselves, the relationships we can offer to our clients, and the quality of their participation to make it all work. This, of course, hardly represents a propitious proposition for a business already overrun with too many therapists, receiving too few dollars.

Fortunately, the more energetic and enterprising among us, undeterred by the demise of psychotherapy as we know it, are ushering the age of neuro-mythology and the new language of neuro-babble.   Seemingly accepting wholesale the belief that the brain is the final frontier, some are determined to sell us the map thereto and make more than a buck while they are at it. Thus, we see terms such as “Somatic/sensorimotor Psychotherapy,” “Interpersonal Neurobiology,” “Neurogenesis and Neuroplasticity,”  “Unlocking the Emotional Brain,” “NeuroTherapy,” “Neuro Reorganization,” and so on.  A moment’s look into this burgeoning literature quickly reveals the existence of an inverse relationship between the number of scientific sounding assertions and actual studies proving the claims made. Naturally, this finding is beside the point, because the purpose is to offer the public sensitive, nuanced brain-based solutions for timeless problems. Traditional theories and models, are out, psychotherapies-informed-by-neuroscience, with the aura of greater credibility, are in.

Neurology and neuroscience are worthy pursuits. To suggest, however, that the data emerging from these disciplines have reached the stage of offering explanatory mechanisms for psychotherapy, including the introduction of “new” technical interventions, is beyond the pale. Metaphor and rhetoric, though persuasive, are not the same as evidence emerging from rigorous investigations establishing and validating cause and effect, independently verified, and subject to peer review.

Without resorting to obfuscation and pseudoscience, already, we have a pretty good idea of how psychotherapy works and what can be done now to make it more effective for each and every client. From one brain to another, to apply that knowledge, is a good case of using the old noggin.

Filed Under: Brain-based Research, Practice Based Evidence Tagged With: behavioral health, brief therapy, continuing education, mark hubble, meta-analysis, neuro-mythology, Norway, psychotherapy, public behavioral health

Deliberate Practice, Expertise, & Excellence

February 3, 2010 By scottdm 2 Comments

Later today, I board United flight 908 on my way to workshops scheduled in Holland and Belgium.  My routine in the days leading up to an international trip is always the same.  I slowly gather together the items I’ll need while away: computer (check); european electric adapter (check); presentation materials (check); clothes (check).   And, oh yeah, two decks of playing cards and close up performance mat.

That’s me (pictured above) practicing a “ribbon spread” in my hotel room following a day of training in Marion, Ohio.  It’s a basic skill in magic and I’ve been working hard on this (and other moves using cards) since last summer.  Along the way, I’ve felt both hopeful and discouraged.  But I’ve kept on nonetheless taking heart from what I’m reading about skill acquisition.

Research on expertise indicates that the best performers (in chess, medicine, music, sports, etc.) practice every day of the week (including weekends) for up to four hours a day.  Sounds tiring for sure.  And yet, the same body of evidence shows that world class performers are able to sustain such high levels of practice because they view the acquisition of expertise as a long-term process.  Indeed, in a study of children, researcher Gary McPherson found that the answer to a simple question determined the musical ability of kids a year later: “how long do you think you’ll play your instrument?”  The factors that were shown to be irrelevant to performance level were: initial musical ability, IQ, aural sensitivity, math skills, sense of rhythm, income level, and sensorimotor skills.

The type of practice also matters.  When researchers Kitsantas and Zimmerman studied the skill acquisition of experts, they found that 90% of the variation in ability could be accounted for by how the performers described their practice; the types of goals they set, how they planned and executed strategies, self-monitored, and adapted their performance in response to feedback.

So, I take my playing cards and close-up mat with me on all of my trips (both domestic and international).  I don’t practice on planes.  Gave that up after getting some strange stares from fellow passengers as they watched me repeat, in obsessive fashion, the same small segment of my performance over, and over, and over again.  It only made matters worse if they found out I was a psychologist.  I’d get that “knowing look,” that seemed to say, “Oh yeah.”  Anyway, I also managed to lose a fair number of cards when the deck–because of my inept handling while trying to master some particular move–went flying all over the cabin (You can imagine why I’ve been less successful in keeping last year’s New Year resolution to learn to play the ukelele).

Once I’m comfortably situated in my room, the mat and cards come out and I work, practice a specific handling for up to 30 minutes followed by a 15-20 minute break.  Believe it or not, learning–or perhaps better said, attempting to learn–magic has really been helpful in understanding the acquisition of expertise in my chosen field: psychology and psychotherapy.  Together with my colleagues, we are translating our experience and the latest research on expertise into steps for improving the performance and outcome of behavioral health services.  This is, in fact, the focus of the newest workshop I’m teaching, “Achieving Clinical Excellence.”   It’s also the organizing theme of the ICCE Achieving Clinical Excellence conference that will be held in Kansas City, Kansas in October 2010.  Click on the photo below for more information.

In the meantime, check out the two videos I’ve uploaded to ICCETV featuring two fun magic effects.  And yes, of course, feedback is always appreciated!

Filed Under: Conferences and Training, deliberate practice, excellence, Feedback Informed Treatment - FIT Tagged With: achieving clinical excellence, Alliance, Belgium, Carl Rogers, common factors, holland, icce, Norway, psychology, psychotherapy, randomized clinical trial, Therapist Effects

Outcomes in the Artic: An Interview with Norwegian Practitioner Konrad Kummernes

January 21, 2010 By scottdm Leave a Comment

Dateline: Mosjoen, Norway

The last stop on my training tour around northern Norway was Mosjoen.  The large group of psychologists, social workers, psychiatrists, case managers, and physicians laughed uproariously when I talked about the bumpy, “white-knuckler” ride aboard the small twin-engine airplane that delivered me to the snowy, mountain-rimmed town. They were all to familiar with the peculiar path pilots must follow to navigate safely between the sharp, angular peaks populating the region.

Anyway, I’d been invited nearly two years earlier to conduct the day-long training on “what works in treatment.” The event was sponsored by Helgelandssykehuset-Mosjoen and organized by Norwegian practitioner Konrad Kummernes.  I first met Konrad at a conference held in another beautiful location in Norway (is there any other type in this country?!), Stavanger–best known for its breathtaking Fjordes.  The goal for the day in Mosjoen?  Facilitate the collaboration between the many different services providers and settings thereby enabling the delivery of the most effective and comprehensive clinical services.  Meeting Konrad again and working with the many dedicated professionals in Mosjoen was an inspiration. Here’s Konrad:

Filed Under: Behavioral Health, Conferences and Training, Feedback Informed Treatment - FIT Tagged With: cdoi, evidence based practice, icce, Norway, psychotherapy

Practice-Based Evidence in Norway: An Interview with Psychologist Mikael Aagard

January 19, 2010 By scottdm Leave a Comment

For those of you following me on Facebook–and if you’re not, click here to start–you know that I was traveling above the arctic circle in Norway last week.  I always enjoy visiting the Scandinavian countries.  My grandparents immigrated from nearby Sweden.  I lived there myself for a number of years (and speak the language).  And I am married to a Norwegian!  So, I consider Scandinavia to be my second home.

In a prior post, I talked a bit about the group I worked with during my three day stay in Tromso.  Here, I briefly interview psychologist Mikael Aagard, the organizer of the conference.  Mikael works at KORUS Nord, an addiction technology transfer center, which sponsored the training.  His mission?  To help clinicians working in the trenches stay up-to-date with the research on “what works” in behavioral health.  Judging by the tremendous response–people came from all over the disparate regions of far northern Norway to attend the conference–he is succeeding.

Listen as he describes the challenges facing practitioners in Norway and the need to balance the “evidence-based practice” movement with “practice-based evidence.”  If you’d like any additional information regarding KORUS, feel free to connect with Mikael and his colleagues by visiting their website.  Information about the activities of the International Center for Clinical Excellence in Scandinavia can be found at: www.centerforclinicalexcellence.org.

Filed Under: Behavioral Health, Drug and Alcohol, evidence-based practice, Practice Based Evidence Tagged With: cdoi, evidence based practice, Hyperlipidemia, icce, meta-analysis, psychotherapy

"What Works" in Norway

January 13, 2010 By scottdm 1 Comment

Dateline: Tromso, Norway
Place: Rica Ishavshotel

For the last two days, I’ve had the privilege of working with 125+ clinicians (psychotherapists, psychologists, social workers, psychiatrists, and addiction treatment professionals) in far northern Norway.  The focus of the two-day training was on “What Works” in treatment, in particular examining what constitutes “evidence-based practice” and how to seek and utilize feedback from consumers on an ongoing basis.  The crowd was enthusiastic, the food fantastic, and the location, well, simply inspiring.  Tomorrow, I’ll be working with a smaller group of practitioners, doing an advanced training.  More to come.

Filed Under: Behavioral Health, Conferences and Training, evidence-based practice Tagged With: behavioral health, evidence based practice, icce, Norway, psychotherapy, public behavioral health, Therapist Effects

New Year’s Resolutions: Progress Report and Future Plans

January 1, 2010 By scottdm Leave a Comment

One year ago today, I blogged about my New Year’s resolution to “take up the study of expertise and expert performance.”  The promise marked a significant departure from my work up to that point in time and was not without controversy:

“Was I no longer interested in psychotherapy?”

“Had I given up on the common factors?

“What about the ORS and SRS?” and was I abandoning the field and pursue magic as a profession?”

Seriously.

The answer to all of the questions was, of course, an emphatic “NO!”  At the same time, I recognized that I’d reached an empirical precipice–or, stated more accurately, dead end.  The common factors, while explaining why therapy works did not and could never tell us how to work.  And while seeking and obtaining ongoing feedback (via the ORS and SRS) had proven successful in boosting treatment outcomes, there was no evidence that the practice had a lasting impact on the professionals providing the service.

Understanding how to improve my performance as a clinician has, as is true of many therapists, been a goal and passion from the earliest days of my career.  The vast literature on expertise and expert performance appeared to provide the answers I’d long sought.   In fields as diverse as music and medicine, researchers had identified specific principles and methods associated with superior performance.  On January 2nd, 2009, I vowed to apply what I was learning to, “a subject I know nothing about…put[ting] into practice the insights gleaned from the study of expertise and expert performance.”

The subject? Magic (and the ukulele).

How have I done?  Definitely better than average I can say.  In a column written by Barbara Brotman in today’s Chicago Tribune, psychologist Janine Gauthier notes that while 45% of people make New Year’s resolutions, only 8% actually keep them!  I’m a solid 50%.  I am still studying and learning magic–as attendees at the 2009 “Training of Trainers” and my other workshops can testify.  The uke is another story, however.  To paraphrase 1988 Democratic vice-presidential candidate, Lloyd Bentsen , “I know great ukulele players, and Scott, you are no Jake Shimabukuro.”

I first saw Jake Shimabukuro play the ukulele at a concert in Hawaii.  I was in the islands working with behavioral health professionals in the military (Watch the video below and tell me if it doesn’t sound like more than one instrument is playing even though Jake is the only one pictured).

Interestingly, the reasons for my success with one and failure with the other are as simple and straightforward as the principles and practices that researchers say account for superior (and inferior) performance.  I promise to lay out these findings, along with my experiences, over the next several weeks.  If you are about to make a New Year’s resolution, let me give you step numero uno: make sure your goal/resolution is realistic.  I know, I know…how mundane.  And yet, while I’ve lectured extensively about the relationship between goal-setting and successful psychotherapy for over 15 years, my reading about expert performance combined with my attempts to master two novel skills, has made me aware of aspects I never knew about or considered before.

Anyway, stay tuned for more.  In the meantime, just for fun, take a look at the video below from master magician Bill Malone.  The effect he is performing is called, “Sam the Bellhop.”  I’ve been practicing this routine since early summer, using what I’ve learned from my study of the literature on expertise to master the effect (Ask me to perform it for you on break if you happen to be in attendance at one of my upcoming workshops).

Filed Under: Behavioral Health, deliberate practice, excellence, Top Performance Tagged With: Alliance, cdoi, ors, outcome rating scale, psychotherapy, sessino rating scale, srs, Therapist Effects, training of trainers

Five Incredible Days in Anaheim

December 15, 2009 By scottdm 2 Comments

From December 9-13th, eight thousand five hundred mental health practitioners, from countries around the globe, gathered in Anaheim, California to attend the “Evolution of Psychotherapy” conference.  Held every five years since 1985, the conference started big and has grown only larger.  “Only a few places in the US can accommodate such a large gathering,” says Jeffrey K. Zeig, Ph.D., who has organized the conference since the first.

The event, held every five years, brings together 40 of the field’s leading researchers, practitioners, trend setters, and educators to deliver keynote addresses and workshops, host discussion panels, and offer clinical demonstrations on every conceivable subject related to clinical practice.  Naturally, I spoke about my current work on “Achieving Clinical Excellence” as well as served on several topical panels, including “evidence based practice” (with Don Meichenbaum), “Research on Psychotherapy” (with Steven Hayes and David Barlow), and “Severe and Persistent Mental Illness (with Marsha Linnehan and Jeff Zeig).

Most exciting of all, the Evolution of Psychotherapy conference also served as the official launching point for the International Center for Clinical Excellence.  Here I am pictured with long-time colleague and friend, Jeff Zeig, and psychologist and ICCE CEO, Brendan Madden, in front of the ICCE display in the convention center hall.

Over the five days, literally hundreds of visitors stopped by booth #128 chat with me, Brendan, and Senior ICCE Associates and Trainers, Rob Axsen, Jim Walt, Cynthia Maeschalck, Jason Seidel, Bill Andrews, Gunnar Lindfeldt, and Wendy Amey.  Among other things, a cool M and M dispenser passed out goodies to folks (if they pressed the right combination of buttons), we also talked about and handed out leaflets advertising the upcoming “Achieving Clinical Excellence” conference, and finally people watched a brief video introducing the ICCE community.  Take a look yourself:.


More to come from the week in Anaheim….

Filed Under: Behavioral Health, Conferences and Training, excellence, ICCE Tagged With: Acheiving Clinical Excellence, brendan madden, david barlow, Don Meichenbaum, evidence based practice, Evolution of Psychotherapy, icce, Jeff Zeig, jeffrey K. zeig, Marsha Linnehan, mental health, psychotherapy, Steve Hayes

Evolution of Psychotherapy and the International Center for Clinical Excellence

December 9, 2009 By scottdm Leave a Comment

evolution-2005

Dateline: Chicago, Illinois
December 7, 2009

I’ve just finished packing my bags and am heading for the airport.  Tomorrow the “Evolution of Psychotherapy” begins.  Nearly 25 years after volunteering at the first “Evolution” conference, I’m back a second time to present.  Tomorrow, I’ll be talking about “Achieving Clinical Excellence.”  On the days that follow, I’m on panels with my friend Don Meichenbaum, as well as David Barlow, Marsha Linnehan, and others.  I’m really looking forward to the four days in Anaheim.

Of everything going on in sunny southern California, I have to say that I’m most excited about the launch of the International Center for Clinical Excellence.  We have a booth (#128) in the exhibitor hall where folks can stop by, talk, and peruse our new website.  As promised, it is a true web 2.0 experience, enabling clinicians researchers. and educators around the world to connect, share, and learn from each other.

We’ll be streaming video to facebook and twitter. Stay tuned to my blog and twitter accounts as well for updates, videos, and pictures from the conference.

Filed Under: Conferences and Training, excellence, ICCE Tagged With: achieving clinical excellence, david barlow, Don Meichenbaum, Evolution of Psychotherapy, Marsha Linnehan, psychotherapy

Outcomes in Oz

November 20, 2009 By scottdm Leave a Comment

Greetings from beautiful Melbourne, Australia!   For the next couple of weeks, I’ll be traveling the up and down the east coast of this captivating country, conducting workshops and providing consultations on feedback-informed clinical work.

Actually, I’ve had the privilege of visiting and teaching in Australia about once a year beginning in the late 1990’s. Back then, Liz Sheehan, the editor of the “must read” journal Psychotherapy in Australiabrought me in to speak about the then recently published first edition of the Heart and Soul of Change.  By the way, if you are not from Australia, and are unfamiliar with the journal, please do visit the website.  Liz makes many of the articles that appear in the print version available online.  I’ve been a subscriber for years now and await the arrival of each issue with great anticipation.  I’m never disappointed.

In any event, on Wednesday this week, I spent the entire day with Mark Buckingham, Fiona Craig, and the clinical staff of Kedesh Rehabilitation Services in Wollongong, Australia–a scenic sea-side location about 45 minutes south of Sydney.  Briefly, Kedesh is a residential treatment facility providing cutting-edge, consumer driven, outcome-informed services to people with drug, alcohol, and mental health problems.  The crew at Kedesh is using the ORS and SRS to guide service delivery and is, in fact, one of the first to fully implement CDOI in the country.

I’ll be back with more soon, so please check back tomorrow.  In the meantime, check out the video with Mark and Fiona.

Filed Under: Behavioral Health, evidence-based practice, excellence, Feedback Informed Treatment - FIT, PCOMS Tagged With: australia, kedesh, liz sheehan, psychotherapy

Where is Scott Miller going? The Continuing Evolution

November 16, 2009 By scottdm 2 Comments

I’ve just returned from a week in Denmark providing training for two important groups.  On Wednesday and Thursday, I worked with close to 100 mental health professionals presenting the latest information on “What Works” in Therapy at the Kulturkuset in downtown Copenhagen.  On Friday, I worked with a small group of select clinicians working on implementing feedback-informed treatment (FIT) in agencies around Denmark.  The day was organized by Toftemosegaard and held at the beautiful and comfortable Imperial Hotel.

In any event, while I was away, I received a letter from my colleague and friend, M. Duncan Stanton.  For many years, “Duke,” as he’s known, has been sending me press clippings and articles both helping me stay “up to date” and, on occasion, giving me a good laugh.  Enclosed in the envelope was the picture posted above, along with a post-it note asking me, “Are you going into a new business?!”

As readers of my blog know, while I’m not going into the hair-styling and spa business, there’s a grain of truth in Duke’s question.  My work is indeed evolving.  For most of the last decade, my writing, research, and training focused on factors common to all therapeutic approaches. The logic guiding these efforts was simple and straightforward. The proven effectiveness of psychotherapy, combined with the failure to find differences between competing approaches, meant that elements shared by all approaches accounted for the success of therapy (e.g., the therapeutic alliance, placebo/hope/expectancy, structure and techniques, extratherapeutic factors).  As first spelled out in Escape from Babel: Toward a Unifying Language for Psychotherapy Practice, the idea was that effectiveness could be enhanced by practitioners purposefully working to enhance the contribution of these pantheoretical ingredients.  Ultimately though, I realized the ideas my colleagues and I were proposing came dangerously close to a new model of therapy.  More importantly, there was (and is) no evidence that teaching clinicians a “common factors” perspective led to improved outcomes–which, by the way, had been my goal from the outset.

The measurable improvements in outcome and retention–following my introduction of the Outcome and Session Rating Scales to the work being done by me and my colleagues at the Institute for the Study of Therapeutic Change–provided the first clues to the coming evolution.  Something happened when formal feedback from consumers was provided to clinicians on an ongoing basis–something beyond either the common or specific factors–a process I believed held the potential for clarifying how therapists could improve their clinical knowledge and skills.  As I began exploring, I discovered an entire literature of which I’d previously been unaware; that is, the extensive research on experts and expert performance.  I wrote about our preliminary thoughts and findings together with my colleagues Mark Hubble and Barry Duncan in an article entitled, “Supershrinks” that appeared in the Psychotherapy Networker.

Since then, I’ve been fortunate to be joined by an internationally renowned group of researchers, educators, and clinicians, in the formation of the International Center for Clinical Excellence (ICCE).  Briefly, the ICCE is a web-based community where participants can connect, learn from, and share with each other.  It has been specifically designed using the latest web 2.0 technology to help behavioral health practitioners reach their personal best.  If you haven’t already done so, please visit the website at www.iccexcellence.com to register to become a member (its free and you’ll be notified the minute the entire site is live)!

As I’ve said before, I am very excited by this opportunity to interact with behavioral health professionals all over the world in this way.  Stay tuned, after months of hard work and testing by the dedicated trainers, associates, and “top performers” of ICCE, the site is nearly ready to launch.

Filed Under: excellence, Feedback Informed Treatment - FIT, Top Performance Tagged With: denmark, icce, Institute for the Study of Therapeutic Change, international center for cliniclal excellence, istc, mental health, ors, outcome rating scale, psychotherapy, psychotherapy networker, session rating scale, srs, supershrinks, therapy

Common versus Specific Factors and the Future of Psychotherapy: A Response to Siev and Chambless

October 31, 2009 By scottdm 4 Comments

Early last summer, I received an email from my long time friend and colleague Don Meichenbaum alerting me to an article published in the April 2009 edition of the Behavior Therapist–the official “newsletter” of the Association for Behavioral and Cognitive Therapies–critical of the work that I and others have done on the common factors.

Briefly, the article, written by two proponents of the “specific treatments for specific disorders” approach to “evidence-based practice” in psychology, argued that the common factors position–the idea that the efficacy of psychotherapy is largely due to shared rather than unique or model-specific factors–was growing in popularity despite being based on “fallacious reasoning” and a misinterpretation of the research.

Although the article claimed to provide an update on research bearing directly on the validity of the “dodo verdict”–the idea that all treatment approaches work equally well–it simply repeated old criticisms and ignored contradictory, and at times, vast evidence.  Said another way, rather than seizing the opportunity they were given to educate clinicians and address the complex issues involved in questions surrounding evidence-based practice, Siev and Chambless instead wrote to “shore up the faithful.”  “Do not doubt,” authors Siev and Chambless were counseling their adherents, “science is on our side.”

That differences and tensions exist in the interpretation of the evidence is clear and important.  At the same time, more should be expected from those who lead the field.  You read the articles and decide.  The issues at stake are critical to the future of psychotherapy.  As I will blog about next week, there are forces at work in the United States and abroad that are currently working to limit the types of approaches clinicians can employ when working with clients.  While well-intentioned, available evidence indicates they are horribly misguided.  Once again, the question clinicians and consumers face is not “which treatment is best for that problem,” but rather “which approach “fits with, engages, and helps” the particular consumer at this moment in time?”

Behavior Therapist (April 2009) from Scott Miller

Dissemination of EST’s (November 2009) from Scott Miller

Filed Under: Dodo Verdict, evidence-based practice, Practice Based Evidence Tagged With: Association for Behavioral and Cognitive Therapies, behavior therapist, Don Meichenbaum, evidence based medicine, evidence based practice, psychology, psychotherapy

Top Resources for Top Performers

September 28, 2009 By scottdm 1 Comment

Since the 1960’s, over 10,000 “how-to” book on psychotherapy have been published.  I joke about this fact at my workshops, stating “Any field that needs ten thousand books to describe what it’s doing…surely doesn’t know what its doing!” I continue, pointing out that, “There aren’t 10,000 plus books on ‘human anatomy,’ for example.  There are a handful!  And the content of each is remarkably similar.”  The mere existence of so many, divergent points of view makes it difficult for any practitioner to sort the proverbial “wheat from the chaff.”

Over the last 100 years or so, the field has employed three solutions to deal with the existence of so many competing theories and approaches.  First, ignore the differences and continue with “business as usual”– this, in fact, is the approach thats been used for most of the history of the field.  Second, force a consolidation or reduction by fiat–this, in my opinion, is what is being attempted with much of the current evidence-based practice (“specific treatments for specific disorders”) movement.  And third, and finally, respect the field’s diverse nature and approaches, while attempting to understand the “DNA” common to all–said another way, identify and train clinicians in the factors common to all approaches so that they can tailor their work to their clients.

Let’s face it: option one is no longer viable.  Changes in both policy and funding make clear that ignoring the problem will result in further erosion of clinical autonomy.  For anyone choosing option two–either enthusistically or by inaction–I will blog later this week about developments in the United States and U.K. on the “evidence-based practice” front that I’m sure will give you pause.  Finally, for those interested in movng beyond the rival factions and delivering the best clinical service to clients, I want to recommend two resources.  First, Derek Truscott’s, Becoming an Effective Psychotherapist.  The title says it all.  Whether you are new to the field or an experienced clinician, this book will help you sort through the various and competing psychotherapy approaches and find a style that works for you and the people you work with.  The second volume, is Mick Cooper’s Essential Research Findings in Counselling and Psychotherapy.  What can I say about this book?  It is a gem.  Thorough, yet readable.  Empirical in nature, but clinically relevant.  When I’m out and about teaching around the globe and people ask me what to read in order to understand the empirical literature on psychotherapy, I recommend this book.

OK, enough for now.  Stay tuned for further updates this week. In the meantime, I did manage to find a new technique making the rounds on the workshop circuit.  Click on the video below.

Filed Under: Behavioral Health, Practice Based Evidence Tagged With: common factors, counselling, Derek Truscott, evidence based practice, icce, Mick Cooper, psychotherapy, randomized clinical trial

The Evolution of Psychotherapy: Twenty-Five Years On

September 1, 2009 By scottdm Leave a Comment

In 1985, I was starting my second year as a doctoral student at the University of Utah.  Like thousands of other graduate students, I’d watched the “Gloria” films.  Carl Roger, Albert Ellis, Fritz Perls were all impressive if not confusing given their radically different styles.  I also knew that I would soon have the opportunity to meet each one live and in person.  Thanks to Jeffrey K. Zeig, Ph.D. and the dedicated staff at the Milton H. Erickson Foundation, nearly every well known therapist, guru, and psychotherapy cult-leader would gather for the first mega-conference ever held, the field’s Woodstock: The Evolution of Psychotherapy.

Having zero resources at my disposal, I wrote to Jeff asking if I could volunteer for the event in exchange for the price of admission.  Soon after completing the multiple-page application, I received notice that I had been chosen to work at event.  I was ecstatic.  When December finally came around, I loaded up my old car with food and a sleeping bag and, together with a long time friend Paul Finch, drove from Salt Lake City to Phoenix.   What can I say?  It was alternately inspiring and confusing.  I learned so very much and also felt challenged to make sense of the disparate theories and approaches.

At that time, I had no idea that some twenty years later, I’d receive a call from Jeff Zeig asking me to participate as one of the “State of the Art” faculty for the 2005 Evolution Conference.  Actually, I can remember where I was when my cell phone rang: driving on highway 12 on southwest Michigan toward Indian Lake, where my family has a small cottage.  In any event, I’m looking forward to attending and presenting at the 2009 conference.  I encourage all of the readers of my blog to attend.  Registration information can be found at the conference website: www.evolutionofpsychotherapy.com.  The highlight of the event for me is a debate/discussion I’ll be having with my friend and colleague, Don Meichenbaum, Ph.D. on the subject of “evidence-based practice.”

One more thing.  To get a feel for the event, I included a clip of a panel discussion from the first Evolution conference featuring Carl Rogers.  Not trying to be hyperbolic, but listening to Rogers speak changed my life.  I won’t bore you with the details but the night following his presentation, I had a dream…(more later)…

Filed Under: Behavioral Health, Conferences and Training, Dodo Verdict, evidence-based practice, excellence Tagged With: albert ellis, carl roger, Don Meichenbaum, erickson, evidence based practice, Evolution of Psychotherapy, fritz perl, jejjrey k. zeig, psychotherapy

  • 1
  • 2
  • Next Page »

SEARCH

Subscribe for updates from my blog.

  

Upcoming Training

May
31

FIT CAFÉ May/June 2022


Aug
01

FIT Implementation Intensive 2022


Aug
03

Feedback Informed Treatment (FIT) Intensive ONLINE

FIT Software tools

FIT Software tools

NREPP Certified

HTML tutorial

LinkedIn

Topics of Interest:

  • Behavioral Health (112)
  • behavioral health (4)
  • Brain-based Research (2)
  • CDOI (14)
  • Conferences and Training (67)
  • deliberate practice (31)
  • Dodo Verdict (9)
  • Drug and Alcohol (3)
  • evidence-based practice (66)
  • excellence (63)
  • Feedback (40)
  • Feedback Informed Treatment – FIT (216)
  • FIT (29)
  • FIT Software Tools (12)
  • ICCE (26)
  • Implementation (7)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (11)
  • Practice Based Evidence (39)
  • PTSD (4)
  • Suicide (1)
  • supervision (1)
  • Termination (1)
  • Therapeutic Relationship (9)
  • Top Performance (40)

Recent Posts

  • Naïve, Purposeful, and Deliberate Practice? Only One Improves Outcomes
  • Study Shows FIT Improves Effectiveness by 25% BUT …
  • Seeing What Others Miss
  • How Knowing the Origins of Psychotherapy Can Improve Your Effectiveness
  • Session Frequency and Outcome: What is the “Right Dose” for Effective Psychotherapy?

Recent Comments

  • Asta on The Expert on Expertise: An Interview with K. Anders Ericsson
  • Michael McCarthy on Culture and Psychotherapy: What Does the Research Say?
  • Jim Reynolds on Culture and Psychotherapy: What Does the Research Say?
  • gloria sayler on Culture and Psychotherapy: What Does the Research Say?
  • Joseph Maizlish on Culture and Psychotherapy: What Does the Research Say?

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors conferences continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training