It’s true. Adding to a growing literature showing that the person of the therapist is more important than theoretical orientation, years of experience, or discipline, a new study documents that clients are sensitive to the quality of their therapist’s life outside of treament. In short, they can tell when you are happy or not. Despite our best efforts to conceal it, they see it in how we interact with them in therapy. By contrast, therapists’ judgements regarding the quality of the therapy are biased by their own sense of personal well-being. The solution? Short of being happy, it means we need to check in with our clients on a regular basis regarding the quality of the therapeutic relationship. Multiple randomized clinical trials show that formally soliciting feedback regarding progress and the alliance improves outcome and continued engagement in treatment. One approach, “Feedback-Informed Treatment” is now listed on SAMHSA’s National Registry of Evidence-Based Programs and Practices. Step-by-step instructions and videos for getting started are available on a new website: www.pcomsinternational.com. Seeking feedback from clients not only helps to identify and correct potential problems in therapy, but is also the first step in pushing one’s effectiveness to the next level. In case you didn’t see it, I review the research and steps for improving performance as a therapist in an article/interview on the Psychotherapy.net website. It’s sure to make you happy!
Just yesterday, the membership of the International Center for Clinical Excellence burst through the 1000 mark, making it the largest community of behavioral health professionals dedicated to excellence and feedback informed treatment (FIT). And there’s more news…click on the video below.
When it comes to healthcare, it can be said without risk of exaggeration that “revolution is in the air.” The most sweeping legislation in history has just been passed in the United States. Elsewhere, as I’ve been documenting in my blogs, countries, states, provinces, and municipalities are struggling to maintain quality while containing costs of the healthcare behemoth.
Back in January, I talked about the approach being taken in Holland where, in contrast to many countries, the healthcare system was jettisoning their government-run system in favor of private insurance reimbursement. Believe me, it is a change no less dramatic in scope and impact than what is taking place in the U.S. At the time, I noted that Dutch practitioners were, in response “’thinking ahead’, preparing for the change—in particular, understanding what the research literature indicates works as well as adopting methods for documenting and improving the outcome of treatment.” As a result, I’ve been traveling back and forth—at least twice a quarter–providing trainings to professional groups and agencies across the length and breadth of the country.
Not long ago, I was invited to speak at the 15th year anniversary of Cenzo—a franchise organization with 85 registered psychologist members. Basically, the organization facilitates—some would say “works to smooth”–the interaction between practitioners and insurance companies. In addition to helping with contracts, paperwork, administration, and training, Cenzo also has an ongoing “quality improvement” program consisting of routine outcome monitoring and feedback as well as client satisfaction metrics. Everything about this forward-thinking group is “top notch,” including a brief film they made about the day and the workshop. Whether you work in Holland or not, I think you’ll find the content interesting! If you understand the language, click here to download the 15th year Anniversary Cenzo newsletter.
Later today, I board United flight 908 on my way to workshops scheduled in Holland and Belgium. My routine in the days leading up to an international trip is always the same. I slowly gather together the items I’ll need while away: computer (check); european electric adapter (check); presentation materials (check); clothes (check). And, oh yeah, two decks of playing cards and close up performance mat.
That’s me (pictured above) practicing a “ribbon spread” in my hotel room following a day of training in Marion, Ohio. It’s a basic skill in magic and I’ve been working hard on this (and other moves using cards) since last summer. Along the way, I’ve felt both hopeful and discouraged. But I’ve kept on nonetheless taking heart from what I’m reading about skill acquisition.
Research on expertise indicates that the best performers (in chess, medicine, music, sports, etc.) practice every day of the week (including weekends) for up to four hours a day. Sounds tiring for sure. And yet, the same body of evidence shows that world class performers are able to sustain such high levels of practice because they view the acquisition of expertise as a long-term process. Indeed, in a study of children, researcher Gary McPherson found that the answer to a simple question determined the musical ability of kids a year later: “how long do you think you’ll play your instrument?” The factors that were shown to be irrelevant to performance level were: initial musical ability, IQ, aural sensitivity, math skills, sense of rhythm, income level, and sensorimotor skills.
The type of practice also matters. When researchers Kitsantas and Zimmerman studied the skill acquisition of experts, they found that 90% of the variation in ability could be accounted for by how the performers described their practice; the types of goals they set, how they planned and executed strategies, self-monitored, and adapted their performance in response to feedback.
So, I take my playing cards and close-up mat with me on all of my trips (both domestic and international). I don’t practice on planes. Gave that up after getting some strange stares from fellow passengers as they watched me repeat, in obsessive fashion, the same small segment of my performance over, and over, and over again. It only made matters worse if they found out I was a psychologist. I’d get that “knowing look,” that seemed to say, “Oh yeah.” Anyway, I also managed to lose a fair number of cards when the deck–because of my inept handling while trying to master some particular move–went flying all over the cabin (You can imagine why I’ve been less successful in keeping last year’s New Year resolution to learn to play the ukelele).
Once I’m comfortably situated in my room, the mat and cards come out and I work, practice a specific handling for up to 30 minutes followed by a 15-20 minute break. Believe it or not, learning–or perhaps better said, attempting to learn–magic has really been helpful in understanding the acquisition of expertise in my chosen field: psychology and psychotherapy. Together with my colleagues, we are translating our experience and the latest research on expertise into steps for improving the performance and outcome of behavioral health services. This is, in fact, the focus of the newest workshop I’m teaching, “Achieving Clinical Excellence.” It’s also the organizing theme of the ICCE Achieving Clinical Excellence conference that will be held in Kansas City, Kansas in October 2010. Click on the photo below for more information.
In the meantime, check out the two videos I’ve uploaded to ICCETV featuring two fun magic effects. And yes, of course, feedback is always appreciated!
Several years ago I was contacted by a group of practitioners located in the largest city in the north of the Netherlands–actually the capital of the province known as Groningen. The “Platform,” as they are known, were wondering if I’d be willing to come and speak at one of their upcoming conferences. The practice environment was undergoing dramatic change, the group’s leadership (Dorti Been & Pico Tuene) informed me. Holland would soon be switching from government to a private insurance reimbursement system. Dutch practitioners were “thinking ahead,” preparing for the change–in particular, understanding what the research literature indicates works in clinical practice as well as learning methods for documenting and improving the outcome of treatment.
I was then, and remain now, deeply impressed with the abilities and dedication of Dutch practitioners. During that visit to Groningen, and the many that have followed (to Amsterdam, Rotterdam, Beilen, etc.), its clear that clinicians in the Netherlands are determined to lead rather than be led. I’ve been asked to meet with university professors, practitioner organizations, training coordinators, and insurance company executives. In a very short period of time, two Dutch therapists–physician Flip Van Oenen and psychologist Mark Crouzen–have completed the “Training of Trainers” course and become recognized trainers and associates for the International Center for Clinical Excellence. And finally, a study will soon be published showing sound psychometric properties of the Dutch translations of the ORS and SRS.
I’ve also been working closely with the Dutch company Reflectum–a group dedicated to supporting outcome-informed healthcare and clinical excellence. Briefly, Reflectum has organized several conferences and expert meetings between me and clinicians, agency managers, and insurance companies. One thing for sure: we will be working closely together to train a network of trainers and consultants to promote, support, and train agencies and practitioners in outcome-informed methods in order to meet the demands of the changing practice climate.
Check out the videobelow filmed at Schipol airport during one of my recent trips to Holland:
“What works” in therapy? Believe it or not, that question–as simple as it is–has and continues to spark considerable debate. For decades, the field has been divided. On one side are those who argue that the efficacy of psychological treatments is due to specific factors (e.g., changing negative thinking patterns) inherent in the model of treatment (e.g., cognitive behavioral therapy) remedial to the problem being treated (i.e., depression); on the other, is a smaller but no less committed group of researchers and writers who posit that the general efficacy of behavioral treatments is due to a group of factors common to all approaches (e.g., relationship, hope, expectancy, client factors).
While the overall effectiveness of psychological treatment is now well established–studies show that people who receive care are better off than 80% of those who do not regardless of the approach or the problem treated–one fact can not be avoided: outcomes have not improved appreciably over the last 30 years! Said another way, the common versus specific factor battle, while generating a great deal of heat, has not shed much light on how to improve the outcome of behavioral health services. Despite the incessant talk about and promotion of “evidence-based” practice, there is no evidence that adopting “specific methods for specific disorders” improves outcome. At the same time, as I’ve pointed out in prior blogposts, the common factors, while accounting for why psychological therapies work, do not and can not tell us how to work. After all, if the effectiveness of the various and competing treatment approaches is due to a shared set of common factors, and yet all models work equally well, why learn about the common factors? More to the point, there simply is no evidence that adopting a “common factors” approach leads to better performance.
The problem with the specific and common factor positions is that both–and hang onto your seat here–have the same objective at heart; namely, contextlessness. Each hopes to identify a set of principles and/or practices that are applicable across people, places, and situations. Thus, specific factor proponents argue that particular “evidence-based” (EBP) approaches are applicable for a given problem regardless of the people or places involved (It’s amazing, really, when you consider that various approaches are being marketed to different countries and cultures as “evidence-based” when there is in no evidence that these methods work beyond their very limited and unrepresentative samples). On the other hand, the common factors camp, in place of techniques, proffer an invariant set of, well, generic factors. Little wonder that outcomes have stagnated. Its a bit like trying to learn a language either by memorizing a phrase book–in the case of EBP–or studying the parts of speech–in the case of the common factors.
What to do? For me, clues for resolving the impasse began to appear when, in 1994, I followed the advice of my friend and long time mentor, Lynn Johnson, and began formally and routinely monitoring the outcome and alliance of the clinical work I was doing. Crucially, feedback provided a way to contextualize therapeutic services–to fit the work to the people and places involved–that neither a specific or common factors informed approach could.
Numerous studies (21 RCT’s; including 4 studies using the ORS and SRS) now document the impact of using outcome and alliance feedback to inform service delivery. One study, for example, showed a 65% improvement over baseline performance rates with the addition of routine alliance and outcome feedback. Another, more recent study of couples therapy, found that divorce/separation rates were half (50%) less for the feedback versus no feedback conditions!
Such results have, not surprisingly, led the practice of “routine outcome monitoring” (PROMS) to be deemed “evidence-based.” At the recent, Evolution of Psychotherapy conference I was on a panel with David Barlow, Ph.D.–a long time proponent of the “specific treatments for specific disorders” (EBP)–who, in response to my brief remarks about the benefits of feedback, stated unequivocally that all therapists would soon be required to measure and monitor the outcome of their clinical work. Given that my work has focused almost exclusively on seeking and using feedback for the last 15 years, you would think I’d be happy. And while gratifying on some level, I must admit to being both surprised and frightened by his pronouncement.
My fear? Focusing on measurement and feedback misses the point. Simply put: it’s not seeking feedback that is important. Rather, it’s what feedback potentially engenders in the user that is critical. Consider the following, while the results of trials to date clearly document the benefit of PROMS to those seeking therapy, there is currently no evidence of that the practice has a lasting impact on those providing the service. “The question is,” as researcher Michael Lambert notes, “have therapists learned anything from having gotten feedback? Or, do the gains disappear when feedback disappears? About the same question. We found that there is little improvement from year to year…” (quoted in Miller et al. ).
Research on expertise in a wide range of domains (including chess, medicine, physics, computer programming, and psychotherapy) indicates that in order to have a lasting effect feedback must increase a performer’s “domain specific knowledge.” Feedback must result in the performer knowing more about his or her area and how and when to apply than knowledge to specific situations than others. Master level chess players, for example, have been shown to possess 10 to 100 times more chess knowledge than “club-level” players. Not surprisingly, master players’ vast information about the game is consilidated and organized differently than their less successful peers; namely, in a way that allows them to access, sort, and apply potential moves to the specific situation on the board. In other words, their immense knowledge is context specific.
A mere handful studies document similar findings among superior performing therapists: not only do they know more, they know how, when, and with whom o apply that knowledge. I noted these and highlighted a few others in the research pipeline during my workshop on “Achieving Clinical Excellence” at the Evolution of Psychotherapy conference. I also reviewed what 30 years of research on expertise and expert performance has taught us about how feedback must be used in order to insure that learning actually takes place. Many of those in attendance stopped by the ICCE booth following the presentation to talk with our CEO, Brendan Madden, or one of our Associates and Trainers (see the video below).
Such research, I believe, holds the key to moving beyond the common versus specific factor stalemate that has long held the field in check–providing therapists with the means for developing, organizing, and contextualizing clinical knowledge in a manner that leads to real and lasting improvements in performance.
For the last 7 years, I’ve been traveling to the small, picturesque village of Brattleboro, Vermont to work with clinicians, agency managers, and various state officials on integrating outcomes into behavioral health services. Peter Albert, the director of Governmental Affairs and PrimariLink at the Brattleboro Retreat, has tirelessly crisscrossed the state, promoting outcome-informed clinical work and organizing the trainings and ongoing consultations. Over time, I’ve done workshops on the common factors, “what works” in therapy, using outcome to inform treatment, working with challenging clinical problems and situations and, most recently, the qualities and practices of super effective therapists. In truth, outcome-informed clinical work both grew up and “came of age” in Vermont. Indeed, Peter Albert was the first to bulk-purchase the ASIST program and distribute it for free to any provider interested in tracking and improving the effectiveness of their clinical work.
If you’ve never been to the Brattleboro area, I can state without reservation that it is one of the most beautiful areas I’ve visited in the U.S.–particularly during the Fall, when the leaves are changing color. If you are looking for a place to stay for a few days, the Crosy House is my first and only choice. The campus of the Retreat is also worth visiting. It’s no accident that the trainings are held there as it has been a place for cutting edge services since being founded in 1874. The radical idea at that time? Treat people with respect and dignity. The short film below gives a brief history of the Retreat and a glimpse of the serene setting.
Anyway, this last week, I spent an entire day together with a select group of therapists dedicated to improving outcomes and delivering superior service to their clients. Briefly, these clinicians have been volunteering their time to participate in a project to implement outcome-informed work in their clinical settings. We met in the boardroom at the Retreat, discussing the principles and practices of outcome-informed work as well as reviewing graphs of their individual and aggregate ORS and SRS data.
It has been and continues to be an honor to work with each and every one in the PrimariLink project. Together, they are making a real difference in the lives of those they work with and to the field of behavioral health in Vermont. If you are a clinician located in Vermont or provide services to people covered by MVP or PrimariLink and would like to participate in the project, please email Peter Albert. At the same time, if you are a person in need of behavioral health services and looking for a referral, you could do no better than contacting one of the providers in the project!
Since the 1960’s, over 10,000 “how-to” book on psychotherapy have been published. I joke about this fact at my workshops, stating “Any field that needs ten thousand books to describe what it’s doing…surely doesn’t know what its doing!” I continue, pointing out that, “There aren’t 10,000 plus books on ‘human anatomy,’ for example. There are a handful! And the content of each is remarkably similar.” The mere existence of so many, divergent points of view makes it difficult for any practitioner to sort the proverbial “wheat from the chaff.”
Over the last 100 years or so, the field has employed three solutions to deal with the existence of so many competing theories and approaches. First, ignore the differences and continue with “business as usual”– this, in fact, is the approach thats been used for most of the history of the field. Second, force a consolidation or reduction by fiat–this, in my opinion, is what is being attempted with much of the current evidence-based practice (“specific treatments for specific disorders”) movement. And third, and finally, respect the field’s diverse nature and approaches, while attempting to understand the “DNA” common to all–said another way, identify and train clinicians in the factors common to all approaches so that they can tailor their work to their clients.
Let’s face it: option one is no longer viable. Changes in both policy and funding make clear that ignoring the problem will result in further erosion of clinical autonomy. For anyone choosing option two–either enthusistically or by inaction–I will blog later this week about developments in the United States and U.K. on the “evidence-based practice” front that I’m sure will give you pause. Finally, for those interested in movng beyond the rival factions and delivering the best clinical service to clients, I want to recommend two resources. First, Derek Truscott’s, Becoming an Effective Psychotherapist. The title says it all. Whether you are new to the field or an experienced clinician, this book will help you sort through the various and competing psychotherapy approaches and find a style that works for you and the people you work with. The second volume, is Mick Cooper’s Essential Research Findings in Counselling and Psychotherapy. What can I say about this book? It is a gem. Thorough, yet readable. Empirical in nature, but clinically relevant. When I’m out and about teaching around the globe and people ask me what to read in order to understand the empirical literature on psychotherapy, I recommend this book.
OK, enough for now. Stay tuned for further updates this week. In the meantime, I did manage to find a new technique making the rounds on the workshop circuit. Click on the video below.