SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
scottdmiller@ talkingcure.com +1.773.454.8511

Dumb and Dumber: Research and the Media

April 2, 2014 By scottdm 1 Comment

DUMB-AND-DUMBER

“Just when I thought you couldn’t get any dumber, you go and do something like this… and totally redeem yourself!”
– Harry in Dumb & Dumber

On January 25th, my inbox began filling with emails from friends and fellow researchers around the globe.  “Have you seen the article in the Guardian?” they asked.  “What do you make of it?” others inquired, “Have you read the study the authors are talking about?  Is it true?!”  A few of the messages were snarkier, even gloating,  “Scott, research has finally proven the Dodo verdict is wrong!”

The article the emails referred to was titled, Are all psychological therapies equally effective?  Don’t ask the dodo.  The subtitle boldly announced, “The claim that all forms of psychotherapy are winners has been dealt a blow.”

Honestly, my first thought on reading the headline was, “Why is an obscure topic like the ‘Dodo verdict’ the subject of an article in a major newspaper?”  Who in their right mind–outside of researchers and small cadre of psychotherapists–would care?  What possible interest would a lengthy dissertation on the subject–including references to psychologist Saul Rozenzweig (who first coined the expression in the 1930’s) and researcher allegiance effects–hold for the average Joe or Jane reader of The Guardian.  At a minimum, it struck me as odd.

And odd it stayed, until I glanced down to see who had written the piece.  The authors were psychologist Daniel Freeman–a strong proponent of the empirically-supported treatments–and his journalist brother, Jason.

Jason&Daniel-Freeman

Briefly, advocates of EST’s hold that certain therapies are better than others in the treatment of specific disorders.  Lists of such treatments are created–for example, the NICE Guidelines–dictating which of the therapies are deemed “best.”  Far from innocuous, such lists are, in turn, used to direct public policy, including both the types of treatment offered and the reimbursement given.

Interestingly, in the article, Freeman and Freeman base their conclusion that “the dodo was wrong” on a single study.  Sure enough, that one study comparing CBT to psychoanalysis, found that CBT resulted in superior effects in the treatment of bulimia.  No other studies were mentioned to bolster this bold claim–an assertion that would effectively overturn nearly 50 years of  robust research findings documenting no difference in outcome among competing treatment approaches.

In contrast to what is popularly believed extraordinary findings from single studies are fairly common in science.  As a result, scientists have learned to require replication, by multiple investigators, working in different settings.

The media, they’re another story.  They love such studies.   The controversy generates interest, capturing readers attention.   Remember cold fusion?  In 1989, researchers Stanley Pons and Martin Fleischmann–then two of the world’s leading electrochemists–claimed that they had produced a nuclear reaction at room temperature–a finding that would, if true, not only overturn decades and decades of prior research and theory but, more importantly, revolutionize energy production.

The media went nuts.  TV and print couldn’t get enough of it.  The hope for a cheap, clean, and abundant source of energy was simply too much to ignore.  The only problem was that, in the time that followed, no one could replicate Pons and Fleischmann’s results.  No one.  While the media ran off in search of other, more tantalizing findings to report, cold fusion quietly disappeared, becoming a footnote in history.

Back to The Guardian.  Curiously, Freeman and Freeman did not mention the publication of another, truly massive study published in Clinical Psychology Review—a study available in print at the time their article appeared.  In it, the researchers used the statistically rigorous method of meta-analysis to review results from 53 studies of psychological treatments for eating disorders.  Fifty-three!  Their finding?  Confirming mountains of prior evidence: no difference in effect between competing therapeutic approaches.  NONE!

Obviously, however, such results are not likely to attract much attention.

HUIZENGA

Sadly, the same day that the article appeared in The Guardian, John R. Huizenga passed away.  Huizenga is perhaps best known as one of the physicists who helped build the atomic bomb.  Importantly, however, he was also among the first to debunk the claims about cold fusion made by Pons and Fleischman.  His real-world experience, and decades of research, made clear that the reports were a case of dumb (cold fusion) being followed by dumber (media reports about cold fusion).

“How ironic this stalwart of science died on this day,” I thought, “and how inspiring his example is of ‘good science.'”

I spent the rest of the day replying to my emails, including the link to study in Clinical Psychology Review (Smart). “Don’t believe the hype,” I advised, “stick to the data” (and smarter)!

Filed Under: Practice Based Evidence Tagged With: CBT, Clinical Psychology Review, Daniel Freeman, dodo verdict, eating disorder, Jason Freeman, Martin Fleischmann, meta-analysis, NICE, psychoanalysis, psychotherapist, psychotherapy, research, Saul Rozenzweig, Stanley Pons, the guardian

NIMH Dumps the DSM-5: The No News Big News

May 10, 2013 By scottdm 1 Comment

Almost a year ago, I blogged about results from field trials of the soon-to-be-released, fifth edition of the Diagnostic and Statistical Manual of Mental Disorders.  Turns out, many of the diagnoses in the “new and improved” version were simply unreliable.  In fact, the likelihood of two clinicians, applying the same criteria to assess the same person for the two most common mental health conditions—anxiety and depression—and agreeing, was worse than it was with DSM IV, the ICD-10, or the DSM-III!

The question of validity, that is how well the diagnoses relate to real world phenomena, has never been addressed empirically in any edition.  Essentially, DSM is a collection of symptom clusters, not too dissimilar from categorizing people according to the four humours—and, it turns out, about as helpful in determining the appropriate or likely outcome of any treatment provided.

Despite these serious shortcomings, the volume exerted tremendous power and influence over research and practice for the last three decades.  Nearly all graduate programs teach it, research is organized around its content, and insurance companies and payers (including the Federal government) demand it for reimbursement.  In short, everyone acted “as if” it were true—that is, until last week when NIMH Director, Thomas Insel, announced the organization was abandoning the DSM.  As if having woken up from a thirty-year- nap the reason given was the volume’s lack of validity!  Really?

The day the announcement was made, I received a bunch of emails.   Most of the writers were elated.  They knew I’d been critical of the volume for many years.  “Finally,” one said, “a return to sanity.”  My response?  Not so fast.

To begin, DSM is not going away any time soon.  Sorry, but if you want to be paid, keep your trusty copy nearby.

More troubling— if you read the fine print—NIMH is promising a better system, based on “a new idea everyone should welcome.”   Just what is that idea?   Mental health problems are biological in origin.  To achieve better outcomes, NIMH funded researchers need to map the “cognitive, circuit, and genetic aspects of mental disorders” so as to identify “new and better targets for treatment.”  Insel calls it, “precision medicine.”

Now, I don’t know about you, but the new idea sounds a heck of a lot like the old one to me!  Psychiatry’s biological bandwagon blew into town last century and has been playing the same tune ever since.  Remember the “dexamethasone suppression test” for differentiating endogenous from non-endogenous depression?  How about the claims made about Xanax in the treatment of panic or the “new” anti-psychotics?   There’s always prefrontal lobotomy which like the DSM, proponents continued to use and promote long after its lack of efficacy and brain disabling side effects were known.  Heck, the originator won a Nobel Prize!

As far the promise of something better is concerned, history should chasten any hope one might feel.  Honestly, when was the last time the field failed to claim significant progress was being made?  Each new treatment approach is pitched as a vast improvement over “old ideas.”  CBT is better than psychodynamic,   specific is better than eclectic, evidence-based treatments are better than routine clinical practice, and so on—except none of these widely promulgated notions holds empirical water.

If “news” = new + different, then the NIMH announcement, like so much of what you find on TV and other social media, is definitely not news.  It’s more of the same.  Precision medicine in mental health is: 90% promise + 10% hyperbole, or marketing.

Here are a couple of newsworthy facts with immediate implications for mental health policy, practice, and research:

  1. Treatment works.  Evidence gathered over the last four decades documents that people who receive therapy are better off than 80% of those (with the same problem or concern) as those without the benefit of treatment.
  2. A majority of potential consumers (78%) cite “lack of confidence” in the outcome of treatment as a barrier to seeking help from a mental health professional.
  3. Tracking a consumer’s engagement and progress during treatment enables clinicians to tailor services to the individual, resulting in lower costs, fewer drop outs, and as much as three times the effects!

Just a thought—if we really want to step into the future, rather than geneticists, neurologists, and radiologists perhaps the field could start by listening to consumers.  That’s exactly the point Ernesto Sirolli made at a recent TED talk.  If you haven’t seen it, here it is:

Filed Under: Feedback Informed Treatment - FIT Tagged With: CBT, DSM, ICD-10, NIMH, psychiatry

The Revolution in Swedish Mental Health Services: UPDATE on the CBT Monopoly

April 5, 2013 By scottdm Leave a Comment

No blogpost I’ve ever published received the amount of attention as the one on May 13th, 2012 detailing changes to Swedish Mental Health practice.  At the time, I reported about research results showing that the massive investment of resources in training therapists in CBT had not translated into improved outcomes or efficiency in the treatment of people with depression and anxiety.  In short, the public experiment of limiting training and treatment to so called, “evidence-based methods” had failed to produce tangible results.  The findings generated publications in Swedish journals as merited commentary in Swedish newspapers and on the radio.

I promised to keep people updated if and when research became available in languages other than Swedish.  This week, the journal Psychotherapy, published an article comparing outcomes of three different treatment approaches, including CBT, psychodynamic, and integrative-eclectic psychotherapy.  Spanning a three year period, the results gathered at 13 outpatient clinics, found that psychotherapy was remarkably effective regardless of the type of treatment offered!  Read the study yourself and then ask yourself: when will a simpler, less expensive, and more client-centered approach to insuring effective and efficient behavioral health services be adopted?  Routinely seeking feedback from consumers regarding the process and outcome of care provides such an alternative.  The failure to find evidence that adopting specific models for specific disorders improves outcomes indicates the time has come.  You can learn more about feedback-informed treatment (FIT), a practice recently designed “evidence-based” by the Substance Abuse and Mental Health Services Administration (SAMHSA), by visiting the International Center for Clinical Excellence web-based community or attending an upcoming training with me in Chicago or on the road.

  • Learn more about what is going on in Sweden by reading:

Everyday evidence outcomes of psychotherapies in swedish public health services (psychotherapy werbart et al 2013)

  • Here’s one additional reference for those of you who read Swedish.  It’s the official summary of the results from the study that started this entire thread:
Delrapport ii slutversion

Filed Under: Practice Based Evidence Tagged With: CBT, evidence based practice, ors, outcome rating scale, psychotherapy, session rating scale, srs, sweden

More from Sweden

June 4, 2012 By scottdm Leave a Comment

sweden-mapThree short weeks ago, I was in Stockholm, Sweden talking about “what works” in clinical practice.  As I announced at the time, my visit coincided with an announcement by the organization governing mental health practice in the country.  For the better part of a decade, CBT enjoyed near exclusive status as “evidence-based.”  Indeed, payment for training of clinicians and treatment of clients in other approaches disappeared as over two billion Swedish crowns were spent on in CBT. 

The result? The widespread adoption of the method had no effect whatsoever on the outcome of people disabled by depression and anxiety.  The conclusion?  Guidelines for clinical practice were reviewed and expanded.  Research on feedback is in full swing in the largest randomized clinical trial on FIT in history.

More news…

Today, I received notice from Swedish clinician and publisher, Bengt Weine, that my article, “The Road to Mastery” (written together with my long friend and collaborator, Mark A. Hubble, Ph.D.), had been translated into Swedish and accepted for publication in SFT, the Swedish Family Therapy journal.  If you understand the language, click here to access a copy.

Helping clinicians and agencies along the “road to mastery” is what the upcoming Advanced Intensive and Training of Trainers events are all about.  Join colleagues from around the globe for these fun, intense days of training in Chicago.

Filed Under: Conferences and Training Tagged With: CBT, continuing education, FIT, holland, mark hubble, sweden

Psychologist Alan Kazdin Needs Help: Please Give

September 25, 2011 By scottdm Leave a Comment

Look at this picture.  This man needs help.  He is psychologist, Alan Kazdin, former president of the American Psychological Association, and current Professor of Psychology at Yale University.  A little over a week ago, to the surprise and shock of many in the field, he disclosed a problem in his professional life.  In an interview that appeared online at TimeHealthland Dr. Kazdin reported being unable to find a therapist or treatment program to which he could refer clients–even in Manhattan, New York, the nation’s largest city!

After traveling the length and breadth of the United States for the last decade, and meeting and working with hundreds of agencies and tens of thousands of therapists, I know there are many clinicians that can help Dr. Kazdin with his problem.  Our group has been tracking the outcome of numerous practitioners over the last decade and found average outcomes to be on par with those obtained in tightly controlled randomized clinical trails!  That’s good news for Dr. Kazdin.

Now, just to be sure, it should be pointed out that Dr. Kazdin is asking for practitioners who adhere to the Cochrane Review’s and the American Psychological Association’s definition of evidence-based practice (EBP)–or, I should say, I believe that is what he is asking for as the interview is not entirely clear on this point and appears to imply that EBP is about using specific treatment methods (the most popular, of course, being CBT).  The actual definition contains three main points, and clearly states that EBP is the integration of:

  1. The best available research;
  2. Clinical expertise; and
  3. The client’s culture, values, and preferences.

Interestingly, the official APA policy on evidence-based practice further defines clinical expertise as the “monitoring of patient progress (and of changes in the patient’s circumstances)…that may suggest the need to adjust the treatment.  If progress is not proceeding adequately, the psychologist alters or addresses problematic aspects of the treatment (e.g., problems in the therapeutic relationship or in the implementation of the goals of the treatment) as appropriate.”

I say “interestingly” for two reasons.  First, the definition of EBP clearly indicates that clinicians must tailor psychotherapy to the individual client.  And yet, the interview with Dr. Kazdin specifically quotes him as saying, “That’s a red herring. The research shows that no one knows how to do that. [And they don’t know how to monitor your progress].”   Now, admittedly, the research is new and, as Dr. Kazdin says, “Most people practicing who are 50 years or older”–like himself–may not know about it, but there are over a dozen randomized clinical trials documenting how routinely monitoring progress and the relationship and adjusting accordingly improves outcome.  The interview also reports him saying that “there is no real evidence” that the relationship (aka alliance) between the therapist and client matters when, in fact, the APA Interdivisional Task Force on Evidence-Based Therapy Relationships concluded that there is abundant evidence that “the therapy relationship accounts for substantial and consistent contributions to…outcome….at least as much as the particular method.”  (Incidently, the complete APA policy statement on EBP can be found in the May-June 2006 issue of the American Psychologist).

Who knows how these two major bloopers managed to slip through the editing process?  I sure know I’d be embarrased and immediately issue a clarification if I’d been misquoted making statements so clearly at odds with the facts.  Perhaps Dr. Kazdin is still busy looking for someone to whom he can refer clients.  If you are a professional who uses your clinical expertise to tailor the application of scientifically sound psychotherapy practices to client preferences, values, and culture, then you can help.

Filed Under: evidence-based practice, Top Performance Tagged With: Alan Kazdin, American Psychological Association, brief therapy, Carl Rogers, CBT, continuing education, evidence based practice, icce, medicine, therapy

Problems in Evidence-Based Land: Questioning the Wisdom of "Preferred Treatments"

March 29, 2010 By scottdm Leave a Comment

This last week, Jeremy Laurance, Health Editor for the U.K. Independent published an article entitled, “The big question: Does cognitive therapy work? And should the NHS (National Health Service) provide more of it?” Usually such questions are limited to professional journals and trade magazines. Instead, it ran in the “Life and Style” section of one of Britain’s largest daily newspapers. Why?

In 2007, the government earmarked £173,000,000 (approximately 260,000,000 U.S. dollars) to train up an army of new therapists. Briefly, the money was allocated following an earlier report by Professor Richard Layard of the London School of Economics which found that a staggering 38% of illness and disability claims were accounted for by “mental disorders.” The sticking point—and part of the reason for the article by Laurance—is that training was largely limited to a single treatment approach: cognitive-behavioral therapy (CBT).  And research released this week indicates that the efficacy of the method has been seriously overestimated due to “publication bias.”
Researchers Cuijpers, Smith, Bohlmeijer, Hollon, and Andersson (2010) examined the “effect sizes” of 117 trials and found that the tendency of journals to accept trials that showed positive results and reject those with null or negative findings reduced the reported effectiveness of CBT by as much as 33 percent!
Combine such findings with evidence from multiple meta-analyses showing no difference in outcome between treatment approaches intended to be therapeutic and one has to wonder why CBT continues to enjoy a privileged position among policy makers and regulatory bodies.  Despite the evidence, the governmental body in the UK that is responsible for reviewing research and making policy recommendations—National Institute for Health and Clinical Excellence (NICE)–continues to advocate for CBT.  It’s not only unscientific, its bad policy. Alas, when it comes to treatment methods, CBT enjoys what British psychologist Richard Wiseman calls, the “get out of a null effect free” card.
What would work? If the issue is truly guaranteeing effective treatment, the answer is measurement and feedback.  The single largest contributor to outcome is who provides the treatment and not what treatment approach is employed.  More than a dozen randomized clinical trials—the design of choice of NICE and SAMSHA—indicate that outcomes and retention rates are improved while costs are decreased—in many cases dramatically so.
I respectfully ask, “What is the hold up?”

Filed Under: Practice Based Evidence Tagged With: CBT, cdoi, cognitive-behavioral therapy, conferences, evidence based practice, icce, Jeremy Laurance, National Institute for Health and Clinical Excellence (NICE), randomized clinical trial, Richard Layard, Richard Wiseman

How NOT to Achieve Clinical Excellence: The Sorry State of Continuing Professional Education

September 30, 2009 By scottdm 5 Comments

Greg Neimeyer, Ph.D., is causing quite a stir in continuing education circles.  What has he done?  In several scholarly publications, he’s reviewed the existing empirical literature and found that continuing professional education in heavioral health is not particularly, well, …educational.  Indeed, in a soon-to-be published piece in the APA journal, Professional Psychology, he notes, “While the majority of studies report high levels of participants’ satisfaction with their CE experiences, little attention has been paid to assessing actual levels of learning, the translation of learning into practice, or the impact of CE on actual professional service delivery outcomes.”   Neimeyer then goes on to cite a scholarly review published in 2002 by Daniels and Walter which pointed out that “a search [of the research literature] revealed no controlled studies of the impact of continuing education in the…behavioral health disciplines” (p. 368).  Said another way, the near ubiguitous mandate that clinicians attend so many hours per year of approved “CE” events in order to further their knowledge and skill base has no empirical support.

Personally, my guess is that any study that might be done on CE in Behavioral Health would show little or no impact on performance anyway.  Why?  Studies in other fields (i.e., medicine, flight training) have long documented that traditional CE activities (i.e., attending conferences, lectures, reading articles) have no demonstrable effect.  So, what does work?  The same research that calls the efficacy of current CE activities into questions provide clear guidance: namely, brief, circumscribed, skill-based training, followed by observed practice, real-time feedback, and performance measurement. Such characteristics are, in fact, part and parcel of expert performance in any field.  And yet, it is virutally non-existent in behavioral health.

Let me give you an example of a CE offering that arrived in my box just this week.  The oversized, multi-color, tri-fold brochure boldly asserts a workshop on CBT featuring the “top evidence-based techniques.”  Momentarily setting aside the absolute lack of evidence in support of such trainings, consider the promised content–and I’m not kidding: clinical applications of cognitive behavior therapy, motivational interviewing, cognitive therapy, mindfulness and acceptance based therapies, and behavior therapy.  As if that were not enough, the outline for the training indicates that participants will learn 52 other bulleted points, including but not limited to: why CBT, integration of skills intro practice, identifying brain-based CBT strategies, the latest research on CBT, the stages of change, open-ended and reflective listening, behavioral activiation, acceptance and commitment, emotional regulation and distrss tolerance skills, the ABC technique to promote rational beliefs, homework assignments that test core beliefs, rescripting techniques for disturbing memories and images…and so on…AND ALL IN A SINGLE 6 HOUR DAY!  You say you have no money? Your agency has suffered budget cuts?  No worries, the ad states in giant print, as the same content is available via CD, web and podcast.

Such an agenda defies not only the evidence but strains credulity to the breaking point.  Could anyone accomplish so much in so little time?  Clinicians deserve and should demand more from the CE events they register for and, in many instances, are mandated to attend in order to maintain licensure and certification.  The International Center for Clinical Excellence web platform will soon be launched.  The mission of the site, as indicated in my blog post of August 25th, is to “support clinical excellence through creating virtual clinical networks, groups and clinical communities where clinicians can be supported in the key behavior changes required for developing clinical excellence.”  Members of the site will use a variety of social networking and collaborative tools to learn skills, obtain real-time feedback, and measure their performance.    Anyway, kudos to Dr. Greg Neimeyer for confronting the ugly truth about CE in behavioral health and saying it out loud!

Filed Under: Behavioral Health, Conferences and Training, evidence-based practice, Feedback, ICCE Tagged With: behavioral health, brief therapy, CBT, CE, CEUs, continuing professional education, icce, meta-analysis, psychology, psychometrics

SEARCH

Subscribe for updates from my blog.

loader

Email Address*

Name

Upcoming Training

Jun
03

Feedback Informed Treatment (FIT) Intensive ONLINE


Oct
01

Training of Trainers 2025


Nov
20

FIT Implementation Intensive 2025

FIT Software tools

FIT Software tools

LinkedIn

Topics of Interest:

  • Behavioral Health (112)
  • behavioral health (5)
  • Brain-based Research (2)
  • CDOI (14)
  • Conferences and Training (67)
  • deliberate practice (31)
  • Dodo Verdict (9)
  • Drug and Alcohol (3)
  • evidence-based practice (67)
  • excellence (63)
  • Feedback (40)
  • Feedback Informed Treatment – FIT (246)
  • FIT (29)
  • FIT Software Tools (12)
  • ICCE (26)
  • Implementation (7)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (11)
  • Practice Based Evidence (39)
  • PTSD (4)
  • Suicide (1)
  • supervision (1)
  • Termination (1)
  • Therapeutic Relationship (9)
  • Top Performance (40)

Recent Posts

  • Agape
  • Snippets
  • Results from the first bona fide study of deliberate practice
  • Fasten your seatbelt
  • A not so helpful, helping hand

Recent Comments

  • Bea Lopez on The Cryptonite of Behavioral Health: Making Mistakes
  • Anshuman Rawat on Integrity versus Despair
  • Transparency In Therapy and In Life - Mindfully Alive on How Does Feedback Informed Treatment Work? I’m Not Surprised
  • scottdm on Simple, not Easy: Using the ORS and SRS Effectively
  • arthur goulooze on Simple, not Easy: Using the ORS and SRS Effectively

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors conferences continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training