SCOTT D Miller - For the latest and greatest information on Feedback Informed Treatment

  • About
    • About Scott
    • Publications
  • Training and Consultation
  • Workshop Calendar
  • FIT Measures Licensing
  • FIT Software Tools
  • Online Store
  • Top Performance Blog
  • Contact Scott
scottdmiller@ talkingcure.com +1.773.454.8511

Is Professional Training a Waste of Time?

March 18, 2010 By scottdm 6 Comments

readerEvery year, thousands of students graduate from professional programs with degrees enabling them to work in the field of behavioral health. Many more who have already graduated and are working as a social worker, psychologist, counselor, or marriage and family therapist attend—often by legal mandate—continuing education events. The costs of such training in terms of time and money are not insignificant.

Most graduates enter the professional world in significant debt, taking years to pay back student loans and recoup income that was lost during the years they were out of the job market attending school. Continuing professional education is also costly for agencies and individuals in practice, having to arrange time off from work and pay for training.

To most, the need for training seems self-evident. And yet, in the field of behavioral health the evidence is at best discouraging. While in traveling in New Zealand this week, my long-time colleague and friend, Dr. Bob Bertolino forwarded an article on the subject appearing in the latest issue of the Journal of Counseling and Development (volume 88, number 2, pages 204-209). In it, researchers Nyman and Nafziger reported results of their study on the relationship between therapist effectiveness and level of training.

First, the good news: “clients who obtained services…experienced moderate symptom relief over the course of six sessions.” Now the bad news: it didn’t matter if the client was “seen by a licensed doctoral –level counselor, a pre-doctoral intern, or a practicum student” (p. 206, emphasis added). The authors conclude, “It may be that researchers are loathe to face the possibility that the extensive efforts involved in educating graduate students to become licensed professionals result in no observable differences in client outcome” (p. 208, emphasis added).

In case you were wondering, such findings are not an anomaly.  Not long ago, Atkins and Christensen (2001) reviewed the available evidence in an article published in the Australian Psychologist and concluded much the same (volume 36, pages 122-130); to wit, professional training has little if any impact on outcome.  As for continuing professional education, you know if you’ve been reading my blog that there is not a single supportive study in the literature.

“How,” you may wonder, “could this be?” The answer is: content and methods.  First of all, training at both the graduate and professional level continues to focus on the weakest link in the outcome chain—that is, model and technique. Recall, available evidence indicates that the approach used accounts for 1% or less of the variance in treatment outcome (see Wampold’s chapter in the latest edition of the Heart and Soul of Change).  As just one example, consider workshops being conduced around the United States using precious resources to train clinicians in the methods studied in the “Cannabis Youth Treatment” (CYT) project–a study which found that the treatment methods used contributed zero to the variance in treatment outcome.  Let me just say, where I come from zero is really close to nothing!

Second, and even more important, traditional methods of training (i.e., classroom lecture, reading, attending conferences) simply do not work. And sadly, behavioral health is one of the few professions that continue to rely on such outdated and ineffective training methods.

The literature on expertise and expert performance provides clear, compelling, and evidence-based guidelines about the qualities of effective training. I’ve highlighted such data in a number of recent blogposts. The information has already had a profound impact on the way how the ICCE organizes and conducts trainings.   Thanks to Cynthia Maeschalck, Rob Axsen, and Bob, the entire curriculum and methods used for the annual “Training of Trainers” event have been entirely revamped. Suffice it to say, agencies and individuals who invest precious time and resources attending the training will not only learn but be able to document the impact of the training on performance.  More later.

Filed Under: Top Performance Tagged With: behavioral health, Carl Rogers, cdoi, continuing professional education, healthcare, holland, icce, Journal of Counseling and Development, psychometrics

How NOT to Achieve Clinical Excellence: The Sorry State of Continuing Professional Education

September 30, 2009 By scottdm 5 Comments

Greg Neimeyer, Ph.D., is causing quite a stir in continuing education circles.  What has he done?  In several scholarly publications, he’s reviewed the existing empirical literature and found that continuing professional education in heavioral health is not particularly, well, …educational.  Indeed, in a soon-to-be published piece in the APA journal, Professional Psychology, he notes, “While the majority of studies report high levels of participants’ satisfaction with their CE experiences, little attention has been paid to assessing actual levels of learning, the translation of learning into practice, or the impact of CE on actual professional service delivery outcomes.”   Neimeyer then goes on to cite a scholarly review published in 2002 by Daniels and Walter which pointed out that “a search [of the research literature] revealed no controlled studies of the impact of continuing education in the…behavioral health disciplines” (p. 368).  Said another way, the near ubiguitous mandate that clinicians attend so many hours per year of approved “CE” events in order to further their knowledge and skill base has no empirical support.

Personally, my guess is that any study that might be done on CE in Behavioral Health would show little or no impact on performance anyway.  Why?  Studies in other fields (i.e., medicine, flight training) have long documented that traditional CE activities (i.e., attending conferences, lectures, reading articles) have no demonstrable effect.  So, what does work?  The same research that calls the efficacy of current CE activities into questions provide clear guidance: namely, brief, circumscribed, skill-based training, followed by observed practice, real-time feedback, and performance measurement. Such characteristics are, in fact, part and parcel of expert performance in any field.  And yet, it is virutally non-existent in behavioral health.

Let me give you an example of a CE offering that arrived in my box just this week.  The oversized, multi-color, tri-fold brochure boldly asserts a workshop on CBT featuring the “top evidence-based techniques.”  Momentarily setting aside the absolute lack of evidence in support of such trainings, consider the promised content–and I’m not kidding: clinical applications of cognitive behavior therapy, motivational interviewing, cognitive therapy, mindfulness and acceptance based therapies, and behavior therapy.  As if that were not enough, the outline for the training indicates that participants will learn 52 other bulleted points, including but not limited to: why CBT, integration of skills intro practice, identifying brain-based CBT strategies, the latest research on CBT, the stages of change, open-ended and reflective listening, behavioral activiation, acceptance and commitment, emotional regulation and distrss tolerance skills, the ABC technique to promote rational beliefs, homework assignments that test core beliefs, rescripting techniques for disturbing memories and images…and so on…AND ALL IN A SINGLE 6 HOUR DAY!  You say you have no money? Your agency has suffered budget cuts?  No worries, the ad states in giant print, as the same content is available via CD, web and podcast.

Such an agenda defies not only the evidence but strains credulity to the breaking point.  Could anyone accomplish so much in so little time?  Clinicians deserve and should demand more from the CE events they register for and, in many instances, are mandated to attend in order to maintain licensure and certification.  The International Center for Clinical Excellence web platform will soon be launched.  The mission of the site, as indicated in my blog post of August 25th, is to “support clinical excellence through creating virtual clinical networks, groups and clinical communities where clinicians can be supported in the key behavior changes required for developing clinical excellence.”  Members of the site will use a variety of social networking and collaborative tools to learn skills, obtain real-time feedback, and measure their performance.    Anyway, kudos to Dr. Greg Neimeyer for confronting the ugly truth about CE in behavioral health and saying it out loud!

Filed Under: Behavioral Health, Conferences and Training, evidence-based practice, Feedback, ICCE Tagged With: behavioral health, brief therapy, CBT, CE, CEUs, continuing professional education, icce, meta-analysis, psychology, psychometrics

SEARCH

Subscribe for updates from my blog.

loader

Email Address*

Name

Upcoming Training

Jun
03

Feedback Informed Treatment (FIT) Intensive ONLINE


Oct
01

Training of Trainers 2025


Nov
20

FIT Implementation Intensive 2025

FIT Software tools

FIT Software tools

LinkedIn

Topics of Interest:

  • Behavioral Health (112)
  • behavioral health (5)
  • Brain-based Research (2)
  • CDOI (14)
  • Conferences and Training (67)
  • deliberate practice (31)
  • Dodo Verdict (9)
  • Drug and Alcohol (3)
  • evidence-based practice (67)
  • excellence (63)
  • Feedback (40)
  • Feedback Informed Treatment – FIT (246)
  • FIT (29)
  • FIT Software Tools (12)
  • ICCE (26)
  • Implementation (7)
  • medication adherence (3)
  • obesity (1)
  • PCOMS (11)
  • Practice Based Evidence (39)
  • PTSD (4)
  • Suicide (1)
  • supervision (1)
  • Termination (1)
  • Therapeutic Relationship (9)
  • Top Performance (40)

Recent Posts

  • Agape
  • Snippets
  • Results from the first bona fide study of deliberate practice
  • Fasten your seatbelt
  • A not so helpful, helping hand

Recent Comments

  • Bea Lopez on The Cryptonite of Behavioral Health: Making Mistakes
  • Anshuman Rawat on Integrity versus Despair
  • Transparency In Therapy and In Life - Mindfully Alive on How Does Feedback Informed Treatment Work? I’m Not Surprised
  • scottdm on Simple, not Easy: Using the ORS and SRS Effectively
  • arthur goulooze on Simple, not Easy: Using the ORS and SRS Effectively

Tags

addiction Alliance behavioral health brief therapy Carl Rogers CBT cdoi common factors conferences continuing education denmark evidence based medicine evidence based practice Evolution of Psychotherapy excellence feedback feedback informed treatment healthcare holland icce international center for cliniclal excellence medicine mental health meta-analysis Norway NREPP ors outcome measurement outcome rating scale post traumatic stress practice-based evidence psychology psychometrics psychotherapy psychotherapy networker public behavioral health randomized clinical trial SAMHSA session rating scale srs supershrinks sweden Therapist Effects therapy Training