Every year, thousands of students graduate from professional programs with degrees enabling them to work in the field of behavioral health. Many more who have already graduated and are working as a social worker, psychologist, counselor, or marriage and family therapist attend—often by legal mandate—continuing education events. The costs of such training in terms of time and money are not insignificant.
Most graduates enter the professional world in significant debt, taking years to pay back student loans and recoup income that was lost during the years they were out of the job market attending school. Continuing professional education is also costly for agencies and individuals in practice, having to arrange time off from work and pay for training.
To most, the need for training seems self-evident. And yet, in the field of behavioral health the evidence is at best discouraging. While in traveling in New Zealand this week, my long-time colleague and friend, Dr. Bob Bertolino forwarded an article on the subject appearing in the latest issue of the Journal of Counseling and Development (volume 88, number 2, pages 204-209). In it, researchers Nyman and Nafziger reported results of their study on the relationship between therapist effectiveness and level of training.
First, the good news: “clients who obtained services…experienced moderate symptom relief over the course of six sessions.” Now the bad news: it didn’t matter if the client was “seen by a licensed doctoral –level counselor, a pre-doctoral intern, or a practicum student” (p. 206, emphasis added). The authors conclude, “It may be that researchers are loathe to face the possibility that the extensive efforts involved in educating graduate students to become licensed professionals result in no observable differences in client outcome” (p. 208, emphasis added).
In case you were wondering, such findings are not an anomaly. Not long ago, Atkins and Christensen (2001) reviewed the available evidence in an article published in the Australian Psychologist and concluded much the same (volume 36, pages 122-130); to wit, professional training has little if any impact on outcome. As for continuing professional education, you know if you’ve been reading my blog that there is not a single supportive study in the literature.
“How,” you may wonder, “could this be?” The answer is: content and methods. First of all, training at both the graduate and professional level continues to focus on the weakest link in the outcome chain—that is, model and technique. Recall, available evidence indicates that the approach used accounts for 1% or less of the variance in treatment outcome (see Wampold’s chapter in the latest edition of the Heart and Soul of Change). As just one example, consider workshops being conduced around the United States using precious resources to train clinicians in the methods studied in the “Cannabis Youth Treatment” (CYT) project–a study which found that the treatment methods used contributed zero to the variance in treatment outcome. Let me just say, where I come from zero is really close to nothing!
Second, and even more important, traditional methods of training (i.e., classroom lecture, reading, attending conferences) simply do not work. And sadly, behavioral health is one of the few professions that continue to rely on such outdated and ineffective training methods.
The literature on expertise and expert performance provides clear, compelling, and evidence-based guidelines about the qualities of effective training. I’ve highlighted such data in a number of recent blogposts. The information has already had a profound impact on the way how the ICCE organizes and conducts trainings. Thanks to Cynthia Maeschalck, Rob Axsen, and Bob, the entire curriculum and methods used for the annual “Training of Trainers” event have been entirely revamped. Suffice it to say, agencies and individuals who invest precious time and resources attending the training will not only learn but be able to document the impact of the training on performance. More later.