When perusing addiction articles as I do, I often come across work that is well done, researched, has substance, and contributes solid knowledge to our field. This holds particularly true for articles that put forth new and needed addiction treatments. I try to pass on such articles in my Research to Practice column.
Then you come across stuff that is the exact opposite, and quite frankly needs mountains of renovation, as well as mountains of relevant research to come anywhere close to the term “well done.” I recently read some thing that belongs in this camp.
The article focused on an alternative to the traditional 12-Step program format. This article purposed 10 not 12 steps to recovery. The authors attempted to strengthen support for this new Step approach by adding a reasonably good reference list. But, upon closer examination the references came up short. Essentially, there just wasn’t anything cited in those references that actually put this new Step program to the test.
What the authors tried to do instead was cite a single research article completed in 1980 that put forth the claim that becoming self-aware leads to people work harder to “fix” things. (An imbedded premise for these 10 steps.)
Now here comes the criticism. First, it seemed a stretch for the authors to call up something written 31 years ago as creditable support for their new Step program today. Second, the research they cited is 31 years old! If you are going to cite supporting research for a new 10-Step program, should you not cite something a little more recent?
There were a few other glaring things wrong with the article. But there was one that hit me between the eyes. There was absolutely no data for how this new Step program measured up in terms of outcome research (e.g., pre-post-control studies, comparative analysis with other forms of treatment). None. Nada. Zero.
I find it difficult to fathom that in this day and age, that some one would put forth a new approach to addiction recovery without first testing it in a rigorous manner. Maybe that was how we did things decades ago, but this is the 21st Century and new treatment approaches now need statistics, facts, and data to back their claim.
My central argument is that you and I have to be careful of “new therapies” that do not offer empirical support. All treatment without research support needs to be viewed with a suspicious eye. This new Step concept is a prime example of what not to do in terms of introducing a new addiction treatment to our field.
I fear many reading the “10 steps” will be immediately caught up in it. They will perceive this as something fresh, and apply it to their very next client. That’s sad.
It is alluring to believe that the things you see online or in print are true. Yet, reading any and all addiction articles call for questions and some critical thought.
So, with articles of this nature is it is supremely important for you and I to seek the data. If there is none that is a sure sign we need to be suspect.
Would you give someone with an illness a pill without the pill first being thoroughly tested? Same sentiment applies here.