Sick of lame citing
I consider it my job to always cite the best relevant science that I can find, such as it is, and that’s still the rule pretty much everywhere on PainScience.com, in hundreds of articles. But there is an exception now.
Over many years of grinding out dozens of informal narrative reviews I have grown weary and disillusioned with the extremely poor average quality of trials of interventions in musculoskeletal medicine. More and more, I think the best “evidence-based” can mean is just a rigorous critical analysis of plausibility and safety. Good direct evidence of efficacy is hardly ever actually available. It sucks, but that’s where we’re at.
So in my new article on laser therapy, published last October, I did something new: I decided not to cite any “garbage in, garbage out” meta-analyses, because I am frankly just sick to death of citing science that only muddies the waters, and of having to explain the same problems with it ad nauseam. How many times have I had to come with fresh ways to describe “methodological flaws” or how “more and better research is needed”? I’m running out of patience. So in this case, I decided to drop it. If the citations aren’t useful, then don’t use them!
This upset some people, of course. After decade of constantly being in people’s crosshairs for citing, suddenly I was getting flak for not citing. If by some staggering coincidence I chose to do this with a topic where there actually is good, positive research that I missed… well, wouldn’t that be ironic! Ultimately I’d enjoy being wrong in that way, but I am not holding my breath for that satisfaction.
Even if the evidence is useless, maybe it was a tactical mistake to not to cite it. Obviously its absence makes the article easier to criticize and dismiss; maybe I can’t escape the need to cite the research for the sake of scholarly appearances. But, for the record, I didn’t neglect the allegedly “positive” evidence because I was unaware of it, but because it’s lame.