PainScience.com Sensible advice for aches, pains & injuries
 
 

Bogus Citations

11 classic ways to self-servingly screw up references to science, like “the sneaky reach” or “the uncheckable”

updated (first published 2009)
by Paul Ingraham, Vancouver, Canadabio
I am a science writer and a former Registered Massage Therapist with a decade of experience treating tough pain cases. I was the Assistant Editor of ScienceBasedMedicine.org for several years. I’ve written hundreds of articles and several books, and I’m known for readable but heavily referenced analysis, with a touch of sass. I am a runner and ultimate player. • more about memore about PainScience.com

Someone’s devoted to good citations! Karina Pry’s wrist upgrade, photo by her.

Just because there are footnotes doesn’t make it true.1 References to “scientific evidence” are routinely misleading and scammy.

A trivial, common example in my job: I got an email from a reader who wanted to make a point he thought I might be interested in. He included a reference to support his point — this is welcome, but quite unusual, as most people are just not that diligent. Unfortunately for his credibility, I actually checked that reference — imagine! — and I quickly realized that the paper did not actually support his point. And this wasn’t ambiguous.2

Unfortunately, even professionals can be that careless with their citations. The phenomenon of the bogus citation is almost a tradition in both real science and (especially) in science reporting. Full-blown irrelevancy in citations is surprisingly common, and milder forms of bogosity are epidemic. Real live scientists — not the better ones, of course, but scientists nevertheless — often “pad” their books and articles with barely relevant citations. They probably pulled the same crap with their term papers as undergrads!

But most common and sad, there is now an actual industry of fraudulent and junky journals34 that exists to just to supply meaningless citations to people who want to create the appearance of scientific support for their sacred cows and cash cows. And so these days it is now easy to find “support” for or “promising” evidence of basically anything — and no one will know the difference unless they are research literate and check.

This kind of stuff gives science a black eye. But it’s not that science is bad — it’s just bad science, bad scholarship, and fraud. This is why nothing can ever be “known” until several credible lines of research have all converged on the same point.

So, a word to the wise: citations are not inherently trustworthy.

[better citation needed]

11 common wrong ways (at least!) to refer to scientific evidence

The right way to refer to scientific evidence is to link to a relevant, good-quality study — no glaring flaws, like “statistical significance” being passed off as “importance” — published in a real peer-reviewed scientific journal (not a junk journal). There are quite a few wrong ways…

The clean miss: cite something that is topically relevant but which just doesn’t actually support your point. It looks like a perfectly good footnote, but where’s the beef?

The sneaky reach: make just a bit too much of good and relevant evidence … without even realizing it yourself, probably.

The big reach: make waaaaay too much of otherwise good evidence.

The curve ball: reference to perfectly good science that has little or nothing to do with the point.

The bluff (A.K.A. “the name drop”): citation selected at random from a famous scientific journal like The New England Journal of Medicine … because no one actually checks references, do they?

The ego trip: cite your own work … which in turn cites only your own work … and so on …

The slum cite: referencing research that perfectly supports your point, but is published by hacks and quacks in a crap journal no one’s ever heard of or ever will again.

The uncheckable: citing a chapter in a book no one can or would ever want to actually read, because it has a title like Gaussian-Verdian Analysis of Phototropobloggospheric Keynsian Infidelivitalismness … and it’s been out of print for decades and it’s in Russian.

The backfire: citing science that actually undermines your point, instead of supporting it. (It sounds hard to believe, but this is actually common.)

The fake: case studies and other tarted up anecdotal evidence are a nice way to cite lite.

The really low road: just make stuff up!

“Citogenesis”

xkcd #978 © xkcd.com by Randall Munroe

Geek-artist Randall Munroe made a highly relevant comment about that comic:

I just read a pop-science book by a respected author. One chapter, and much of the thesis, was based around wildly inaccurate data which traced back to … Wikipedia. To encourage people to be on their toes, I’m not going to say what book or author.

I am not completely innocent

I work my butt off to prevent bogus citations on this website, but I’m not perfect, it’s a hard job, and I’ve been learning the whole time I’ve been doing it. Sometimes I read a scientific study that I wrote long ago and I think, “Was I high?” It’s usually clear in retrospect that it was a straightforward case of wishful interpretation — reporting only the most useful, desired interpretation of the study — but once in a while I find one so wrong that my rationale for using it is just a head-scratcher.

This is not a common problem on PainScience.com, but it’s wise to acknowledge that I also crap out the occasional bogus citation.

Example: a clean miss of a clean miss!

Here’s a rich example of a “clean miss” double-whammy, from a generally good article about controversy over the safety and effectiveness of powerful narcotic drugs. The authors are generally attacking the credibility of the American Pain Foundation’s position, and in this passage they accuse the APF of supporting a point with an irrelevant reference — a “clean miss.” But the accusation is based on a citation that isn’t actually there — a clean miss of their own! (Not that I could find in a good 20 minutes of poring over the document. Maybe it’s there, but after 20 minutes of looking I was beginning to question the sanity of the time investment.)

Another guide, written for journalists and supported by Alpharma Pharmaceuticals, likewise is reassuring. It notes in at least five places that the risk of opioid addiction is low, and it references a 1996 article in Scientific American, saying fewer than 1 percent of children treated with opioids become addicted.

But the cited article does not include this statistic or deal with addiction in children.

Actually, after a careful search, I can find no such Scientific American article cited in the APF document at all. So the APF’s point does not appear to be properly supported, but then again neither is the accusation.

Example: SleepTracks.com tries to sell a dubious product with bogus references to science

I was making some corrections to my insomnia tutorial when I curiously clicked on an advertisement for SleepTracks.com, where a personable blogger named “Yan” is hawking an insomnia cure: “brain entrainment” by listening to “isochronic tones,” allegedly superior to the more common “binaural beats” method.

Yan goes to considerable lengths to portray his product as scientifically valid, advanced and modern, and he actually had me going for a while. He tells readers that he’s done “a lot of research.” To my amazement, he even cited some scientific papers. I had been so lulled by his pleasant writing tone that I almost didn’t check ‘em.

“Here are a few sleep-related scientific papers you can reference,” Yan writes, and then he supplies these three references:

Notice anything odd there? They lack dates. Hmm. I wonder why? Could it be because they’re from the stone age?

Those papers are from 1967, 1981, and 1982. In case you’ve lost track of time, 1982 was 35 years ago — not exactly “recent” reseach, particularly when you consider how far neuroscience has come in the last twenty years. Now, old research isn’t necessarily useless, but Yan was bragging about how his insomnia treatment method is based on modern science. And the only three references he can come up with pre-date the internet by a decade? One of them pre-dates me.

Clearly, these are references intended to make him look good. Yan didn’t actually think anyone would look them up. Yan was wrong. I looked them up. And their age isn’t the worst of it.

The 1981 study had negative results. (The Backfire!) The biofeedback methods studied — which aren’t even the same thing Yan is selling, just conceptually related — didn’t actually work: “No feedback group showed improved sleep significantly.” Gosh, Yan, thanks for doing that research! I sure am glad to know that your product is based on thirty year-old research that showed that a loosely related treatment method completely flopped!

The 1982 study? This one actually had positive results, but again studying something only sorta related. And the sample size? Sixteen patients — a microscopically small study, good for proving nothing.

The 1967 study? Not even a test of a therapy: just basic research. Fine if you’re interested in what researchers thought about brain waves and insomnia before ABBA. Fine if you want to include numerous other references. But as one of three references intended to support the efficacy of your product? Is this a joke?

So Yan gets the Bogus Citation Prize of the Year: not only are his citations barely relevant and ancient, but are obviously deliberately cited without dates to keep them from looking as silly as they are.

“The fake” 

Anecdotal evidence isn’t real evidence … but it can kind of seem like real evidence if it’s in a footnote! Illustration used with the kind permission of Zach Weiner, of Saturday Morning Breakfast Cereal. Thanks, Zach!

Better citations and footnotes

Here in the Salamander’s domain, there are no bogus citations. (Certainly none I know about!)

Citations here are harvested and hand-crafted masterfully from only from the finest pure organic heirloom artisan fair-trade sources. If I had a TV ad for this, there’d be oak barrels and a kindly old Italian gentleman wearing a leather apron and holding up sparkling citations in the dappled sunshine before uploading them to ye olde FTP server.

But how are you to believe this?

To actually trust citations without checking them yourself, you have to really trust the author. But you can only trust an author by actually checking their references quite a few times first. Even after trust is established, you should probably still check the occasional reference — for your own edification, if nothing else. And of course you should always check references when the truth actually matters to you. That goes without saying, right?

This is why I have gone to considerable technological lengths on PainScience.com not just to cite my sources for every key point, but also to provide user-friendly links to the original material … which makes them easy to check! This is a basic principle of responsible publishing of health care information online.

If you’re not going to leverage technology to facilitate reference-checking, why even bother?


About Paul Ingraham

Headshot of Paul Ingraham, short hair, neat beard, suit jacket.

I am a science writer, former massage therapist, and I was the assistant editor at ScienceBasedMedicine.org for several years. I have had my share of injuries and pain challenges as a runner and ultimate player. My wife and I live in downtown Vancouver, Canada. See my full bio and qualifications, or my blog, Writerly. You might run into me on Facebook or Twitter.

Related Reading

What’s new in this article?

SeptemberAdded a brief acknowledgement of my own fallibility in the citation department.

JulyAdded two citations substantiating the prevalance of predatory/fraudulent journals.

FebruaryAdded a paragraph about the existence and purpose of the junk journal industry.

2009Publication.

Notes

  1. This footnote, for instance, in no way supports the statement it follows, and is present only for the purpose of being mildly amusing. BACK TO TEXT
  2. It wasn’t a matter of interpretation or opinion. It was just way off: he had read much more into the paper than the researchers had ever intended anyone to get from it. The authors had clearly defined the limits of what could be interpreted from their evidence. BACK TO TEXT
  3. Beall J. What I learned from predatory publishers. Biochem Med (Zagreb). 2017 Jun;27(2):273–278. PubMed #28694718. PainSci #52870.

    ABSTRACT


    This article is a first-hand account of the author's work identifying and listing predatory publishers from 2012 to 2017. Predatory publishers use the gold (author pays) open access model and aim to generate as much revenue as possible, often foregoing a proper peer review. The paper details how predatory publishers came to exist and shows how they were largely enabled and condoned by the open-access social movement, the scholarly publishing industry, and academic librarians. The author describes tactics predatory publishers used to attempt to be removed from his lists, details the damage predatory journals cause to science, and comments on the future of scholarly publishing.

    BACK TO TEXT
  4. Gasparyan AY, Yessirkepov M, Diyanova SN, Kitas GD. Publishing Ethics and Predatory Practices: A Dilemma for All Stakeholders of Science Communication. J Korean Med Sci. 2015 Aug;30(8):1010–6. PubMed #26240476. PainSci #52903. “Over the past few years, numerous illegitimate or predatory journals have emerged in most fields of science.” BACK TO TEXT