PainScience.com Sensible advice for aches, pains & injuries
 
 

Bogus Citations

13 classic ways to self-servingly screw up references to science, like “the sneaky reach” or “the uncheckable”

updated (first published 2009)
by Paul Ingraham, Vancouver, Canadabio
I am a science writer and a former Registered Massage Therapist with a decade of experience treating tough pain cases. I was the Assistant Editor of ScienceBasedMedicine.org for several years. I’ve written hundreds of articles and several books, and I’m known for readable but heavily referenced analysis, with a touch of sass. I am a runner and ultimate player. • more about memore about PainScience.com

Someone’s devoted to good citations! Karina Pry’s wrist upgrade, photo by her.

Just because there are footnotes doesn’t make it true.1 References to “scientific evidence” are routinely misleading and scammy.

A trivial, common example in my job: I got an email from a reader who wanted to make a point he thought I might be interested in. He included a reference to support his point — this is welcome, but quite unusual, as most people are just not that diligent. Unfortunately for his credibility, I actually checked that reference — imagine! — and I quickly realized that the paper did not actually support his point. Like, not even close. It wasn’t ambiguous.2

Unfortunately, even professionals can be that careless with their citations. The phenomenon of the bogus citation is almost a tradition in both real science and (especially) in science reporting. Full-blown irrelevancy in citations is surprisingly common, and milder forms of bogosity are epidemic. Real live scientists — not the better ones, of course, but scientists nevertheless — often “pad” their books and articles with barely relevant citations. They probably pulled the same crap with their term papers as undergrads!

But most common and sad, there is now an actual industry — a huge industry — of both junky and actually fraudulent journals34 (full discussion of this problem below). And so these days it is now easy to find “support” for “promising” evidence of basically anything — and no one will know the difference unless they are research literate and diligent enough to check.

This kind of stuff gives science a black eye. But it’s not that science is bad — it’s just bad science, bad scholarship, and fraud making real science look bad. It’s a major reason why nothing can ever be “known” until several credible lines of research have all converged on the same point.

So, a word to the wise: citations are not inherently trustworthy.

[better citation needed]

13 common wrong ways (at least!) to cite

The right way to cite scientific evidence is to link to a relevant, good-quality study — no glaring flaws, like “statistical significance” being passed off as “importance” — published in a real peer-reviewed scientific journal (not a junk journal). There are quite a few wrong ways…

The clean miss: cite something that is topically relevant but which just doesn’t actually support your point. It looks like a perfectly good footnote, but where’s the beef?

The sneaky reach: make just a bit too much of good and relevant evidence … without even realizing it yourself, probably.

The big reach: make waaaaay too much of otherwise good evidence.

The curve ball: reference to perfectly good science that has little or nothing to do with the point.

“New Study”

xkcd #1240 © xkcd.com by Randall Munroe

The non-cite: reference to junk science that is not actually peer-reviewed at all — published in a fraudulent “predatory journal” that will publish literally anything. This is becoming an extremely common type of bogus citation.

The junk cite: reference to junk science in “real” but terrible journals, only technically “peer-reviewed science,” but peer-reviewed badly. This is science done by hacks and quacks, and approved for publication by their highly biased and unqualified peers, for a rag of a journal that exists just to publish what real journals won’t touch. Alt-med has thousands of these. The emphasis in alt-med research is on setting the bar low enough to get over it without anyone but those nasty skeptics noticing.

The bluff (A.K.A. “the name drop”): citation selected at random from a famous scientific journal like The New England Journal of Medicine … because no one actually checks references, do they? See also NASA studies, more below.

The ego trip: cite your own work … which in turn cites only your own work … and so on …

The uncheckable: citing a chapter in a book no one can or would ever want to actually read, because it has a title like Gaussian-Verdian Analysis of Phototropobloggospheric Keynsian Infidelivitalismness … and it’s been out of print for decades and it’s in Russian.

The fake: just make stuff up! Often hard to distinguish from the uncheckable.

The backfire: citing science that actually undermines the point, instead of supporting it. It sounds hard to believe, but this is actually common!

The lite cite: case studies and other tarted up anecdotal evidence are a nice way to cite lite.

The dump cite: many, many references. A reference dump is meant to overwhelm rhetorical opposition with sheer volume — too many to meaningfully evaluate or respond to.

“Citogenesis”

xkcd #978 © xkcd.com by Randall Munroe

Geek-artist Randall Munroe made a highly relevant comment about that comic:

I just read a pop-science book by a respected author. One chapter, and much of the thesis, was based around wildly inaccurate data which traced back to … Wikipedia. To encourage people to be on their toes, I’m not going to say what book or author.

A weird bonus bogus citation type: NASA studies

Any time you hear that something in health care is allegedly “based on a NASA study,” it’s probably at least partially bullshit. It’s importance is being carelessly or deliberately inflated, often for commercial purposes, by leveraging the power of the NASA brand.

The finding often does not actually come from NASA at all, a “name drop” citation. Even if it does, it’s probably doesn’t really support the claim being made (a “big reach”), or it’s not even relevant at all (a “clean miss”). Many people are so dazzled by the idea of NASA as a source that they will cite their research if it’s even remotely connected to what they care about.

NASA does fund a lot of research, but they do not magically produce better research than other organizations. Emphasizing that they funded a study usually means someone isn’t focusing on what matters: results, not sources.

Confession: I have used bogus citations myself occasionally, by accident

I work my butt off to prevent bogus citations on this website, but I’m not perfect, it’s a hard job, and I’ve been learning the whole time I’ve been doing it. Sometimes I read a scientific study that I wrote long ago and I think, “Was I high?” It’s usually clear in retrospect that it was a straightforward case of wishful interpretation — reporting only the most useful, pleasing, headline-making interpretation of the study — but once in a while I find one so wrong that my rationale for ever using it is just a head-scratcher.

This is not a common problem on PainScience.com, but it’s wise to acknowledge that I also crap out the occasional bogus citation.

So many times I’ve been intrigued by the abstract of a scientific paper, dug into the whole thing & COULD NOT FIND THE REST OF THE PEPPERONI.

Today there is mind-bogglingly more junk science available to cite than ever before, because of the “predatory journal” crisis

Pseudoscience and crappy journals have always existed. But it’s worse now. Much worse.

There are now at least several thousand (bare minimum) completely fraudulent scientific journals that exist only to take money from gullible and/or desperate academics who must “publish or perish.” It’s a scam that has exploded into a full-blown industry on an almost unthinkable scale since the 1990s. These so-called journals collectively publish millions of papers annually with “no or trivial peer review, no obvious quality control, and no editorial board oversight”5

Tragically, most of the papers they publish have every superficial appearance of being real scientific papers. The substantive difference is that real scientific papers are peer-reviewed, and these papers are not! But the authors are actual scientists, who work at real organizations, who actually care about their research and their careers — if they didn’t, they wouldn’t pay predatory journals to publish their papers! Some of their research might even be half decent, and might have even made it into a real journal. But only some. Much of it is blatantly incompetent, fraudulent, or just daft, and wouldn’t have had a snowball’s chance in hell of being accepted by a real journal. Even most of the better ones wouldn’t cut it at a real journal, or at least not without significant improvements — because that’s what peer review is all about, separating the wheat from the chaff.

But it’s easy to get published in one of these predatory journals, no matter how bad your paper is. They will publish literal gibberish — this has been demonstrated repeatedly — as long as you pay them. But the real problem is that they publish so many papers that look like real science to the casual observer.

All of this not-peer-reviewed research now constitutes a substantial percentage of the papers I have to wade through when I’m doing my job, trying to get to the truth about pain science. When I complain that there’s a lot of junky science out there, I don’t just mean that there’s some low quality science — which was true long before predatory journals existed — I mean that the scientific literature is severely polluted with actual non-science, with an insane number of papers that were published under entirely false pretenses, the fruit of fraud.

This isn’t just a “problem” for science—it’s a rapidly developing disaster that “wastes taxpayer money, chips away at scientific credibility, and muddies important research.”

“Dubious Study”

xkcd #1847 © xkcd.com by Randall Munroe

Examples of bogus citations in the wild

A clean miss of a clean miss!

Here’s a rich example of a “clean miss” double-whammy, from a generally good article about controversy over the safety and effectiveness of powerful narcotic drugs. The authors are generally attacking the credibility of the American Pain Foundation’s position, and in this passage they accuse the APF of supporting a point with an irrelevant reference — a “clean miss.” But the accusation is based on a citation that isn’t actually there — a clean miss of their own! (Not that I could find in a good 20 minutes of poring over the document. Maybe it’s there, but after 20 minutes of looking I was beginning to question the sanity of the time investment.)

Another guide, written for journalists and supported by Alpharma Pharmaceuticals, likewise is reassuring. It notes in at least five places that the risk of opioid addiction is low, and it references a 1996 article in Scientific American, saying fewer than 1 percent of children treated with opioids become addicted.

But the cited article does not include this statistic or deal with addiction in children.

Actually, after a careful search, I can find no such Scientific American article cited in the APF document at all. So the APF’s point does not appear to be properly supported, but then again neither is the accusation.

SleepTracks.com tries to sell a dubious product with bogus references to science

I was making some corrections to my insomnia tutorial when I curiously clicked on an advertisement for SleepTracks.com, where a personable blogger named “Yan” is hawking an insomnia cure: “brain entrainment” by listening to “isochronic tones,” allegedly superior to the more common “binaural beats” method.

Yan goes to considerable lengths to portray his product as scientifically valid, advanced and modern, and he actually had me going for a while. He tells readers that he’s done “a lot of research.” To my amazement, he even cited some scientific papers. I had been so lulled by his pleasant writing tone that I almost didn’t check ‘em.

“Here are a few sleep-related scientific papers you can reference,” Yan writes, and then he supplies these three references:

Notice anything odd there? They lack dates. Hmm. I wonder why? Could it be because they’re from the stone age?

Those papers are from 1967, 1981, and 1982. In case you’ve lost track of time, 1982 was 36 years ago — not exactly “recent” reseach, particularly when you consider how far neuroscience has come in the last twenty years. Now, old research isn’t necessarily useless, but Yan was bragging about how his insomnia treatment method is based on modern science. And the only three references he can come up with pre-date the internet by a decade? One of them pre-dates me.

Clearly, these are references intended to make him look good. Yan didn’t actually think anyone would look them up. Yan was wrong. I looked them up. And their age isn’t the worst of it.

The 1981 study had negative results. (The Backfire!) The biofeedback methods studied — which aren’t even the same thing Yan is selling, just conceptually related — didn’t actually work: “No feedback group showed improved sleep significantly.” Gosh, Yan, thanks for doing that research! I sure am glad to know that your product is based on thirty year-old research that showed that a loosely related treatment method completely flopped!

The 1982 study? This one actually had positive results, but again studying something only sorta related. And the sample size? Sixteen patients — a microscopically small study, good for proving nothing.

The 1967 study? Not even a test of a therapy: just basic research. Fine if you’re interested in what researchers thought about brain waves and insomnia before ABBA. Fine if you want to include numerous other references. But as one of three references intended to support the efficacy of your product? Is this a joke?

So Yan gets the Bogus Citation Prize of the Year: not only are his citations barely relevant and ancient, but are obviously deliberately cited without dates to keep them from looking as silly as they are.

“The fake” 

Anecdotal evidence isn’t real evidence … but it can kind of seem like real evidence if it’s in a footnote! Illustration used with the kind permission of Zach Weiner, of Saturday Morning Breakfast Cereal. Thanks, Zach!

Better citations and footnotes

Here in the Salamander’s domain, there are no bogus citations. (Certainly none I know about!)

Citations here are harvested and hand-crafted masterfully from only from the finest pure organic heirloom artisan fair-trade sources. If I had a TV ad for this, there’d be oak barrels and a kindly old Italian gentleman wearing a leather apron and holding up sparkling citations in the dappled sunshine before uploading them to ye olde FTP server.

But how are you to believe this?

To actually trust citations without checking them yourself, you have to really trust the author. But you can only trust an author by actually checking their references quite a few times first. Even after trust is established, you should probably still check the occasional reference — for your own edification, if nothing else. And of course you should always check references when the truth actually matters to you. That goes without saying, right?

This is why I have gone to considerable technological lengths on PainScience.com not just to cite my sources for every key point, but also to provide user-friendly links to the original material … which makes them easy to check! This is a basic principle of responsible publishing of health care information online.

If you’re not going to leverage technology to facilitate reference-checking, why even bother?


About Paul Ingraham

Headshot of Paul Ingraham, short hair, neat beard, suit jacket.

I am a science writer, former massage therapist, and I was the assistant editor at ScienceBasedMedicine.org for several years. I have had my share of injuries and pain challenges as a runner and ultimate player. My wife and I live in downtown Vancouver, Canada. See my full bio and qualifications, or my blog, Writerly. You might run into me on Facebook or Twitter.

Related Reading

What’s new in this article?

Five updates have been logged for this article since publication (2009). All PainScience.com updates are logged to show a long term commitment to quality, accuracy, and currency. more Like good footnotes, update logging sets PainScience.com apart from most other health websites and blogs. It’s fine print, but important fine print, in the same spirit of transparency as the editing history available for Wikipedia pages.

I log any change to articles that might be of interest to a keen reader. Complete update logging started in 2016. Prior to that, I only logged major updates for the most popular and controversial articles.

See the What’s New? page for updates to all recent site updates.

AugustAdded two xkcd comics: “Dubious Study” and “New Study.” This article is now chock-a-block with comics! Probably more comics per square inch than any other page on PainScience.com.

JuneBig update today: added thorough discussion of the problem of predatory journals. Added three new bogus citation types: the non-cite, the junk cite, and the dump cite. Added a short section about citations to NASA studies.

2017Added a brief acknowledgement of my own fallibility in the citation department.

2017Added two citations substantiating the prevalance of predatory/fraudulent journals.

2017Added a paragraph about the existence and purpose of the junk journal industry.

2009Publication.

Notes

  1. This footnote, for instance, in no way supports the statement it follows, and is present only for the purpose of being mildly amusing. BACK TO TEXT
  2. It wasn’t a matter of interpretation or opinion. It was just way off: he had read much more into the paper than the researchers had ever intended anyone to get from it. The authors had clearly defined the limits of what could be interpreted from their evidence. BACK TO TEXT
  3. Beall J. What I learned from predatory publishers. Biochem Med (Zagreb). 2017 Jun;27(2):273–278. PubMed #28694718. PainSci #52870.

    ABSTRACT


    This article is a first-hand account of the author's work identifying and listing predatory publishers from 2012 to 2017. Predatory publishers use the gold (author pays) open access model and aim to generate as much revenue as possible, often foregoing a proper peer review. The paper details how predatory publishers came to exist and shows how they were largely enabled and condoned by the open-access social movement, the scholarly publishing industry, and academic librarians. The author describes tactics predatory publishers used to attempt to be removed from his lists, details the damage predatory journals cause to science, and comments on the future of scholarly publishing.

    BACK TO TEXT
  4. Gasparyan AY, Yessirkepov M, Diyanova SN, Kitas GD. Publishing Ethics and Predatory Practices: A Dilemma for All Stakeholders of Science Communication. J Korean Med Sci. 2015 Aug;30(8):1010–6. PubMed #26240476. PainSci #52903. “Over the past few years, numerous illegitimate or predatory journals have emerged in most fields of science.” BACK TO TEXT
  5. Blogs.jwatch.org [Internet]. Sax P. Predatory Journals Are Such a Big Problem It's Not Even Funny; 2018 June 14 [cited 18 Jun 21]. BACK TO TEXT