PainScience.com • Good advice for aches, pains & injuries

13 Kinds of Bogus Citations

Classic ways to self-servingly screw up references to science, like “the sneaky reach” or “the uncheckable”

Paul Ingraham, updated

Someone’s devoted to good citations! Karina Pry’s wrist upgrade, photo by her.

Just because there are footnotes doesn’t make it true.1 References to “scientific evidence” are routinely misleading and scammy.

A trivial, common example in my job: I got an email from a reader who wanted to make a point he thought I might be interested in. He included a reference to support his point — this is welcome, but quite unusual, as most people are just not that diligent. Unfortunately for his credibility, I actually checked that reference — imagine! — and I quickly realized that the paper did not actually support his point. Like, not even close. It wasn’t ambiguous.2

Unfortunately, even professionals can be that careless with their citations. The phenomenon of the bogus citation is almost a tradition in both real science and (especially) in science reporting. Full-blown irrelevancy in citations is surprisingly common, and milder forms of bogosity are epidemic. Real live scientists — not the better ones, of course, but scientists nevertheless — often “pad” their books and articles with barely relevant citations. They probably pulled the same crap with their term papers as undergrads!

But most common and sad, there is now an actual industry — a huge industry — of both junky and actually fraudulent journals34 (full discussion of this problem below). And so these days it is now easy to find “support” for “promising” evidence of basically anything — and no one will know the difference unless they are research literate and diligent enough to check.

This kind of stuff gives science a black eye. But it’s not that science is bad — it’s just bad science, bad scholarship, and fraud making real science look bad. It’s a major reason why nothing can ever be “known” until several credible lines of research have all converged on the same point.

So, a word to the wise: citations are not inherently trustworthy.

[better citation needed]

13 common wrong ways (at least!) to cite

The right way to cite scientific evidence is to link to a relevant, good-quality study — no glaring flaws, like “statistical significance” being passed off as “importance” — published in a real peer-reviewed scientific journal (not a junk journal). There are quite a few wrong ways …

The clean miss: cite something that is topically relevant but which just doesn’t actually support your point. It looks like a perfectly good footnote, but where’s the beef?

The sneaky reach: make just a bit too much of good and relevant evidence … without even realizing it yourself, probably.

The big reach: make waaaaay too much of otherwise good evidence.

The curve ball: reference to perfectly good science that has little or nothing to do with the point.

“New Study”

xkcd #1240 © xkcd.com by Randall Munroe

The non-cite: reference to junk science that is not actually peer-reviewed at all — published in a fraudulent “predatory journal” that will publish literally anything. This is becoming an extremely common type of bogus citation.

The junk cite: reference to junk science in “real” but terrible journals, only technically “peer-reviewed science,” but peer-reviewed badly. This is science done by hacks and quacks, and approved for publication by their highly biased and unqualified peers, for a rag of a journal that exists just to publish what real journals won’t touch. Alt-med has thousands of these. The emphasis in alt-med research is on setting the bar low enough to get over it without anyone but those nasty skeptics noticing.

The bluff (A.K.A. “the name drop”): citation selected at random from a famous scientific journal like The New England Journal of Medicine… because no one actually checks references, do they? See also NASA studies, more below.

The ego trip: cite your own work … which in turn cites only your own work … and so on …

The uncheckable: citing a chapter in a book no one can or would ever want to actually read, because it has a title like Gaussian-Verdian Analysis of Phototropobloggospheric Keynsian Infidelivitalismness… and it’s been out of print for decades and it’s in Russian.

The fake: just make stuff up! Often hard to distinguish from the uncheckable. There have been cases of extensive phantom citations to papers that never existed.

The backfire: citing science that actually undermines the point, instead of supporting it. It sounds hard to believe, but this is actually common!

The lite cite: case studies and other tarted up anecdotal evidence are a nice way to cite lite.

The dump cite: many, many references. A reference dump is meant to overwhelm rhetorical opposition with sheer volume — too many to meaningfully evaluate or respond to.

So many times I’ve been intrigued by the abstract of a scientific paper, dug into the whole thing & COULD NOT FIND THE REST OF THE PEPPERONI.

Cherry-picking and nit-picking

Citations are often “cherry picked” — that is, only the best evidence supporting a point is cited. The citations themselves may not be bogus, but the pattern of citation is, because it’s suspiciously lacking contrary evidence.

Less familiar: advanced cherry pickers also exaggerate the flaws of studies they don’t like (to justify dismissing them). Such nitpicking can easily seem like credible critical analysis, but it’s easy to find problems with even the best studies. Research is a messy, collaborative enterprise, and perfect studies are as rare as perfect movies. When we don’t like the conclusions, we are much likelier to see research flaws and blow them out of proportion. It works like this…

Flow chart: first cell, new study published. Second, does it confirm my beliefs? If yes, must be a good study. If no, must be a bad study, nitpick and find flaws, bad study confirmed. Both pathways lead to the conclusion: I was right all along!

No one is immune to bias, and evaluating scientific evidence fairly is really tricky. At this point you might be wondering how we can ever trust any citations. But it gets even worse…

What if citations avoid all of these pitfall? They can still be bogus!

Genuine and serious research flaws are often invisible. Famously, “Most Published Research Findings Are False,”5 even when there’s nothing obviously wrong with them. It’s amazing how many ways “good” studies can still be bad, how easily they can go wrong, or be made to go wrong, by well-intentioned researchers trying to prove pet theories.6 Musculoskeletal medicine is plagued by junky, underpowered studies with “positive” results that just aren’t credible — all they do is muddy the waters.7

So beware of citations to a smaller number of small studies… even if there’s nothing obviously wrong with them.

The life cycle of bogus citations

I’ll now pass the baton to geek-cartoonist Randall Munroe to explain how bogus references proliferate:

“Citogenesis”

xkcd #978 © xkcd.com by Randall Munroe

Munroe made a highly relevant comment about that comic:

I just read a pop-science book by a respected author. One chapter, and much of the thesis, was based around wildly inaccurate data which traced back to … Wikipedia. To encourage people to be on their toes, I’m not going to say what book or author.

A weird bonus bogus citation type: NASA studies

Any time you hear that something in health care is allegedly “based on a NASA study,” it’s probably at least partially bullshit. It’s importance is being carelessly or deliberately inflated, often for commercial purposes, by leveraging the power of the NASA brand.

The finding often does not actually come from NASA at all, a “name drop” citation. Even if it does, it’s probably doesn’t really support the claim being made (a “big reach”), or it’s not even relevant at all (a “clean miss”). Many people are so dazzled by the idea of NASA as a source that they will cite their research if it’s even remotely connected to what they care about.

NASA does fund a lot of research, but they do not magically produce better research than other organizations. Emphasizing that they funded a study usually means someone isn’t focusing on what matters: results, not sources.

Photo of Dr. Mindy Kaling, caption, captioned “No really tell me again how great the study is… based on the abstract.”

Most abstracts look perfectly cromulent. Damning & dubious details are often not just “missing” from the abstract, but deliberately excluded.

Confession: I have used bogus citations myself occasionally, by accident

I work my butt off to prevent bogus citations on this website, but I’m not perfect, it’s a hard job, and I’ve been learning the whole time I’ve been doing it. Sometimes I read a scientific study that I wrote about long ago and I think, “Was I high?” It’s usually clear in retrospect that it was a straightforward case of wishful interpretation — reporting only the most useful, pleasing, headline-making interpretation of the study — but once in a while I find one so wrong that my rationale for ever using it is just a head-scratcher.

This is not a common problem on PainScience.com, but it’s wise to acknowledge that I also crap out the occasional bogus citation, and I’ve probably unfairly demoted some studies I didn’t like. No one is immune to bias. The only real defense is a relentless and transparent effort to combat citation bogusity.

Today there is mind-bogglingly more junk science available to cite than ever before, because of the “predatory journal” crisis

Pseudoscience and crappy journals have always existed. But it’s worse now. Much worse.

There are now at least several thousand (bare minimum) completely fraudulent scientific journals that exist only to take money from gullible and/or desperate academics who must “publish or perish.” It’s a scam that has exploded into a full-blown industry on an almost unthinkable scale since the 1990s. These so-called journals collectively publish millions of papers annually with “no or trivial peer review, no obvious quality control, and no editorial board oversight.”8 Oy.

Tragically, most of the papers they publish have every superficial appearance of being real scientific papers. The substantive difference is that real scientific papers are peer-reviewed, and these papers are not! But the authors are actual scientists, who work at real organizations, who actually care about their research and their careers — if they didn’t, they wouldn’t pay predatory journals to publish their papers! Some of their research might even be half decent, and might have even made it into a real journal. But only some. Much of it is blatantly incompetent, fraudulent, or just daft, and wouldn’t have had a snowball’s chance in hell of being accepted by a real journal. Even most of the better ones wouldn’t cut it at a real journal, or at least not without significant improvements — because that’s what peer review is all about, separating the wheat from the chaff.

But it’s easy to get published in one of these predatory journals, no matter how bad your paper is. They will publish literal gibberish — this has been demonstrated repeatedly — as long as you pay them. But the real problem is that they publish so many papers that look like real science to the casual observer.

All of this not-peer-reviewed research now constitutes a substantial percentage of the papers I have to wade through when I’m doing my job, trying to get to the truth about pain science. When I complain that there’s a lot of junky science out there, I don’t just mean that there’s some low quality science — which was true long before predatory journals existed — I mean that the scientific literature is severely polluted with actual non-science, with an insane number of papers that were published under entirely false pretenses, the fruit of fraud.

This isn’t just a “problem” for science—it’s a rapidly developing disaster that “wastes taxpayer money, chips away at scientific credibility, and muddies important research.”

“Dubious Study”

xkcd #1847 © xkcd.com by Randall Munroe

Examples of bogus citations in the wild

A clean miss of a clean miss!

Here’s a rich example of a “clean miss” double-whammy, from a generally good article about controversy over the safety and effectiveness of powerful narcotic drugs. The authors are generally attacking the credibility of the American Pain Foundation’s position, and in this passage they accuse the APF of supporting a point with an irrelevant reference — a “clean miss.” But the accusation is based on a citation that isn’t actually there — a clean miss of their own! (Not that I could find in a good 20 minutes of poring over the document. Maybe it’s there, but after 20 minutes of looking I was beginning to question the sanity of the time investment.)

Another guide, written for journalists and supported by Alpharma Pharmaceuticals, likewise is reassuring. It notes in at least five places that the risk of opioid addiction is low, and it references a 1996 article in Scientific American, saying fewer than 1 percent of children treated with opioids become addicted.

But the cited article does not include this statistic or deal with addiction in children.

Actually, after a careful search, I can find no such Scientific American article cited in the APF document at all. So the APF’s point does not appear to be properly supported, but then again neither is the accusation.

SleepTracks.com tries to sell a dubious product with bogus references to science

I was making some corrections to my insomnia tutorial when I curiously clicked on an advertisement for SleepTracks.com, where a personable blogger named “Yan” is hawking an insomnia cure: “brain entrainment” by listening to “isochronic tones,” allegedly superior to the more common “binaural beats” method.

Yan goes to considerable lengths to portray his product as scientifically valid, advanced and modern, and he actually had me going for a while. He tells readers that he’s done “a lot of research.” To my amazement, he even cited some scientific papers. I had been so lulled by his pleasant writing tone that I almost didn’t check ‘em.

“Here are a few sleep-related scientific papers you can reference,” Yan writes, and then he supplies these three references:

Notice anything odd there? They lack dates. Hmm. I wonder why? Could it be because they’re from the stone age?

Those papers are from 1967, 1981, and 1982. In case you’ve lost track of time, 1982 was 37 years ago — not exactly “recent” reseach, particularly when you consider how far neuroscience has come in the last twenty years. Now, old research isn’t necessarily useless, but Yan was bragging about how his insomnia treatment method is based on modern science. And the only three references he can come up with pre-date the internet by a decade? One of them pre-dates me.

Clearly, these are references intended to make him look good. Yan didn’t actually think anyone would look them up. Yan was wrong. I looked them up. And their age isn’t the worst of it.

The 1981 study had negative results. (The Backfire!) The biofeedback methods studied — which aren’t even the same thing Yan is selling, just conceptually related — didn’t actually work: “No feedback group showed improved sleep significantly.” Gosh, Yan, thanks for doing that research! I sure am glad to know that your product is based on thirty year-old research that showed that a loosely related treatment method completely flopped!

The 1982 study? This one actually had positive results, but again studying something only sorta related. And the sample size? Sixteen patients — a microscopically small study, good for proving nothing.

The 1967 study? Not even a test of a therapy: just basic research. Fine if you’re interested in what researchers thought about brain waves and insomnia before ABBA. Fine if you want to include numerous other references. But as one of three references intended to support the efficacy of your product? Is this a joke?

So Yan gets the Bogus Citation Prize of the Year: not only are his citations barely relevant and ancient, but are obviously deliberately cited without dates to keep them from looking as silly as they are.

“The fake” 

Anecdotal evidence isn’t real evidence … but it can kind of seem like real evidence if it’s in a footnote! Illustration used with the kind permission of Zach Weiner, of Saturday Morning Breakfast Cereal. Thanks, Zach!

Who cites so badly? Dramatis personae

Bogus citations are often accompanied by strange claims of science expertise and credentials. This happens so often, especially on social media, that I have started to classify them. I think there is a spectrum you can simplify into three major types: the Bullshitter, The Lightweight, and the Bad Scientist.

The Bullshitter is just lying. They toss their completely invented credential around in the hopes it will make them more credible, but they are so clueless that they don’t realize how transparent the lie is. For instance, someone who claims to have a degree in “science” obviously knows nothing about science, or they would have been more specific. It’s like a cat fluffing up to to fool a human into thinking it’s bigger than it is — we know what you’re doing, kitty.

Photo of a smug hippy, captioned “I have a degree in science. And I also know better than science.”

The Lightweight is technically telling the truth: they actually have a relevant background, but it doesn’t mean much. They were bottom of their class, and they aren’t a working scientist; haven’t cracked a book since they graduated. Competence goes way beyond just squeaking through a degree program, even a masters, even in a hard science. But, a decade later, there they are on social media claiming to be a “scientist”!

The Bad Scientist is technically a real scientist, someone working as a scientist in a scientific context. But even “real scientists” can be terrible at their jobs, and many are. They can also have terrible jobs that do not challenge them to continue learning. Real scientists can be sloppy and dishonest and even just not that bright; they can fall into pseudoscientific rabbit holes and disappear from legitimate science forever; and they can be remarkably ignorant of the history and process of science. Bottom line: most scientists aren’t a Carl Sagan or even a Bill Nye. Many aren’t even up to the level of the MythBusters. In short, many scientists are much more like technicians than the curiosity-driven polymaths and super dorks that have come to represent “scientist.”

Better citations and footnotes

Here in the Salamander’s domain, there are hopefully not many bogus citations, and certainly none I know about!

Citations here are harvested and hand-crafted masterfully from only from the finest pure organic heirloom artisan fair-trade sources. If I had a TV ad for this, there’d be oak barrels and a kindly old Italian gentleman wearing a leather apron and holding up sparkling citations in the dappled sunshine before uploading them to ye olde FTP server.

But how are you to believe this?

To actually trust citations without checking them yourself, you have to really trust the author. But you can only trust an author by actually checking their references quite a few times first. Even after trust is established, you should probably still check the occasional reference — for your own edification, if nothing else. And of course you should always check references when the truth actually matters to you. That goes without saying, right?

This is why I have gone to considerable technological lengths on PainScience.com not just to cite my sources for every key point, but also to provide user-friendly links to the original material… which makes them easy to check! This is a basic principle of responsible publishing of health care information online.

If you’re not going to leverage technology to facilitate reference-checking, why even bother?

Did you find this article useful? Please support independent science journalism with a donation. See the donation page for more information and options.

Donate $5

Donate $10
  
Donate $15

About Paul Ingraham

Headshot of Paul Ingraham, short hair, neat beard, suit jacket.

I am a science writer, former massage therapist, and I was the assistant editor at ScienceBasedMedicine.org for several years. I have had my share of injuries and pain challenges as a runner and ultimate player. My wife and I live in downtown Vancouver, Canada. See my full bio and qualifications, or my blog, Writerly. You might run into me on Facebook or Twitter.

Related Reading

What’s new in this article?

Eight updates have been logged for this article since publication (2009). All PainScience.com updates are logged to show a long term commitment to quality, accuracy, and currency. more When’s the last time you read a blog post and found a list of many changes made to that page since publication? Like good footnotes, this sets PainScience.com apart from other health websites and blogs. Although footnotes are more useful, the update logs are important. They are “fine print,” but more meaningful than most of the comments that most Internet pages waste pixels on.

I log any change to articles that might be of interest to a keen reader. Complete update logging of all noteworthy improvements to all articles started in 2016. Prior to that, I only logged major updates for the most popular and controversial articles.

See the What’s New? page for updates to all recent site updates.

AugustNew section, “Who cites so badly? Dramatis personae.” The bullshitter, the lightweight, and the bad scientist.

2018New section on cherry-picking and confirmation bias in evaluating studies. This article is now quite a comprehensive review of how citing can go wrong.

2018Tiny new section about non-obviously bogus citations: “What if a citation avoids all of these pitfall? It can still be bogus!”

2018Added two xkcd comics: “Dubious Study” and “New Study.” This article is now chock-a-block with comics! Probably more comics per square inch than any other page on PainScience.com.

2018Big update today: added thorough discussion of the problem of predatory journals. Added three new bogus citation types: the non-cite, the junk cite, and the dump cite. Added a short section about citations to NASA studies.

2017Added a brief acknowledgement of my own fallibility in the citation department.

2017Added two citations substantiating the prevalance of predatory/fraudulent journals.

2017Added a paragraph about the existence and purpose of the junk journal industry.

2009Publication.

Notes

  1. This footnote, for instance, in no way supports the statement it follows, and is present only for the purpose of being mildly amusing. BACK TO TEXT
  2. It wasn’t a matter of interpretation or opinion. It was just way off: he had read much more into the paper than the researchers had ever intended anyone to get from it. The authors had clearly defined the limits of what could be interpreted from their evidence. BACK TO TEXT
  3. Beall J. What I learned from predatory publishers. Biochem Med (Zagreb). 2017 Jun;27(2):273–278. PubMed #28694718.  PainSci #52870. 

    ABSTRACT


    This article is a first-hand account of the author's work identifying and listing predatory publishers from 2012 to 2017. Predatory publishers use the gold (author pays) open access model and aim to generate as much revenue as possible, often foregoing a proper peer review. The paper details how predatory publishers came to exist and shows how they were largely enabled and condoned by the open-access social movement, the scholarly publishing industry, and academic librarians. The author describes tactics predatory publishers used to attempt to be removed from his lists, details the damage predatory journals cause to science, and comments on the future of scholarly publishing.

    BACK TO TEXT
  4. Gasparyan AY, Yessirkepov M, Diyanova SN, Kitas GD. Publishing Ethics and Predatory Practices: A Dilemma for All Stakeholders of Science Communication. J Korean Med Sci. 2015 Aug;30(8):1010–6. PubMed #26240476.  PainSci #52903.  “Over the past few years, numerous illegitimate or predatory journals have emerged in most fields of science.” BACK TO TEXT
  5. PS Ingraham. Ioannidis: Making Medical Science Look Bad Since 2005: A famous and excellent scientific paper … with an alarmingly misleading title. PainScience.com. 2896 words. BACK TO TEXT
  6. Cuijpers P, Cristea IA. How to prove that your therapy is effective, even when it is not: a guideline. Epidemiol Psychiatr Sci. 2016 Oct;25(5):428–435. PubMed #26411384.  BACK TO TEXT
  7. Most of these were someone’s attempt to “prove” that their methods work, clinicians playing at research, using all kinds of tactics (mostly unconsciously) to get the results they wanted, such as (paraphrasing Cuijpers et al):

    • Sell it! Inflate expectations! Make sure everyone knows it’s the best therapy EVAR.
    • But don’t compare to existing therapies!
    • And reduce expectations of the trial: keep it small and call it a “pilot.”
    • Use a waiting list control group.
    • Analyse only subjects who finish, ignore the dropouts.
    • Measure “success” in a variety of ways, but report only the good news.

    And so on. And many of these tactics leave no trace, or none that’s easy to find.

    BACK TO TEXT
  8. Blogs.jwatch.org [Internet]. Sax P. Predatory Journals Are Such a Big Problem It's Not Even Funny; 2018 June 14 [cited 18 Nov 28]. BACK TO TEXT