PainScience.com readers often ask things like:
- Who did your website design/programming?
- How can one person possibly publish so much?
- Are you using WordPress? Or some other content management system?
- What do you use to organize your references?
- How can I get a website like yours?
I have been publishing webpages since the dawn of the web in the early 90s, and PainScience.com is entirely based on custom code written either by me or contractors, a thick stew of technologies that facilitate efficient production of heavily referenced content, and highly automated ecommerce. I use a free Mac app, BibDesk, to manage a huge BibTeX bibliography; custom PHP extracts citations from that database for use on PainScience.com — an exotic setup, and the technological heart of the website.
I chuckle when people ask, “How can I get a website like yours?” It’s quirky and exclusive, by design. It could be imitated, but not easily!
Bibliography-first design: powering good footnotes
Top-notch referencing is so important to me that it has driven much of the design and technology of PainScience.com. I manage a huge bibliographic database using an obscure database format used by academics, BibTeX. I manage the bibliography using a terrific Mac app BibDesk. My own articles are also stored in that database, and so they can be cited like any other reference.
The code for PainScience.com parses the BibTeX database and then uses the data to make web-ready citations in many styles. For instance, I can create an inline citation versus full bibliographic details for a footnote, all pulled from the BibTeX database and turned into complex HTML. (Footnote formatting is all automated too.1) If I tweak a citation format, every usage on the site changes in one stroke. Or, if I change the summary of a scientific paper, then it automatically changes everywhere that summary is used! This clever system was originally written by a friend of mine in about 2005, a programming prodigy who works for Apple and remains a coding tutor/mentor to me. We still get together for lunch regularly.
And so I can write like a writer and cite on the fly with efficiency that would make a scholar envious. It’s a thing of highly abstract beauty. I get fan mail from librarians. I am not kidding.
The need for speed (and security)
I am obsessed with efficiency and automation, which is the only way one person can make a project like this financially viable. And so I have dozens of custom programs for automation of all kinds of common publishing chores: lots of shell, Perl, and PHP scripts.
A major goal is to have a publishing system that really gets out of my way and lets me write, write, write. Making websites is so complex and tricky that even the best tools available — and there are some amazing ones — are missing many features I consider essential (like the footnoting and referencing features), and include much more that I don’t need or like. There has always been a serious disconnect between my priorities and the nature of most web publishing tools.
After years of getting irritated by blogging platforms of every description, I decided the only hope was to write my own — as every writerly geek must eventually do. And so my production workflow is now lower friction than any other blogging platform I’ve ever used or heard of. Many publishing and formatting chores have been reduced to a few keystrokes. A short blog post can be written and uploaded in less than a minute, and yet look richly formatted.
The code weirdness also it makes the site virtually unhackable (security by obscurity). PainScience.com will never be directly harmed by a script kiddie, or even a sophisticated hacker, because there is nothing remotely standard running on this server.
The result of all this is fast, secure pages made of pure, organic, free-range HTML5 and CSS and little else — barely even any jQuery.2
- rsync scripts and Transmit for moving files around.
- Querious for messing around with my ecommerce database.
- Fork and Bitbucket for file version control.
- Homebrew for managing my Mac’s testing server (the AMP stack, etc).
- LaunchBar is a Mac app that gives quick access via typed abbreviations to nearly anything on the system. It’s is probably my most important productivity tool and has been for well over a decade now. Typinator is also critical for a huge collection of shorthands that I have been refining for years (it’s a lot easier to type “gtps” than “greater trochanteric pain syndrome”).
- Evernote is my main notes app — not especially beloved, but heavily used, and that app is the only other app I use for actual writing.
- About PainScience.com — 19 years of publishing science-based advice about aches, pains, and injuries
- The Pricing of PainScience.com e-Books — A candid explanation of my prices and how I present them to new visitors
- The PainScience.com origin story — PainScience.com is an unusual business success and many readers are curious to know how it started
- Smarter and Funnier — Publication standards for PainScience.com and why you can trust the information published here
- Studying the Studies — Tips and musings about how to understand (and write about) pain and musculoskeletal health science
- Why So “Negative”? — Answering accusations of negativity, and my reasons and methods for debunking bad treatment options for pain and injury.
Appendix: The making of a footnote (many hours of science wrangling to create one small footnote about hot kneecaps)
As you can see from the above, I take my footnoting really seriously technologically. But I also take them very seriously as a matter of scholarship.3 There are a lot of ways to do referencing wrong,4 so I work hard to understand and correctly explain relevant science. Preparing a new footnote — just a single, “minor” footnote for my IT band syndrome book — I was struck by what a huge process/ordeal it is for one teensy little number, both technologically and in terms of scholarship.
They’re like icebergs: there’s a lot more to them than you can see.
So this is the life cycle of a typical show-me-the-science footnote: one that’s interesting enough to deserve some real effort, but still not important enough to end up as anything much more than a wee footnote. It begins easily enough …
- 30 seconds to notice a scientific paper and make a note to deal with it later. (I find most papers while browsing the RSS feeds for top journals.)
- 10 minutes to add it to the bibliography, tag it, and rate it so that I can find it when I’m looking for the best science updates on particular topics.
- Weeks or months before I finally get back to it. Or longer. A few have waited years! I have a backlog of literally hundreds of papers I want to process like this.
- 5 minutes to really carefully read the abstract and start trying to grasp the nugget of the paper.
- 15 minutes to write a good quality plain language translation of the abstract. The effort has now surpassed what 97% of bloggers would bother with.
- 30 minutes to find and/or buy the full text and read it, highlighting and raising my eyebrows and muttering things like, “Yes, but where’s the chart for that data?” and “Have these people never heard of an analysis of variance?” I would guess that most papers are actually read by no better than 1% of the non-scientists who cite them.
- 20 minutes to mostly re-write my translation now that I actually understand the paper and realize that the abstract was oversimplified to the point of being almost meaningless. (This makes me worry a great deal about all the summaries I’ve written without reading the whole paper.)
- 1 hour to find the best places on PainScience.com to cite the evidence. I have to work a bit of a summary into the text in several different places, and then customize footnotes to match. There’s usually a lot of editing of the surrounding material to reflect what I’ve learned. Most citations I’ve chosen because they generally support my position, but they usually improve my understanding. But if the evidence doesn’t agree with my position? This step can double or triple in length for those citations.
- 10 minutes to revise the summary again, based on all the editing and shoehorning I’ve been doing. I usually think of better and better ways to explain it as I go.
- 30 minutes to revise other related citations, and then cross-reference them. For instance, if New Study seems to contradicts Old Study, but both have strengths, I need to mention in the summary of Old Study to “see also New Study.” If my understanding of a topic has changed, it could affect the wording of my summaries of a dozen other papers. This step can sometimes get down-the-rabbit-hole complicated — effectively infinite.
- 20 minutes for drop-in citations, adding references to the paper in places where it can be “just a footnote” and doesn’t require changes to the text.
- 10 minutes to actually publish all the updates.
- 40 minutes for repairing problems that I noticed mere moments after “finishing.” The act of declaring “all done!” is like a magic incantation that forces some glitch out into the open. In the case of the footnote that inspired this, I was about to call it a wrap when my eye chanced upon … a duplicate of the citation, already in the database since 2004, which I did a slapdash job on back then and got half dozen things wrong, but which also contained a particularly elegant phrase of summary, much craftier than anything I came up with this time, so of course I just had to revised everything again to use the superior phrasing. Oh, and the old summary also mentions another closely related paper that should have been mentioned in all the places I’ve already edited. And so on …
- 10 minutes to re-publish all the updates.
- 20 minutes to “debate” it on Facebook with some smartass who harshly criticized my interpretation after reading the abstract and demonstrating a perfect lack of knowledge of statistics.
- Six months later a reader reports a glaring typo in the summary, which of course has been duplicated everywhere the citation is used.
What my iMac sees during the final stages of this process.
I do something like this two or three times per week, and have for many years now. And that’s how a typical footnote is born — in this case, one which shows (among other things) that about half of people with unexplained anterior knee pain have “hot” kneecaps.5
- And, of course, the footnote itself links nicely back to the document body. You can imagine what a headache it would be to “manually” create and rearrange thousands of footnotes. Imagine a large, heavily footnoted document … and then trying to add a new footnote somewhere in the middle! Yikes. Software handles all that: specifically, a custom Perl filter, quite powerful, inserting and renumbering new footnotes with the press of a button. Return to text.
- For an example, here’s a very serious citation:
Patton D, McIntosh A. Head and neck injury risks in heavy metal: head bangers stuck between rock and a hard bass. BMJ. 2008;337:a2825. PubMed #19091761 ❐
It turns out that head-banging, “a popular dance form,” constitutes “a definite risk of mild traumatic brain injury” and the study “helps to explain why metal concert goers often seem dazed, confused, and incoherent.” I can think of other reasons! Part of the reason I wanted to share this beautiful piece of science is that I grew up in a youth culture dominated by heavy metal — a small industry town in northern Canada, Prince George. I was surrounded by head bangers, and they were definitely dazed, confused, and incoherent. And worse! But there is a strong possibility that the daze preceded the heavy metal in many cases.
The risk of neck injury also increases with head banging intensity — although less than one might expect, which we can infer from the way people are able to keep doing it.
- Ingraham. 13 Kinds of Bogus Citations: Classic ways to self-servingly screw up references to science, like “the sneaky reach” or “the uncheckable”. ❐ PainScience.com. 3755 words.
- Näslund J, Näslund UB, Odenbring S, Lundeberg T. Comparison of symptoms and clinical findings in subgroups of individuals with patellofemoral pain. Physiotherapy Theory and Practice. 2006 Jun;22(3):105–18. PubMed #16848349 ❐
Researchers bone scanned and x-rayed 80 patients diagnosed with PFPS and with many common similar diagnoses eliminated, a nice “pure” selection of unexplained knee pain patients. They divided them into three groups: 17 with pathology, 29 with “hot” kneecaps (metabolically active), and 29 without any findings (5 dropped out). All patients and 48 healthy subjects without any knee pain were then interviewed and examined by a surgeon and a physical therapist.
They could not diagnose the pathologies without the scans — all patients with pain tested about the same, and their symptoms were indistinguishable. Q-for-quadriceps angles were about 4˚ bigger in the afflicted, but the authors carefully explain that 4˚ too small to be reliably detected. The most interesting result of the study is that almost half the PFPS patients had kneecaps throbbing with metabolic activity — that’s a fairly strong pattern.