SCOTUS and Pincites

The judicial branch is my favorite arm of the government of the United States. Ok, the judiciary isn’t as sexy as the lawgivers and warmongers, but at least it doesn’t hang up a “closed for business” sign when the justices are having a spat, and even the choleric Justice Thomas has the decency not to bomb anyone in his foul moods. Further, in contrast to some of the people in the other branches one might mention #cough# Joe Wilson #cough#, the members of the Supreme Court of the United States (SCOTUS) wear their office with quiet dignity. And if they stage a constitutional coup from time to time, well, at least their web site works.

Where does SCOTUS intersect with Evolution and Human Behavior? Recently (hat tip: Ray Hames who, I should add, I did not mean to imply was dead; he’s doing just fine), an article in Evolution and Human Behavior was cited in the context of a Supreme Court case. Kyle Gibson, the author of the piece in question, has a brief discussion of this on his blog. The case, Hollingsworth, et al. v. Perry, was about same-sex marriage, and the paper was cited in an amicus brief in support of the point that “adoption is good for children in need of permanent families.” So good for Kyle, homosexual couples, and children in need of permanent families. (Still, he added wistfully, nothing is truly forever…)

Reading the amicus brief reminded me of the various peculiarities of writing for the law. I have been fortunate to collaborate with legal scholars a couple of times, and got a front row seat to view the publication process for a paper in a law review.

I thought I would take a few moments to explain how the publication process in legal scholarship ought to be a point of deepest shame for those of us publishing in the (social) scientific literature.

I’m not talking about the fact that their papers are typeset so that only 100 words or so appear on each page. I’m sure there’s some reason or tradition that explains why they have eight inch margins on either side and on the bottom of their pages, which I’m guessing probably has something to do with lawyers having equity positions in paper firms.

Instead, I’m talking about a practice in publishing for law reviews that we in the social sciences don’t have that, to me, makes a certain amount of sense.

They make sure citations are right.

Many readers might already be aware of this, but after a paper has been accepted to a law review, an excruciating process is set in motion that makes the excruciating process of dealing with the copy editors at our journals feel as effortless as signing up for federal health insurance. (Ok, bad example…) Intrepid law students, who work for the respective law reviews, meticulously – and I mean meticulously – pore over each sentence, word, and syllable of the submitted paper and ensure that each claim, no matter how small, is cited and, moreover, is properly cited. To give you a sense of the no-claim-is-too-pedestrian-to-cite mentality, in explaining the data we presented in our paper, I reminded the reader that correlation coefficients range between -1.0 and +1.0. The student editor called out this sentence with a request that I “supply a source for this claim.” (Note, speaking of accurate citations. I actually don’t recall if that’s an exact quote. Pretty close though.) Such diligence leads, it is true, to a tremendous amount of material “below the line,” occasionally to the point where the main text is dwarfed by the supporting documentation below it.

Further, authors are asked to provide quoted material from the cited source that supports the claim made in the text. Often but not always these quotations are left in the footnotes for the reader’s reference.

Impressive, right? Sounds right scholarly, doesn’t it? But wait, there’s more.

When I was working on the papers for the law reviews, I learned the term “pin cite” or sometimes as one word, “pincite.” This term refers to the practice of telling the reader, for the source cited, what page the supporting ideas can be found on. This would seem to be a fairly good idea; as Wikipedia puts it, a pincite “gives helpful information about the cited authority to the reader.” It does indeed. How often do you see a citation in a journal to a book or other lengthy publication that you’re sure would take you hours to track down if you actually bothered to try to look for it? Pincites add a burden to authors to find the precise place where a claim supports the point their making, but eases the burden on the scholar consuming the work to backtrack through the literature. We provide page numbers for quotations, yes, but rarely for anything else.

It makes a certain amount of sense that the legal community has these exacting standards. After all, legal decisions are often based on prior legal decisions, and of course there is the powerful principle of precedent that permeates SCOTUS and other corners of the bench. Law, in some important sense, is supposed to be accretive, building on prior decisions, and present decisions should be able to be traced back to prior decisions, legislation and, in some cases, relevant data, such as findings regarding whether adoption is good for children in need of permanent families. (It is. See Kyle Gibson, Differential Parental Investment in Families with  Both Adopted and Genetic Children, 30 EVOL. &  HUM. BEHAV. 184, 187 (2009). That “187” is the pincite, by the way.)

All of which would seem to be true of science as well. It’s not clear that sciences ought to be more accretive than the law, but surely we’re supposed to be building upon prior knowledge, and a reader’s ability to interpret the present findings depend at least sometimes on the prior findings on which the present data and arguments rest. Not only do we not provide page numbers in our citations, but many of us have had the experience of tracking back through a citation and finding that a source doesn’t really support the claim for which the author cites it. (Raise your hand if you have… all of you…? Oh…)

A process such as that used at law reviews could go some way to ameliorating this way in which social scientific scholarship looks to be curably deficient. Why aren’t authors required to indicate the page or pages on which supporting information is to be found?

More importantly, why don’t armies of graduate students – or postdocs or whoever – pore over each and every citation in papers before publication to ensure that citations are accurate?

This is not, in fact, a rhetorical question. And, sure, there are barriers. Will the students be paid? If so, where will the money come from? If they won’t be paid, how will they be recognized or compensated? What tasks will not be done because graduate students are devoting time to checking the accuracy of citations?

These are important questions. Still… Are we willing to say that the law requires greater care than science in documenting the connections to prior scholarship? Is it merely momentum (on our part) and tradition (on theirs) that explains the vast gulf in practices between the two fields?

Psychology is undergoing a number of transitions in the way that we do business, from statistics to replications to data archiving and more.

This is a good time to introduce innovations to try to make each paper more useful by ensuring that citations are both precise and accurate.

Now accepting proposals regarding how to proceed.

14. November 2013 by kurzbanepblog
Categories: Blog | 7 comments

Comments (7)

  1. We should raise the standards of journals by having fact-checkers, grad students who get paid to do this. I recently had a journal editor actually fact-chek an article of mine (this one: but it’s way, way too rare. (I was thrilled when I realized why he was asking me for unpublished documents we were citing — he was fact-checking what we said!!!)

    There are multiple reasons I have always fact-checked my own work, line-by-line: (1) I know I can make mistakes; (2) I want to have the reputation for being meticulous; (3) I don’t want to drive other scholars crazy with bad citations or “facts”; (4) ****when you fact-check your own work, you end up back with your sources and you find all sorts of great stuff you forgot to include or somehow totally missed; so it actually improves the content of your work!; (5) I don’t want Terry Turner suing me. (Obviously I can’t stop him from calling me demonic. But I kind of like the idea that Satan is into facts.)

    Over and over again I’ve asked scholars in the social sciences and humanities, “Do you fact-check your work?”, and they look at me like I’m from Mars. Why is that? They spent more time checking the route to the conference hotel than checking their papers for accuracy.

    Obviously a pet peeve…..

  2. I completely agree with you the scientific papers should be more accurate in their citation. Many times I find citations that not only don’t say what the author of the paper think they say, but actually quite the opposite. However, in many cases you can’t give a specific page or pages to support your argument, because the entire article deal with them. For example, would you just cite the actual statistical analysis or should you include the discussion?

    Also, I find the way law papers cite very annoying. They are full of footnotes that sometimes are longer than the main text (historians do this too). It’s much easier to read citation on the main text. In popular books it’s even worse. The citations and notes are at the end and the reader must go back and forth to read them. I understand that this is done to make the book more readable for the casual non-academic reader, but it sure can be frustrating as an academic that wants to know who did the study quoted in the text.

  3. Interesting post. I love your example of requesting a citation for the correlation coefficient!

    My academic training included very little humanities, but when I worked with humanities students as a tutor I was impressed by humanities-folks’ focus on carefully and accurately representing the claims they are engaging in their own work. In psychology, I have frequently had the experience of trying to trace the logic of an argument made by an article by reading the cited work, and realizing that the former misrepresented the latter. I’ve seen it occur in two ways: (1) the newer article misinterpreted claims made by the older article, (2) the newer article made unwarranted inferences about the empirical results presented in the older article (Unfortunately, I don’ t have an example on hand that I can cite, but fortunately this isn’t a law journal!) In the end I think it goes beyond mere fact-checking and precision and requires some careful thinking about logic and evidence.

    I found your comments on the accretive nature of law vs. science interesting as well. In science there’s this “Baconian” idea that the data will eventually add up and speak for themselves, whereas in law the accretive nature depends on interpretation in context (for which one receives extensive training in the humanities). To my mind, data are not some objective entity in the world but a part of a larger argument which require just as much interpretation in context. I think we’re seeing some focus on this with the questioning of psychology’s traditional p < .05 metric.

    My guess is that anyone in psychology who attempts to do this "checking" will tumble down a rabbit hole and crash into a house of cards.

  4. There’s this little thing called overlyhonestmethods. I think it might have been a hashtag on twitter (I don’t know the reference of the thing, ironically). In any case, when some of the popular submissions were posted up on Facebook, they was met with a warm, relatable reception, apparently.

    In any case, one of them in particular bugged me: “We didn’t read half the papers we cite because they’re behind a paywall”.

    So that would be bad; people not actually reading the sources. I inquired around to find, much to my dismay, that almost no one I talked to reported actually, you know, reading most of the articles they cite. They skimmed them, maybe; sometimes they just read the discussion. Granted, these were graduate students, but I imagine the problem isn’t limited to them.

  5. This entry makes me feel all warm and fuzzy. Having fact checked some of your writings I can attest to the fact that it is a tedious and time consuming job. It is also a great way to get a better understanding of the research. I think it’s completely reasonable for students to be tasked with fact checking and be paid either monetarily or with school credits. At the very least, I would think it would look good on a resume (or CV for the academics) to be able to say “I was a fact checker for [insert impressive journal here].”

  6. Pincite 100% agree. The other part is am issue of so Mich effort vs. Benefit condurumm

  7. Without naming names – so this carries no citations – I have encountered a number of instances of shoddy citation in my career, usually used for rhetorical effect, so a strategy favoured by our more sophist colleagues who enjoy ad hominem statement rather than anything more sustained.

    However, I have also encountered the view that the law is a machine for simulating knowledge. So the grinding drudgery of these grad student citation checkers is some kind of Orwellian vision of mechanical epistemology. Science, on the other hand, ultimately generates knowledge through coherent argument and experiment. The rub here is that perhaps citation should be reduced – thereby making it more accurate perhaps, or easier to check – in favour of good argument. Who cares who owns the idea, just so long as the idea is a good ‘un?

    Of course, originality seems prized, hence our disgust reactions around plagiarism, but I suspect that is a social fact, rather than an epistemological one. Indeed, I wonder, would students cheat less if they allowed to think more… Mmm…

Skip to toolbar