In the scientific community, researchers call it
salami slicing. Appropriately, the act consists of shaving down a collected dataset until a scientist reaches the smallest scrap of result that still constitutes an original idea.
This decontextualized whisper defines the least publishable unit: the publon. The researcher proceeds to neatly separate the salami slice from the rest of the data and use it as the meat of a submission to a peer-reviewed journal. The next step fills out the sandwich: Authors repeat the submission process several times with various combinations of the same publons and different academic journals.
Taken together, the publons coherently describe the results from a given set of experiments. Separated, they boost the publishing rates of researchers. In a meritocracy where tenure is dictated by industry prestige—measured by CV length—it is no surprise that many researchers are slicing their salami as thinly as possible.
Jonah Lehrer is not a research scientist, but he was my hero. A Rhodes Scholar, Lehrer entered Oxford with an Columbia degree in neuroscience and left with a penchant for science communication. Early this summer, he became one of The New Yorker’s
youngest staff writers of all time. His three books—Proust Was a Neuroscientist
, How We Decide
, and Imagine: How Creativity Works
—garnered heaps of
public excitement about understanding the brain.
As a young science writer, he did everything a young science writer is supposed to do: He distilled complex scientific concepts in concrete, accessible terms, he got people excited about original research, and he looked damn good
in edgy glasses and tailored jackets.
But Jonah Lehrer’s skyrocket to fame turned out to be largely fueled by the journalistic version of salami slicing. In the journalistic sphere, writers call it self-plagiarism. Gradually, Lehrer’s close readers had realized that his New Yorker
posts were similar to pieces they had read years earlier when Lehrer wrote for Wired
. He had recycled paragraphs, ideas, and nearly whole columns. But in the journalistic moral court, there is no verdict on self-plagiarism. It seems ethically squishy; morally ambiguous.
The diehard Lehrer fans may have been getting a double dose, but new readers were engaging with new, exciting concepts. Because of the lack of a formal position on self-plagiarism, Lehrer was able to lay low for a couple weeks and maintain his position at The New Yorker.
That is, of course, until Michael Moynihan published a now-infamous piece
on July 30. Moynihan’s piece chronicled an investigation with the science writer that led to Lehrer’s admission that several Bob Dylan quotes in Imagine
were either completely fabricated or taken out of context to further a point. Facing such ethically inexcusable errors, Lehrer promptly resigned from his staff writer position. Over the following weeks, new information continued to surface
in reference to his books and other work, including several instances of factual inaccuracies, as well as formal examples of plagiarism and press-release plagiarism. Wired
dropped him as contributing writer, Houghton Mifflin yanked Imagine
from booksellers worldwide, and The Wall Street Journal completely deleted
Lehrer’s existence from its archives.
The exposé threw Lehrer’s self-plagiarism under new light, and the resulting irony is literally jaw-dropping. Consider the wasteland that was his New Yorker
-hosted Frontal Cortex
blog. A glance at a single post neatly summarizes the crux of the scandal. Sandwiched between “The New Neuroscience of Choking” and “Why Smart People Are Stupid”—two concepts with which the science writer should be particularly familiar—is Lehrer’s June 7 piece
, “Why We Don’t Believe in Science.” And just beneath the byline, in its coy, italic glory, rests an inadvertent thesis statement: “Editors’ Note: Portions of this post appeared in a similar form in a December, 2009, piece by Jonah Lehrer for
Wired magazine. We regret the duplication of material.
” Reading any further is superfluous; the editors answer the question laid out in the post’s title. (As appetizing as the salami sandwich may be, close your mouth.)
In the instant gratification generation of TED Talks and Facebook Mobile push notifications, industry standards encourage writers to sacrifice content for wowdacity. In his GOOD post
“Jonah Lehrer and the Tyranny of the Big Idea,” Andrew Price lays out the problem explicitly:
“Whatever you think about Jonah Lehrer's transgressions, his treatment in the media, and his plummet from what is arguably the highest perch in American journalism, it's helpful to bear in mind that there's a demand side of this equation.
What made Lehrer so successful—with his books, at Wired, and then, for a time, at The New Yorker—was his ability to mold the results of hard science into tidy, consumer-friendly, and often unexpected insights. That's exactly what smart, curious, and busy readers like you and I want: surprising, Fun-Size ideas with just enough academic heft.
Jonah Lehrer isn't the only one capitalizing on this demand for Wow! stories. There's a whole industry. Malcolm Gladwell, the Freakonomics guys, certain TED Talks, Slate—they all trade, to some extent, on the snappy, mind-blowing idea you didn't see coming but totally seems kind of true.”
Proust Was a Neuroscientist
sold well because of the intellectual quirkiness proposed in the book’s title. It was one of those “surprising, Fun-Size ideas with just enough academic heft.” But one person can only be expected to have so many “unexpected insights.” The need to continually produce groundbreaking ideas suggests a mismatch between supply and demand. To ignore the idea-hungry nature of the publishing industry is to ignore its foundation.
This is where blogs enter the fray. The pressures behind salami slicing and self-plagiarism should be absent in the blogosphere, supposedly the ancestral homeland of ethical publishing. Fact-checkers, polemicists, freewriters; bloggers theoretically have the power to think critically about an issue without being subjected to the systemic issues inherent to journalistic publishing. They publish themselves, so there should be no pressure to file on time—each individual blogger determines his rate of publication. The science blogosphere is democratic and fair, and as a whole it has come to foster the public trust in science that writers like Lehrer tend to erode.
Unfortunately, it turns out the same pressures that lead to salami slicing in academia and self-plagiarism in journalism do trickle down to the blogosphere, fueling similar brands of reductionism and hasty publishing. As soon as the Tablet
exposé went up, bloggers inherited the scandal. It was a veritable swarm. Within hours of Moynihan’s piece going live, they were fact-checking, polemicizing, and freewriting on Lehrer’s errors. Piece after piece flowed in, each positing motivation for Lehrer’s plagiarism. He had no apprentice. He was too young and rose to prestige too quickly. He was not a scientist. He lacked a moral compass. Aside from the psychoanalysis, there was a vast array of publishing industry critiques. Some bloggers advocated on behalf of industry, demanding higher fact-checking standards and stronger self-plagiarism consequences. Some praised Lehrer, congratulating him on managing to point out the flaws in the industry that allowed him to sneak through the cracks. Someone performed a close reading
of a poem Lehrer had written as an undergraduate. Sam Harris used the opportunity
to push his book Lying
But there is something wrong with this picture. There is only so much insight a person can have between learning of a scandal and self-publishing a few hours later. For all of the psychoanalysis, the only person who knew why Lehrer did what he did was Lehrer himself, and he certainly was not blogging about it the day the Tablet piece went up. Whether it matters or not why Lehrer did what he did raises another question, and one that may not have an answer. The industry standards at play were the more important issues, but they were glossed over in favor of succinctness, personal interest, and speed of publication. Yet the more egregious error of the blogosphere is that, unknowingly, the bloggers mirrored the process they critiqued. In jostling to publish quickly and with biting insight—as is the nature of the blog game—bloggers were too reductionist in their approach and focused on trite speculations. They emulated young researchers scrambling for their first NIH R01 grant. They continued to feed the beast of their industry.
Combing through the responses to the scandal actually ends up revealing more common threads between bloggers than differences. Price begins his “Tyranny” piece by introducing Lehrer as a “wunderkind,” and then encourages readers to Google “jonah lehrer wunderkind.” There are 138,000 hits. Even meta-critiques of the “schadenfreude” responses to the Moynihan article hit harmonics across
the industry. Robert Wright at The Atlantic manages to use the two words in the same sentence
: “God knows how many species of schadenfreude have been loosed by the fate that befell wunderkind writer Jonah Lehrer today after he got caught fabricating quotes.” It was “the fall of a hipster intellectual
” and “a grievous oraculism
.” It was “Gladwellization
.” In The Book of Laughter and Forgetting
, Kundera writes, “Once the writer in every individual comes to life (and that time is not far off), we are in for an age of universal deafness and lack of understanding.” Just as salami slicing data across journal articles prevents a coherent understanding of the whole experiment, infinite speculation and reductionism on behalf of Lehrers and the blogosphere sacrifice vital underlying complexity and distance us further and further from the truth.
Of course neuroscience is that much closer to home when the secrets of the mind-brain-body complex were wholly imagined by Marcel Proust. The brain is that much easier to understand when it is portrayed as a representative bundle of ten neurons giving rise to consciousness. But there are 1011
neurons in the human brain. Public adherence to industry pressures of unrealistic, wow-hungry publication standards—in academia, journalism, and the blogosphere—ends up letting the industry down at an intimately fundamental level. Bloggers play a dangerous game, because they do have industry standards of their own. The blog, hailed as a free writing workshop, is only free to a certain extent. Especially among science bloggers, the self-inflicted need to have a new take on a concept as quickly as possible manifests itself in a form of competition among bloggers that ends up detracting from the issues at hand.
Yet the struggle for ethical publishing is a battle being fought on two fronts, and we are losing on both. While considering the “demand side of this equation” makes for some exceptionally idealistic pontification—no doubt, it is the subject of this post—we cannot ignore the personal responsibility at stake. It may make sense that we cater to the industry standards, but the act of catering itself hosts a store of ethical transgression. Researchers are still slicing their salami thinner and thinner; Lehrer still copy and pasted his own writing and others’ in order to further his career—and yes, he is already working on a new book
to tell his side of the story.
With each slice, science communication becomes more and more diluted. Ultimately, it is the individual that makes the decision to plagiarize, self-plagiarize, or publish snippets. Maybe we should read Lying
after all. As they say, change comes from within (the sandwich?). In the parabolic tennis set that is the blame game between industry and individual, a personal, moral obligation to the truth may be the only impetus for match point. On all fronts, scientific publishing deserves the same level of rigor and veracity it purports to share. Psychoanalysis takes more than an hour. The brain has more than ten neurons. Proust was no neuroscientist.