Can a big name lead to a boost, even for low-profile work?

Indeed it can, according to an analysis which found that scientific papers written by well-known scholars get more attention than they otherwise would receive because of their authors’ high profiles - but there are some subtle twists in how this happens.

The study reports that citations of papers increase by 12 percent, above the expected level, when their authors are awarded prestigious investigator status at the Howard Hughes Medical Institute (HHMI), a major private research organization. However, certain kinds of research papers are boosted more than others by the increased prestige that accompanies the HHMI award - there was more of an effect on recent papers published more recently, just before the prize.

The greatest gains come for papers in new areas of research, and for papers published in lower-profile journals. Younger researchers who had lower profiles previously were more likely to see a change as well. 

“We found that there was an effect of status,” says Pierre Azoulay, an associate professor at the MIT Sloan School of Management and co-author of the paper in Management Science. But that effect, he adds, is not overwhelming. “The effect was much more pronounced when there was more reason to be uncertain about the quality of the science or the scientist before the prize.” Azoulay observes.

The ‘Matthew Effect’?

The “Matthew Effect”was coined by sociologist Robert K. Merton to describe the possibility that the work of those with high status receives greater attention than equivalent work by those who are not as well known. But is it real? 

Positively identifying this phenomenon in citations is difficult because it is hard to separate the status of the author from the quality of the paper. It is entirely possible that better-known researchers are simply producing higher-quality papers, which get more attention as a result, and that lesser known people receive a higher boost in citations for papers written before the award, because that is when the researchers were doing the work that got them the award.

But the authors say they were able to address this concern: They looked at papers first published before the authors became HHMI investigators, then examined the citation rates for those papers after the HHMI appointments occurred, compared to a baseline of similar papers whose authors did not receive HHMI appointments.

More specifically, each paper in the study is paired with what Azoulay calls a “fraternal twin,” that is, another paper published in the exact same journal, at the same time, with the same initial citation pattern. For good measure, the authors of the papers in this comparison group were all scientists who had received other early-career awards.

In all, from 1984 through 2003, 443 scientists were named HHMI investigators. The current study examines 3,636 papers written by 424 of those scientists, comparing them to 3,636 papers in the control group.

“You couldn’t tell them [the pairs of papers] apart in terms of citation trajectories, up until the time of the prize,” Azoulay says.

Beyond the overall 12 percent increase in citations, the effect was nearly twice as great for papers published in lower-profile journals. Alternately, Azoulay points out, “If your paper was published in Cell or Nature or Science, the HHMI [award] doesn’t add a lot.”

It's a debate whether or not the quality of papers is represented by citation data anyway; worthy research can escape wide notice for extended periods of time. So scientists are not necessarily buying into studies of citations.

But citations are really all there is so scholars use citation data to try and glean new insights and quantify observations about the scientific enterprise. For instance, drawing on his own proprietary database of more than 12,000 life scientists, Azoulay has found that bioscience advances are encouraged by longer-term grants with more freedom for researchers, and that physical proximity among scientists increases citation rates, among other things.