A group contends that the journal impact factor (JIF), which ranks scholarly journals by the average number of citations their articles attract in a set period, has increasingly become an obsession in science. Impact factor of articles is used in evaluating research for funding, hiring, promotion, or institutional effectiveness.
But the people behind The San Francisco Declaration on Research Assessment, convened by the American Society for Cell Biology (ASCB) last December in San Francisco, say impact factors warp the way that research is conducted, reported, and funded. After five months of discussion, the San Francisco declaration group made 18 recommendations for change in the scientific culture at all levels — individual scientists, publishers, institutions, funding agencies, and the bibliometric services themselves — to reduce the dominant role of the impact factor in evaluating research and researchers and instead to focus on the content of primary research papers, regardless of publication venue.
With 25,000 journals existing now, and the increased ability for anything to be declared peer-reviewed in return for an open access fee, how could anyone know which research is worse the money that is being increasingly provided by government? Funding agencies would literally have to be experts in every field to know whether work is valid. Journals make standards possible; publication in a high-impact journal means it has undergone real peer-review and not editorial review or other "light peer review" shortcuts. It's not perfect and certainly has flaws but it's hard to find something better.
There are other citation ranking systems but the oldest and most influential is the "two-year JIF" devised by Eugene Garfield in the early 1950s and originally published by his Institute for Scientific Information (ISI) as a subscription buying tool for academic and medical librarians. The JIF appears once a year in Journal Citation Reports as part of the Thomson Reuters (ISI) Web of Knowledge and is the average number of citations received in a year per paper published in the journal during the two preceding years. The earliest that a new journal can have a JIF is the end of its third full year of publication.
Even though the JIF is only a measure of a journal's average citation frequency, it has become a powerful proxy for scientific value and is being widely misused to assess individual scientists and research institutions, say the DORA framers. The JIF has become even more powerful in China, India, and other nations emerging as global research powers.
As would be expected, scientists prefer to publish in the most prestigious journals first. In many cases, they only go to lower impact factor journals if they are turned down at higher ones.
But even some journals with high impact factors agree that the system is flawed in the way it is used as a crutch now. The DORA has been endorsed by editors at Science, Journal of Cell Biology, Genetics and more. The full list is here and is heavily weighted toward life sciences.
The San Francisco declaration cites studies that outline known defects in the JIF, distortions that skew results within journals, that gloss over differences between fields, and that lump primary research articles in with much more easily cited review articles. Further, the JIF can be "gamed" by editors and authors, while the data used to compute the JIF "are neither transparent nor openly available to the public," according to DORA.
If it's not known to anyone outside Thomson-Reuters, how are editors and authors gaming it?
Since the JIF is based on the mean of the citations to papers in a given journal, rather than the median, a handful of highly cited papers can drive the overall JIF, says Bernd Pulverer, Chief Editor of the EMBO Journal. "My favorite example is the first paper on the sequencing of the human genome. This paper, which has been cited just under 10,000 times to date, single handedly increased Nature's JIF for a couple of years."
Editors at Nature are not on the list of signatories, making it safe to single out that magazine.
"The Journal Impact Factor (JIF) was developed to help librarians make subscription decisions, but it's become a proxy for the quality of research," says Stefano Bertuzzi, ASCB Executive Director, one of more than 70 institutional leaders to sign the declaration on behalf of their organizations. "Researchers are now judged by where they publish not by what they publish. This is no longer a question of selling subscriptions. The 'high-impact' obsession is warping our scientific judgment, damaging careers, and wasting time and valuable work."
The SF declaration urges all stakeholders to focus on the content of papers, rather than the JIF of the journal in which it was published, says Bertuzzi, "The connection is flawed and the importance of the finding as reflected by the light of a high JIF number is often completely misleading, because it is always only a very small number of papers published in a journal that receive most of citations, so it is flawed to measure the impact of a single article by this metric. Great papers appear in journals with low JIFs and vice versa."
The DORA coalition calls on all individuals and organizations engaged in scientific research to sign the San Francisco declaration: http://www.ascb.org/SFdeclaration.html.