Measuring the value and the impact of a scientist on her field of research using as data her scientific papers, the number of citations these papers got, and the prestige of the scientific journals where these were published is no easy matter.

Grading Researchers: The H-Index

There is a large body of literature on how best to account for all these factors together: the discipline is called "scientometrics". Of course, the goal is to summarize the productivity of a scholar in a single number; possibly one with at most double digits, since decision-makers who hire or fund are usually incapable of handling more complex data. One notable attempt is the Hirsch Index, proposed in 2005 by a physicist, Jorge Hirsch.

The H-index combines together the number of your publications with the number of citations of each: its value N is, in fact, equal to the number N of your publications which have at least N citations. Suppose you have 100 publications and you order them by the number of citations each got: if the top 15 of them got 15 or more citations, and the others got fewer than that, your H-index is 15. So if you write a lot but nobody reads you, your H-index is zero. Kind of smart, but maybe a bit too succinct. Good for a CEO, as I said.

Coming Clean

For some reason I never really got excited by the whole business. Perhaps this is due to the fact that in experimental particle physics, my field of research, you just need to be accepted as a member of a large collaboration and you are done: papers will be published with your name on them even if you cannot spell their title. I have been in this business for almost twenty years, and my paper count is well over 700. My H-Index is probably in the whereabouts of 60 or so, which would be stratospheric for almost any other field of research, but is not uncommon in high-energy physicists of my age.

You might be asking yourself whether I consider those papers as my own. Did I write them ? No, I only wrote or helped writing a small fraction of that large number. Did I at least review them ? No, I only reviewed maybe two hundred of them (and believe me, that is a large fraction for the field's standards!). Hell, did I read them all at least ? No, I cannot even say I read all of them; perhaps I read a third. If this coming clean with my true contribution to papers I signed shocks you, please consider: I did not ask to sign those papers -it is automatic !

Indeed, in a world full of honest people, one could invent a way to correct for the overinflated output of particle physicists: a "opt-in" policy could be enforced by large collaborations, whereby anybody who wishes to be listed as author of a paper should declare it. By default, the author list of a paper should be empty (let's say it would start containing only the name of the paper editor); then, people would add their names, maybe with a line justifying their request.

The above mechanism would be quite nice, but unfortunately it would hardly work: the few honest scientists who stick to the idea of only signing papers to which they feel they gave a contribution would quickly end up at the bottom of the list of any scientometric index, being surpassed by the less upright ones.

The Italian way

So we are stuck with huge lists of papers we don't really own. The funny thing in Italy is that, at least until some time ago (but I believe the practice is still on) if you participated in a selection for an academic position you were obliged to include in your application an envelope with a faithful copy of all your publications.

No kidding: putting together the material and sending it cost you a week of work and at least a few tens of euros. And if you cannot imagine the mess at the receiving end, I can describe it: rooms filled with boxes containing scientific papers -usually the same ones in many boxes, since applicants were often members of the same collaborations! And those boxes, needless to say, were not even opened by the examiners.

The revenge, however, was during the oral phase of the exam: you could be asked to describe the details of any one of your publications. I know stories of people being caught totally oblivious of having presented a publication. Dramatic stuff.

I was led to think back to all the above when I got aware of a "disturbance" among italian researchers that a new evaluation method for italian research institutes is causing. The method (called "VQR") is actually a good one: every researcher belonging to the institute is asked to insert in a database a list of papers he or she acknowledges of having authored. The system, however, needs to sort out the papers such that every researcher claims a different set -a subset of those they authored. Then the whole scientific output of the institute can be evaluated with that data.

The disturbances I was mentioning above are caused by the fact that the system's choice of papers that each individual has contributed constitutes an official record, and in principle these data could one day be used to judge the value of individual researchers. This would be silly and dangerous, since a researcher would be judged on the basis of a random process of attribution of a subset of authored papers. Given the sorry situation of research careers in Italy, due to the chronic lack of funding, you well imagine that 50-years-old scientists still at the bottom of the career ladder are growing anxious: they have published countless papers, and now there's an official record that certifies they recognize six of those (randomly picked) as their main contribution to the science.

I am confident that the problem will soon be solved... My attitude is that if I ever grow tired of the slowness of my career in Italy, I will move some place else. Almost anywhere else in the world than in my sorry little country I would be guaranteed to get a better salary and a higher recognition for my status.