If you want to understand the modern academy, it wouldn’t hurt to start at “impact factor.”
Every year, the company Thomson Reuters assigns every academic journal an “impact factor.” Impact factors measure, roughly, how often papers published in one journal are cited by other journals. It is an ecological measurement, in other words. You’d recognize the names of journals with the highest impact factors — Nature, Science, etc. — but the world of scholarly journals is enormous, and there’s crowding at the bottom.
Two stories today illustrate the problems with impact factors, and the difficulty of measuring knowledge through any metric.
First, Nature News revealed that a Brazilian citation cartel had been outed by Thomson Reuters. That’s right: a citation cartel.
The Brazilian government measures graduate schools based on the impact factor of the journals that those schools’ students publish in. Brazilian journals, many of which are newer, have low impact factors, so Brazilian graduate students often publish in journals abroad. This makes them and their graduate program look better, but it means the commercial benefit of Brazilian scholarship flows, in part, to non-Brazilian companies.
So editors at a set of Brazilian journals began linking to each others’ journals... a lot. The flurry of cross-citation made every journal appear more influential, and succeeded in raising the journals’ impact factor in 2011. For a moment, the scheme worked.
Until it didn’t. Earlier this year, Thomson Reuters began using a new algorithm to look for more elaborate exercises in factor-raising “self-citation,” and it turned up four Brazilian journals. The editor of one of those journals has been fired. He’s indicated the project went beyond just those four: “There are a few others which played a part in this game, and they escaped,” he toldNature.
Yoni Appelbaum, an Atlantic contributor, made the following of the situation:
Tie dollars to any metric, and you'll reshape behavior, as people try to game it by means fair and foul: http://t.co/3FCxdLWklv
The second story is domestic. Sam Wineburg, a professor of education and history at Stanford, writes in the Chronicle of Higher Education about his most recent scholarly project. It began when he and his graduate students made some rigorous, high-quality history curricula available to five San Francisco high schools:
[Research] showed that students who used our curriculum not only outperformed peers on tests of historical knowledge but also grew in reading comprehension. When district officials asked us to make our materials available to every San Francisco teacher, we created a simple Web site and uploaded 75 PDF’s.
It soon became clear that teachers were forwarding links to friends elsewhere. After six months, we had 50,000 downloads; 200,000 by the end of the first year.
He’s now expanded the project, choosing “real-world impact over impact factor.” But he laments that all this work — widely-seen, widely-used — is invisible to his institution’s reviewers. A scholar who chooses to work rigorously and accessibly and does not, like him, have tenure, would likely not secure it. “I no longer believe that the scholarly enterprise of education has much to do with educational betterment,” he says:
I no longer believe that when I publish articles in journals with minuscule circulations I am contributing to the field—if by “field” we mean the thousands of well-meaning individuals who go to work each day in places called schools.
Impact factor: a corporate assessment of not-for-profit output, which may have very little to do with real-world impact but is tabulated and ranked by governments regardless. Another journal editor told Nature says the Brazilian “cartel” sought to raise the profile of Brazilian scholarship, not to boost its impact factor. But it’s hard not to see the Brazilian “cartel” as a brilliant ecological solution for an ecological problem. Banding together to raise the profile of scholarship? It sounds like an organized, cooperative innovation to fix a biased system -- it sounds like reform.