UdeM has signed DORA to support the development and promotion of best practices in university research assessment.
Université de Montréal has joined the 21,000 signatories of the San Francisco Declaration on Research Assessment (DORA), a global initiative to make the research ecosystem fairer and more inclusive, while underscoring the vital role of scholarly journals in the dissemination of research findings.
One of DORA’s central themes is the view that the Journal Impact Factor (JIF) has been given undue weight in assessing research outputs. To understand the issue, we talked to Vincent Larivière, Professor in the School of Library and Information Sciences at UdeM and Associate Vice-Rector, Strategic Planning and Communications.
What is the Journal Impact Factor and why is DORA challenging it? What are its adverse effects?
The JIF measures a scholarly journal’s impact based on the number of citations it receives. It was initially developed to help libraries decide which journals to subscribe to. However, with the advent of electronic journals in the 1990s, libraries started subscribing to all journals and the JIF lost its relevance as a yardstick for building periodical collections. Since nature abhors a vacuum, the JIF has been taken up as a tool for assessing the merits of research outputs.
The idea is that if an article is published in a journal with a high impact factor, the research must be of better quality. Basically, articles are being judged by their cover. This approach has led to distortions over the past 20 years. For one thing, it has strengthened the status of English in academic publishing, since the journals with the highest impact factors are in English. For another, the impact factor influences the choice of research topics: it pressures researchers to focus on trendy subjects instead of the in-depth work that science often demands. Some countries have even introduced publishing premiums that can be worth more than C$150,000, based on the JIF.
The JIF is a poor predictor of an individual article’s impact. An article may be published in a leading journal and still fizzle. There are existing tools that measure the reception of a specific article based on factors such as the number of citations or the attention it receives on social media or in the press. These indicators also have limitations, however, and can also be swayed by fashion or reputation. It must be borne in mind that "research quality" is subjective, as many studies of peer review have shown. The best way to judge an article is to read it and dissect it.
We realized that it is never too late to be part of the solution instead of part of the problem. UdeM has always used best practices to evaluate the work of its research community. We need to move with the times and continue adopting the fairest and most equitable practices in order to guard against questionable behaviour by researchers and give all work its due. We don’t do research to score points but to advance knowledge and benefit society
We want UdeM to be DORA’s standard-bearer in the French-speaking world. Our goal for 2022 is to help revive the movement, expound the need to change our thinking, and make sure our community is aware of the limitations of research impact indicators. In short, we want UdeM to be part of the international conversation about research assessment and to help ensure that the benefit to society is the overriding consideration.
- - -