Skip to Main Content

Open Research Practices

Research Metrics

Competition between universities for places on commercially constructed "league tables" has risen sharply in recent decades (UNESCO).

The most aggressively marketed products are "Times Higher Education World University Rankings (THE WUR)", "Academic Ranking of World Universities (ARWU)" (ShanghaiRanking Consultancy), and "Quacquarelli Symonds World University Rankings (QS WUR)".

Each of the three lists is produced entirely by a for-profit corporation, and they are all based on closed, proprietary datasets. Examining the conflicting data sources for the commercial rankings highlights why institutions perform differently on purportedly "objective" scales.

The figure below ("Figure 2") shows the indicators used by ARWU, THE WUR, and QS WUR for their rankings in 2022 (Ranking the University, p. 12).

The full list of indicators is broken down in the report. Highlights include:

  • "Prizes" for ARWU is determined by the number of staff (20%) and alumni (10%) who are Nobel Prize laureates or Field Medallists.
  • "Publications" and "Citations" for ARWU are based on Nature (10%), Science (10%), and Clarivate indices (40%). Scopus/Elsevier is the source for "Citations" for both THE WUR (36%) and QS WUR (20%).
  • "Research reputation", "teaching reputation", and "employer reputation" are collected via surveys conducted by THE and QS. Two academics who have been invited to complete the survey subsequently posted samples of the questions (Christopher P. Hood and Leon Rocha).

As with other open research practices highlighted in this Guide, new tools are being developed to open up institutional evaluation processes, such as the CWTS Leiden Ranking Open Edition. In contrast to the closed edition of the CWTS Leiden Ranking, which is based on data from Web of Science and Clarivate, the Open Edition draws its data from OpenAlex, an open-source, open access database of research publications (see "Tools for more transparent metrics" below).

From 2024, the CWTS Open Edition is being produced in parallel with the original (closed) version of the Ranking with the aim of replacing it within two years (Leiden Madtrics). In order to promote more responsible use of rankings, the CWTS Leiden team provides article-level data from Open Alex to demonstrate exactly how the scores are calculated, and how institutions can interrogate the data for further analysis.

Responsible Assessment

As awareness of the structural inequity of commercial metrics has increased, a number of alternative initiatives have emerged. In 2023, Utrecht University (NL) chose not to submit data to the THE World University Ranking (Utrecht University News). In the announcement, they wrote "rankings put too much stress on scoring and competition, while we want to focus on collaboration and open science". Another reason was "the makers of rankings use data and methods that are highly questionable", as demonstrated in a report from the Universities of the Netherlands (Ranking the University).

 

DORA logo

The San Francisco Declaration on Research Assessment (DORA) was written in 2012 by a group of scholarly journal editors and publishers.

(DORA logo by Nick Duffield, CC BY-SA)

Its main general recommendation is "Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist's contributions, or in hiring, promotion, or funding decisions" (Declaration). As of December 2024, more than 25,000 organisations and individuals in 166 countries have signed DORA.

 

The Coalition for Advancing Research Assessment (CoARA) is a group of organisations "committed to reforming the methods and processes by which research, researchers, and research organisations are evaluated". The 800+ organisational signatories to CoARA have each committed to creating action plans and implementing them within a five-year time-frame.

The University of Galway has signed CoARA, and it is also a member of the Irish National Chapter. The National Open Research Forum has funded the ABOARD project to "develop a roadmap for system-level incentivisation of open research practices", which includes leading the Irish National Chapter of CoARA.

Rewards and recognition

Recognising and rewarding individuals who demonstrate their commitment to open research practices is essential to embedding those practices across institutions. Such incentives need to be tangible for employers in hiring and retention processes, and also for funders in reviewing and awarding funding applications.

The Latin American Forum on Research Assessment (FOLEC) is an initiative of the Latin American Council of Social Sciences (CLACSO). Representing 937 research and postgraduate centers in 56 countries, CLACSO is a non-governmental instution dedicated to advancing the humanities and social sciences. FOLEC has published a "Declaration of Principles" on research assessment with a number of recommendations, including:

  • "Scientific knowledge is a collective construction, so it is essential that research assessment gives adequate weight to teamwork and its different forms of organization and construction
  • Assessment processes should be evolutionary, self-reflective, transparent, and participatory, promoting mechanisms that encourage dialogue and mutual learning, and ensure continuous improvement, not only for the scientific community but also for citizens, including social and community referents in its development
  • It is essential to guarantee the representation of women and diversities in the assessment systems and processes
  • Writing in English does not confer a merit per se superior to publications in other languages. Multilingualism favors the development of socially relevant research and to sustaining cultural diversity."

 

In the Netherlands, the Open Science NL Steering Board has approved grants up to €50,000 each to 23 "knowledge institutions" to "integrate the recognition and rewarding of open science into their HR policies for recruitment and promotion" through 2026 (Open Science NL). This is administered through the Recognition and Rewards programme, whose goals are to:

  • "create more dynamic differentiation in career paths
  • place more emphasis on the qulaity of work and less on quantitative results
  • do justice to individual academic achievements and ambitions as well as to contributions that serve collective goals
  • encourage high-quality leadership
  • and stimulate open science".

 

The Open Science Career Evaluation Matrix (OS-CAM) was launched in 2017 by the EU to encourage and incentivise open research practices. The Universities Norway (UHR) adapted the matrix and released the Norwegian Career Evaluation Matrix (NOR-CAM) to improve three core assessment principles: "more transparency, greater breadth, and comprehensive assessments as opposed to one-sided use of indicators".

 

In Finland, the Steering Group for Responsible Researcher Assessment is developing the Finnish Career Assessment Matrix (FIN-CAM). This is being supported by the Federation of Finnish Learned Societies, who conducted a survey of researchers.

 

The Recognising and Rewarding Open Research Practices toolkit is a self-assessment tool and maturity framework that institutions can use to assess their readiness to implement recognition and reward for open research practices. The toolkit was launched in October 2024 by the Open and Responsible Researcher Reward and Recognition Project (OR4), which is part of the UK Reproducibility Network. This is the first iteration of the toolkit, which will continue to be developed through 2027.

Tools for more transparent metrics

OpenAlex is a bibliographic database that is fully open, which makes it more comprehensive than commercial products such as Web of Science or Scopus. It is built on several open persistent identifiers, including digital object identifiers (DOI), ORCiD, ROR, and Wikidata. You can learn more about persistent identifiers here.

OpenAlex logo

Some of the key features that distinguish OpenAlex from commercial products include:

  • using open-source code and releasing the dataset under a CC0 (public domain) license
  • records in languages other than English to improve global inclusion and equity
  • records for books and book chapters to improve bibliodiversity
  • data regarding article processing charges (APCs) to improve transparency regarding gold open access

To increase the reusability of the data, it can be accessed in three different ways: through the online interface, via an API, or by directly downloading the entire dataset.

On 1 January 2024, the Sorbonne began using OpenAlex as its sole source for bibliometrics after cancelling its subscriptions to Web of Science and Clarivate (Sorbonne Université).

 

The Lens (Lens.org) is another freely searchable database that links scholarly works with patent records and patent sequences.

The Lens logo

By making it possible to move between patent databases and academic papers that cite those patents, The Lens facilitates faster research innovation and patent developments.

The underlying patent data is aggregated from national and international patent offices, including the European Patent Office, the World Intellectual Property Organization, and the US Patent and Trademark Office. The scholarly works are aggregated from OpenAlex, Crossref, and PubMed.

(The Lens logo by Aaron Ballagh, CC BY-SA)

NB: While both of these tools are free at the point of use for individual users, there are pricing plans for organisations that provide faster support, more bandwidth, bespoke services, etc.