Monitoring and evaluating UKRI’s Open Access policy: Recommendations for implementation

Share this article

Latest Tweets

Monitoring and evaluating UKRI’s Open Access policy: Recommendations for implementation | Research Consulting

Since August 2022, Research Consulting has been working to support the development of a monitoring and evaluation framework for UK Research and Innovation’s (UKRI) Open Access policy. In November 2022, we shared high-level findings arising from our consultation with 76 representatives from research and publishing communities. We are now ready to release the set of monitoring and evaluation questions proposed to UKRI, alongside a recommended strategy for implementation going forward.

In reading this article, please keep in mind that UKRI will have the final say on what questions are implemented and operationalised. This is therefore a starting point for UKRI to further reflect and build on over the coming months, including in liaison with research and publishing stakeholders.

Proposed monitoring and evaluation strategy

Our recommended monitoring and evaluation strategy is based on contribution analysis, an approach that is especially useful when determining the impact of an intervention (such as UKRI’s Open Access policy) on observed outcomes. Contribution analysis is a powerful yet versatile methodology that permits the use of a wide range of evidence types to draw conclusions, making it particularly suitable for investigating complex policy landscapes with multiple stakeholders.

For UKRI’s monitoring and evaluation efforts, we suggest using a combination of quantitative (including bibliometric) data on individual research outputs and qualitative information on the publishing landscape to assess policy efficacy and compliance. The recommended monitoring and evaluation questions (see figure below) will also help with value-for-money evaluations, although an accurate assessment would require additional financial data beyond the scope of this work.

For the shortlisting of monitoring and evaluation questions, which were identified via a mix of desk research and stakeholder input, we considered several dimensions, including:

  • the prominence of a requirement in UKRI’s Open Access policy
  • perceived significance of the question to external stakeholders
  • estimation of feasibility and resource intensity
  • expected reporting burdens
  • availability of data sources (including open data)
M&E questions infographic | Research Consulting
Prioritised list of monitoring and evaluation questions and intended purpose (click here to open the full set of questions in a separate window).

As part of the proposed monitoring and evaluation strategy, we recommend the use of difference-in-differences analysis to inform contribution analysis. This is a quasi-experimental method for comparing trends across comparable groups prior to and after implementing an intervention. For instance, comparing UKRI-funded and non-UKRI-funded authors that are affiliated with UK institutions would enable the monitoring of behaviours in critical policy dimensions from the moment the policy was introduced (e.g. no embargo period, choice of appropriate licence, choice of appropriate Open Access pathway).

At this stage, the recommended data sources to underpin the proposed monitoring and evaluation approach is under discussion, but is likely to include a mix of open and proprietary data sources. The final choice of data sources will be informed by data availability and coverage, as well as by the need to strike a sustainable balance between value-for-money, skills and resource available within UKRI and technical requirements. UKRI should communicate their chosen direction of travel transparently, so that the strengths and limitations of the final set of data sources are clearly understood by all stakeholders involved.

Recommendations for UKRI

The recommended timeline for the monitoring and evaluation framework’s first six years is summarised in the diagram below. Naturally, deploying the monitoring and evaluation approach will require planning and collaboration with external stakeholders. This is why we recommend a pilot phase with a small set of key stakeholders, such as an external review group, to help validate the terminology and definitions, approach and technical solutions for data storage, analysis and sharing. These dimensions will have to be assessed hand-in-hand with emerging results from the monitoring and evaluation questions, to ensure consistency as well as their appropriateness to answer the chosen monitoring and evaluation questions. After piloting, it will be appropriate to fully operationalise a (likely evolved) version of the monitoring and evaluation approach, with ensuing reporting and data releases.

M&E timeline | Research Consulting
Recommended implementation timeline.

The implementation timeline should be read in combination with the following recommendations, which arose throughout our research and consultation.

Principles for monitoring and evaluation

UKRI should:

  • focus on positively supporting future policymaking and improving communication, awareness-raising and sharing of best practice
  • use contribution analysis as the core evaluation strategy
  • seek to achieve a balance between the collection of new evidence and the creation of reporting burdens for external stakeholders (including, for example, higher education institutions, researchers and publishers)
  • monitor the use of a variety of funding models and not only focus on the use of article processing charges and book processing charges
  • consider monitoring the impact of the Open Access policy on global equity

Practical next steps

UKRI should:

  • ensure that terminology is clear, by engaging research performing organisations and publishers to test definitions prior to implementation
  • consider a pilot stage to reassess and update the prioritised monitoring and evaluation questions based on emerging results
  • iterate and review the monitoring and evaluation approach for long-form outputs as the landscape continues to develop
  • carefully scope out questions on the societal impact of open access, to avoid scope creep

Technical implementation

UKRI should:

  • consider the use of open data sources, in combination with external datasets where proprietary or confidential information is required and can be used for the intended purposes
  • pursue an automated rather than manual approach to data collection and analysis, leveraging Application Programming Interfaces (APIs) and cloud-based analytics tools as appropriate
  • explore potential to collaborate with existing data providers, to minimise data collection burdens where open data may not be available

Outputs and public sharing

UKRI should:

  • produce a monitoring and evaluation report as well as a record-level dataset and additional qualitative evidence
  • share monitoring and evaluation results and lessons learned publicly, in the spirit of open data and transparency
  • clearly note the limitations of the chosen set of data sources as part of the published monitoring and evaluation methodology and any data releases

As UKRI updates and implements the framework, there is an opportunity for them to stand out as one of the leaders in understanding the immediate, intermediate and long-term effects of Open Access policies. By delving into both qualitative and bibliometric data, UKRI will be able to share nuanced lessons learned with other national and international funding organisations, potentially leading to valuable exchanges and continued progress in the journey towards open science.

Towards implementation

We are now in the process of preparing our final report and executive summary for release. UKRI expects to publish these in the summer of 2023 and will consider them in developing their final monitoring and evaluation framework.

We note that this is the last project announcement to take place via Research Consulting’s website. Please refer to UKRI’s Open Access page for further information.

Related Posts

Systems for research management
Universities & Research Organisations

Systems for research management

The past 10 years have seen significant growth and development in terms of the number dedicated systems supporting research management and the breadth of their

Read More