By Dan King, first published 21/02/2020, updated 03/03/2020
Prompted by our response to the review consultation, this blog shares some of our thoughts and observations on HE-BCI and the consultation. Dan King has enjoyed a (professional) relationship with this data for nearly 20 years, using the HE-BCI data to inform University performance and now in consultancy work with various UK and overseas clients.
Research Consulting regularly supports work focused on knowledge exchange, including a contributory report on effective practice in KE that informed the 2016 McMillan Review. The data available from HE-BCI frequently informs our work, setting contexts for scale, breadth and priorities.
The Higher Education-Business and Community Interactions (HE-BCI) has now been with us since 1999, and it is timely that HESA (who now collect the data) are undertaking a major review. The importance of this data can be seen in the evidence drawn from it in multiple reviews of university-industry partnerships (most recently the Muscatelli Report on innovation in Scotland).
Disciplines matter: can we improve the granularity of data?
There is no doubt that the level of information captured within HE-BCIS is comprehensive and globally leading in terms of the data captured and detail of categorisation. A good example of this is seen below in the infographic by AUTM – a US-based association of over 3100 technology transfer professionals – that has run an annual survey of institutions involved in technology transfer since 2008. The infographic shows the various spin-off and start-up categories and differentiation between staff and student start-ups, but this can be contrasted against the equivalent metric in Australia which does not differentiate in as much detail.
However, one of the weaknesses of HE-BCI is that the public data stops at institutional levels. Here I would contrast HE-BCI with the data collected by HESA for research income, where analysis can take into account the discipline mix of institutions. Enabling a better understanding of knowledge exchange performance at discipline level is the biggest ‘gap’ for users of HE-BCI.
Alignment of some HE-BCI metrics to HESA Cost Centres would greatly add to the ability for benchmarking analysis to draw on the full range of data available, and in ways that are sensitive to disciplinary contexts and the characteristics of universities and institutions. This change would also serve to better align HE-BCI to the needs of institutions in responding to the KEF, providing institutions with discipline-relevant KE metrics that better illustrate their context and allow performance benchmarking.
From our work with individual universities, we know that approaches to internal data collection already allow this to be assessed internally, but no external comparisons are possible. That said, we know that some universities may find this extra granularity hard to report against, requiring adjustments to their data collection approach. But, if the discipline splits are aligned with existing data collection standards, then the approach would allow for process efficiency and alignment.
Of course, this can’t easily work for every HE-BCI metric, but in consultancy, contract research and regeneration, for example, additional discipline context adds value and indicates more clearly how institutions are delivering their KE mission. Greater transparency at discipline levels also starts to provide evidence against some national challenges. For example, in the social sciences, there has been a sustained concern over the level of engagement with private sector businesses. This was evident in the 2015 review of Doctoral Training Partnerships and is one of the issues to be examined in ESRC’s major review of doctoral support during 2020.
International perspectives – an effective tool for international comparisons
Through our international work, we also recognise growing interest and developments in, for example, research-led intellectual property (IP) and commercialisation. IP Pragmatics looked at this in their 2016 report to HEFCE on benchmarking for KE, and more recently this was exemplified in Mike Rees’ recent report on university-investor links. The data and evidence in HE-BCI are helpful for the UK on the world stage and featured in recent work we undertook for the British Council.
A review of HE-BCI should take into account the international landscape and equivalent data collected in the US and Canada (by AUTM), Australia (via the National Survey of Research Commercialisation), Ireland and also in Japan (through the University Technology Transfer Survey, deliberately modelled on the AUTM survey, and has run annually since 2007). The question is whether HE-BCI could more easily demonstrate UK performance set against international surveys and landscapes.
Thus far equivalent models across Europe have, to our knowledge, yet to emerge to a consistent standard. The ASTP, in their 2018 and 2019 reports on knowledge transfer activities in Europe, secured information from ~470 Knowledge Transfer Offices across Europe. Tellingly, a high percentage of responses (around 35%) were from the UK, suggesting that the availability of data (no doubt due to HE-BCI) is much greater here than elsewhere in Europe. However, the reports do indicate that national associations are developing metrics systems that contribute to the survey aims. However, there is some acknowledged incompatibility around these datasets and the ASTP survey, and the 2019 report notes additional difficulties in securing comparable research income (or research staffing) metrics to better allow meaningful comparison.
The Australian National Survey of Research Commercialisation, Snapshot 2018 Report, outlines the performance of Australia’s Universities in research commercialisation.
Improving the ability of HE-BCI data to influence?
The HESA consultation requested views on the metrics used in HE-BCI, and on their usefulness. Here, we make two observations on regeneration funding and the capture of “event” data around Social, Community and Cultural Engagement.
Big changes to funding environments in regeneration and economic development, but an opportunity to better capture the impact from this funding on beneficiaries.
This class of funds are distinctive in that typically they mean a grant from a sponsor to the University, with the expectation that the university delivers an agreed form of support to businesses and/or individuals. This support may be about entrepreneurial activity and skills, technical facilities and knowledge for R&D&I, SME leadership and growth or (quite frequently) recruitment of graduate talent.
There is no doubt that the locally allocated European Structural Funds have, over many years, been highly important funding sources for universities in developing their SME-facing KE portfolio. The ESIF 2014-2020 programme (which includes sub-funds including ERDF and ESIF) will now be the final programme of this type. In 2017-18, ERDF alone delivered over £90m of income into universities – accounting for 40% of all the reported regeneration income for that year. The majority of these projects will have been focused on delivering benefit to SMEs in their local area – but we cannot see this within the current HE-BCI data. One local example (that we have benefitted from as a company), is the Enabling Innovation ERDF project, where three universities in the D2N2 LEP area worked together to deliver support to 2,000 SMEs.
So, the question is whether, within HE-BCI, we need to consider what the form of support is to the beneficiary organisations and who is delivering it (i.e. which part of the university). This data would be helpful in better understanding the role and scale of universities in supporting regions and local areas.
Beyond ESIF 2014-2020, there is the expectation of continued university engagement via the expected replace fund: the UK Shared Prosperity Fund (SPF). Details remain scant but an indicative timetable is emerging and suggests that the Government will bring forward plans for SPF “shortly”.
Social, community and cultural engagement: is the data influential?
Our experience has found that data for social, community and cultural engagement events has less value to us in its present form. Essentially, this derives from the lack of typology describing the events and a lack of robustness in the data, which generates some doubt in conducting further meaningful analysis. In most cases, it seems universities report more attendees in “other” event types than for the aggregate total in the four defined event types (public lectures, performance arts, exhibitions or museum education).
Individual universities do use these data in reports outlining the scale of community engagement activities, a useful indicator of the wider value universities bring to local communities. But, beyond this, we have seen little evidence that this data capture is influential. Indeed, the only report we’ve recently come across looking at academic support for pro-bono public engagement or knowledge exchange work was undertaken by a separate survey and did not rely on HE-BCI data (Viewforth Consulting, 2018). The data draws on analysis of academic staff time in delivering these events.
In conclusion, we await with interest the possibilities for the review of HE-BCI to reinforce and improve this important data set.