How do we know the return-on-investment for academic research? What is the impact of the academic studies that have been carried out? What is the value for money of the research that a university has performed?

In search of excellence

These questions, and more, have been important but difficult to answer for many higher education institutions. That is why they are the focus of the Research Excellence Framework (REF), a revised system for assessing the quality of research in UK higher education institutions, whose results will be finalised in 2014. The REF is undertaken by the four UK higher education funding bodies (HEFCE, SFC, HEFCW and DELNI), to help them decide where to allocate funding, and to provide accountability for public investment in research and benchmarks for universities in the UK. It is important to note that REF is a selective assessment exercise, not an audit: institutions make their own submissions, and it is possible to choose who is included, what constitutes their best work, and to demonstrate the social impact that will be derived from this. Therefore, its focus will truly be on excellence.

In time

In 2006, the UK Government announced its intention to reform its current framework for assessing and funding research. What followed was (1):

  • some initial studies on the potential use of bibliometric indicators;
  • a bibliometrics pilot exercise;
  • proposals to assess the social impact of research;
  • another pilot exercise to test and develop the proposed approach.

In March 2011 the funding bodies announced their decisions on the weighting and assessment of impact within the REF.  In November 2011, a conference was organized at the Royal Society in London to examine in detail how the REF will work in practice (2). In this article, Research Trends combines insights from that meeting with background information to give you the complete and up-to-date picture.

Force of impact

Impact is defined in the broadest sense. The REF looks at several aspects of impact, such as scientific, economic and social, in particular using case studies to demonstrate social impact. Impact is evaluated by panels conducting peer review, and these experts will make use of different types of information and different sources as they deem appropriate. In doing so, they aim to arrive at the fairest evaluation possible, as it is based on many different aspects of impact. In order to ensure that the expert panels include a sufficient breadth and depth of expertise to produce robust assessments and carry the confidence of the community, submissions can be made to 36 different units of assessment, or subject areas.

Bibliometric indicators derived from SciVerse Scopus will be available to 11 of the 36 panels (see Table 1 for details) to make use of to complement and / or confirm their peer review findings, if they would like. Most panels in Health Sciences, Life Sciences and Physical Sciences will have bibliometric information available. Fields such as engineering and Social Sciences, where citation information is known to have less uptake, will not make use of this option.

REF unit of assessment Bibliometrics data available?
1 Clinical Medicine Yes
2 Public Health, Health Services and Primary Care Yes
3 Allied Health Professions, Dentistry, Nursing and Pharmacy Yes
4 Psychology, Psychiatry and Neuroscience Yes
5 Biological Sciences Yes
6 Agriculture, Veterinary and Food Science Yes
7 Earth Systems and Environmental Sciences Yes
8 Chemistry Yes
9 Physics Yes
10 Mathematical Sciences  
11 Computer Science and Informatics Yes
12 Aeronautical, Mechanical, Chemical and Manufacturing Engineering  
13 Electrical and Electronic Engineering, Metallurgy and Materials  
14 Civil and Construction Engineering  
15 General Engineering  
16 Architecture, Built Environment and Planning  
17 Geography, Environmental Studies and Archaeology  
18 Economics and Econometrics Yes
19 Business and Management Studies  
20 Law  
21 Politics and International Studies  
22 Social Work and Social Policy  
23 Sociology  
24 Anthropology and Development Studies  
25 Education  
26 Sports-Related Studies  
27 Area Studies  
28 Modern Languages  
29 English Language and Literature  
30 History  
31 Classics  
32 Philosophy  
33 Theology and Religious Studies  
34 Art and Design: History, Practice and Theory  
35 Music, Drama, Dance and Performing Arts  
36 Communication, Cultural and Media Studies, Library and Information Management  

Table 1 - Units of assessment in REF 2014, indicating which ones will have bibliometric information available as part of the toolkit to evaluate impact.

A rather unique example of impact

You may know that Amy Williams won the Winter Olympic 2010 Gold in skeleton bobsleigh. But did you know that she was assisted in suiting the design to her body contours and method of steering by two 2 PhD students? Rachel Blackburn and James Roche, from the University of Southampton, helped realize this achievement. Dr Stephen Turnock, Blackburn and Roche’s supervisor from the University of Southampton's School of Engineering Sciences, said that they had “demonstrated that engineering excellence can be delivered by a small dedicated team with a clear vision”. (3)

What’s your number?

Some quotes from the Panel Criteria and working methods (4) clarify REF’s vision on the use of bibliometrics in this exercise.

On using more than one indicator:

“Where available and appropriate, citation data will be considered as a positive indicator of the academic significance of the research output. This will only be one element* to inform peer-review judgments about the quality of the output, and will not be used as a primary tool in the assessment.” (p. 13)

On reliability and comparability:

“… the citation count is sometimes, but not always, a reliable indicator. (…) such data may not always be available, and the level of citations can vary across disciplines (…). Sub-panels will be mindful that citation data may be an unreliable indicator for some forms of output (for example, relating to applied research) and for recent outputs.” (p.42)

On putting a number into context:

“ Where available on the Scopus citation database, the REF team will provide citation counts for submitted outputs, at a pre-determined date and in a standard format. The sub-panels will also receive discipline-specific contextual information about citation rates for each year of the assessment period to inform, if appropriate, the interpretation of citation data”.  (p.42)

Cause for concern

Much of the original criticism towards REF was focused on measurement of impact and how that could be done in an objective way, for instance (5). Often, it was commented that impact can’t include everything: it relies on strong underlying science, and several speakers at the conference underlined how “curiosity science” or “risk science” is not something an institution should be penalized for, even if it will not consistently pay off as much in terms of impact as the more “conservative science” will inevitably do.

Other concerns have been expressed about specific subject areas, especially Arts & Humanities, and how it may be more difficult to show impact there, not only in terms of citation counts, but also in terms of impact on society. In this issue of Research Trends we describe the role of library and information science journals in generating patents, which is one potential way of showing concrete impact. Examples of impact could be: improving public understanding, improving patient outcome, or influencing policy.

Watch this space

Final results will not be published until 2014, but Research Trends will follow up and report on any interesting developments, as fostering excellence is crucial for the research of the future.  It’s not simply an exercise in assessing what was done, but what was done over and above the expected.

Links

  1. http://www.hefce.ac.uk/research/ref/
  2. http://www.hepi.ac.uk/478-2001/HEPI's-Autumn-Conference-will-focus-on-the-new-Research-Excellence-Framework-which-is-due-to-go-live-in-2014.html
  3. http://www.epsrc.ac.uk/newsevents/news/2010/Pages/gold-winningsled.aspx
  4. http://www.hefce.ac.uk/research/ref/pubs/2012/01_12/
  5. http://www.brass.cf.ac.uk/uploads/Research_Excellence_Framework290410.pdf

Notes

*emphasis by authors

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)