Intended for healthcare professionals

Feature Academic Medicine

Beyond the impact factor

BMJ 2009; 338 doi: (Published 13 February 2009) Cite this as: BMJ 2009;338:b553
  1. Geoff Watts, freelance journalist
  1. 1London
  1. geoff{at}

    Details of how the new research excellence framework will assess UK research are expected later this year. Geoff Watts looks at the possibilities for fairer evaluation of applied medical research

    The RAE is dead; long live the REF.1 We’re talking research quality: what’s good and what’s not; what’s worth doing and what isn’t. But how is good quality to be defined? And to what extent should that judgment take account not only of the intellectual excellence of a research programme but also, in the case of medicine, its value to clinical practice? The question is a live one, and set to become livelier. One answer may lie in a still embryonic form of appraisal: the “social impact” of research.

    For the benefit of readers who don’t follow the vicissitudes of the research community, the more familiar of these three letter abbreviations, RAE, stands for research assessment exercise: the process by which, until last year, the Higher Education Funding Council of England surveyed the quality of research carried out in British universities. The REF, the research excellence framework,2 is its replacement. In a nutshell, the old system was based principally on peer review of an academic department’s research, while the new one will rely far more on metrics: a raft of statistical measures of the research under scrutiny. These could include a department’s total non-government research income, the number of its postgraduates, and—worrisome to some—a bibliometric analysis of its research output.

    Universities take understandable pride in being able to quote good research ratings—but there’s more at stake here than academic egos. The assessment figures are used to determine how research funds are allocated.

    What is good research?

    One of the pitfalls in any judgment of research is the “apples and pears” problem: like is not being compared with like. Richard Smith, a previous …

    View Full Text

    Log in

    Log in through your institution


    * For online subscription