How impact factors changed medical publishing—and science
BMJ 2007; 334 doi: https://doi.org/10.1136/bmj.39142.454086.AD (Published 15 March 2007) Cite this as: BMJ 2007;334:561
All rapid responses
Rapid responses are electronic comments to the editor. They enable our users to debate issues raised in articles published on bmj.com. A rapid response is first posted online. If you need the URL (web address) of an individual response, simply click on the response headline and copy the URL from the browser window. A proportion of responses will, after editing, be published online and in the print journal as letters, which are indexed in PubMed. Rapid responses are not indexed in PubMed and they are not journal articles. The BMJ reserves the right to remove responses which are being wilfully misrepresented as published articles or when it is brought to our attention that a response spreads misinformation.
From March 2022, the word limit for rapid responses will be 600 words not including references and author details. We will no longer post responses that exceed this limit.
The word limit for letters selected from posted responses remains 300 words.
For a single article to generate “impact” in the academic community,
it foremost requires to be read. The ubiquitous and searchable electronic
databases, such as Medline, allow publications in most journals to be
accessed according to the researcher’s interest, not determined by the
relative impact factor of the journal in which the work is published.
Scientific scrutiny of this piece of work will determine whether it is
cited, and thus “impacting” on future research. To individual authors, it
is this process that can determine the success of their academic career
and their status in the research community.
Therefore, it is imperative that assessment of an individual’s impact
on their field is based on their own scientific track record, and not that
of the journals in which they publish, given that these indices were not
designed for such a task, and often do not produce a fair assessment (1-
3). Such individualised rating scales based on citation rates have been
suggested recently, such as the Hirsch index (4), although others have
also been developed.
In today’s increasingly flexible digital world, complete with
modifiable databases of information on a researcher’s activity, such
indices could be refined to specify the information requested and remove
some of its pitfalls. An individual’s impact factors could be determined
from citations for a limited time frame, and only for first author
publications, or primary research (e.g. excluding reviews).
It must be remembered that an individual’s scientific pedigree should
not be summarised into a single number, but as we move towards
bibliometric measurement as forming an important part of academic
appraisals (5), it will be important to ensure that for whichever process
is adopted (and verified), it is the individual researcher’s “impact” that
is taken into account, rather than that of the journals in which their
work is published.
1. Brown H. How impact factors changed medical publishing--and
science. BMJ 2007;334:561-4.
2. Williams G. Should we ditch impact factors? BMJ 2007;334:568.
3. Hobbs R. Should we ditch impact factors? BMJ 2007;334:569.
4. Hirsch JE. An index to quantify an individual's scientific research
output. PNAS 2005;102:16569-72.
5. Hobbs FDR,.Stewart PM. How should we rate research? BMJ 2006;332:983-
4.
Competing interests:
None declared
Competing interests: No competing interests
The Australian Government will begin the Research Quality Framework
(RQF) assessment process in 2008, which will affect funding for
Universities from 2009. A proportion of the assessment will be based on
citation indices. The Quality Metrics Working Group recommended that
bibliometrics, citation data, grant income data and tiered ranking of
discipline-specific research outputs be used in the RQF assessment process
to assist rating research quality.
Whilst impact factors will not contribute solely to funding outcomes,
and particular fields of research will be examined using different
methods, it remains clear that impact factors will play a role in the
distribution of funds within the higher education sector in Australia in
the future.
Competing interests:
None declared
Competing interests: No competing interests
The impact factor represents a poor measure of the importance of an
individual paper taking into account that 80% of a journals impact factor
is determined by 20% of the papers published.
In the last years a lot of new indexes has been created to provide
alternatives for the IF algorithm: the Sevinc’s proposal of retouching the
IF not considering the number of self citations in the calculation is a
good example of different option.
Van Leeuwen and Moed have developed an alternative journal impact measure,
the Journal to Field Impact Score (JFIS), providing solutions to biases
incurred in 4 areas (“non-citable” items included in the numerator of the
IF calculation; the relative distribution of research articles; technical
notes and reviews, different citing behavior across subject fields; and
the fixed two-year citation window).
Asai proposed an Adjusted Impact Factor to count a weighted sum of
citations per month over a time period of four years.
There are other alternatives, such as Cited Half-Life Impact Factor (CHAL-
IF), Median Impact Factor (MIF), Disciplinary Impact Factor (DIF),
"Prestige Factor" (PF).
A particular notation is for the Euro-Factor (EF), born in Europe to
target the language bias (prevalence of the English tongue) and perceived
USA-centricity of the SCI database.
Perhaps the best choice to evaluate the real impact of a scientific
article or journal is to take into account different parameters and not
only one of them, avoiding the “IF supremacy” in the field of research and
fundings.
Competing interests:
None declared
Competing interests: No competing interests
An important but often overlooked effect of impact factors is the
role they play in the publishing of sponsored studies. Such papers are
often, if not usually, developed by companies that specialize in the
development of "publication plans,"--the carefully planned release of
information in targeted venues, often over several years. In addition to
making information available, these plans also create a "citation trail"
so that later papers use earlier ones for support of findings and
conclusions.
The impact factor is one of the most important criteria used for
choosing where to submit such a paper.
This may explain in part why papers appearing in such journals are
often not those cited at some point in the future as landmark
publications.
Competing interests:
None declared
Competing interests: No competing interests
I agree with most of the sentiments expressed about the over reliance
on impact factors as a measure of quality and the potential for their
misuse by funding bodies and appointment committees. However, it seems
unlikely that the impact factor has had a major influence on the falling
number of clinical academics over the past decade. The major driver
behind this is surely financial. Remuneration for clinical academics
falls substantially behind that of their colleagues from an early stage
and never makes up the ground. Even the merit award system favours the
non-academic medic. This week has seen media highlight the fact that the
avergae GP has a six figure annual income under the new GP contract.
Where is the incentive to be the best clinical academic when you can earn
25% more as an average GP?
Competing interests:
None declared
Competing interests: No competing interests
Impact of the impact factor in Spain
Spanish researchers have observed with interest and no little irony
the debate on the virtues and vices of the impact factor and its potential
use in the UK Research Assessment Exercises beginning in 2008. Perhaps
Spain is not so different from other European countries, since the same
debate took place in our country more than 15 years ago.
In Spain publication of research reports in journals with a high
impact factor has, since 1989, been an official part of the national
system for evaluating researchers’ productivity. As stated in the Spanish
parliamentary record (1), a bonus is awarded only for “those articles of
scientific worth in journals of recognized prestige, which shall be
accepted to mean those that occupy relevant positions in the lists for
science fields in the Subject Category Listings of the Journal Citation
Reports of the Science Citation Index (Institute for Scientific
Information (ISI), Philadelphia, PA, USA)”. This criterion, which is the
determining factor in the government’s evaluation of a researcher’s
output, has become the key which opens the door to a professional career,
pubic funding, and favourable judgements of scientific productivity (2).
The aim of this reward system was to improve the quality and the
international visibility of Spanish science, but what effects has this
system, based almost entirely on a bibliometric criterion, actually had in
Spain?
The most immediate effect was mass emigration of the best research
articles to foreign journals. The increase in the number of Spanish source
items in the Thomson Scientific databases since this policy came into
effect has been spectacular (3). The mean increase in Spanish source items
in the ISI database was 592 items per year until 1990, but jumped to 1512
items per year between 1991 and 2004, a 255% increase. Nevertheless, this
huge increase beginning in the 1990s coincided with a freeze in public
investment in research and development (R&D), which decreased in real
terms after adjustment for inflation, to the point that the unit cost per
published article declined steadily in Spain for several years.
Other effects of the change in policy were the internationalization
of Spanish science, and its incorporation into mainstream research, a
process which has enhanced the rigour and quality of work done in this
country, and has raised the bar in terms of the range and scope of the
goals the research aims to achieve. Articles authored by Spanish
researchers have increased not only in quantity but also in expected and
observed impact.
However, progress toward international-quality research has been
accompanied by increasing neglect of Spanish journals, to which
researchers rarely submit their best work (4). Another negative effect has
been the destruction of Spanish as a language of science. These negative
consequences obviously cannot be extrapolated to the UK, a country which
speaks English and whose scientists were not cut off from the rest of the
international scientific community during most of the 20th century.
However, other, more worrisome and unpredictable consequences have begun
to appear which may eventually infect UK research activity and assessment:
1. Many research groups have altered their research agendas. In
Spain, research with potential practical applications, and research on
topics that are local, regional and national in scope, has been replaced
by basic research in topics more likely to be better received by the
international research community, and therefore more likely to have a
greater impact in terms of citation counts.
2. Impactitis (5), a new disease, has become epidemic in Spain. Its
symptoms are altered publication and citation behaviour in response to an
obsessive compulsion to use the impact factor as the single,
incontrovertible quality criterion for scientific articles. This epidemic
has been propagated by all types of academic institutions and researchers,
invading all tissues of science and infecting all those who take part in
scientific communication. Numerous cases are diagnosed daily in authors
and editors. Among the former, the signs of illness are choosing the
target journal on the basis of the impact factor alone without considering
which audience is most appropriate for their work, hypertrophic self-
citing, joining invisible citation colleges intended to increase the
impact of their publications, and deliberately omitting to cite their
scientific rivals or enemies. Infected editors, determined to see their
journal indexed in Thomson Scientific’s databases, sink to manipulating
editorial policies in order to increase the journal’s repercussion both
among scholars and in the mass media.
Publishing in journals that occupy the leading positions in the
Thomson Scientific impact factor rankings has become the fuel which fires
Spanish science. Although the reasons why the impact factor should be
used with caution have been spelled out (6), and even though Garfield
himself has warned against this type of abuse of the ranking (7), the
impact factor has become the number that is devouring Spanish science (8).
The consequences, both positive and negative, of using bibliometric
criteria to evaluate research in Spain may be predictive of some of the
effects that such a policy could have for science in the UK in the next
few years if a similar method of research evaluation is adopted.
(1) Boletín Oficial del Estado, Pub. L. No. 28563, (September 9,
1989). Boletín Oficial del Estado, Pub. L. No. 3566, (February 6, 1990).
Boletín Oficial del Estado, Pub. L. No. 37030, (December 3, 1994). Boletín
Oficial del Estado, Pub. L. No. 35028, (November 20, 1996).
(2) Jiménez Contreras E, Moya Anegón F, Delgado López-Cózar E. The
evolution of research activity in Spain. The impact of the National
Commission for the Evaluation of Research Activity (CNEAI). Research
Policy 2003; 32(1): 123-42.
(3) Jiménez Contreras E, Delgado López-Cózar E, Ruiz Pérez R, Fernández
VM. Impact-factor rewards affect Spanish research. Nature 2002; 417(6892):
898
(4) Díaz M, Asensio B, Llorente GA, Moreno E, Montori A, Palomares F, et
al. El futuro de las revistas científicas españolas: un esfuerzo
científico, social e institucional. Rev Esp Doc Cient. 2001; 24(3): 306-
14.
5. Camí J. Impactolatría: diagnóstico y tratamiento. Med Clín (Barc) 1997;
109: 515-24.
6. Seglen PO. Why the impact factor of journals should not be used for
evaluating research. BMJ 1997; 314: 498-502.
7. Garfield E. The Agony and the Ecstasy. The History and Meaning of the
Journal Impact Factor. International Congress on Peer Review and
Biomedical Publication. Chicago, Sep 2005 16. [cited 2007 Ap 19].
Available from:
http://garfield.library.upenn.edu/papers/jifchicago2005.pdf
8. Monastersky R. The Number That’s Devouring Science. The Chronicle of
Higher Education [serial on the Internet]. 2005 [cited 2007 Ap 19];52(8,)
A12. Available from: http://chronicle.com/free/v52/i08/08a01201.htm
Translator: K. Shashok
Competing interests:
None declared
Competing interests: No competing interests