Evaluating computerised health information systems: hard lessons still to be learnt
BMJ 2003; 326 doi: https://doi.org/10.1136/bmj.326.7394.860 (Published 19 April 2003) Cite this as: BMJ 2003;326:860All rapid responses
Rapid responses are electronic comments to the editor. They enable our users to debate issues raised in articles published on bmj.com. A rapid response is first posted online. If you need the URL (web address) of an individual response, simply click on the response headline and copy the URL from the browser window. A proportion of responses will, after editing, be published online and in the print journal as letters, which are indexed in PubMed. Rapid responses are not indexed in PubMed and they are not journal articles. The BMJ reserves the right to remove responses which are being wilfully misrepresented as published articles or when it is brought to our attention that a response spreads misinformation.
From March 2022, the word limit for rapid responses will be 600 words not including references and author details. We will no longer post responses that exceed this limit.
The word limit for letters selected from posted responses remains 300 words.
Littlejohns and co-authors have realistically identified the reasons
for failure in the implementation of a hospital information system (HIS)
in South Africa (1). However, they do not sufficiently stress the reasons
related to the training and epidemiological experience of professionals in
charge of the HIS and the need of health professionals to be closely
involved in the HIS.
In 1997, we conducted a field test of prototype tools and information
flows, during six months, with the overall goal to develop an computerised
HIS at the 3 university teaching hospitals (totalling 1,500 beds) in
Abidjan (Ivory Coast). The objectives of the project were to improve
patient health care management, to strengthen a better management
efficiency and to promote the HIS database for clinical and
epidemiological research.
The HIS design and management procedures were similar for the 3
university hospitals. In each hospital, the HIS was managed by a team from
the administrative department, without a hospital physician or trained
epidemiologist. Prior to the beginning of the field test, the
administrative staff underwent intensive training during 3 weeks (ICD-10
using, computerised data entry, data analysis, HIS informed decision-
making, monthly feed-back and reporting). Five voluntary clinical
departments in each hospital were included in the field test. Project
presentation workshops with clinicians and nurses were organised in all
hospitals. A Ministry of Health supervisory team (epidemiologists) was
responsible for technical implementation and follow-up of the project.
After 6 months, the assessment demonstrated a major failure in HIS
implementation. Three main reasons for failure were identified. First, the
heavy administrative staff workload generated by day-to-day HIS management
associated to an inadequate medical and epidemiological education and
therefore an in-adapted profile required for this task. Second, the
limited involvement of the medical teams in the project possibly due to
the responsibilities attributed to the administrative departments in this
project. Third, the difficulty perceived among the practitioners for the
implementation of an HIS which was not judged a public health priority
considering the scarce financial resources, partly related to an
insufficient knowledge regarding the goals and functions of a computerised
HIS (2).
One of the recommendations of the evaluation was to create specific
medical information units in each hospital, with highly specialised
epidemiologists and to closely associate the health care professionals in
the following project, including exhaustive information about functions,
informational approach to decision making as well as the advantages and
limitations of HIS (3).
References
1 - Littlejohns P, Wyatt JC, Garvican L. Evaluating computerised
health information system: hard lessons still to be learnt. BMJ 2003;
326:860-3.
2 - Osunlaja AA, Olabode JA. Role of an effective hospital information
system in a depressed economy. Methods Inf Med 1997; 36:141-3.
3 - Gladwin J, Dixon RA, Wilson TD. Rejection of an innovation: health
information management training materials in east Africa. Health Policy
Plan 2002; 17:354-61.
Competing interests:
None declared
Competing interests: No competing interests
Littlejohns, Wyatt and Garvicans describe a failed attempt to
implement a computerized information system among 42 hospitals in the
Limpopo Province of South Africa. They provide a typology of possible
reasons for this failure, retrospectively summarizing the lessons learnt
from this experience (1).
Implementation of an information system can be viewed as an
organizational innovation process contributing to the maintenance and/or
improvement of its overall performance and effectiveness (2). Because
some users find themselves at a greater position of power and influence
with the introduction of access to previously impenetrable organizational
information, others who have traditionally enjoyed legitimate power may
find that power base eroded by the innovation. If this potential problem
is not dealt with during the implementation process, then these
individuals could seek to mitigate the process through incomplete
implementation, rejection or even sabotage (3).
These researchers have also conceptualized implementation as a
clinical intervention and used randomized controlled trials of users as
their study method to evaluate the system’s effectiveness. Given the fact
that innovation implementation is a complex, longitudinal process
involving phases that evolve over time and comprise many layers of social
interactions (4), this positivist research strategy appears misguided.
Organizational learning and progressive user empowerment need to be
accounted for in any information system evaluative framework in which both
the user and the innovation are fluid and evolving. Real success may be
achieved by iteratively recognizing and accounting for the human factor
changes within organizations arising during implementation, such as
redistribution of responsibility, alteration in management structures and
changes in attitudes (3), and accounting for these phenomena in one’s
choice of study methods.
Thus, an innovation process theory approach in the examination of
information system implementation has particular relevance to
understanding the relationship between inputs to and outputs from the
system, including the actions of key stakeholders and their collective
effects on overall organizational change (5). This approach is amenable
to both the long time frame and complex network of social interactions
characteristic of most information system implementation studies.
Consideration of this body of literature could help anticipate problems
before much time and money is lost.
(1) Littlejohns P, Wyatt JC, Garvican L. Evaluating computerised
health information systems: hard lessons still to be learnt. BMJ 2003;
326(7394):860-863.
(2) Damanpour F. Organizational innovation: a meta-analysis of
effects of determinants and moderators. Academy of Management Journal
1991; 34(3):555-590.
(3) Linton JD. Implementation research: State of the art and future
directions. Technovation 2002; 22(2):65-79.
(4) Dobbins M, Cockerill R, Barnsley J, Ciliska D. Factors of the
innovation, organization, environment, and individual that predict the
influence five systematic reviews had on public health decisions.
International Journal of Technology Assessment in Health Care 2001;
17(4):467-478.
(5) Kaplan B. Models of change and information systems research. In:
Nissen H-E, Klein HK, Hirschheim R, editors. Information Systems Research:
Contemporary Approaches and Emergent Traditions. Amsterdam: Elsevier
Science, 1991: 593-611.
Competing interests:
None declared
Competing interests: No competing interests
Littlejohns et al.'s paper [1] makes interesting and yet depressing
reading. How are we still getting it so wrong? They describe a broad range
of reasons although I make no apologies for focussing on just one.
Lord Hunt, following his departure from the Department of Health,
states "my greatest fear is that we won't get the doctors on board". Like
most clinicians and other "hands on" NHS staff (I suspect), I favour a
bottom up approach to ICT within the NHS. This does not mean I'm against
the regional rollout and efforts of the National Programme. It makes sense
to use the economies of scale to provide the infrastructure and systems,
and standardisation is after all one of the key principles of health
informatics. What I am concerned about is the power of clinicians in the
equation.
If Lord Hunt believes that “the doctors and nurses … will make or
break it” why is more not being done to involve us? Ownership comes not
from consultation but by active involvement. My experience is that NHS
staff are generally interested in the potential benefits of ICT but feel
they lack the knowledge or skills or that they do not want ‘outsiders’
telling them how to do their jobs. Existing processes seem to do little to
improve this picture and I believe threaten the success of the National
Programme.
Resistance to ICT changes can be reduced by getting clinical staff
involved in the training and development processes. Given that widespread
participation may be difficult to achieve, it is important to rely on so
called "clinical champions" to drive things forwards. This is not an easy
task though. Meetings often clash with clinical commitments and it's not
feasible to cancel a clinic at short notice to attend an ICT meeting.
Trusts must recognise the importance of ICT in the clinical process (and
clinicians in the ICT process) and be prepared to release doctors from
clinical sessions to play an active role.
It’s an unfortunate paradox that limited resources inhibit creativity
in NHS ICT. "Quick win" projects usually come from the shop floor and tend
to provide cheap yet significant benefits to staff and patients alike. The
days of hospital IT as a “Cottage Industry” [3] may be numbered, but this
cottage industry tends to produce quality goods designed by the users for
the users. Those responsible for implementing the new systems will do well
to remember that.
Highly trained professionals do not respond well to being told what
to do. Instead they require equal standing in the decision making
processes along with Managers and ICT professionals. Systems should be in
place so that interested clinicians are intimately involved at all stages
in the analysis, design, implementation and evaluation of systems.
Clinician involvement may not happen by chance – it needs active
encouragement.
[1] Littlejohns P, Wyatt JC, Garvican L. Evaluating computerised
health information systems: hard lessons still to be learnt 2003;326:860-
863
[2] Health Service Journal 03/04/2003, Volume II3, No. 5849 Page 18
[3] www.e-health-media.com Trusts Must Not Stop Local IT Investment 22
Jan 2003
Competing interests:
being a doctor
Competing interests: No competing interests
Some lessons have been learnt.
Sir
As a former IBM employee who was intimately involved in the Limpopo
project, I would like (in my private capacity) to expand on some of the
points made by Littlejohns et al. I fear that the attempt to re-affirm
universals, has obscured some of the particulars. The result is
prejudicial to all of us who were involved.
Regrettably, the article conveys the impression of an incompetent client
and an overzealous vendor.
In practice, all of the “lessons” mentioned were identified during
the project (independently of the evaluation), but could not be applied,
for contractual and “real life” reasons
1. I believe the authors have misinterpreted the role of culture.
Irrespective of other concerns, the introduction of a computerised
information system amounted to a large-scale culture change. Hospitals
which had traditionally operated on paper-based, “batch mode” principles,
with a degree of laissez faire, had to suddenly migrate to “on-line, real-
time” principles, with control measures and instantaneous reporting. In
the best of circumstances, such a culture change might take many months
(witness the reluctance of many clinicians to work on-line). In ours, it
might take several years. By our internal assessments, certain hospitals
which had already had some experience of computerised systems, experienced
less upheaval. But how much preparatory education is sufficient for such
large-scale change, and how effective is it, if it is not temporally
proximate to the change?
2. As regards evaluations, a measuring instrument which is applied at six
months (say), might not measure a slow, evolutional change. I say this
with hindsight, as I was one of the “special advisory group of experts”.
3. Surfeit of change. In Limpopo, four previous administrations, of
varying efficiency and probity, were being amalgamated. Hospitals and
Departmental management were suffering from “change fatigue”. Many of the
managers were recent appointees, grappling with multiple challenges.
Perhaps such projects should not be undertaken during times of great
transition, but that would leave developing countries awaiting Godot.
4. In our latitudes there is often an inappropriate expectation, no doubt
fuelled by the computer industry, that Information Technology (IT) is the
answer to intractable organisational problems. In practice, IT amplifies
process and managerial deficiencies. The lack of satisfaction with the
subsequent, less sophisticated system, is therefore not surprising.
5. There is a perverse “efficiency” in the case of a shortage of skilled
staff – fewer people deal with a constant or increasing workload.
Injection of investment often has no apparent effect on efficiency,
because it allows a return to a more “normal” state.
6. Inefficient processes generally have one or more rate-limiting steps.
In Limpopo at the time, there were some three hundred-odd doctors manning
the 42 hospitals. It would therefore be naïve to expect a decrease in
patient waiting time. As the research completed by Mbananga et al
demonstrated, all we achieved was to shift the queues from the
administrative waiting area to the consultation waiting area (but at
least, few files were lost).
7. The distortions introduced by public tenders, are significant. In
general, the scope and time-scale cannot be changed (the subsequent
criticism of the Ethniks tender refers). This meant that
a. Our suggestions to de-scope the project, to a scale assimilable by the
hospitals, were not accepted.
b. Our attempts to introduce a formal change management programme (not
part of the original tender), were unsuccessful.
c. The functional modules stipulated in the tender were not necessarily
aligned with the achievement of some of the objectives. Offers to
substitute more suitable modules were not accepted.
d. Even a request by the head of Department to retard the project for
budgetary reasons, fell foul of regulations.
e. We were not in a position to influence the turn-around time of other
hardware suppliers, operating under different tenders.
8. The pharmacy system was supposedly the remit of a different vendor,
under another tender, and we did not therefore supply pharmacy management.
9. Half of the laboratories in the province were owned by an external
organisation, with its own information system. The other half were
provincially-owned, but the Department chose not to implement our Lab
modules.
10. All the tendered modules were available at time of purchase. However,
some necessary adaptations took time to complete.
11. It is surprising to learn that management reporting should receive
priority over the basic functioning of the system. Our decision makers
generally still do not have experience in the use of information in
institutional management. Unreliable information is unlikely to further
the cause.
12. Our training programmes (“how-to”) have been variously criticised as
occurring too early or too late. In reality, they occurred whenever
circumstances permitted.
The authors point out that enormous IT investments have had no
discernible effect on the productivity of health professionals.
Paradoxically, such investments continue. Perhaps increased productivity
of professionals is not the primary aim. Absent folie en masse of
healthcare decision makers, one presumes that other benefits accrue.
Measurement has been deucedly difficult – perhaps because of simplistic
yardsticks. In today’s tight financial conditions, many decision-makers
demand a measurable Return-on-Investment (ROI) before committing funds to
an IT project. Yet there are few convincing ways to measure ROI. The
Gartner Group, a respected IT consultancy, now proposes other measures.
Perhaps we in health care are also measuring the wrong things?
Since so many projects fail, perhaps the real question is when to
admit failure, and withdraw funding. There are no easy answers – many
successful complex projects at some point looked like failures. I have
been associated with another large healthcare information system project
in Gauteng Province, South Africa. At the end of the third year, the six
hospitals and five clinics were largely in no better shape than when we
started. In some ways, they were worse off, because of staff attrition.
Yet in the current (fifth) year, there is a significant difference. None
of the hospitals would now agree to the abandonment of the system. When I
recently mooted the discontinuation of nursing care plans (notoriously
difficult to implement successfully), I was greeted with outrage, and told
to improve the functionality instead. The users had already incorporated
usage of the module into a new system of performance measurement.
It is only now that we are getting to grips with information-assisted
management.
Hardly a paragon of success, but indubitably a small victory – made
possible by the tenacity of the client and the project team. Should the
plug have been pulled after year three?
Perhaps the money would have been better spent on aspirin. However,
it could be argued that there is a necessary first step in moving into the
information age. It is instructive to compare the amount wasted here,
with the amount quoted by the authors for one large hospital. Perhaps some
lessons have been learnt, after all.
Competing interests:
B. Longano was an employee of IBM South Africa from Feb 1996 to Feb 2002. He has a collaborative relationship with IBM and is an expert on the Medicom Health Care Information System.
Competing interests: No competing interests