Intended for healthcare professionals

Career Focus

Adverse incident reporting and significant event auditing: AIR and SEA rescue

BMJ 2004; 328 doi: https://doi.org/10.1136/bmj.328.7447.s173-a (Published 01 May 2004) Cite this as: BMJ 2004;328:s173
  1. Karen Dalby, Medicolegal adviser and clinical risk manager with the Medical Defence Union

Abstract

Karen Dalby, clinical risk manager at the Medical Defence Union, explains why learning from events and near misses is becoming enshrined in medical culture

Adverse incident reporting and significant event auditing are two of the many terms which doctors will have heard being used increasingly by NHS managers and policymakers. But what do these rather unwieldy terms mean and how will they help doctors at the coalface?

The National Patient Safety Agency

Incident reporting is a useful risk management technique that can provide valuable, timely information about problems that occur during clinical treatment. An independent, national body, the National Patient Safety Agency (NPSA), was set up in July 2001 to collate all reports of patient safety incidents, analyse patterns and trends, and issue risk management recommendations through their National Reporting and Learning System (NRLS).

The NPSA calls adverse events “patient safety incidents” and near misses “prevented patient safety incidents.” It has established an electronic anonymous reporting system for any practitioner who does not wish to report directly through the local system. During 2004, the NPSA will be rolling out the system across England and Wales, as well as providing training on how to undertake root cause analysis—which is how we can establish the core reasons that underlie an event (wwwnpsa.nhs.uk).

The NPSA has an agreement with the National Assembly for Wales, meaning it has the same role there as in England, and it maintains “close contact” with colleagues in Scotland and Northern Ireland. For more information on the NPSA's geographical remit see its annual report (www.npsa.nhs.uk/admin/publications/docs/annual_report_2002_2003(1).pdf).

The following fictitious cases illustrate the kinds of mistake that could be analysed using an incident reporting system.

Case examples

A 72 year old man telephoned his general practitioner (GP) complaining of acute knee pain. The GP made a provisional diagnosis of osteoarthritis and advised an over the counter non-steroidal anti-inflammatory drug (NSAID). He did not review the patient's medical record and was unaware that he was on long term warfarin treatment. The patient telephoned three days later complaining of passing blood in his urine. Arrangements were made for an urgent international normalised ratio (INR) test, and the patient was advised to stop taking the NSAID. The INR result was very high and was telephoned to the practice that afternoon. The receptionist put the result in the GP's pigeonhole. The GP was on study leave that afternoon and did not see the result until the next day. By that time, the patient had been admitted overnight to hospital with an acute haemorrhage; he was discharged three days later.

In another case, a patient was successfully resuscitated following a cardiac arrest. During intubation, the blade on the first laryngoscope came loose and the spare one was found to have no functioning light source. Fortunately, the anaesthetist was able to intubate using the second laryngoscope, and there were no consequences for the patient.

In the past, doctors in these situations may have been relieved that the patient made a full recovery, offered the patient an explanation and an apology, and made a mental note to try to ensure the same thing didn't occur again. In the GP example, for instance, arrangements may have been made to cover GPs on study leave. In the hospital case, the doctor may have made a quick check of the equipment after coming on duty. But the emphasis now is on the whole team reviewing incidents like these to learn valuable lessons about system failure, introduce appropriate changes, and improve patient safety.

Doctors are also obliged to take part in reporting and learning from incidents. The General Medical Council (GMC) says in Good Medical Practice that doctors must “take part in confidential enquiries and adverse event recognition and reporting to help reduce risk to patients.”

Common errors

While these scenarios may sound like an unlikely series of events, the Medical Defence Union (MDU) has found that these types of cases are by no means isolated incidents. For example, with the average GP prescribing around 15 000 items a year it is hardly surprising that 25% of all settled medical negligence claims against GP members of the MDU relate to medication errors. Even a GP with a prescribing accuracy of 99.9% can expect at least 15 adverse events due to prescribing errors each year, some with potentially serious consequences.

The chief medical officer, Sir Liam Donaldson, set the NHS a target of reducing by 40% the number of serious errors in the use of prescribed drugs by 2005. Setting up and using a robust system to learn from events is a useful tool to help prevent inadvertent harm to patients from incidents related to prescribing, delayed diagnosis, and poor communication and record keeping, particularly as MDU research shows that around a third of all complaints about GPs are because of system errors rather than lack of clinical knowledge.

Fair blame culture

Implementing a reporting system is not easy. In primary care, a practice should nominate someone to oversee the process to ensure appropriate levels of investigation and that lessons are learnt and fed back to the staff. In a hospital setting, this person is usually the trust risk manager. It is vital to develop a culture in which all staff feel comfortable about reporting incidents. They will need to understand the benefits of reporting incidents and to be concerned with setting up the system. Staff need to be reassured that the system is not intended as a disciplinary procedure and to be encouraged to report incidents knowing that they will not be unfairly blamed for doing this.

The aim of this process is to get to the underlying causes of problems with processes and systems, to prevent or reduce the risk of recurrence. However, occasionally circumstances might suggest a problem with the performance, conduct, or health of a healthcare professional. The GMC advises that you must act to protect patients, initially by investigating concerns to establish whether they are well founded. If such a situation arose during the analysis of an adverse event, then you would need to tie this to any local procedures that are in place to address performance, conduct, or health issues. The GMC advises doctors that if no such systems are in place, or local systems cannot resolve the problem and a doctor remains concerned about patient safety, then they should inform the relevant regulatory body. These situations, though rare, are often difficult to resolve, and it is wise to discuss the issues and options with your defence organisation

Setting up an adverse incident reporting system

  1. Identify categories of incidents to be reported

    For example, failure or delay in diagnosis (malignancy, orthopaedic problem, and so on), medication error (wrong drug or dose, contraindicated drug, and so on), system failure (test results, lost messages, and so on), record problem (unavailable, failure to record information, and so on), procedural or surgical error (consent, postoperative care, lack of equipment, and so on).

    The NPSA has developed a dataset which is tailored to each healthcare setting.

  2. Establish the method for reporting incidents

    The NPSA is rolling out a national reporting system, the National Reporting and Learning System (NRLS), and has developed an electronic reporting form, but practices or trusts may already have devised their own local system. The NRLS interfaces with all the commercial risk management systems in use in most NHS organisations.

  3. Nominate an individual to be responsible for overseeing the process

  4. Train staff

    Staff need to know what is expected of them so that they can understand the benefits of a reporting system and take part. They will also need to be reassured that the system is not intended as a disciplinary procedure.

  5. Report an adverse incident to the nominated person

    The person who witnessed, discovered, or was involved in the event should complete the appropriate form as soon as possible after the incident. The report should be factual.

  6. Feed back the findings

    Staff should receive information about the outcome of the process, lessons learnt, and any proposals for change.

  7. Deal with the consequences

It goes without saying that you will also need to deal with the consequences of the incident. When something has gone wrong, you should explain to the patient what has happened and offer an apology when appropriate.

The box shows how an adverse incident reporting system could be established to complement processes already established to assess and minimise risks.

Significant events

Learning from errors and near misses and sharing good practice is also central to the concept of significant event audit (SEA), which is an active approach to case analysis and is used mainly in primary care. It is an open and supportive discussion of a case, usually involving all those responsible for the patient's care. Apart from helping to identify practical changes to improve patient care and safety, a set of completed SEA records provides good evidence of reflective learning for the purposes of revalidation by the GMC. It is important to note that it can be used to celebrate a success in clinical care and is not just for when things go wrong, as in the following fictitious example.

A 46 year old woman telephoned her GP complaining of “flu like” symptoms, nausea, and some mild chest discomfort. The receptionist took a message, and the GP arranged to leave a prescription for an antibiotic for the patient to collect. Two days later the patient telephoned again, saying that she had retrosternal pain radiating to both arms and had vomited. The visit request was noted in the visit book. The patient telephoned again two hours later and said that the pain was worse and that no doctor had visited. The receptionist was unable to locate the duty doctor, and when she returned the patient's call she discovered that the patient had been taken by ambulance to hospital, where a myocardial infarction was diagnosed.

In this case the possible outcomes of a SEA would include a review of the management of the patient, particularly with regard to communication and messaging systems in the practice. The case also highlights the need to develop a strong telephone protocol for receptionists and a training programme so that reception staff know when to refer to a doctor. It also asks questions about arrangements for patients to access GPs.

Summary

Analysing incidents and near misses is something that doctors and other healthcare professionals have been doing informally for many years. But, with the help of initiatives such as the NPSA's “national reporting and learning system,” this is set to become a far more effective and valuable process in future, with the goal of greatly improving patient safety.

Footnotes

  • Doctors with specific concerns are advised to contact their medical defence organisation for advice.

View Abstract